One difference to the TM-D710 is that the TH-D72 has got native USB. Therefore, I also set PC Port Output (screen 350) to On.
I connected the HT with a Mini-USB cable to a USB-OTG adapter which had a Micro-USB plug. Using a USB-C adapter, I connected it to the BV9900pro. A simple USB OTG app didn’t list any USB devices connected, though. I noticed I had to switch on “OTG data exchange” in Settings > System. Now APRSDroid showed a USB device connected, and I could start tracking.
I noticed two caveats, which might be completely the smartphone’s fault. (It performs aggressive internal task killing which can’t even be entirely neutralized by rooting and modding.) Note that the “OTG data exchange” setting shows a description that it would be switched off when unused for 15 minutes. In my case, it switches off anyway, even if the phone’s screen is on and APRSDroid is tracking. The second issue is that when I switch to map mode while the USB connection is on, the app freezes and is cumbersome to stop and restart. I always have to switch off OTG first, then study the map, switch back to list view, activate OTG and start tracking again.
Additional thoughts: The TH-D72 is an aging device. It was introduced in 2010, I bought it new when I got licensed in 2018, and it was discontinued soon after. It is still the only device available [besides the TH-D74] that offers an all-in-one solution for Packet/APRS/GPS that also works perfectly from Linux and Android. Although I’m a ham for a little time now, it always puzzles me how information can be so inaccessible to newbies: Only recently I managed to use Packet Radio at 1200 baud to
send WinLink messages directly from the HT (called APRSLink),
send ordinary e-mails directly from the HT,
receive(!) ordinary e-mails directly on the HT, via APRS radio (what is endgame-awesome), and
all of which are actually old-school meanwhile. It also took me four years to figure out APRS works directly between the HT and a smartphone or tablet. At least I established these options for me now.
The TM-D710 disappeared from the market last year, and I was caught by surprise. After months of searching I could finally buy one used. This mobile TRX also offers APRS on-board which can be directly accessed from Linux.
Currently, in CQRLOG 2.5.1 of Ubuntu 21.04, QSL export for label printing is broken due to a FreePascal bug:
TRegExpr exec: empty input string.
Version 2.5.2 contains a fix, though. Now I had the choice between these unpleasent options:
Go through dependency hell by using the CQRLOG PPA for Ubuntu. This repo however has got some unresolvable dependencies. Attempts to work around these moved my system from MariaDB back to MySQL, resulting in a migration of the system database directory, what made a “downgrade” back to MariaDB another huge stumbling block. Luckily, I could switch back to 2.5.1 on MariaDB again.
Go through dependency hell by trying to compile CQRLOG from source. Either I’d have to mess with several odd development packages on my system directly, or by using a Docker container. This was a path I didn’t really want to take.
Delay printing QSL cards for several months until Ubuntu 21.10 is out (which hopefully contains that fix).
Couldn’t I somehow work around that bug? After all, I just wanted to dump certain log entries to a CSV. This could be done using an ordinary MySQL client! And this is the procedure to do so:
Start CQRLOG, this launches a MySQL (MariaDB) server instance in the CQRLOG data directory, which is ~/.config/cqrlog/database by default, listening on port 64000. Now, simply connect to it:
$ mysql -h 127.0.0.1 -P64000
Use the desired database:
> use cqrlog001;
Query, using the columns you usually export:
> select qsodate, time_on, callsign, mode, freq,
rst_s, qsl_via, remarks, stx, stx_string
into outfile ’/path/to/qsl_test.csv’
fields terminated by ’,’
from cqrlog_main
where qsl_s in (’SMB’, ’SB’);
The resulting CSV is already almost in the usual format, except for the date. In Vim, I did these transformations:
Since ham radio call signs can contain both the letter O and the digit 0 at the same time, the zeros are often represented as a slashed zero in print. But because there's no ready-hand (Unicode) character for this, the letter Ø is used most often instead, since one cannot always use a monospace font which already represents the zero in slashed form. (This problem could also occur in other cases, such as printing hashes or codes.)
There's a lot of ham radio logging software outside that supports a workflow for printing QSL cards or label stickers, and I'm not sure if any of these supports the replacement of zeros with Ø before print—only in call signs, and not in other data such as date, time or frequency. The software I use on Ubuntu, CQRLOG, exports QSOs into a CSV file that can be read in by gLabels to perform or prepare the actual print. I wanted a step in between that transforms that CSV file by substituting that Ø only in call signs. It took me some time, but here's my solution (showing a bogus CSV line):
My experiment with crypto coins is over. (No, these are no currencies.)
Last summer, when the crypto hype started taking off, I decided to get my hands dirty and to learn what’s up with this stuff. I decided to spend around €50 on each of a few established crypto coins and expected to lose them all.
My take-away now is: I’m so glad I got everything back into real money. Crypto coins are a fad and will not revolutionize anything in this shape but scam. Crypto coins only work in times of (relative) stability and working infrastructure. Crypto coins are no way to protect wealth from disasters such as weather, government, nuclear meltdown or financial system meltdown. Crypto coins need way too much energy, they live in a world of busily buzzing networks and heated-up CPUs and GPUs. All this crypto stuff is not ready for prime time (or has its prime time already passed?), it’s much too complicated, you have to have too much and too deep technical knowledge to be able to handle your funds. Here are some examples I experienced myself:
You have to manage a scattered and varied set of crypto wallets to actually store your funds. Mine included online wallets, local wallets on my (Linux) PC and an exchange.
Because you do not want to save your passwords, you have to memorize them very well, and to not forget them, you have to log into your wallets regularly. If you try a few wrong passwords, your IP is slowed down or blocked, and you locked yourself out, right when you actually needed access. Of course, you use 2FA, and you’re thus dependend on your smartphones or tablets happily running.
For every transfer, you have to click through a wall of confirmations, captchas, more 2FA codes, and verifications via e-mail. You depend on access to your e-mails. (Do your internet access points and e-mail providers anticipate large-scaled disasters?)
If you’re too sloppy and forget to include the payment ID in your transfer to an exchange, you need their support and have to wait for days to get your funds back.
Your balance might suddenly show up as zero. (In my case, it was my local IOTA client.) You have to do research, update your client and try pointless “reattachments to the Tangle” until you finally see your balance again and get rid of it, rid of it, exchange it into real money, my ass.
Exchange of regular (small) amounts might be artificially slowed down or made impossible, because the networks don’t scale.
It might well be that blockchain and smart contracts reach a plateau of productivity one day, but I downright hope that crypto coins will not. Mining rigs are placed in containers next to power plants! Finance portals publish crypto reports as if these were a regular asset class! For the times ahead, better invest into something tangible or into capital that cannot be raided, like social capital or knowledge capital. I want to know something, be able to do and rely on something that does not depend on a running and ever-consuming industrial infrastructure.
I noticed that TresoritSolo suddenly ramped up total space from 1 TB to 2 TB without any notice or extra charges. I hope this change is permanent.
Btw, since my (somewhat forced) migration from Wuala 2½ years ago I’m even more convinced of their service. Their clients for Windows, GNU/Linux & Android are very convenient and reliable. Recently they made a big revamp of their visual appearance and clients (even that for Linux).
The only thing that’s a bit odd is that they make artificial restrictions to file and directory names. In earlier versions, these files were simply and silently(!) not uploaded if they ended with a blank or dot. I was surprised and slightly shocked that these files were simply not protected, without being aware of it. Luckily, with their recent client updates, there’s now a possibility to view a list of affected files. Boy, was I surprised that there were even more restrictions: They disallow characters that Windows does not support in file names, such as
: ? " < > |
In addition, there are name collisions reported when file names only differ in capitalization, all of which are perfect features of Linux file systems that I got used to over the years. Another thing is hidden directories in Linux, which start with a dot. Some day they simply and silently didn’t get uploaded anymore. Luckily, I didn’t have to rename that many files and directories, but they were in the dozens, though. Other users could be off much worse. Finally, after renaming, those affected files got uploaded—they were in the hundreds, and I had thought they’d already been in the cloud since months.
Apart from that, I can fully recommend their service, even more now that they cost me 0.09 €/GB per year.
A roaring went through the tech-savvy finance community when Yahoo suddenly shut down its Finance API on Nov 1, 2017. Dozens of libraries and scripts that had parsed HTTP output from this API for years ceased to work. No longer could libraries such as Finance::Quote query prices of stocks, funds and currencies and hand these over to other applications such as GnuCash. By using Finance::Quote—I even wrote an extensionmodule for it—I was affected as well, and finally I stumbled upon a solution: Version 1.41 recently introduced an Alpha Vantage module, and querying this source instead of Yahoo is straightforward.
Unfortunately, using Debian or Ubuntu one has to install this version from source, but it's not that hard to do. And for using Alpha Vantage, you have to register for an API key, but that's also no issue. One of the things that still don't work is currency conversion, as that module is still tied to Yahoo's interface. Alpha Vantage does provide currency conversion through its API, though. Looking at AlphaVantage.pm I recreated a simple Perl logic to perform currency conversion. Note that I haven't packed it into a real Finance::Quote module, it's just a simple Perl script that Works For Me™. For simplicity, I don't do any error handling. It relies on an environment variable that contains the API key. Here's how I did it:
#!/usr/bin/perl -w
use lib '/home/user/lib/perl'; #not needed#use Finance::Quote; #debug#use Data::Dumper; use JSON qw( decode_json ); use HTTP::Request::Common;
my$pair=$ARGV[0];# e.g. "XAU:EUR" for gold price in Euro @curr=split(/:/,$pair);
I recently managed to reactivate my dear old Epson HX-20, a retro computer released in 1983 which I used at the end of the 1980’s and early 1990’s to learn programming. Since I even could read in some BASIC programs that were still stored on the micro cassette, I wondered if I could rescue the code directly, without using OCR on printouts or even typing it off by hand. I was aware that I was very lucky that this machine still worked after all those years—the soldering seemed to have been much more rigid back then, it might cause more issues to attempt to run old PCs which are a decade younger! To be on the safe side, I invested into a new NiCd battery pack and replaced the original one.
My research first led me to the machine’s RS-232 output, internally called “COM0”. Someone had used that some years ago to directly connect an HX-20 to a PC’s serial port, using a special cable and some adapters. Sadly, it seems that this is no longer an option, since these cables disappear, and serial-to-USB adapters only seem to work with a certain chip in this case.
Then I stumbled upon the GPL software HXTape, and I was totally baffled: What, the Epson HX-20 had an external cassette interface as well? I knew that concept from our even older machine, the Texas Instruments TI-99/4A. It connected to a special music cassette recorder and encoded data into simple “magnetic bits” onto the tape:
It was quite funny to listen to the noisy sounds when playing the MCs on an ordinary player.
This bidirectional data transfer works over ordinary mono audio cables, one for each direction. And now, it turns out the HX-20 had such an interface as well, and we never used it. But the point is, one could exploit it to decode the audio signals into the original bits and bytes by connecting the HX-20’s “MIC” port to the microphone input of a PC using a simple mono audio cable with standard 3.5 mm jacks! (How tremendous the analog world was! Keep a music cassette lying around in the basement for decades and then just play it. Try this with your ¡Phone!) And that audio decoding is exactly what HXTape is doing.
In 1988, when I entered grammar school, [...] I did a bit more serious BASIC programming on an Epson HX-20, which I still own today, but isn’t working anymore. [...] My cousin René taught me some BASIC things like how to use the random number generator.
Well, it turns out that this machine is still alive, as you can see in this video:
You can even watch me loading my cousin’s learning program from the micro cassette—that very lines he’d written for me on Dec 29th, 1988. Though, its output was designed for a separate and larger screen that I don’t own anymore.
I’m pretty sure that this machine had slept from ~1995 until I woke it up now. I hope I can also have a look at my other programs on that micro cassette.