A Word About Obsidian

I’ve been using Obsidian for a while now, and it’s a tool that gets better as I use it.

Normally, that sort of statement is due to a tool’s complexity being an inhibitor at first, and then gradually an asset. There’s no question that this is true for Obsidian.

However, it’s also the case that Obsidian, itself, has gotten markedly better over the time that I’ve been using it.

What prompted this post is the recent change in the way Obsidian handles tables.

Obsidian has always supported tables, although they are not necessarily a standard Markdown feature. The support was very bare-bones, and I used the wonderful Advanced Tables and Table Extended plugins to make them more useful to me.

When a recent update to Obsidian announced that tables support had improved, my first thought was to wonder if I would need my plugins anymore. While that answer is a qualified “yes”, what Obsidian has added is by no means unhelpful.

I also use Okular as my primary PDF viewer. One of the nice features in Okular is the ability to select a table, including specifying where the row and column breaks fall, and then to paste it elsewhere.

In my previous workflow, this always required a lot of post-processing. Things sped up a bit when I discovered that pasting into LibreOffice resulted in a working table that I could then convert to text and paste into Obsidian.

All of that has changed. I can now pasted those tables (some of which are pretty large) directly into obsidian, with Obsidian taking care of the formatting.

So, what are the qualifications on my “yes” to keeping my two Tables plugins?

  1. Advanced Tables has the marvelous ability to start typing a table and then use a hotkey (I use ctrl-enter or ctrl-tab) to tell Obsidian, “Based upon what I typed, make this a table.” It’s possible that I’ll eventually stop using this in favor of the new table support’s version (using the mouse), but for the time being I’m going to keep Advanced Tables.
  2. Table Extended does something I haven’t seen anywhere else, and it’s pretty cool. The only downside is that it’s not visible in the default editor: you have to go to preview mode to see it. Therefore, all of the editing is manual.

    Table Extended allows me to make multi-row and multi-column cells, and especially multi-row headers. This is absolutely essential for tables where columns are grouped under larger headings.

Anyway, I may at some point post more about how I’m using Obsidian, but for now I just suggest you try it. Oh, one more thing I love: I use a plugin called Obsidian Git to automatically commit my changes and then push them to a remote git server. Automatic backup, plus I can now work on multiple machines without corrupting my files (because of git’s commit tracking).

Update on the QNAP TS-473A

It’s been almost a year since I wrote about the QNAP NAS system, and I thought it appropriate to talk about some updated experiences.

First, I’m no longer using the DLNA server as the primary way to stream media. It’s not that it didn’t work, but it was clunky, and I knew my wife would never be happy with it.

This led to a lot of experimentation. Over the same period of time, it became clear that her iMac needed a system software update to continue to be useful as an Internet-facing machine, and since Apple does not support that model of iMac with updates, I installed Linux Mint on it.

The Linux install was easy and straightforward, and I installed a couple of things to make it look a little more Mac-like for her, but Kodi continued to give us problems, and even seemed to be more unstable than before. It certainly didn’t work well with the DLNA from the QNAP (I am still making video available from the old TrueNAS system, although it doesn’t get any new rips) and I could see that my wife was getting very frustrated.

I decided to try a different tack. I have a free account with Plex, but in my experience, they only let you stream your local content locally if you also have a paid subscription. I may be misunderstanding that, or maybe it only applies to the clients, or something, but I could never get it to do what I wanted it to do.

I then looked at Emby, and ran into some similar issues, but then I tried Jellyfin, which was a fork of an earlier version of Emby. Not only does it have a native client, it also just works in a web browser.

Back to the QNAP. QNAP ships with software called ContainerStation. This software allows one to install and run docker containers on the NAS. It allows installation from Docker Hub, which opens up the possibilities for a vast array of software running on the NAS.

One of the drawbacks of ContainerStation is that it is difficult to modify a docker configuration after it is created — specifically to change the list of file system folders that are mounted inside the docker environment. However, since it is so easy to spin up a second copy of a container and configure it the way you want, this isn’t as bad a deal as it might otherwise be.

I installed a Jellyfin docker container, and my wife is now happy with the streaming media available on her iMac. It’s easy for me to add content to, to modify metadata, etc., and she benefits from all the changes I make.

Second, there’s also a docker image for calibre-web. I haven’t decided, yet, whether this is my solution, or if I’m going to instead run an image for Calibre itself (running as a headless server). The benefit to running Calibre is that you can easily make virtual libraries based upon metadata, including tags. On calibre-web, you have to add the books one by one to the “shelf” you want them on. On the other hand, I do think the interface looks nicer for calibre-web.

All in all, I’m still quite pleased with QUTS-Hero as an OS on the NAS, and the hardware of the QNAP TS-473A. It is very responsive for what I want to do with it, and in my limited use-case it’s been a good value for the money. That said, the QNAP software for media streaming and control is terrible, and seems designed (like Plex) to try to drive you to their servers and services. No thanks. The reason I do this on-prem is to avoid having your tendrils in my stuff. The fact that ContainerStation makes it relatively painless to do this is a clear win. Also, if I do need a more complete solution of something down the road, I still have the ability to install Ubuntu, etc., to have a more fine-grained software experience.

QNAP TS-473A First Impressions

Christmas came early at the Wood household, as we decided to make good on years of talk about separating the gift-giving from the celebration of the Savior’s birth. Picking a day when we would all be available, and so on, we settled on December 18.

One of the new toys is the QNAP NAS system referenced in the post title.

The interface could be a little more explanatory, and I was glad that I had read up to know that I would need to select the new QuTS hero OS right away if I wanted to use it.

OS installation took a while, but I just left it running and went to do other things, and before long it was all there.

One reason I wanted to use the new QuTS hero 5 OS is that it uses ZFS1 by default. ZFS is said to be the OS for preventing bit rot, or the gradual degradation of data from random changes to the storage medium, so I’ve been using it on my homebuilt NAS (that currently runs TrueNAS).

QNAP has a pretty good reputation for their software (and are popular enough that they are the targets of specially-designed malware), but I was still a little nervous about migration. Supposedly, you can also wipe the box and put another OS on it (like TrueNAS), so I could theoretically end up with a similar setup to what I have, but on more NAS-friendly hardware. Still, I have been struggling with the media streaming element of my home server setup.

We’ve traditionally been using file shares and Kodi to stream our media around the house, but it didn’t work quite as well as we’d like. For one thing, running Kodi on each machine that wanted to consume media meant that we had a separate library on each machine. Some of our media is a little quixotic, and the “matches” found in the movie database were usually not very family-friendly. Trying to get the info updated on all client machines was frustrating.

So far, the new QNAP looks promising as a way to get around some of these difficulties. For one thing, the DLNA2 implementation “just works”. (My client is VLC, which I had generally avoided as a media player in the past, but the interface has gotten a lot better.) This should mean that I can edit the metadata for my music, movies, and TV shows, and the clients (including Kodi) will pull that information when they connect to the DLNA server.

What about the hardware?

Drive installation was really painless. It’s a “nearly” tool-free process that is tool-free if you want it to be. Basically, the device uses drive sleds (provided) that have the old clip-style drive rails. They attach to the outside of the sled and poke through the holes into the mounting screw holes on the drive. For better retention, you can (optionally) use the provided3 screws to secure the drive to the sled using the screw holes on the underside of the drive.

It’s not as simple and straightforward as the sled-free system on the Synology JBOD4 system I have, but I really have no complaints.

There are USB ports on the front of the device, and you can program an action to take place when the button on the front is pressed. (Basically, do you want to automatically import media to your server — like connecting your camera, for instance — or back up to NAS storage, etc.)

There are also USB ports on the back, so you can attach more storage (like the Synology JBOD device I have) if the four drive bays (in my model) aren’t enough for you, and you don’t want to spring for a whole new NAS system.

Other details

There are a lot of options for connecting the QNAP to other storage services, either to use cloud services as a backup to the QNAP, or to use the QNAP as a local cache of the cloud services. You can link it together with other NAS devices to back up off-site, so it really makes it easy to make backups, rather than just copies.

I mentioned ZFS briefly when I started this post — one of the other features of ZFS is that it can very easily make RAID systems of varying flexibility. It’s relatively easy to add and remove drives from the RAID, and as long as you’re patient, and your hardware isn’t too slow, you shouldn’t lose data. I have four 8TB drives that will end up living in this QNAP, and that should provide a certain amount of redundancy to the data storage in this NAS.

Final observation, before I go back to playing with this thing: QuTS provides an app called Ubuntu Linux Station, that allows me to install an Ubuntu 20.04 or 18.04 VM on the NAS. While this is terrible overkill for most things that I want a NAS to do, it does allow me a simple way to set up a Calibre Content Server on the network. I’ve been running one from my desktop, but not only will I feel better with that stuff stored away from the rest of my stuff (okay, backed up, since I’ll be keeping a copy of everything on my computer, too), but it will make it easier for me to convince my family to look to the network for ebooks.

I’m sure as time goes by I will find out other things that I want this server to do, and so far I’m really impressed by the flexibility of the QuTS to let me do them.

Password Management

A necessity in the digital age

A while back, password management was one of those things that “I suppose some people do that”. Most people had at most one or two online accounts, and it was a whole lot harder to get around the Internet and find useful things to compromise or steal.

Things have changed. My currently used password manager tracks 1,692 items, and while there are some duplicates and items that aren’t passwords, not to mention passwords for accounts that I no longer have, that’s way too many passwords for me to manage on my own.

What is a Password Manager?

A Password Manager is a computer program (and possibly a service) that helps a computer user to manage the passwords that are needed for a modern digital life. This is necessary because of a couple of very important points.

  1. Passwords should be secure. They should be long, complicated, and difficult for people to guess. In general, they should not contain identifying information about you (birthdate, anniversary, pet’s name, etc.) but should be as random as possible.
  2. You should use a different password for each service you use.

Let’s look at each of these points in turn, and then we’ll talk about how a password manager can help with them.

Passwords Should Be Secure

In the before time, computers were slow and had extremely limited memory. (This is technically true about almost all times, compared to the times that come after, although I believe Moore’s law has failed.) One of the things that this meant was that it was difficult for a computer to crack a password.

If you’ve played the game Portal 2, you’ll possibly remember the scene where Wheatley tries to guess the password to keep GLaDOS from reactivating. He starts by guessing six As, and then goes to five As followed by C. He’s not very fast, and if the password is longer than 6 characters, he’ll never get it.

Today’s computers are able to go through all of the possible combinations in a six character password (including lower case letters and numbers and symbols) in a very short amount of time.

With the larger amounts of memory (and storage) programs designed to pilfer passwords can also use a dictionary attack, using words from a word list to try to guess the password more quickly. Some of these word lists include passwords collected from online server breaches.

In short, to make it more difficult for a program to guess your password, you need to have a password that’s difficult to guess. There are two major techniques used for this.

The first technique is to make a jumble of characters. This password manager believes that this 20 character code containing UPPER and lower case letters, digits (numbers), and symbols is adequately random to be excellent. It is allowing ambiguous characters (i.e. both 0 and O — that’s a zero and a capital O). This type of password is quite secure, but very difficult to remember.

The second technique uses real words, but combines them in ways that are not normal (or grammatically sensical). Although this password is easier to remember, it is also rated excellent because it is quite long, and random enough that a program will have a hard time cracking it. Note that upper and lower case letters are used, as well as digits, to make it more secure. To a computer, E and e are completely different, and not necessarily related.

So, with this kind of password, you can have a relatively secure way to access data that is yours, and that shouldn’t be shared with other people. This includes your email, your bank account, your computer, etc. However, memorizing even one of these passwords is difficult, and I said you should have a different password for every account you use?

Passwords Should Be Unique

Every once in a while, you’ll hear that service XYZ.com got hacked, and that the hackers got away with personal information of the users of that service. If that personal information included passwords, the hackers (or those they sell the information to) will certainly try to use those passwords in other places, especially in combination with the username connected to that account.

Since usernames are often just email addresses, and since most people don’t have lots and lots of email addresses, this basically means that if you reuse your password, you’ll be reusing both password and username.

Logins are like locks that usually require two keys: your username and your password. Since most services ask for your email address, and allow you to login using that address, hackers now only need to have one of the keys, the password, to try to access your information. If your password is always the same, too, they will have both keys to every account you use.

Why Does It Matter?

I’ve heard a lot of people say, “Well, I don’t do anything important on my computer, and I don’t have very much money in my bank account, and so I won’t lose much if someone is able to break in to my account. Besides, why would they want to?”

This ignores several very real harms:

  1. Once a hacker has access to a server using a legitimate account, he can try to use that account to gain access to other parts of the server. By not using his own account, he’s less likely to be caught on failed attempts.
  2. If a hacker empties your bank account (even if there’s not much money in it) you’ll likely be faced with overdraft charges that can be expensive to clean up.
  3. Sometimes, the hacker just wants access to your computer. He can use it to attack other computers around the world without leaving a trail back to him. This slows your computer down (and if he uses your computer for bitcoin mining, will cost you in electricity and failed computer parts) and makes you a participant in, for example, an attack on the Bank of Scotland.
  4. Some people just want to see the world burn. They may not care about your 1,700 pictures of your dog, but they get pleasure from the thought that by deleting or defacing them, they have exercised power over you.
  5. If you use any kind of account to work with someone else’s data, that person can be harmed through the access to your account. Malicious reviews can be posted under your name, or false information can be inserted into online accounts.

In short, it’s more important than you may realize.

It Sounds Complicated

To be fair, it is complicated, but Password Managers make it somewhat less complicated.

Most password managers share two characteristics: they let you generate secure passwords (as shown in the pictures above) and they store username-password pairs for later use. Most of them also use a single password to unlock this store of information, and many of them allow you to synchronize your passwords between your phone and your computer, for example.

How Does it Work, Then?

  1. You go to a website that you use for some purpose (maybe it’s your email).
  2. If the password manager is locked, you will need to type your one password that unlocks it. Many web browsers provide basic password management, and most of them don’t lock the manager when you’re not using it.
  3. The password manager will either let you copy and paste the username and password into the webpage, or will do it for you.

That’s it!

The nice thing is that, for most password managers, the one password never leaves your computer — rather it’s used to unlock the password database/password store.

Most of them will also allow you to generate a secure password, will update the stored password when you change it, etc.

Extra features include checking that the website is the same as the one where you previously used the password. https://microsoft.com and https://microsoft-us.com are not the same place, and one of them might be owned by a hacker trying to get your password.

Some password managers let you store your credit card information securely so it’s available, but not stored in some stranger’s database. Identification information (driver’s license, passport) and software keys can also be stored, and some managers provide special formats for these.

Warnings!

Not all password managers are created equal. In my opinion, there are several things to watch out for when selecting a password manager:

  1. Where are the passwords stored?
    • If the passwords are stored only on your computer, that’s more secure than if they’re stored on someone’s server. The server is not only a bigger target (millions of users’ passwords) but also easier for a hacker to get to.
  2. How are the passwords stored?
    • If they are encrypted, that’s better protection. However, if they’re just in a text file, that’s not very secure, even if it’s on your own hard drive.
  3. Where is the decryption done?
    • If the password is decrypted at a remote server, so just the password is sent back to you, that’s not very secure. However, if the service can’t decrypt your password file, that’s more secure. (Sometimes this is called zero knowledge. The company can’t share your information because they don’t know it. It also means you’ll lose all your passwords if you forget your one password to decrypt them all.
  4. What does it cost?
    • Password managers all cost something. Sometimes it’s just the inconvenience of using one, instead of using the same username and password everywhere on the Internet. However,
      1. Some password managers have a price to buy. This seems to be less common now because
      2. Some password managers have a monthly fee. This means that not only do you have an on-going expense, you may lose access to your passwords if your subscription lapses.
      3. Some password managers have a special format that can’t be exported to a different manager easily. If you decide to change to a different password manager, you could have some problems. (I ran into this when I stopped using the Mac — my password manager was 1Password, which isn’t available off the Mac, and several of the password managers I tried couldn’t make sense of the 1Password export format.)
      4. Some password managers give the serving company access to the password store. This means that if they are hacked, hackers could gain access to all of your passwords.
      5. Some password managers are just plain hard to use.
  5. Does it synchronize with multiple devices?
    • This one bit me with a couple of solutions. If Dropbox (for example) is the only option for synchronizing, a free account will limit you to five devices. Sometimes, the manager itself will require a license for every x devices it’s installed on. It’s best to find that out before you have gotten committed to using a particular password manager.

This is a lot of information to process. I will plan to write a second post comparing some password managers.

Gaming Remotely: challenges

I’ve been playing a D&D game with my kids for the past few years. We are using MapTool for our Virtual Table Top because it’s free, and it’s also pretty impressive. A lot of the number crunching involved in regular Old School D&D is done by the computer, which is what it’s for. (Note, although I’ll generally say D&D, we’re actually playing 1st Edition Advanced D&D. My time with the Basic Set was very short, and I lost interest during 2nd Edition.)

Things have gone pretty well, and we have two players who live off site who join us through the magic of technology. The biggest problem has been our Internet connection. We live in a rural area, and the only Internet options for us have been traditional satellite Internet (bad latency and even worse customer service, coupled with a pricing structure that would make a robber baron blush) and AT&T DSL.

AT&T is one of those companies that has tried its level best to alienate me for years. I pulled the family off of their cell phone service last year, and we’re just waiting for an opportunity to ditch the DSL line as well. The service is poor to start with, but the signal drops several times per day, and that’s just unacceptable.

When the service drops, we lose connection on our voice chat (using Discord) and our remote players also lose their connection to MapTool, and have to be booted and reconnect (once the connection is back up).

We finally have a chance at something better. I’m testing a Starlink setup. It’s not technically available in our area (we’re on the wait list), but I managed to get a chance, and I’ve been running it for just under a week.

The first thing to love is that it’s so much faster than DSL most of the time. The latency isn’t too much worse, and it’s also been quite stable. There’s just one little problem!

Starlink uses the same setup as cell phones for assigning IP addresses. That means, among other things, that the IP address can change, sometimes frequently. It also means that they don’t support (and probably will never support) port forwarding or NAT.

MapTool runs a local server, and unless your machine is directly connected to the Internet, you need that port forwarded to your players for them to connect to the server. (Okay, technically, you need the port forwarded from somewhere else to the server, but let’s not get too OCD about this.)

Last week was a blind role playing session, because we discovered the problem too late. I almost always forget about the port forwarding because it works transparently most of the time, and every time I’ve changed router setups, I generally have a glitch the first game day because I’ve forgotten to redo the port forwards. With most routers, it’s a fairly trivial thing to do.

This time, I’ve spent most of the week working on this issue, and I finally fixed it this morning. I’ll start with what didn’t work.

  1. localtunnel
    • This is an npm installation that should work easily. In fact, it’s very similar to the solution that eventually did work.
    • Unfortunately, my remote machine (tested using a computer connected via my phone’s hotspot) couldn’t connect to the server.
    • I have a need to use/hate relationship with npm. For whatever reason, I seem to have to continually reinstall applications that use it, including npm. All in all, I’m probably just as happy it didn’t work.
  2. Remote.it
    • Oh, how I wanted to like this solution. It has a nice configuration app, it claims to do everything I need, and it even mentions working with Starlink.
    • Nope. Also, nuh uh. Documentation was sparse, and whether I’m just a bonehead or for some other reason, it didn’t work.
    • If someone can tell me how to make it work, I’d be happy to give it another chance.
  3. Tailscale
    • This is a great concept — zero config VPN. Unfortunately, it achieves this by using Single Sign On from Google or MS, or one of those other evil megacorporations. Nope, I spend a fair amount of my life avoiding the monitoring of those beasts, I don’t want to invite them to our games. I didn’t even look far enough to see what costs, if any, there were.
  4. ZeroTier
    • This one is actually pretty amazing. I might go back to it if I get the proper feedback from my group.
    • ZeroTier is an easy-to-configure VPN setup that supports 25 nodes using their free plan.
    • Unfortunately, it requires installation of client software on every connected machine. For a different application, that would be completely worth it. For gaming, where I’ve already asked the players to install MapTool, it’s not going to work. Plus, we’re hoping to move over to Foundry, which will eliminate the need for my players to install any software except a web browser.
  5. SSH Tunnel
    • Technically, all I need for this one is ssh installed on my machine, which I have, and a server that’s not inside my network, which I also have. (An example is this fine website.)
    • I had an application on the Mac that would help with this, but I couldn’t remember the name, so didn’t look for alternatives. (It turns out to have been SSH Tunnel Manager. Now I feel stupid.)
    • I found an app called jEnTunnel. It’s a little java app with a graphical interface that allows me to easily set up tunnels. I haven’t tried to see if it actually works with MapTool yet, but I will update when I have a chance to do that. Chances are it will be my go-to, since I control the other endpoint.
  6. ngrok
    • This is what I ended up using, and what prompted writing this post.
    • ngrok is a tiny little command-line tool that lets me forward a port very simply.
    • In fact, it allows the creation of config files that give a shortcut to opening several ports simultaneously, which could be very useful indeed.
    • They have a ppa, so I can install (and uninstall) using apt.
    • One downside, the external url and port change every time you run it, so it’s a bit more fiddly than some of the other options.

TL/DR

If you have a similar situation where you need to forward a port out of your firewalled Internet connection, ngrok or an ssh tunnel may be what you’re looking for.

Update!

Further experimentation has revealed a couple of unpleasant truths. The first is that, because my server isn’t a VPS, I don’t actually control it. My hosting provider won’t unblock the ports that I need, so forwarding them does nothing good for me. I may decide to pay for a VPS and host my site(s) there, and then I’ll control the ports, but that’s way in the future.

Secondly, I got a response from Remote.it’s support team. It looks like the only way (short of running a VPS) to make it work for my use is to have my players install the Remote.it software on their computers as well. Then, we can link up and have fun. This puts it soundly in the ZeroTier category, and isn’t acceptable right now.

TL/DR 2: ngrok does what I need, albeit not in a pretty fashion.

Phones: The Shipping Wars

I have had a love/hate relationship with my cellular provider for some time. Long enough ago that it could well be the beginning, I was with AT&T, but their customer service irked me fiercely. Between one thing and another, I hadn’t taken the plunge to a different service, but then my son accidentally bought a Tracfone phone that was locked to that service. You see, we had been buying cheap prepaid phones to use with our AT&T sim cards, but he didn’t realize that some of them used different background networks.

Well, he just got another phone, and the Tracfone device lay around our house for a while, seeking purpose. That purpose arrived when AT&T started disabling our older devices. I activated the Tracfone for my wife, ported the number over, and it was smooth sailing.

However, there were storms on the horizon. Tracfone’s customer service rivals AT&T for horror, and there was one complication after another. Finally, they switched off my phone — the one AT&T had sent me when they were shutting down my older phone.

Well, this got me interested in options again, and I found Reach Mobile. I haven’t used them long enough to know if I like them, but they have been marginally more communicative than Tracfone, and they don’t insist on calling back on an unrelated number “within 15 minutes” which is always longer and then making me wait on hold after they call me.

So, to transfer your number from one phone service to another, you need four pieces of information.

  1. The name of the company you’re transferring from. In my case, this was Tracfone.
  2. The account number. This is trickier, especially if you have multiple phones with a vendor. For Tracfone, it’s the last 15 digits of your SIM card number, which you can find with a magnifying glass on the back of your SIM card if you pull it out of the phone.
  3. The PIN code. This was a real stumper. After wasting time with the chat support at Tracfone, waiting for someone to call me back, waiting on hold for the person who called me to pick up, I finally found that I could have gotten it by myself. To get the Number Transfer Pin for a Tracfone, text NTP to 611611 on the phone that has the number you’re transferring from.
  4. The billing zip code.

Well, once I had located these pieces of information, I was set. There was one small snag: since I had activated my SIM card with Reach Mobile, they were unable to change the number on it. Fortunately, they send two SIM cards when you sign up, so I was able to use the second card to receive my old number.

Well, I’m still waiting for the transfer to go through. Reach suggested contacting Tracfone to ask them to hurry it up, but I think I’d rather carry around two phones for a bit longer.

Using Cloud Storage

Okay, so we really need to answer the question, “Should you use cloud storage?” before we get into the weeds on this one.

Simply put, cloud storage is file server space that’s not on your computer. More generally, it is usually owned by someone else, and is physically separate from where you are.

Because the “cloud” is not typically owned by you, you need to be aware that what you store there might be examined by whoever is providing the storage, that they may change their policies and force you to find other solutions, they might be raided by law enforcement in such a way that your (innocent) files are swept up with those of the target, etc. In other words, don’t blindly trust a cloud.

That said, there are good reasons to use cloud storage. One thing that makes it “cloud” is the ease of access. You can access your files on a number of different devices, and changes in one place are quickly reflected elsewhere. (This can be a problem, too, if things change in a bad way, but there are ways to mitigate that, too.)

For as much of a “cloud skeptic” as I am, I have accounts with several cloud providers. I’m going to briefly recount which ones, why, and then discuss how.

The oldest cloud account I have is with Apple’s iCloud service. I got it when it was a free .Mac account, kept paying for it for quite a while, and now have stopped paying for, and using, it. I don’t use a Mac very much at all, and my Macs tend to be well behind the current OS and hardware. Because of that, Apple won’t let me use two-factor authentication, and iCloud is pretty useless without it. So, I don’t even try to access my iCloud storage anymore, and I long since took my data out of it. I still have the account because it’s connected to my Apple ID, which is connected to my Apple Developer ID, etc.

The second oldest account I have is with Dropbox. Dropbox was one of the first, and I was always able to function within the limits of the free account. I don’t use a cloud to back up most of my data (unless you count a local cloud, which we’ll talk about later) partly because of cost and partly because of time. It takes a long time to upload or download terabytes of data. Dropbox worked well, and I used it for a number of programs on the Mac that were able to use Dropbox for sync. (This was largely during the period when you had to pay for .Mac to use the cloud storage for sync, and I wasn’t paying.)

I mostly stopped using Dropbox a few years ago when they started limiting free accounts to using three devices. The primary purpose for my Dropbox account was to sync my contacts and passwords using a third party whose interest was not in my contacts and passwords. To do that, it needed to sync with close to a dozen devices, so three was not going to cut it. I still have the account, and it can be useful for transferring a file to someone, but I use it only slightly more than the iCloud account.

Now it starts to get hard to say what I got next. It was probably Koofr. I was looking for a privacy-supporting cloud service, and heard about Koofr. It’s hosted in the EU, so it has all the benefits and problems associated with that. I get 10GB free, which isn’t nothing, but is a lot less than some of my folders. It has a Linux app, so that’s nice. I don’t really know why I haven’t used it more.

Box is another of these services that I tried to get the free offer, and wasn’t impressed enough to really do more with. It is something I’ll be using with work, so I may find myself using it more in the future.

OneDrive (Microsoft) and Google Drive (guess who) are two more cloud drives that I only use for work. I hate both of those companies, and don’t trust them in any way, so I share as little of my information with them as possible.

pCloud is my new favorite. There are several reasons for this. First, it is privacy-oriented, being based in Switzerland. Second, Linux is a first-class citizen, with a proper app. (To be fair, Dropbox also has a full Linux app. OneDrive and Google Drive don’t, and iCloud isn’t even accessible using other means.) Second, although it cost me money, I was able to buy 2TB of cloud storage with a lifetime lease. Of course, that’s the lifetime of that lease, and it’s very possible that they will discontinue this service, or be bought out, or something at some point. However, the perpetual license makes it a fixed cost rather than a recurring one, and that’s something I like. They offer encryption as an add-on, which I haven’t purchased yet, but it’s available in the same way with a one-time payment. It syncs nicely with my computer and phone, and the storage is big enough to be useful, especially when considering things like file versioning, which it supports.

OwnCloud/NextCloud are two versions of the same open source software, with NextCloud being a fork of OwnCloud. What is nice about this software is that it’s not necessarily on someone else’s hardware. While I run an instance of one of these on my shared-host web server (that I don’t control) it is trivial to install it on XigmaNAS or HomeAssistant (or many similar open source servers) and use as a cloud within your home. While firewalls and network security are outside the scope of this article, with proper precautions (like a VPN) you could even have a home-based server that you can access while away from home.

Accessing “Unsupported” Clouds

So, how can I access OneDrive, Google Drive, and Box from my Linux computer, since I both need these for work and they don’t support my operating system?

Well, for Box, the answer is deceptive. John Green wrote an article about mounting a Box drive in Ubuntu, and although one of the comments from 2021 says that Box stopped supporting WEBDAV, I find that (in 2022) it still works just fine. Since the credentials are stored in my Gnome Credentials, I’m not sure how I would mount a second Box account using this method.

However, the product ExpanDrive is another solution. This was something I acquired long, long ago in a Mac bundle, and never really got it to do what I needed it to.

ExpanDrive options

It supports a ridiculous number of cloud providers, both free and costly, and it allows me to mount those volumes seamlessly on my Linux computer. It flaked out a certain bit when I was trying to mount two different Box accounts, but that’s why I went looking for, and found, John Green’s solution. Using ExpanDrive I can easily access all of the cloud accounts that don’t have a nice Linux client.


Update 14 April 2022: Another contender in the cloud storage arena appears to be internxt. At this point, I don’t know anything about them except that they advertise on Brave, and they emphasize zero-knowledge file storage for anonymity and security. They offer 10GB on the free plan, which is certainly enough to try them out. If you do, please let me know in the comments.

Filesystem Search in Linux Mint 20.2

When I installed Linux Mint 20.2 Uma for my kids, I was immediately struck that the search panel in Nemo now had a search by contents field.

I work pretty hard to keep my work organized, but there are still times when I can’t quite lay my finger on something I’m looking for, so I upgraded to 20.2 as quickly as was possible (I had been running 20.1)

I figured that I had better take the fetters off of tracker and its kin — after all, a file and content search technology that’s part of the Gnome project would surely be the heart of any similar feature of Nemo, right?

Well, after several weeks of having at least one CPU core pegged 24/7, I decided to do some more research. As far as I can determine, tracker is basically an unwanted orphan. Tracker-GUI, the configuration panel for the utility, is gone from the supported repos (and even in Ubuntu, upstream from Mint).

Clutching my courage with both hands (at the prospect of having to redo those weeks of unprofitable cryptic churning) I reset the tracker database, wiping out its indexes and shutting down its processes. ( tracker daemon -t to terminate all the tracker processes and tracker reset -r to wipe the data cache )

Finally, I typed in sudo apt remove tracker and found, to my delight, that it only removed tracker and its attendants (tracker-miner and tracker-extract). Then, I hopped in to Nemo and did a search for some text I knew existed inside one of the files in a particular folder. Success! After a relatively short time, the window began to be populated with corresponding files.

Above all, I no longer have a mammoth processor hog flailing about, not to mention the disk usage and memory.

So, if you’re concerned that removing tracker from your Linux Mint 20.2 installation will negatively affect Nemo’s search capabilities, worry no more! Go ahead and uninstall the little beast!

Screenshots

Many years ago I worked for a commercial help desk operation, and one of the things I had to do was take screen shots for documentation, or just to show users what they needed to do.

I had my personal Mac at work, and I owned Snapz Pro X, from the now-defunct Ambrosia Software. It was a great program that did screencast captures as well, and I enjoyed using it. Most of the time, I didn’t even need to use it, as the Mac has a great screenshot utility, activated by command-shift-3 and command-shift-4, that allows taking screenshots of the whole screen or a portion of it. However, some of my work required me to take screenshots of Windows screens. I researched and installed Greenshot, and was happy with that. Since that time, Greenshot has expanded their software to work on Macs.

Fast forward to a more recent time, and I’m switching my work environment over to Linux. Among the needs I have is the need to do screenshots, so I went back to work researching. Greenshot wasn’t an option, so I found Shutter. At that time, the project had been abandoned for some years, but I was able to scrounge up the procedure to make it work on my computer, and I was happy again.

More recently, Shutter has been adopted, and installation is straightforward, and there really seems to be no reason to look elsewhere, but I did. In looking around I found Flameshot, which is available for Linux, Mac, and Windows, and seems really nice. I didn’t really need to explore it in depth, since Shutter works, but I took it for a spin or two.

For Windows, I’ve installed ShareX, which gets enthusiastic reviews. I haven’t actually had occasion to use it, as I take most of my Windows screenshots from Linux these days, but I thought it worth mentioning.

Finally, as I was fiddling around with trying to assign a key combination (Ctrl-Shift-3 and Ctrl-Shift-4, as it happens) to my screen capture utility, I discovered that I had already done so. In activating the combo, I found that it didn’t activate Shutter, nor did it activate Flameshot or any of the other screenshot utilities I had installed.

Linux Mint has a built-in screen capture utility that works similarly to the one built in to macOS. On activating the screen capture command (I have no idea what the default key combo is, and I’m too lazy to look right now) a screen capture is taken, and the user is prompted with a save dialog that also has a Copy to Clipboard button. (On the Mac, copying the screenshot to the clipboard was accomplished by holding Option along with the other keys.)

Now, while I say that I don’t know what the default combo is, that’s not strictly true. You can certainly use the PrtSc key on most keyboards for this. However, because of the settings I had put in to Mint, on my computer Ctrl-Shift-4 allows me to select a portion of the screen to capture.

Now it’s true that this built-in utility has no editing capability, and that I can’t do timed captures to select active menus, etc. For that, I’ll probably continue to use Shutter, which has some really nice controls for that. I may even just paste screenshots from the native tool into Shutter for editing.

However, for many people, the built-in function is good enough, as it was on the Mac for me for many years. I’m thrilled to see this kind of simple usability making its way into the notoriously difficult to use Linux ecosystem.

Installing Linux Mint on an Acer Nitro 5 laptop

This journey isn’t over yet, but it’s been adventurous enough so far that I thought I’d better start documenting things. Otherwise, I’ll never be able to recreate what I’ve done.

My daughter bought herself a new Acer Nitro 5 (AND515-44-R99Q) because her old MacBook is getting very long in the tooth. The new system comes with 16GB RAM, a 256GB NVME SSD (with Windows 10) and a 1 TB Hard Drive.

Her brother had recently picked up an Acer Aspire 3, and apart from some silliness with the secure boot options, it was a piece of cake to throw Linux Mint 20.2 on there.

The Nitro, however, has a new hybrid graphics setup, using the AMD Renoir chip for low-powered stuff, and an Nvidia GTX 1650 as the high-powered graphics engine. That’s the same card I have in my desktop, so I didn’t have any questions about whether it would run.

Well, the live disk installer wouldn’t get to desktop using the “standard” setup. Using compatibility mode, however, I was able to get the desktop to appear. There was a little bit of wonkiness — the trackpad wasn’t recognized, apparently, but I threw an old Kensington trackball on there and was installing in no time.

I’ve learned from past experience that you sometimes have pain on first boot if you don’t install the extra media stuff right away, so I hooked up an ethernet cable and off we went.

Installation is a lot faster on this newer hardware than on a lot of machines I’ve worked with, but I still tend to walk away and let it churn after I’ve gotten it configured. (And hope I didn’t forget anything to come back to it patiently waiting for input to start.)

Installation finished, I rebooted the machine, and …

So, ctrl-alt-F2 to open a console, log in and sudo apt update

Now there are a bunch of upgrades, so sudo apt upgrade

To be honest, I’m about 50/50 at this point whether I want to just install ssh before I get any deeper in the weeds (because the laptop is mounted two feet above my desk, to the right, and I could just ssh in instead of reaching over there to type the commands). However, I want to follow a “normal” process before I get to that, though I’m sure it will come soon.

Well, the upgrade stalled, so it’s ctrl-alt-del and let it reboot. It’s nice at this stage, because while it’s frustrating to have to redo things, at least we’re not worried about losing any data.

So, on this reboot, after getting into the console to log in again (because of the same black screen / non-blinking cursor issue), I’m going to sudo install openssh-server. This will allow me to connect to a console from my own computer, which will allow me to interact with the Nitro without stretching or getting out of my chair, and also will allow me to do other things while it’s going.

So, ssh lets me connect to the laptop even when the display is funky, and even if the keyboard on the laptop starts misbehaving. I can install and uninstall stuff, and even reboot if I need to. One of the first tools I install on a computer, even if I intend to sit in front of it most of the time.

At this point, the Nitro is behaving very badly, and even with moderate edits to the grub file that controls the boot process, it’s not allowing me to log in (graphically). I’m going to switch over to Ubuntu. If I recall correctly, it worked in early tests with this machine. If it shows any sign of trouble, I’ll install ssh first 🙂

There are a couple of reasons that I use Mint instead of Ubuntu, even though Mint is based upon Ubuntu.

The first is Unity. Ubuntu’s default Desktop Environment is clunky, wastes space, and is needlessly obstructive. That’s okay, I know I can install Cinnamon[1]Mint’s default Desktop after I install Ubuntu, and they actually have an installer that uses the Mate environment (which is not bad).

The second is more complicated. Ubuntu has really been pushing the Snap install infrastructure. While it sounds great, the more I delve into it, the more I agree with Mint’s developers that it is the kind of oppressive centralization that caused many of us to leave Apple and Microsoft.

The good news is that Ubuntu starts right up without needing compatibility mode, and the trackpad works. (I actually still prefer using the trackball, since it’s right on the desk next to me.) I installed using Mate, installed ssh, updated drivers, and everything worked. With that in mind, I took careful note of the settings (using the inxi -Fxz command) so that I can try to replicate them in Mint.

So, back to the Mint installer. As before, it only boots in compatibility mode. Bummer. Oh, well, let’s wipe that partition and get it installed again.

So, install is done. We’re doing the first reboot… As expected, black screen. Well ctrl-alt-F2 still works, and after logging in I quickly install ssh.

One advantage of doing some of this work behind the screens is that I get to see the error messages dumped into the console. Wow, the nouveau driver is buggy on Mint Cinnamon with this hardware! A simple difference is the linux kernel being used, however. Ubuntu is using kernel 5.11.0-34, while Mint is using 5.4.0-74. This should be relatively easy to test.

So, from the ssh session: sudo apt update (I actually already did this before installing ssh) and then sudo apt upgrade to apply the upgrades available. This can be kind of important because some things might break with the new kernel otherwise (not that we would notice, since it looks pretty broken as it is).

And, as it happens, Linux Mint 20.2 with kernel 5.11.0-34 still breaks under Cinnamon or Mate when using the Nvidia drivers. Oh, well, I’ll try to figure that out some other time — right now my daughter wants to use her computer.

So, I wipe the partition again, reboot to the Ubuntu Mate installer, and quickly run the install. I’ll have to get to the drivers at another time, but she did use the laptop during our D&D game today, so at least there’s that.

References

References
1 Mint’s default Desktop