But OS X, macOS, and at least one Linux distro are/were UNIX certified.
But OS X, macOS, and at least one Linux distro are/were UNIX certified.
The network gear I manage is only accessible via VPN, or from a trusted internal network…
…and by the gear I manage, I mean my home network (a router and a few managed switches and access points). If a doofus like me can set it up for my home, I’d think that actual companies would be able to figure it out, too.
Add to that photo editing (as much as GIMP is great…). I would guess DAW and video editing would fall under that category, too…and good luck finding many AAA open source games.
IIRC Torvalds uses Fedora.
(Debian for me.)
Remote backup server would be my suggestion.
Configure it with a VPN to talk to your home network and set it up at a trusted friend’s or family’s place.
I do this with a raspberry pi and an external HDD that takes daily/weekly/monthly snapshots, with daily rsync. Works nicely for me.
I’m guessing it’s because the developers either have a different speciality that they focus on, are employed to support specific hardware, or both.
Just use your $200+ Fluke to check the batteries, problem solved.
It’s mostly so that I can have SSL handled by nginx (and not per-service), and also for ease of hosting multiple services accessible via subdomains. So every service is its own subdomain.
Additionally, my internal network (as in, my physical LAN) does not have any port forwarding enabled — everything is over WireGuard to my VPS.
My method:
VPS with reverse proxy to my public facing services. This holds SSL certs, and communicates with home network through WireGuard link configured on my router.
Local computer with reverse proxy for all services. This also has SSL certs, and handles the same services as the VPS, so I can have local/LAN speeds. Additionally, it serves as a reverse proxy for all my private services, such as my router/switches/access point config pages, Jellyfin, etc.
No complaints, it mostly just works. I also have my router override DNS entries for my FQDN to resolve locally, so I use the same URL for accessing public services on my LAN.
We tend to use between 3kWh (vacation/idle power consumption) and around 8kWh per day. If we switched to electric stove, water heater, and heat pump, and add a hot tub, that’d increase substantially. But if we added solar (on our long Todo list…), the battery in the article (60kWh) would probably be able to handle all our storage needs, and it’d fit in he garage (bonus of it can be placed outside/under a deck!). I live in a major city, but I would absolutely love to effectively be off grid.
Exciting stuff — it seems these are touted as being extremely robust/safe, which is of course important for me if it’s going to be in/near our house. Storage density not a huge concern, but price is somewhat important — let’s hope this sort of thing ticks all the boxes.
And your VPN connection to work knows your endpoint…
Interestingly, there’s another way of finding out if your coworker is in the office — just walk over to their desk.
Getting TLS certs will be complicated
I just use Let’s Encrypt with a wildcard domain — same certs for public and private facing domains. I’m sure this isn’t best practice, but it’s mostly just for me so I’m not too worried :)
Yeah I don’t expose Jellyfin over the Internet, so it doesn’t matter for me, and wouldn’t work at all over WAN (unless VPN’d to home network).
Also, it’s all reverse proxied, and there’s nothing preventing having two Jellyfin hostnames, e.g., jf-local.mydomain.com and jf-public.mydomain.com.
Another fun trick you can play is to use a private IP on your public DNS records. This is useful for Jellyfin on Chromecast for instance — it uses 8.8.8.8 for DNS lookup (and ignores your router settings), so it wants a fully qualified domain name. But it has no problem accessing local hosts, so long as it’s from 8.8.8.8’s record.
I have set up local DNS entries (with Pi-Hole) to point to my srrver, but I don’t know if it possible to get certs for that, since it is not a real domain.
So long as your certs are for your fully qualified domain there’s no problem. I do this, as do many people — mydoman.com is fully qualified, but on my own network I override the DNS to the local address. Not a problem at all — DNS is tied to the hostname, not the IP.
The only flaw in Corel’s logic was that as soon as you’re running Linux, you lose all desire to run WordPerfect, and develop an irresistible need to align yourself with vim or emacs…
My university was pretty zen about this — essentially, “don’t use your own access point/router please. But if you do, please talk to your resident (University employed) student IT rep and they can probably help you set it up correctly.”
…but was it the “Windows Uninstall” button…or the “format /dev/sda1 as ext4” button?
I think (?) it’s generally true that the root user should never mess with users’ files.
Imagine your home directory is shared across many systems on a network (my alma mater did this). It would be really bad if a sysadmin for alpha.university.edu removed a program, and suddenly your personal settings were removed from beta.university.edu — even though that computer still has the program.
This is one of the “UNIX on the desktop” issues — a lot is designed for a sysadmin/multiuser situation, and it has some gotchas when using it as a desktop machine (I’m used to/really appreciate the directory structure and settings management at this point, but it may take some getting used to).
EulerOS, a Linux distro, was certified UNIX.