Maintaining perl scripts from the 90s is my ball park!
Mind, I did write some of them, and they’re still whirring away making it a pretty easy job. Perl’s lack of breaking features is its strongest strength.
Maintaining perl scripts from the 90s is my ball park!
Mind, I did write some of them, and they’re still whirring away making it a pretty easy job. Perl’s lack of breaking features is its strongest strength.
If this was just about the X/Twitter accounts, then X could just suspend them.
Neither. Cinnamon on Debian. Has just enough bling to be pretty and still manages not to be fat, and pretty similar to both your choices.
It won’t be that simple.
For starters, you’re assuming t-zero response. It’ll likely be a week before people worry enough that LE isn’t returning before they act. Then they have to find someone else for, possibly, the hundreds or thousands of certs they are responsible for. Set up processes with them. Hope that this new provide is able to cope with the massive, MASSIVE surge in demand without falling over themselves.
And that’s assuming your company knows all its certs. That they haven’t changed staff and lost knowledge, or outsourced IT (in which case they provider is likely staggering under the weight of all their clients demanding instant attention) and all that goes with that. Automation is actually bad in this situation because people tend to forget how stuff was done until it breaks. It’s very likely that many certs will simply expire because they were forgotten about and the first thing some companies knows is when customers start complaining.
LetsEncrypt is genuinely brilliant, but we’ve all added a massive single point of failure into our systems by adopting it.
(Yeah, I’ve written a few disaster plans in my time. Why do you ask?)
Sleeping too well lately? Consider this:
If LetsEncrypt were to suffer a few weeks outage, how much of the internet would break?
It’s the Sharepoint of chat.
Why would I care enough to try and discredit you on any grounds than you’ve written here? I don’t know you, I don’t care about you other than what I’ve read in this thread where you come across as arrogant and the aggressor. Not quite the innocent party you’re trying to project.
Don’t worry about replying, I’m going to use Lemmy’s block user system. Not used it before, but I think it’s the best way to deal with someone I have a disagreement with and don’t want to talk with any further, rather than wasting others time with vexatious development requests.
EDIT: I’m gonna open an issue so Lemmy lets OPs edit and delete comments on their posts. The amount of argument on here is too bad for a standard centralized moderation model.
Not only do you insult a game that many people have a huge amount of love for, for the weakest reason possible - then you get all salty because people disagree with you.
And THEN, you complain to the developers that you should be able to delete other people’s content that you disagree with?
Seriously, get some perspective and stop being a douche. Please.
but the target audience was just presented factorio 2.0
I totally missed that this was a thing. Saw this and then bought the DLC yesterday at 7pm, figuring I’d get a couple of hours in before going to bed. (I get up early)
Yeah, rigt. 4am this morning and I was still playing. .
Factorio is amazing.
I think your reply would have been more useful if you’d given some pointers about how, instead of just “do it right”.
It’s fine, but not going to be the cheapest.
Cheap to buy: Any old PC desktop, really. Most will run linux and windows fine, depending on what you want. Anywhere from free to £100. If you have an old desktop or laptop already, use that to start with.
Cheap to run: Any mini PC. I run a Lenovo ThinkCentre M53 for low power duties. Cost £40 and runs silently at 10watts, idle. (I have a secondary, much beefier server for other stuff that runs at around 100w which lives in the garage)
But plenty of people do run mac minis as home servers, often on Linux. They’re fine - just do your homework on the CPU ability, how much ram you can add, and whether you’re okay with external disks if you can’t fit enough inside.
As far as I can read from that, they’re still maintainers, just have had their credit removed from the contributors page, no?
Still a strange thing to do and I look forwards to an explanation.
The bar chart might be more useful if they weighted the source with its number of users. Facebook isn’t 7 times more hateful than Telegram. It has around 3.5 times as many users - but also the two are used very differently. I use Telegram, but only as a free messaging platform for automated alerts.
Then there’s the algorithms, which tend to feed you what you engage with and from those connections you’ve made on it. The exception recently is X which has a very strong political bias and has turned into something that pushes hate very strongly.
Fair play - it’s good that there’s choice and if it works for you, great. I also totally get the fun of building something yourself.
The local storage is a big one if you don’t have a nas or home server on the network. Although, if you’re linked into the *arrs then I would think most people already do. It’s nice when new episodes just turn up automagically in Jellyfin.
I tried Kodi before but I found the commercialisation of it very jarring. Jellyfin is entirely free - your fifth point might give it extra credit for that. The Jellyfin app doesn’t (afaik) feed any info to anyone, but you do need to load it from the Amazon fire menu, so you can’t entirely skip their advertising. It is the only thing I use the fire stick for, and the price is cheap compared to anything else - it cost £25 and works on any TV. Being a dongle, there’s no noise either.
Why not? It’s a computer that displays tv? At 4k, 5.1 audio, that’s not too shabby, no?
I made a PC specifically for streaming video back before sticks were a thing, but it was expensive, noisy and not very good in comparison and I don’t miss it. What about a stick is inferior to what you’re talking about? Genuine question - educate me, please. What software, what hardware, why choose it over something else?
Using the native Jellyfin app available for Amazon’s fire tv stick.
Not really suitable for a Home Theatre PC
Not sure where you got that idea, but it’s absolutely what I use it for. That I can also watch content from multiple sources as well is part of the appeal. Plus no constant upsell like Kodi and Emby.
Scuse the cut and paste, but this is something I recently thought quite hard about and blogged, so stealing my own content:
What to back up? This is a core question to ask when you start planning. I think it’s quite simply answered by asking the secondary question: “Can I get the data again?” Don’t back up stuff you downloaded from the public internet unless it’s particularly rare. No TV, no Movies, no software installers. Don’t hoard data you can replace. Do back up stuff you’ve personally created and that doesn’t exist elsewhere, or stuff that would cause you a lot of effort or upset if it wasn’t available. Letters you’ve written, pictures you’ve taken, code you authored, configurations and systems that took you a lot of time to set up and fine tune.
If you want to be able to restore a full system, that’s something else and generally dealt best with imaging – I’m talking about individual file backups here!
Backup Scenario Multiple household computers. Home linux servers. Many services running natively and in docker. A couple of windows computers.
Daily backups Once a day, automate backups of your important files.
On my linux machines, that’s things like some directories like /etc, /root, /docker-data, some shared files.
On my windows machines, then that’s some mapping data, word documents, pictures, geocaching files, generated backups and so on.
You work out the files and get an idea of how much space you need to set aside.
Then, with automated methods, have these files copied or zipped up to a common directory on an always-available server. Let’s call that /backup.
These should be versioned, so that older ones get expired automatically. You can do that with bash scripts, or automated backup software (I use backup-manager for local machines, and backuppc or robocopy for windows ones)
How many copies you keep depends on your preferences – 3 is a sound number, but choose what you want and what disk space you have. More than 1 is a good idea since you may not notice the next day if something is missing or broken.
Monthly Backups – Make them Offline if possible
I puzzled a long time over the best way to do offline backups. For years I would manually copy the contents of /backup to large HDDs once a month. That took an hour or two for a few terabytes.
Now, I attach an external USB hard drive to my server, with a smart power socket controlled by Home Assistant.
This means it’s “cold storage”. The computer can’t access it unless the switch is turned on – something no ransomware knows about. But I can write a script that turns on the power, waits a minute for it to spin up, then mounts the drive and copies the data. When it’s finished, it’ll then unmount the drive and turn off the switch, and lastly, email me to say “Oi, change the drives, human”.
Once I get that email, I open my safe (fireproof and in a different physical building) and take out the oldest of three usb Caddies. Swap that with the one on the server and put that away. Classic Grandfather/Father/Son backups.
Once a year, I change the oldest of those caddies to “Annual backup, 2024” and buy a new one. That way no monthly drive will be older than three years, and I have a (probably still viable) backup by year.
BTW – I use USB3 HDD caddies (and do test for speed – they vary hugely) because I keep a fair bit of data. But you can also use one of the large capacity USB Thumbdrives or MicroSD cards for this. It doesn’t really matter how slowly it writes, since you’ll be asleep when it’s backing up. But you do really want it to be reasonably fast to read data from, and also large enough for your data – the above system gets considerably less simple if you need multiple disks.
Error Check: Of course with automated systems, you need additional automated systems to ensure they’re working! When you complete a backup, touch a file to give you a timestamp of when it was done – online and offline. I find using “tree” to catalogue the files is worthwhile too, so you know what’s on there.
Lastly – test your backups. Once or twice a year, pick a backup at random and ensure you can copy and unpack the files. Ensure they are what you expect and free from errors.
No sign of that happening yet, especially with the results of a certain election in a certain country.