• 0 Posts
  • 218 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle

  • Note that to some extent, this might have been a necessary step in the relative popularity of computing.

    Folks remembering how flexible and open ended things were in the 90s were a tiny sliver of the population. At the time about 1% of the world were participating in the internet, now the majority of the population participates on the internet.

    I would have loved for the industry to keep up the trends of the 90s (AOL/Prodigy lost out to a federated internet, centralized computing yielded to personal computing) instead of going backwards (enduser devices becoming tethered to internet hosted software, relatively few internet domains and home hosted sites being considered suspicious rather than normal), but this might have just been what it took for the wider population to be able to cope.


  • Gen X and older witnessed a young generation born into kind of working, but kind of janky technology. They saw kids figure out obscure VCR programming interfaces that let the kids record something they wanted, but only by navigating very obtuse interface rendered exclusively with 7 segment displays with a few extra static indicators. A teenager playing that new DOS game, but first they had to struggle with getting the conventional memory, upper memory, EMS/XMS and just the right set of TSRs running, involving mucking about with menu driven config.sys/autoexec.bat tailored for their use cases. Consumer electronics and computers of the time demanded a steep learning curve, but they could still do magic, leading to the trope in the 80s and 90s media of tech wonder kids doing awesome stuff way better than the adults. Even if you have a super advanced submarine and very smart people, you needed your teenager computer kid to outclass everyone.

    By now, we’ve made high res touch screens that can be embedded in everything for cheap, and embedded systems that would be the envy of a pretty high end desktop from the year 2000, which was capable of running more friendly operating environments. The rather open ended internet has largely baked in how the participants get to play. The most common devices lock down what the user can do, because the user can’t be trusted not to break themselves with malware.

    The end result is that we may have the same proportion of people with the deep technical skills, but a lot of people are now unimpressed. In the mid 90s, less than 1 percent of the population had direct internet experience, and by 2008, 25% had that experience. So even if you still have 1% of really tech savvy people, there’s over 24x as many non savvy people that don’t need to marvel at those savvy people because they are getting about what they want out of it.












  • All companies enshittify themselves eventually,

    This isn’t a foregone conclusion, just a number of things that are increasingly likely:

    • When a place becomes “cool” and “rich”, then they get job applicants that are all in on “get rich quick” and as they get to call shots they are ready to destroy the core value to pulp out some profit. Particularly older companies like Intel, IBM and Microsoft have had time for that mindset to promote into the positions.
    • Failing that, investors can rapidly corrupt existing leadership, particularly for “startups”. It might start sincere, but then when a relatively small bunch find themselves with a multi-billion dollar payday, pedal to the floor to ruin things.
    • Additionally, any major player not in a position to be a part of the fun will make their knock-offs and market or monopoly their way to displace. Oh look, Microsoft Teams…

    There’s a moderately successful, but also reasonably modest company that went about 50 years without getting “enshitified”. The founder was passionate about it, kept the company private, and held in until his 80s when he finally decided he really couldn’t do it anymore. Then they went public and immediately the layoffs, price hikes, and cloudification commenced, as the flood gate of enshitification opened on his retirement.


  • Frankly, no idea. They talked the talk, but it’s vague enough that I wouldn’t be surprised if they cut out some important people and kept a lot of the bogus crap.

    Speaking from a company that has seen (less dramatic) rhetoric around similar circumstances, and everyone on the ground who understood nuance would have guessed certain projects to get canned. Then it turns out they treated the money losing projects as the sacred cows and cut the profitable projects to the bone and put them at risk rather than give up the losers.





  • Part of the lackluster CPU problem is that Intel was pissing away their money on other adventures. CPUs were “in the bag”, so they kept spending money on other stuff to try to “create new markets”. Any casual observer knew their fundamental problem was simple: they got screwed on fabrication tech. Then they got screwed again as a lot of heavy lifting went to the ‘GPU’ half of the world and they were the only ones with zero high performance GPU product/credibility. But they instead went very different directions with their investments…

    For example they did a lot to try to make Optane DIMMs happen, up to and including funding a bunch of evangelism to tell people they’ll need to rewrite their software to use entirely new methods of accessing data to make Optane DIMMs actually do any better than NAND+RAM. They had a problem where if it were treated like a disk, it was a little faster, but not really, and if it were used like RAM it was WAY slower, so they had this vision of a whole new third set of data access APIs… The instant they realized they needed the entire software industry to fundamentally change data access from how they’ve been doing it for decades for a product to work should have been the signal to kill it off, but they persisted.

    See also adventures in weird PCIe interconnects no one asked for (notably they liked to show a single NVME drive being moved between servers, which costed way more than just giving each server another NVME and moving data over a traditional fabric). Embedding FPGA into CPUs when they didn’t have the thermal budget to do so and no advantages over a discrete FPGA. Just a whole bunch of random ass hardware and software projects with no connection to business results, regardless of how good or bad they were. Intel is bad for “build it, and they will come”.


  • I do a moderate amount of work with Intel, and I’d say the problem is not that the people are “shit”, it’s that their bureaucracy is so messed up. You have the people that actually engaged with their customers (support and sales), who marketing largely ignores, and marketing makes up stuff that isn’t in sync with the field guys, but that’s hardly a problem because the development executives then go off on their own “cool” ideas, without any buy in or anything from support, sales, or marketing. This has real impact, but then you have some middle managers spooling up side projects with like a dozen dedicated people each, adding another indirection of effort totally disconnected from any business capability.

    So end result is you have an admittedly qualified team toiling away on a project that there’s just no way a potential customer will even hear about, working on problems that someone “imagined” that a customer never had, or is trivially solved in the industry already, but they don’t have the experience to know that. Even when the work is good and people might want it, it’s still doomed to obscurity because there’s such a disconnect between the engineers and any actual communication with potential customers.