• 2 Posts
  • 965 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle

  • Buffalox@lemmy.worldtoTechnology@lemmy.worldThe GPT Era Is Already Ending
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    3 days ago

    You wouldn’t need infinite time if you had infinite monkeys.

    Obviously, but as I wrote BOTH are impossible, so it’s irrelevant. I just didn’t think I’d have to explain WHY infinite monkeys is impossible, while some might think the universe is infinite also in time, which it is not.

    I also already wrote that if you have an infinite string everything is contained in it.
    But even with infinite moneys it’s not instant, because technically each monkey needs to finish a page.

    But I understand what you mean, and that’s exactly why the theorem is so stupid IMO. You could also have 1 monkey infinite time.
    But both are still impossible.

    When I say it’s stupid, I don’t mean as a thought experiment which is the purpose of it. The stupid part is when people think they can use it as an analogy or example to describe something


  • Buffalox@lemmy.worldtoTechnology@lemmy.worldThe GPT Era Is Already Ending
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    10
    ·
    edit-2
    3 days ago

    Infinite monkeys and infinite time is equally stupid, because obviously you can’t have either, for the simple reason that the universe is finite.
    And apart from that, it’s stupid because if you use an infinite random, EVERYTHING is contained in it!

    I’m sorry it just annoys the hell out of me, because it’s a thought experiment, and it’s stupid to use this as an analogy or example to describe anything in the real world.


  • Buffalox@lemmy.worldtoTechnology@lemmy.worldThe GPT Era Is Already Ending
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    8
    ·
    edit-2
    4 days ago

    It’s a great article IMO, worth the read.

    But :

    “This is back to a million monkeys typing for a million years generating the works of Shakespeare,”

    This is such a stupid analogy, the chances for said monkeys to just match a single page any full page accidentally is so slim, it’s practically zero.
    To just type a simple word like “stupid” which is a 6 letter word, and there are 25⁶ combinations of letters to write it, which is 244140625 combinations for that single simple word!
    A page has about 2000 letters = 7,58607870346737857223e+2795 combinations. And that’s disregarding punctuation and capital letters and special charecters and numbers.
    A million monkeys times a million years times 365 days times 24 hours times 60 minutes times 60 seconds times 10 random typos per second is only 315360000000000000000 or 3.15e+20 combinations assuming none are repaeated. That’s only 21 digits, making it 2775 digits short of creating a single page even once.

    I’m so sick of seeing this analogy, because it is missing the point by an insane margin. It is extremely misleading, and completely misrepresenting getting something very complex right by chance.

    To generate a work of Shakespeare by chance is impossible in the lifespan of this universe. The mathematical likelihood is so staggeringly low that it’s considered impossible by AFAIK any scientific and mathematical standard.




  • Corporate ownership, but you can have that and still be generally accepted in the community. Like both Fedora when controlled by Red Hat and Suse when controlled by Novell.
    One of the real problem is their dual license policy for their open source projects, that grant Ubuntu full license and the power to close in an Open source Project if they want. This is decidedly against the GPL spirit, but can be done with dual licensing.
    Another problem is the “not made here” mentality, which undermined Wayland for instance.
    Ultimately the problem is I guess, that Ubuntu is (was?) trying to make Ubuntu exclusive to Linux, with Canonical controlling key technologies. Seemingly an effort to reduce other Linux distros to second rate players.
    Another example of that (apart from dual license and Mir) is their new package system Snap, which is open source on the client side, but proprietary on the server side.
    Obviously it’s not a good idea for Linux to use proprietary package systems.

    These are of course ideological issues, if you don’t give a shit about those, I suppose Ubuntu is mostly OK. Except minor annoyances like media not working out of the box. And that the PPA system sucks.





  • This is absolutely about cars MADE in China.

    hugely high-end and out of the price range for most people.

    There are lots of cheaper options in Europe now, and more coming all the time.
    The new thing is that they are getting better range, because batteries are getting cheaper.
    There are lots of excellent option around 30K EUR/USD.

    And cars like the Renault 5 expected January are coming at just below 20,000.- EUR making it basically also 20k USD here in Denmark.
    Currently there are a couple of options just below 22k EUR. But as stated more are coming.

    USA has chosen politically to have less competition by adding 100%¤ import tax to cars from China. So of course prices are higher in USA.



  • How much taxpayer money went to tesla?

    Oh boy, Tesla is receiving taxpayer money through so many channels.
    First they got cheap federal loans to build the company, they got massive direct funding for building the charging grid.
    EVERY single car sold receives a TAX credit. And finally other brands have to pay Tesla regulatory credits, because those are designed to favor and promote electric cars too.

    So Tesla is indeed a massively subsidized company.




  • what’s perhaps most striking about GenCast is that it requires significantly less computing power than traditional physics-based ensemble forecasts like ENS. According to Google, a single one of its TPU v5 tensor processing units can produce a 15-day GenCast forecast in eight minutes. By contrast, it can take a supercomputer with tens of thousands of processors hours to produce a physics-based forecast.

    If true this is extremely impressive, but this is their own evaluation, so it may be biased.


  • Yes Chrysler is part of Stellantis, and once upon a time, way way back in what has since been called the 80’s, Chrysler was near bankrupt, but a savior came to Chrysler with the name Lee Iacocca. The mastermind behind Ford Mustang. And he came to Chrysler and saw all that was bad and fixed it. He undertook to finish a bold new type of car in the Dodge Caravan, which became hugely successful and saved Chrysler. Chrysler went on to become so successful they were even able to buy up other brands like AMC that also owned Jeep.

    Ah well, as a European I know little about Chrysler today, but I have fond memories of once admiring mostly everything American, and Lee Iacocca and Jack Tramiel are probably the two business leaders I respect the most of all time.

    Sorry to hear Chrysler is now considered safe to bet against. But sadly Stellantis has been shit for some years now.
    Stellantis has many traditionally popular European brands, like Citroen, Peugeot, FIAT, Opel (Vauxhall in UK), Alpha Romeo and Lancia. And AFAIK all the brands are doing poorly.


  • Funny the Radeon RX 480 came out in 2016 at a similar price. Is that a coincidence?
    Incidentally the last great generation offering a midrange GPU at a midrange price. The Nvidia 1060 was great too, and the 1080 is claimed to maybe be one of the best offers of all time. Since then everything has been overpriced.

    The RX 480 was later replaced by the 580 which was a slight upgrade at great value too. But then the crypto wave hit, and soon a measly 580 cost nearly $1000!!! Things have never returned quite back to normal since. Too little competition with only Nvidia and AMD.


  • I think it’s looking pretty grim

    Absolutely, but for some reason Intel has a history of failing in new areas. Their attempt with Itanium for high end was really bad, their attempt at RISC which mostly ended up in SCSI controllers was a failure too. Their failure with Atom not being competitive against Arm. Their attempts at compute for data-center has failed for decades against Nvidia, it’s not something that just happened recently. And they tried in the 90’s with a GPU that was embarrassingly bad and failed too.

    They actually failed against AMD Athlon too, but back then, they controlled the market, and managed to keep AMD mostly out of the market.
    When the Intel 80386 came out it was actually slower than the 80286!, When Pentium came out, it was slower than i486. When Pentium 4 came out, it was not nearly as efficient as Pentium 3. Intel has a long history of sub par products. Typically every second design by Intel had much worse IPC, so much so that it was barely compensated by the higher clocks of better production process. So in principle every second Intel generation was a bit like the AMD Bulldozer, but where for AMD 1 mistake almost crashed the company, Intel managed to keep profiting even from sub par products.

    So it’s not really a recent problem, Intel has a long history of intermittently not being a very strong competitor or very good at designing new products and innovating. And now they’ve lost the throne even on X86! Because AMD beat the crap out of them, with chiplets, despite the per core speed of the original Ryzen was a bit lower than what Intel had.

    What kept Intel afloat and hugely profitable when their designs were inferior, was that they were always ahead on the production process, that was until around 2016. Where Intel lost the lead, because their 10nm process never really worked and had multiple years of delays.

    Still Intel back then always managed to come back like they did with Core2, and the brand and the X86 monopoly was enough to keep Intel very profitable, even through major strategic failures like Itanium.