Ain’t it so? I feel like the two confused posters are either demented LLM bots, or non-English speakers using an old style paper dictionary for literal translations from whatever language… Most likely bots though.
Ain’t it so? I feel like the two confused posters are either demented LLM bots, or non-English speakers using an old style paper dictionary for literal translations from whatever language… Most likely bots though.
That made less than zero sense…
The article talks about LLM developers / operators. Not sure how you got from that to “current AI technologies” - a completely unrelated topic.
I really wish we had a really advanced AI with reasonable resource consumption within my lifetime.
You only wish that for as long as it doesn’t happen. Have you looked at the world we live in? Such tools would be controlled by the same billionaire dipshits for their personal gain as all social media is being used already.
repeat after me: LLMs are not AI.
Not all thoughts are original.
Agreed, and I am also 100% opposed to SW patents. No matter what I wrote, if someone came up with the same idea on their own, and finds out about my implementation later, I absolutely do not expect them to credit me. In the use case you describe, I do not see a problem of using other people’s work in a license breaking way. I do however see a waste of time - you have to triple check everything an LLM spits out - and energy (ref: MS trying to buy / restart a nuclear reactor to power their LLM hardware).
Further, whether I should be at fault for LLM vendors who may be breaking copyright law, is like trying to make a case for me being at fault for murder because I drive a car when car manufacturers lobby to the effect that people die more.
If you drive a car on “autopilot” and get someone killed, you are absolutely at fault for murder. Not in the legal sense, because fuck capitalism, but absolutely in the moral sense. Also, there’s legal precedent in a different example: https://www.findlaw.com/legalblogs/criminal-defense/can-you-get-arrested-for-buying-stolen-goods/
If you unknowingly buy stolen (fenced) goods, if found out, you will have to return them to the rightful owner without getting your money back - that you would then have to try and get back from the vendor.
In the case of license agreements, you would still be participant to a license violation - and if you consider a piece of code that would be well-recognizable, just think about the following thought experiment:
Assume someone trained the LLM on some source code Disney uses for whatever. Your code gets autocompleted with that and you publish it, and Disney finds out about it. Do you honestly think that the evil motherfuckers at Disney would stop at anything short of having your head served on a silver platter?
Troll elsewhere, dipshit.
oh yeah, those for sure!
You are right. My apologies, and my congratulations for finding the correct “tone” to respond to me ;) The thing is, I am absolutely fed up with especially the bullshit about snake oil vendors selling LLMs as “AI”, and I am much more fed up with corporations on a large scale getting away with - since it’s for profit - what I guess must already be called theft of intellectual property.
When people then use said LLMs to “develop software”, I’m kind of convinced they are about as gone mentally as the MAGA cult and sometimes I just want to vent. However, I chose the word parasite for a reason, because it’s a parasitic way of working: they use the work of other people, which for more specific algorithms, an LLM will reproduce more or less verbatim, while causing harm to such people by basically copy-pasting such code while omitting the license statement - thereby releasing such code (if open source) into the “wild” with an illegally(*) modified license.
Considering on top the damage done to the environment by the insane energy consumption for little to no gain, people should not be using LLMs at all. Not even outside coding. This is just another way to contribute missing our climate goals by a wide margin. Wasting energy like this - basically because people are too lazy to think for themselves - actually gets people killed due to extreme weather events.
So yeah, you have a valid point, but also, I am fed up with the egocentric bullshit world that social media has created and that has culminated in what will soon be a totalitarian regime in the country that once brought peace to Europe by defeating the Nazis and doing a PROPER reeducation of the people. Hooray for going off on a tangent…
Ah, I guess I’ll have to question why I am lying to myself then. Don’t be a douchebag. Don’t use open source without respecting copyrights & licenses. The authors are already providing their work for free. Don’t shit on that legacy.
That statement is as dumb as it is non-sensical.
I know both LLM mechanisms better than you, it would appear, and my point is not so weak that I would have to fabricate a strawman that I then claim is what you said, to proceed to argue the strawman.
Using LLMs trained on other people’s source code is parasitic behaviour and violates copyrights and licenses.
Fuck Microsoft. But first, Fuck Google, Fuck Amazon, Fuck Facebook. I am sure I could think of a few more that are worse than Microsoft, as bad as they are.
So you use other people’s open source code without crediting the authors or respecting their license conditions? Good for you, parasite.
Mobile one absolutely. And they make humans more stupid and the world worse.
So how good did I think my point was? VR is an artificial hype, especially in a time where almost all major game releases lack in story and already put way too much money into graphical effects. It’s a gadget.
let me be the one to say: the only people who “need” VR are those earning their money with selling VR products. No one else in the whole wide world actually needs VR.
Ease of adoption (or appearance of)
Thank you for acknowledging that point. Because since Win7 or so, Almost all major Linux distributions are shitloads easier to learn that any windows environment, no matter how unfamiliar you are with Linux. Basically, all major desktop environments behave like an optimized WinXP desktop.
“I aimed my rifle at that person’s head and pulled the trigger, but I swear I didn’t want them to die”
Tesla should be broken up and reassembled with zero overlap in management.
And yes, legally it won’t stick, but the shitty south african oligarch should absolutely be tried for murder.
Not at all. AI is something that uses rules, not statistical guesswork. A simple control loop is alreadu basic AI, but the core mechanism of LLMs is not (the parts before and after token association/prediction are). Don’t fall for marketing bullshit of some dumbass silicon valley snake oil vendors.