Your comment made my day. Thanks.
Anyone spreading this misinformation and trying gatekeep being an artist after the avant-garde movement doesn’t have an ounce of education in art history. Generative art, warts and all, is a vital new form of art that’s shaking things up, challenging preconceptions, and getting people angry - just like art should.
Entertainment.
Their policy could never stop anyone in the first place.
Using copyrighted works without permission isn’t illegal and shouldn’t be. You should check out this article by Kit Walsh, a senior staff attorney at the EFF, and this open letter by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.
Someone dumb enough could easily flatten someone backing up with that bug.
Or just not show people what you’re typing.
I can’t tell if this is a joke or not.
A computer like that is useful outside of work. I’d pay for it out of pocket if I had to.
The only thing I got from this is that bro loves ads more than anything in the world.
I accept regulations are real, but not all ways to help people require you dealing with regulations. I’m still waiting on that proof by the way.
There are more ways to help people than making medical software. Rather than saying they could focus on doing simpler things, you automatically jumping to all projects running afoul of FDA regulations is pretty telling. All while still having not provided a single project halted by FDA order.
Which projects have been shut down by FDA order?
Open source AI is huge, and I don’t think you need FDA approval to distribute a model. Where are you even getting that from?
What about open source projects?
This isn’t about research into AI, what some people want will impact all research, criticism, analysis, archiving. Please re-read the letter.
You should read this letter by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.
Why are scholars and librarians so invested in protecting the precedent that training AI LLMs on copyright-protected works is a transformative fair use? Rachael G. Samberg, Timothy Vollmer, and Samantha Teremi (of UC Berkeley Library) recently wrote that maintaining the continued treatment of training AI models as fair use is “essential to protecting research,” including non-generative, nonprofit educational research methodologies like text and data mining (TDM). If fair use rights were overridden and licenses restricted researchers to training AI on public domain works, scholars would be limited in the scope of inquiries that can be made using AI tools. Works in the public domain are not representative of the full scope of culture, and training AI on public domain works would omit studies of contemporary history, culture, and society from the scholarly record, as Authors Alliance and LCA described in a recent petition to the US Copyright Office. Hampering researchers’ ability to interrogate modern in-copyright materials through a licensing regime would mean that research is less relevant and useful to the concerns of the day.
Have you read this article by Cory Doctorow yet?
Fair use isn’t a loophole, it is copyright law.