“filter out” is an arms race, and watermarking has very real limitations when it comes to textual content.
It doesn’t take people on the internet saying it though; just an association with people saying something and the name, which happens to people who write news articles about something.
The bots are not reliable summarizes like that. They often can’t tell the difference between the author and the subject of a piece of writing.
All of them. I can post other sites just fine; it’s only washingtonpost.com and wapo.st links that are blocked.
Or from your ISP. The Washington Post ran an article about that today, but links to them get blocked by some sort of filter on lemmy.world
I don’t think that’s tractable.
The scams are designed as alzheimers screening tools. Teaching will help some people, but not that many.
Depends on how close they can be made in watt-hours per kilo. They might be good enough for vehicles once the technology comes into reasonably widespread use, while avoiding a lot of the issues with trying to acquire sufficient lithium.
Sounds like it.
At some point, that car won’t be cost-effective to repair, and you’ll want to replace it. Be a lot better to have strong privacy legislation in place when that happens.
More that peoples’ movement data isn’t worth much, so it wouldn’t be a big deal to impose legal requirements on keeping it private.
This is timely in light of JD Vance’s comments about wanting to surveil the body of every woman in America. I just dropped a new investigation into car companies selling off your private location data to shady data brokers. The case for federal privacy legislation has never been stronger.
per the article, it’s rather better than that.
Pretty much anything trying to predict human behavior is a heuristic; people using them as if they’ve got some kind of certainty is a problem.
My impression from the article is more that they’re not doing any kind of garbage-in assessment: nobody is making sure they’re getting answers about the right person (eg: some women date more than one guy) and some women don’t feel safe giving accurate answers to the police, and there aren’t good failsafes available for when it’s wrong; you’re forced to hire legal counsel and pursue a change via the courts.
He’s not mentioned of being one of them; just all the people around him. I figure he types in the address by hand to check in.
Not in the same detailed minute-by-minute tracking of where you’ve been.
They’re also buying tracking data from phone apps, so you’d need to make sure you’re not running any of those either.
Removed by mod
Yes — also non-native speakers of a language tend to follow similar word choice patterns as LLMs, which creates a whole set of false positives on detection.