- cross-posted to:
- technology@lemmy.world
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
- technology@lemmy.world
Edward Zitron has been reading all of google’s internal emails that have been released as evidence in the DOJ’s antitrust case against google.
This is the story of how Google Search died, and the people responsible for killing it.
The story begins on February 5th 2019, when Ben Gomes, Google’s head of search, had a problem. Jerry Dischler, then the VP and General Manager of Ads at Google, and Shiv Venkataraman, then the VP of Engineering, Search and Ads on Google properties, had called a “code yellow” for search revenue due to, and I quote, “steady weakness in the daily numbers” and a likeliness that it would end the quarter significantly behind.
HackerNews thread: https://news.ycombinator.com/item?id=40133976
MetaFilter thread: https://www.metafilter.com/203456/The-core-query-softness-continues-without-mitigation
I was thinking on something slightly different. It would be automatic; a bit more like “federated Google” and less like old style indexing sites. It’s something like this:
It would be vulnerable to SEO, but less so than Google - because SEO tailored to the algorithm being used by one server won’t necessarily work well for another server.
Please, however, note that this is “ideas guy” tier. I wouldn’t be surprised if it’s unviable, for some reason that I don’t know.
I think you could do it in Lemmy itself combined with RSS feeds. The mods would curate a list of RSS feeds, and use the keywords to pick the ones for a bot to automatically post (which means if a programming blog did a post about windsurfing, it wouldn’t show up as long as the meta keywords didn’t match). Mods could take suggestions each week for feeds to add or remove.