The LLM search engine
Ben Werdmuller tried Arc's new “AI”-based search and shares his concerns in Stripping the web of its humanity.
Like all these tools, it outputs falsehoods. But that isn't the worst issue, he explains. Without attribution, the tool gives a false sense of objectivity and hides away bias:
If I search for “who should I follow in AI?” I get the usual AI influencers, with no mention of Timnit Gebru or Joy Buolamwini (who would be my first choices). If I ask who to follow in tech, I get Elon Musk. It undoubtedly has a lens through which it sees the world.
It's a particular kind of bubble where Elon Musk is worth following and Timnit Gebru is not suggested (would very much recommend following her instead).
Ben also notes that when bots consume content instead of humans, that threatens the ecosystem of content and writing:
If we strip [payments or donations to writers] away, there’s no writing, information, or expression for the app to summarize.
Who's going to make the input these tools grab in order to generate their output? Google faced various legal issues around displaying excerpts of news outlets on their news website. But they did at least quote and attribute them, while linking to the original. The automated processing basically strips away any opportunity for writers to be paid (or known) for their work.