You may not know exactly what “slop” means in relation to artificial intelligence. But on some level you probably do.
Slop, at least in the fast-moving world of online message boards, is a broad term that has developed some traction in reference to shoddy or unwanted A.I. content in social media, art, books and, increasingly, in search results.
Google suggesting that you could add nontoxic glue to make cheese stick to a pizza? That’s slop. So is a low-price digital book that seems like the one you were looking for, but not quite. And those posts in your Facebook feed that seemingly came from nowhere? They’re slop as well.
The term became more prevalent last month when Google incorporated its Gemini A.I. model into its U.S.-based search results. Rather than pointing users toward links, the service attempts to solve a query directly with an “A.I. Overview” — a chunk of text at the top of a results page that uses Gemini to form its best guess at what the user is looking for.
The change was a reaction to Microsoft having incorporated A.I. into its search results on Bing, and it had some immediate missteps, leading Google to declare it would roll back some of its A.I. features until problems can be ironed out.
But with the dominant search engines having made A.I. a priority, it appears that vast quantities of information generated by machines, rather than largely curated by humans, will be served up as a daily part of life on the internet for the foreseeable future.