Humans are still better than robots at finding correct answers online. Good.
What, if anything, is AI search good for? It has been a month since Googleâs spectacular goof. Its new AI Overviews feature was supposed to â[take the legwork out of searching](,â offering up easy-to-read answers to our queries based on multiple search results. Instead, it told people to [eat rocks]( and to [glue cheese on pizza](. You could ask Google what country in Africa starts with the letter âKâ, and Google [would say none of them](. In fact, you can still get these wrong answers because AI search is a disaster. This spring looked like a turning point for AI search, thanks to a couple of big announcements from major players in the space. One was that Google AI Overview update, and the other came from Perplexity, an AI search startup thatâs already been labeled [as a worthy alternative to Google](. At the end of May, Perplexity launched a new feature called Pages that can create custom web pages full of information on one specific topic, like a smart friend who does your homework for you. Then [Perplexity got caught plagiarizing](. For AI search to work well, it seems, it has to cheat a little. Thereâs a lot of ill will over AI searchâs mistakes and missteps and critics are mobilizing en masse. A group of online publishers and creators [took to Capitol Hill on Wednesday]( to lobby lawmakers to look into Googleâs AI Overviews feature and other AI tech that pulls content from independent creators. This is just a couple days after the Recording Industry Association of America (RIAA) and a group of major record labels [sued two AI companies]( that generate music from text for copyright infringement. And letâs not forget that [several newspapers](, [including the New York Times](, have sued OpenAI and Microsoft for copyright infringement for scraping their content in order to train the same AI models that power their search tools. (Vox Media, the company that owns this publication, meanwhile, [has a licensing deal with OpenAI]( that allows our content to be used to train its models and by ChatGPT. Our journalism and editorial decisions remain independent.) Generative AI technology is supposed to transform the way we search the web. At least, [thatâs the line]( weâve been fed since ChatGPT exploded on the scene near the end of 2022, and now every tech giant is pushing its own brand of AI technology: Microsoft has Copilot, Google has Gemini, Apple has Apple Intelligence, and so forth. While these tools can do more than help you find things online, dethroning Google Search still seems to be the holy grail of AI. Even OpenAI, maker of ChatGPT, is reportedly [building a search engine]( to compete directly with Google. But despite many companiesâ very public efforts, AI search wonât make finding answers online effortless any time soon, according to experts I spoke to. Itâs not just that AI search isnât ready for primetime due to some flaws, itâs that those flaws are so deeply integrated into how AI search works that itâs now unclear if it can ever get good enough to replace Google. âIt's a good addition, and there are times when it's really great,â Chirag Shah, a professor of information science at the University of Washington, told me. âBut I think we're still going to need the traditional search around.â Rather than going into all of AI searchâs flaws here, let me highlight the two that were on display with the recent Google and Perplexity kerfuffles. The Google pizza glue incident shows just how stubborn generative AIâs hallucination problem is. Just a few days after Google launched AI Overview, some users noticed that if you asked Google how to keep cheese from falling off of pizza, Google [would suggest adding some glue](. This particular answer appeared to come from [an old Reddit thread]( that, for some reason, Googleâs AI thought was an authoritative source even though a human would quickly realize that the Redditors are joking about eating glue. Weeks later, The Vergeâs Elizabeth Lopatto reported that [Googleâs AI Overview feature was still recommending pizza glue](. Google [rolled back its AI Overview feature]( in May following the viral failures, so itâs difficult to access AI Overview at all. The problem isnât just that the large language models that power generative AI tools can hallucinate, or make up information in certain situations. They also canât tell good information from bad â at least not right now. âI don't think we'll ever be at a stage where we can guarantee that hallucinations won't exist,â said Yoon Kim, an assistant professor at MIT who researches large language models. âBut I think there's been a lot of advancements in reducing these hallucinations, and I think we'll get to a point where they'll become good enough to use.â The recent Perplexity drama [highlights a different problem with AI search](: It accesses and republishes content that itâs not supposed to. Perplexity, whose investors include Jeff Bezos and Nvidia, made a name for itself by providing deeper answers to search queries and showing its sources. You can give it a question and it will come back with a conversational answer, complete with citations from around the web, which you can refine by asking more questions. When Perplexity launched its Pages feature, however, it became clear that its AI had an uncanny ability to rip off journalism. Perplexity even makes Pages it generated [look like a news section of its website](. One such Page it published included [summaries of some Forbesâs exclusive, paywalled investigative reporting]( on Eric Schmidtâs drone project. Forbes [accused Perplexity of stealing its content](, and Wired [later reported]( that Perplexity was scraping content from websites that have blocked the type of crawlers that do such scraping. The AI-powered search engine would even construct incorrect answers to queries based on details in URLs or metadata. (In an interview with Fast Company last week, Perplexity CEO Aravind Srinivas [denied some of the findings]( of the Wired investigation and said, âI think there is a basic misunderstanding of the way this works.â) The reasons why AI-powered search stinks at sourcing are both technical and simple, Shah explained. The technical explanation involves [something called retrieval-augmented generation]( (RAG), which works a bit like a professor recruiting research assistants to go find out more information about a specific topic when the professorâs personal library isnât enough. RAG does solve a couple of problems with how the current generation of large language models generate content, including the frequency of hallucinations, but it also creates a new problem: It canât distinguish good sources from bad. In its current state, AI lacks good judgment. When you or I do a Google search, we know that the long list of blue links will include high-quality links, like newspaper articles, and low-quality or unverified stuff, like old Reddit threads or SEO farm garbage. We can distinguish between the good or bad in a split second, thanks to years of [experience perfecting our own Googling skills](. And then thereâs some common sense that AI doesnât have, like knowing whether or not itâs okay to eat rocks and glue. âAI-powered search doesnât have that ability just yet,â Shah said. None of this is to say that you should turn and run the next time you see an AI Overview. But instead of thinking about it as an easy way to get an answer, you should think of it as a starting point. Kind of like Wikipedia. Itâs hard to know how that answer ended up at the top of the Google search, so you might want to check the sources. After all, youâre smarter than the AI. âAdam Clark Estes, senior technology correspondent Getty Images [Is Nvidia stock overvalued? It depends on the future of AI.]( It has been a wild few days for Nvidia stock. But the hype isn't dying down anytime soon. Getty Images [The Supreme Court hands an embarrassing defeat to Americaâs Trumpiest court]( No, MAGA judges donât get to decide what the Biden administration is allowed to say. [A man in a suit running towards an exit door.]( Getty Images/iStockphoto [What if quitting your terrible job would help the economy?]( Should people who quit get unemployment benefits?
Â
[Learn more about RevenueStripe...]( Getty Images [Julian Assangeâs release is still a lose-lose for press freedom]( The Wikileaks founder will plead guilty to violating the Espionage Act for publishing leaks about the Iraq War. [A huge dark cloud with a swirling funnel in southwest Texas]( Wirestock/Getty Images [The race to get ahead of the next tornado]( AI is being tested against one of the most dangerous natural disasters. Become a Vox member Support our journalism â become a Vox Member and youâll get exclusive access to the newsroom with members-only perks including newsletters, bonus podcasts and videos, and more. [Join our community]( [Listen To This] [Listen to This]( [We still don't really know how inflation works]( Inflation is one of the most significant issues shaping the 2024 election. But how much can we actually do to control it? [Listen to Apple Podcasts]( [This is cool] Oslo reminds tourists that it's gloriously boring there](
Â
[Learn more about RevenueStripe...]( [Facebook]( [Twitter]( [YouTube]( This email was sent to {EMAIL}. Manage yourâ¯[email preferences]( , orâ¯[unsubscribe](param=tech) â¯to stop receiving emails from Vox Media. View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1201 Connecticut Ave. NW, Washington, DC 20036. Copyright © 2024. All rights reserved.