Before you get too enamored with using AI for research, there’s something you should know: Sometimes it just makes stuff up.
When I’ve seen this happen in AI responses to simple questions about easily verified information, I’ve challenged its accuracy, and it has backed down. But apparently, sometimes it just keeps digging the hole deeper and deeper, as John G. West notes:
There is no doubt that AI-generated information services are powerful tools, and they will be increasingly influential in the years ahead. But they come with a significant downside. Marketed as if they are omniscient and accurate, in reality they are only as good as the data fed into them and the programming used to process that data. If the data fed into these AI services is skewed, their answers will be no better.
That’s bad enough. But the situation is actually far worse. The underlying programming of these systems can lead to the generation and dissemination of completely fictional statements. I discovered this firsthand when I experimented with Google’s Bard in 2023. I asked it questions about the legality of teaching evidence for intelligent design in public school classrooms.
I had expected Bard to generate the kinds of inaccuracies and bias found in places like Wikipedia. After all, it probably drew on Wikipedia as one of its sources. What I didn’t expect was that Bard would invent completely false information, such as claiming (wrongly) that the United States Supreme Court had ruled that teaching intelligent design in public schools is unconstitutional in the case of Kitzmiller v. Dover. In reality, Kitzmiller was merely a federal district court case, not a Supreme Court case, and its decision only applies to one part of Pennsylvania, not nationwide.
And when I asked it to justify specific claims or to cite its sources, Bard repeatedly invented court rulings that didn’t actually exist. It generated the names of the purported legal cases and even cited the courts that purportedly issued the rulings. But these court cases and rulings never happened.
Let that sink in. Bard provided fictional sources in support of false claims.
Yikes. If you must proceed, proceed with caution.