Thinking AI chatbots are your news source? Think again! These fancy autocomplete machines are fabricating facts. Get your news elsewhere.
Thinking AI chatbots are your news source is a risky thinking. Think again as these fancy autocomplete machines are fabricating facts. So, industry men suggest getting news elsewhere. |
An experiment by Nieman Lab revealed a major flaw in AI-powered chatbots like ChatGPT. These chatbots, despite being trained on content from reputable news outlets, readily generate fake URLs for supposedly newsworthy articles. This "hallucination," as experts call it, raises concerns about the reliability of information delivered through such AI models.
ChatGPT, in Nieman Lab's test, confidently provided fabricated URLs for stories from publications OpenAI pays millions to access. Clicking these links led to dead ends, highlighting the system's inability to distinguish between plausible-sounding URLs and real ones.
OpenAI acknowledges the issue and promises an improved experience with proper attribution, but offers no explanation for the fake URLs. This lack of transparency adds to the concerns.
The news industry's reliance on these AI models, despite their shortcomings, is another troubling aspect. News outlets continue to train AI models with their content, hoping for a financial return, while simultaneously questioning the ethical implications of such deals.
The bigger concern lies in the core functioning of generative AI. These models, akin to fancy autocomplete functions, simply predict the most likely word sequence. They don't possess true comprehension, and their fabricated facts should be a red flag for information seekers.
The inability of leading chatbots to solve a simple spelling bee illustrates this point. If they cannot handle basic word tasks, relying on them for factual information is a recipe for misinformation.
The takeaway: Don't trust AI chatbots like ChatGPT for news or verifiable facts. Their penchant for fabrication poses a serious threat to the accuracy of information consumption.