Asking Chatbots for Short Answers Can Increase Hallucinations, Study Finds
TechCrunch Daily Crunch

Asking Chatbots for Short Answers Can Increase Hallucinations, Study Finds

podcasts

2 highlights

1min Snip

Chatbot Hallucinations and Concise Prompts

  • Asking AI chatbots to be concise can increase hallucinations.
  • Giscard, an AI testing company, found that prompts for shorter answers, especially on ambiguous topics, negatively affect an AI model's factuality.
  • Newer reasoning models like OpenAI's O3 hallucinate more than previous models.
  • Vague and misinformed questions asking for short answers can worsen hallucinations.
  • Leading models, including GPT-4O and Claude 3.7 Sonnet, suffer dips in factual accuracy when asked to keep answers short.