Skip to Main Content

Students' Guide to Generative AI

Always Fact-Check AI

AI "Hallucination"

Hallucination refers to the fact that AI sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic. You'll need to fact-check information from AI no matter what, but this is especially true if you're new to a subject area. 

 

Web Search Results as Grounding

When an AI model is combined with a search engine, it hallucinates less. That's because it can search the web, ingest the pages it finds, and use the AI model to summarize and link to those pages. 

There may be mistakes in the summary, and there may be mistakes on the website the AI tool is summarizing. Always follow the links, read, and evaluate the articles the AI tool is summarizing.

Most all of the major models now include the ability to search the web.

 

Scholarly Sources as Grounding

There are AI systems that combine language models with scholarly sources. These models use scholarly papers as a source of grounding. See the "AI Tools for Research" tab on this page for a list of AI models that are connected to the scholarly literature.

Why Read Laterally?