Not all websites are reputable, either. Review the library's guide "Is This Website Credible?" to learn more about fact-checking information you find online.
Hallucination refers to the fact that AI sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic. You'll need to fact-check information from AI no matter what, but this is especially true if you're new to a subject area.
When an AI model is combined with a search engine, it hallucinates less. That's because it can search the web, ingest the pages it finds, and use the AI model to summarize and link to those pages.
There may be mistakes in the summary, and there may be mistakes on the website the AI tool is summarizing. Always follow the links, read, and evaluate the articles the AI tool is summarizing.
Most all of the major models now include the ability to search the web.
There are AI systems that combine language models with scholarly sources. These models use scholarly papers as a source of grounding. See the "AI Tools for Research" tab on this page for a list of AI models that are connected to the scholarly literature.