I asked, “What will I find there?” He replied “Your answer.” I didn’t have a lot of confidence in the mission, but the address was close by so I went. Sure enough, the backroom gave me the answer. (The details of what was missing is anticlimactic, but there were some 60,000 missing street signs.)
That’s how to deal with what genAI tools produce. Don’t assume it’s correct, but feel free to ask questions — and make other inquiries — based on that information. It can be helpful, if you do the legwork.
It’s important to remember that for every right answer genAI delivers, there will be many wrong answers. (The hyperscalers often seem to forget to mention that.) And sadly, “wrong answers” are not limited to hallucinations.