I've been using ChatGPT for my research, but it keeps spitting out wrong or nonsensical answers. I'm working on a project about environmental policies, and I need factual data from spanning over a fairly long period. I wanted to make it easier for myself so I asked ChatGPT. Instead of getting just the facts, I got a mix of right and totally off-the-wall stuff. Had to fact check everything and in the end it took me the same amount of time and effort as if I had done the work myself, except costing me for the GPT subscription.
I did some research and found out that it's a common problem with AIs, called "hallucination." I need an AI that gives me correct information, not random guesses. No made up sources for god’s sake.
[link] [comments]