You’re not crazy. Chat GPT has gotten considerably worse over time
You’re not crazy. Chat GPT has gotten considerably worse over time

You’re not crazy. Chat GPT has gotten considerably worse over time

I feel like there is a very, very common misunderstanding of AI, and what it is. This also applies to Chat GPT

What people think AI is: An all-knowing entity that is capable of instantaneously returning an accurate answer/solution to virtually any question/challenge.

What AI actually is: A collection of data that grows over time, and as it grows, becomes more inaccurate, inefficient, and ineffective at solving problems due to the overwhelming amount of information, or ambiguity of the problems

Chat GPT has indeed gotten worse, and it's because it's being trained with an astounding amount of new data every single day, and just like human beings, it struggles with different problems, even simple ones, the more time goes on. As we learn more and more in one area, we become less effective in another area. AI works similarly. More data is provided in X area, so now, it suffers in Y area. Well, there's a near infinite amount of areas that AI is 'learning' from, so it's not getting more proficient in one/few things. Instead, it's becoming more "well rounded" and that means less skilled in A,B,C.... But now it's capable of D,E,F. However, it can't be good at all of them unless it has infinite processing power, too, which it doesn't have.

submitted by /u/databro92
[link] [comments]