Once pitched as dispassionate tools to answer your questions, AI chatbots are now programmed to reflect the biases of their creators
Once pitched as dispassionate tools to answer your questions, AI chatbots are now programmed to reflect the biases of their creators

Once pitched as dispassionate tools to answer your questions, AI chatbots are now programmed to reflect the biases of their creators

The New York Times tested several chatbots and found that they produced starkly different answers, especially on politically charged issues. While they often differed in tone or emphasis, some made contentious claims or flatly hallucinated facts. As the use of chatbots expands, they threaten to make the truth just another matter open for debate online.

submitted by /u/tekz
[link] [comments]