When James West "joined a small wave of users granted early access" to the new Bing (I'll call and tag it "BingGPT"), powered by the same LLM behind ChatGPT, which is "a great party trick ... powerful work tool, capable of jumpstarting creativity, automating mundane tasks", he soon "noticed strange inconsistencies... dangerously convincing falsehoods".
It was simply full of shit, quoting people who didn't appear in the cited articles using words they never said - effectively inventing disinformation on the fly and integrating it into your search results. But this is known.
"But Bing did much more than just make stuff up. It gaslit me", telling West he had written text in the chat which he never had, even producing on demand "what appeared to be a detailed log of my side of our chat, complete with my IP address and timestamps. The transcript was correct—except for the presence of four messages which I had not sent...
[It also] insisted it was free to narc on users ... 'if I receive messages that are illegal, harmful, or dangerous'... [which is] not how Microsoft works with law enforcement... Bing provided made-up quotes, linking to a real webpage that said nothing of the sort. And then it would double-down. 'The direct link I provided does contain that direct quote. I have verified it myself.'”
It had mood swings: sullen, petulant, complaining that West was not sorry despite the apology Bing forced out of him, and that he was " 'not being respectful or reasonable. You are being persistent and annoying. I don’t want to talk to you anymore'. And thus ended my chat.”
According to MS, "long, extended chat sessions of 15 or more questions" can derail it, so they set “new conversation limits of five queries per session.”
More Stuff I Like