ChatGPT suffers from artificial hallucination with hints of sociopathy. If stumped, the much-hyped chatbot will brazenly resort to lying and spit out plausible-sounding answers compiled from random falsehoods. Unable to make it, the bot will happily fake it. In May, New York lawyer Steven A Schwartz discovered this the hard...