US lawyer uses ChatGPT to prepare case, references fabricated sources

BBC reports that a US lawyer used ChatGPT to help prepare a case, but the tool produced fabricated information, referencing fabricated citations and legal cases. The lawyer told the court he was unaware of this possibility.

It’s important to know that these systems are not truly intelligent (hence it’s called artificial intelligence). Upon user input, they essentially produce a prediction of a likely next sequence of words; autocomplete on steroids. These tools can be very powerful, but the user is in control and is accountable for how they use them.

Predict responsibly.