HomeUncategorizedWoman who relied on AI "hallucinations" loses tax battle

Woman who relied on AI “hallucinations” loses tax battle

A woman who attempted to use artificial intelligence in instead of a lawyer to represent her in court has lost her case.

In her self-represented appeal against a £3,000 tax penalty related to the sale of her house, Felicity Harber relied on case law obtained from an AI chatbot.

Unknown to her, though, it was discovered during the trial that the nine cases she had submitted to support her defense had been fabricated rather than drawn from actual court cases.

When Ms. Harber first claimed to have received the cases, she said they came from “a friend in a solicitor’s office.” It was “possible,” according to Mrs. Harber, that the cases were generated by an AI platform like ChatGPT when the tribunal stated they were unable to locate the cases on legal websites.

Because AI platforms lack a concept of “reality” and operate by anticipating the text that should be generated based on input, the Solicitors Regulation Authority has cautioned lawyers not to put their trust in them.

A “hallucination” occurs when an artificial intelligence generates false information and presents it as fact.

The tribunal determined that the cases were “plausible but incorrect,” indicating that they had to have been generated by AI. Even though they were similar to actual cases, there were differences in names, dates, and the verdict.

The tribunal was critical of Ms. Harber’s use of artificial intelligence (AI) but acknowledged that she was ignorant that the cases were not genuine and was unable to verify the validity of them through legal websites.

The tribunal acknowledged that the provision of fictitious cases in reasonable excuse tax appeals is probably not going to have as much of an effect as it would in many other forms of litigation. That being said, it is not always safe to quote made-up verdicts.

According to Judge Kastel’s ruling, citing fictitious court cases “wastes time and money” and “promotes cynicism about the legal profession.”

Following ChatGPT’s initial release in November 2022, there have been multiple instances of AI infiltrating the legal system.

Six fictitious AI-generated cases that were filed in a court filing during the summer cost two New York attorneys a fine. Steven Schwartz, a lawyer, expressed his shock at the time, saying he had no idea ChatGPT could create fake cases.

In 2018, Ms. Harber contested a £3,265 fine she was assessed for neglecting to notify HM Revenue & Customs of a capital gains tax obligation.

Her “reasonable” explanation for not paying was that she was unaware of the law and had been experiencing anxiety and panic attacks since her mother passed away in 2013. But her appeal was rejected by the tribunal.

According to Tim Stovold of the tax firm Moore Kingston Smith, Ms. Harber unintentionally trusted artificial intelligence to choose which tax cases would help her fight an HMRC fine.

Her claim that there was no penalty because of her mental health conditions and lack of legal knowledge was supported by nine tax cases that the robots found, calling them “reasonable excuses.”

In this case, the cases that were generated were entirely made up, but they were plausible.

Although AI is a very useful tool for professionals, human intervention is still required to ensure that the outcomes are accurate.

Source link

Most Popular