During the “Star Trek: The Next Generation” episode “Devil’s Due,” the Enterprise’s resident android Data serves as an arbitrator between Captain Jean-Luc Picard and a con artist because, as an android, Data isn’t privy to bias or favoritism. The audience is willing to go along with this logic since they know Data is a good person (he successfully argued his personhood in front of a judge in a prior episode). Plus, “Star Trek” is a work of fiction. But when a real AI is involved in a real trial, then things get scary.
In March of 2023, Judge Anoop Chitkara, who serves the High Court of Punjab and Haryana, India, was presiding over a case involving Jaswinder Singh, who had been arrested in 2020 as a murder suspect. Judge Chitkara couldn’t decide if bail should be provided, so he asked ChatGPT. The AI quickly rejected bail, claiming Singh was “considered a danger to the community and a flight risk” due to the charges, and that in such cases bail was usually set high or rejected “to ensure that the defendant appears in court and does not pose a risk to public safety.”
To be fair, ChatGPT’s ruling isn’t unreasonable, probably because we know so little about the case, but it is scary because the program only reads information and lacks analysis and critical thinking skills. Moreover, this event sets a precedent. Since Judge Chitkara took the AI’s advice, who’s to say other public officials won’t do the same in the future? Will we one day rely on ChatGPT or another AI to pass rulings instead of flesh and blood judges? The mere thought is enough to send shivers down your spine.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest gaming News Click Here
For the latest news and updates, follow us on Google News.