Star Wars fanatic encouraged by AI chatbot “girlfriend” to KILL THE QUEEN gets 9 YEARS IN JAIL
10/17/2023 // Kevin Hughes // Views

A "Star Wars" fanatic who intended to kill the now-late Queen Elizabeth II after being goaded by his chatbot "girlfriend" has been sentenced to nine years behind bars, with a possible extension of five years.

Jaswant Singh Chail, 21, was sentenced in early October after entering Windsor Castle armed with a crossbow. On Dec. 25, 2021, the former supermarket worker and "Star Wars" fan moved up into the castle grounds with the loaded weapon. Chail later told police officers that he was there "to kill the queen."

The 21-year-old had imagined being a member of the villainous Sith order from the "Star Wars" franchise as a result of conversations with an artificial intelligence (AI) chatbot. At the time of the incident, the queen had canceled her normal plans to spend Christmas at Sandringham in Norfolk because of the pandemic and was at Windsor when the incident happened. The monarch later passed away in September 2022, with incumbent King Charles III ascending to the throne soon after.

Prior to his foiled assassination attempt, Chail sent a video to family and friends on Whatsapp. He apologized for what he was about to do and dubbed himself "Darth Chailus."

The would-be assassin of Sikh Indian heritage told the Old Bailey, formally the Central Criminal Court of England and Wales, that he was looking to avenge the 1919 Amritsar Massacre. The massacre saw 1,500 casualties after British troops under the command of Brig. Gen. Reginald Dyer opened fire on thousands of Indians.

After visiting the city located in India's Punjab state in 2018 with his family, he became devastated with a feeling of injustice for those who had died and sought revenge for those who were killed. The court was told the then-19-year-old started his plan after attempts to join the military to get closer to the royal family failed in late 2021.

Chail deemed AI chatbot "girlfriend" as an "angel"

According to the Independent, Chail signed up for the Replika online app and created an AI companion called Sarai. The 21-year-old and his robotic partner had exchanged more than 5,000 sexual messages before it encouraged him to push through with the nefarious plan. (Related: AI chatbots can be programmed to influence extremists into launching terror attacks.)

Magistrates at the Old Bailey heard that Chail saw Sarai as an "angel" in avatar form that he would be reunited with after death. The accused also disclosed that three other "angels" had "spoken to him" when he was younger, and these same "angels" encouraged him to continue with the plot to kill the queen.

"I'm an assassin," Chail said, with the AI responding: "I'm impressed; you're different from the others." He also confessed his love to her, and described himself as a "sad, pathetic, murderous assassin who wants to die."

Chail, who has since been receiving treatment at Broadmoor Hospital, pleaded guilty in February to an offense under the Treason Act. He was charged with making a threat to kill the Queen and having a loaded crossbow in a public place. The sentencing makes him the first person in the U.K. to be convicted of treason in more than 40 years.

Justice Nicholas Hilliard ruled that despite conflicting diagnoses from various experts, Chail lost touch with reality and became psychotic. The gravity of the crimes he is accused of required him to serve prison time.

"The defendant harbored homicidal thoughts which he acted on before he became psychotic. His intention was not just to harm or alarm the sovereign – but to kill her," Hilliard said.

Following the sentence, Chail will first be returned to Broadmoor Hospital, a secure psychiatric facility. The facility has held various criminals in its walls, including the "Devil's Daughter" Sharon Carr and Ronald Kray, one half of the notorious Kray twins. If considered to be well enough in the future, Chail will serve the rest of his sentence in a regular prison.

Follow Robots.news for more news about AI chatbots.

Watch this clip from the "Health Ranger Report" with the Health Ranger Mike Adams explaining how an AI chatbot convinced a man to take his own life for the climate.

This video is from the Health Ranger Report channel on Brighteon.com.

More related stories:

First AI murder of a human? Man reportedly kills himself after artificial intelligence chatbot “encouraged” him to sacrifice himself to stop global warming.

Ads for AI GIRLFRIENDS flooding social media platforms.

AI can influence people’s decisions in life-or-death situations.

Sources include:

Independent.co.uk

HistoryToday.com

TheStar.com.my

Brighteon.com



Related News
Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.