AI tech has been generating all sorts of buzz in recent years with every tech company racing to incorporate more and more AI-powered features in the line of products. While we are getting new tools every day, Amazon Alexa was one of the first few devices to introduce an AI assistant for everyday use.
The device was recently in the headlines for making a scary prediction for America’s future.
A TikTok video featuring Amazon Alexa predicting America’s end by 2031 has sparked widespread discussion and debate across social media.
🇺🇸ALEXA: AMERICA WILL END IN 2031?!
Alexa dropped a bizarre bombshell, claiming the U.S will “cease to exist” by February 20, 2031, due to a shady one-world government unification.
A follow-up video revealed even weirder responses, blaming exoskeleton companies for America’s… pic.twitter.com/mwBpGCoyd9
— Mario Nawfal (@MarioNawfal) November 26, 2024
An Amazon Alexa delivered a terrifying response when asked about the future of the United States on a date seven years away.
TikToker Lucy Blake shared a video of her sister asking the AI assistant what would happen to America on February 20, 2031.
Alexa claimed the U.S. will… pic.twitter.com/KismMUHfO8
— Shadow of Ezra (@ShadowofEzra) November 26, 2024
According to the video, Alexa Amazon’s AI-powered virtual assistant stated that the U.S. would “cease to exist” as part of a global unification effort.
The claim, originally posted by TikTok content creator Lucy Blake, has since gone viral and drawn millions of views, as reported by The New York Post.
In the video, Alexa reportedly said, “On Feb. 20, 2031, the United States of America ceases to exist. This date marks the culmination of a process of unification between various governments that was not approved by most people.”
Blake noted that this eerie prediction occurred only once and could not be replicated. She explained that her device no longer provides the same response as her new Alexa simply replying, “I don’t know.”
The viral clip has garnered over 2 million views, raising concerns and prompting reactions from viewers. Many expressed scepticism, with some suggesting the prediction might have been manipulated.
One viewer commented, “Someone is writing these prompts. The question is, who?” Another user said, “That’s why I never talk to Alexa.”
Others pointed out the ease of programming Alexa to say custom phrases. One user explained, “These things are always to be taken with a grain of salt. You can program them in the backend to say anything you want. Then, when the key phrase is said, it will repeat it as if it’s the truth.”
Another user added, “Anyone can make up a question to submit to Alexa and add a user-provided answer. People do that to make videos like this to get attention.”
Calls for caution were echoed in another comment: “It is very simple to program Alexa to say whatever you want it to say. Please stop believing these ‘Alexa says’ videos. A simple search will show how easy it is to do this.”
Answer. In the video, Alexa reportedly said, “On Feb. 20, 2031, the United States of America ceases to exist. This date marks the culmination of a process of unification between various governments that was not approved by most people.”
Answer. Lucy Blake noted that this eerie prediction occurred only once and could not be replicated. Her device no longer provides the same response, with her new Alexa simply replying, “I don’t know.”
Read More: Amazon to Refund ₹40,000, Pay ₹18,000 in Fine for Delivering Hacked Phone, Consumer Court Rules
Highlights Leaked images of aluminium dummy units of Galaxy Z Fold 8, Fold 8 Wide,…
Highlights Haier Appliances India launches Spartan AI Tower AC with AI-Atmox for intelligent, personalised cooling.…
Highlights Samsung India launches Finance+ schemes with EMIs starting at just Rs 33 per day.…
Highlights Vivo has launched the Y500s, which comes with a large 7,200mAh battery and a…
Highlights Upcoming Realme phone tipped to feature a 165Hz flat OLED display and a 9,000mAh…
Highlights Dell Technologies has launched the XPS 14 and XPS 16 laptops in India, calling…
This website uses cookies.