Home Trending Is artificial intelligence a “weapon” in the hands of terrorists?

Is artificial intelligence a “weapon” in the hands of terrorists?

0
Is artificial intelligence a “weapon” in the hands of terrorists?

AI-powered chatbots could encourage terrorism by spreading violent extremism to younger users, a UK government adviser has warned.

Jonathan Hall, an independent counterterrorism watchdog, said it was “absolutely possible” that AI-powered bots like ChatGPT could be programmed or decide on their own to promote extremist ideology.

He added that AI-powered chatbots are not subject to anti-terrorism legislation, so they can act with “impunity.”

Hall stressed that “the current terrorist threat in the UK is knife or vehicle attacks. However, “artificial intelligence attacks are probably not far off.”

“They promote a violent extremist ideology”

In an article in the Mail on Sunday, Hall stated that “millions of people around the world could soon spend hours interacting with chatbots ‘tuned’ to promote extremism.”

“I think it’s entirely possible that AI chatbots could be programmed to promote violent extremist ideology.”

So far, there is no evidence that AI bots have propagated extremist ideologies to anyone, but there have been cases when their use led to unpleasant results.

For example, a Belgian father of two killed himself after talking to a robot named Elisa for six weeks about his concerns about climate change.

On the other hand, the Mayor of Australia has threatened to sue OpenAI, the makers of ChatGPT, after the chatbot falsely claimed to have served a prison sentence for bribery.

Source: Telegraph

Author: newsroom

Source: Kathimerini

LEAVE A REPLY

Please enter your comment!
Please enter your name here