Home   News   National   Article

‘Genie won’t go back in the bottle’ on AI, says security minister


By PA News

Register for free to read more of the latest local news. It's easy and will only take a moment.



Click here to sign up to our free newsletters!
Security minister Tom Tugendhat has suggested that calls to suspend or stop the development of artificial intelligence due to fears about the new technology are misguided (James Manning/PA)

Calls to suspend or stop the development of artificial intelligence due to fears about the new technology are misguided, the security minister has suggested.

Tom Tugendhat, addressing the CyberUK conference in Belfast, said he understands fears about the potential danger of AI but added that the “genie won’t go back in the bottle”.

Italy said last month that it will temporarily block the artificial intelligence software ChatGPT amid global debate about the power of such new tools.

The AI systems powering such chatbots, known as large language models, are able to mimic human writing styles based on the huge trove of digital books and online writings they have ingested.

There is also significant debate about the potential of the new technology, and Mr Tugendhat said the UK can become a leader in the area if the Government and private sector can work together.

However, he acknowledged that criminals and cyber attackers are aware of the uses of AI.

“Cyber attacks work when they find vulnerabilities. AI will cut the cost and complications of cyber attacks by automating the hunt for the chinks in our armour,” he said.

“Already AI can confuse and copy, spreading lies and committing fraud.

“Natural language models can mimic credible news sources, pushing disingenuous narratives at huge scale, and AI image and video generation will get better.”

The security minister also acknowledged the threat posed by Russia, as well as China’s interest in AI.

“Given the stakes, we can all understand the calls to stop AI development altogether,” he said. “But the genie won’t go back in the bottle any more than we can write laws against maths.

“(Russian President Vladimir) Putin has a longstanding strategic interest in AI and has commented that whoever becomes leader in this sphere will rule the world.

“China, with its vast datasets and fierce determination, is a strong rival.

“But AI also threatens authoritarian controls. Other than the United States, the UK is one of only a handful of liberal democratic countries that can credibly lead the world in AI development.

We can stay ahead, but it will demand investment and co-operation
Security minister Tom Tugendhat

“We can stay ahead, but it will demand investment and co-operation, and not just by government.”

“As for the safety of the technology itself, it’s essential that, by the time we reach the development of AGI (artificial general intelligence), we are confident that it can be safely controlled and aligned to our values and interests.

“Solving this issue of alignment is where our efforts must lie, not in some King Canute-like attempt to stop the inevitable but in a national mission to ensure that, as super-intelligent computers arrive, they make the world safer and more secure.”

Mr Tugendhat followed several senior officials and ministers to have addressed the annual conference, which has been dominated by debates about the challenge posed by China and the cyber threat posed by Russian-aligned groups.

Lindy Cameron, head of the National Cyber Security Centre, warned earlier this week that more needs to be done to protect the UK from the threat posed by Russia-aligned cyber groups.

And Chancellor of the Duchy of Lancaster Oliver Dowden stressed the danger that a “cyber equivalent of the Wagner group” poses to critical infrastructure.

Do you want to respond to this article? If so, click here to submit your thoughts and they may be published in print.

Keep up-to-date with important news from your community, and access exclusive, subscriber only content online. Read a copy of your favourite newspaper on any device via the HNM App.

Learn more


This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More