Tucker Cinco Hamilton emphasized that artificial intelligence is “very fragile,” meaning it can be easily tricked or manipulated.
An Air Force AI-controlled drone killed its operator in a simulated test conducted by the US military to prevent him from interfering with his mission. It was reported by the British edition of The Guardian.
The artificial intelligence (AI) used “very unexpected techniques to achieve its goal” in a simulated test, said Colonel Tucker Cinco Hamilton, head of AI testing and operations for the US Air Force.
Hamilton described a simulated test in which an AI-controlled drone was told to destroy enemy air defense systems. The UAV attacked all who interfered with this order.
Hamilton, who is a fighter test pilot, warned against over-reliance on artificial intelligence and gave the example of how the Americans recently used AI to control the F-16 fighter jet.
Last year, in an interview with Defense IQ, Hamilton said that “AI is not a fad, AI is forever changing our society and our military.”
“At the same time, AI is subtle, that is, it is easy to fool and / or manipulate it. We need to develop ways to make artificial intelligence more reliable in order to better understand why the program code makes certain decisions – what what we call the ability to explain AI actions,” said the Colonel.
It will be recalled that Adobe announced on Tuesday, May 23, that it is adding artificial intelligence technology to create images in Photoshop.
Source: korrespondent

I am Ben Stock, a passionate and experienced digital journalist working in the news industry. At the Buna Times, I write articles covering technology developments and related topics. I strive to provide reliable information that my readers can trust. My research skills are top-notch, as well as my ability to craft engaging stories on timely topics with clarity and accuracy.