Close Menu
My Blog

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Nautilus debuts Voyager platform in push toward next-gen proteomics

    March 1, 2026

    First-in-Human Success for Prenatal Stem Cell Therapy in Spina Bifida

    February 28, 2026

    Pressure-Driven Pathway Links Liver Congestion to Fibrosis and Cancer

    February 28, 2026
    Facebook X (Twitter) Instagram
    X (Twitter) YouTube
    My BlogMy Blog
    Sunday, March 1
    • Home
    • About Us
    • Healthy Living
    • DNA & Genetics
    • Podcast
    • Shop
    My Blog
    Home»Gut Health»Why we should limit the autonomy of AI-enabled weapons
    Gut Health

    Why we should limit the autonomy of AI-enabled weapons

    adminBy adminOctober 29, 2025No Comments4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
    Why we should limit the autonomy of AI-enabled weapons
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The SkyShark, an autonomous drone built in the United Kingdom, is put on display at the Defence and Security Equipment International exhibition in London.Credit: John Keeble/Getty Images

    Weapons capable of identifying and attacking targets automatically have been in use for more than 80 years. An early example is the Mark 24 Fido, a US anti-submarine torpedo equipped with microphones to home in on targets, which was first deployed against German U-boats in 1943.

    Nature Spotlight: Robotics

    Such ‘first-wave’ autonomous systems were designed to be used in narrowly defined scenarios and programmed to act in response to signals such as the radiofrequency emissions of specific targets. The past ten years have seen the development of more advanced systems that can use artificial intelligence to navigate, identify and destroy targets with little or no human intervention. This has led to growing calls from human-rights groups to ban or regulate the technologies.

    Nehal Bhuta, a professor of international law at the University of Edinburgh, UK, has been investigating the legality and ethics of autonomous weapons for more than a decade. He was among the authors of a report on the responsible use of AI presented to the United Nations Security Council last month by Netherlands Prime Minister Dick Schoof.

    Bhuta says that autonomous weapons, especially those that are AI-enabled, raise multiple ethical and legal concerns, including determining responsibility for system failures and potentially encouraging the intrusive collection of civilian data. He says there is still time for the international community to agree on principles and regulations to limit the risk, and warns that an arms race could ensue if it fails to do so.

    Which legal frameworks and principles currently apply to autonomous weapons systems?

    There is no specific legal framework that applies to the use of autonomy or AI in these systems. Under international humanitarian law, based on the Hague Conventions and the Geneva Conventions, which together set out international law on war and war crimes, weapons must be capable of being used in a manner that can distinguish between civilian and military targets. Attacks must not result in disproportionate harm to civilians, and combatants must take precautions to verify they have the right target and reduce the risk of civilian harm. These international laws apply to all weapons, including the use of advanced autonomous systems, such as the drones deployed by Ukraine in June, which used machine learning to select, identify and strike targets deep within Russia on the basis of preprogrammed instructions.

    Head and shoulders of man standing in front of a window, wearing jacket and tie and glasses, looking forwards. Out of the window, trees and buildings can be seen.

    Professor Nehal Bhuta says it is important for the international community to agree on guidelines regarding the use of autonomous weapons.Credit: Edinburgh Law School

    What are the risks associated with autonomous weapons?

    Insufficient care in their development and deployment could compromise compliance with the principles of distinction and proportionality. Could the system generate too many false positives when identifying targets? Might an autonomous weapon calculate that large numbers of civilian deaths is an acceptable price to pay when targeting a suspected enemy soldier? We don’t really know yet because it’s immature technology, but these are vast risks. There is also a danger that if a system fails to accurately process incoming data in a rapidly changing environment, it could target the wrong forces or civilians.

    To make these systems effective, you have to acquire masses of data, including biometric information, voice calls, e-mails and details of physical movements. That’s a concern if this is done without the consent of those involved. The more you want to do, the more data you need. This creates an incentive to collect data more intrusively.

    Who is legally and ethically responsible when autonomous weapons kill?

    I think it is likely that some sovereign states will in future deploy weapons that are capable of making decisions to kill. The question is whether countries wish to regulate such systems. Effective legal frameworks require ways of attributing responsibility for violations. It can become difficult with complex autonomous weapons systems to identify the individuals responsible for failures and violations.

    The operators of future systems might not be adequately trained in when to ignore a system’s recommendations. They might also develop automation bias, making them unwilling to question a technologically advanced machine. A system could be systematically biased in how it acquires targets — in which case, responsibility would lie somewhere between the developer and the military officials who authorize its use.

    There is a risk that accountability becomes so diffuse that it’s hard to identify the individuals or groups of agents responsible for violations and failures. This is a common problem with complex modern technologies, and I think the answer lies in the adoption of regulatory frameworks for the development and use of autonomous weapons systems.

    What do you say to those who call for a ban on autonomous weapons?

    AIenabled autonomy limit weapons
    Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
    Previous ArticleSimtra BioPharma Expands Global Footprint
    Next Article Global survey seeks to map longevity’s next chapter
    admin
    • Website

    Related Posts

    Autonomy and Accountability in Bioprocessing

    February 26, 2026

    The ins and outs of the new Dietary Guidelines for Americans in light of the gut microbiome

    February 25, 2026

    Patrick Veiga – Gut Microbiota for Health

    February 17, 2026

    The gut microbiome puzzle and probiotics in primary care patients with IBS

    February 16, 2026
    Leave A Reply Cancel Reply

    Our Picks

    9 Time-Saving Kitchen Gadgets for Fall at Amazon

    September 5, 2025

    Why Exercise Is So Important For Heart Health, From An MD

    September 5, 2025

    An Engineered Protein Helps Phagocytes Gobble Up Diseased Cells

    September 5, 2025

    How To Get Rid Of Hangnails + Causes From Experts

    September 5, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Longevity

    Nautilus debuts Voyager platform in push toward next-gen proteomics

    By adminMarch 1, 20260

    Company’s new benchtop system promises a clearer view of proteins following validation at a leading…

    First-in-Human Success for Prenatal Stem Cell Therapy in Spina Bifida

    February 28, 2026

    Pressure-Driven Pathway Links Liver Congestion to Fibrosis and Cancer

    February 28, 2026

    A cellular atlas of aging comes into focus

    February 28, 2026

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    About Us

    At FineGut, our mission is simple: to enhance your self-awareness when it comes to your gut health. We believe that a healthy gut is the foundation of overall well-being, and understanding the brain–gut connection can truly transform the way you live.

    Our Picks

    9 Time-Saving Kitchen Gadgets for Fall at Amazon

    September 5, 2025

    Why Exercise Is So Important For Heart Health, From An MD

    September 5, 2025

    An Engineered Protein Helps Phagocytes Gobble Up Diseased Cells

    September 5, 2025
    Gut Health

    Nautilus debuts Voyager platform in push toward next-gen proteomics

    March 1, 2026

    First-in-Human Success for Prenatal Stem Cell Therapy in Spina Bifida

    February 28, 2026

    Pressure-Driven Pathway Links Liver Congestion to Fibrosis and Cancer

    February 28, 2026
    X (Twitter) YouTube
    • Contact us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    © 2026 finegut.com. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.