This essay has been submitted by a student. This is not an example of the work written by professional essay writers.
Uncategorized

How universal limitations can be implemented for autonomous lethal weapons

Pssst… we can write an original essay just for you.

Any subject. Any type of essay. We’ll even meet a 3-hour deadline.

GET YOUR PRICE

writers online

How universal limitations can be implemented for autonomous lethal weapons

The potential damage to AI intentionally destined for a combat massacre is a pressing issue. Also, the United States and other countries strive to create artificial military intelligence, such as autonomous drones and weapons that enhance capabilities in war zones, while injuring or killing fewer soldiers. For the United States, this would be a natural extension of the current drone program, which has seen a lot of accidents in the battlefield, such as the massacre of non-combatants in Iraq (Hussain). The Pentagon says it does not intend to exclude people from the decision-making process, which supports the use of lethal power. Still, AI Innovation has proven to perform better in decision-making as compared to humans. This has caused concerns among people as many fear that global arms competition, which threatens an arms race to fully ordinances, which to some extent may not be capable of making decisions (Hussain). As a result, there have always been calls to regulate the use of AI in autonomous weapon systems such as UAS.

The United Nations is examining how universal limitations can be implemented for autonomous lethal weapons. In 2015, a large number of AI analysts proposed the restriction of autonomous weapons, if they did not require human control, because autonomous weapons are perfect for companies for various reasons, such as destabilizing countries, assassinations and the oppression of the population, and in particular an ethnic group (Berkeley Engineering). These terrible calamities would be under the supervision of an AI ​​system already available, or that will make them sooner or later where features such as facial recognition can make such drones easier to hunt and assassinate a particular target. Worse can be an algorithm fed onto a drone can select specific targets based on their skin colour.

One notable aspect when it comes to the protection of the military and national infrastructure is that machines and not humans, that will make essential decisions in a world of national security. The AI created by humans will affect the administration, work and progress of military power. It goes beyond the swarms of autonomous drones to better target enemies and offers military administrators with new and proven alternatives in a combat situation. Although the Department of Defense has promised that people will always make the final decision to kill another person, there are real questions about what it means if using AI can improve the performance of their weapons to the extent that they can independently identify and determine an alternative course of action to help achieve mission objectives based on the current parameters being fed in realtime (Ilachinski 11).

The ability to develop artificial thinking will affect each of the three detailed phases of national strategy development: identification, decision-making and evaluation — this likely to provide both benefits and drawbacks. After locating and examining large datasets, policymakers will have more details than ever in their planned briefs on a wide range of topics, from security phases to skill movements and enemy military reconnaissance (Horowitz 41).

Military and security organizations can use artificial intelligence as a targeting system. Artificial intelligence can identify and present new threats and provide effective countermeasures to mitigate such risks (Soffar). This can invalidate a long-distance threat in the combat zone and provide methods for balancing the improvised explosive device (IED). For example, drones mounted with AI can scan a particular area to determine threats on infrastructure on infrastructures such as dams. One perceived threat is that small drones can be used as remote-controlled bombs that terrorists can use to attach small explosives and fly over a military facility (Soffar). The drones can be operated autonomously by an AI system. However, military drones fitted with AI technologies can detect such drones and engage signal jamming capabilities to disable such threats (Soffar).

  Remember! This is just a sample.

Save time and get your custom paper from our expert writers

 Get started in just 3 minutes
 Sit back relax and leave the writing to us
 Sources and citations are provided
 100% Plagiarism free
error: Content is protected !!
×
Hi, my name is Jenn 👋

In case you can’t find a sample example, our professional writers are ready to help you with writing your own paper. All you need to do is fill out a short form and submit an order

Check Out the Form
Need Help?
Dont be shy to ask