CyberEd Pro

Cyberwarfare / Nation-State Attacks , Fraud Management & Cybercrime , Training & Security Leadership

Has the US Created the Wrong War Machine?

We Need Low-Cost, High-Volume Weapons Systems to Prevail in Future Conflicts
Has the US Created the Wrong War Machine?

Over half a century ago, during the Vietnam War, the United States debuted precision-guided weapons in their contemporary guise. Since then, military forces have pursued ever-greater accuracy and power, and the cost of such armaments has gone up. GPS-guided artillery shells now fetch a price of $100,000 each in the U.S.

See Also: How to Unlock the Power of Zero Trust Network Access Through a Life Cycle Approach

The high cost of these advanced weapons has limited their availability, leading to shortages, such as those experienced by European countries during the conflict in Libya in 2011. Conversely, Israel has opted to use less sophisticated, unguided bombs over Gaza, prioritizing the conservation of its precision munitions over the minimization of unintended damage.

Ukraine

This longstanding dilemma is currently being addressed in Ukraine, where the introduction of first-person view drones is transforming warfare. These drones, modified from consumer models and equipped with explosives, are inexpensive yet capable of causing significant damage by infiltrating enemy lines and targeting tanks and bunkers with precision. The use of such drones is making warfare more perilous for soldiers, and it signifies a major shift toward using more accessible, cost-effective technology in combat.

FPV drones have made thousands of confirmed strikes and led to the formation of specialized drone units, underscoring a trend toward miniaturization and affordability in military hardware. This trend is further illustrated by Ukraine's ambitious plan to manufacture up to 2 million drones, demonstrating a shift away from traditional, more expensive munitions.

Ethics

These developments bring challenges, including ethical considerations and the potential for rapid dissemination among nonstate actors, such as militias and terrorists. The democratization of precision weaponry, as seen in various conflicts around the world, poses a significant threat to both military and civilian targets.

The technological advancements driving this shift, particularly in consumer electronics, are accelerating the pace of innovation and the adoption of autonomy in warfare. This raises questions about the future of combat, including the potential for fully autonomous drones capable of operating independently of human control.

Adapt, Migrate or Perish

As military strategies evolve to incorporate these new technologies, there is a pressing need for defense planning to adapt accordingly. Developing low-cost, high-volume weapons systems, such as drones, is crucial if we want to maintain a competitive edge in future conflicts. This approach requires us to invest in new technologies and develop defenses capable of countering the widespread use of drones in both wartime and peacetime scenarios.

The emergence of intelligent drones and the possibility of autonomous swarms of drones represent a paradigm shift in how warfare is conducted and challenges the traditional notions of battlefield control and human oversight.

Blurred Lines

As the line between human and machine decision-making in combat continues to blur, the implications for military strategy and international security are profound. Has the U.S. created the wrong war machine by developing advanced military versions of prior models, betting heavily on bigger, faster, more powerful versions of our ships, fighter jets, tanks and ordinance, while our enemies have gone in the opposite direction, relying on computers and disposable, weaponized delivery systems?

Autonomous drones and precision-guided munitions are dirt cheap. A weaponized smart drone costs the Iranians $2,000, and they are supplying hundreds of thousands to the Houthis, according to The Wall Street Journal.

The Economist says guided missiles technology is one of the most expensive sectors in smart weaponry. A single medium- to long-range subsonic Tomahawk cruise missile costs roughly $1.5 million, and 50kg air-to-ground Hellfire rockets cost $115,000 each.

The issue isn't whether we should or shouldn't be building huge atomic subs. It's whether we can use expensive hardware to fight off the relatively inexpensive incoming without bankrupting ourselves in the process.

This debate is multifaceted, considering the implications these technologies have for warfare, global security and the future of humanity. Here are several key points to consider:

  • Ethical concerns: The development of autonomous weapons systems raises significant ethical questions, particularly regarding accountability, decision-making in combat and the potential loss of human oversight in life-and-death situations. There's a growing concern about the moral implications of delegating lethal decisions to machines, especially in scenarios where distinguishing between combatants and non-combatants is challenging.
  • Strategic implications: From a strategic standpoint, advanced military technologies offer the potential for more precise and effective operations, reducing the risk to human soldiers and potentially minimizing collateral damage. But they also lower the threshold for engaging in conflict, as the perceived costs and risks of deployment may be reduced when human lives are less directly at risk on one's own side.
  • Security dilemma: The proliferation of advanced military technologies can exacerbate the security dilemma, where actions by one state to enhance its security - such as developing autonomous drones - prompt others to respond in kind. This can lead to an arms race that ultimately makes all parties less secure, particularly as these technologies spread to nonstate actors and potentially malicious users.
  • International law and norms: Current international laws and norms were established in a pre-digital era and do not adequately address the challenges posed by autonomous weapons and cyber warfare. The international community needs to develop new frameworks to regulate the use and proliferation of these technologies and to ensure that they are used responsibly and in accordance with humanitarian law.
  • Technological determinism vs. human agency: There's a debate about whether technological advancements inevitably dictate the direction of military strategy and international relations - technological determinism - or if humans have agency in shaping how technology is developed and employed - constructivism. The choices that policymakers, military leaders and the scientific community make will play a crucial role in determining the path forward. And technology is advancing very swiftly.
  • Existential risks: Advanced military technologies, especially those that incorporate artificial intelligence, pose existential risks if they malfunction, are misused or lead to unintended escalation in conflicts. The potential for autonomous systems to act in unpredictable ways or to be hijacked by malicious actors adds a layer of risk that could have catastrophic consequences.

Developing and deploying advanced military technologies involves balancing the desire to improve national security with the need to navigate the ethical, strategic and existential challenges these technologies present. Should they also be used to bolster a country's battle agenda and mission statements? We fight wars to defend our territories and ways of life.

If we believe that a constitutional republic is the only form of a free world model, then we will fight and die for that belief and to defend the version we embrace.

The real danger here is that the LLM or instruction set upon which the weapons are trained and guided accommodates biases and judgments about identity, race, political ideologies and objectives and may easily be written in ways that "forget" critical scenario possibilities while in conflict.

One example of a real-life use case where we literally shot off our own foot is an Air Force drone strike exercise last year in which a drone was trained with a specific mission statement: to take out several high-value enemy targets. After demonstrating it could easily accomplish its mission, the mission commander called off the drone. But "calling off the drone" was not included in the drone's possibilities. The drone rightfully interpreted this intervention as a barrier to accomplishing its mission and took out the entire control tower and command center before continuing to complete its mission.

The question of whether the U.S. has created the wrong war machine is ultimately a reflection of broader societal values and priorities. It highlights the need for ongoing dialogue, regulation and oversight to ensure that advancements in military technology serve humanity's long-term interests and security.

And it must do this at Mach speed, for we are now engaged as a proxy in three conflicts, and a new line is forming.

Can we act quickly enough? And are we smart enough?



About the Author




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing inforisktoday.asia, you agree to our use of cookies.