Autor Tema: Article: The international law and its approach on combat drones  (Leído 722 veces)

Admin

  • Administrator
  • Newbie
  • *****
  • Mensajes: 15
    • Ver Perfil
The international law and its approach on combat drones

One of the basic considerations whether an action is a crime or a consequence of an armed conflict tends to be if that armed conflict existed in the first place. There are war strategies followed by different countries when they find themselves in a conflict, and they all must follow different limitations that will not allow these countries and governments to “cross the line”. The Rome Statute is the most visual example of what is allowed and not during a time of war. Specifically, their article 8 speaks about the different war crimes that will be considered and punished if they are committed. Serious violations of international laws and conventions tend to be included in the concept of war crimes, but they must always work as a part of a plan or policy in the large-scale conflict.
The Rome Statute is a complete text that explains the different contents of international criminal law. But, as does society, the type of crimes evolve. A new way of committing a crime, a new weapon, new strategies, people trying to find the spots that law has never treated with before, and use them to commit supposed crimes, is something that tends to happen, and even more in the era of information, communication and technology.

Recently, it’s been a topic in international law the usage of Drones. Drones can be used as an entertainment, for humanitarian reasons, to investigate, or even for war. It’s easy to use a drone as a weapon if a person makes the machine carry an explosive. The biggest difference that these drones or UCAVs (Unmanned Combat Air Vehicles) have with other means of war is the independence in which these act with.

A drone functions with a remote pilot, but different types of drones can work with different programs, to the point of allowing drones to fly and act almost on their own by the use of AI’s. That means that a drone can be an unmanned vehicle or a programmed missile, it can have many shapes and, for that reason, it can be dangerous.

Already in 2013, Peter Maurer, president of the International Committee of the Red Cross, spoke about drones. He explained the importance of making a difference between other weapons and drones. Although the UCAVs are not specifically considered under International Humanitarian Law, their usage is subject to international law. As every technology, its existence or usage does not necessarily imply a harm done to someone else. However, it’s actually there where the debate starts. Combat operations in places such as Afghanistan, Gaza or Yemen. Some argue that the precision with which the drones act have caused less destruction and casualties between civilians, but their remote functioning has also caused an erroneous attack and death on civilians in many occasions. (Maurer, P., 2013)

Using drones is not illegal, but arming them makes the vehicle become a weapon. That’s the reason why the international law still applies. But one of the main issues then is finding the responsibility of the crime that a drone can commit.
For remotely controlled drones, these weapons, even if not directly, have a constant control from an operator. Although that operator is not present, he identifies the target, fires the weapon, and follows a responsibility chain equal to anyone who actually fights in the battlefield using other type of weaponry. “Drone operators are thus no different than the pilots of manned aircraft such as helicopters or other combat aircraft as far as their obligation to comply with international humanitarian law is concerned, and they are no different as far as being targetable under the rules of international humanitarian law.”, Maurer, P. (2013)

In order to judge the usage of drones as a weapon, there are different factors to take in mind. The operator or IA that was in control, the program used in the drone, casualties, damage, moral…

In the 31st international Conference of the Red Cross and Red Crescent, during 2011, the workers from the different organisations redacted a 53-pages text in which they considered the new challenges that the international law was facing to judge the contemporary conflicts. In the chapter V, Means and methods of warfare, they specify on drones and IAs.

  • Drone usage: Drones are defined as remote-controlled weapons usable from a safe distance by the different parts of an armed conflict. The remote-controlled weapons are introduced as a new weapon system able to allow combatants to be “physically absent from a zone of operations”. This can help to reduce civilian casualties and damages to objects by making coordinated precise attacks. Even so, the remote-controlled weapons also increase the opportunities to attack the adversary and thus, put population in risk of incidental harm constantly. Drones, specifically, have a great real-time surveillance capacity and have enlarged the amount of precautionary measures. On the other hand, drones have specific emotional-psychological risks for the operator.
  • AI controlled weaponry: These are not the also defined “automated weapon systems”, but “autonomous weapon systems”. Automatic systems work under defined and programmed circumstances and will act once the programmed conditions have been given. An example would be a sentry gun waiting for movement in order to automatically shoot. On the other side, the Autonomous systems are those that can learn and adapt based on the circumstances and environment given. The Red Cross and Red Crescent define the system with an artificial intelligence as one that “would have been capable of implementing IHL”, which for now do not exist. These systems would automatically change the paradigm on the conduct of hostilities and bring many moral, ethical, legal and societal discussions. It’d basically be an autonomous system able to distinguish between civilians and combatants and behave ethically in a battlefield. Again, many moral issues are taken in mind because of various reasons. The most obvious one is if we can delegate life and death choices to an artificial intelligence.

The Conference explains how, in the latter case, the responsible for a crime would be difficult to determine. It’d not be clear if the person responsible for a war crime committed by an Artificial Intelligence. Serious violations of the IHL and the different war crimes tend to be considered under the responsibility of a human, so what would happen if an AI committed such crimes? The circumstances of that AI usage cannot be currently determined by the IHL, since the programmer, the manufacturer, the command that deployed the drone, all could be part of the definition but not necessarily responsible. The International Committee of the Red Cross argues that the new technologies do not change the existing law, but actually must respect it.

In that sense, I’d like to consider the war crimes under the Rome Statute and which ones could be caused by drones. It must be said that because of the amount of crimes recognised by the Rome Statute, I will focus mainly on these that are directly in relation with the grave breaches of the Geneva Convention:

  • Wilful killing: It is likely that both, an AI and an operator could perform a murder. If the AI is properly equipped with the actual ethic respects that the ICRC considers, the AI would not commit any of the crimes, so it’ll be taken in mind only the currently developed AI’s.
  • Torture or inhuman treatment, including biological experiments: I personally consider it’s difficult for the current AI’s to perform torture or experiments, as also is for an operator with his drone. It must be mentioned that in the interview that was made to the president of the ICRC, Peter Maurer, he mentioned how the presence of drones could cause serious psychological harm to the populations by their presence hovering on the sky, although it’s not clear which levels of stress and psychological harm can it cause.
  • Wilfully causing great suffering, or serious injury to body or health: Without doubt, unmanned vehicles can be used in order to cause suffering and harm from a safe distance. Some argue that the safe distance and the lack of feeling of a “battlefield” can help to perform specific actions with a smaller feeling of remorse. Of course, if the AI has programmed a specific action course that includes an extreme pain through weapon use for the civilian population, for example, they will follow the pattern.
  • Extensive destruction and appropriation of property, not justified by military necessity and carried out unlawfully and wantonly: UCAV’s can be used not only for precise operations. They can carry powerful weaponry and explosives and cause severe harm to property and structures from the population, even if they are AI’s.
  • Compelling a prisoner of war or other protected person to serve in the forces of a hostile Power: Drones and their operators are highly unlikely to cause this type of behaviour, since their lack of human presence makes negotiations generally impossible and, therefore, they cannot act directly on a person or force them to change sides.
  • Wilfully depriving a prisoner of war or other protected person of the rights of fair and regular trial: This type of war crime tends to be reserved to higher positions in the command ladder, therefore operators would not be included. It’s impossible that the current IA’s can commit this type of offense.
  • Unlawful deportation or transfer or unlawful confinement: As mentioned in the previous paragraph, the possibility of a drone or an operator to commit this type of crime is unlikely or impossible. Although, it must be said, their presence can be used in order to enforce that deportation, be it for an escorting process or to threaten hostile actions.
  • Taking of hostages: Again, as in the previous points, it’s difficult, though not impossible, that an unmanned vehicle can take hostages. Under certain circumstances it could happen that the drones participate in a hostage-taking situation, but it’s unlikely that a drone itself can take a person (focusing on the specific vigilance of a prisoner that is a hostage).

As it’s shown, the various crimes cannot be always applicable for the drones, but still, being a weapon of new warfare does not make drones invulnerable to International Humanitarian Law. Even so, the ICRC mentions that “current norms do not sufficiently regulate some of the challenges posed and might need to be elaborated. For the ICRC, it is important to ensure informed discussion of the issues involved, to call attention to the necessity of assessing the potential humanitarian impact and IHL implications of new and developing technologies and to ensure that they are not employed prematurely under conditions in which respect for IHL cannot be guaranteed.” (ICRC, 2011)

One of the biggest challenges that the international community has to face in order to regulate drones is the legal field, but morals get involved often:

  • Operators: The operator of a drone does not necessarily need to be a combatant previously trained for war operations. The operator is a person able to conduct a series of instructions to a machine that will not inflict direct harm to them. In a sense, it’s a positive point since the operators are away from the risk, although they do put many dangers to other people. But, can an operator be someone without army training? The US has been using drones for already years in different operations and it’s already reported how the army had contracted civilians to pilot the UCAV’s. In 2015, there were already reports about how the CIA’s predator-type drones started to be operated by gamers, people who had dedicated hours to the memorisation of videogame controls and that would feel even more comfortable than an actual pilot on the drone. In 2016, there were also reports about the US Air Force using external pilots as contractors to use the drones because of the increasing demand. But that also brings again the topic of moral. Drones turn instantly a civilian into a combatant and a target of the opposed party, and then, it’s also argued that not being in the place where the combat exists makes the pilot or operator avoid physical or psychological traumas that only the same battlefield could cause, and is also argued that the conflict on killing people gets reduced because of that geographic situation. But a different danger is then exposed to the civilian, since the sole act of working as an operator makes them a possible enemy: “The act of associating with alleged militants is apparently crucial to the administration’s selection of many “signature” strikes, i.e., attacks on people whose identities are not known but who are deemed to be combatants by virtue of their behavior.” (Roth, K. 2013). Even so, it must be considered that for now, these civilian pilots do not have the permission to “fire”, as they are limited to just perform surveillance and control operations in the Air Force.

UCAV’s affect directly to the military capacities of a country. They can be used to supress the enemy, support operations, eliminate specific targets, and perform specific surveillance or defensive operations. These vehicles have access to an incredible amount of customization only limited by the human capacity, which is constantly increasing. The military has already begun to consider drone warfare as the next step of armed conflicts. UCAV’s will be in the future used for low-intensity fights, counter-insurgence and surveillance, but because of the advances and different legal and moral considerations, it’s unlikely that the change will happen anytime soon. UCAV’s can be used for the most dangerous or precision-based jobs and the future of their actions will depend mostly on how the technology advances.

For that reason, knowing how these advances could occur in any moment, the ICRC is an example of prevention. The Committee has been working on the issue of drones from even 2011 with their mention on the Report and the discussion on legal, social and ethical issues. Drones have the opportunity to change the ways in which war is performed and they will bring either more destruction or even a warfare based only in drone-fighting operations. Policymakers will need to review International Laws in order to control and prevent it to get out of hand. What worries the ICRC, as an example, is the implications on civilians. There’s been reported cases of harmed civilian populations in Pakistan because of the CIA’s drone programmes, but as it happens with any weapon, the case needs to be deeply studied.

In conclusion, drones are a new mean of contemporary warfare that brings into question legal issues because of their recent apparition and the possibility of being not fully included and considered in IHL, as also how to judge the chain of command in AI-controlled vehicles. They also bring specific ethical problems to the international community because of the possible harm of civilians and the intention to delegate to an AI the question: “What’s morally acceptable in this situation?” Only the future advances on technology and law will show the extent of the issue.



References

Broersma, M. (2015), US Military Recruits Gamers To Fly Killer Drones. From Silicon UK, available at http://www.silicon.co.uk/e-innovation/military-gamers-drones-160784

Callam, A. (2015). Drone wars: Armed unmanned aerial vehicles. International Affairs Review, 18.

International Committee of the Red Cross (2011). 31st INTERNATIONAL CONFERENCE OF THE RED CROSS AND RED CRESCENT. International Humanitarian Law and the challenges of contemporary armed conflicts – ICRC Report. Geneva, Switzerland, 28th November to 1st December, 2011

International Committee of the Red Cross (2013). The use of armed drones must comply with laws.

International Committee of the Red Cross (2014). Ensuring the use of drones in accordance with international law

International Criminal Court (2011), Rome Statute of the International Criminal Court. Avaliable at: https://www.icc-cpi.int/NR/rdonlyres/ADD16852-AEE9-4757-ABE7-9CDC7CF02886/283503/RomeStatutEng1.pdf

Roth, K. (2013), What Rules Should Govern US Drone Attacks? From The New York Review of Books, available at http://www.nybooks.com/articles/2013/04/04/what-rules-should-govern-us-drone-attacks/

Schmidt, M. (2016). Air Force, Running Low on Drone Pilots, Turns to Contractors in Terror Fight. New York Times, available at https://www.nytimes.com/2016/09/06/us/air-force-drones-terrorism-isis.html?_r=0
« Última modificación: Octubre 11, 2017, 10:36:16 pm por Admin »