Robots Be Ethical

Robots Be Ethical

Popular science fiction has pictured scenarios where robots are destructive and dominate their creators. This might have influenced our perception towards the use of robot technologies. While we may believe that robots cannot think ethically and accomplish tasks meant for humans, the advancements in technology and ideologies might prove us wrong. Today, robot applications range from elderly care and driverless cars to carrying out critical surgical operations and combating on war arenas. Developments in robot ethics, AI, and programming techniques could in fact help design robots which can simulate intrinsic ethics and act in a desirable way.  While independent robots can expand possibilities, humans should have the final authority and accountability for their actions which can otherwise heat up situations in case of mishaps. Ultimately, it is upon us to realize and weigh the potential applications and shortcomings of robots and alter our perception about their ethical use in various settings.

Robots can be programmed to behave ethically. Unlike humans, competent robots would not let their emotions influence their decisions or favor an individual over the other. This in fact would let machines undertake more unbiased and ethical decisions (Anderson & Anderson, 2010).While it is true that it can be a challenging task programming robots to function ethically in every known situation, their potential applications in augmenting human lives can outweigh the act of prohibiting their use in situations where their presence might be debatable (Anderson & Anderson).  Researchers have projected ways –using artificial intelligence and algorithm tactics- in which machines can be programmed to evaluate new and different scenarios, correlate them with human behavior and then take logical and ethical decisions. These decision tend to reflect the dominance of one duty over another; benefit over detriment (Anderson & Anderson). Anderson & Anderson strongly imply that if machine softwares can incorporate learning from various human situations and responses, robots can be programmed to adopt ethical standards similar to that of humans. As an evidence of the learning phenomenon, there are drones and robo-pets that mimic the nature in order to feel real as well as simulate the kind of learning natural entities can do. (Davis, 2014). A part of Locomorph project- which essentially utilizes biomimicry to enhance productivity of robot locomotion- analyses the adaptability of disabled dogs (Shaw, 2010).This is aimed at applying the adaptability feature while programming robots which would further aid the robots’ ability to better adapt with dilemmas in their work spaces (Shaw, 2010).

While robots can be programmed to simulate ethical principles, restraining their autonomy can also help in achieving useful applications for robots as well as address questions on ethical implications. Autonomous robots have entered into several settings of our everyday life and can help expand the potential applications of robot technology (Anderson & Anderson). On the other hand, the use of autonomous machines in specific settings such as in military has fueled several debates (Farmer, 2014). Angela Kane, UN’s chief representative for disarmament, opposes the notion of autonomous robots being used in battlefields due to the lack of human control and of clear accountability in case of disasters. But Huw Williams, Unmanned Systems Editor at HIS Jane’s International Defense Review, argues that current robots do not possess real autonomy and that they are directed by human control (Farmer, 2014).  Additionally, Anderson & Anderson also talk about keeping that balance between independence of robots and human command.

Above all, altering our perception about robots and robot ethics can help us evaluate their benefits and dangers and cultivate their ethical use. Majority of robots do not have to make life-threatening decisions, yet we think they should act sympathetically and ethically in order to accept them (Anderson & Anderson). When Google started acquiring robotic companies, people started questioning their ambitions and raising concerns over their increasing power (Naughton, 2013).While majority of us believe that machines are incapable of ever undertaking ethical decisions (Anderson & Anderson), and that robots cannot substitute routine tasks maintained by humans, Google has essentially altered their beliefs and rationalized the merits of robotic applications and their endless possibilities in enhancing human life (Naughton). They are working on projects including automated cars, universal free high speed internet access, and enhancing human life span by improving healthcare related technologies (Naughton). In addition to the perception of robotic applications, potential users of robotic equipment do not completely evaluate the machine’s implications before using it (Rabin, 2013). The da Vinci system, made and heavily marketed by Intuitive Surgical Inc., is a machine which is increasingly used in surgeries. Rabin talks about the increasing number of mishaps due to this equipment – majority of which are not reported. He strongly implies that there is a need to train the doctors who use such robotic equipment, evaluate its risks and benefits, and report the cases in case of malfunction of the equipment.

Developments in robotic technologies can benefit humankind if the robots are used ethically. Efficient programming of robots by integrating ethical principles and learning ability, restraining their autonomy, and finally changing how we thing about and go about using robot technologies can lead to acceptance of robots in various settings.

Shyamal Anadkat
Shyamal Anadkat (a.k.a Zostale) is a student in Computer Engineering at the University of Wisconsin- Madison, USA. Innovation, creativity, and the art of critical thinking are few of the ingredients that craft his way of life. For the rest of his day, he enjoys gyming, playing lawn tennis, mixing music, and penning his thoughts even at 3 in the morning. His lifetime motto is 'to learn, progress, and serve’.