Ethical Decision Making in Robots: Autonomy, Trust and Responsibility

FieldValue
dc.contributor.authorAlaieri, Fahad
dc.contributor.authorVellino, Andre
dc.date.accessioned2016-09-12T21:10:38Z
dc.date.available2016-09-12T21:10:38Z
dc.date.issued2016-11-01
dc.identifier.citationEighth International Conference on Social Robotics 2016
dc.identifier.urihttp://hdl.handle.net/10393/35163
dc.description.abstractAutonomous robots such as self-driving cars are already able to make decisions that have ethical consequences. As such machines make increasingly complex and important decisions, we will need to know that their decisions are trustworthy and ethically justified. Hence we will need them to be able to explain the reasons for these decisions: ethical decision-making requires that decisions be explainable with reasons. We argue that for people to trust autonomous robots we need to know which ethical principles they are applying and that their application is deterministic and predictable. If a robot is a self-improving, self-learning type of robot whose choices and decisions are based on past experience, which decision it makes in any given situation may not be entirely predictable ahead of time or explainable after the fact. This combination of non-predictability and autonomy may confer a greater degree of responsibility to the machine but it also makes them harder to trust.
dc.language.isoen
dc.subjectrobot ethics
dc.subjectAutonomy
dc.subjecttrust
dc.subjectresponsibility
dc.titleEthical Decision Making in Robots: Autonomy, Trust and Responsibility
dc.typeConference Proceeding
CollectionSciences de l'information - Publications // Information Studies - Publications

Files
Robots-Paper-Final.pdf163.29 kBAdobe PDFOpen