Can Robots Be Responsible: A Bumper Theory Approach to Robot Moral Conditioning

ORCiD

Dongbin Lee: 0000-0002-5307-0374

Document Type

Conference Proceeding

Department

Electrical and Computer Engineering

Conference Title

Proceedings - 2023 17th IEEE International Conference on Robotic Computing, IRC 2023

Organization

IEEE

Location

Laguna Hills, CA

Conference Dates

December 11-13, 2023

Date of Presentation

3-25-2024

Abstract

This paper examines the main unresolved theoretical controversy question, centered on the gap issue when responsibility meets ambiguity. Our scientific inquiry aims to discuss what is missing in this issue and promote a dialogue that hopes to begin to resolve it. We are proposing a moral framework and planning a series of experiments that, when applied, may start the process of resolving the responsibility gap to better understand moral cognition. The controversy surrounding morality in autonomous robots or vehicles, synonymous with the responsibility gap problem, is being investigated by bumper theory related to the severity of casualty, indicating an immoral level, with a moral rulebook. The experiment proposes moral conditioning to drive the robot’s behavior to avoid collision and update the level of morality of the robot. Our scientific method should also promote a dialogue that contends robots can be prescribed a moral rulebook that could give the robot human-like moral cognition. As an experiment to validate robots enabled to morally be conditioned, robots are developed to detect humans, cars, or non-humans using machine learning. A fast single-stage YOLO is used for human detection and multitask convolutional neural network MTCNN is used for detecting human-faces.

Publisher

IEEE Robotic Computing

First Page

284

Last Page

287

DOI

10.1109/IRC59093.2023.00053

Share

COinS