top of page

     A Moral Machine is a tool created by MIT Media Lab to monitor the public's opinions towards important moral questions to do with robots. MIT made a moral machine to help figure out what a self-driving car should do upon finding itself in life-death dilemmas. 

​

     I have created a moral machine to find out the public's opinions towards the conflict between Isaac Asimov's first and second laws of robotics and human free will. This is an increasingly important ethical question that will guide the design of robots which are rapidly moving from industrial application to playing a role in the personal lives of people. I will use statistical analysis to determine whether our opinions depend on factors such as age, race, gender, country of residence, marital status, or whether or not we have children. 

​

     The first law of robotics states that: A robot may not injure a human being or, through inaction, allow a human being to come to harm.  The second law is: A robot must obey orders given to it by human beings except where such orders would conflict with the first law.

     At first the laws seem logical, but as you can see, the more you think about it, the more ambiguities you can find. In fact, almost all laws have ambiguities. That is why we have court systems, to interpret the laws.

 

     Now, you can be the judge, and help develop the future court system that is sure to come one day by clicking the button below. The survey takes less than 10 minutes. You can go back to questions in the quiz, but you can only take the quiz once per device. You don't have to finish the entire survey - you can quit if you feel uncomfortable at any time.

What Should the Robot do? You Decide!

***CHILDREN UNDER THE***

AGE OF 18 SHOULD NOT TAKE THIS SURVEY

For risk of figuring out the clue:

Wrench, Screwdriver, Hammer, Nail, Saw

bottom of page