Think of a situation in the not-too-distant future in which an autonomous robot, for example a factory manufacturing robot or a self-driving car, through some unfortunate coincidental combination of circumstances accidentally causes a human serious injury or even death. In a case such as this, who should be held responsible for the crime and damages? Can the developer or team of developers who wrote the code justifiably be held responsible? Is it simply considered a mechanical malfunction and nobody is held responsible, but the machine is destroyed?