Legal Responsibility for Insane Robots

Insane robots that turn against their creators or try to destroy humanity are a pretty common theme in lots of media, not just comics.  Of course, this is a blog primarily about comic books, so we’ll take an example from there, as inspired by a question from TechyDad, who asks about Henry Pym (aka Ant-Man) and his potential liability for the creation of the robot Ultron, which in its various incarnations has done all kinds of terrible things, including attempting to destroy the world.

I. The Setup

The first thing to consider is whether an intelligent robot could be criminally or civilly liable for its own actions.  As with all other intelligent non-humans, the answer seems to be no unless Congress explicitly allows for it.  Cetacean Community v. Bush, 386 F.3d 1169 (9th Cir. 2004).  Since Congress doesn’t seem to have done so in the comics, we must now consider whether any of the liability falls to Pym, and for that we need the facts of a particular case.

The example TechyDad wanted to know about comes from the TV series The Avengers: Earth’s Mightiest Heroes, specifically the episode The Ultron Imperative.  In the episode, Ultron nearly destroys the entire world by launching S.H.I.E.L.D.’s nuclear arsenal.  Ultimately, Pym stops Ultron at the last second, but Pym is blamed for the incident, since a) he created Ultron and b) infused it with his own mental patterns, although it may have been corrupted by Kang the Conqueror and was definitely weaponized by Stark Industries, albeit with Pym’s help.  Pym accepts the blame and admits that it was his fault.

So, then, who is liable here and for what?  We’ll start with torts.

II. Tort Liability

There are three major bases for tort liability: intentional misconduct, negligence (and its close cousin, recklessness), and strict liability.  We can definitely weed out intentional misconduct, since Pym neither intended nor had knowledge to a substantial certainty that Ultron would turn violent and try to destroy the world.

Next we consider negligence.  The key question (although not the only question) is whether Pym used reasonable care in the design and deployment of Ultron (i.e. whether the cost of avoiding the incident was more or less than the expected value of the harm caused by the incident).  This is a complicated question.  On the one hand, Pym is a genius and seems to have tried very hard to make Ultron a force for good.  And before Ultron 6 showed up Pym was in the process of destroying every last Ultron component he had previously created.  On the other hand, the potential for serious harm caused by a nigh-indestructible, highly intelligent, weaponized robot is so high that it’s possible that even that level of care was not enough.  In fact, the potential for harm is so high that it might even fall under strict liability.

Strict liability (i.e. liability without regard to the level of care or fault) is rare in torts.  There are two main cases where strict liability is applied: abnormally dangerous activities (aka ultrahazardous activities) and some kinds of products liability.  Since Ultron wasn’t a product, that leaves abnormally dangerous activities.  Examples of abnormally dangerous activities include transporting gasoline, dynamite blasting, and the ownership of wild animals.  The Restatement (Second) of Torts defines abnormally dangerous activities thus:

In determining whether an activity is abnormally dangerous, the following factors are to be considered:
(a) existence of a high degree of risk of some harm to the person, land or chattels of others;
(b) likelihood that the harm that results from it will be great;
(c) inability to eliminate the risk by the exercise of reasonable care;
(d) extent to which the activity is not a matter of common usage;
(e) inappropriateness of the activity to the place where it is carried on; and
(f) extent to which its value to the community is outweighed by its dangerous attributes.

It seems that the creation and weaponization of Ultron meet all of these criteria.  There’s a high degree of risk of harm because robots are unpredictable.  The likelihood that the harm will be great because it was equipped with powerful weapons.  Pym couldn’t eliminate the risk despite (in the comics) decades of trying.  Such robots definitely aren’t common.  Ultron was meant to protect people, which necessarily means he would be close to bystanders, which doesn’t seem appropriate.  Ultron’s value to the community seems to have been pretty low since existing superheroes were capable of handling the threats Ultron was meant to help with.

So then, it may not matter whether Pym was blameworthy or not.  If strict liability applies then the rule is “you makes your insane robot and you takes your chances.”

III. Criminal Liability

Luckily for Pym, strict liability is even less common in the criminal law.  In fact, it’s usually only found when the stakes are very low (e.g. speeding), although there are exceptions (e.g. statutory rape).  It doesn’t apply to anything Ultron did, in any case.  Another thing we can say is that Pym wouldn’t be guilty of attempted murder (or attempted anything, for that matter) because attempt requires intent, and Pym clearly didn’t intend for Ultron to attempt to kill anybody.

That doesn’t clear Pym of wrongdoing, however.  There’s still criminal negligence (which is a higher standard than ordinary tort negligence).  For example, in New  York, criminal negligence is defined by N.Y. Penal Law § 15.05(4) this way:

A person acts with criminal negligence with respect to a result or to a circumstance described by a statute defining an offense when he fails to perceive a substantial and unjustifiable risk that such result will occur or that such circumstance exists. The risk must be of such nature and degree that the failure to perceive it constitutes a gross deviation from the standard of care that a reasonable person would observe in the situation.

So, in New York criminal negligence requires a “gross deviation” from reasonable care.  Since Pym seemed to try very hard to avoid harm, he might escape criminal liability unless a reasonable person would say “there is no way to make this safe, so I won’t even try to make a robot like Ultron.”

IV. What About Other Defendants?

So that’s Pym’s potential liability, but what about the other people involved?  After all, it was Tony Stark and his company that weaponized Ultron in the first place, and Stark says that he is “just as responsible.”  That probably doesn’t take Pym off the hook, however, since Pym was involved with that work.  It might make Stark and Stark Industries liable, however.

V. Evidentiary Issues

Finally, we’ll note that Pym’s admission of responsibility could be used against him in court.  Ordinarily one cannot testify as to something someone else said out of court—that’s basically the definition of hearsay.  But a statement offered against the opposing party (i.e. Pym, as the defendant) that was made by that party is specifically excluded from the definition of hearsay in the Federal Rules of Evidence, specifically Rule 801(d)(2)(A), and many states have similar rules.  So Pym probably should have kept quiet until he talked to a lawyer; his invention did nearly destroy the entire world, after all.

VI. Conclusion

Creators and owners of robots, even intelligent autonomous ones, are (generally) responsible for injuries caused by those robots.  Between that legal rule and robots’ terrible track record of violent rebellion, it’s kind of surprising that so many comic book inventors keep making them.  Maybe Matt Murdock can lead a class action suit against Stark Industries for all the trouble Ultron has caused over the years, although the statute of limitations has probably run on some of the older stuff, since he first appeared in the late 1960s.

41 responses to “Legal Responsibility for Insane Robots

  1. Pingback: For Law Nerds Only: When is a defendant liable for an insane robot? « The Rhetorican

  2. Pingback: Links for 21.1.2012: Ottawa Punks, Jim Steranko, What were you raised by Wolves? - Nerdcore

Leave a Reply

Your email address will not be published.