It’s not my fault, my brain implant made me do it
Mr. B loves Johnny Cash, except when he doesn’t. Mr. X has watched his doctors morph into Italian chefs right before his eyes.
The link between the two? Both Mr. B and Mr. X received deep brain stimulation (DBS), a procedure involving an implant that sends electric impulses to specific targets in the brain to alter neural activity. While brain implants aim to treat neural dysfunction, cases like these demonstrate that they may influence an individual’s perception of the world and behavior in undesired ways.
Mr. B received DBS as treatment for his severe obsessive compulsive disorder. He’d never been a music lover until, under DBS, he developed a distinct and entirely new music preference for Johnny Cash. When the device was turned off, the preference disappeared.
If autonomy is undermined, can we attribute responsibility to the individual?
Mr. X, an epilepsy patient, received DBS as part of an investigation to locate the origin of his seizures. During DBS, he hallucinated that doctors became chefs with aprons before the stimulation ended and the scene faded.
In both of these real-world cases, DBS clearly triggered the changed perception. And that introduces a host of thorny questions. As neurotechnologies like this become more common, the behaviors of people with DBS and other kinds of brain implants might challenge current societal views on responsibility.
Lawyers, philosophers and ethicists have labored to define the conditions under which individuals are to be judged legally and morally responsible for their actions. The brain is generally regarded as the center of control, rational thinking and emotion – it orchestrates people’s actions and behaviors. As such, the brain is key to agency, autonomy and responsibility.
Where does responsibility lie if a person acts under the influence of their brain implant? As a neuroethicist and a legal expert, we suggest that society should start grappling with these questions now, before they must be decided in a court of law.
WHO'S TO BLAME IF SOMETHING GOES WRONG?
Imagine that Ms. Q was driving one day and had a sudden urge to swerve into a crowded bus stop. As a result, she ended up injuring several people and damaging the bus stop. During their investigation, police found that Ms. Q had a brain implant to treat her Parkinson’s disease. This implant malfunctioned at the time the urge occurred. Furthermore, Ms. Q claims that the bus stop was not there when she acted on the impulse to swerve.
As brain stimulating technology advances, a hypothetical case like Ms. Q’s raises questions about moral and legal responsibility. Is Ms. Q solely responsible for her actions? Can we attribute any blame to the device? What about to the engineers who designed it or the manufacturer? The neurosurgeon who implanted it or the neurologist who programmed the device parameters?
Historically, moral and legal responsibility have largely focused on the autonomous individual – that is, someone with the capacity to deliberate or act on the basis of one’s own desires and plans, free of distorting external forces. However, with modern technological advances, many hands may be involved in the operation of these brain implants, including artificial intelligence programs directly influencing the brain.
This external influence raises questions about the degree to which someone with an implant can control their actions and behaviors. If brain implants influence someone’s decisions and behaviors, do they undermine the person’s autonomy? If autonomy is undermined, can we attribute responsibility to the individual?
Society needs to discuss what happens when science and technology start challenging those long-held assumptions.
SO MANY SHADES OF GRAY
There are different legal distinctions concerning responsibility, such as causal responsibility and liability responsibility.
Using this distinction, one may say that the implant is causally responsible, but that Ms. Q still has liability for her actions. One might be tempted to split the liability in this way because Ms. Q still acted on the urge – especially if she knew the risk of brain implant side effects. Perhaps Ms. Q still bears all primary responsibility but the influence of the implant should mitigate some of her punishment.
These are important gradations to reckon with, because the way we as a society divide liability may force patients to choose between potential criminal liability and treating a debilitating brain condition.
Questions also arise about product liability for companies, professional responsibility issues for researchers and technology developers, and medical malpractice for the health professionals who placed and programmed the device. Even if multiple actors share responsibility, the question regarding how to distribute responsibility among multiple actors still remains.
Adding an additional layer is the potential for malicious interference of these implants by criminals. Newer implants may have wireless connectivity. Hackers could attack such implants to use Ms. Q for their own (possibly nefarious) purposes, posing more challenges to questions of responsibility.
Insulin pumps and implantable cardiac defibrillators have already been hacked in real life. While there have not been any reports of malicious interference with brain implants, their increasing adoption brings greater opportunity for tech-savvy individuals to potentially use the technology for evil.
Considering the impact brain implants can have on moral and legal notions of responsibility, it’s time to discuss whether and when brain interventions should excuse people. New technologies often require some modification or extension of existing legal mechanisms. For example, assisted reproductive technologies have required society to redefine what it means to be a “parent.”
It’s possible that soon we will start hearing in courtrooms: “It’s not my fault. My brain implant made me do it.”
- Laura Y. Cabrera is an assistant professor of neuroethics at Michigan State University.
- Jennifer Carter-Johnson is an associate professor of law at Michigan State University.
-This article was first published at The Conversation