In case you missed it, February 3, 2005 witnessed the birth of a new tort: negligent delivery of cookies.
It was on that day that a Colorado judge awarded $900 in damages to Wanita Young, who claims to have been terrified by two teenage girls who dropped off homemade cookies at her house.
The defendants, two wholesome girls from Durango, Colorado, had decided – sua sponte, as it were – to bake cookies one evening and leave them as presents for their neighbors. They also (and I hope you’re sitting down for this) delivered each batch of cookies with a heart-shaped note saying “Have a Great Evening.”
But by the time the girls got to Wanita Young’s home, it was 10:30 p.m., and Ms. Young was curled up in bed with a copy of Lawsuits for Dummies. She allegedly mistook the tender knocks on her door to be those of a craven (and oddly polite) burglar. Ms. Young ended up in the emergency room with an anxiety attack.
Although the girls apologized and offered to pay her medical bills, Ms. Young took them to court. On February 3, Judge Doug Walker of the La Plata County Small Claims Court found the girls liable for Ms. Young’s medical expenses.
Novel torts – say, serving coffee that’s too hot – pop up like tulips in the spring, and they invariably grab headlines. When most people think about civil law they are probably thinking, whether they know it or not, of torts: malpractice, personal injury, defamation, and fraud are all torts. What is surprising is that the law and terminology of tort law arrived comparatively late on the legal scene.
Tort is an old French word which simply means “injury.” It comes from the Latin verb torquere (to twist) from which we get such useful words as torque and torsion. In legal English (but, curiously, no longer in French), it refers to wrongful conduct – other than a breach of contract – giving rise to an action for damages. Lawyers began using tort in this sense as early as 1586, but the phrase was not immediately popular. Sir William Blackstone, whose 18th Century treatise covered the entire common law of England, makes no mention of “tort.” Instead, he refers to private wrongs and civil injuries.
During the 19th Century, the word tort became increasingly commonplace, and by the 20th Century it was universal. Just why the legal profession abandoned perfectly good English phrases like “private wrongs” in favor of an archaic French word is something of a mystery. Imagine, if you will, that all the lawyers got together and decided that binding agreements should be known, not as contracts, but as soufflés. Wouldn’t that be odd?
For many centuries, the law of civil wrongs was concerned only with intentional acts. A plaintiff who was injured by the intentional act of another could bring an action for trespass. This was trespass in its original broad meaning of “transgression,” a sense that survives in the Lord’s Prayer: “forgive us our trespasses.” The technical term for the lawsuit was trespass vi et armis (trespass by force and arms) which gives you some idea of the subject matter of early tort law.
Today, tort law is largely concerned with liability for non-intentional acts, or negligence. The word negligence entered the law sideways as a Latin adverb (negligenter) used to describe a kind of trespass that was not done with “force and arms.” Blackstone recognized a whole category of such trespasses, including the unskillful work of a “physician, surgeon, or apothecary,” which he called mala praxis, from Latin malum (bad), plus Greek praxis (practice) – the forerunner, of course, of malpractice.
But Blackstone did not refer to “negligence” as a cause of action – because it wasn’t. A person injured in an accident would have to plead “trespass on the case,” which was a request that a court make a limited exception to the rule that trespasses have to be intentional. Each case was considered unique, and right up to the early 20th Century, legal experts could stoutly deny the existence of an action for “negligence.”
We’ve come a long way, baby. The sixth edition of Black’s Law Dictionary lists no fewer than 17 varieties of negligence, some of them distinguished by such fine shades of meaning that they appear to have been devised by medieval monks. There is slight negligence and ordinary negligence; gross negligence and reckless negligence; and even the seemingly contradictory willful negligence. Negligence law employs thousands of lawyers across the country and is the reason for all those lawyer ads on television and radio. New York City alone paid out $570 million in negligence damages in 2004.
Arguably the person most responsible for the emergence of negligence as a separate cause of action was not a judge or even a lawyer, but an engineer, John Loudon McAdam. In the early 1800s, McAdam pioneered a new method of road construction that vastly improved the British turnpike system. Improved roads led to a great increase in the volume and speed of stagecoach traffic, which in turn led to an explosion of road accidents. The result was a glut of lawsuits seeking remedies for what was obviously the careless, but unintentional, acts of the drivers.
Incidentally, roads built by the McAdam system are said to be macadamized; and when somebody thought of covering such roads with tar, it became known as tar-bound macadam, or tarmac.
Anyway, by the 1830s it was settled that all those people injured on macadamized roads could sue, in effect, for negligence, without having to shoehorn the lawsuit into the trespass doctrine. This legal innovation came just in time for the Industrial Revolution, which gave mankind spectacular new ways to behave negligently. The railroads alone killed or injured over 20,000 workers in the U.S. during 1888-89 – a rate that would double by 1906.
Courts recognized that at least some of these industrial injuries resulted from negligence, or the breach of a duty of care, but they struggled for years to define that duty. In 1856, one English court came up with the now-famous standard of the reasonable man; that is, negligence amounts to doing something that a reasonable man wouldn’t, or failing to do something that he would. A later English judge would memorably described this reasonable guy as “the man on the Clapham omnibus” – presumably because a rational person would sooner ride on an omnibus than one of those deathtrap trains.
In the late 19th Century, zealous lawyers on both sides of the Atlantic sought to expand the frontiers of negligence just as quickly as modern machines could mangle their clients. But judges recoiled in horror against opening the floodgates of litigation – a term that actually predates the Industrial Revolution having been coined by a New York court in the 1818 case of Whitbeck v. Cook.
Courts developed a number of theories to limit negligence liability. Perhaps the most famous was contributory negligence, which dates to the early 1800’s but became a huge factor in Victorian-era litigation. Under that doctrine, if the injured party was even a little bit negligent, then he could not recover a penny. The majority of states have replaced contributory negligence with comparative negligence, which means that a plaintiff’s recovery may be proportionately diminished – but not barred – by his or her own fault.
During the same period plaintiff-friendly courts swung back with their own rules, such as res ipsa loquitur (“the thing speaks for itself”), which was first articulated in 1863. Under that doctrine, plaintiff need not allege the cause of certain types of accident that simply would not have happened without negligence.
Other courts adopted a rule of strict liability for activities deemed to be extra-hazardous. This branch of strict liability has largely been superseded by statute and regulation, but the precedents still serve as a cautionary note to those who would engage in particularly risky conduct.
Like baking cookies.
(This column originally appeared in the April 2005 issue of New York Law Journal Magazine).
Friday, April 29, 2005
Subscribe to:
Posts (Atom)