The Chicago Tribune reported in September that the Indian state of Punjab has created the world’s first monkey jail. At present, there are 13 inmates who have been officially sentenced to hard time for theft, vandalism, and minor assaults.
Meanwhile in Britain, a judge recently granted a reprieve to a German Shepherd named Dino who had been sentenced to die for biting a woman. That decision came after three years of litigation on Dino’s behalf, including arguments before Britain’s House of Lords and the European Court of Human Rights (that’s right, Human Rights).
I know what you’re thinking: those things couldn’t happen here. Who ever heard of putting an animal on trial?
For starters, there was Gifford Pinchot, a former governor of Pennsylvania. In 1924, Pinchot put a Labrador retriever named Pep on trial for killing the Governor’s cat. Pep was found guilty and sentenced to life imprisonment in the State Penitentiary. Pep lived out the rest of his days as inmate C2559.
Death Row Dogs
In fact, dogs are effectively put on trial all the time under vicious dog laws in the U.S., Britain, and elsewhere.
These laws authorize individuals and animal control officers to file complaints against dogs that have attacked people, or that merely seem likely to do so. There is a hearing at which a judge or public health official must decide whether the dog meets the statutory definition of “vicious” or “dangerous.”
Although the dog’s owner may be involved in the proceedings, there’s no doubt that it is the animal itself that is on trial, and it is the animal that faces punishment, be it confinement, removal from the jurisdiction, or death. In one Texas case, the accused dog was actually picked out of a dog lineup by the dog bite victim.
America’s most famous death-row dog was Taro, a three-year old Japanese Akita who was sentenced to death under New Jersey’s vicious dog law. Taro’s owner appealed the death sentence, and the litigation dragged on for the next three years. In the meantime, Taro became Bergen County’s prisoner No. 914095 and was incarcerated in the Sheriff’s K-9 Unit. In a bit of drama worthy of Hollywood, a last minute pardon came from then-Governor Christine Todd Whitman.
Dogs have been sued for money damages, too. In 2000, an Indiana prisoner sued a police dog, alleging that the dog was a “person” who violated his rights while “acting under color of state law.” The dog, named Frei, was a sympathetic defendant, having received numerous awards for valor in the line of duty.
The Seventh Circuit threw out the claim against Frei, holding that dogs are not “persons” whether they act under color of state law or not. The court added: “A suit against a dog poses a host of other problems. Was Frei served with process? Did he retain [a] lawyer . . .? Was Frei offered the right of self-representation under 28 U.S.C. §1654? What relief does [plaintiff] seek from a dog – Frei’s awards, perhaps?”
Bees, Worms, and Rats
Although we detect a note of sarcasm in the Seventh Circuit’s opinion, the fact is that there is a long and distinguished history of lawsuits against animals.
In the Middle Ages, ecclesiastical courts conducted countless trials against wild animals that did damage to persons or property. As early as 864, a hive of bees that had stung a man to death in Germany was ordered to be destroyed. Oddly enough, this judgment against bees was rendered by the Council of Worms. (Worms being a city in Germany, but still.)
In the typical case, ecclesiastical prosecutors went after swarms of insects or rodents who destroyed crops. In such cases, court would order the animals banished from the district. If they failed to leave, the animals risked excommunication or anathema, these being official curses.
The most extraordinary feature of these early trials is their rigorous attention to procedural fairness. Not only would the court issue summonses to the offending animals, reading them aloud in church or nailing them to trees, but it would appoint defense counsel to represent the beasts – at the community’s expense.
And the animal defenders were no slouches. Bartholomew Chassenée, who went on to become one of France’s leading judges, established his reputation in 1522 by defending the rats of Autun, who had been accused of eating the province’s barley crop. Chassenée argued, among other things, that the rats had not been properly served with process and that, in any event, the presence of cats in the neighborhood made it impossible for his clients to appear in court.
In the 1570s, there was a long and expensive litigation brought against certain beetles that were said to be devouring the vineyards in St. Julien in France. The beetles’ lawyer, Pierre Rembaud, put up such a strong defense that the townspeople offered to settle the case by granting the insects their own plot of land.
It would be a mistake to think that these trials were put on as jokes, with a nudge and a wink to the audience. They were conducted with great respect for precedent. According to one ancient treatise, a lawsuit against insects would begin with a formal complaint filed by the inhabitants of an infested town, the plaidoyer des habitans (plaidoyer is the French root of “pleading”), after which the defendants filed their answer, plaidoyer pour les insectes. This may be contrasted with today’s method of proceeding against insects, as set forth by Messrs. Johnson & Johnson: “Shake well. Point nozzle at insects and spray.”
This time it’s personal
While ecclesiastical prosecutors were targeting pests, other litigious Europeans were bringing actions against a veritable menagerie of domesticated animals in secular courts.
Pigs were the worst. They wandered the streets freely and, unfortunately, had a taste for small children. In 1386, a French court sentenced a sow to death for killing a child. The convicted pig was, for some inexplicable reason, dressed up in human clothes and hanged in the public square. A century later, six infanticidal piglets narrowly escaped the gallows in Savigny-sur-Etang, when the court held that they had been corrupted by their mother’s bad example.
There are over a hundred recorded examples of animal trials. At the end of the 17th Century, a Russian goat was tried and banished to Siberia for its misdeeds. In 1712, an Austrian court sentenced a dog to confinement in a public pillory for biting the leg of a prominent citizen – proof that dogs have always enjoyed a juicy burgher.
The notion that animals are capable of committing a crime was troubling even to medieval jurists due to the age-old doctrine of mens rea – that a crime requires some measure of intent. For decades, leading scholars wrestled with the apparent inability of animals to form a guilty intent. By the 13th Century, they had figured it out: animals are agents of Satan. But of course.
The English Deo-dandies
While animal trials were all the rage in places like France and Germany, the English were cool to the idea. Instead, English law held that if any animal, or indeed any inanimate object, caused the death of a person, it would simply be forfeited to the King to be sold for the benefit of the poor. The technical term for this is deodand, from the Latin deo dandum, “to be given to God.” The deodand, which was necessary for “the pacification of [God’s] wrath” according to a 1607 treatise, was also on the books in colonial Maryland, Virginia, and Rhode Island.
Nowadays, the idea of the government seizing a piece of “guilty” property sounds distinctly outmoded. But is it? Consider today’s asset forfeiture laws, under which the federal government may seize any property used in the commission of a crime. Thus, if a man robs a bank, the government may not only convict the man, but it can also seize the getaway car. Why is that? To make sure that the car doesn’t do it again? To set an example for other cars?
Perhaps we’re not so advanced as we think. At least, that’s what my cat tells me.
(This column originally appeared in the December 2004 issue of New York Law Journal Magazine).
Wednesday, December 29, 2004
Friday, October 29, 2004
Column: Boilerplate Special
My lease is up for renewal, so it’s time for that annual ritual in which I start to read the Lease Agreement, get about two paragraphs into it, and then reach for a bottle.
“Not to worry,” I tell myself, “it’s just boilerplate.”
Boilerplate must rank right up there with laches as one of those legal terms that seem to come from nowhere. The word refers to those bits of legal language that are endlessly repeated in certain documents and, for some inexplicable reason, assumed to be harmless.
Boilerplate is all around us, not only in leases, but also mortgages, loan agreements, subpoenas, powers of attorney, and so on. It is through boilerplate that most ordinary people come to know – and hate – legalese.
The Party of the First Part
With its random verbosity and archaic syntax, boilerplate usually sounds like the product of a 19th Century opium smoker out on a bender. What normal person would, for example, refer to people as “the party of the first part” and the “the party of the second part”? And yet, one study confirmed that lawyers continued to use those terms at least into the 1980’s.
In the 1935 movie A Night at the Opera, the Marx Brothers rip such language to shreds, literally. Groucho, who is trying to lure Chico into signing a contract, reads the first clause aloud: “the party of the first part shall be known in this contract as the party of the first part.” Chico doesn’t like the sound of that and so they agree to rip that part of the contract out. And on they go tearing out clauses up through the “party of the ninth part.”
After tearing up most of the contract, Groucho and Chico disagree about the final clause: "If any of the parties participating in this contract is shown not to be in their right mind, the entire agreement is automatically nullified." Groucho offers the classic defense of boilerplate: “It's all right, that's, that's in every contract. That's, that's what they call a 'sanity clause'.” To which Chico defiantly answers: “You can't fool me! There ain't no Sanity Clause!”
Riveting Language
The term boilerplate originated in the offices of 19th Century American newspapers. Back then, newspapers were printed from metal plates that were cast from mats (short for matrices) made by the paper’s typesetters. Some of the savvier news agencies and syndicates would send out their press releases or columns in pre-cast metal plates that could not be altered. Editors referred to these pre-packaged plates as “boilerplate” because they resembled the standard-sized iron plates that were riveted together to make boilers. Over time, boilerplate came to mean any part of a newspaper that remained unchanged, issue after issue.
Boilerplate became a journalistic cliché. At some point in the last century, lawyers borrowed the term from the field of journalism, and they have yet to give it back.
Blame Gutenberg, and Gates
Although the term boilerplate arose in the 19th Century, the phenomenon is much older – almost as old as law itself.
In medieval English law, transactions and courtroom allegations gained validity by exact repetition of verbal formulas – missing out a single word, or even stammering, could lead to dismissal of one’s case. In the 17th Century, judges could throw out a pleading because, for example, a single Latin word was misspelled.
It was the printing press that saved lawyers from the fear and loathing brought on by those hyper-precise judges. Almost as soon as Gutenberg’s first Bible rolled off the press, English lawyers were putting together formbooks, that is, collections of sample contracts, pleadings, and other documents that had already passed muster with some court or another. Provided that one copied the form verbatim, no sporting judge could object.
Until the late 18th Century, American lawyers simply borrowed from British formbooks, but after gaining independence, there was a demand for something more homegrown. In 1797, New Jersey lawyer William Griffith struck a blow for independence with his Scrivener’s Guide (the title alone gives one goose bumps), which was advertised as being “Useful for all Gentlemen, especially those that Practice the Law . . . .”
Formbooks are convenient, no doubt, but they have the unfortunate effect of bringing out the most conservative instincts of the legal professions. A cautious lawyer (and is there any other kind?) is loath to depart from the accepted form, and that’s a big reason why legal language has become fossilized.
Ironically, modern technology has given a big boost to archaic language, since the word processor makes boilerplate all-but irresistible to busy lawyers. Who wants to reinvent the wheel when a simple cut-and-paste job will have that contract on the partner’s desk in no time? Never mind that the language you’re cutting and pasting was originally drafted by Ben Franklin’s brother-in-law, it still works!
Everyday Boilerplate
Which brings us back to that lease staring me in the face. Although it’s nothing more than pre-printed boilerplate, a small note on the first page informs me that it is written in “plain English format.” That is, it uses the same kind of language that you and I use everyday. Like “material misstatement of fact” (a phrase you will no doubt remember from the film Sex, Material Misstatements of Fact, and Videotape), which appears on page 3 of the lease.
Despite the drafter’s efforts, the lease suffers from all the sins of boilerplate. There is legalistic redundancy, such as the insistence that rent be paid “in full without deduction” or that the landlord shall not be liable for “loss, expense, or damage.” There is also rather troubling ambiguity, such as when the landlord is given the right to “enter the apartment at reasonable hours to: repair, inspect, [or] exterminate.” Pests, one hopes.
The great irony of boilerplate is that it appears most often in consumer contracts, that is, contracts that are meant to be read and understood by the great mass of non-lawyers out there. Take your average mortgage – the most important contract that most people will ever enter into. Here medieval boilerplate abounds, not in some musty book of precedents, but in the most up-to-date forms available. One sample mortgage posted on the internet contains the following provision:
Borrower further covenants and warrants to Lender that Borrower is indefeasibly seized of said land in fee simple. . . .
How many people, one wonders, know what “indefeasibly seized” means? Not me. But fortunately I have a law dictionary, which defines seized (also spelled “seised”) as a “feudal term referring to one possessed of a freehold.” Basically, this term is nothing but a faint echo of the days when men wore stockings and the right to a freehold could be based on actual possession.
Not to deny the charm of using feudal terms, but couldn’t the phrase in question be replaced with something like, say, “Borrower owns the land”?
Boilerplate Special
Boilerplate resists innovation. Nevertheless, large, seismic shifts in law and culture can, occasionally, cause boilerplate to shift slightly.
At the height of the Cold War, for example, some insurance companies added a “nuclear clause,” specifying that the word “fire” does not include a nuclear reaction “whether controlled or uncontrolled.” More recently, some form contracts have begun to recognize “domestic partners” where they used to speak only of “spouses.” In California – according to one report – a lawyer changed the standard provision absolving the parties from liability for “acts of God” to “acts of God or other deities.”
Change is good, of course. But it would be even better if lawyers allowed new language to replace the old. But as every lawyer knows, the “delete” key is infected with cooties, and so new clauses invariably get tacked on to existing boilerplate without anything getting cut.
As a result, going through a contract is like an archaeological dig, with different provisions representing different historical strata. A quick glance at my lease reveals clauses that appear to come from the 1890’s (“trade people must only use … service entrances”); the 1960’s (“No waterbeds allowed in apartment”); and the 1970’s (“Tenant shall conserve energy.”)
Now that I really look at it, I can see that my lease is a jumble of inconsistent, incoherent, and burdensome clauses. I’m going to sign it anyway, of course. It’s just boilerplate.
(This column originally appeared in the October 2004 issue of New York Law Journal Magazine).
“Not to worry,” I tell myself, “it’s just boilerplate.”
Boilerplate must rank right up there with laches as one of those legal terms that seem to come from nowhere. The word refers to those bits of legal language that are endlessly repeated in certain documents and, for some inexplicable reason, assumed to be harmless.
Boilerplate is all around us, not only in leases, but also mortgages, loan agreements, subpoenas, powers of attorney, and so on. It is through boilerplate that most ordinary people come to know – and hate – legalese.
The Party of the First Part
With its random verbosity and archaic syntax, boilerplate usually sounds like the product of a 19th Century opium smoker out on a bender. What normal person would, for example, refer to people as “the party of the first part” and the “the party of the second part”? And yet, one study confirmed that lawyers continued to use those terms at least into the 1980’s.
In the 1935 movie A Night at the Opera, the Marx Brothers rip such language to shreds, literally. Groucho, who is trying to lure Chico into signing a contract, reads the first clause aloud: “the party of the first part shall be known in this contract as the party of the first part.” Chico doesn’t like the sound of that and so they agree to rip that part of the contract out. And on they go tearing out clauses up through the “party of the ninth part.”
After tearing up most of the contract, Groucho and Chico disagree about the final clause: "If any of the parties participating in this contract is shown not to be in their right mind, the entire agreement is automatically nullified." Groucho offers the classic defense of boilerplate: “It's all right, that's, that's in every contract. That's, that's what they call a 'sanity clause'.” To which Chico defiantly answers: “You can't fool me! There ain't no Sanity Clause!”
Riveting Language
The term boilerplate originated in the offices of 19th Century American newspapers. Back then, newspapers were printed from metal plates that were cast from mats (short for matrices) made by the paper’s typesetters. Some of the savvier news agencies and syndicates would send out their press releases or columns in pre-cast metal plates that could not be altered. Editors referred to these pre-packaged plates as “boilerplate” because they resembled the standard-sized iron plates that were riveted together to make boilers. Over time, boilerplate came to mean any part of a newspaper that remained unchanged, issue after issue.
Boilerplate became a journalistic cliché. At some point in the last century, lawyers borrowed the term from the field of journalism, and they have yet to give it back.
Blame Gutenberg, and Gates
Although the term boilerplate arose in the 19th Century, the phenomenon is much older – almost as old as law itself.
In medieval English law, transactions and courtroom allegations gained validity by exact repetition of verbal formulas – missing out a single word, or even stammering, could lead to dismissal of one’s case. In the 17th Century, judges could throw out a pleading because, for example, a single Latin word was misspelled.
It was the printing press that saved lawyers from the fear and loathing brought on by those hyper-precise judges. Almost as soon as Gutenberg’s first Bible rolled off the press, English lawyers were putting together formbooks, that is, collections of sample contracts, pleadings, and other documents that had already passed muster with some court or another. Provided that one copied the form verbatim, no sporting judge could object.
Until the late 18th Century, American lawyers simply borrowed from British formbooks, but after gaining independence, there was a demand for something more homegrown. In 1797, New Jersey lawyer William Griffith struck a blow for independence with his Scrivener’s Guide (the title alone gives one goose bumps), which was advertised as being “Useful for all Gentlemen, especially those that Practice the Law . . . .”
Formbooks are convenient, no doubt, but they have the unfortunate effect of bringing out the most conservative instincts of the legal professions. A cautious lawyer (and is there any other kind?) is loath to depart from the accepted form, and that’s a big reason why legal language has become fossilized.
Ironically, modern technology has given a big boost to archaic language, since the word processor makes boilerplate all-but irresistible to busy lawyers. Who wants to reinvent the wheel when a simple cut-and-paste job will have that contract on the partner’s desk in no time? Never mind that the language you’re cutting and pasting was originally drafted by Ben Franklin’s brother-in-law, it still works!
Everyday Boilerplate
Which brings us back to that lease staring me in the face. Although it’s nothing more than pre-printed boilerplate, a small note on the first page informs me that it is written in “plain English format.” That is, it uses the same kind of language that you and I use everyday. Like “material misstatement of fact” (a phrase you will no doubt remember from the film Sex, Material Misstatements of Fact, and Videotape), which appears on page 3 of the lease.
Despite the drafter’s efforts, the lease suffers from all the sins of boilerplate. There is legalistic redundancy, such as the insistence that rent be paid “in full without deduction” or that the landlord shall not be liable for “loss, expense, or damage.” There is also rather troubling ambiguity, such as when the landlord is given the right to “enter the apartment at reasonable hours to: repair, inspect, [or] exterminate.” Pests, one hopes.
The great irony of boilerplate is that it appears most often in consumer contracts, that is, contracts that are meant to be read and understood by the great mass of non-lawyers out there. Take your average mortgage – the most important contract that most people will ever enter into. Here medieval boilerplate abounds, not in some musty book of precedents, but in the most up-to-date forms available. One sample mortgage posted on the internet contains the following provision:
Borrower further covenants and warrants to Lender that Borrower is indefeasibly seized of said land in fee simple. . . .
How many people, one wonders, know what “indefeasibly seized” means? Not me. But fortunately I have a law dictionary, which defines seized (also spelled “seised”) as a “feudal term referring to one possessed of a freehold.” Basically, this term is nothing but a faint echo of the days when men wore stockings and the right to a freehold could be based on actual possession.
Not to deny the charm of using feudal terms, but couldn’t the phrase in question be replaced with something like, say, “Borrower owns the land”?
Boilerplate Special
Boilerplate resists innovation. Nevertheless, large, seismic shifts in law and culture can, occasionally, cause boilerplate to shift slightly.
At the height of the Cold War, for example, some insurance companies added a “nuclear clause,” specifying that the word “fire” does not include a nuclear reaction “whether controlled or uncontrolled.” More recently, some form contracts have begun to recognize “domestic partners” where they used to speak only of “spouses.” In California – according to one report – a lawyer changed the standard provision absolving the parties from liability for “acts of God” to “acts of God or other deities.”
Change is good, of course. But it would be even better if lawyers allowed new language to replace the old. But as every lawyer knows, the “delete” key is infected with cooties, and so new clauses invariably get tacked on to existing boilerplate without anything getting cut.
As a result, going through a contract is like an archaeological dig, with different provisions representing different historical strata. A quick glance at my lease reveals clauses that appear to come from the 1890’s (“trade people must only use … service entrances”); the 1960’s (“No waterbeds allowed in apartment”); and the 1970’s (“Tenant shall conserve energy.”)
Now that I really look at it, I can see that my lease is a jumble of inconsistent, incoherent, and burdensome clauses. I’m going to sign it anyway, of course. It’s just boilerplate.
(This column originally appeared in the October 2004 issue of New York Law Journal Magazine).
Wednesday, September 29, 2004
Column: Comma Cause
True story: a few days ago, a partner handed me a mark-up of a brief I had written. His revisions drove me into an uncharacteristic fit of rage because of one unpardonable sin: he removed three commas.
Not just any commas, but serial commas. You know the serial comma. When you have a list of three or more things, the serial comma comes after the penultimate item.
For example, in the phrase “red, white, and blue,” the serial comma is the one after “white.” The Oxford University Press embraces the serial comma, which is, as a result, sometimes referred to as the Oxford comma, while others, including the New York Times, eschew it. In the world of legal writing, the status of the serial comma keeps many of us (well, me) up at night.
Comma Chameleon
Before you dismiss such talk with a brisk “get a life,” just remember that punctuation has decided many a lawsuit, and has even sent men to the gallows, as we shall see.
Take our friend, the serial comma. Linguist Peter Tiersma reports that most lawyers omit the serial comma, and he seems, if I may say, not the least bit worked up about it. In The Elements of Legal Style, Bryan Garner insists that good usage requires the serial comma. Meanwhile, in the recent book Eats Shoots & Leaves, Lynn Truss advises us not to be “too rigid” about the serial comma. Heretic!
Garner, of course, is correct. As he so pithily states in his book: “omitting [the serial comma] may cause ambiguities, while including it never will.” If your contract states that the vendor must deliver “eggs, milk and macaroni and cheese,” the serial comma will clarify whether the vendor is supplying pasta and cheese separately, or boxes of mac and cheese.
Can there be law without punctuation?
Legend has it that in the halcyon days of medieval England, statutes were entirely unpunctuated. As if castles, jousting, and giant tankards of ale weren’t reason enough to yearn for the past – imagine, a world without punctuation!
But alas, the late Professor David Mellinkoff disproved that theory through painstaking research – he even found a 2,400 year-old Greek city law with punctuation marks. What is true, however, is that for many centuries punctuation was not standardized. Some scriveners preferred to use dots or points, descended from Greek, while others tended to the Latin virgule (a forward slash).
When the English began printing their statutes in the 1480’s, the printers had to make sense of the stacks of indifferently punctuated laws. They did their best to tidy up the books with such novelties as section headings and consistent punctuation. It was in this context that lawyers began to assume that since printers had added punctuation, the original statutes must have been entirely unpunctuated. Well, that’s legal reasoning for you.
Once the age of printing got underway, legal English went from being a lightly punctuated affair to the enthusiastically punctuated creature we know today. The rise of punctuation went hand-in-hand with the law’s addiction to run-on sentences. The longer a sentence gets, the more it needs commas and semicolons to prop up its attenuated meaning.
Consider this: lawmakers are many times more long-winded than scientists. Whereas scientific prose has a mean sentence length of 27.6 words, just one penal statute from California consists of a single sentence of 150 words. The problem with criminal law, you might say, is that the sentences are too long.
Matters of Life and Death
From the very beginning, therefore, the task of statutory construction has involved deciphering punctuation. A 16th Century treatise on statutory interpretation discussed the thorny issue of matching up modifiers with their objects with the observation that “the poynctinge & parenthesinge is muche materiall.”
At times, punctuation has turned deadly. In U.S. v Palmer (1818), the Supreme Court upheld the death penalty for two Bostonians who had robbed a Spanish ship while on the high seas. The conviction turned on how to read the piracy statute, which defined piracy as “murder or robbery, or any other offence, which, if committed within the body of a county, would, by the laws of the United States, be punishable with death.”
The defendants argued that, although they had committed robbery, it was not the sort of robbery that would have been punishable by death if committed within “any county” of the U.S. The prosecution countered that the bit about “punishable by death” only modified the phrase “or any other offence,” so that any robbery or any murder committed on the high seas should constitute piracy.
The majority agreed with the prosecution, evidently swayed by the statute’s artful use of commas, to which a horrified Justice Johnson said in his dissent: “singular as it may appear, it really is the fact in this case, that these men’s lives may depend upon a comma . . .”
The lesson, presumably, is that one shouldn’t depend on commas to save one’s life. But just 100 years after Palmer, the defendant in an English case made exactly the same mistake. During World War I, Sir Roger Casement sailed on a German submarine to Ireland, bent on fomenting a rebellion that might distract the British. He was caught, and duly convicted of treason. On appeal, his counsel argued that the trial court had misconstrued the Treason Act of 1351. The relevant language, translated from the Norman French, states: “If a man be adherent to the king’s enemies in his realm giving them aid and comfort in the realm or elsewhere …”
Casement argued that the act applied only to those who commit treason while in the King’s realm, whereas he had done his plotting outside of Britain. The prosecution, according to Casement, had misread the statute as though it included commas, whereas anybody who’s anybody knows that the original statute was unpunctuated (this was many years before Prof. Mellinkoff disposed of that canard).
Two of the appellate judges dutifully traipsed over to the Public Record Office, magnifying glass in hand, and examined the original statute. There they found, not commas, but their predecessor, the virgule, in just the right places. Casement was hanged.
Elusive Commas
The search for missing commas continues to this day. In 1997, Christopher Carroll appealed his conviction under a federal child pornography statute on the ground that the judge had failed to appreciate the significance of the absence of a comma in the statute.
The First Circuit conceded that the comma appeared to be missing in the U.S. Code Annotated version of the law. “But appearances often are deceiving,” observed Judge Selya, who then dryly added the citation “See Aesop, The Wolf in Sheep’s Clothing (circa 550 B.C.).” By tracking down the official version of the law in the Statutes At Large, the court was able to establish that the elusive comma had been there all along, and thus affirmed the conviction.
In a 1991 case, the Supreme Court interpreted the contours of the federal removal statute based, in part, on the placement of commas. The plaintiff in that case, the International Primate Protection League, brought suit in a Louisiana court seeking to enjoin the National Institutes of Health from using primates in medical research.
The NIH tried to remove the case to federal court on the strength of a law allowing removal of cases in which the defendant is “[a]ny officer of the United States or any agency thereof, or person acting under him.” The Supreme Court read the statute to mean that removal is allowed only when an officer of the United States, or an officer of any agency thereunder is the defendant in a lawsuit. Had Congress meant to include lawsuits where the agency itself is a defendant, the Court held, it would have put a comma after “United States.” Clearly the Justices were in no mood to monkey around with the law’s grammar. (Sorry.) Suffice it to say that this is an area of the law that is still evolving.
So, next time you have a brief to write, pay attention to those little punctuation marks. Just be careful not to fall asleep or, worse yet, lapse into a comma.
(This column originally appeared in the September 2004 issue of New York Law Journal Magazine).
Not just any commas, but serial commas. You know the serial comma. When you have a list of three or more things, the serial comma comes after the penultimate item.
For example, in the phrase “red, white, and blue,” the serial comma is the one after “white.” The Oxford University Press embraces the serial comma, which is, as a result, sometimes referred to as the Oxford comma, while others, including the New York Times, eschew it. In the world of legal writing, the status of the serial comma keeps many of us (well, me) up at night.
Comma Chameleon
Before you dismiss such talk with a brisk “get a life,” just remember that punctuation has decided many a lawsuit, and has even sent men to the gallows, as we shall see.
Take our friend, the serial comma. Linguist Peter Tiersma reports that most lawyers omit the serial comma, and he seems, if I may say, not the least bit worked up about it. In The Elements of Legal Style, Bryan Garner insists that good usage requires the serial comma. Meanwhile, in the recent book Eats Shoots & Leaves, Lynn Truss advises us not to be “too rigid” about the serial comma. Heretic!
Garner, of course, is correct. As he so pithily states in his book: “omitting [the serial comma] may cause ambiguities, while including it never will.” If your contract states that the vendor must deliver “eggs, milk and macaroni and cheese,” the serial comma will clarify whether the vendor is supplying pasta and cheese separately, or boxes of mac and cheese.
Can there be law without punctuation?
Legend has it that in the halcyon days of medieval England, statutes were entirely unpunctuated. As if castles, jousting, and giant tankards of ale weren’t reason enough to yearn for the past – imagine, a world without punctuation!
But alas, the late Professor David Mellinkoff disproved that theory through painstaking research – he even found a 2,400 year-old Greek city law with punctuation marks. What is true, however, is that for many centuries punctuation was not standardized. Some scriveners preferred to use dots or points, descended from Greek, while others tended to the Latin virgule (a forward slash).
When the English began printing their statutes in the 1480’s, the printers had to make sense of the stacks of indifferently punctuated laws. They did their best to tidy up the books with such novelties as section headings and consistent punctuation. It was in this context that lawyers began to assume that since printers had added punctuation, the original statutes must have been entirely unpunctuated. Well, that’s legal reasoning for you.
Once the age of printing got underway, legal English went from being a lightly punctuated affair to the enthusiastically punctuated creature we know today. The rise of punctuation went hand-in-hand with the law’s addiction to run-on sentences. The longer a sentence gets, the more it needs commas and semicolons to prop up its attenuated meaning.
Consider this: lawmakers are many times more long-winded than scientists. Whereas scientific prose has a mean sentence length of 27.6 words, just one penal statute from California consists of a single sentence of 150 words. The problem with criminal law, you might say, is that the sentences are too long.
Matters of Life and Death
From the very beginning, therefore, the task of statutory construction has involved deciphering punctuation. A 16th Century treatise on statutory interpretation discussed the thorny issue of matching up modifiers with their objects with the observation that “the poynctinge & parenthesinge is muche materiall.”
At times, punctuation has turned deadly. In U.S. v Palmer (1818), the Supreme Court upheld the death penalty for two Bostonians who had robbed a Spanish ship while on the high seas. The conviction turned on how to read the piracy statute, which defined piracy as “murder or robbery, or any other offence, which, if committed within the body of a county, would, by the laws of the United States, be punishable with death.”
The defendants argued that, although they had committed robbery, it was not the sort of robbery that would have been punishable by death if committed within “any county” of the U.S. The prosecution countered that the bit about “punishable by death” only modified the phrase “or any other offence,” so that any robbery or any murder committed on the high seas should constitute piracy.
The majority agreed with the prosecution, evidently swayed by the statute’s artful use of commas, to which a horrified Justice Johnson said in his dissent: “singular as it may appear, it really is the fact in this case, that these men’s lives may depend upon a comma . . .”
The lesson, presumably, is that one shouldn’t depend on commas to save one’s life. But just 100 years after Palmer, the defendant in an English case made exactly the same mistake. During World War I, Sir Roger Casement sailed on a German submarine to Ireland, bent on fomenting a rebellion that might distract the British. He was caught, and duly convicted of treason. On appeal, his counsel argued that the trial court had misconstrued the Treason Act of 1351. The relevant language, translated from the Norman French, states: “If a man be adherent to the king’s enemies in his realm giving them aid and comfort in the realm or elsewhere …”
Casement argued that the act applied only to those who commit treason while in the King’s realm, whereas he had done his plotting outside of Britain. The prosecution, according to Casement, had misread the statute as though it included commas, whereas anybody who’s anybody knows that the original statute was unpunctuated (this was many years before Prof. Mellinkoff disposed of that canard).
Two of the appellate judges dutifully traipsed over to the Public Record Office, magnifying glass in hand, and examined the original statute. There they found, not commas, but their predecessor, the virgule, in just the right places. Casement was hanged.
Elusive Commas
The search for missing commas continues to this day. In 1997, Christopher Carroll appealed his conviction under a federal child pornography statute on the ground that the judge had failed to appreciate the significance of the absence of a comma in the statute.
The First Circuit conceded that the comma appeared to be missing in the U.S. Code Annotated version of the law. “But appearances often are deceiving,” observed Judge Selya, who then dryly added the citation “See Aesop, The Wolf in Sheep’s Clothing (circa 550 B.C.).” By tracking down the official version of the law in the Statutes At Large, the court was able to establish that the elusive comma had been there all along, and thus affirmed the conviction.
In a 1991 case, the Supreme Court interpreted the contours of the federal removal statute based, in part, on the placement of commas. The plaintiff in that case, the International Primate Protection League, brought suit in a Louisiana court seeking to enjoin the National Institutes of Health from using primates in medical research.
The NIH tried to remove the case to federal court on the strength of a law allowing removal of cases in which the defendant is “[a]ny officer of the United States or any agency thereof, or person acting under him.” The Supreme Court read the statute to mean that removal is allowed only when an officer of the United States, or an officer of any agency thereunder is the defendant in a lawsuit. Had Congress meant to include lawsuits where the agency itself is a defendant, the Court held, it would have put a comma after “United States.” Clearly the Justices were in no mood to monkey around with the law’s grammar. (Sorry.) Suffice it to say that this is an area of the law that is still evolving.
So, next time you have a brief to write, pay attention to those little punctuation marks. Just be careful not to fall asleep or, worse yet, lapse into a comma.
(This column originally appeared in the September 2004 issue of New York Law Journal Magazine).
Thursday, April 29, 2004
Column: The Jury is Out
As I write this, Jayson Williams is seeking a mistrial in his criminal case on the ground that lawyers lied to the jury; Martha Stewart says there should be mistrial of her case because one of the jurors lied to the lawyers; and the trial of two former Tyco executives has just ended in mistrial because the jurors couldn’t get along.
Trading One Ordeal For Another
Trial by jury is an ordeal, but ironically, that’s exactly what it was supposed to replace. For centuries, guilt and innocence were determined by the “ordeal,” such as the ordeal by water – if she floats, she’s a witch! Another popular method was ordeal by fire, in which the accused had to walk nine paces with a red-hot iron in both hands – this is thought to be the origin of the phrase “the whole nine yards.”
The Fourth Lateran Council of 1215 spoiled all the fun by effectively banning trial by ordeal. Suddenly, authorities all over Europe had to find an equally scientific way to determine innocence and guilt – a tall order at a time when bloodletting represented cutting-edge medicine. In England, they hit upon the novel idea of canvassing the opinions of twelve local men.
The word jury emerged in the Anglo-Norman dialect of that period, coming from the Old French word juree, and ultimately, from the Latin jurare, meaning “to swear.” Jurare, by the way, is also the root of jurat, the word you sometimes see at the end of an affidavit, or sworn statement.
Why is a juror sworn? Originally, because jurors were also witnesses who were required to testify from personal knowledge if they had it, or, more likely, to simply repeat neighborhood gossip. It wasn’t until the 16th Century that compulsory process was available to compel the testimony of witnesses at trial using the subpoena (Latin for “under penalty”); which finally allowed jurors to sit back and listen to other people talk. And talk, and talk.
Although jurors no longer testify, they still must take an oath in order to be properly impaneled (from the English “panel,” deriving from the Latin pannus, or piece of cloth). In the Martha Stewart case, the defense team is arguing that one of the jurors, Chappell Hartridge, lied in order to get on the jury – which is, of course, a big no-no for oath-takers.
Stewart’s lawyers argue that Mr. Hartridge concealed his criminal record, namely, an assault charge filed by his ex-girlfriend, Gail Outlaw. Confusingly enough, Ms. Outlaw is not herself an outlaw, although she charges that Mr. Hartridge is. In any event, Stewart’s lawyers argue that Hartridge fibbed his way on to the jury because he was eager to convict Stewart because she is rich, or a woman, or both.
The notion that jurors are often tainted by pre-existing biases is not a new one. In their 1875 operetta “Trial By Jury,” Gilbert and Sullivan lampooned the prejudices of juries. In one scene – before the trial has even begun – the entire jury shakes its fists at the defendant and sings in unison:
Monster, dread our damages.
We’re the jury!
Dread our fury!
In fact, such common spirit among jurors seems to be increasingly rare, in this country at least. According to statistics, a “hung jury” happens about 8,000 times a year in the U.S. Make that 8,001 with the Tyco trial.
Hung Juries and Hanging Judges
A hung jury is one that cannot arrive at a verdict after a reasonable period of deliberation. In the Tyco trial, the majority of jurors were reportedly set to convict former executives Dennis Koslowski and Mark Swartz of grand larceny. But one juror (Juror No. 4 , to be exact) held out for the defense.
New York Judge Michael Obus sternly directed the jury to keep deliberating. But after eleven days of deadlock, and reports of overt threats against Juror No. 4, a 79 year-old grandmother, Judge Obus finally decided to declare a mistrial, thus forcing prosecutors to start their six-month case against the defendants all over again.
A hung jury has nothing to do with a hanging judge. In fact, it’s quite the reverse, since a hung jury can’t make up its mind, while a hanging judge is thought to be a little too eager to make up his. Still, the word “hung” has lead to some bogus etymologies. One legal humorist, for example, has suggested that the hung jury was born in the Old West – where jurors who were taken in by shyster lawyers would be hanged in punishment.
The phrase does not, in fact, come from the Old West. But it is American in origin. The Oxford English Dictionary lists the first printed reference to a hung jury in Edwin Bryant’s What I Saw in California (1848-49) in which he states: “The jury . . . were what is called ‘hung’; they could not agree . . .”
Bryant’s phrasing obviously suggests that the phrase was already in common use by the late 1840’s. Indeed, there are earlier case reports with references to hung jury and variations thereof. The earliest use of the term in a law report appears in an 1821 case, Evans v. McKinsey. That case, and virtually all of the early cases referring to hung juries is from the south. So it appears that the term developed somewhere in the south during the early 19th Century.
Linguistically, the phrase seems to derive from the sense of “hung” to mean caught, suspended or delayed (“I got hung up at the office”). Oddly, although the phrase is American, we have not extended the metaphor to describe other deadlocked bodies, say, a “hung Congress.” In Britain, however, commentators have used the term hung parliament to describe the situation when the House of Commons is unable to elect a Prime Minister.
It Only Takes One
The Tyco trial is an example of why juries are so often hung. When unanimity is required – as is still the case in most jurisdictions – it only takes one “angry man” (or angry grandmother) to cause a mistrial.
The need for unanimity has been cheered as a pillar of liberty and decried as an obstruction of justice. Whatever the merits of that debate, the fact is that unanimity was not part of the original plan. Early medieval juries in England did not require unanimity – a few loud shouts of “she turned me into a newt” would suffice for a conviction.
The first recorded case where unanimity was required was in 1367, when an English court refused to accept a majority verdict of guilty. By the end of the fourteenth century, the trend towards unanimity was unstoppable. With that development no doubt came the phenomenon of deadlocked juries. A quick four centuries later, an anonymous Southern American finally came up with a name for this phenomenon: the hung jury.
Renegade Juries
A judge can nullify a jury’s verdict by declaring a mistrial, but that has nothing to do with the term jury nullification. Jury nullification happens when a jury returns a verdict of not guilty, despite their belief that the defendant committed the offense charged. The jury “nullifies” the law on which the prosecutors rely.
In America, nullification goes back to a 1735 New York case in which the colonial governor, William Crosby, brought an action against a printer, John Peter Zenger, for publishing “seditious libels” in the New York Weekly Journal. Although Zenger had clearly printed the alleged libels, the jury voted to acquit, thus nullifying the harsh sedition law of the time.
And finally, let me clear up one confusion. The term jury-rigged, meaning assembled in a makeshift manner, has nothing to do with the legal sense of the word jury. It is an 18th Century nautical term, deriving from the Latin adjutare (to aid), from which we also get the word “adjutant.” Of course, I cannot deny that shady lawyers do occasionally try to “rig” a jury with bribes. But that brings us back to where we began, with mistrials.
(This column originally appeared in the June 2004 issue of New York Law Journal Magazine).
Trading One Ordeal For Another
Trial by jury is an ordeal, but ironically, that’s exactly what it was supposed to replace. For centuries, guilt and innocence were determined by the “ordeal,” such as the ordeal by water – if she floats, she’s a witch! Another popular method was ordeal by fire, in which the accused had to walk nine paces with a red-hot iron in both hands – this is thought to be the origin of the phrase “the whole nine yards.”
The Fourth Lateran Council of 1215 spoiled all the fun by effectively banning trial by ordeal. Suddenly, authorities all over Europe had to find an equally scientific way to determine innocence and guilt – a tall order at a time when bloodletting represented cutting-edge medicine. In England, they hit upon the novel idea of canvassing the opinions of twelve local men.
The word jury emerged in the Anglo-Norman dialect of that period, coming from the Old French word juree, and ultimately, from the Latin jurare, meaning “to swear.” Jurare, by the way, is also the root of jurat, the word you sometimes see at the end of an affidavit, or sworn statement.
Why is a juror sworn? Originally, because jurors were also witnesses who were required to testify from personal knowledge if they had it, or, more likely, to simply repeat neighborhood gossip. It wasn’t until the 16th Century that compulsory process was available to compel the testimony of witnesses at trial using the subpoena (Latin for “under penalty”); which finally allowed jurors to sit back and listen to other people talk. And talk, and talk.
Although jurors no longer testify, they still must take an oath in order to be properly impaneled (from the English “panel,” deriving from the Latin pannus, or piece of cloth). In the Martha Stewart case, the defense team is arguing that one of the jurors, Chappell Hartridge, lied in order to get on the jury – which is, of course, a big no-no for oath-takers.
Stewart’s lawyers argue that Mr. Hartridge concealed his criminal record, namely, an assault charge filed by his ex-girlfriend, Gail Outlaw. Confusingly enough, Ms. Outlaw is not herself an outlaw, although she charges that Mr. Hartridge is. In any event, Stewart’s lawyers argue that Hartridge fibbed his way on to the jury because he was eager to convict Stewart because she is rich, or a woman, or both.
The notion that jurors are often tainted by pre-existing biases is not a new one. In their 1875 operetta “Trial By Jury,” Gilbert and Sullivan lampooned the prejudices of juries. In one scene – before the trial has even begun – the entire jury shakes its fists at the defendant and sings in unison:
Monster, dread our damages.
We’re the jury!
Dread our fury!
In fact, such common spirit among jurors seems to be increasingly rare, in this country at least. According to statistics, a “hung jury” happens about 8,000 times a year in the U.S. Make that 8,001 with the Tyco trial.
Hung Juries and Hanging Judges
A hung jury is one that cannot arrive at a verdict after a reasonable period of deliberation. In the Tyco trial, the majority of jurors were reportedly set to convict former executives Dennis Koslowski and Mark Swartz of grand larceny. But one juror (Juror No. 4 , to be exact) held out for the defense.
New York Judge Michael Obus sternly directed the jury to keep deliberating. But after eleven days of deadlock, and reports of overt threats against Juror No. 4, a 79 year-old grandmother, Judge Obus finally decided to declare a mistrial, thus forcing prosecutors to start their six-month case against the defendants all over again.
A hung jury has nothing to do with a hanging judge. In fact, it’s quite the reverse, since a hung jury can’t make up its mind, while a hanging judge is thought to be a little too eager to make up his. Still, the word “hung” has lead to some bogus etymologies. One legal humorist, for example, has suggested that the hung jury was born in the Old West – where jurors who were taken in by shyster lawyers would be hanged in punishment.
The phrase does not, in fact, come from the Old West. But it is American in origin. The Oxford English Dictionary lists the first printed reference to a hung jury in Edwin Bryant’s What I Saw in California (1848-49) in which he states: “The jury . . . were what is called ‘hung’; they could not agree . . .”
Bryant’s phrasing obviously suggests that the phrase was already in common use by the late 1840’s. Indeed, there are earlier case reports with references to hung jury and variations thereof. The earliest use of the term in a law report appears in an 1821 case, Evans v. McKinsey. That case, and virtually all of the early cases referring to hung juries is from the south. So it appears that the term developed somewhere in the south during the early 19th Century.
Linguistically, the phrase seems to derive from the sense of “hung” to mean caught, suspended or delayed (“I got hung up at the office”). Oddly, although the phrase is American, we have not extended the metaphor to describe other deadlocked bodies, say, a “hung Congress.” In Britain, however, commentators have used the term hung parliament to describe the situation when the House of Commons is unable to elect a Prime Minister.
It Only Takes One
The Tyco trial is an example of why juries are so often hung. When unanimity is required – as is still the case in most jurisdictions – it only takes one “angry man” (or angry grandmother) to cause a mistrial.
The need for unanimity has been cheered as a pillar of liberty and decried as an obstruction of justice. Whatever the merits of that debate, the fact is that unanimity was not part of the original plan. Early medieval juries in England did not require unanimity – a few loud shouts of “she turned me into a newt” would suffice for a conviction.
The first recorded case where unanimity was required was in 1367, when an English court refused to accept a majority verdict of guilty. By the end of the fourteenth century, the trend towards unanimity was unstoppable. With that development no doubt came the phenomenon of deadlocked juries. A quick four centuries later, an anonymous Southern American finally came up with a name for this phenomenon: the hung jury.
Renegade Juries
A judge can nullify a jury’s verdict by declaring a mistrial, but that has nothing to do with the term jury nullification. Jury nullification happens when a jury returns a verdict of not guilty, despite their belief that the defendant committed the offense charged. The jury “nullifies” the law on which the prosecutors rely.
In America, nullification goes back to a 1735 New York case in which the colonial governor, William Crosby, brought an action against a printer, John Peter Zenger, for publishing “seditious libels” in the New York Weekly Journal. Although Zenger had clearly printed the alleged libels, the jury voted to acquit, thus nullifying the harsh sedition law of the time.
And finally, let me clear up one confusion. The term jury-rigged, meaning assembled in a makeshift manner, has nothing to do with the legal sense of the word jury. It is an 18th Century nautical term, deriving from the Latin adjutare (to aid), from which we also get the word “adjutant.” Of course, I cannot deny that shady lawyers do occasionally try to “rig” a jury with bribes. But that brings us back to where we began, with mistrials.
(This column originally appeared in the June 2004 issue of New York Law Journal Magazine).
Column: Beam Me Up, Counselor
On January 15, 2004, President Bush unveiled his “New Space Exploration Vision.” Under the President’s plan, American astronauts could be planting the flag on the Moon by 2015, and on Mars after that.
All I can say is: they’d better bring lawyers with them.
For starters, the astronauts could be slapped with a trespassing suit by a Nevada businessman who claims to have valid title to both the Moon and Mars. Welcome to the world of Space Law.
Space Law: The Final Frontier
Space law deals with human activities in outer space. Most issues in space law come under the jurisdiction of the United Nations. The rest, of course, is controlled by the Klingon Empire. Actually, the rest of space law is a hodgepodge of bilateral agreements, national law, and certain “norms” of questionable weight.
Space law presents a great opportunity to the legal linguist because, as a relatively new field, the meanings of some of its key terms are still being worked out. Take space, for example. Clearly, at some point up there, the earth ends and space begins. But where? Experts disagree. Is it where the earth’s atmosphere peters out, somewhere around 80 km up? Or is it the lowest point where a craft can achieve orbit, about 100 km up, or is it – well, you get the picture.
So far, this lack of a definition has not caused any practical problems. But that may just be dumb luck, because defining the boundary of “space” has real implications for tort law. There is a completely different liability regime for accidents caused by aircraft versus those caused by spacecraft (which go by the dreary name of space objects in space law). For the latter, there is – I’m not making this up – the International Convention on Liability for Damage Caused by Space Objects.
Under the Liability Convention, if a space object collides with an aircraft, the country that launched the space object faces absolute liability, but if a space object collides with another space object, it’s fault liability. Thus, in the event of a mid-space collision, astronauts are advised to exchange insurance information and just continue on their way.
Sovereignty is a big problem in space. A country remains sovereign over its adjacent airspace – but how far up does that airspace go? In the 1976 Bogotá Declaration, a number of developing countries asserted that their national territory extends all the way up to geostationary orbit, or about 40,000 km straight up. This is the orbit where a satellite always stays over the same spot on earth. It is a particularly valuable bit of real estate because that’s where communications satellites need to be.
Imagine – if countries owned their orbital space, then they could very well charge you enormous fees for the privilege of having your communication satellite hovering over their territory. Those cell phone calls would suddenly become a lot more expensive.
Property Rights in Space
But the Bogotá Declaration never went anywhere, because it was universally decried as a violation of the Outer Space Treaty of 1967.
The Outer Space Treaty, which has been ratified by 98 countries, including the US, is often called the Magna Carta of space law. The treaty prohibits any state from claiming sovereignty over any part of Outer Space, including the Moon and “other celestial bodies.”
Lest you be confused, celestial bodies is not a bit of cheesy copy from the latest Sports Illustrated swimsuit issue, but rather a recognized legal term. Alas, this is another phrase that has confounded space lawyers because it is not defined in any treaty or convention. Under the Outer Space Treaty, most lawyers interpret celestial bodies to mean all planets, natural satellites of planets, and to asteroids anywhere in the universe.
Some scholars, however, urge that the definition of celestial bodies ought to exclude any planet that is inhabited by intelligent beings. Assuming that there are intelligent aliens, the argument goes, it would be rude to foist the 1967 treaty on them.
More recently, overworked UN lawyers have limited the meaning of celestial bodies to those bodies found within our solar system – other systems would be covered later (one thing at a time, please!). The big losers in all this lawmaking, by the way, are comets and meteoroids, which don’t count as celestial bodies under anybody’s definition.
Martian Mortgages
Although the Outer Space Treaty prohibits nations from owning any celestial body, it doesn’t say anything about individuals or corporations.
That omission has created, well, a vacuum, into which a number of individuals have leapt. The most enterprising of these is Dennis Hope who in 1980 filed papers with the U.S. government and the United Nations laying claim to the Moon and all the planets of the solar system except Earth. His Nevada-based company, Lunar Embassy, has been selling off bits of the solar system to, er, optimistic, investors ever since.
Hope has a number of rivals in the celestial property game, including Martin Juergens, a German pensioner who claims that the moon has belonged to his family ever since Frederick the Great of Prussia granted it to one of his ancestors in 1756.
In an effort to forestall such entrepreneurial activities, the UN Committee on the Peaceful Uses of Outer Space came up with the so-called Moon Treaty of 1979, which prohibits governments and private parties from gaining property rights in the surface or subsurface resources of the Moon and the other celestial bodies.
The rationale of the Moon Treaty, borrowed from the Law of Sea Treaty, is that the Moon and planets are the “common heritage of mankind,” and thus cannot be claimed as private property.
Most of the developed countries felt that the Moon Treaty went a tad too far. After all, if you take the trouble of actually going to Mars, shouldn’t you at least be able to buy a souvenir rock? More seriously, perhaps, a number of space scientists see the potential for lucrative mining operations on various planets and asteroids, and that’s a hard thing for any politician to sign away. Only ten countries have ratified the Moon Treaty, and not one of those has a serious space program.
Resolving disputes
Next to cosmic radiation, litigation is one of great hazards of space travel. Consider the International Space Station in which astronauts spend up to six months in submarine-like conditions. That place is a tort, or even a crime, just waiting to happen (“Hey that’s my bottle of Tang!”)
In the event of a dispute, lawyers would probably apply the law of the flag from maritime law; that is, the flag of the vessel governs torts and crimes committed on the high seas. But the Space Station does not have one flag; rather, it is composed of “modules” from different countries.
So, if an American astronaut and a Russian cosmonaut get into a fight while in a European Union module, then EU law would govern. Since the EU doesn’t have criminal or tort law, one would have to decide which of the EU’s 25 member states (as of 2004) should supply the governing law. Nothing simpler, really.
And then there are those rovers – Spirit and Opportunity – scurrying about the surface of Mars. What if they had a fender bender? There is no Liability Convention to govern accidents on another planet. Those rovers would have to be subject to local law.
The only problem, of course, is that there is no Martian law. As far as we know.
(This column originally appeared in the April 2004 issue of New York Law Journal Magazine).
All I can say is: they’d better bring lawyers with them.
For starters, the astronauts could be slapped with a trespassing suit by a Nevada businessman who claims to have valid title to both the Moon and Mars. Welcome to the world of Space Law.
Space Law: The Final Frontier
Space law deals with human activities in outer space. Most issues in space law come under the jurisdiction of the United Nations. The rest, of course, is controlled by the Klingon Empire. Actually, the rest of space law is a hodgepodge of bilateral agreements, national law, and certain “norms” of questionable weight.
Space law presents a great opportunity to the legal linguist because, as a relatively new field, the meanings of some of its key terms are still being worked out. Take space, for example. Clearly, at some point up there, the earth ends and space begins. But where? Experts disagree. Is it where the earth’s atmosphere peters out, somewhere around 80 km up? Or is it the lowest point where a craft can achieve orbit, about 100 km up, or is it – well, you get the picture.
So far, this lack of a definition has not caused any practical problems. But that may just be dumb luck, because defining the boundary of “space” has real implications for tort law. There is a completely different liability regime for accidents caused by aircraft versus those caused by spacecraft (which go by the dreary name of space objects in space law). For the latter, there is – I’m not making this up – the International Convention on Liability for Damage Caused by Space Objects.
Under the Liability Convention, if a space object collides with an aircraft, the country that launched the space object faces absolute liability, but if a space object collides with another space object, it’s fault liability. Thus, in the event of a mid-space collision, astronauts are advised to exchange insurance information and just continue on their way.
Sovereignty is a big problem in space. A country remains sovereign over its adjacent airspace – but how far up does that airspace go? In the 1976 Bogotá Declaration, a number of developing countries asserted that their national territory extends all the way up to geostationary orbit, or about 40,000 km straight up. This is the orbit where a satellite always stays over the same spot on earth. It is a particularly valuable bit of real estate because that’s where communications satellites need to be.
Imagine – if countries owned their orbital space, then they could very well charge you enormous fees for the privilege of having your communication satellite hovering over their territory. Those cell phone calls would suddenly become a lot more expensive.
Property Rights in Space
But the Bogotá Declaration never went anywhere, because it was universally decried as a violation of the Outer Space Treaty of 1967.
The Outer Space Treaty, which has been ratified by 98 countries, including the US, is often called the Magna Carta of space law. The treaty prohibits any state from claiming sovereignty over any part of Outer Space, including the Moon and “other celestial bodies.”
Lest you be confused, celestial bodies is not a bit of cheesy copy from the latest Sports Illustrated swimsuit issue, but rather a recognized legal term. Alas, this is another phrase that has confounded space lawyers because it is not defined in any treaty or convention. Under the Outer Space Treaty, most lawyers interpret celestial bodies to mean all planets, natural satellites of planets, and to asteroids anywhere in the universe.
Some scholars, however, urge that the definition of celestial bodies ought to exclude any planet that is inhabited by intelligent beings. Assuming that there are intelligent aliens, the argument goes, it would be rude to foist the 1967 treaty on them.
More recently, overworked UN lawyers have limited the meaning of celestial bodies to those bodies found within our solar system – other systems would be covered later (one thing at a time, please!). The big losers in all this lawmaking, by the way, are comets and meteoroids, which don’t count as celestial bodies under anybody’s definition.
Martian Mortgages
Although the Outer Space Treaty prohibits nations from owning any celestial body, it doesn’t say anything about individuals or corporations.
That omission has created, well, a vacuum, into which a number of individuals have leapt. The most enterprising of these is Dennis Hope who in 1980 filed papers with the U.S. government and the United Nations laying claim to the Moon and all the planets of the solar system except Earth. His Nevada-based company, Lunar Embassy, has been selling off bits of the solar system to, er, optimistic, investors ever since.
Hope has a number of rivals in the celestial property game, including Martin Juergens, a German pensioner who claims that the moon has belonged to his family ever since Frederick the Great of Prussia granted it to one of his ancestors in 1756.
In an effort to forestall such entrepreneurial activities, the UN Committee on the Peaceful Uses of Outer Space came up with the so-called Moon Treaty of 1979, which prohibits governments and private parties from gaining property rights in the surface or subsurface resources of the Moon and the other celestial bodies.
The rationale of the Moon Treaty, borrowed from the Law of Sea Treaty, is that the Moon and planets are the “common heritage of mankind,” and thus cannot be claimed as private property.
Most of the developed countries felt that the Moon Treaty went a tad too far. After all, if you take the trouble of actually going to Mars, shouldn’t you at least be able to buy a souvenir rock? More seriously, perhaps, a number of space scientists see the potential for lucrative mining operations on various planets and asteroids, and that’s a hard thing for any politician to sign away. Only ten countries have ratified the Moon Treaty, and not one of those has a serious space program.
Resolving disputes
Next to cosmic radiation, litigation is one of great hazards of space travel. Consider the International Space Station in which astronauts spend up to six months in submarine-like conditions. That place is a tort, or even a crime, just waiting to happen (“Hey that’s my bottle of Tang!”)
In the event of a dispute, lawyers would probably apply the law of the flag from maritime law; that is, the flag of the vessel governs torts and crimes committed on the high seas. But the Space Station does not have one flag; rather, it is composed of “modules” from different countries.
So, if an American astronaut and a Russian cosmonaut get into a fight while in a European Union module, then EU law would govern. Since the EU doesn’t have criminal or tort law, one would have to decide which of the EU’s 25 member states (as of 2004) should supply the governing law. Nothing simpler, really.
And then there are those rovers – Spirit and Opportunity – scurrying about the surface of Mars. What if they had a fender bender? There is no Liability Convention to govern accidents on another planet. Those rovers would have to be subject to local law.
The only problem, of course, is that there is no Martian law. As far as we know.
(This column originally appeared in the April 2004 issue of New York Law Journal Magazine).
Sunday, February 22, 2004
Column: Ham on Wry
When a Brooklyn grand jury recently handed down an indictment against his client, defense attorney Ron Aiello didn’t flinch.
“A grand jury will indict a ham sandwich,” scoffed Aiello, whose client, Jeffrey Feldman, has been accused of committing various acts of political corruption, along with his boss, Brooklyn Democratic Party leader Clarence Norman.
His point, of course, was that a grand jury will indict – that is, bring formal charges against – anybody, even certain foodstuffs.
Aiello did not come up with this zinger himself. In fact, the phrase has been bandied about so much in recent years that most people are probably a little afraid to order a ham sandwich, lest they be indicted as accomplices.
The exact origin of the indictable ham sandwich – like so much else in law – is shrouded in mystery. Some sources attribute the phrase to defense lawyer Barry Slotnick in connection with the “Mayflower Madam” prostitution case of the mid-1980s.
Hilary Clinton cites a different source. In her memoir Living History, the former First Lady turned Senator refers to the “the immortal words of Edward Bennett Williams, ‘a prosecutor can indict a ham sandwich if he chooses.’” Williams, now deceased, was a giant of the Washington, D.C. bar, and founder of the firm of Williams & Connolly. Although such a quip would not have been out of character for the legendary Williams, I can find no further evidence to support his authorship.
The most commonly-cited source of the “ham sandwich” critique is a 1985 interview with then-Chief Judge Sol Wachtler of the New York Court of Appeals. It is possible, however, that Wachtler was repeating something he had heard elsewhere. Either way, Wachtler’s use of the phrase served to popularize it and it has since become a courthouse staple.
Life Imitates Art
In what might safely be called an ironic twist, Wachtler himself was indicted seven years later for sexual harassment. Trapped within his own piquant metaphor, Wachtler had now become the equivalent of a ham sandwich.
Wachtler was convicted and sentenced to serve time. Because he was diagnosed with a severe mental illness, he was ultimately referred for psychiatric treatment at Rochester Prison, a federal facility associated with the Mayo Clinic. The moral of the story, I suppose, is that even if a ham sandwich is indicted, it still needs Mayo.
The Sandwich in History
Indict is itself a curious word – many people wonder where that silent “c” came from. The word began its life in early Renaissance England with the admirably phonetic spelling indite, meaning “to write down.” In the 17th Century, when the word took on its narrower meaning of “to write down legal charges,” some scholar added the “c” to make the word more faithful to its Latin ancestor, indictare. The innovation, evidently, caught on.
And while we’re on the subject, what is so grand about a grand jury? The adjective simply comes from French word for “big.” In Medieval England, an accusatory jury consisted of 23 persons, and was referred to as le graunde inquest or grand jury, using the Law French that dominated British proceedings at the time.
The jury that would ultimately decide the guilt or innocence of the suspect had only 12 men, and thus became known in similarly gallic terms as the petit jury. For some reason, modern English has preserved the “grand” but dropped the “petit,” although the latter word survives in the criminal context in phrases such as petty theft.
Like all clichés, the ham sandwich remark has some basis in fact. A number of studies have criticized the traditional procedures before grand juries – like, defense counsel cannot appear and the prosecutor is under no obligation to present exculpatory evidence – as making it too easy for prosecutors to obtain indictments. The institution was abolished in its native Britain in 1948. But in the U.S., the grand jury is more difficult to get rid of, at least at the federal level, because it is enshrined in the Constitution.
Research fails to disclose any instance of a ham sandwich actually being indicted – nor is there any evidence that a ham sandwich has ever committed a crime. The closest I can find is the infamous ham sandwich that was rumored to have caused the choking death of Mama Cass, the beloved lead singer of the 1960s pop group, the Mamas and the Papas. However, an autopsy on Ms. Cass confirmed that she did not choke on a sandwich, ham or otherwise.
Historically, the law appears to be lenient even on people named “Sandwich.” I refer, of course, to John Montagu, the Fourth Earl of Sandwich, for whom the sandwich is named (he ordered his servants to bring him meat between two slices of bread so that he would not have to interrupt his card game). He was a politician of legendary corruption, even by the standards of 18th Century Britain. Despite his plundering of the Royal Treasury, Lord Sandwich was never indicted.
Throwing the Book
Crime and criminal law has long been a rich source of slang. A stooge, for example, is an informant, a suit is a lawyer. Oddly enough, since the late 1960’s, counter-culture types have referred to the police as pigs, which would put them in an awkward position if they ever did have to prosecute a ham sandwich.
And let’s say a ham sandwich were to be indicted. What would happen then? The sandwich would have to stand trial and decide whether to testify, or just sit there wearing a honey-glazed expression. It would be an ordeal, even for one with such an admirable mixture of protein and carbs.
If the sandwich were found guilty, the judge might throw the book at it. Here we see up close what a dangerous business it is to mix metaphors. If a judge were to throw the book at a ham sandwich, or indeed any sandwich, the result would be an awful mess – cold cuts, tomato, and mustard splattered all over the courtroom.
“To throw the book” at someone was originally an underworld phrase, probably from the 1930s. The basic image, which still survives, is that of a judge convicting a criminal with every crime in “the book,” an imaginary book of penal laws.
The fact that this metaphor exists at all just shows you how much criminal law has changed over the last century or so. Throughout most of the 19th Century, the phrase would have made little sense, since criminal law was still largely a matter of common law, and therefore not compiled in a single book. A harsh judge would have to “throw the precedents” at a criminal – which doesn’t have quite the same menacing ring to it.
It was only after 1900 that most states got down to the business of enacting all-inclusive penal codes. But the codification of the law penetrated the popular mind sufficiently quickly so that by the 1930’s, criminals knew that the law was written down in “the book.”
Book ‘em!
Before a suspect can be convicted, or even indicted, the police must arrest him – or “book him,” as the saying goes. It is not at all clear where the phrase comes from – or whether the book of “book him” is the same book that judges like to throw. What is clear is that the phrase became universally known through Hawaii Five-O (which ran from 1968 to 1980). In the show, Detective McGarrett, memorably played by Jack Lord, would typically end each episode by calling out to his second-in-command: “Book ‘em, Danno!”
But not even McGarrett would try to book a ham sandwich. Unless it came with a slice of pineapple.
(This column originally appeared in the February 2004 issue of New York Law Journal).
“A grand jury will indict a ham sandwich,” scoffed Aiello, whose client, Jeffrey Feldman, has been accused of committing various acts of political corruption, along with his boss, Brooklyn Democratic Party leader Clarence Norman.
His point, of course, was that a grand jury will indict – that is, bring formal charges against – anybody, even certain foodstuffs.
Aiello did not come up with this zinger himself. In fact, the phrase has been bandied about so much in recent years that most people are probably a little afraid to order a ham sandwich, lest they be indicted as accomplices.
The exact origin of the indictable ham sandwich – like so much else in law – is shrouded in mystery. Some sources attribute the phrase to defense lawyer Barry Slotnick in connection with the “Mayflower Madam” prostitution case of the mid-1980s.
Hilary Clinton cites a different source. In her memoir Living History, the former First Lady turned Senator refers to the “the immortal words of Edward Bennett Williams, ‘a prosecutor can indict a ham sandwich if he chooses.’” Williams, now deceased, was a giant of the Washington, D.C. bar, and founder of the firm of Williams & Connolly. Although such a quip would not have been out of character for the legendary Williams, I can find no further evidence to support his authorship.
The most commonly-cited source of the “ham sandwich” critique is a 1985 interview with then-Chief Judge Sol Wachtler of the New York Court of Appeals. It is possible, however, that Wachtler was repeating something he had heard elsewhere. Either way, Wachtler’s use of the phrase served to popularize it and it has since become a courthouse staple.
Life Imitates Art
In what might safely be called an ironic twist, Wachtler himself was indicted seven years later for sexual harassment. Trapped within his own piquant metaphor, Wachtler had now become the equivalent of a ham sandwich.
Wachtler was convicted and sentenced to serve time. Because he was diagnosed with a severe mental illness, he was ultimately referred for psychiatric treatment at Rochester Prison, a federal facility associated with the Mayo Clinic. The moral of the story, I suppose, is that even if a ham sandwich is indicted, it still needs Mayo.
The Sandwich in History
Indict is itself a curious word – many people wonder where that silent “c” came from. The word began its life in early Renaissance England with the admirably phonetic spelling indite, meaning “to write down.” In the 17th Century, when the word took on its narrower meaning of “to write down legal charges,” some scholar added the “c” to make the word more faithful to its Latin ancestor, indictare. The innovation, evidently, caught on.
And while we’re on the subject, what is so grand about a grand jury? The adjective simply comes from French word for “big.” In Medieval England, an accusatory jury consisted of 23 persons, and was referred to as le graunde inquest or grand jury, using the Law French that dominated British proceedings at the time.
The jury that would ultimately decide the guilt or innocence of the suspect had only 12 men, and thus became known in similarly gallic terms as the petit jury. For some reason, modern English has preserved the “grand” but dropped the “petit,” although the latter word survives in the criminal context in phrases such as petty theft.
Like all clichés, the ham sandwich remark has some basis in fact. A number of studies have criticized the traditional procedures before grand juries – like, defense counsel cannot appear and the prosecutor is under no obligation to present exculpatory evidence – as making it too easy for prosecutors to obtain indictments. The institution was abolished in its native Britain in 1948. But in the U.S., the grand jury is more difficult to get rid of, at least at the federal level, because it is enshrined in the Constitution.
Research fails to disclose any instance of a ham sandwich actually being indicted – nor is there any evidence that a ham sandwich has ever committed a crime. The closest I can find is the infamous ham sandwich that was rumored to have caused the choking death of Mama Cass, the beloved lead singer of the 1960s pop group, the Mamas and the Papas. However, an autopsy on Ms. Cass confirmed that she did not choke on a sandwich, ham or otherwise.
Historically, the law appears to be lenient even on people named “Sandwich.” I refer, of course, to John Montagu, the Fourth Earl of Sandwich, for whom the sandwich is named (he ordered his servants to bring him meat between two slices of bread so that he would not have to interrupt his card game). He was a politician of legendary corruption, even by the standards of 18th Century Britain. Despite his plundering of the Royal Treasury, Lord Sandwich was never indicted.
Throwing the Book
Crime and criminal law has long been a rich source of slang. A stooge, for example, is an informant, a suit is a lawyer. Oddly enough, since the late 1960’s, counter-culture types have referred to the police as pigs, which would put them in an awkward position if they ever did have to prosecute a ham sandwich.
And let’s say a ham sandwich were to be indicted. What would happen then? The sandwich would have to stand trial and decide whether to testify, or just sit there wearing a honey-glazed expression. It would be an ordeal, even for one with such an admirable mixture of protein and carbs.
If the sandwich were found guilty, the judge might throw the book at it. Here we see up close what a dangerous business it is to mix metaphors. If a judge were to throw the book at a ham sandwich, or indeed any sandwich, the result would be an awful mess – cold cuts, tomato, and mustard splattered all over the courtroom.
“To throw the book” at someone was originally an underworld phrase, probably from the 1930s. The basic image, which still survives, is that of a judge convicting a criminal with every crime in “the book,” an imaginary book of penal laws.
The fact that this metaphor exists at all just shows you how much criminal law has changed over the last century or so. Throughout most of the 19th Century, the phrase would have made little sense, since criminal law was still largely a matter of common law, and therefore not compiled in a single book. A harsh judge would have to “throw the precedents” at a criminal – which doesn’t have quite the same menacing ring to it.
It was only after 1900 that most states got down to the business of enacting all-inclusive penal codes. But the codification of the law penetrated the popular mind sufficiently quickly so that by the 1930’s, criminals knew that the law was written down in “the book.”
Book ‘em!
Before a suspect can be convicted, or even indicted, the police must arrest him – or “book him,” as the saying goes. It is not at all clear where the phrase comes from – or whether the book of “book him” is the same book that judges like to throw. What is clear is that the phrase became universally known through Hawaii Five-O (which ran from 1968 to 1980). In the show, Detective McGarrett, memorably played by Jack Lord, would typically end each episode by calling out to his second-in-command: “Book ‘em, Danno!”
But not even McGarrett would try to book a ham sandwich. Unless it came with a slice of pineapple.
(This column originally appeared in the February 2004 issue of New York Law Journal).
Subscribe to:
Posts (Atom)