It has often been to my regret that a fuller treatment of contracts as the keystone to financial cryptography has been beyond me. But not beyond Nick Szabo, who posts on a crucial case in the development of the negotiable instrument as a contract of money. He suggests two key differences:
Negotiable instruments – checks, bank notes, and so on – are promises to pay or orders with an implied promise to pay, and are thus contracts. But they differ from contracts in two important ways. First is the idea of “merger.” Normally, a contract right is an abstraction that is located nowhere in particular but belongs to a party to the contract or to a person to whom that party has assigned that right. Possessing a copy of a normal contract has nothing to do with who has rights under that contract. But in a negotiable instrument, the contract right is “merged” into the document. Assignment of that right takes place simply by transferring the document (in the case of a bearer instrument) or by indorsing (signing) and transferring it.The second big way negotiable instruments differ from contracts is the “good faith purchaser” or “holder in due course” rule which is illustrated by Miller v. Race. In a normal sale of goods under common law, the new owner’s title to the goods is at risk to the contractual defenses of a prior owner. ...
In short, with a normal contract, even once assigned - sold - to new parties, there are defences to get it back. Yet in the case of Miller v. Race in the halcyon days of London's note banking, Lord Mansfield declared in 1758:
A bank-note is constantly and universally, both at home and abroad, treated as money, as cash; and paid and received, as cash; and it is necessary, for the purposes of commerce, that their currency should be established and secured.
It is these instruments that we commonly issue on the net. Although issuers sometimes have the technical capability to return them, subject to some good case, they often declare in effect that they are indeed negotiable instruments and the current holder is "holder in due course." Especially, the digital golds and the digital cashes have so declared, to their benefit of much lower costs. Those that aren't so resolved - the Paypals of the world - inflict much higher fees on their users.
Posted by iang at January 21, 2006 04:39 PM | TrackBackCaveat -- that was a discussion of part of the Anglo-American law of paper instruments. It may or may not apply very well to various kinds of digital money. Metaphors are quite useful, but when it comes to the actual law they have their limits. What Bob Hettinga and I call a "digital bearerr certificate" may or many not legally be a negotiable instrument. It may not even be desirable that it be considered a negotiable instrument by the law. We call it that primarily because its mechanism reflects some (but by no means all) of the core security functions currently provided by the technology and law of paper bearer instruments. The same, with varying degrees of metaphorical similitude, holds for "digital cash," "digital signature," and so on.
I distinguish between the "functional" and the "selfish" parts of the law. The former we'd like to emulate as mechanism, and the latter is best left left to the musty legal tomes.
For example, a "negotiable instrument" under the U.S. Uniform Commercial Code must involve the payment of "money." "Money" is defined in that code as "a medium of exchange currently authorized or adopted by a domestic or foreign government. The term includes a monetary unit of account establsihed by an intergovernmental organization or by agreement between two or more countries." U.C.C. 1-201(b)(24). As a result many things that are in fact money (e.g. e-gold) probably cannot be "money" and thus not "negotiable instruments" under the U.C.C. I'd classify this as a selfish rather than functional part of the law.
In sum, the law can be used in two very different ways by a security designer, and one should be careful to distinguish them. If one is designing mechanism in such a way that law carries part of the security burden (e.g. one creates a "digital contract" and expects the state to enforce it), then one has to care about the law as it is (quite a job!) rather than as metaphor. If, OTOH, one is designing a mechanism to emulate or surpass a security function currently performed by law, then one should care about the functional policy reasons for the law and how to distinguish those from the selfish parts that your customers don't need as mechanism.