June 16, 2005

Miller & Shapiro on Hayek's market - explaining object orientations

I was struck how the Introduction to Miller & Shapiro's new paper on concurrency control sought to integrate economics and programming. Here's the Introduction, stolen in fine Hayekian tradition for your reading pleasure. The paper is for full publication in proceedings of Trustworthy Global Computing (so it will miss out on the bona fide FC++ advantage) but I couldn't help from letting slip this teaser!

The fundamental constraint we face as programmers is complexity. It might seem that the systems we can successfully create would be limited to those we can understand. Instead, every day, massive numbers of programmers successfully contribute code towards working systems too complex for anyone to understand as a whole. Instead, we make use of mechanisms of abstraction and modularity to construct systems whose components we can understand piecemeal, and whose compositions we can again understand without fully understanding the components being composed.
To understand these twin problems, of separating components and of composing them, we draw on Friedrich Hayek's examination of how markets address the twin problems of plan coordination: bringing about the cooperative alignment of separately conceived plans, while simultaneously avoiding disruptive plan interference [Hayek45]. His explanation of the need for property rights parallels the rationale for encapsulation in object-oriented systems: to provide a domain (an object's encapsulation boundary) in which an agent (the object) can execute plans (the object's methods) that use resources (the object's private state), where the proper functioning of these plans depends on these resources not being used simultaneously by conflicting plans. By dividing up the resources of society (the state of a computational system) into separately owned chunks (private object states), we enable a massive number of plans to make use of a massive number of resources without needing to resolve a massive number of conflicting assumptions.

But a single object cannot do much by itself. Instead, both objects and markets use abstraction to compose plans together into vast cooperative networks, such as subcontracting graphs, where one agent, employing only its local knowledge, will subcontract out subtasks to others, often in great ignorance of how each subtask will be carried out [Lachmann, Lavoie, Tulloh02].

"Programmers are not to be measured by their ingenuity and their logic but by the completeness of their case analysis. Alan Perlis"

The problem Hayek was concerned with, how humans coordinate their plans with each other, certainly has many differences from the world of programming. For purposes of this paper, the most interesting difference is that, in the human world, the intelligence of the entity who formulates a plan is comparable to the entity who executes the plan. Therefore, the plan doesn't have to prepare for every possible contingency. If something unusual happens, you'll probably be better able to figure out what to do then anyway. By contrast, when writing a program, we must express a plan that can deal with all possible relevant contingencies. Even under sequential and benign conditions, the resulting case analysis can be quite painful. As we extend our reach into concurrency, distribution, and mutual suspicion, each of these dimensions threatens an explosion of new cases. To succeed at all three simultaneously, we must find ways to reduce the number of additional cases we need to worry about.

Mark later pointed out that he and Bill Tulloh have an entire paper on the Austrian market process, Institutions as Abstraction Boundaries.

Posted by iang at June 16, 2005 04:29 PM | TrackBack
Comments

Yes but tractability and efficiency are different things.
Consider for instance what happened in networking. One could argue that switched networks fit the encapsulation-of-resources paradigm to a T yet they are giving way to packet-based networks, which are much more efficient.

Posted by: OL at June 17, 2005 11:36 AM

Say more! There's an important underlying thread in the coordination debate as to whether packet-switching essential, at all levels.

Posted by: Iang at June 17, 2005 12:14 PM

I'm a little confused. You can have a strict market paradigm in a switched network, and an almost egalitarian peering mechanism for packet-based networks, or vice versa. You can even tinker with the parameters to argue that one is more efficient than the other. Both structures demand very different *institutions* to create the underlying mechanism--market or otherwise. What is the heart of the matter, here?

Posted by: Allan Friedman at June 17, 2005 05:32 PM

Allan,
you are going beyond where I was looking there. A recent debate over on cap-talk and some of my own musings are looking at whether we've made a mistake in our concentration on connection networking at the higher layer instead of packet networking. As the network supports both equally well, I don't see that there is a *current* institutional question to answer, although it certainly raises an interesting line of thought... If one did need an institutional model and had to choose between the two technical approaches for networking, which would one choose?

More, I'm interested in whether the Hayekian approach throws any light on the efficiency aspects of packets versus connections.

Posted by: Iang at June 24, 2005 06:42 AM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.