"Capital Structure Evolution:
Austrian Observations on the Case of Software Development"

Howard Baetjer, Jr.
Towson University

A paper to be presented at the Southern Economic Association meetings,
New Orleans, November 22, 1999


 
I. An Austrian perspective on capital
Capital as embodied knowledge
Capital structure
Focus on the development process
II. Observations on the process of capital development
Software prototyping as social learning
Designing evolvable software
Capital structure evolution
Increasing understanding of the causal connections between things and human welfare, and increasing control of the less proximate conditions responsible for human welfare, have led mankind, therefore, from a state of barbarism and the deepest misery to its present stage of civilization and well-being. ... Nothing is more certain than that the degree of economic progress of mankind will still, in future epochs, be commensurate with the degree of progress of human knowledge.[1] Carl Menger


Introduction

This paper offers a way to think about the role of capital in the process of economic growth.[2]

My interest in capital and growth grew up before I ever studied economics, when I worked summers at a Nevada ranch that produces alfalfa. There I observed astonishing improvements in productivity that followed from improvements in capital goods.[3]  My framework for understanding began to take shape when I studied Carl Menger on knowledge and capital, and read Adam Smith’s observations on the way workmen build improvements into their tools. It developed with Hayek’s insights on local knowledge and the knowledge problem literature in general. It was focused by studying the Harrod-Domar-Solow growth theory in macroeconomics, whose treatment of capital I found appalling: Not only does it treat capital as homogeneous, but also it separates technology and capital, thereby disallowing innovation and improvement in capital goods.

The framework offered here was shaped into current form by my research into software development. Evolving principles of sound software development match up well with and even inform Austrian capital theory. (Indeed, it is not clear to me anymore to what extent my Austrian theory has shaped my understanding of what software developers do, and to what extent my observation of software development has shaped my theory.)

The approach presented here is built on Menger’s view of capital as knowledge, Hayek’s insights about the local, dispersed nature of knowledge, and Lachmann’s insistence on capital structure – that "complementarity is of the essence" in understanding capital.

Certain conclusions rise on these foundations: If capital is knowledge and knowledge is initially dispersed, then new capital development must be a social learning process in which initially dispersed knowledge gets built into the new capital goods. If complementarity is of the essence, then knowledge of the relevant complementarities, of how to fit a new capital good into the existing capital structure, is an important kind of knowledge that must get built into new capital. As for systemic evolution, because the capital structure is a web of overlapping relationships, introduction of a new capital combination in one area will create entrepreneurial opportunities for changes in other areas. Hence the overall evolution of the capital structure is a coevolutionary process in which one development leads to another.

Part I of this paper lays out the foundational concepts of capital as knowledge and the structural relationships of capital. Part II draws conclusions about the nature of the process by which the capital structure evolves, drawing on software development for illustration.

I. An Austrian perspective on capital

The view of capital structure evolution presented here is based on an interpretation of capital as knowledge derived primarily from Menger, and on Lachmann’s insistence that capital functions in a structure. Before considering capital structure evolution as such, let us take time to consider these foundational ideas.

Capital as embodied knowledge[4]

Capital[5] is embodied knowledge. It is human knowledge of how to accomplish some productive purposes, embodied in some medium and ready for productive use. This view is expressed by Carl Menger when he writes,

The quantities of consumption goods at human disposal are limited only by the extent of human knowledge of the causal connections between things, and by the extent of human control over these things." (1981, p. 74) Because this statement comes in a passage contrasting simple collection of first-order goods with employment of higher order goods in production processes, it is clear that we are to take the use of higher-order goods – capital goods – as the application of the knowledge Menger speaks of. When we know how to produce in a roundabout way, we employ capital goods for the purpose. Our knowledge is to be found in practice not in our heads, but in the capital goods we employ. Capital is embodied knowledge.[6]

More particularly, capital embodies knowledge of how to accomplish some purpose(s). Much of our "knowledge of the causal connections between things," and of how to effect the changes we desire, is not articulate but tacit. In the beginning of Wealth of Nations, Adam Smith speaks of the "skill, dexterity, and judgment" (p. 7) of workers; these attributes are a kind of knowledge, a kinesthetic knowledge located in the hands rather than in the head. The improvements Smith tells us these skilled workers make in their tools are embodiments of that knowledge. The very design of the tool passes on to a less skilled or dexterous worker the ability to accomplish the same results. Consider how the safety razor enables unskilled and clumsy academics to shave with the blade always at the correct angle, rarely nicking ourselves. How well would we manage with straight razors? The skilled barber's dexterity has been passed on to us, embodied in the design of the safety razor.

Adam Smith gives a clear example of the embodiment of knowledge in capital equipment in his account of the development of early steam engines, on which:

a boy was constantly employed to open and shut alternately the communication between the boiler and the cylinder, according as the piston either ascended or descended. One of those boys, who loved to play with his companions, observed that, by tying a string from the handle of the valve which opened this communication to another part of the machine, the valve would open and shut without his assistance, and leave him at liberty to divert himself with his playfellows. (p. 14) The tying on of the string, and the addition of the metal rod which was built on to subsequent steam engines to accomplish the same purpose, is an archetypal case of the embodiment of knowledge in a tool. The boy's observation and insight, improved by the insights of the engineers who improved them further were built into the machine for use indefinitely into the future.[7]

The point here concerns theoretical perspective: Capital theory (and growth theory in its treatment of capital) should consider capital goods to be first and foremost knowledge, rather than physical stuff. This is not to say the physical stuff is unimportant: All capital goods are a combination of knowledge and physical stuff, in which the knowledge is embodied. But the knowledge is fundamental and primary; the physical stuff is incidental and secondary.[8] Capital theory will be more powerful if it puts knowledge first.

Consider a hammer, for instance. It is physical wood (the handle) and minerals (the head). But a piece of oak and a chunk of iron do not make a hammer. The hammer is those raw materials plus all the knowledge required to shape the oak into a handle, to transform the iron ore into a steel head, to shape it, fit it, and so on. There is a great deal of knowledge embodied in the precise shape of the head and handle, the curvature of the striking surface, the proportion of head weight to handle length, and so on.

The knowledge is necessary while the particular physical stuff is not. The same knowledge might be embodied in a handle and head made of other materials, say, carbon fiber handle and titanium (or a buffalo bone and a rock).

Even with a tool as bluntly physical as a hammer, the knowledge component is of overwhelming importance. With precision tools such as microscopes and calibration instruments, the knowledge aspect of the tool becomes more dominant still. We might say, imprecisely but helpfully, that there is a greater proportion of knowledge to physical stuff in a microscope than in a hammer.

Computer software illustrates the point at the logical extreme. Software is less tied to any physical medium than most tools. We think with equal comfort of a given program as a program, whether it is printed out on paper, stored on a diskette, or loaded and running in the circuits of a computer. Of course, to function as capital the software must be loaded and running in the physical medium of a computer. Nevertheless, if we want to understand the contribution of computational devices to production, and their place in the evolution of the capital structure, we must focus on the software and treat the computers’ physical circuits as secondary.[9]

While the distinction between a tool system’s knowledge and its physical embodiment is obvious with computers running software, there is no fundamental difference between software tools and conventional tools in this respect. What is true of software is true of capital goods in general. What a person actually uses is not software alone, but software loaded into a physical system – a computer with a monitor, printer, plotter, or whatever. The computer is the multi-purpose, tangible complement to the special-purpose, intangible knowledge that is software. When a word-processor or computer-assisted design (CAD) package is loaded and run, the system becomes a dedicated writing or drawing tool.

But there is no important difference in this respect between a word-processor and, say, a hammer. The oaken dowel and molten steel are the multi-purpose, tangible complements to the special-purpose, intangible knowledge of how to drive a nail. When the knowledge of what a hammer is is imprinted on the oak in the shape of a smooth, well-proportioned handle, and on the steel in the shape, weight, and hardness of a hammer-head; and when the two are joined together properly; then the whole system – raw oak, raw steel, and knowledge – becomes a dedicated nail-driving tool.

All tools are a combination of knowledge and matter. They are knowledge imprinted on or embodied in matter. Software is to the computer into which it is loaded as the knowledge of traditional tools is to the matter of which those tools are composed.

If this is true, then knowledge is the key aspect of all capital goods. After all, the matter is, and always has been, "there." As Bohm-Bawerk says in discussing what it means to produce:

To create goods is of course not to bring into being materials that never existed before, and it is therefore not creation in the true sense of the word. It is only a conversion of indestructible matter into more advantageous forms, and it can never be anything else. (1959, p. 7) Likewise the evolution – the improvement – of the capital structure is fundamentally a matter not of accumulating steel or plastic or silicon, but of increasing, refining, and coordinating human knowledge of how to accomplish human purposes.

Capital structure – complementarity and heterogeneity

The second foundational insight of the view of capital structure evolution presented here is the idea of capital structure itself, which Ludwig Lachmann, following Hayek and Schumpeter,[10] has illuminated so usefully.

Horizontal relationships

Understanding the nature of capital development requires a clear appreciation that capital goods work and have value in particular relationships with one another – in the capital structure. (Lachmann 1978, Hayek 1941) New tools contribute to the economy not by being thrown, as it were, into a bubbling production pot, where one ingredient adds as much to the amorphous stew as another. Rather each must fit into a structure, or, to use a less mechanical and static metaphor, each must play a particular role in a particular niche in a kind of economic ecosystem. (Rothschild, 1990) If a capital good is ill adapted to its niche, it makes no contribution, fails to sustain itself, and is selected out.

Lachmann stresses complementarity in use, what we might call horizontal relationships among capital goods used together at the same stage of production. He writes,

It is hard to imagine any capital resource which by itself, operated by human labour but without the use of other capital resources, could turn out any output at all. For most purposes capital goods have to be used jointly. Complementarity is of the essence of capital use. But the heterogeneous capital resources do not lend themselves to combination in any arbitrary fashion. For any given number of them only certain modes of complementarity are technically possible, and only a few of these are economically significant. (1978, p. 3, emphasis in original) In this context the simultaneous use of fixed capital and working capital comes to mind: In most production processes durable fixed capital performs operations on more transient working capital. Foundries smelt ore into metal; mills grind wheat into flour; printers turn ink and paper into documents; spreadsheet programs transform data into meaningful information.

Let us note in passing that the distinction between an individual capital good and a capital combination composed of individual capital goods is, for any tools other than the most primitive, a matter of viewpoint. A word processing software package, for example, is from one point of view a distinct capital good – an individual tool for composing documents. From a zoomed-out point of view, however, word processing software can be seen as useless by itself: it must rather form part of a capital combination including computer, printer, paper, the electrical power grid, and so on, in order for documents to be produced. From still a third, zoomed-in point of view, word processing software is not a singular capital good at all, but rather a capital combination: It comprises many capabilities such as the footnote formatter, the spell-checker, the layout tools, and so on. We may zoom in more deeply still, and discover that each of these capabilities can be usefully understood as a capital combinations themselves, some remarkably complex. The spell-checker, for example, is composed of a wide variety of large and small software modules from small string search routines to the massive lexicon database.[11]

Vertical relationships

The capital structure comprises not only horizontal but also vertical relationships, the familiar relationships between goods of different order. This is Bohm-Bawerk’s roundaboutness. In lengthening the capital structure, we develop tools for producing tools for producing tools and so on. The better the tools at each stage, the better and more cheaply we may produce the goods at the next lower stage. Menger stressed the importance of lengthening the capital structure:

Assume a people which extends its attention to goods of third, fourth, and higher orders... If such a people progressively directs goods of ever higher orders to the satisfaction of its needs, and especially if each step in this direction is accompanied by an appropriate division of labor, we shall doubtless observe that progress in welfare which Adam Smith was disposed to attribute exclusively to the latter factor. (p. 73) Frequently, there is a kind of recursion or feedback involved, in that developments at one stage make possible developments at a lower stage, which in turn improve processes at the first stage. Better steel, for example, the product of a steel mill, makes possible the construction of better steel mills. The availability of the programming language Smalltalk made possible the user interface builder WindowBuilder, which was itself an improvement to Smalltalk.

Capital structure evolution means increasing complexity

If we visualize together the horizontal and vertical relationships among capital goods, and allow ourselves to zoom in and out mentally so as to keep in mind the nested nature of capital combinations, we see the capital structure as a complex web of capital complementarities. Raw and intermediate goods flow down from the highest orders. At each stage of production fixed capital combinations operate on the converging and branching flows of working capital. Each of the fixed capital goods, the tools of production, is itself an endpoint to some flow of raw and intermediate goods, processed in stages by higher-order fixed capital. (Complementary to the whole fabric, of course, are the human skills and efforts that direct the capital goods, and the processes or routines by which they are directed.)

To understand capital structure evolution, we must focus on the development process; the technology is in the capital

Before turning directly to the process of capital structure evolution, let us note that in order to understand it, our attention has to be on the design process rather than the manufacturing process, because it is with new or enhanced designs (and corresponding processes), rather than by mere accumulation of extant machinery, that the structure of production changes in the most important ways.

To appreciate the distinction, contrast our common conceptions of producing cars, on the one hand, and of producing software, on the other. When we think of GM producing cars, we think of their work creating new instances of extant designs. True, GM employs many designers, who design new cars, but we don't think of that; we think of the assembly line, spot welding, riveting, bolting, etc. We think of the physical work of realizing these designs – imprinting a design on metal and rubber and glass so that a new instance of the design – a new car – comes to be.

When we think of Microsoft's work producing software, by contrast, we think of programmers writing code – creating new designs (or enhancing older designs). True, Microsoft employs people who store the programs onto diskettes, thus in a sense creating instances of the extant designs; but we don't think of that; we think of the late nights at the terminal designing, coding, revising, running, debugging, etc. We think of the mental work of creating new software, that is, new designs, specific instances of which will eventually be copied by the thousands onto CDs and distributed.

I emphasize this point because mainstream economic growth theory, both in its older, Solovian versions and in the "new growth theory" best represented in the work of Paul Romer, generally overlooks (ignores?) the production of new capital good designs. It treats capital as essentially physical. The models generally depend on unchanging production functions in which capital is represented by a single variable exhibiting diminishing returns. In that equilibrium framework,[12] more capital can only mean more of the same kind of capital.[13] When capital accumulation involves using no new designs, but simply greater quantities of existing kinds of goods, we might say that the capital structure has expanded, but not that it has evolved. Evolution implies a change in the way production is carried out, a different production function.

When standard growth theory models do include a role for knowledge, they usually treat knowledge ("technology") and capital as distinct and independent. Technology gets a separate variable in the production function. The approach we take here departs radically from the standard approach. It rejects the separation of technology and capital and holds, rather, that for the vast majority of human production processes, (most of) the technology is in the capital goods used. Capital and technology cannot be separated.

Moreover, in an energetically developing economy such as ours, capital accumulation occurs not through adding more and more (identical) tractors to the same field, but through replacing the generic tractor with specialized planters, cultivators and harvesters, not to mention irrigation and improved seeds. It occurs not through production of more and more (identical) word processors and printers, but through the development of internet publishing.[14] Lachmann writes,

As capital becomes more plentiful its accumulation does not take the form of multiplication of existing items, but that of a change in the composition of capital combinations. Some items will not be increased at all while entirely new ones will appear on the stage. (1978, p. 79) In our present effort to investigate how the capital structure develops and improves, then, it is essential to focus on production of new (or improved) designs. To that we turn now.

II. Observations on the process of capital development

It follows from our treatment of capital as embodied knowledge that new capital development must be a social learning process: Because capital is embodied knowledge, capital development is a matter of learning, through which the knowledge gets embodied in the new good.[15] Because the necessary knowledge is initially dispersed among many different people who must interact to communicate their particular and often tacit knowledge, (Hayek 1945, Polanyi 1958) capital development is a matter of social interaction. Because this interaction takes time and because the capital structure changes as learning occurs, capital development is an on-going process. In brief, because capital is embodied knowledge, capital development is a social learning process.

As we shall see now, the history of the software development process supports this view. We look first at the way the software development process has become oriented toward facilitating social learning, and then at the way in which software itself has evolved for evolvability.

Software prototyping as social learning

Much traditional software development methodology takes a central planning approach, and consequently suffers from the knowledge problem that face complex software systems. They assume that the knowledge a new software system needs to embody is static, that it can be identified and clearly articulated at the outset.

In the classic "waterfall" model of traditional methodology, software development is supposed to cascade from requirements specification to analysis to design to coding to debugging to delivery. Note that in this approach, design of the software begins only after the software requirements have been fully specified by the clients. The approach was wildly unrealistic:

[T]he conventional 'waterfall' methodology practiced in most large companies today ... requires the creation and approval of numerous detailed documents before the first procedure is ever written ... [and] doesn't allow any modifications once the actual programming has begun. This constraint frustrates [client] managers to no end because they rarely know what they really want until they see it running on a screen, at which point it's too late to make any changes! (Taylor 1990, p. 97, emphasis added) Failures of immense proportions frequently resulted.

The software development community has been learning from these failures, however, and better approaches have been evolving in recent years.[16] These approaches use various techniques aiming to discover what knowledge is relevant – what needs and opportunities a new software tool may address and how. Important among these techniques is prototyping. Prototyping constitutes a kind of dialogue in which all the various people participate whose knowledge must be embodied in the new capital.The medium for the dialogue is the prototype itself – the emerging design. In a useful sense it is the prototype itself that learns, rather than the human participants, because in the prototype alone may all the relevant knowledge be found in useful form.

The prototyping approach recognizes that the knowledge necessary in a new capital good is initially dispersed, constantly changing, and difficult to articulate. Perhaps most importantly, as the quotation from Taylor above suggests, the clients, for whom the software is being designed, do not know in much detail what they want, nor can say what that is. Their knowledge is tacit, inarticulate. Accordingly the most fundamental kind of knowledge necessary to the tool-building process – what the tool is to do – is not readily accessible at the outset.

Prototyping provides a means of eliciting this knowledge. Simply put, rapid prototyping works as follows: After the clients give the developer a rough idea of what they want, the developer produces a very simple prototype which the clients can try out on the computer. Then follows a repeated sequence of the following steps:

  • the clients try out the current version of the prototype and react to it. They explain as well as they can what they like and don't like. Equally important, the developers observe what the clients do and don't do with the prototype, what they try, what they ignore, where they are frustrated, and where they are pleased.
  • informed with this new knowledge, the developer improves and extends the prototype, and offers this next version to the clients for trial.
  • The cycle continues in a kind of dialogue – a conversation in which the prototype itself is passed back and forth, as much as any words about it – until the prototype has been refined to where it contains the functionality the client needs. At that point the initial version of the software to be delivered is defined. Significantly, the prototype itself defines the clients’ needs with an accuracy and completeness that would have been impossible without the prototyping process. The developers may then turn to the details of implementation.

    Prototyping is a discovery process in which knowledge is revealed and developed. We can identify different kinds of knowledge which prototyping brings out and builds into the new software. One is conscious but inarticulate knowledge. Frequently clients for whom software is being built cannot say very well what they want, but they recognize it when they see it. In using the prototype, the clients bring their tacit knowledge to bear, and where the prototype does not match smoothly with what they actually do or want to do, they detect problems. Notably, they need not be able to explain these problems completely. Tacit knowledge made more explicit through interaction with the prototype need not be made fully articulate, but only clear enough so that it can be communicated to the designer for incorporation into the next version of the prototype.

    Another kind of knowledge prototyping brings out is perhaps the most important: unconscious, or latent knowledge of the capabilities the software should have and how it should function. This kind of knowledge is at issue in the development of new kinds of systems doing new kinds of things. The client users often begin with little more than a conviction that a well-thought out system must somehow enhance their operations. What that system needs to be and do has to be worked out collaboratively with members of the organization and their software developers.[17] Designers of such systems speak in terms of "exploring what kind of tool [the clients] really needed," of observing how users spend time with the prototype, and of how, thereby, "the key nature of [certain capabilities] became apparent."[18]

    Still another kind of knowledge that prototyping generates is knowledge of capabilities that users did not want before the prototyping process began, because the users had never thought of such capabilities. These new possibilities occur to them as a consequence of interaction with the prototype, in which the designers, trying to figure out what is wanted, offer certain capabilities to try out. Not infrequently the trial version will not exactly suit, but it will give the user insight into the possibility of similar capabilities they discover they do want, which had never occurred to them before.

    Prototyping evokes knowledge not only from the clients but also from the designers. The designers themselves are engaged in an evolutionary sub-process of generating the new knowledge that constitutes the evolving design. They are learning not just about what the client needs, but also about what they themselves can do and how to do it. Their knowledge of design principles and various problem-solving techniques is neither all ready to hand, nor static and complete. They discover how to apply this knowledge to new problems in the process of applying it. They ponder, they sketch, they experiment, they try out various ways of decomposing the problem, they make some initial decisions, they repeat the process. They learn by doing.

    A good illustration of the manner in which the designer learns through working with the design comes in this description of the initial laying out of the views (screens) that the user will see:

    The actual act of laying out the view provides you with another set of information you will need in constructing the prototype. By deciding on the visual grouping of information in the view, you will also be determining any data assembly, or aggregation, capabilities that the view needs. By laying out a view, the software designer learns something; by deciding on grouping, he determines needed capabilities. In brief, by interacting with the evolving design, the designer learns more about what it should be.

    A creative process such as software design is not deterministic, with output dictated by input through some sort of black-box optimization. This would require the designer to grasp the problem in its entirety at a glance, and on that basis to grasp its "correct" solution. On the contrary, software design is an evolutionary process in which the designer "makes sense" of the problem over time, and gradually puts the design together.[19]

    The growth of prototyping in software development is a tacit recognition in the software industry that knowledge is more tacit and more dispersed than has previously been recognized. Software development is coming to be seen as less and less a matter of manipulating static knowledge and more and more a matter of dynamic learning.

    In this discussion of prototyping we have deferred paying attention to the necessity for new software (and by extension, any new capital good) to fit successfully into capital combinations. In fact, much of the knowledge generated in the prototyping process is knowledge of necessary complementarities. But because these relationships change as other capital goods in the combinations change, developers face a moving target. And the target does not stop moving once the software is "finished." The challenge of maintaining complementarity with constantly changing capital combinations is known among programmers as software maintenance. To that topic, and to software developers’ strategies for coping with anticipated change, we turn now.

    Designing evolvable software: planning for change

    The process of software development does not end when the first version is shipped to the customer. The world changes, hence the software must change with it, if it is to maintain or increase its value as a useful capital good. Users' requirements change as their businesses change; the software needs new features to keep up with competitive products; it needs to run on new machines, to be used on networks, etc. On the broad view, as the economy grows and develops, software products must themselves "learn" – be enhanced by the incorporation of new knowledge – to maintain and/or improve their position in the evolving capital structure.

    The process of adapting and enhancing existing software is known as software maintenance. The term seems strange to those unaccustomed to its usage in the software field, because software does not wear out, and hence should need no maintenance. To economists, however, the term makes sense. As Hayek has stressed (1935), to maintain capital is fundamentally to maintain its value in the evolving capital structure. Obsolescence is just as important as wear and tear. Programmers speak of "bit rot" – that creeping incompatibility that erodes software's usefulness as the environment changes: with new computers, peripherals, operating systems, etc. – but the code does not. Software maintenance is a matter of complementarity. To maintain the value of a piece of software, even when what it does remains constant, requires changing that software to keep it complementary to the changing capital goods with which it must work. The term "maintenance" is thus not misapplied.

    In early days of computation, when computers had little memory or power and program tasks were simple, programmers could easily "hack" solutions to problems, programming in an ad hoc, undisciplined way. When changes needed to be made to these small systems, it was easy to see what needed to be done. But as systems grew larger and more complex, the software maintenance problem grew even faster. Changing carelessly built systems became much more difficult. One part of the system affected another, which affected another, and so on, in ways that were often entirely unclear. (At its worst this problem is called "spaghetti code.") The expense of maintaining software systems grew overwhelming, to approximately 70% of the total cost of a software project by 1987 (Meyer, p. 7).

    To make maintenance easier and less costly, the software industry has been driven to develop techniques that make it possible for systems to change more smoothly and easily.

    The problem illustrates dramatically the truth of Lachmann’s dictum that "complementarity is of the essence." To repeat: the software "maintenance" problem has nothing to do with wear and tear, because bits don’t wear out. It has everything to do with maintaining the software’s complementarity with other capital goods (hardware and software) with which it must be used.

    Best practices in software development now seek to build software so as to facilitate change in general. Good design, in an uncertain world, is design which prepares for change – which allows the software to "learn" easily, to embody new knowledge readily. Good design entails design evolvability.[20]

    Evolvability, in turn, depends on modularity. Simply stated, modularity leads to evolvability for two related reasons: 1) it reduces the amount of change necessary, and 2) it makes more understandable what must be changed. In order for a software system to evolve smoothly as its users and builders learn how it could be improved, its overall structure must allow the system engineers to change it without too much difficulty. When software architecture is appropriately modular, with functionality encapsulated in relatively independent modules, making the necessary changes is relatively easy: Changes are confined to few modules and are therefore relatively easy to identify. In non-modular architectures, by contrast, there are numerous hidden interdependencies among different parts of the system ("spaghetti code"). These make adaptation or extension very difficult, because so many different parts of the system are affected in so many ways that it is not clear what must be done.

    The amount of work to be done is not the main issue; the issue is understanding what must be done. Far more important problem than the sheer volume of recoding is the danger that what recoding must be done will not be clear. A non-modular system will be significantly more difficult to understand than a modular one. Accordingly, when functionality is added or changed, it is not clear what parts of the system are affected.[21] A great deal of effort must be expended finding out where problems remain. In extreme cases of multiple interdependencies in large systems, the system becomes literally incomprehensible. Then adding or changing functionality in any but trivial ways is so difficult that the task is not one of change, but of beginning again and recreating the system entirely. Then that particular software species, so to speak, becomes extinct, because it can no longer adapt to the changing climate of business.

    Modularity makes possible the evolution of extremely complex systems because it allows people to understand the system in pieces at various levels of abstraction. Each module is understandable as an entity on its own, and the overall system structure is understandable in terms of the relationships among these entities. While no one can understand a whole (large) system in its entirety all at once, in order to maintain the system it is necessary only to understand clearly defined pieces of the whole, and their interrelationships with near neighbors.

    Modularity also promotes evolvability because it leads to decentralized rather than hierarchical architectures, making it is easier to add and remove functionality. Traditional design approaches frequently involve functional decomposition, in which a central function or purpose for the system is systematically decomposed into subprocesses at ever more fine-grained levels. In such architectures, it is difficult to add new functionality without reconstructing much of the whole. Modular architectures, by contrast, tend to be designed by representing the various parts of the system being modeled. With such decentralized architectures, the pieces have a more equal relationship; the structure is more organic. Adding functionality is more like adding a node to a network than reconstituting a rigid skeleton.[22]

    That software developers consciously seek to make their designs evolvable suggests an extension to Hayek and Lachmann’s work on capital maintenance. Both discuss the necessity for producers to restructure their capital combinations when unexpected changes occur. Neither, however, emphasizes the issue raised here, of maintaining flexibility in capital designs so as to cope with future changes that cannot be fully anticipated. Lachmann, for example, speaks of "the changing pattern of resource use which the divergence of results actually experienced from what they had been expected to be, imposes on entrepreneurs." (1978, p. 35, emphasis added). He says,

    The capital stock in existence always contains 'fossils', items that will not be replaced and would not exist at all had their future fate been correctly foreseen at the date of their investment. (1986, p. 61, emphasis added) Focusing as he does on changes that must be made in the capital structure when entrepreneurs incorrectly forecast the future, Lachmann seems to suggest that entrepreneurs commit themselves rather firmly to their vision of the future, tying their capital investments tightly to the future needs they anticipate, and allowing themselves little flexibility to adjust if events take a different path. In such cases we can properly speak, as Lachmann does, of "failure" and "error."

    The efforts that software developers make to build software that is evolvable suggests the slightly different view that successful producers of capital goods do not (always) commit themselves to a particular view of the future. In order to maintain evolvability of their designs, especially in our time of astonishingly rapid capital structure evolution, they rather make their best estimate of a range of likely paths of capital structure evolution, and build into the design of their products a flexibility with which to cope with this range of paths. The upshot is very much the same, of course: there must be constant adjustment because the future was not, and could not be, correctly anticipated in all its detail. But many of the imperfectly adapted capital goods in use at any time can be seen as "imperfect" not as a result of failure, but as a result of planned flexibility.[23]

    Because many entrepreneurs are simultaneously planning capital developments based on their various estimations of how the capital structure might evolve, and because their various estimations depend in part on what they see others doing (at least with capital that might complement or substitute for their own), the whole process is co-evolutionary. Which tools become useful and which become obsolete at any time is determined by what other tools happen to be developed at the same time.

    Consequently the notion of a "best solution" to a particular problem is a mirage that appears when one focuses on the moment. In another moment the problem will have changed, and there will be a new "best solution," because others have been working on related problems. There is no fixed skeleton or underlying architecture for the capital structure. The skeleton, the architecture, grows organically as particular entrepreneurs make particular choices.[24] Each choice in response to a particular aspect of a problem poses a new, or at least a changed, problem for other participants in the process. In the words of Peter Allen, a specialist on evolutionary dynamics at the International Ecotechnology Research Centre:

    Evolution is not just about the solving of optimization problems, but also about the optimization problems posed to other populations. It is the emergence of selfconsistent 'sets' of populations, both posing and solving the problems and opportunities of their mutual existence that characterizes evolutionary dynamics. (1990, p. 25) The developers of new capital goods, then, must try to build their products so that they can readily evolve to maintain a reasonably good fit in the evolving capital structure around them, regardless of how, out of a broad continuum of possibilities, that capital structure does in fact evolve.[25]

    Capital structure evolution

    The capital structure can be seen as a great web of vertical and horizontal relationships. The vertical relationships represent the orders of capital goods and the downward flows of working capital; the horizontal relationships represent the capital combinations used together in production. The whole rests on and makes possible the myriad consumer goods the web exists to produce.

    For purposes of understanding how it evolves, the capital structure can be seen as analogous to a modular software system. Such a system is maintained (developed, enhanced) by the addition and replacement of modules. The modules can be of all different sizes (granularity), because larger modules are composed of smaller modules. Modules are replaced, in a well-maintained system, when designers come up with better (better adapted) functionality. Analogously, the capital structure as a whole can be seen to evolve as entrepreneurs extend it with new capital goods or enhance it by replacing existing capital (combinations) with better (better adapted) ones.

    Capital structure evolution thus generally involves a lengthening of the capital structure, whether at the edges or within. It entails what Lachmann calls a "‘division of capital,’ a specialization of individual capital items" (1978, p. 79). It entails what we might call a "complexifying" of the capital structure, an increasing intricacy of the pattern(s) of complementarity among increasingly specialized capital goods, born in the on-going growth and division of knowledge.[26]Lachmann, again:

    We conclude that the accumulation of capital renders possible a higher degree of the division of capital; that capital specialization as a rule takes the form of an increasing number of processing stages and a change in the composition of the raw material flow as well as of the capital combinations at each stage; that the changing pattern of this composition permits the use of new indivisible resources; that these indivisibilities account for increasing returns to capital... (1978, p. 85). Capital structure evolution comes about through the unceasing efforts of entrepreneurs to change the capital structure as it exists at any time. Always there are design projects in motion whose issue, their participants hope, will find a profitable place in the structure of production of goods of the next lower order. The acceptance of these designs in capital combinations actually used is not assured. That depends on the simultaneous design efforts of other entrepreneurs who must provide the necessary complements. The intended place of these designs might be filled by some competing product before they are even finished. The project might turn out to be more difficult than anticipated and too expensive to finish. (Millions of dollars worth of software development falls into this category.) The necessary, anticipated complementary goods might never appear, or appear in incompatible form. Some designs actually get tried, but then rejected or replaced in favor of some other good more suitable in that combination.

    The capital structure might be seen, then, as consisting of a main and a secondary part. The (temporary!) main structure consists of all those capital goods that are actually connected by one roundabout strand of relationships or another to actual consumer goods. This main structure is enveloped by secondary structures that are entrepreneurial trials – capital combinations in development and flows of working capital intended to replace (improve) some part of the main structure. Some fail; some succeed and are incorporated. Then the capital combinations they replace must find some other place in the structure or be scrapped. By this process the capital structure extends and becomes more complex as ever-new knowledge is embodied in it.
     
     

    Notes

    [1]  1981, p. 74.

    [2]  Most of what follows is based closely on Baetjer 1998.  Much of what appears in Part I also appears in slightly different form in Baetjer 2000.

    [3]  Joel Mokyr presents compellingly the importance ever-improving capital equipment in his 1990 book, The Lever of Riches.  The other main source of economic progress is expansion of trade.  Both depend on the institution of rules of just conduct and stable money.

    [4]  The content of this section is adapted from Baetjer (1998), pp. 8-13.

    [5]  Capital goods, as opposed to financial capital, human capital, or capital in the abstract (see Lewin 1999, pp. 5-6) are meant here.  From this point on capital may be taken in all cases to mean capital goods – hardware and software, the “produced means of production,” unless otherwise noted.

    [6]   Hayek writes,
    Take the concept of a 'tool' or 'instrument,' or of any particular tool such as a hammer or a barometer.  It is easily seen that these concepts cannot be interpreted to refer to 'objective facts,' that is, to things irrespective of what people think about them.  Careful logical analysis of these concepts will show that they all express relationships between several (at least three) terms, of which one is the acting or thinking person, the second some desired or imagined effect, and the third a thing in the ordinary sense.  If the reader will attempt a definition he will soon find that he cannot give one without using some term such as 'suitable for' or 'intended for' or some other expression referring to the use for which it is designed by somebody.  And a definition which is to comprise all instances of the class will not contain any reference to its substance, or shape, or other physical attribute.  An ordinary hammer and a steamhammer, or an aneroid barometer and a mercury barometer, have nothing in common except the purpose for which men think they can be used. (1979, p. 44)

    [7]  Joel Mokyr’s book (p. 245 n.) called my attention to the fact that the story is “largely…mythical,” according to Edwin Cannan’s editorial notes to Wealth of Nations (Smith [1776], 1976, book I, pp. 13-14.  The example is helpful nonetheless.  In any case the account given in the source Cannan supposes Smith to have misread makes the point stressed here – that capital embodies knowledge – even if it does not so well support Smith’s point that the source of that knowledge is often workmen using the machines.

    [8]  An anonymous referee of a draft of a related paper (Baetjer2000, forthcoming) criticized my writing that “capital is essentially knowledge.”  He or she suggested that looking for Platonic essences here was a waste.  The point is well taken.  My purpose is not to find the platonic essence of capital but to recommend a theoretical orientation.

    [9]  Robert Polutchko of Martin Marietta Corp. tells a story that points out clearly the distinctness of the knowledge embodied in tools from the physical medium in which it is embodied.  He recounts a remarkable exchange between two engineers working on a moonshot.  One, literally a rocket scientist responsible for calculating propulsion capacity, approached the other, a software engineer.  The rocket scientist wanted to know how to calculate the effect of all that software on the mass of the system.  The software engineer didn't understand; was he asking about the weight of the computers?  No, the computers' weight was already accounted for.  Then what was the problem, asked the software engineer.  "Well, you guys are using hundreds of thousands of lines of software in this moonshot, right?"  "Right," said the software engineer.  "Well," asked the rocket scientist, "how much does all that stuff weigh?"  The reply: "... Nothing!!"

    [10]  I was unaware of the connection to Schumpeter until reading Lewin (1999, p. 118, including footnote 3), who helpfully traces the attention to this insight from Lachmann back to both Hayek (e.g. 1941, p. 6) and Schumpeter (1954, pp. 631-632).

    [11]  These nested capital combinations suggest fractals, a mathematical concept from chaos theory, that reveal the same degree of complexity however closely we zoom in.  Capital combinations are not truly fractal, however, because eventually we do get to (what the designer regards as) the simplest, non-reducible elements.

    [12]  Lewin (1999, p. 79) demonstrates that for a single quantitative representation of capital to be meaningful, the economic system must be in general equilibrium.   In that case, of course, evolution is out of the question.

    [13]  Even when Paul Romer “disaggregates capital into an infinite number of distinct types of producer durables" (1990, p. S80), the disaggregation is appears specious – no complementarities or substitutabilities arise.  The production function at the core of his model “expresses output as an additively separable function of all the different types of capital goods so that one additional dollar of trucks has no effect on the marginal productivity of computers.” (p. S81)  To a given capital structure, add buggy whips or microchips and the effect on output will be the same.  Not only does “one additional dollar of trucks have no effect on the marginal productivity of computers,” but also it has no effect on the marginal productivity of either the roads it complements or the railroad freight cars for which it substitutes.  Capital is here still aggregable and thus implicitly homogeneous.  I discuss this at more length in Baetjer, 1998, Appendix A, and develop the critique in Baetjer, 2000.

    [14]  The kind of capital accumulation that can be represented as in increase in the magnitude of K in a production function is thus a special case, and a comparatively unimportant special case, of capital accumulation as it actually happens.  Only in this special case can we be sure that there will be diminishing returns to new capital.  I develop these ideas in this section in Baetjer, 1999.

    [15]   Again, we must think of the design of the new good -- the model of tractor or version of a software application -- rather than any particular instance.

    [16]   In an example of the evolution with which we are concerned, these methodologies – usually expensive software capital combinations complemented by detailed procedures manuals and sometimes a team of consultants – are gradually being selecting out by market evolution and replaced with prototyping and related approaches.

    [17]  Some firms undertake simultaneously to restructure and computerize most or all of their operations, in a process sometimes known as business process reengineering.  The development of the software that results, an integrated system known as an enterprise model, requires the participation of managers from all parts of the firm.

    [18]  For a representative account of a project done for Hewlett-Packard, see Whitefield and Auer, from which these quotations are taken (pp. 65, 67).

    [19]  In this respect software design would seem to be akin to writing.  Composition is not a matter of copying out a book that has somehow popped into the writer's head.  Rather the writer works gradually from a vague idea to a fully conceived book, through a process of fleshing out, defining and refining, finding out what "works" by trial and error.  Similarly the software designer uses feedback from the design itself, seeing what works, what has promise, what relationships are revealed that were unclear before.

    [20]  The term of art in software development most closely identified with modularity is “object-oriented.”  Space does not allow explanation of this term here; for an excellent layman’s introduction, see Taylor (1990).

    [21]  This is the whole Y2K problem:  No one knows how many dependencies on the old, two-digit date format remain in large, non-modular systems.

    [22]  The similarity of this distinction to the distinction between centrally planned and market-oriented economies is strong, and has been frequently noted.  See Miller and Drexler, 1988, and Lavoie, Baetjer and Tulloh, 1991.

    [23]  Cf. G.B. Richardson (1990), Information and Investment: A Study in the Working of the Competitive Economy, Oxford: Clarendon Press:  “[A]n entrepreneur will wish his [investment] program to be as flexible or adaptable as possible, in order that it can be modified to take account of changing and unexpected circumstances.” (p. 79)  Quoted in Lewin (1999),  p. 139.

    [24]  Indeed, it may be more helpful to think of the capital structure [and the economy in general] not as a structure at all, but rather an ecosystem.  The notion of structure is too static; ecosystem captures better the interdependence and ceaseless mutual adjustments of the different elements.  For a readable, layman’s discussion of this point of view, see Bionomics by Michael Rothschild.

    [25]  This view is entirely congruent with the view that capital is necessarily always in disequilibrium (Lewin 1999).  There can be no equilibrium where knowledge is combining, recombining, and growing in many different directions at once.

    [26]   Lachmann, following Hayek (1935), holds that over time there develops “an increasing degree of complexity of the pattern of complementarity displayed by the capital structure.” (1975, p. 201)  For a summary of Lachmann’s ideas on this point see Lewin (1999), pp. 130-2.
     
     

    Bibliography

    Allen, Peter M., 1990. "Why the Future Is Not What It Was," prepared for Futures, 6/4/90. Bedford, England: International Ecotechnology Research Center.

    Baetjer, Howard, 1998. Software as Capital: An Economic Perspective on Software Engineering, Los Alamitos, California: IEEE Computer Society.

    ------- 1999. "Austrian Observations on f’’(k)." Working paper.

    ------- 2000. "Capital as Embodied Knowledge: Some implications for the theory of economic growth," forthcoming in Review of Austrian Economics, Vol. 13, no. 2.

    Bohm-Bawerk, Eugen von. 1959 [1889]. Capital and Interest, 3 vols. Trans. G.D. Huncke and H.F. Sennholz. South Holland, Illinois: Libertarian Press.

    Hayek, F.A. 1935. "The Maintenance of Capital," in Profits, Interest and Investment, London: Routledge & Sons.

    ------- 1941. The Pure Theory of Capital, Chicago: University of Chicago Press.

    ------- 1945. "The Use of Knowledge in Society," in Hayek (1948).

    ------- 1948. Individualism and Economic Order, Chicago: University of Chicago Press.

    Huberman, Bernardo A. 1988. The Ecology of Computation, New York: North-Holland.

    Lachmann, L. M. 1975. "Reflections on Hayekian Capital Theory," Paper delivered at the Allied Social Science Association meeting in Dallas, Texas. Photocopy 1975.

    ------- 1978. Capital and its Structure. Kansas City: Sheed Andrews and McMeel.

    ------- 1986. The Market as an Economic Process. New York: Basil Blackwell.

    Lavoie, Don, Baetjer, Howard , and Tulloh, William. 1991. "Coping with Complexity: OOPS and the Economist's Critique of Central Planning," Hotline on Object-Oriented Technology, 3: 1, (Nov.) pp. 6-8.

    Lewin, Peter.  1999.  Capital in Disequilibrium.  New York: Routledge.

    Menger, Carl. 1981 [1871]. Principles of Economics. New York: New York University Press.

    Meyer, Bertrand. 1988. Object-oriented Software Construction, Englewood Cliffs, NJ: Prentice-Hall.

    Miller, Mark S. and Drexler, K. Eric. 1988. "Markets and Computation: Agoric Open Systems," in B.A Huberman, ed., The Ecology of Computation, Amsterdam: North-Holland.

    Mullin, Mark. 1990. Rapid Prototyping for Object-Oriented Systems, Menlo Park, California: Addison-Wesley.

    Polanyi, Michael. 1958. Personal Knowledge, Chicago: University of Chicago Press.

    Romer, Paul M. 1990. "Endogenous Technological Change." American Economic Review, Vol. 80, no. 2.

    Rothschild, Michael. 1990. Bionomics, New York: Henry Holt.

    Smith, Adam. 1976 [1776]. An Inquiry Into the Nature and Causes of the Wealth of Nations. Chicago: University of Chicago Press.

    Taylor, David A. 1990. Object-Oriented Technology: A Manager's Guide, Alameda, California: Servio Corporation.

    Whitefield, Bob and Auer, Ken. 1991. "You can't do that in Smalltalk! Or can you?" Object Magazine, Vol. 1, no.1, May/June.

    Home   Email me  Department of Economics   Towson University