HTML
-
As is the case of the Walrasian auctioneer device for ensuring market clearing, for instance. More on this point in Section 3 below.
-
See also Leombruni et al. (2005).
-
The use of methodological individualism in Economics was championed by the Austrian School of Economics in the XX century, of which Friederich von Hayek was one of the main exponents (von Hayek, 1948). The legacy of Hayek to agent-based modeling and the complex system approach (see e.g. von Hayek, 1967) has been amply recognized (Rosser, 1999; Vaughn, 1999; Koppl, 2000; Vriend, 2002; Rosser, 2009).
-
The general principle of holism was concisely summarized by Aristotle in his Metaphysics: ‘The whole is more than the sum of its parts’.
-
William Emerson Ritter coined the term in 1919.
-
As in the title of the well-known book by Joshua Epstein and Robert Axtell (Epstein & Axtell, 1996).
-
In the Greek theater, a mechanism was used to drop one or more divinities on the stage to solve complicated situations, in which no apparent ways out were available.
-
See also, among many others, Edmonds (1999), Phelan (2001) and Chu et al. (2003) and especially the popular books by Gleick (1987) and Waldrop (1992). A rather critical view of the research on complex systems undertaken at the Santa Fe Institute through the mid-1990s can be found in the writings of the science journalist John Horgan (Horgan, 1995, 1997). A very good account of the relationships between complexity theory, cybernetics, catastrophe theory and chaos theory (the four ‘C's) and their implications for economic theory, can be found in Rosser (1999).
-
Although this perspective is associated with the Santa Fe Institute, it was initiated in Europe by chemists and physicists concerned with emergent structures and disequilibrium dynamics (more precisely, in Brussel by the group of the Nobel-prize-winning physical chemist Ilya Progogine and in Stuttgart by the group of the theoretical physicist Hermann Haken)—see Haken (1983), Nicolis and Prigogine (1989) and Prigogine and Stengers (1984).
-
This is summarized by the empirical ‘law’ of a twofold increase in performance every 2 years.
-
It is worth remembering that some of the brightest minds of their time—gathered together around physicists Robert Oppenheimer under the Manhattan project, the World War II U.S. Army project at Los Alamos aimed at developing the atomic bomb—were reported to spend half of their time and effort in order to find smarter algorithms and save precious computing time on the huge but slow machines available (Gleick, 1992).
-
This allows the possibility to integrate tools developed as separate libraries by third parties (e.g. for graphical visualization, statistical analysis, database management, etc.).
-
The categories identified below correspond only partially to Axtell's.
-
Axtell (2000) provides references and examples for each case.
-
In what follows we will refer to the deterministic case. Generalization to the stochastic case requires some changes (mainly regarding the notation), but the idea remains the same.
-
Here and in the following we use ‘behavioral rules’ and similar terms in a loose sense that encompasses the actual intentional behaviors of individuals as well as other factors, such as technology, etc.
-
When the dynamic system has one (or more), stable equilibrium and the initial conditions lie in its (their) basin of attraction.
-
Or even not dependent on the initial conditions.
-
Note that the problem of deriving the equilibrium relation (4) from the law of motion (3) is often skipped altogether. Equilibrium conditions are externally imposed, and the dynamics towards the equilibrium is simply ignored: the system ‘jumps’ to the equilibrium.
-
This difficulty is the same experienced in game theory models, where games typically become intractable if they involve more than a handful of players.
-
The relevant exception is when rare events are themselves the focus of the investigation, for instance, as in risk management. Here, simulations may prove extremely useful, by dispensing from making assumptions—such as the Gaussian distribution of some relevant parameters—which may be necessary in order to derive algebraic results but have unpleasant properties—such as excessively thin tails. In a simulation, the reproduction of such rare events is limited only by the computational burden imposed on the computer. However, techniques can be used in order to artificially increase the likelihood of their occurrence.
-
Copyright © Cambridge University Press 2012 2012 Cambridge University Press
| Matteo G. Richiardi. 2012. Agent-based computational economics: a short introduction. The Knowledge Engineering Review 27(2)137−149, doi: 10.1017/S0269888912000100 |





