Systems in Art Making and Art Theory: Complex Networks from the Ashes of Postmodernism

Philip Galanter

Assistant Professor, Department of Visualization, College of Architecture, Texas A&M University

Notions of Complexity in Art

One of the first modern scholars to apply technical notions of complexity to art was the mathematician George Birkhoff. [1] He did this as part of an effort to create a quantitative measure of aesthetic effectiveness. His ‘aesthetic measure’ was defined with a simple formula:

M = O / C

Where:

M is the measure of aesthetic effectiveness
O is the degree of order, i.e. the psychological tension resolved upon perception
C is the degree of complexity, i.e. the effort expended in perception

Birkhoff noted, “The well-known aesthetic demand for ‘unity in variety’ is evidently closely connected with this formula.” One problem with this formula was that O and C were at best proxy measures based on rather arbitrary geometric counting procedures. Another was that empirical tests with human subjects using Birkhoff’s own examples did not confirm his formula. [2]

Somewhat later Claude Shannon developed ‘Information Theory.’ [3] Interested in calculating the bandwidth of a communication channel, for Shannon complexity increased with surprise and resistance to lossless compression. For example, there would be little information or surprise in the following highly compressible communication:

“AAAAAAAAAAAAAAAA”

There is more information and surprise in something like this:

“I LIKE GENERATIVE ART”

As is popular with smartphone texting, this message could still be somewhat compressed:

“I LIK GENART”

For Shannon the most complex signal, the one with maximum information, is a string of random characters:

“FOZEVQMWPYLSCVJAXLJU”

A truly random string cannot be compressed without losing information.

Earlier forms of computer art, notably that created by the first generation ‘algorists’ of the 1960’s, were born under the influence of this notion of system complexity. Max Bense (in Germany) and Abraham Moles (in the US) extended George Birkhoff’s aesthetic measure with Shannon’s information theory creating the study of ‘information aesthetics.’ Both equated high degrees of order with simplicity, and high degrees of disorder (i.e. randomness) with complexity; both were interested in system redundancy, repetition, randomness, and repertoire.

Max Bense emphasized the notion of analyzing media space in terms of information theory, and then used it as the basis for what he called ‘generative aesthetics’:

The system of generative aesthetics aims at a numerical and operational description of characteristics of aesthetic structures (which can be realized in a number of material elements) which will, as abstract schemes, fall into the three categories of the formation principle, distribution principle, and set principle. These can be manipulated and applied to an unordered set of elements, so as to produce what we perceive macro-aesthetically as complex and orderly arrangements, and micro-aesthetically as redundancies and information. [4]

Abraham Moles combined these notions with findings from perceptual psychology in order to analyze the arts. Using various statistical measures, in 1966 Moles showed how musical works exist on a spectrum from ‘banal’ to ‘novel’ corresponding to the relative order and disorder they exhibit as information. In addition, he demonstrated how various forms of media could be thought of as the union of aesthetic continua in a high dimension space. [5]

However, as much as information theory has its place in the analysis of communication channel bandwidth, it does not correspond very well with our experiential sense of complexity in the world generally, or in art specifically. To the extent it equates complexity with disorder, information theory breaks down as a general model of our experience. This is where contemporary complexity science has fashioned a response in the form of effective complexity.

Complexity science takes on widely varied systems such as the stock market, ant colonies, the brain and mind, the evolution of species, autocatalytic chemical systems, the weather, political systems, and social movements. What complex systems have in common is large numbers of smaller scale components with local interactions resulting in emergent behavior at a higher scale. These systems are frequently nonlinear and dynamic, and they often include feedback mechanisms. While science views these systems as mechanical and deterministic, at the same time some are chaotic and ultimately unpredictable due to their sensitivity to initial conditions. [6]

In the context of contemporary complexity theory both highly ordered and highly disordered systems are thought of as being simple. Complex systems such as those noted above, and especially life itself, combine order and disorder. One version of this view championed by physicists Gell-Mann and Lloyd is called “effective complexity.” [7]

Information complexity compared with effective complexity, ©Philip Galanter.

Information complexity compared with effective complexity, ©Philip Galanter.

Generative Art as Systems-Based Art

In earlier writing I proposed what has become the most widely cited theory of generative art:

Generative art refers to any art practice where the artist uses a system, such as a set of natural language rules, a computer program, a machine, or other procedural invention, which is set into motion with some degree of autonomy contributing to or resulting in a completed work of art. [8]

It’s worth noting that, for example, systems of chemistry, biology, and nanotechnology can be used in generative art, and generative art is not limited to computer art. In addition, the term ‘generative art’ says nothing about content. It casts a wide net including many forms in the standard canon of art. In fact this establishes generative art as being as old as art itself. In this regard consider, for example, nonobjective symmetry and pattern-based art found in every known culture. [9, 10]

Generative art uses all manner of systems, both simple and complex. Each of these systems can be identified as having a place according to its effective complexity.

Generative art classified by system effective complexity, ©Philip Galanter.

Generative art classified by system effective complexity, ©Philip Galanter.

Of all the techniques used by generative artists, evolutionary techniques and genetic algorithms seem to hold the most promise of long-term extensible development. They can be used in combination with virtually every other technique by taking what would be static parameters and treating them as quantities or categories that can be mutated or exchanged with other individuals.

More than other systems, evolution captures and relentlessly exploits the bottom up emergence we expect to find in complex systems. Some speculate that biological evolution is just one example of a broader so-called Universal Darwinism. The thinking here is that much of the structure we see in the world, natural and otherwise, is the result of materials assembling themselves in a way that minimizes or reverses local entropy. While the universe at large tends towards energy dissipation and decay leading to randomness, some clusters persist more than others.

Where matter can retain structural information, where that information can be varied, and where there are external forces against which some structures will fare better than others, a kind of evolution is possible. Structures that persist end up being observed as those that indeed exist.

Universal Darwinism as found in nature, ©Philip Galanter.

Universal Darwinism as found in nature, ©Philip Galanter.

The kind of multi-level emergence found in biological systems requires mechanisms difficult to invent de novo. Most genetic algorithms simply take a finite number of genes and directly map them into a phenotype, i.e. a result. Imagine, for example, a genetic algorithm for drawing faces, with genes for eye color, eye shape, nose type, nose size, etc. An extension might allow concatenation of genes creating a chromosome of arbitrary length. A gene for a typical 6-legged insect might grow into a gene for a centipede.

More sophisticated is a genetic system that doesn’t map directly into a phenotype, but rather creates a machine that then, in turn, creates the phenotype. But what happens in biology is something even more open-ended and powerful. What happens in nature, and what we’ve not yet captured in software, is that genes create machines that can create machines that can create machines…for an arbitrary number of iterations at larger and larger scales with no obvious limit.

This is shown in diagram 3. I count myself among those who find something very meaningful about this process. It appears to establish a kind of teleology, but not a teleology that requires some other metaphysical realm for its motive force. What we see here is a material teleology. It’s the way the physical universe inevitably sorts itself out.

Towards a Complex-Systems Inspired Worldview

C. P. Snow’s 1959 Rede lecture “The Two Cultures” is perhaps the first popular airing of the rift between the humanities and science. This rift has only become greater in the intervening years:

Literary intellectuals at one pole – at the other scientists, and as the most representative, the physical scientists. Between the two a gulf of mutual incomprehension – sometimes (particularly among the young) hostility and dislike, but most of all lack of understanding…The non-scientists have a rooted impression that the scientists are shallowly optimistic, unaware of man’s condition. On the other hand, the scientists believe that the literary intellectuals are totally lacking in foresight, peculiarly unconcerned with their brother men, in a deep sense anti-intellectual, anxious to restrict both art and thought to the existential moment. And so on. [11]

Postmodernism, deconstruction, post-structuralism, critical theory, and the like introduce notoriously elusive, slippery, and overlapping terms and ideas. Most adherents would argue that this must be the case because each is not so much a position as an attitude and an activity; an attitude of skepticism and activity that is in the business of destabilizing apparently clear and universal propositions. Relative to the modern culture of science, however, the postmodern culture of the humanities can be starkly contrasted.

The modern versus the postmodern, ©Philip Galanter.

The modern versus the postmodern, ©Philip Galanter.

Where the science of modernity seeks progress towards the absolute, the postmodern humanities see relative positions in constant circulation. Science tends to fix concepts in rigid hierarchies where those in the humanities seek to collapse assumed structures and expect the random. Modernists believe in truth as authority, whereas postmodernists deny any totalizing truths and keep all candidates in perpetual contention. Modern art seeks heroic practitioners who present form as a kind of revelation, where postmodern art eschews formalism as a false claim to useless beauty.

The complexity worldview creates a subsuming synthesis, ©Philip Galanter.

The complexity worldview creates a subsuming synthesis, ©Philip Galanter.

As polar and incommensurate as these two worldviews seem, and as divided as the culture of science and the culture of the humanities have become, a worldview suggested by complexity theory offers a new subsuming synthesis. It suggests a distributed world where progress is neither linear and absolute, nor relative and circular, but rather a process of emergence and coevolution. Complexity suggests that both overly rigid and overly random systems are limited, but networks allow feedback for constant adjustment and adaptation. Finally, complexity can revive formalism in art not as a heroic effort of privilege, but rather in publicly accessible appreciation of natural and invented systems.

Network Systems and the Arts and Humanities

In the arts and humanities perhaps the most potent kind of system considered, and the system that generates the most interpretation, is the network. There is little doubt this is due as much as anything to the growth of the Internet and in particular the network application called the World Wide Web.

It’s my contention that because complexity-based network models have been mostly ignored, the arts and humanities have produced rather poor analyses when it comes to the Internet, the Web, and networks in general. For example, the rhizome is a simple plant root system that succeeds primarily in the reproduction and distribution of plant life. But the rhizome, at least as a meaningful symbol for the power of networks, is something of a myth. Many or most interesting networks, including the Internet and Web, are not rhizome-like.

The power of the rhizome as a metaphor is probably due to its use by Deleuze and Guattari. It is also, not coincidently, the name of a web-based arts organization. The rhizome is typically suggested as the alternative to hierarchical networks, i.e. tree- or star- structured networks. The rhizome is presented as a network where all nodes are radically equal. But this is a false binary choice.

More typical is a third type of network where, while there are typically multiple paths to any given node, some nodes are more popular and well connected with higher bandwidth than others. These ‘scale free networks,’ where some nodes are more equal than others, can be created through a process called preferential attachment. The driving principle is that the more links a given node has, the more likely it is to collect additional links in the future. As in a chaotic system, slight differences in initial connections can result in radically different networks. This mechanism isn’t merely theoretical or mathematical. For example Google presents nodes in search results based in part on the count of incoming links. This leads to some nodes being found first in searches. And that leads to the increased probability of those nodes gaining new links.

Networks emerging as complex systems, self-assembling without overt human planning, and forming in a scale-free non-rhizome fashion happens in many settings and situations. Scale-free networks have been found in studies of social networks such as collaborations in films or co-authoring mathematics papers. They emerge in growing financial networks and the formation of airline route maps. Protein-protein interaction networks and semantic networks exhibit scale-free properties.

There is also a related phenomenon called the Mathew effect. The effect results in discoveries and inventions being credited to already known practitioners rather than those new on the scene. Scale-free networks typically form due to choices made in an economic context. Where links and nodes are not free, agents will want to maximize the return on their investments. Unlike infrastructure and material heavy paper publishing, or old school filmmaking, the cost of Internet media production is not stratified and offers an inexpensive smooth slope for ramping up quantity. So after so much democratic promise, why has the Internet taken on a non-rhizome-like form rather than a radically egalitarian scheme?

In short, the Internet and Web have emerged with scale-free features because there is still a constrained resource. Moreover, it’s a resource unlikely to ever become unconstrained. That resource is the reader’s time. As long as readers need to make choices to optimize the return on their time, media networks will not be rhizomes. Even those fluent with complexity science can be lead astray by putting the new wine of complex network systems into the old bottles of post-modernism. In his book “Complexity and Postmodernism” the philosopher Paul Cilliers offers the system view of neural connectionism as something already implicit in Derrida’s notions of traces and differences. But doing so exacts a great cost. In his view mental representation becomes recursive and unanchored, and our ability to create meaningful abstractions from empirical evidence is fatally eroded.

Ultimately, Cilliers questions the very practice of any system science when he says: “If something is really complex, it cannot be adequately described by means of a simple theory. Engaging with complexity entails engaging with specific complex systems.” This abandonment of scientific method, and the corrosive skepticism regarding scientific epistemology, stands like a signed confession, and a barrier to further understanding. [12]

Alexander R. Galloway in “Protocol: How Control Exists After Decentralization,” and with Eugene Thacker in “The Exploit: A Theory of Networks” offers a fundamentally political critique of the Internet. He argues that while the Internet is popularly viewed as a new decentralized and highly democratic medium, in fact it is a new form of highly efficient control by means of the technological protocols required. [13, 14] Citing numerous philosophers, but most notably Foucault on power relations and Deleuze and his notion of control society, Galloway attempts to capture the new system science of networks with the standard political tropes of critical theory. In doing this he is all too comfortable rhetorically eliding from the control of data checksums and packet routing to the implication of social control. Further, his worries about the social implications of object-oriented code encapsulation and the need for political critique to be applied to algorithms seem at best to be a form of what philosopher Gilbert Ryle has termed a category mistake.

Fortunately there is an alternative already present in the humanities. Process philosophy resonates strongly with our contemporary view of complex systems, networks, and emergence. It offers an alternative to more traditional substance metaphysics where primacy is given to static materials that in turn participate in activities. Dating back to atomistic Greek philosophers such as Democritus and Epicurus, substance metaphysics privileges nouns over verbs.

Process metaphysics proposes that being is dynamic, and it is that dynamic nature that must be the point of departure. Nouns do not dominate verbs, but rather they are fused in a world of becoming, just as a network exists only when there is a fusion of nodes and edges. Process philosophy has deep roots such as Heraclitus in the Greek era and interestingly Taoism in the ancient east. Process philosophy can be found in Leibniz, the Americans Peirce, James, and Dewey, and was most fully developed as a comprehensive system by Alfred N. Whitehead.

Interest in this analytic strain of process philosophy has its roots in consideration of the Darwinian theory of evolution, and has most recently been applied to questions surrounding quantum mechanics, emergence and self-organization, and embodied cognition. It provides a strong home-built paradigmatic bridge those in the humanities can use to leave postmodernism behind, and to cross over to the new world of complex systems and complex network theory.

References:

1. G. D. Birkhoff, Aesthetic Measure (Cambridge, MA: Harvard University Press, 1933), xii, 1, 2 l., 3‒225.
2. D. J. Wilson, “An Experimental Investigation of Birkhoff’s Aesthetic Measure,” The Journal of Abnormal and Social Psychology 34:3 (1939): 390‒394.
3. Claude E. Shannon, “A Mathematical Theory of Communication,” The Bell System Technical Journal 27:3 (1948): 379‒423.
4. Max Bense, The Projects of Generative Aesthetics, in Cybernetics, Art, and Ideas, ed. Jasia Reichardt (Greenwich, CT: New York Graphic Society, 1971), 207.
5. A. A. Moles, Information Theory and Esthetic Perception (Urbana, IL: University of Illinois Press, 1966), 217.
6. Melanie Mitchell, Complexity: A Guided Tour (New York: Oxford University Press, 2009), xvi, 349.
7. Murray Gell-Mann, “What is Complexity?” Complexity, John Whiley and Sons, 1995. 1(1): 16-19.
8. Philip Galanter, “What is Generative Art? Complexity theory as a context for art theory” (paper presented at The International Conference on Generative Art, Generative Design Lab, Milan Polytechnic, Milan, Italy, 2003).
9. Istvan Hargittai and Magdolna Hargittai, Symmetry: A Unifying Concept (Berkeley, CA: Shelter Publications, 1994), xviii, 222.
10. Michael Balter, “From a Modern Human’s Brow – or Doodling?” Science 295 (2002): 247-248.
11. C. P. Snow, The Two Cultures (London: Cambridge University Press, 1994), xxiii, 107.
12. Paul Cilliers, Complexity and Postmodernism: Understanding Complex Systems (New York: Routledge, 1998), x, 156.
13. Alexander R. Galloway, Protocol: How Control Exists after Decentralization (Cambridge, MA.: MIT Press, 2004), xxvi, 260.
14. Alexander R. Galloway and Eugene Thacker, The Exploit: A Theory of Networks. (Minneapolis, MN: University of Minnesota Press, 2007), vii, 196.

Bio:

Philip Galanter is an artist, theorist, and curator. As an Assistant Professor at Texas A&M University he conducts graduate studios in generative art and physical computing. He is an MFA graduate of the School of Visual Arts in New York City.

Galanter creates generative hardware systems, light sculptures, video and sound art installations, digital fine art prints, and light-box transparencies. His work has been shown in the United States, Canada, the Netherlands, Peru, Italy, and Tunisia. Venues have included ISEA (formerly Inter-Society for the Electronic Arts), IEEE VIS (The visualization section of the Institute of Electrical and Electronics Engineers), and the International Computer Music Conference.

Galanter’s research includes the artistic exploration of complex systems, and the development of art theory bridging the cultures of science and the humanities. His writing