Week 2 Private: Joel Arellano

Technocratic Hubris

In the first three chapters, Gere describes the evolution of digital culture from a 19th century automated loom to the age of IBM, when the Cold War spurned the newly-christened Department of Defense to pour billions into the development of vast computer arrays designed to model and supplement human decisions. Gere presents the course of these changes as part of a cycle of common throughout history, and urges caution. He argues that digital culture is a false god – like the shamanism and mythology of the past – which will grow to dominate, control, and exploit us unless we recognize it as such.

So what is digital culture? Gere states that it is more than software and digital devices- it describes the paradigms and behaviors, including “abstraction, codification, self-regulation, virtualization, and programming” (17), that emerge in cultures possessed of such technology. After describing several examples of early digital machines, Gere focuses on several new fields of study that emerged between the 1940s and ‘50s, each of which was imbued with the five characteristics described above.

Gere begins by describing Claude Shannon’s development of Information Theory, which described communication as the origin, encoding, transmission, and decoding of Information, broadly defined. Initially applied to electronic transmissions, Information Theory was soon used in a wide range of fields. At the Macy Conferences, Norbert Wiener and others explored the potential of Information Theory to assist in predicting behavior in human and machine systems, and Wiener referred to their pursuit as the study of “cybernetics.” At the same time, the French Structuralists also contributed to the theoretical development of digital culture by abstracting linguistics into ‘semiotics,’ a theoretical practice where lingual representations are divorced from their meanings and both are formally represented. Combined with psychology and anthropology, semiotics provided the means to model human relationships “without reference to the particular embodied circumstances of the phenomena concerned” (62). Though Gere does not mention semiotics again in the assigned reading, it is reasonable to assume that the abstraction and codification of communication assisted cybernetics in modeling human social relationships.

This predictive potential caught the military’s attention. After WWII, the Department of Defense invested heavily in developing computers that could run such models, primarily for the purpose of avoiding global nuclear war. To that end, computers were used both to predict outcomes of various attack scenarios and to supplement human control over our weapons systems through fail-safe automation. The immense responsibility of these tasks demonstrated a new faith in computers to think and act in place of humans- faced with the chaos humans had created, it’s not surprising that we were open to more reliable means of decision-making. Gere writes that “theories of self-regulation combined with new technologies [gave] a sense of control and mastery in a complex world” (78), and there are few times in our nation’s history that the world has been perceived as complex and terrifying as it was during the Cold War.

Beginning during the Cold War and continuing over the subsequent decades, faith in computers spilled into civilian life along with its underpinning paradigms and the “methods of organization suggested by cybernetics” (73).  One can appreciate the dazzling effect computers must have had on the population, offering an unbiased, rational, and presumptively perfect way to quantify and make sense of the world humans couldn’t seem to manage on their own. Unsurprisingly, a new, universal framework for studying efficiency was gaining popularity at the same time – systems analysis – which reduced humans and machines alike to components within systems. Applying this theory to business management, companies like IBM focused on hierarchy and quantitative production rather than more human considerations like productivity and quality. Unfortunately, focusing on formal structures and quantity over quality led to inefficiency, as Gere points out that technocratic ideas failed when applied to war strategy, industry, and the economy.

Wiener, Shannon, and others recognized that “information was bound up with uncertainty” (54), but Gere’s chronology illustrates how this value seems to have been lost. From the 40s through the 80s, digital culture led to technocratic hubris and an apparent over-reliance on the potential of raw digital power to solve human problems. In the introduction, Gere points out that the human impulse to mastery – of self, knowledge, others, and the world around us – has driven driven us to excess repeatedly throughout the course of human history. Now, digital culture threatens to do so on a global scale, since our banking systems and critical infrastructure are digitally vulnerable. Given recent developments in the ability to digitally track people and communication, one wonders whether we haven’t once again leapt past caution to blind faith in technology, without adequate regard for the potential consequences of our trust in the security, privacy, and reliability of digital systems.
Google pixel

Print Friendly, PDF & Email

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>