scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Annals of the History of Computing in 2016"


Journal ArticleDOI
TL;DR: A looks back at the birth of the fork system call to share, as remembered by its pioneers, both with regard to software development principles and to hardware design, which increasingly accommodates parallelism in process execution.
Abstract: The fork call allows a process (or running program) to create new processes. On multiprocessor systems, these processes can run concurrently in parallel. Since its birth 50 years ago, the fork has remained a central element of modern computing, both with regard to software development principles and, by extension, to hardware design, which increasingly accommodates parallelism in process execution. This article looks back at the birth of the fork system call to share, as remembered by its pioneers.

15 citations


Journal ArticleDOI
TL;DR: This article focuses on the first such system, Intrusion Detection Expert System (IDES), developed in the second half of the 1980s at SRI International, and analyzes the disproportionately high contributions of women scientists to leadership in IDS research and development relative to other computer security specialties.
Abstract: As part of a broader prehistory and history of early intrusion-detection systems (IDSs), this article focuses on the first such system, Intrusion Detection Expert System (IDES), which was developed in the second half of the 1980s at SRI International (and SRI's follow-on Next Generation Intrusion Detection Expert System, or NIDES, in the early-to-mid 1990s). It also briefly recounts other early IDSs and the National Security Agency's Computer Misuse and Anomaly Detection (CMAD) Program, and it analyzes the disproportionately high contributions of women scientists to leadership in IDS research and development relative to other computer security specialties.

13 citations


Journal ArticleDOI
TL;DR: This study of the PLI is an entry into the historical relationship between cryptography and packet-switched computer networks.
Abstract: Developed around 1973 by BBN under contract from DARPA, the private line interface (PLI), a cryptographic cybersecurity device used on the Arpanet, operated with minimal modification of the existing network infrastructure, sitting at the "edge" of the network between the network switches and the connected host computers. As a result of the developmental and infrastructural trajectory set in motion by the PLI, significant cryptographic resources remain at the edges (or ends) of the networks that constitute the Internet today. This study of the PLI is an entry into the historical relationship between cryptography and packet-switched computer networks.

12 citations


Journal ArticleDOI
TL;DR: The authors compare the adoption of computers at HSBC in the 1960s and 1970s with the Octopus micropayment system, which was developed in the 1990s by a consortium that excluded financial firms.
Abstract: A dramatic change occurred in retail banking technology in Hong Kong between 1960 and 2000. Initially, the relevant technologies were installed and managed within the boundaries of large banks, such as HSBC. Over the course of this period, however, the industrial organization of the relevant technologies transformed to include provisions outsourced to nonbank institutions. This article seeks to account for this shift in the organization of computer technology. Specifically, the authors compare the adoption of computers at HSBC in the 1960s and 1970s with the Octopus micropayment system, which was developed in the 1990s by a consortium that excluded financial firms, thanks to the development (both in terms of depth and breadth) of an epistemic community of computer professionals and computer-literate managers in Hong Kong.

10 citations


Journal ArticleDOI
David Hemmendinger1
TL;DR: Two early networking experiments joined a time-sharing computer at the System Development Corporation with systems at the Stanford Research Institute briefly in 1963 and at MIT Lincoln Laboratory in 1966-1967 and included experiments with the interactive use of remote programs.
Abstract: Two early networking experiments joined a time-sharing computer at the System Development Corporation with systems at the Stanford Research Institute briefly in 1963 and at MIT Lincoln Laboratory in 1966-1967. Both were influenced by J.C.R. Licklider's interest in resource sharing and included experiments with the interactive use of remote programs.

9 citations


Journal ArticleDOI
Aaron Plasek1
TL;DR: To effectively explore the intellectual, material, and disciplinary contingencies surrounding both the curation and subsequent distribution of datasets, the field of machine learning needs to be taken seriously as a worthy subject for historical investigation.
Abstract: The construction, maintenance, and mobilization of data used to both constrain and enable machine learning systems poses profound historiographical questions and offers an intellectual opportunity to engage in fundamental questions about novelty in historical narratives. To effectively explore the intellectual, material, and disciplinary contingencies surrounding both the curation and subsequent distribution of datasets, we need to take seriously the field of machine learning as a worthy subject for historical investigation.

9 citations


Journal ArticleDOI
Dongoh Park1
TL;DR: The history of the Korean character code is traced, from the introduction of the KSC-5601:1987 standard to the adoption of the Unicode system.
Abstract: Adequately representing the Korean language has been a challenge since the earliest introduction of digital computing and computer-mediated communication. As a local solution to the more general problem of the internationalization of computer languages, the Korean government introduced a new standard character code known as KSC-5601 in 1987. This article traces the history of the Korean character code, from the introduction of the KSC-5601:1987 standard (which stirred up a decade of controversy that was as much political, economic, and cultural as it was technological) to the adoption of the Unicode system.

9 citations


Journal ArticleDOI
Gerardo Con Diaz1
TL;DR: This article argues that the journey of Benson and Tabbot's program through the patent system from 1963 to 1972 consists of a series of "ontological contests", which reveals how the nature of computer programs as technologies and inventions was shaped by public administrators, judges, corporate attorneys, and trade associations.
Abstract: In 1972, the US Supreme Court issued Gottschalk v. Benson, one of the most prominent decisions in the history of software patenting. It ruled that a computer program developed at Bell Laboratories by Gary Benson and Arthur Tabbot was ineligible for patent protection. This article argues that the journey of Benson and Tabbot's program through the patent system from 1963 to 1972 consists of a series of "ontological contests"--that is, clashes between attorneys and federal agents who proposed mutually incompatible conceptions of the nature of software, each one designed to serve as a philosophical underpinning for patent law. This argument invites a new historical approach to the study of the history of software patenting, one that reveals how the nature of computer programs as technologies and inventions was shaped by public administrators, judges, corporate attorneys, and trade associations.

8 citations


Journal ArticleDOI
TL;DR: It is shown that manufacturing was not necessarily simpler than design and how important these producers' design engineering capabilities were to attracting customers and establishing a solid foundation for their future development.
Abstract: This article explores how Taiwanese laptop contract manufacturers established the foundation of their businesses between the late 1980s and mid-1990s. It casts doubt on the traditional view of a linear progression from manufacturing to design capability in Asia. By examining the three earliest laptop projects in Taiwan, however, this author shows that manufacturing was not necessarily simpler than design and how important these producers' design engineering capabilities were to attracting customers and establishing a solid foundation for their future development.

7 citations


Journal ArticleDOI
TL;DR: New evidence about two early multilevel access, time-sharing systems, SDC's Q-32 and NSA's RYE, and its security-related consequences for both the 1967 SJCC session and 1970 Ware Report are described.
Abstract: The 1967 Spring Joint Computer Conference session organized by Willis Ware and the 1970 Ware Report are widely held by computer security practitioners and historians to have defined the field's origin. This article documents, describes, and assesses new evidence about two early multilevel access, time-sharing systems, SDC's Q-32 and NSA's RYE, and outlines its security-related consequences for both the 1967 SJCC session and 1970 Ware Report. Documentation comes from newly conducted Charles Babbage Institute oral histories, technical literature, archival documents, and recently declassified sources on National Security Agency computing. This evidence shows that early computer security emerged from the intersection of material, cultural, political, and social events and forces.

7 citations


Journal ArticleDOI
TL;DR: The evolving information infrastructure used by historians in support of their research on the history of computing and of the role of IT practitioners, computer executives, scientists, and universities in creating that support since the 1970s are described.
Abstract: This article is the second in a two-part series exploring the development of the early history of information technologies, from the 1940s to the present. This article describes the evolving information infrastructure used by historians in support of their research on the history of computing and of the role of IT practitioners, computer executives, scientists, and universities in creating that support since the 1970s.

Journal ArticleDOI
TL;DR: A retooling of the maps of the Arpanet to highlight further what is missing from them: flows, gateways, and hierarchy.
Abstract: The earliest and most widespread representation of the Arpanet were network graphs or maps that, arguably, remain its most prominent artifact. In an earlier article, the authors analyzed how the maps were created, what they represented, and how histories of the network parallel their emphases and omissions. Here, the authors begin a retooling of the maps to highlight further what is missing from them: communication flows, gateways to other networks, and hierarchies between its nodes.

Journal ArticleDOI
John Day1
TL;DR: The author places the INWG discussions in this wider context to better understand the technical points and implications, their ultimate impact, and the paradigm shift that threatened established business models.
Abstract: In a 2011 Anecdote department article in the Annals, Alex McKenzie provided an excellent account of the events between 1974 and 1976 leading up to INWG 96, a proposed internetwork transport protocol. McKenzie's anecdote focused on the events in INWG (International Network Working Group), which this article shows were a small part of a much larger debate that was going on outside. The author places the INWG discussions in this wider context to better understand the technical points and implications, their ultimate impact, and the paradigm shift that threatened established business models.

Journal ArticleDOI
TL;DR: Without a framework for understanding innovation, as activity and aspiration, the author squanders an opportunity to clarify the relationship between the imperatives of digital electronics and the various ways in which cultures and personalities construct and reconstruct their computers and computer networks in pursuit of human aims.
Abstract: When chroniclers of technological change in the 20th century worship innovation as if it were a god, they often feel freed of the obligation to define the object of their worship. So it is with Walter Isaacson and his popular 2014 book, The Innovators, which begins with a beguiling confession by the author that innovation is “a buzzword, drained of clear meaning.” Rather than address the implications of the elusiveness of the innovation concept or set down the terms of his engagement with the history of computing, which is more closely the subject of his book, the authod does neither, thus squandering a chance to present himself as what he probably aspires to be: a bridge builder between the two mighty rivers of innovation studies. On the one hand, the author recounts the birth, life, and death of digital artifacts; on the other hand, he highlights the people and subcultures that shape digital innovations. But without a framework for understanding innovation, as activity and aspiration, the author squanders an opportunity to clarify the relationship between the imperatives of digital electronics and the various ways in which cultures and personalities construct and reconstruct their computers and computer networks in pursuit of human aims.

Journal ArticleDOI
TL;DR: The previously undocumented history of the 101 Online ecosystem is revealed and reasons why it failed where Minitel had succeeded are suggested.
Abstract: In 1981, videotex and virtual circuits were the hot computer network technologies and promised to bring the world to the masses. Amid a worldwide battle over standards, France started Minitel, which quickly became the first successful mass-market digital information-distribution ecosystem. In 1991, France Telecom launched the American version of Minitel, 101 Online, in San Francisco. 101 Online was as massive a failure as Minitel had been a success. This article reveals the previously undocumented history of the 101 Online ecosystem, suggests reasons why it failed where Minitel had succeeded, and draws lessons for the current policy debate on what information-network architecture and implementation best fosters digital innovation.

Journal ArticleDOI
TL;DR: What postcolonial science and technology studies mean for historians of computing is explored in a study of theories and systems of society and technology influenced by colonialism.
Abstract: Postcolonial studies of science and technology seek to reevaluate our theories and systems of society and technology in light of the ways that they are influenced by the long history of colonialism. This article explores what postcolonial science and technology studies mean for historians of computing.

Journal ArticleDOI
TL;DR: A collection of trip reports from Western computer experts who visited the Soviet Union in the 1960s reveals details about interactions between these Westerners and their Soviet counterparts during the height of the Cold War.
Abstract: During its initial decades, computing emerged as a key technology of the Cold War. Due to a largely unidirectional flow of information into, but not out of, the Soviet Union, Western policy analysts who were wary of the growing Soviet computing prowess sought intelligence from numerous sources. Based on an examination of trip reports from Western computer experts who had visited the Soviet Union, this article looks at the interaction between Western computer specialists and their Soviet counterparts during the height of the Cold War. It describes how these interactions helped to shape and confront American perceptions of Soviet computing and the threats it posed to the West. The previously unexplored first-hand perspectives offered in these largely forgotten trip reports help to illuminate the West's fascination and fear of computer technology behind the Iron Curtain.

Journal ArticleDOI
TL;DR: One of two major IBM supercomputer efforts in the late 1960s, the Advanced Computer Systems project had significantly more ambitious performance goals than the earlier IBM System/360 Model 91 project, and it pioneered many features that became common decades later.
Abstract: The Advanced Computer Systems (ACS) project was one of two major IBM supercomputer efforts in the second half of the 1960s. ACS had significantly more ambitious performance goals than the earlier project that developed the IBM System/360 Model 91, and the ACS-1 instruction set and processor design pioneered many features that became common some two or three decades later, such as multiple condition codes and aggressive out-of-order execution. ACS also pioneered high-speed integrated circuitry that required immersive cooling in liquid fluorocarbon. Although the project was canceled, it brought many talented engineers to California and contributed to several later developments at IBM and beyond, including the Amdahl line of System/370-compatible processors in the 1970s and the IBM 801 and POWER processors in the 1980s.

Journal ArticleDOI
Craig Partridge1
TL;DR: This article revisits the preceding five years to show that the IETF, at least initially, sought to avoid the crisis that forced a restructuring of the Internet standards governance process.
Abstract: In June 1992, the Internet Activities Board sought to push the Internet Engineering Task Force into a solution for the Internet's address depletion problem. Its actions provoked a management crisis that forced a restructuring of the Internet standards governance process. Although the events have been characterized as a revolt by the Internet Engineering Task Force, this article revisits the preceding five years to show that the IETF, at least initially, sought to avoid the crisis.

Journal ArticleDOI
TL;DR: The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson as mentioned in this paper is a popular book on the history of computing.
Abstract: Walter Isaacson, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, Simon & Schuster, 2014. When chroniclers of technological change in the 20th century worship innovation as if it were a god, they often feel freed of the obligation to define the object of their worship. So it is with Walter Isaacson and his popular 2014 book, The Innovators, which begins with a beguiling confession by the author that innovation is “a buzzword, drained of clear meaning” (p. 1). Rather than address the implications of the elusiveness of the innovation concept or set down the terms of his engagement with the history of computing, which is more closely the subject of his book, Isaacson does neither, thus squandering a chance to present himself as what he probably aspires to be: a bridge builder between the two mighty rivers of innovation studies. On the one hand, he recounts the birth, life, and death of digital artifacts; on the other hand, he highlights the people and subcultures that shape digital innovations. But without a framework for understanding innovation, as activity and aspiration, he squanders an opportunity to clarify the relationship between the imperatives of digital electronics (to what extent these technical factors determine the digital field at play) and the various ways in which cultures and personalities construct and reconstruct their computers and computer networks in pursuit of human aims. Isaacson, a journalist who penned the authorized biography of Steve Jobs after writing popular biographies of Einstein and Ben Franklin, fails to embed these trajectories into the wider context of technological developments in the United States and the world in the second half of the 20th century. In the wider frame of innovation, or why societies and national economies grow and change, Isaacson is even weaker, more uncertain and thinner in his use of sources. Because he only examines successes in computing, rarely looking at losing approaches, his insights into how innovation occurs are usually tautological. His logic is insufferably circular and rarely rises above the level of empty truisms. “The most successful endeavors in the digital age,” he writes in his closing pages, “were those run by leaders who fostered collaboration while also providing a clear vision” (p. 484). For example, since plenty of leaders fail despite possessing widely admired traits, Isaacson resorts to identifying successful computer people—say Andy Grove, Bill Gates, or Steve Jobs—and then posits that they must have these traits because they certainly succeeded in a big way. On closer inspection, however, these super-successful digital tycoons don’t possess all of these traits or have them in the combinations seemingly required for grand achievements. Isaacson is then forced to concede that a great innovator can be both passionately self-possessed and highly open: “The best leaders could be both,” he writes (p. 484). Such post-hoc analyses end up looking like history written by the winners (and for the winners). When Isaacson does invoke an actual trait of successful innovators—that they know how to design and built the novel products they are selling—he again lapses into tautology. Innovators, he writes, share “one thing in common: they were product people.” True, but what else could they be? If the definition of an innovator is to back a novel thing or service and win in the marketplace against inertia or the power of tradition, how could they not possess mastery of their products? Isaacson has written a book that demands attention because of the author’s stature and the scale and scope of the capacious subject he tackles. But alas, The Innovators is not about innovation broadly construed or even innovators as individuals ranging across diverse areas. Rather, Isaacson concentrates on a specific area of innovation: computing and related digital electronics and communications. Following well-established paths that attempt to humanize computer history by associating a person or a set of people with major historical shifts in computing, Isaacson presents a series of potted biographies, largely if not completely derived from existing literature (some of which is decades old) that is organized around successful companies (such as Intel, Apple, Microsoft, and Google) or technological platforms (the PC, computer games, software, the Web, and search are representative examples). While invoking pieties about the importance of teams and collaboration in the rise and spread of computer technologies and what he terms “the digital revolution,” Isaacson subtly and at times bluntly emphasizes, if not exaggerates, the role of signal individuals such as Steve Jobs, Bill Gates, Gordon Moore, and Larry Page, and even earlier, formative actors such as Vannevar Bush (whom he invokes at the opening of four chapters), Alan Turing, and Von Neumann. Isaacson’s approach both panders to the popular tastes and reflects them. The popular understanding of the ascent of the computer, as artifact and sociotechnical driving force, lends much weight to the potential effects of great individuals on the course of sociotechnical histories. The habit of personifying phases in the ongoing digital revolution is so strong that biography is easily equated with history. Although human dimensions of technological change should always be explored, the emphasis on heroic individuals can

Journal ArticleDOI
TL;DR: The reality is that no one is any closer to implementing a contextually nimble machine learning system that could, say, engage us in an exegesis of a poem than when Alan Turing first proposed this thought experiment for machine intelligence in 1950.
Abstract: We’ve grown accustomed to speculative narratives about the birth of artificial intelligence (both on the screen and the page) in which computers programmed by us to teach themselves quickly exceed our own mental faculties and physical resources. The idea that an “ultraintelligent machine” will “gradually improve itself out of all recognition” is nearly as old as the term AI itself. It has gained prominence in the popular imagination through a cottage industry of books, articles, and think pieces advocating for the study of safe AI. The idea is to defer or defuse entirely an inevitable apocalypse, and it’s prompted at least one multimillion dollar donation “aimed at keeping AI beneficial to humanity.” The reality is that no one is any closer to implementing a contextually nimble machine learning system that could, say, engage us in an exegesis of a poem than when Alan Turing first proposed this thought experiment for machine intelligence in 1950. One of the most exciting implementations of a “general purpose” algorithm was one that learned how to play 29 Atari video games at “human-level or above” proficiency. That is, what is “general” is its ability to learn different games by only having access to the pixels on the screen and the controller inputs while only being programmed to maximize score. This was such an impressive advance over earlier work that it was featured on the cover of Nature in 2015. The failure to appreciate this point has contributed to myopia in the popular histories of AI that rely on AI researchers as informants while downplaying the enormous body of technical work these informants produced, often relegating the field of machine learning to a mere subfield of AI. The actual historical situation, in terms of the sheer volume and ambit of technical publications produced, suggests the opposite to be true: machine learning has always been center stage, while AI within the larger field of computer science has often had the status of a disciplinary backwater.

Journal ArticleDOI
TL;DR: This article briefly surveys efforts to introduce computerized, centralized, and portable electronic health records in the United States and why they have been largely unsuccessful.
Abstract: How did we wind up with computers in most doctors' offices but little communication between these computers? This article briefly surveys efforts to introduce computerized, centralized, and portable electronic health records in the United States and why they have been largely unsuccessful. Historians of computing have the resources not only to help the general public think more constructively about the roles of computers in medicine, but also to see how what is happening in medicine lays bare the great and small changes the use of information technology has brought to everyday life.

Journal ArticleDOI
TL;DR: All four articles in this special issue cover stories of real-world implementations or applications of computing technology in East Asia, which demonstrates the diversity of this scholarly community.
Abstract: All four articles in this special issue cover stories of real-world implementations or applications of computing technology in East Asia. Each article adopts a unique approach and historiographical strategy, which demonstrates the diversity of this scholarly community.

Journal ArticleDOI
TL;DR: This issue is the second of two Annals special issues extending from the National Science Foundation's Computer Security History Workshop, which gathered historians and pioneers at the Charles Babbage Institute in July 2014.
Abstract: This issue is the second of two Annals special issues extending from the National Science Foundation's Computer Security History Workshop, which gathered historians and pioneers at the Charles Babbage Institute in July 2014. It markedly advances scholarship on many different critical aspects of computer security history.

Journal ArticleDOI
TL;DR: This story tells not only about nimbleness in the early software industry, but also about the importance of venture capitalists in the success and failure of early software firms.
Abstract: Before Symantec became a major supplier of security software, it offered a variety of natural language microcomputer software products. Its growth into a security firm was the result of acquisitions of software enterprises and expanding market conditions in the 1980s and 1990s. This story tells not only about nimbleness in the early software industry, but also about the importance of venture capitalists in the success and failure of early software firms.

Journal ArticleDOI
TL;DR: Raymond (Ray) Tomlinson was a computer engineer best known for developing the TENEX operating system and implemented the first email program on the Arpanet system in 1971.
Abstract: Raymond (Ray) Tomlinson was a computer engineer best known for developing the TENEX operating system and implemented the first email program on the Arpanet system in 1971. In its official biography, the Internet Hall of Fame states that "Tomlinson's email program brought about a complete revolution, fundamentally changing the way people communicate." This interview is the first in a two-part Annals series based on an oral history conducted by Marc Weber and Gardner Hendrie for the Computer History Museum (CHM) in June 2009.

Journal ArticleDOI
TL;DR: It is shown that the original digital pictures were associated with the original computers in the late 1940s and early 1950s, which establishes a different take on the history of early computers and unifies thehistory of digital light itself.
Abstract: Digital pictures and computers are now inseparable, so it's surprising how generally unremarked their association was in the beginning. Records reveal that the first digital pictures--the first still pictures, videogames, and computer animations--were made on the earliest computers. Historians have noted this before, but individually without a unifying context. This article shows that the original digital pictures were associated with the original computers in the late 1940s and early 1950s. This fresh perspective on digital pictures establishes a different take on the history of early computers and unifies the history of digital light itself.

Journal ArticleDOI
Caroline Jack1
TL;DR: Studying JA's use of computers in Applied Economics reveals how a corporate-sponsored nonprofit group used personal computers to engage students, adapt its traditional outreach methods to the classroom, and bolster an appreciation of private enterprise in American economic life.
Abstract: In late 1982, the corporate-funded business education nonprofit Junior Achievement (JA) distributed 121 donated personal computers to classrooms across the United States as part of its new high school course, Applied Economics. Studying JA's use of computers in Applied Economics reveals how a corporate-sponsored nonprofit group used personal computers to engage students, adapt its traditional outreach methods to the classroom, and bolster an appreciation of private enterprise in American economic life. Mapping the history of how business advocacy and education groups came to adopt software as a means of representing work and commerce offers a new perspective on how systems of cultural meaning have been attached to, and expressed through, computers and computing.

Journal ArticleDOI
TL;DR: What postcolonial science and technology studies mean for historians of computing is discussed and a discussion on how their categories may be more contingent, and less universal, than the authors have accepted is encouraged.
Abstract: In the early 1990s, Indian historian Dipesh Chakrabarty proposed an agenda for “provincializing Europe.” According to Chakrabarty, philosophers, historians, and other scholars who shaped the nature of western social science developed their theoretical and empirical projects to embrace the entirety of humanity. However, they also produced this knowledge in relative, and sometimes absolute, ignorance of the histories and experiences of those living outside of the western world. In his response, Chakrabarty sought to demonstrate how our categories may be more contingent, and less universal, than we have accepted—often without evidence. In other words, this historical method promotes a more limited and thus accurate use of core concepts that usually are translated without any problem, making the provincialization of Europe a cautious engagement with historical research. Since that time we have seen a rise in what is called postcolonial studies of science and technology. Ultimately this field seeks to reevaluate our theories and systems of society and technology in light of the ways that they are influenced by the long history of colonialism. Here I want to continue and encourage a discussion on what postcolonial science and technology studies mean for historians of computing.

Journal ArticleDOI
TL;DR: In the last days of 1982, the corporate-funded business education nonprofit Junior Achievement (JA) began distributing 121 donated personal computers (Xerox 820s, HP-86s, and IBM personal computers) to classrooms in 25 cities across the United States as mentioned in this paper.
Abstract: In the last days of 1982, the corporate-funded business education nonprofit Junior Achievement (JA) began distributing 121 donated personal computers—Xerox 820s, Hewlett Packard HP-86s, and IBM personal computers—to classrooms in 25 cities across the United States. The donated machines were part of JA’s new high school course, Applied Economics. The course curriculum had been designed with the goal of teaching economic principles, business skills, and the appreciation of private enterprise to high school students. The personal computer, JA executives hoped, would draw student interest toward the course and its perspective on American economic life. The course, including its classroom lectures, in-class activities, computerized bookkeeping, and management simulation software, would present a vision of American economic life in which market forces organized the economy, with necessary but decidedly minimal interventions by organized labor or the state. The research for my dissertation focused on the production of these kinds of corporate-sponsored “economic education” media: television documentaries, public service announcements, and school curricula that sponsors and producers hoped would instill an appreciation of private enterprise in the American public. Sponsored economic education media productions were particularly prevalent from the New Deal era through the late Cold War period. Sometimes, corporations directly sponsored and produced economic education materials. More often, however, nonprofit advocacy, outreach, or education groups acted as institutional intermediaries, taking corporate grants or sponsorships to fund media production. These media productions— ranging across diverse forms such as pamphlets, films, television programs, filmstrips, textbooks, and digital software—were efforts to maintain and buttress the social legitimacy of capitalism and private enterprise in the public imagination. Many of the groups involved in economic education were interested in what personal computers could help them communicate to the public. The act of embracing computer technology could help an economic education group signal a forward-looking orientation. Furthermore, economic education groups were interested in software because it could represent complex ideas: from their perspective, software was communicative and therefore potentially persuasive. Maintaining Meaning Recent scholarship has emphasized the importance of the maintainers—that is, the people who “keep ordinary existence going” by repairing and maintaining our technological systems. Technological systems, however, are not the only systems that require maintenance. Systems of cultural meaning—that is to say, ideologies—must also be carefully maintained to accommodate changing conditions and avoid ideological breakdowns that could shift the distribution of power, wealth, and perceived social legitimacy within a society. In other words, the advocacy groups, education nonprofits, and corporate sponsors involved in making economic education media were maintainers of a different sort. They used media in their efforts to repair and maintain ideologies, in the process framing a particular set of American capitalist institutional practices and social values as “ordinary existence.” We can examine these maintenance efforts by exploring not only the traditional media such organizations produced, but also their turn to and deployment of computers. JA’s adoption of computing in the Applied Economics curriculum illustrates how this nonprofit business education group conceptualized personal computer hardware and software: as a means of drawing the attention of disengaged students toward an appreciation of free market perspectives, and as a means of aligning a learning-by-doing tradition with the rhythms and norms of the secondary school classroom. This is not to suggest that there is something intrinsic or natural that links computing to capitalism in the abstract or to the many and varied ways Americans practice capitalism. Rather, the case provides one more example of the myriad ways computers were drawn into, and became modes of expression for, political and economic ideologies. Links drawn between capitalist ideology and computing include Thomas Streeter’s critique of Silicon Valley’s “two guys in a garage” mythos, for example, which draws out the romantic individualism embedded in 1980s personal computing lore. Similarly, Fred Turner’s unraveling of the 1990s cyber-elite’s countercultural roots helps illuminate the emergence of techno-libertarianism. Other works such as Eden Medina’s account of socialist computing visions in Allende’s Chile and Benjamin Peters’ chronicling of Soviet attempts to build a nationwide networked system of computers, however, show that noncapitalist