scispace - formally typeset
Search or ask a question

Showing papers in "Communications of The ACM in 1991"


Journal ArticleDOI
TL;DR: The Baseline method has been by far the most widely implemented JPEG method to date, and is sufficient in its own right for a large number of applications.
Abstract: For the past few years, a joint ISO/CCITT committee known as JPEG (Joint Photographic Experts Group) has been working to establish the first international compression standard for continuous-tone still images, both grayscale and color. JPEG’s proposed standard aims to be generic, to support a wide variety of applications for continuous-tone images. To meet the differing needs of many applications, the JPEG standard includes two basic compression methods, each with various modes of operation. A DCT-based method is specified for “lossy’’ compression, and a predictive method for “lossless’’ compression. JPEG features a simple lossy technique known as the Baseline method, a subset of the other DCT-based modes of operation. The Baseline method has been by far the most widely implemented JPEG method to date, and is sufficient in its own right for a large number of applications. This article provides an overview of the JPEG standard, and focuses in detail on the Baseline method.

3,944 citations


Journal ArticleDOI
TL;DR: C categories and examples of groupware are described and some underlying research and development issues are discussed and GROVE, a novel group editor, is explained in some detail as a salient groupware example.
Abstract: Groupware reflects a change in emphasis from using the computer to solve problems to using the computer to facilitate human interaction. This article describes categories and examples of groupware and discusses some underlying research and development issues. GROVE, a novel group editor, is explained in some detail as a salient groupware example

2,891 citations


Journal ArticleDOI
TL;DR: Design of the MPEG algorithm presents a difficult challenge since quality requirements demand high compression that cannot be achieved with only intraframe coding, and the algorithm’s random access requirement is best satisfied with pure intraframes coding.
Abstract: The Moving Picture Experts Group (MPEG) standard addresses compression of video signals at approximately 1.5M-bits. MPEG is a generic standard and is independent of any particular applications. Applications of compressed video on digital storage media include asymmetric applications such as electronic publishing, games and entertainment. Symmetric applications of digital video include video mail, video conferencing, videotelephone and production of electronic publishing. Design of the MPEG algorithm presents a difficult challenge since quality requirements demand high compression that cannot be achieved with only intraframe coding. The algorithm’s random access requirement, however, is best satisfied with pure intraframe coding. MPEG uses predictive and interpolative coding techniques to answer this challenge. Extensive details are presented.

2,447 citations



Journal ArticleDOI
TL;DR: An economic understanding of how information systems affect some key measures of organization structure is developed to develop a lack of comprehensive analysis of these issues from the economic perspective.
Abstract: The adoption of information technology (IT) in organizations has been growing at a rapid pace. The use of the technology has evolved from the automation of structured processes to systems that are truly revolutionary in that they introduce change into fundamental business procedures. Indeed, it is believed that “More than being helped by computers, companies will live by them, shaping strategy and structure to fit new information technology [25].” While the importance of the relationship between information technology and organizational change is evidenced by the considerable literature on the subject,1 there is a lack of comprehensive analysis of these issues from the economic perspective. The aim of this article is to develop an economic understanding of how information systems affect some key measures of organization structure.

1,023 citations


Journal ArticleDOI
TL;DR: This article attempts to introduce some discipline and order in understanding fault-tolerance issues in distributed system architectures by examining various proposals, discusses their relative merits, and illustrates their use in existing commercial fault-Tolerance systems.
Abstract: This article attempts to introduce some discipline and order in understanding fault-tolerance issues in distributed system architectures. This article examines various proposals, discusses their relative merits, and illustrates their use in existing commercial fault-tolerant systems

784 citations



Journal ArticleDOI
TL;DR: The conclusions of the experience are: reuse library technology is available, it is transferable, and it definitely has a positive financial impact on the organization implementing it.
Abstract: Experience with the development, implementation, and deployment of reuse library technology is reported. The focus is on organizing software collections for reuse using faceted classifications. Briefly described are the successfully GTE Data Services' Asset Management Program and the steps taken at Contel for furthering reuse technology. The technology developed for reuse libraries is presented, followed by a description of how it was transferred. The experience described indicates that reuse library technology is available and transferable, and that it definitely has a positive financial impact on the organization implementing it. >

542 citations



Journal ArticleDOI
TL;DR: Alternative techniques for evaluating the business case for strategic systems have been developed and have worked well in practice.
Abstract: Developing a strategic application ― intended to make a company more flexible, more responsive to customer needs, or more able to adapt to rapidly changing conditions in the competitive environment ― is fundamentally different from investments undertaken to automate the back office to reduce expenses or increase capacity. Alternative techniques for evaluating the business case for strategic systems have been developed and have worked well in practice. Several cases are presented

417 citations


Journal ArticleDOI
TL;DR: This paper has built a system called LaSSIE, which uses knowledge representation and reasoning technology to directly address each of these three issues of invisibility and thereby help with the invisibility problem.
Abstract: The authors discuss the important problem of invisibility that is inherent in the task of developing large software systems It is pointed out that there are no direct solutions to this problem; however, there are several categories of systems-relational code analyzers, reuse librarians, and project management databases-that can be seen as addressing aspects of the invisibility problem It is argued that these systems do not adequately deal with certain important aspects of the problem of invisibility-semantic proliferation, multiple views, and the need for intelligent indexing A system called LaSSIE, which uses knowledge representation and reasoning technology to address each of these three issues directly and thereby help with the invisibility problem, has been built The authors conclude with an evaluation of the system and a discussion of open problems and ongoing work >


Journal ArticleDOI

Journal ArticleDOI
TL;DR: ACM first published recommendations for undergraduate programs in computer science in 1968 in a report called “Curriculum '68,” which has been providing updates to recommendations for computer science programs as well as recommendations for other academic programs in computing.
Abstract: ACM first published recommendations for undergraduate programs in computer science in 1968 in a report called “Curriculum '68.” The report was produced as an activity of the ACM Education Board, which since then has been providing updates to recommendations for computer science programs as well as recommendations for other academic programs in computing.

Journal ArticleDOI
TL;DR: There should be a standard way to make "information about information" interoperable, and when an electronic document is created, its author should be able to incorporate active references to other on-line documents, regardless of the heterogeneity of their notations.
Abstract: he computer and telecommunications industries have made enormous progress in communications technology standardization in recent years. One effect of good communications technology is that people can concentrate on the information being communicated. More and more people are realizing, however, that being able to send and receive files containing information is not enough. It is desirable that all digital documents explicitly indicate in a standard way what kind of notation is used in them. When an electronic document is created, its author should be able to incorporate active references to other on-line documents (\"hyperlinks\"l), regardless of the heterogeneity of their notations. In other words, there should be a standard way to make \"information about information\" interoperable. Such a standard should, among other things:

Journal ArticleDOI
TL;DR: Groupware is intended to create a shared workspace that supports dynamic collaboration in a work group over space and time constraints and must overcome the hurdle of critical mass.
Abstract: Groupware is intended to create a shared workspace that supports dynamic collaboration in a work group over space and time constraints. To gain the collective benefits of groupware use, the groupware must be accepted by a majority of workgroup members as a common tool. Groupware must overcome the hurdle of critical mass.

Journal ArticleDOI
TL;DR: NSF and DARPA have awarded over $15 million to the Corporation for National Research Initiative (CNRI) to create testbeds to perform research on the design and development of networks that operate with data rates of about one GB per second.
Abstract: SF-funded collab-oratories are experimental and empirical research environments in which domain scientists work with computer, communications, behav-ioral and social scientists to design systems, participate in collaborative science, and conduct experiments to evaluate and improve the systems. These research projects are concerned with distributed and collabo-rative research that requires intense reliance on wide-area networks and the Internet, to bring together instruments , laboratories and researchers. Three NSF programs in the Computer and Information Science and Engineering Directorate support the design of collaboratories and coordination experiments: 1. Coordination Theory and Collaboration 7ichnology Special Initiative (CT2). This initiative supports the fundamental research of relevance to the design of collaboratories. The research covers a broad spectrum of coordination problems, from formal theory to software design and collaboratory development. For the past two years, the Information Technology and Organizations (ITO) Program has coordinated this initiative. A total of 17 awards have been made. Funding for existing as well as new CT2 awards is expected to be about $3 million this year. Also starting in FY 1991 the initiative has been institutionalized by making it an integral part of the ITO program, with an enhanced base to its budget. 2. Research on scientific databases. A new call seeks proposals for work on problems that are fundamental to the design of scientific databases, written by interdisciplinary groups that include relevant domain scientists. The success of the overall col-laboratory design enterprise requires the ability to store and easily access the data and knowledge in extremely large, heterogeneous and distributed databases. The Database and Expert Systems Program coordinates this effort. Funding for proposals under this announcement will total more than $1 million. 3. The Gigabit Network project. NSF and DARPA have awarded over $15 million to the Corporation for National Research Initiative (CNRI) to create testbeds to perform research on the design and development of networks that operate with data rates of about one GB per second. The availability of GB networks may enable a major paradigm shift from text-based to image-based communication. Five contracts awarded by CNRI address network architec-tures and potential applications for GB networks. This testbed research includes distributed computing using multiple supercomputers and workstations, and real-time processing of composite high-speed data streams. The experiments explore the feasibility of group collaboration over the network and the use of GB networks to develop simulated environments. [] Intelligent Systems (IRIS). This division supports fundamental scientific research on the …


Journal ArticleDOI
Morten Kyng1
TL;DR: This article will discuss how to design computer applications that enhance the quality of work and products, and will relate the discussion to current themes in the field of Computer-Supported Cooperative Work (CSCW).
Abstract: This article will discuss how to design computer applications that enhance the quality of work and products, and will relate the discussion to current themes in the field of Computer-Supported Cooperative Work (CSCW). Cooperation is a key element of computer use and work practice, yet here a specific “CSCW approach is not taken.” Instead the focus is cooperation as an important aspect of work that should be integrated into most computer support efforts in order to develop successful computer support, however, other aspects such as power, conflict and control must also be considered.

Journal ArticleDOI
TL;DR: A status report about the value of this process of usability design and a description of new ideas for enhancing the use of the process are described.
Abstract: Almost a decade has passed since we started advocating a process of usability design. This article is a status report about the value of this process and, mainly, a description of new ideas for enhancing the use of the process

Journal ArticleDOI
TL;DR: The history of database system research in the US is one of exceptional productivity and startling economic impact Barely twenty years old as a basic science research field, database research conducted with Federal support in the nation's universities and in its industrial research laboratories has fueled an information services industry estimated at $10 billion per year in the USA alone.
Abstract: The history of database system research in the US is one of exceptional productivity and startling economic impact Barely twenty years old as a basic science research field, database research conducted with Federal support in the nation's universities and in its industrial research laboratories has fueled an information services industry estimated at $10 billion per year in the US alone This industry has grown at an average rate of 20 percent per year since 1965 and is continuing to expand at this rate Achievements in database research underpin fundamental advances in communications systems, transportation and logistics, financial management, knowledge-based systems, accessibility to scientific literature, and a host of other civilian and defense applications They also serve as the foundation for considerable progress in basic science in various fields ranging from computing to biology


Journal ArticleDOI
TL;DR: The notion that computers, or more significantly, a computer, will control the lives and destinies of humans actually predates the age of computing.
Abstract: discussion of the computer's impact on society, few have been as provocative and hotly contested as that of computerization and centralization. The notion that computers, or more significantly, a computer, will control the lives and destinies of humans actually predates the age of computing. E.M. Forster's novella The Machine Stops, written in 1916, describes a utopian society in which all goods and services are provided by a giant \"thinking m a c h i n e \"-u n t i l the machine stops. Yvgeny Zamyatin's 1930 novel We tells a similar story of a society run by a gigantic machine that eventually becomes the oppressor of the people. And Kurt Vonnegut's 1956 novel Player Piano updated Forster's and Zamyatin's visions, setting the stage for ongoing speculation about the centralizing effects of the modern digital computer.

Journal ArticleDOI
TL;DR: Starburst as discussed by the authors is a research prototype of an extensible relational database management system that is under development at IBM Almaden Research Center and includes hierarchical types and functions, large unstructured and structured complex objects, and user-defined rules to respond to changes in the database.
Abstract: Starburst is a research prototype of an extensible relational database management system that is under development at the IBM Almaden Research Center. Through extensions to Starburst, we are incorporating the advanced structuring and data behavior features offered by object-oriented database management systems, while retaining the significant gains in data independence and data integrity of the relational model and upward compatibility with its standard access language, SQL. Some of the advanced features supported by Starburst extensions, described in this paper, include hierarchies of user-defined types and functions, large unstructured and structured complex objects, and user-defined rules to respond to changes in the database.

Journal ArticleDOI
TL;DR: The scalability of a given architecture is defined to be the fraction of the parallelism inherent in a given algorithm that can be exploited by any machine of that architecture as a function of problem size.
Abstract: We define the scalability of a given architecture to be the fraction of the parallelism inherent in a given algorithm that can be exploited by any machine of that architecture as a function of problem size

Journal ArticleDOI
TL;DR: Ideas and suggestions from current literature on software documentation are described, recognizing the interconnectedness of the software, its documentation, and the help system.
Abstract: Finally, new and interesting ideas about documentation. It is kind of funny, really. Most documentat ion is written by technic i a n s n o t professional writers. And most technicians would include documentation among their top ten complaints regarding the software they use. Physician, heal thyself. This column describes ideas and suggestions from current literature on software documentation. I hope they will change the way you think about documentation. If you are in the software field, it is almost certain that you will have to write documentation, for either your peers or your users. If you are designing software, you owe it to those you serve to gain an enlightened attitude toward documentation, recognizing the interconnectedness of the software, its documentation, and the help system. Otherwise, you are not a "practical programmer.

Journal ArticleDOI
TL;DR: Groupware, such as electronic mail, conferencing, and on-line editing, has an apparently natural affinity to the team and project work of salaried professional employees.
Abstract: Advanced computer tools designed to facilitate collaboration in a common task or across functions have had a remarkably disappointing record of diffusion and adoption [16]. Technologies that are unresponsive to users needs will not find their markets. Groupware such as electronic mail, conferencing, and on-line editing, however, has an apparently natural affinity to the team and project work of salaried professional employees.

Journal ArticleDOI
TL;DR: Adoption strategies are needed to guide and accelerate the process of introducing appealing new technologies into the marketplace and finding productive uses for new systems takes time.
Abstract: Multimedia communication systems promise better support for widely distributed workgroups. Their benefits for complex communication—problem-solving, negotiating, planning, and design —seem obvious, introducing appealing new technologies into the marketplace, however, can require years of Investment [13, 22]. In particular, finding productive uses for new systems takes time. Adoption strategies are needed to guide and accelerate the process.

Journal ArticleDOI
TL;DR: This article explores the landscape in which the major object-oriented facilities exist, showing how the CLOS solution is effective within the two contexts.
Abstract: Lisp has a long history as a functional language,* where action is invoked by calling a procedure, and where procedural abstraction and encapsulation provide convenient modularity boundaries. A number of attempts have been made to graft object-oriented programming into this framework without losing the essential character of Lisp—to include the benefits of data abstraction, extensible type classification, incremental operator definition, and code reuse through an inheritance hierarchy.The Common Lisp Object System (CLOS) [3], a result of the ANSI standardization process for Common Lisp, represents a marriage of these two traditions. This article explores the landscape in which the major object-oriented facilities exist, showing how the CLOS solution is effective within the two contexts.