scispace - formally typeset
Search or ask a question

Showing papers in "Communications of The ACM in 2000"


Journal ArticleDOI
TL;DR: The WINS network represents a new monitoring and control capability for applications in such industries as transportation, manufacturing, health care, environmental oversight, and safety and security, and opportunities depend on development of a scalable, low-cost, sensor-network architecture.
Abstract: W ireless integrated network sensors (WINS) provide distributed network and Internet access to sensors, controls, and processors deeply embedded in equipment, facilities, and the environment. The WINS network represents a new monitoring and control capability for applications in such industries as transportation, manufacturing, health care, environmental oversight, and safety and security. WINS combine microsensor technology and low-power signal processing, computation, and low-cost wireless networking in a compact system. Recent advances in integrated circuit technology have enabled construction of far more capable yet inexpensive sensors, radios, and processors, allowing mass production of sophisticated systems linking the physical world to digital data networks [2–5]. Scales range from local to global for applications in medicine, security, factory automation, environmental monitoring, and condition-based maintenance. Compact geometry and low cost allow WINS to be embedded and distributed at a fraction of the cost of conventional wireline sensor and actuator systems. WINS opportunities depend on development of a scalable, low-cost, sensor-network architecture. Such applications require delivery of sensor information to the user at a low bit rate through low-power transceivers. Continuous sensor signal processing enables the constant monitoring of events in an environment in which short message packets would suffice. Future applications of distributed embedded processors and sensors will require vast numbers of devices. Conventional methods of sensor networking represent an impractical demand on cable installation and network bandwidth. Processing at the source would drastically reduce the financial, computational, and management burden on communication system

3,415 citations


Journal ArticleDOI
TL;DR: Systems T he Internet offers vast new opportunities to interact with total strangers, but these interactions can be fun, informative, even profitable, but they also involve risk.
Abstract: Systems T he Internet offers vast new opportunities to interact with total strangers. These interactions can be fun, informative, even profitable. But they also involve risk. Is the advice of a self-proclaimed expert at expertcentral.com reliable? Will an unknown dotcom site or eBay seller ship items promptly with appropriate packaging? Will the product be the same one described online? Prior to the Internet, such questions were answered, in part, through personal and corporate reputations. Vendors provided references, Better Business Bureaus tallied complaints, and past personal experience and person-to-person gossip told you on whom you could rely and on whom you could not. Participants’ standing in their communities, including their roles in church and civic organizations, served as a valuable hostage. Internet services operate on a vastly larger scale

2,410 citations


Journal ArticleDOI
TL;DR: The ability to track users’ browsing behavior down to individual mouse clicks has brought the vendor and end customer closer than ever before, and it is now possible for a vendor to personalize his product message for individual customers at a massive scale, a phenomenon that is being referred to as mass customization.
Abstract: The ease and speed with which business transactions can be carried out over the Web have been a key driving force in the rapid growth of electronic commerce. Business-to-business e-commerce is the focus of much attention today, mainly due to its huge volume. While there are certainly gains to be made in this arena, most of it is the implementation of much more efficient supply management, payments, etc. On the other hand, e-commerce activity that involves the end user is undergoing a significant revolution. The ability to track users’ browsing behavior down to individual mouse clicks has brought the vendor and end customer closer than ever before. It is now possible for a vendor to personalize his product message for individual customers at a massive scale, a phenomenon that is being referred to as mass customization.

1,429 citations


Journal ArticleDOI
TL;DR: This article attempts to determine why certain consumers are drawn to the Internet and why others are not, and why the perception of the risk associated with shopping on the Internet is low or is overshadowed by its relative convenience.
Abstract: The past century experienced a proliferation of retail formats in the marketplace. However, as a new century begins, these retail formats are being threatened by the emergence of a new kind of store, the online or Internet store. From being almost a novelty in 1995, online retailing sales were expected to reach $7 billion by 2000 [9]. In this increasngly timeconstrained world, Internet stores allow consumers to shop from the convenience of remote locations. Yet most of these Internet stores are losing money [6]. Why is such counterintuitive phenomena prevailing? The explanation may lie in the risks associated with Internet shopping. These risks may arise because consumers are concerned about the security of transmitting credit card information over the Internet. Consumers may also be apprehensive about buying something without touching or feeling it and being unable to return it if it fails to meet their approval. Having said this, however, we must point out that consumers are buying goods on the Internet. This is reflected in the fact that total sales on the Internet are on the increase [8, 11]. Who are the consumers that are patronizing the Internet? Evidently, for them the perception of the risk associated with shopping on the Internet is low or is overshadowed by its relative convenience. This article attempts to determine why certain consumers are drawn to the Internet and why others are not. Since the pioneering research done by Becker [3], it has been accepted that the consumer maximizes his utility subject to not only income constraints but also time constraints. A consumer seeks out his best decision given that he has a limited budget of time and money. While purchasing a product from a store, a consumer has to expend both money and time. Therefore, the consumer patronizes the retail store where his total costs or the money and time spent in the entire process are the least. Since the util-

1,088 citations



Journal ArticleDOI
TL;DR: As people become more connected electronically, the ability to achieve a highly accurate automatic personal identification system is substantially more critical and organizations are looking to automated identity authentication systems to improve customer satisfaction and operating efficiency.
Abstract: W A LT ER S IP SE R For this reason, more and more organizations are looking to automated identity authentication systems to improve customer satisfaction and operating efficiency as well as to save critical resources (see Figure 1). Furthermore, as people become more connected electronically, the ability to achieve a highly accurate automatic personal identification system is substantially more critical [5]. Personal identification is the process of associating a particular individual with an identity. Identification can be in the form of verification (also known as authentication), which entails authenticating a claimed identity (“Am I who I claim I am?”), or recognition (also known as identification), which entails determining the identity of a given person from a database of persons known to the system (“Who am I?”). Knowledge-based and token-based automatic personal identification approaches have been the two traditional techniques widely used [8]. Token-based approaches use something you have to make a personal identification, such as a passport, driver’s license, ID card, credit card, or keys. Knowledge-based approaches use something you know to make a personal identification, such as a password or a personal identification number (PIN). Since these traditional approaches are not based on any inherent attributes of an individual to make a personal identification, they suffer from the

827 citations


Journal ArticleDOI
TL;DR: The nature of trust and how and where it flourishes online is explored, a conceptual framework for understanding trust is provided, and 10 characteristics of online interaction that can help engineer trust online are offered.
Abstract: trust Online T rust matters. It allows us to reveal vulnerable parts of ourselves to others and to know others intimately in return. A climate of trust eases cooperation among people and fosters reciprocal caretaking. The resources—physical, emotional, economic—that would otherwise be consumed guarding against harm can be directed toward more constructive ends. Here, we explore the nature of trust and how and where it flourishes online. We also seek to make sense of seemingly disparate perceptions. For example, some say the public is too trusting online; without thinking, people routinely download software likely to destroy important information or blithely engage in e-auctions or chat rooms with strangers. Others say the public does not trust enough, that people refrain, for example, from e-commerce under the mistaken belief that their financial transactions are not secure. How can we know if the trust we choose to give or withhold is warranted? Can we trust machines or other technological systems? How can those of us who create and maintain the technological infrastructure help establish a climate of trust? Addressing such questions, we provide a conceptual framework for understanding trust, then offer 10 characteristics of online interaction that can help engineer trust online and distinguish between trust in e-commerce activities and trust in online interpersonal interactions.

760 citations


Journal Article
TL;DR: This work states that ERP implementation is more complex due to cross-module integration, data standardization, adoption of the underlying business model (“best practices”), compressed implementation schedule, and the involvement of a large number of stakeholders.
Abstract: E RP software packages that manage and integrate business processes across organizational functions and locations cost millions of dollars to buy, several times as much to implement, and necessitate disruptive organizational change. While some companies have enjoyed significant gains, others have had to scale back their projects and accept minimal benefits, or even abandon implementation of ERP projects [4]. Historically, a common problem when adopting package software has been the issue of “misfits,” that is, the gaps between the functionality offered by the package and that required by the adopting organization [1, 3]. As a result, organizations have had to choose among adapting to the new functionality, living with the shortfall, instituting workarounds, or customizing the package. ERP software, as a class of package software, also presents this problematic choice to organizations. The problem is exacerbated because ERP implementation is more complex due to cross-module integration, data standardization, adoption of the underlying business model (“best practices”), compressed implementation schedule, and the involvement of a large number of stakeholders. The knowledge gap among implementation personnel is usually significant. Few organizational users underChristina Soh, Sia Siew Kien, and Joanne Tay-Yap

739 citations


Journal ArticleDOI
TL;DR: The vast majority of security professionals would agree that real-time ID systems are not technically advanced enough to detect sophisticated cyberattacks by trained professionals, but these systems have not matured to a level where sophisticated attacks are reliably detected, verified, and assessed.
Abstract: The vast majority of security professionals would agree that real-time ID systems are not technically advanced enough to detect sophisticated cyberattacks by trained professionals. For example, during the Langley cyberattack the ID systems failed to detect substantial volumes of email bombs that crashed critical email servers. Coordinated efforts from various international locations were observed as hackers worked to understand the rules-based filter used in counterinformation operations against massive email bomb attacks [1]. At the other end of the technical spectrum, false alarms from ID systems are problematic, persistent, and preponderant. Numerous systems administrators have been the subject of an ID system reporting normal work activities as hostile actions. These types of false alarms result in financial losses to organizations when technical resources are denied access to computer systems or security resources are misdirected to investigate nonintrusion events. In addition, when systems are prone to false alarms, user confidence is marginalized and misused systems are poorly maintained and underutilized. ID systems that examine operating system audit trails, or network traffic [3, 8] and other similar detection systems, have not matured to a level where sophisticated attacks are reliably detected, verified, and assessed. Comprehensive and reliable systems are complex and the technological designs of these advanced

623 citations


Journal ArticleDOI
TL;DR: It is shown that knowledge so disseminated through the mass of mankind that it may reach even the extremes of society: beggars and kings.
Abstract: see knowledge so disseminated through the mass of mankind that it may…reach even the extremes of society: beggars and kings.

579 citations


Journal ArticleDOI
TL;DR: Newton’s language Regiment, also a functional language, is designed to gather streams of data from regions of the amorphous computer and accumulate them at a single point, which allows Regiment to provide region-wide summary functions that are difficult to implement in Proto.
Abstract: ion to Continuous Space and Time The amorphous model postulates computing particles distributed throughout a space. If the particles are dense, one can imagine the particles as actually filling the space, and create programming abstractions that view the space itself as the object being programmed, rather than the collection of particles. Beal and Bachrach [10, 4] pursued this approach by creating a language, Proto, where programmers specify the behavior of an amorphous computer as though it were a continuous material filling the space it occupies. Proto programs manipulate fields of values spanning the entire space. Programming primitives are designed to make it simple to compile global operations to operations at each point of the continuum. These operations are approximated by having each device represent a nearby chunk of space. Programs are specified in space and time units that are independent of the distribution of particles and of the particulars of communication and execution on those particles (Figure 5). Programs are composed functionally, and many of the details of communication and composition are made implicit by Proto’s runtime system, allowing complex programs to be expressed simply. Proto has been applied to applications in sensor networks like target tracking and threat avoidance, to swarm robotics and to modular robotics, e.g., generating a planar wave for coordinated actuation. Newton’s language Regiment [45, 44] also takes a continuous view of space and time. Regiment is organized in terms of stream operations, where each stream represents a time-varying quantity over a part of space, for example, the average value of the temperature over a disc of a given radius centered at a designated point. Regiment, also a functional language, is designed to gather streams of data from regions of the amorphous computer and accumulate them at a single point. This assumption allows Regiment to provide region-wide summary functions that are difficult to implement in Proto.

Journal ArticleDOI
TL;DR: With this bundle, users can create, maintain, and edit the relationships of organizational units in public security and defense organizations (such as, for instance, Functional Units brigades, companies, and platoons), deployed or otherwise.
Abstract: Consequently, public security and defense organizations can see the real-time locations of various people, units, and equipment in their GIS system, thanks to Global Positioning System (GPS) technology, while getting updates from SAP ERP with details about the current status and capabilities of those units, also updated in real-time as needed using enterprise services. With this bundle, users can create, maintain, and edit the relationships of organizational units (referred to in enterprise SOA terms as ) in public security and defense organizations (such as, for instance, Functional Units brigades, companies, and platoons), deployed or otherwise. In addition, they can read, change, and update the attributes of positions, personnel, and materials within these units.

Journal ArticleDOI
David Tennenhouse1
TL;DR: The computer science research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint.
Abstract: F or the past 40 years, most of the IT research community has focused on interactive computing, J.C.R. Licklider’s powerful and human-centered vision of human-computer symbiosis [3]. In tandem with this research has come the creation of an IT industry that is hurtling toward the human/machine/network breakpoint—the point at which the number of networked interactive computers will surpass the number of people on the planet. We still have a long way to go before Licklider’s vision is attained—and are many years from extending per-capita penetration to most parts of the world. However, “missing science” may no longer be the factor limiting progress toward these long-cherished goals. It is reasonable, though perhaps heretical, to suggest that refinements of the existing science base will be sufficient to drive these efforts forward. It is time for a change. The computer science research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint. In lifting our sights toward a world in which networked computers outnumber human beings by a hundred or thousand to one, we should consider what these “excess” computers will be doing and craft a research agenda that can lead to increased human productivity and quality of life.

Journal ArticleDOI
TL;DR: A ncient social traditions were designed to elicit trust during uncertain encounters, but for many users, strategic trust is difficult to generate, shaken easily, and once shaken extremely difficult to rebuild.
Abstract: A ncient social traditions were designed to elicit trust during uncertain encounters. Handshaking demonstrated the absence of weapons. Clinking of glasses evolved from pouring wine back and forth to prove it was not poisoned. Now, new social traditions are needed to enhance cooperative behaviors in electronic environments supporting e-commerce, e-services, and online communities. Since users of online systems can’t savor a cup of tea with an electronic rug merchant, designers must develop strategies for facilitating e-commerce and auctions. Since users can’t make eye contact and judge intonations with an online lawyer or physician, designers must create new social norms for professional services. Since users can’t stroll through online communities encountering neighbors with their children, designers must facilitate the trust that enables collective action. In parallel, consumer groups must be vigorous in monitoring and reporting deceptions and disreputable business practices. Political scientist Eric Uslaner of the University of Maryland calls trust “the chicken soup of the social sciences. It brings us all sorts of good things—from a willingness to get involved in our communities to higher rates of economic growth ... to making daily life more pleasant. Yet, like chicken soup, it appears to work somewhat mysteriously” [5]. He tries to sort out the mystery by distinguishing between moral trust, or the durable optimistic view that strangers are well-intentioned, and strategic trust, or the willingness of two people to participate in a specific exchange (see Uslaner’s “Social Capital and the Net” in this section). Trust facilitates cooperative behavior. It is a complex term that has generated dozens of doctoral dissertations, not only in sociology and political science, but now in information systems research as well. There are enough dimensions to trust and its failures to keep scholars and philosophers busy for some time, but e-commerce, e-services, and online community designers need a guide to practical action [4]. The designer’s goal is to engage users quickly and establish and preserve strategic trust under challenging situations. But for many users, strategic trust is difficult to generate, shaken easily, and once shaken extremely difficult to rebuild. Strategic trust is fragile. The extensive literature on trust offers multiple perspectives. In his politically oriented book Trust, Francis Fukuyama, a former U.S. State Department analyst, claims: “Trust is the expectation that arises within a community of regular, honest, and cooperative behavior, based on commonly shared norms, on the part of the members of that community” [2]. This compact definition embodies several key con-

Journal ArticleDOI
TL;DR: This “Technical Opinion” focuses on understanding the nature of information security in the next millennium and suggests a set of principles that would help in managing information securityIn the future.
Abstract: R apid advances in electronic networks and computerbased information systems have given us enormous capabilities to process, store, and transmit digital data in most business sectors. This has transformed the way we conduct trade, deliver government services, and provide health care. Changes in communication and information technologies and particularly their confluence has raised a number of concerns connected with the protection of organizational information assets. Achieving consensus regarding safeguards for an information system, among different stakeholders in an organization, has become more difficult than solving many technical problems that might arise. This “Technical Opinion” focuses on understanding the nature of information security in the next millennium. Based on this understanding it suggests a set of principles that would help in managing information security in the future.


Journal ArticleDOI
TL;DR: As the market evolves, a number of important aspects of these IT outsourcing decisions have been explored and can be categorized as descriptive case studies and surveys of the current outsourcing practices.
Abstract: nformation technology outsourcing—the practice of transferring IT assets, leases, staff, and management responsibility for delivery of services from internal IT functions to third-party vendors—has become an undeniable trend ever since Kodak's 1989 landmark decision. In recent years, private and public sector organizations worldwide have outsourced significant portions of their IT functions,monwealth Bank of Australia. The IT outsourcing market, which was worth $76 billion in 1995, grew to over $120 billion in 1997 [5]. As the market evolves, a number of important aspects of these IT outsourcing decisions have been explored. These studies can be categorized as descriptive case studies and surveys of the current outsourcing practices, surveys of practitioners' perceptions of risks IT managers commiserate over the challenges of convincing senior executives that, contrary to popular belief, outsourcing isn't always a money-saving option.




Journal ArticleDOI
TL;DR: Here, some of the features of human-human conversation being implemented in this new genre of embodied conversational agent are described, exploring a notable embodied Conversational agent—named Rea—based on these features.
Abstract: More than another friendly face, Rea knows how to have a conversation with living, breathing human users with a wink, a nod, and a sidelong glance. A nimals and humans all manifest social qualities and skills. Dogs recognize dominance and submission , stand corrected by their superiors, demonstrate consistent personalities, and so forth. On the other hand, only humans communicate through language and carry on conversations with one another. The skills involved in human conversation have developed in such a way as to exploit all the special characteristics of the human body. We make complex repre-sentational gestures with our prehensile hands, gaze away and toward one another out of the corners of our centrally set eyes, and use the pitch and melody of our flexible voices to emphasize and clarify what we are saying. Perhaps because conversation is so defining of humanness and human interaction, the metaphor of face-to-face conversation has been applied to human-computer interface design for quite some time. One of the early arguments for the utility of this metaphor pointed to the application of the features of face-to-face conversation in human-computer interaction, including mixed initiative, nonverbal communication , sense of presence, and the rules involved in transferring control [9]. However, although these features have gained widespread recognition, human-computer conversation has only recently become more than a metaphor. That is, only recently have human-computer interface designers taken the metaphor seriously enough to attempt to design a computer that could hold up its end of the conversation with a human user. Here, I describe some of the features of human-human conversation being implemented in this new genre of embodied conversational agent, exploring a notable embodied conversational agent—named Rea—based on these features. Because conversation is such a primary skill for humans and learned so early in life (practiced, in fact, between infants and their mothers taking turns cooing and burbling EMBODIED CONVERSATIONAL INTERFACE AGENTS

Journal ArticleDOI
TL;DR: The current generation of ERP systems also provide reference models or process templates that claim to embody the current best business practices as discussed by the authors, however, these reference models may not be the best practices in practice.
Abstract: Enterprise resource planning systems are configurable information systems packages that integrate information and information-based processes within and across functional areas in an organization. The current generation of ERP systems also provides reference models or process templates that claim to embody the current best business practices.

Journal ArticleDOI
TL;DR: How emerging technology may blur the network-centric distinction between NAS and SAN is about how the decreasing specialization of SAN protocols promises SAN-like devices on Ethernet network hardware.
Abstract: SAN with Fibre Channel network hardware that has a greater effect on a user’s purchasing decisions. This article is about how emerging technology may blur the network-centric distinction between NAS and SAN. For example, the decreasing specialization of SAN protocols promises SAN-like devices on Ethernet network hardware. Alternatively, the increasing specialization of NAS systems may embed much of the file system into storage devices. For users, it is increasingly worthwhile to investigate networked storage core and emerging technologies. Today, bits stored online on magnetic disks are so inexpensive that users are finding new, previously unaffordable, uses for storage. At Dataquest’s Storage2000 conference last June in Orlando, Fla., IBM reported that online disk storage is now significantly cheaper than paper or film, the dominant traditional information storage media. Not surprisingly, users are adding storage capacity at about 100% per year. Moreover, the rapid growth of e-commerce, with its huge global customer base and easy-to-use, online transactions, has introduced new market requirements, including bursty, unpredictable spurts in capacity, that demand vendors minimize the time from a user’s order to installation of new storage. In our increasingly Internet-dependent business and computing environment, network storage is the computer. NETWORK ATTACHED STORAGE ARCHITECTURE

Journal ArticleDOI
TL;DR: Organizations are generally advised to start planning multisite ERP implementations at the strategic level before proceeding to the technical (software and hardware) levels.
Abstract: H istorically, ERP systems evolved from MRP II systems, which are designed to manage a production facility’s orders, production plans, and inventories. ERP systems integrate inventory data with financial, sales, and human resources data, allowing organizations to price their products, produce financial statements, and manage the resources of people, materials, and money. Implementing ERP systems can be quite straightforward when organizations are simply structured and operate in one or a few locations. But when organizations are structurally complex and geographically dispersed, implementing ERP systems involves difficult, possibly unique, technical and managerial choices and challenges. The complexities of what are often called “multisite” ERP implementations are discussed here. Like all computer-based information systems, multisite ERP implementations can be analyzed in terms of levels or layers (logical versus physical, hardware versus software). At each level there are different choices to make and different criteria for evaluating the alternatives. However, the layers are interdependent: Choices at one level may limit the available choices or affect the performance of the system at another level. Therefore, organizations are generally advised to start planning multisite ERP implementations at the strategic level before proceeding to the technical (software and hardware) levels. In practice, however, the sheer size and scale of such implementations may encourage organizations to tackle the layers MULTISITE ERP IMPLEMENTATIONS

Journal ArticleDOI
TL;DR: This article spells out the several ways in which the CIO and IT function is found asleep at the wheel on ERP, and details the more effective capabilities and organizational and cultural practices that explain ERP success, and indeed wider IT, success.
Abstract: B y early 2000 the ERP revolution generated over $20 billion in revenues annually for suppliers and an additional $20 billion for consulting firms. However, for many organizations ERP represents the return of the old IT catch-22 with a vengeance: competitively and technically it’s a must-do, but economically there is conflicting evidence, suggesting it is difficult to justify the associated costs, and difficult to implement to achieve a lasting business advantage. Critical success factors, and reasons for failure in ERP implementations, have now been widely researched [4–8]. However, what is more noticeable is how the difficulties experienced in ERP implementations and with their business value are not atypical of most IT projects, especially when they are large and complex, expensive, take over a year or more to install, use new technology, and impact significantly on the organizational culture and existing business processes [5, 6, 10, 11]. Our own work on ERP success and failure factors differs in one essential respect from all previous studies. We have identified serious neglect in ERP implementations in securing the most effective roles for the CIO and IT function. Moreover, in case studies we have found failures in this area to be correlated strongly to subsequent difficulties in achieving delivery and business value. This article spells out the several ways in which we have found the CIO and IT function (and relatedly it must be said, often senior business executives), asleep at the wheel on ERP. We then detail the more effective capabilities and organizational and cultural practices that explain ERP, and indeed wider IT, success. In practice, irrespective of the differing technologies being implemented, we have indeed found that there are still consistent IT management principles to be applied to ERP, and many lessons to be had from history.

Journal ArticleDOI
TL;DR: Making a site better fit its users by knowing about the needs of potential customers and the ability to establish personalized services that satisfy these needs is key to winning this competitive race.
Abstract: T he Web has become a borderless marketplace for purchasing and exchanging goods and services. While Web users search for, inspect and occasionally purchase products and services on the Web, companies compete bitterly for each potential customer. The key to winning this competitive race is knowledge about the needs of potential customers and the ability to establish personalized services that satisfy these needs. Making a site better fit its users.

Journal ArticleDOI
TL;DR: The purpose of this article is to understand developments in ERP adoption within the European mid-market, based on a large-scale European multicountry/multi-industry survey conducted in mid-1998.
Abstract: JE A N -F R A N C O IS P O D EV IN U ntil recently, the major ERP vendors (SAP, Oracle, Peoplesoft, JD Edwards, and Baan) were mainly targeting the high end of the market (companies with more than 1,000 employees), but this market comes close to saturation. Many large companies have already adopted ERP systems and are planning the next step of how to use the installed ERP infrastructures as foundations for e-business [1, 2]. Most of the smalland medium-sized companies still have to make the decision to deploy ERP. The midsize market is an interesting market, for example, the number of midsize companies (50–1,000 employees) in Europe is estimated to exceed 100,000. Data from our research shows that with average annual IT budgets of more than $500,000, the total European midsize market for IT products and services surpasses a staggering $50 billion per year. This market as a whole is very attractive for the major ERP vendors. However, since the wave of adoption by midsize companies is in its early stages, little is known about developments and drivers that form the basis of ERP adoption decisions. The purpose of this article is to understand developments in ERP adoption within the European mid-market. Our empirical information is based on a large-scale European multicountry/multi-industry survey conducted in mid-1998. Based on the survey data, we will address various issues, such as: How did ERP penetration in the mid-market develop until 1998 Yvonne van Everdingen, Jos van Hillegersberg, and Eric Waarts


Journal ArticleDOI
TL;DR: Wireless and mobile networks have provided the flexibility required for an increasingly mobile workforce and the technological maturity and the tremendous competition among service providers is indicated.
Abstract: W ith the increasing use of small portable computers, wireless networks, and satellites, a trend to support computing on the move has emerged—this trend is known as mobile computing or nomadic computing [3]. Also referred to as anytime/anywhere computing, mobile computing has several interesting and important applications for business (such as instant claim processing and e-commerce), telecommunications and personal communications, national defense (tracking troop movements), emergency and disaster management, real-time control systems, remote operation of appliances, and in accessing the Internet. Since a user may not maintain a fixed position in such environments, the mobile and wireless networking support allowing mobile users to communicate with other users (fixed or mobile) becomes crucial. A possible scenario may involve several different networks that can support or can be modified to support mobile users. When dealing with different wireless networks, a universal mobile device should be able to select the network (LAN, the Internet, PCS, or satellite) that best meets user requirements. Wireless and mobile networks have provided the flexibility required for an increasingly mobile workforce. As shown in Figure 1(a), the worldwide number of cellular, GSM, and PCS subscribers increased from 140 million in 1996 to over 300 million in 1999 and is expected to grow to 650 million by 2001 (see www.gsmdata.com). In the U.S., capital investment increased from $6.3 billion in 1990 to $66.8 billion in 1999 and service revenues were up from $4.5 billion to $38.7 billion in 1999 (see www.wow-com.com) as shown in Figure 1(b). During the same period, the average local monthly bill diminished from $80 to $39 as shown in Figure 1(c), indicating the technological maturity and the tremendous competition among service providers. Many general remarks can be made about wireless systems. First, the channel capacity typically available in wireless systems is much lower than what is Upkar Varshney and Ron Vetter

Journal ArticleDOI
TL;DR: Modelling methods, architectures, and tools have become increasingly popular because they can help to reduce the cost of software implementation and at the same time increase user acceptance of ERP software solutions.
Abstract: B usiness information systems can be either designed as custom applications or purchased as off-the-shelf standard solutions. The development of custom applications is generally expensive and is often plagued by uncertainties, such as the selection of appropriate development tools, the duration of the development cycle, or the difficulties involved in assessing costs. Thus, empirical surveys have shown that between half to two-thirds of information systems projects fail [3]. The current tendency to shift from individual development to standardized, prepackaged software solutions is therefore not surprising. Yet, standardized ERP systems such as SAP R/3, Oracle Applications, and PeopleSoft have disadvantages, too. Huge storage needs, networking requirements, and training overheads are frequently mentioned ERP problems. However, the scale of Business Process Reengineering (BPR) and customization tasks involved in the software implementation process are the major reasons for ERP dissatisfaction [1]. Baan, Peoplesoft, as well as SAP calculate that customers spend between three and seven times more money on ERP implementation and associated services compared to the purchase of the software license. Our own experiences validate that the ratio between ERP implementation efforts and software purchase is approximately 5 to 1. With hardware and software costs rapidly decreasing, that ratio becomes even worse. This high ratio is due to the fact that ERP systems are more or less easy to install, yet users must also determine which goals (strategies) they wish to reach with the system, how the functionality of the system can achieve this, and how to customize, configure, and technically implement the package. If one realizes that SAP’s R/3 solution comprises more than 5,000 different parameters, the complexity of the implementation process will be evident. Inevitably, customization and implementation of ERP systems became an industry on its own. But particularly smalland medium-sized enterprises are not able to pay consultants millions of dollars for ERP implementation. Hence, modeling methods, architectures, and tools have become increasingly popular because they can help to reduce the cost of software implementation and at the same time increase user acceptance of ERP software solutions. Several modeling approaches are possible: