scispace - formally typeset
Search or ask a question

Showing papers on "The Internet published in 2001"


Journal ArticleDOI
08 Mar 2001-Nature
TL;DR: This work aims to understand how an enormous network of interacting dynamical systems — be they neurons, power stations or lasers — will behave collectively, given their individual dynamics and coupling architecture.
Abstract: The study of networks pervades all of science, from neurobiology to statistical physics. The most basic issues are structural: how does one characterize the wiring diagram of a food web or the Internet or the metabolic network of the bacterium Escherichia coli? Are there any unifying principles underlying their topology? From the perspective of nonlinear dynamics, we would also like to understand how an enormous network of interacting dynamical systems-be they neurons, power stations or lasers-will behave collectively, given their individual dynamics and coupling architecture. Researchers are only now beginning to unravel the structure and dynamics of complex networks.

7,665 citations


Journal Article
TL;DR: Porter as discussed by the authors argues that the Internet is not disruptive to most existing industries and established companies and, contrary to recent thought, the Internet itself will be neutralized as a source of advantage.
Abstract: Many of the pioneers of Internet business, both dot-coms and established companies, have competed in ways that violate nearly every precept of good strategy. Rather than focus on profits, they have chased customers indiscriminately through discounting, channel incentives, and advertising. Rather than concentrate on delivering value that earns an attractive price from customers, they have pursued indirect revenues such as advertising and click-through fees. Rather than make trade-offs, they have rushed to offer every conceivable product or service. It did not have to be this way--and it does not have to be in the future. When it comes to reinforcing a distinctive strategy, Michael Porter argues, the Internet provides a better technological platform than previous generations of IT. Gaining competitive advantage does not require a radically new approach to business; it requires building on the proven principles of effective strategy. Porter argues that, contrary to recent thought, the Internet is not disruptive to most existing industries and established companies. It rarely nullifies important sources of competitive advantage in an industry; it often makes them even more valuable. And as all companies embrace Internet technology, the Internet itself will be neutralized as a source of advantage. Robust competitive advantages will arise instead from traditional strengths such as unique products, proprietary content, and distinctive physical activities. Internet technology may be able to fortify those advantages, but it is unlikely to supplant them. Porter debunks such Internet myths as first-mover advantage, the power of virtual companies, and the multiplying rewards of network effects. He disentangles the distorted signals from the marketplace, explains why the Internet complements rather than cannibalizes existing ways of doing business, and outlines strategic imperatives for dot-coms and traditional companies.

3,558 citations


Journal ArticleDOI
TL;DR: In this article, an attitudinal model is developed and empirically tested integrating constructs from technology acceptance research and constructs derived from models of web behavior, and two distinct categories of the interactive shopping environment support the differential importance of immersive, hedonic aspects of the new media as well as the more traditional utilitarian motivations.

2,888 citations


Book
Pippa Norris1
01 Jan 2001
TL;DR: Digital Divide as discussed by the authors examines access and use of the Internet in 179 nations world-wide and finds evidence for a democratic divide between those who do and do not use Internet resources to engage and participate in public life.
Abstract: From the Publisher: Digital Divide examines access and use of the Internet in 179 nations world-wide. A global divide is evident between industrialized and developing societies. A social divide is apparent between rich and poor within each nation. Within the online community, evidence for a democratic divide is emerging between those who do and do not use Internet resources to engage and participate in public life. Part I outlines the theoretical debate between cyber-optimists who see the Internet as the great leveler. Part II examines the virtual political system and the way that representative institutions have responded to new opportunities on the Internet. Part III analyzes how the public has responded to these opportunities in Europe and the United States and develops the civic engagement model to explain patterns of participation via the Internet.

2,824 citations


Journal ArticleDOI
01 Nov 2001
TL;DR: The Internet is becoming the essential communication and information medium in our society, and stands alongside electricity and the printing press as one of the greatest innovations of all time The Internet Galaxy offers an illuminating look at how this new technology will influence business, the economy, and our daily lives as mentioned in this paper.
Abstract: From the Publisher: Manuel Castells is one of the world's leading thinkers on the new information age, hailed by The Economist as "the first significant philosopher of cyberspace," and by Christian Science Monitor as "a pioneer who has hacked out a logical, well-documented, and coherent picture of early 21st century civilization, even as it rockets forward largely in a blur" Now, in The Internet Galaxy, this brilliantly insightful writer speculates on how the Internet will change our lives Castells believes that we are "entering, full speed, the Internet Galaxy, in the midst of informed bewilderment" His aim in this exciting and profound work is to help us to understand how the Internet came into being, and how it is affecting every area of human life--from work, politics, planning and development, media, and privacy, to our social interaction and life in the home We are at ground zero of the new network society In this book, its major commentator reveals the Internet's huge capacity to liberate, but also its ability to marginalize and exclude those who do not have access to it Castells provides no glib solutions, but asks us all to take responsibility for the future of this new information age The Internet is becoming the essential communication and information medium in our society, and stands alongside electricity and the printing press as one of the greatest innovations of all time The Internet Galaxy offers an illuminating look at how this new technology will influence business, the economy, and our daily lives

2,424 citations


Journal ArticleDOI
R.A. Davis1
TL;DR: A cognitive-behavioral model of Pathological Internet Use is introduced, which implies a more important role of cognitions in PIU, and describes the means by which PIU is both developed and maintained, and provides a framework for the development of cognitive- behavioral interventions for PIU.

2,200 citations


Proceedings ArticleDOI
10 Dec 2001
TL;DR: This measurement study seeks to precisely characterize the population of end-user hosts that participate in Napster and Gnutella, and shows that there is significant heterogeneity and lack of cooperation across peers participating in these systems.
Abstract: The popularity of peer-to-peer multimedia file sharing applications such as Gnutella and Napster has created a flurry of recent research activity into peer-to-peer architectures. We believe that the proper evaluation of a peer-to-peer system must take into account the characteristics of the peers that choose to participate. Surprisingly, however, few of the peer-to-peer architectures currently being developed are evaluated with respect to such considerations. In this paper, we remedy this situation by performing a detailed measurement study of the two popular peer-to-peer file sharing systems, namely Napster and Gnutella. In particular, our measurement study seeks to precisely characterize the population of end-user hosts that participate in these two systems. This characterization includes the bottleneck bandwidths between these hosts and the Internet at large, IP-level latencies to send packets to these hosts, how often hosts connect and disconnect from the system, how many files hosts share and download, the degree of cooperation between the hosts, and several correlations between these characteristics. Our measurements show that there is significant heterogeneity and lack of cooperation across peers participating in these systems.

2,189 citations


Book
01 Jan 2001
TL;DR: Intended for use in a senior/graduate level distributed systems course or by professionals, this text systematically shows how distributed systems are designed and implemented in real systems.
Abstract: From the Publisher: Andrew Tanenbaum and Maarten van Steen cover the principles, advanced concepts, and technologies of distributed systems in detail, including: communication, replication, fault tolerance, and security. Intended for use in a senior/graduate level distributed systems course or by professionals, this text systematically shows how distributed systems are designed and implemented in real systems. Written in the superb writing style of other Tanenbaum books, the material also features unique accessibility and a wide variety of real-world examples and case studies, such as NFS v4, CORBA, DOM, Jini, and the World Wide Web. FEATURES Detailed coverage of seven key principles. An introductory chapter followed by a chapter devoted to each key principle: communication, processes, naming, synchronization, consistency and replication, fault tolerance, and security, including unique comprehensive coverage of middleware models. Four chapters devoted to state-of-the-art real-world examples of middleware. Covers object-based systems, document-based systems, distributed file systems, and coordination-based systems including CORBA, DCOM, Globe, NFS v4, Coda, the World Wide Web, and Jini. Excellent coverage of timely, advanced, distributed systems topics: Security, payment systems, recent Internet and Web protocols, scalability, and caching and replication. NEW-The Prentice Hall Companion Website for this book contains PowerPoint slides, figures in various file formats, and other teaching aids, and a link to the author's Web site.

2,011 citations


Proceedings ArticleDOI
21 Oct 2001
TL;DR: It is found that forwarding packets via at most one intermediate RON node is sufficient to overcome faults and improve performance in most cases, demonstrating the benefits of moving some of the control over routing into the hands of end-systems.
Abstract: A Resilient Overlay Network (RON) is an architecture that allows distributed Internet applications to detect and recover from path outages and periods of degraded performance within several seconds, improving over today's wide-area routing protocols that take at least several minutes to recover. A RON is an application-layer overlay on top of the existing Internet routing substrate. The RON nodes monitor the functioning and quality of the Internet paths among themselves, and use this information to decide whether to route packets directly over the Internet or by way of other RON nodes, optimizing application-specific routing metrics.Results from two sets of measurements of a working RON deployed at sites scattered across the Internet demonstrate the benefits of our architecture. For instance, over a 64-hour sampling period in March 2001 across a twelve-node RON, there were 32 significant outages, each lasting over thirty minutes, over the 132 measured paths. RON's routing mechanism was able to detect, recover, and route around all of them, in less than twenty seconds on average, showing that its methods for fault detection and recovery work well at discovering alternate paths in the Internet. Furthermore, RON was able to improve the loss rate, latency, or throughput perceived by data transfers; for example, about 5% of the transfers doubled their TCP throughput and 5% of our transfers saw their loss probability reduced by 0.05. We found that forwarding packets via at most one intermediate RON node is sufficient to overcome faults and improve performance in most cases. These improvements, particularly in the area of fault detection and recovery, demonstrate the benefits of moving some of the control over routing into the hands of end-systems.

1,968 citations


Journal ArticleDOI
TL;DR: The findings indicate that merchant integrity is a major positive determinant of consumer trust in Internet shopping, and that its effect is moderated by the individual consumer's trust propensity.
Abstract: E-commerce success, especially in the business-to-consumer area, is determined in part by whether consumers trust sellers and products they cannot see or touch, and electronic systems with which they have no previous experience. This paper describes a theoretical model for investigating the four main antecedent influences on consumer trust in Internet shopping, a major form of business-to-consumer e-commerce: trustworthiness of the Internet merchant, trustworthiness of the Internet as a shopping medium, infrastructural (contextual) factors (e.g., security, third-party certification), and other factors (e.g., company size, demographic variables). The antecedent variables are moderated by the individual consumer's degree of trust propensity, which reflects personality traits, culture, and experience. Based on the research model, a comprehensive set of hypotheses is formulated and a methodology for testing them is outlined. Some of the hypotheses are tested empirically to demonstrate the applicability of the...

1,941 citations


Journal ArticleDOI
TL;DR: Everybody talks about e-health these days, but few people have come up with a clear definition of this comparatively new term, which was apparently first used by industry leaders and marketing people rather than academics.
Abstract: Everybody talks about e-health these days, but few people have come up with a clear definition of this comparatively new term. Barely in use before 1999, this term now seems to serve as a general "buzzword," used to characterize not only "Internet medicine", but also virtually everything related to computers and medicine. The term was apparently first used by industry leaders and marketing people rather than academics. They created and used this term in line with other "e-words" such as e-commerce, e-business, e-solutions, and so on, in an attempt to convey the promises, principles, excitement (and hype) around e-commerce (electronic commerce) to the health arena, and to give an account of the new possibilities the Internet is opening up to the area of health care. Intel, for example, referred to e-health as "a concerted effort undertaken by leaders in health care and hi-tech industries to fully harness the benefits available through convergence of the Internet and health care." Because the Internet created new opportunities and challenges to the traditional health care information technology industry, the use of a new term to address these issues seemed appropriate. These "new" challenges for the health care information technology industry were mainly (1) the capability of consumers to interact with their systems online (B2C = "business to consumer"); (2) improved possibilities for institutionto-institution transmissions of data (B2B = "business to business"); (3) new possibilities for peerto-peer communication of consumers (C2C = "consumer to consumer").

Journal ArticleDOI
TL;DR: The authors found that heavy Internet use is associated with increased participation in voluntary organizations and politics, and that people's interaction online supplements their face-to-face and telephone communication without increasing or decreasing it.
Abstract: How does the Internet affect social capital? Do the communication possibilities of the Internet increase, decrease, or supplement interpersonal contact, participation, and community commitment? This evidence comes from a 1998 survey of 39,211 visitors to the National Geographic Society Web site, one of the first large-scale Web surveys. The authors find that people's interaction online supplements their face-to-face and telephone communication without increasing or decreasing it. However, heavy Internet use is associated with increased participation in voluntary organizations and politics. Further support for this effect is the positive association between offline and online participation in voluntary organizations and politics. However, the effects of the Internet are not only positive: The heaviest users of the Internet are the least committed to online community. Taken together, this evidence suggests that the Internet is becoming normalized as it is incorporated into the routine practices of everyday ...

Journal ArticleDOI
TL;DR: The Internet is a critically important research site for sociologists testing theories of technology diffusion and media effects, particularly because it is a medium uniquely capable of integrating modes of communication and forms of content.
Abstract: The Internet is a critically important research site for sociologists testing theories of technology diffusion and media effects, particularly because it is a medium uniquely capable of integrating modes of communication and forms of content. Current research tends to focus on the Internet's implications in five domains: 1) inequality (the “digital divide”); 2) community and social capital; 3) political participation; 4) organizations and other economic institutions; and 5) cultural participation and cultural diversity. A recurrent theme across domains is that the Internet tends to complement rather than displace existing media and patterns of behavior. Thus in each domain, utopian claims and dystopic warnings based on extrapolations from technical possibilities have given way to more nuanced and circumscribed understandings of how Internet use adapts to existing patterns, permits certain innovations, and reinforces particular kinds of change. Moreover, in each domain the ultimate social implications of t...

Book
David Crystal1
01 Jan 2001
TL;DR: Covering a range of Internet genres, including e-mail, chat, and the Web, this is a revealing account of how the Internet is radically changing the way the authors use language.
Abstract: In recent years, the Internet has come to dominate our lives. E-mail, instant messaging and chat are rapidly replacing conventional forms of correspondence, and the Web has become the first port of call for both information enquiry and leisure activity. How is this affecting language? There is a widespread view that as 'technospeak' comes to rule, standards will be lost. In this book, David Crystal argues the reverse: that the Internet has encouraged a dramatic expansion in the variety and creativity of language. Covering a range of Internet genres, including e-mail, chat, and the Web, this is a revealing account of how the Internet is radically changing the way we use language. This second edition has been thoroughly updated to account for more recent phenomena, with a brand new chapter on blogging and instant messaging. Engaging and accessible, it will continue to fascinate anyone who has ever used the Internet.

Book
06 Jul 2001
TL;DR: The application of Network Calculus to the Internet and basic Min-plus and Max-plus Calculus and Optimal Multimedia Smoothing and Adaptive and Packet Scale Rate Guarantees are studied.
Abstract: Network Calculus.- Application of Network Calculus to the Internet.- Basic Min-plus and Max-plus Calculus.- Min-plus and Max-plus System Theory.- Optimal Multimedia Smoothing.- FIFO Systems and Aggregate Scheduling.- Adaptive and Packet Scale Rate Guarantees.- Time Varying Shapers.- Systems with Losses.

Journal ArticleDOI
TL;DR: An experiment in which consumers were instructed to gather online information about one of five specific product topics by accessing either online discussions or marketer-generated online information, finding the consumers who gathered information from online discussions reported greater interest in the product topic.

Journal ArticleDOI
TL;DR: SIENA, an event notification service that is designed and implemented to exhibit both expressiveness and scalability, is presented and the service's interface to applications, the algorithms used by networks of servers to select and deliver event notifications, and the strategies used to optimize performance are described.
Abstract: The components of a loosely coupled system are typically designed to operate by generating and responding to asynchronous events. An event notification service is an application-independent infrastructure that supports the construction of event-based systems, whereby generators of events publish event notifications to the infrastructure and consumers of events subscribe with the infrastructure to receive relevant notifications. The two primary services that should be provided to components by the infrastructure are notification selection (i. e., determining which notifications match which subscriptions) and notification delivery (i.e., routing matching notifications from publishers to subscribers). Numerous event notification services have been developed for local-area networks, generally based on a centralized server to select and deliver event notifications. Therefore, they suffer from an inherent inability to scale to wide-area networks, such as the Internet, where the number and physical distribution of the service's clients can quickly overwhelm a centralized solution. The critical challenge in the setting of a wide-area network is to maximize the expressiveness in the selection mechanism without sacrificing scalability in the delivery mechanism. This paper presents SIENA, an event notification service that we have designed and implemented to exhibit both expressiveness and scalability. We describe the service's interface to applications, the algorithms used by networks of servers to select and deliver event notifications, and the strategies used to optimize performance. We also present results of simulation studies that examine the scalability and performance of the service.

Journal ArticleDOI
TL;DR: The results indicate that, in the context of IT basic skills training in undergraduate education, there are no significant differences in performance between students enrolled in the two environments, however, the VLE leads to higher reported computer self-efficacy, while participants report being less satisfied with the learning process.
Abstract: Internet technologies are having a significant impact on the learning industry. For-profit organizations and traditional institutions of higher education have developed and are using web-based courses, but little is known about their effectiveness compared to traditional classroom education. Our work focuses on the effectiveness of a web-based virtual learning environment (VLE) in the context of basic information technology skills training. This article provides three main contributions. First, it introduces and defines the concept of VLE, discussing how a VLE differs from the traditional classroom and differentiating it from the related, but narrower, concept of computer aided instruction (CAI). Second, it presents a framework of VLE effectiveness, grounded in the technology-mediated learning literature, which frames the VLE research domain, and addresses the relationship between the main constructs. Finally, it focuses on one essential VLE design variable, learner control, and compares a web-based VLE to a traditional classroom through a longitudinal experimental design. Our results indicate that, in the context of IT basic skills training in undergraduate education, there are no significant differences in performance between students enrolled in the two environments. However, the VLE leads to higher reported computer self-efficacy, while participants report being less satisfied with the learning process.

Book
01 Jan 2001
TL;DR: This second editionsystematically introduces the notion of ontologies to the non-expert reader and demonstrates in detail how to apply this conceptual framework for improved intranet retrieval of corporate information and knowledge and for enhanced Internet-based electronic commerce.
Abstract: This second editionsystematically introduces the notion of ontologies to the non-expert reader and demonstrates in detail how to apply this conceptual framework for improved intranet retrieval of corporate information and knowledge and for enhanced Internet-based electronic commerce He also describes ontology languages (XML, RDF, and OWL) and ontology tools, and the application of ontologies In addition to structural improvements, the second edition covers recent developments relating to the Semantic Web, and emerging web-based standard languages

Proceedings Article
13 Aug 2001
TL;DR: This article presents a new technique, called “backscatter analysis,” that provides a conservative estimate of worldwide denial-of-service activity, and believes it is the first to provide quantitative estimates of Internet-wide denial- of- service activity.
Abstract: In this paper, we seek to answer a simple question: "How prevalent are denial-of-service attacks in the Internet today?". Our motivation is to understand quantitatively the nature of the current threat as well as to enable longer-term analyses of trends and recurring patterns of attacks. We present a new technique, called "backscatter analysis", that provides an estimate of worldwide denial-of-service activity. We use this approach on three week-long datasets to assess the number, duration and focus of attacks, and to characterize their behavior. During this period, we observe more than 12,000 attacks against more than 5,000 distinct targets, ranging from well known e-commerce companies such as Amazon and Hotmail to small foreign ISPs and dial-up connections. We believe that our work is the only publically available data quantifying denial-of-service activity in the Internet.

01 Sep 2001
TL;DR: This memo specifies the incorporation of ECN (Explicit Congestion Notification) to TCP and IP, including ECN's use of two bits in the IP header.
Abstract: This memo specifies the incorporation of ECN (Explicit Congestion Notification) to TCP and IP, including ECN's use of two bits in the IP header.

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the topological and dynamical properties of real Internet maps in a three-year time interval and find that the Internet is characterized by nontrivial correlations among nodes and different dynamical regimes.
Abstract: The description of the Internet topology is an important open problem, recently tackled with the introduction of scale-free networks. We focus on the topological and dynamical properties of real Internet maps in a three-year time interval. We study higher order correlation functions as well as the dynamics of several quantities. We find that the Internet is characterized by nontrivial correlations among nodes and different dynamical regimes. We point out the importance of node hierarchy and aging in the Internet structure and growth. Our results provide hints towards the realistic modeling of the Internet evolution. Complex networks play an important role in the under- standing of many natural systems (1,2). A network is a set of nodes and links, representing individuals and the interactions among them, respectively. Despite this simple definition, growing networks can exhibit a high degree of complexity, due to the inherent wiring entanglement occur- ring during their growth. The Internet is a capital example of growing network with technological and economical relevance; however, the recollection of router-level maps of the Internet has received the attention of the research community only very recently (3-5). The statistical analysis performed so far has revealed that the Internet ex- hibits several nontrivial topological properties (wiring redundancy, clustering, etc.). Among them, the presence of a power-law connectivity distribution (6,7) makes the Internet an example of the recently identified class of scale-free networks (8). In this Letter, we focus on the dynamical properties of the Internet. We shall consider the evolution of real In- ternet maps from 1997 to 2000, collected by the National Laboratory for Applied Network Research (NLANR) (3). In particular, we will inspect the correlation properties of nodes' connectivity, as well as the time behavior of several quantities related to the growth dynamics of new nodes. Our analysis shows dynamical behavior with dif- ferent growth regimes depending on the node's age and connectivity. The analysis points out two distinct wiring processes: the first one concerns newly added nodes, while the second is related to already existing nodes increasing their interconnections. A feature introduced in this pa- per refers to the Internet hierarchical structure, reflected in a nontrivial scale-free connectivity correlation function. Finally, we discuss recent models for the generation of scale-free networks in the light of the present analysis of real Internet maps. The results presented in this Letter could help develop more accurate models of the Internet.

Proceedings ArticleDOI
15 Aug 2001
TL;DR: The goal is to produce a topology generation framework which improves the state of the art and is based on the design principles of representativeness, inclusiveness, and interoperability.
Abstract: Effective engineering of the Internet is predicated upon a detailed understanding of issues such as the large-scale structure of its underlying physical topology, the manner in which it evolves over time, and the way in which its constituent components contribute to its overall function. Unfortunately, developing a deep understanding of these issues has proven to be a challenging task, since it in turn involves solving difficult problems such as mapping the actual topology, characterizing it, and developing models that capture its emergent behavior. Consequently, even though there are a number of topology models, it is an open question as to how representative the generated topologies they generate are of the actual Internet. Our goal is to produce a topology generation framework which improves the state of the art and is based on the design principles of representativeness, inclusiveness, and interoperability. Representativeness leads to synthetic topologies that accurately reflect many aspects of the actual Internet topology (e.g. hierarchical structure, node degree distribution, etc.). Inclusiveness combines the strengths of as many generation models as possible in a single generation tool. Interoperability provides interfaces to widely-used simulation applications such as ns and SSF and visualization tools like otter. We call such a tool a universal topology generator.

Patent
28 Mar 2001
TL;DR: In this paper, a flowchart-based approach is used to build a logical structure for a customer relationship management (CRM) system, which comprises an ordered set of questions and branching logic that are presented to a customer of the business when the customer contacts the business with an inquiry.
Abstract: A flowchart-based tool can be used to build a logical structure. In the context of a customer relationship management (CRM) system, the logical structure can comprise an ordered set of questions and branching logic that are presented to a customer of the business when the customer contacts the business with an inquiry, such as for a sale or service inquiry or other interaction. An engine can run a session associated with the logical structure, with the session presenting questions, text, graphics, and the like dynamically to customer across a network, such as the Internet and a web site. Branching logic determines the appropriate information to present to the user based on answers to previous questions. The engine allows presentation of the information to the user/customer, by generating hypertext markup language (HTML) files to display the questions or other elements of the logical structure as part of a user interface on a client terminal of the customer.

Journal ArticleDOI
23 May 2001-JAMA
TL;DR: Accessing health information using search engines and simple search terms is not efficient, coverage of key information on English- and Spanish-language Web sites is poor and inconsistent, although the accuracy of the information provided is generally good.
Abstract: ContextDespite the substantial amount of health-related information available on the Internet, little is known about the accessibility, quality, and reading grade level of that health information.ObjectiveTo evaluate health information on breast cancer, depression, obesity, and childhood asthma available through English- and Spanish-language search engines and Web sites.Design and SettingThree unique studies were performed from July 2000 through December 2000. Accessibility of 14 search engines was assessed using a structured search experiment. Quality of 25 health Web sites and content provided by 1 search engine was evaluated by 34 physicians using structured implicit review (interrater reliability >0.90). The reading grade level of text selected for structured implicit review was established using the Fry Readability Graph method.Main Outcome MeasuresFor the accessibility study, proportion of links leading to relevant content; for quality, coverage and accuracy of key clinical elements; and grade level reading formulas.ResultsLess than one quarter of the search engine's first pages of links led to relevant content (20% of English and 12% of Spanish). On average, 45% of the clinical elements on English- and 22% on Spanish-language Web sites were more than minimally covered and completely accurate and 24% of the clinical elements on English- and 53% on Spanish-language Web sites were not covered at all. All English and 86% of Spanish Web sites required high school level or greater reading ability.ConclusionAccessing health information using search engines and simple search terms is not efficient. Coverage of key information on English- and Spanish-language Web sites is poor and inconsistent, although the accuracy of the information provided is generally good. High reading levels are required to comprehend Web-based health information.

Journal ArticleDOI
TL;DR: The article contains a short summary of the design of this project, a review of main indicators regarding ICT in elementary and lower secondary schools, main obstacles and an exploration of the co-variation between obstacles and contextual factors at the country-level.
Abstract: The main focus of this article is on the perceptions of educational practitioners (at the lower secondary level) regarding obstacles that seriously impede the realization of ICT-related goals of schools. The results are from a worldwide survey among national representative samples of schools from 26 countries. The article contains a short summary of the design of this project, a review of main indicators regarding ICT (Information and Communication Technologies) in elementary and lower secondary schools, main obstacles and an exploration of the co-variation between obstacles and contextual factors at the country-level.

Journal ArticleDOI
TL;DR: It is found that most people use few search terms, few modified queries, view few Web pages, and rarely use advanced search features, and the language of Web queries is distinctive.
Abstract: In studying actual Web searching by the public at large, we analyzed over one million Web queries by users of the Excite search engine. We found that most people use few search terms, few modified queries, view few Web pages, and rarely use advanced search features. A small number of search terms are used with high frequency, and a great many terms are unique; the language of Web queries is distinctive. Queries about recreation and entertainment rank highest. Findings are compared to data from two other large studies of Web queries. This study provides an insight into the public practices and choices in Web searching.

Book
01 Jan 2001
TL;DR: Key peer-to-peer pioneers take us beyond the headlines and hype and show how the technology is changing the way the authors communicate and exchange information.
Abstract: From the Publisher: Upstart software projects Napster, Gnutella, and Freenet have dominated newspaper headlines, challenging traditional approaches to content distribution with their revolutionary use of peer-to-peer file-sharing technologies. Reporters try to sort out the ramifications of seemingly ungoverned peer-to-peer networks. Lawyers, business leaders, and social commentators debate the virtues and evils of these bold new distributed systems. But what's really behind such disruptive technologies -- the breakthrough innovations that have rocked the music and media worlds? And what lies ahead? In this book, key peer-to-peer pioneers take us beyond the headlines and hype and show how the technology is changing the way we communicate and exchange information. Those working to advance peer-to-peer as a technology, a business opportunity, and an investment offer their insights into how the technology has evolved and where it's going. They explore the problems they've faced, the solutions they've discovered, the lessons they've learned, and their goals for the future of computer networking. Until now, Internet communities have been limited by the flat interactive qualities of email and network newsgroups, where people can exchange recommendations and ideas but have great difficulty commenting on one another's postings, structuring information, performing searches, and creating summaries. Peer-to-peer challenges the traditional authority of the client/server model, allowing shared information to reside instead with producers and users. Peer-to-peer networks empower users to collaborate on producing and consuming information, adding to it, commenting on it, and building communities around it. This compilation represents the collected wisdom of today's peer-to-peer luminaries. It includes contributions from Gnutella's Gene Kan, Freenet's Brandon Wiley, Jabber's Jeremie Miller, and many others -- plus serious discussions of topics ranging from accountability and trust to security and performance. Fraught with questions and promise, peer-to-peer is sure to remain on the computer industry's center stage for years to come.

Journal ArticleDOI
TL;DR: In this article, the authors explore risk perceptions among consumers of varying levels of Internet experience and how these perceptions relate to online shopping activity and find that higher levels of internet experience are related to higher or lower levels of perceived risks and concerns regarding the privacy and security of online shopping.
Abstract: Government and industry organizations have declared information privacy and security to be major obstacles in the development of consumer-related e-commerce. Risk perceptions regarding Internet privacy and security have been identified as issues for both new and experienced users of Internet technology. 'This paper explores risk perceptions among consumers of varying levels of Internet experience and how these perceptions relate to online shopping activity. Findings provide evidence of hypothesized relationships among consumers' levels of Internet experience, the use of alternate remote purchasing methods (such as telephone and mail-order shopping), the perceived risks of online shopping, and online purchasing activity. Implications for online commerce and consumer welfare are discussed. The Internet has grown considerably during the past decade, particularly with respect to its use as a tool for communication, entertainment, and marketplace exchange. This rapid growth has been accompanied, however, by concerns regarding the collection and dissemination of consumer information by marketers who participate in online retailing. These concerns pertain to the privacy and security of accumulated consumer data (Briones 1998; Culnan 1999) and the perceived risks that consumers may experience with respect to these issues (Ernst & Young 1999; Milne and Boza 1999; Milne 2000). Consumers' perceived risks associated with online retailing have received limited attention despite their implications for e-commerce. Although some early research suggests that risk perceptions may play a minor role in the adoption of online shopping (Jarvenpaa and Todd 1996-97), several recent industry and government-related studies (e.g., Culnan 1999; Federal Trade Commission (FTC) 1998b, 1998d, 2000) have deemed consumer risk perceptions to be a primary obstacle to the future growth of online commerce. Many involved in online retailing assume that time alone will dissolve consumer concerns regarding the privacy and security of online shopping, yet others argue that greater Internet experience and more widespread publicity of the potential risks of online shopping will lead to increased risk perceptions. To date, no known research has investigated whether higher levels of Internet experience are related to higher or lower levels of perceived risks and concerns regarding the privacy and security of online shopping. Thus, presented here are the results of a study that explores the relationships among Internet experience levels, risk perceptions, and online purchasing rates. This study begins with an examination of Internet users' concerns and perceived risk regarding online shopping. The next area to be examined is how general experience with the Internet and other more-established remote purchasing methods relates to risk perceptions and online purchase rates. Finally, implications for online retailers are discussed with consideration of policy issues surrounding privacy and security on the Internet. PRIVACY AND SECURITY OF ONLINE CONSUMER INFORMATION Statistics and data regarding the growth of the Internet [1] have been widely cited in the popular press. Recent accounts report that over half (52%) of American adults use the Internet, which is twice as many as in mid-1997 (Sefton 2000). Moreover, approximately half of current Internet users have purchased products or services online (Sefton 2000), with average per capita online expenditures exceeding $1,200 in 1999 (Ernst & Young 2000). Looking toward the near future, Ernst & Young (2000) reports that 79 percent of nonbuyers plan to purchase via the Internet during the next twelve months, resulting in online sales of $45 to $50 billion. The issues of privacy and security have been labeled by government and consumer organizations as two major concerns of e-commerce (Briones 1998; CLI 1999; CNN 2000; Consumer Reports Online 1998; FTC 1998a, 2000; Folkers 1998; Judge 1998; Machrone 1998; National Consumers League 1999). …

Proceedings ArticleDOI
06 Jul 2001
TL;DR: If the Internet is the next great subject for Theoretical Computer Science to model and illuminate mathematically, then Game Theory, and Mathematical Economics more generally, are likely to prove useful tools.
Abstract: If the Internet is the next great subject for Theoretical Computer Science to model and illuminate mathematically, then Game Theory, and Mathematical Economics more generally, are likely to prove useful tools. In this talk I survey some opportunities and challenges in this important frontier.