scispace - formally typeset

Proceedings ArticleDOI

Law-governed peer-to-peer auctions

07 May 2002-pp 109-116

TL;DR: A flexible architecture for the creation of Internet auctions is proposed, which allows the custom definition of the auction parameters, and provides a decentralized control of the Auction process.

AbstractThis paper proposes a flexible architecture for the creation of Internet auctions. It allows the custom definition of the auction parameters, and provides a decentralized control of the auction process. Auction policies are defined as laws in the Law Governed Interaction (LGI) paradigm. Each of these laws specifies not only the auction algorithm itself (e.g. open-cry, dutch, etc.) but also how to handle the other parameters usually involved in the online auctions, such as certification, auditioning, and treatment of complaints. LGI is used to enforce the rules established in the auction policy within the agents involved in the process. After the agents find out about the actions, they interact in a peer-to-peer communication protocol, reducing the role of the centralized auction room to an advertising registry, and taking profit of the distributed nature of the Internet to conduct the auction. The paper presents an example of an auction law, illustrating the use of the proposed architecture.

Topics: Auction theory (77%), Forward auction (70%), Eauction (70%), Spectrum auction (69%), Generalized second-price auction (68%)

...read more

Content maybe subject to copyright    Report

Citations
More filters

01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,153 citations


Book ChapterDOI
01 Jan 2005
TL;DR: This chapter proposes an alternative approach, allowing all of a mechanism to be formal and explicit, and presents a taxonomy of declarative rules which can be used to capture a wide variety of negotiation mechanisms in a principled and well-structured way.
Abstract: If agents are to negotiate automatically with one another they must share a negotiation mechanism, specifying what possible actions each party can take at any given time, when negotiation terminates, and what is the structure of the resulting agreements. Current standardization activities such as FIPA [2] and WS-Agreement [3] represent this as a negotiation protocol specifying the flow of messages. However, they omit other aspects of the rules of negotiation (such as obliging a participant to improve on a previous offer), requiring these to be represented implicitly in an agent’s design, potentially resulting incompatibility, maintenance and re-usability problems. In this chapter, we propose an alternative approach, allowing all of a mechanism to be formal and explicit. We present (i) a taxonomy of declarative rules which can be used to capture a wide variety of negotiation mechanisms in a principled and well-structured way; (ii) a simple interaction protocol, which is able to support any mechanism which can be captured using the declarative rules; (iii) a software framework for negotiation that allows agents to effectively participate in negotiations defined using our rule taxonomy and protocol and (iv) a language for expressing aspects of the negotiation based on OWL-Lite [4]. We provide examples of some of the mechanisms that the framework can support.

129 citations


Cites background from "Law-governed peer-to-peer auctions"

  • ...It’s true that LGI has been applied to peer-to-peer auctions [26], but the focus of that work was mainly on the peer-to-peer aspect, aiming to dispense with a centralized service for auctions....

    [...]


Book ChapterDOI
15 Jul 2002
TL;DR: This paper presents an alternative approach, allowing all of a mechanism to be formal and explicit, and presents a taxonomy of declarative rules which can be used to capture a wide variety of negotiation mechanisms in a principled and well-structured way.
Abstract: If agents are to negotiate automatically with one another they must share a negotiation mechanism, specifying what possible actions each party can take at any given time, when negotiation terminates, and what the resulting agreements will be. The current state-of-the-art represents this as a negotiation protocol specifying the flow of messages. However, they omit other aspects of the rules of negotiation (such as obliging a participant to improve on a previous offer), requiring these to be represented implicitly in an agent's design, potentially resulting in compatibility, maintenance and re-usability problems. In this paper, we propose an alternative approach, allowing all of a mechanism to be formal and explicit. We present (i) A taxonomy of declarative rules which can be used to capture a wide variety of negotiation mechanisms in a principled and well-structured way. (ii) A simple interaction protocol, which is able to support any mechanism which can be captured using the declarative rules. (iii) A software framework for negotiation, implemented in JADE [3] that allows agents to effectively participate in negotiations defined using our rule taxonomy and protocol.

81 citations


Journal ArticleDOI
TL;DR: It is shown that while last-minute bidding (sniping) is an effective strategy against bidders engaging in incremental bidding (and against those with common values), in general, delaying bidding is disadvantageous even if delayed bids are sure to be received before the auction closes.
Abstract: We present a mathematical model of the eBay auction protocol and perform a detailed analysis of the effects that the eBay proxy bidding system and the minimum bid increment have on the auction properties. We first consider the revenue of the auction, and we show analytically that when two bidders with independent private valuations use the eBay proxy bidding system there exists an optimal value for the minimum bid increment at which the auctioneer's revenue is maximized. We then consider the sequential way in which bids are placed within the auction, and we show analytically that independent of assumptions regarding the bidders' valuation distribution or bidding strategy the number of visible bids placed is related to the logarithm of the number of potential bidders. Thus, in many cases, it is only a minority of the potential bidders that are able to submit bids and are visible in the auction bid history (despite the fact that the other hidden bidders are still effectively competing for the item). Furthermore, we show through simulation that the minimum bid increment also introduces an inefficiency to the auction, whereby a bidder who enters the auction late may find that its valuation is insufficient to allow them to advance the current bid by the minimum bid increment despite them actually having the highest valuation for the item. Finally, we use these results to consider appropriate strategies for bidders within real world eBay auctions. We show that while last-minute bidding (sniping) is an effective strategy against bidders engaging in incremental bidding (and against those with common values), in general, delaying bidding is disadvantageous even if delayed bids are sure to be received before the auction closes. Thus, when several bidders submit last-minute bids, we show that rather than seeking to bid as late as possible, a bidder should try to be the first sniper to bid (i.e., it should “snipe before the snipers”).

59 citations


Cites background from "Law-governed peer-to-peer auctions"

  • ...RELATED WORK The growth of Web-based electronic commerce has initiated much research into the design of novel mechanisms for online auctions [Fontoura et al. 2002], and also effective bidding strategies for automated bidding agents [Guo 2002; Dumas et al. 2002; Anthony and Jennings 2003]....

    [...]

  • ...The growth of Web-based electronic commerce has initiated much research into the design of novel mechanisms for online auctions [Fontoura et al. 2002], and also effective bidding strategies for automated bidding agents [Guo 2002; Dumas et al....

    [...]


Journal ArticleDOI
TL;DR: This paper presents a lightweight and Cooperative multifactOr considered file Replication Protocol (CORP), which dramatically reduces the overhead of both file replication and consistency maintenance.
Abstract: File replication is widely used in structured P2P systems to avoid hot spots in servers and enhance file availability. The number of replicas and replication distance affect the file replication cost. These two elements and the replica update frequency determined in the file replication stage also affect the cost of subsequent consistency maintenance. However, most existing file replication protocols focus on improving file lookup efficiency without considering its cost and its subsequent influence on consistency maintenance. This paper studies the problem about how a server chooses files to replicate and where to replicate files to achieve low cost in both file replication and consistency maintenance stages without compromising the effectiveness of file replication. This paper presents a lightweight and Cooperative multifactOr considered file Replication Protocol (CORP) to achieve this goal. CORP simultaneously takes into account multiple factors including file popularity, update rate, node available capacity, file load, and node locality, aiming to minimize the number of replicas, update frequency, and replication distance. CORP also dynamically adjusts the number of replicas based on ever-changing file popularity and visit pattern. Extensive experimental results from simulation and PlanetLab real-world testbed demonstrate the efficiency and effectiveness of CORP in comparison with other file replication protocols. It dramatically reduces the overhead of both file replication and consistency maintenance. In addition, it exhibits high adaptiveness to skewed lookups and yields significant improvement in reducing overloaded nodes. Specifically, compared to the other replication protocols, CORP can reduce more than 71 percent of file replicas, 84 percent of overloaded nodes, 94 percent of consistency maintenance cost, and 72 percent of file replication and consistency maintenance latency.

20 citations


Cites background from "Law-governed peer-to-peer auctions"

  • ...In addition, future P2P applications also need consistency support to deliver frequently updated contents such as directory service [27], online auction [28], remote collaboration [29], shared calendar [30], [31], P2P web cache [32], and online games [33]....

    [...]


References
More filters

01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,153 citations


Proceedings ArticleDOI
01 May 1998
TL;DR: The Michigan Internet AuctionBot is a scalable and robust auction server that supports both software and human agents and is used extensively in classroom exercises and is available to the general Internet population.
Abstract: Market mechanisms such as auctions will likely rep resent a common interaction medium for agents on the Internet The Michigan Internet AuctionBot is a ex ible scalable and robust auction server that supports both software and human agents The server manages many simultaneous auctions by separating the interface from the core auction procedures This design provides a responsive interface and tolerates system and network disruptions but necessitates careful timekeeping proce dures to ensure temporal accuracy The AuctionBot has been used extensively in classroom exercises and is available to the general Internet population Its exi ble speci cation of auctions in terms of orthogonal pa rameters makes it a useful device for agent researchers exploring the design space of auction mechanisms

573 citations


Journal ArticleDOI
Ho-Guen Lee1
TL;DR: An industry case study is presented to demonstrate that the prices of goods traded through electronic marketplaces can actually be higher than those of products sold in traditional markets.
Abstract: The efficiency of electronic means for commerce is sometimes countered by increased product cost, as demonstrated in this case involving an auction system for used cars in Japan. E lectronic marketplaces have become increasingly popular alternatives to traditional forms of commerce [8, 9]. This increase in popularity has led many to predict that one effect will be to lower the market price of goods. This reduced price hypothesis was proposed by Bakos in his seminal article on electronic marketplaces [2]. Buyers in market-inter-mediated transactions have to bear search costs to obtain information about the prices and product offerings of sellers. High search costs of buyers enable sellers to maintain prices substantially above their marginal costs and result in alloca-tional inefficiencies in market transactions. Electronic market systems can reduce the search costs that buyers must incur to acquire information about seller prices and product offerings, thus enabling buyers to locate suppliers that better match their needs. The lowered search costs allow buyers to look at more product offerings and make it difficult for sellers to sustain high prices. The reduced price hypothesis predicts that buyers will enjoy lower product prices as a result of the increased competition among sellers in electronic marketplaces. An industry case study is presented in this article to demonstrate that the prices of goods traded through electronic marketplaces can actually be higher than those of products sold in traditional markets. AUCNET is an electronic marketplace introduced to reduce search costs of buyers for used-car transactions in Japan. The average contract price of secondhand cars sold through AUCNET is much higher than that of traditional, non-electronic markets. The industry case study suggests that the analysis of electronic market impacts on product prices should take into account economic factors beyond the buyers' search costs. This article investigates why the product prices in AUCNET are higher than those of traditional markets by examining other economic variables than the buyer's search costs.

487 citations


Book
28 May 2021

441 citations


Journal ArticleDOI
TL;DR: It is shown that LGI is at least as general as a conventional centralized coordination mechanism (CCM), and that it is more scalable, and generally more efficient, then CCM.
Abstract: Software technology is undergoing a transition form monolithic systems, constructed according to a single overall design, into conglomerates of semiautonomous, heterogeneous, and independently designed subsystems, constructed and managed by different organizations, with little, if any, knowledge of each other. Among the problems inherent in such conglomerates, none is more serious than the difficulty to control the activities of the disparate agents operating in it, and the difficulty for such agents to coordinate their activities with each other. We argue that the nature of coordination and control required for such systems calls for the following principles to be satisfied: (1) coordination policies need to be enforced: (2) the enforcement needs to be decentralized; and (3) coordination policies need to be formulated explicitly—rather than being implicit in the code of the agents involved—and they should be enforced by means of a generic, broad spectrum mechanism; and (4) it should be possible to deploy and enforce a policy incrementally, without exacting any cost from agents and activities not subject to it. We describe a mechansim called law-governed interaction (LGI), currently implemented by the Moses toolkit, which has been designed to satisfy these principles. We show that LGI is at least as general as a conventional centralized coordination mechanism (CCM), and that it is more scalable, and generally more efficient, then CCM.

356 citations