scispace - formally typeset
Search or ask a question
Author

Charles R. Standridge

Bio: Charles R. Standridge is an academic researcher from Grand Valley State University. The author has contributed to research in topics: Software & Simulation language. The author has an hindex of 14, co-authored 63 publications receiving 1820 citations. Previous affiliations of Charles R. Standridge include Florida A&M University – Florida State University College of Engineering.


Papers
More filters
18 Jul 1995
TL;DR: This work assesses the potential of proxy servers to cache documents retrieved with the HTTP protocol, and finds that a proxy server really functions as a second level cache, and its hit rate may tend to decline with time after initial loading given a more or less constant set of users.
Abstract: As the number of World-Wide Web users grow, so does the number of connections made to servers. This increases both network load and server load. Caching can reduce both loads by migrating copies of server files closer to the clients that use those files. Caching can either be done at a client or in the network (by a proxy server or gateway). We assess the potential of proxy servers to cache documents retrieved with the HTTP protocol. We monitored traffic corresponding to three types of educational workloads over a one semester period, and used this as input to a cache simulation. Our main findings are (1) that with our workloads a proxy has a 30-50% maximum possible hit rate no matter how it is designed; (2) that when the cache is full and a document is replaced, least recently used (LRU) is a poor policy, but simple variations can dramatically improve hit rate and reduce cache size; (3) that a proxy server really functions as a second level cache, and its hit rate may tend to decline with time after initial loading given a more or less constant set of users; and (4) that certain tuning configuration parameters for a cache may have little benefit.

495 citations

Proceedings ArticleDOI
28 Aug 1996
TL;DR: Surprisingly, the criteria used by several proxy-server removal policies are among the worst performing criteria in the authors' simulation; instead, replacing documents based on size maximizes hit rate in each of the studied workloads.
Abstract: World-Wide Web proxy servers that cache documents can potentially reduce three quantities: the number of requests that reach popular servers, the volume of network traffic resulting from document requests, and the latency that an end-user experiences in retrieving a document. This paper examines the first two using the measures of cache hit rate and weighted hit rate (or fraction of client-requested bytes returned by the proxy). A client request for an uncached document may cause the removal of one or more cached documents. Variable document sizes and types allow a rich variety of policies to select a document for removal, in contrast to policies for CPU caches or demand paging, that manage homogeneous objects. We present a taxonomy of removal policies. Through trace-driven simulation, we determine the maximum possible hit rate and weighted hit rate that a cache could ever achieve, and the removal policy that maximizes hit rate and weighted hit rate. The experiments use five traces of 37 to 185 days of client URL requests. Surprisingly, the criteria used by several proxy-server removal policies (LRU, Hyper-G, and a proposal by Pitkow and Recker) are among the worst performing criteria in our simulation; instead, replacing documents based on size maximizes hit rate in each of the studied workloads.

480 citations

Journal ArticleDOI
TL;DR: In this paper, a small furniture production company integrated lean tools and sustainability concepts with discrete event simulation modeling and analysis as well as mathematical optimization to make a positive impact on the environment, society and its own financial success.
Abstract: A small furniture production company has integrated lean tools and sustainability concepts with discrete event simulation modeling and analysis as well as mathematical optimization to make a positive impact on the environment, society and its own financial success. The principles of lean manufacturing that aid in the elimination of waste have helped the company meet ever increasing customer demands while preserving valuable resources for future generations. The implementation of lean and sustainable manufacturing was aided by the use of discrete event simulation and optimization to overcome deficits in lean’s traditional implementation strategies. Lean and green manufacturing can have a more significant, positive impact on multiple measures of operational performance when implemented concurrently rather than separately. These ideas are demonstrated by three applications.

194 citations

Journal ArticleDOI
TL;DR: In this article, an enhanced lean process that includes future state validation before implementation is presented, which extends value stream mapping to include time, the behavior of individual entities, structural variability, random variability, and component interaction effects.
Abstract: A traditional lean transformation process does not validate the future state before implementation, relying instead on a series of iterations to modify the system until performance is satisfactory. An enhanced lean process that includes future state validation before implementation is presented. Simulation modeling and experimentation is proposed as the primary validation tool. Simulation modeling and experimentation extends value stream mapping to include time, the behavior of individual entities, structural variability, random variability, and component interaction effects. Experiments to analyze the model and draw conclusions about whether the lean transformation effectively addresses the current state gap can be conducted. Industrial applications of the enhanced lean process show it effectiveness.

109 citations

Journal ArticleDOI
TL;DR: In this article, the authors present an economic analysis of the use of end-of-vehicle-life lithium-ion batteries in electric vehicles and plug-in electric hybrid vehicles.
Abstract: Purpose: Lithium-ion batteries that are commonly used in electric vehicles and plug-in electric hybrid vehicles cannot be simply discarded at the end of vehicle application due to the materials of which they are composed. In addition the US Department of Energy has estimated that the cost per kWh of new lithium-ion batteries for vehicle applications is four times too high, creating an economic barrier to the widespread commercialization of plug-in electric vehicles. (USDOE 2014). Thus, reducing this cost by extending the application life of these batteries appears to be necessary. Even with an extension of application life, all batteries will eventually fail to hold a charge and thus become unusable. Thus environmentally safe disposition must be accomplished. Addressing these cost and environmental issues can be accomplished by remanufacturing end of vehicle life lithium ion batteries for return to vehicle applications as well as repurposing them for stationary applications such as energy storage systems supporting the electric grid. In addition, environmental safe, “green” disposal processes are required that include disassembly of batteries into component materials for recycling. The hypotheses that end of vehicle application remanufacturing, repurposing, and recycling are each economic are examined. This assessment includes a forecast of the number of such batteries to ensure sufficient volume for conducting these activities. Design/methodology/approach: The hypotheses that end of vehicle application remanufacturing, repurposing, and recycling are economic are addressed using cost-benefit analysis applied independently to each. Uncertainty is associated with all future costs and benefits. Data from a variety of sources are combined and reasonable assumptions are made. The robustness of the results is confirmed by sensitivity analysis regarding each key parameter. Determining that a sufficient volume of end of vehicle application lithium-ion batteries will exist to support remanufacturing, repurposing, and recycling involves estimating a lower bound for the number of such batteries. Based on a variety of forecasts for electric vehicle and plug-in hybrid electric vehicle production, a distribution of life for use in a vehicle, and the percent recoverable for further use, three projections of the number of end of vehicle applications batteries for the time period 2010 to 2050 are developed. The lower bound is then the minimum of these three forecasts. Multiple forecasts based on multiple sources of information are used to help reduce uncertainty associated with finding the lower bound, which is particularly important given the short time such vehicles have been in use. Findings: The number of lithium-ion batteries becoming available annually for remanufacturing, recycling and repurposing is likely to exceed 3,000,000 between 2029 and 2032 as well as reaching 50% of new vehicle demand between 2020 and 2033. Thus, a sufficient number of batteries will be available. Cost benefit analysis shows that remanufacturing is economically feasible, saving approximately 40% over new battery use. Repurposing is likewise economically feasible if research and development costs for new applications are less than $82.65 per kWh for upper bound sales price of $150.00 per kWh. For a lower bound in R&D expenses of $50 per kWh, the lowest economic sales price is $114.05 per kWh. Recycling becomes economically feasible only if the price of lithium salts increases to $98.60 per kg due to a shortage of new lithium, which is possible but perhaps not likely, with increasing demand for lithium-ion batteries. Research limitations/implications: The demand for lithium-ion batteries for vehicle applications through 2050 has a high degree of uncertainty. Repurposing applications are currently not fully developed and recycling processes are still evolving. There is a high degree of uncertainty associated with the cost-benefit analysis. Practical implications: Lithium-ion batteries are a major cost component of an electric vehicle and a plug-in electric hybrid vehicle. One way of reducing this cost is to develop additional uses for such batteries at the end of vehicle application as well as an environmentally friendly method for recycling battery components as an alternative to destruction and disposal. Social implications: The use of lithium-ion batteries in vehicles as opposed to fossil fuels is consistent with the guiding principles of sustainability in helping to meet current needs without compromising the needs and resources of future generations. Reusing entire lithium-ion batteries or recycling the materials of which they are composed further reinforces the sustainability of the use of lithium-ion batteries. Originality/value: The results show that a sufficient number of batteries to support remanufacturing, repurposing, and recycling will be available. Remanufacturing is shown to be economically feasible. Repurposing is shown to be feasible under reasonable conditions on design and development. Recycling will likely not be economically feasible in isolation but will eventually be necessary for all batteries. Thus, the costs of recycling must be assigned to original vehicle use, remanufacturing and repurposing applications Furthermore, this effort integrates information from a wide variety of sources to show the economic feasibility of end of vehicle application uses for lithium-ion batteries.

88 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper demonstrates the benefits of cache sharing, measures the overhead of the existing protocols, and proposes a new protocol called "summary cache", which reduces the number of intercache protocol messages, reduces the bandwidth consumption, and eliminates 30% to 95% of the protocol CPU overhead, all while maintaining almost the same cache hit ratios as ICP.
Abstract: The sharing of caches among Web proxies is an important technique to reduce Web traffic and alleviate network bottlenecks. Nevertheless it is not widely deployed due to the overhead of existing protocols. In this paper we demonstrate the benefits of cache sharing, measure the overhead of the existing protocols, and propose a new protocol called "summary cache". In this new protocol, each proxy keeps a summary of the cache directory of each participating proxy, and checks these summaries for potential hits before sending any queries. Two factors contribute to our protocol's low overhead: the summaries are updated only periodically, and the directory representations are very economical, as low as 8 bits per entry. Using trace-driven simulations and a prototype implementation, we show that, compared to existing protocols such as the Internet cache protocol (ICP), summary cache reduces the number of intercache protocol messages by a factor of 25 to 60, reduces the bandwidth consumption by over 50%, eliminates 30% to 95% of the protocol CPU overhead, all while maintaining almost the same cache hit ratios as ICP. Hence summary cache scales to a large number of proxies. (This paper is a revision of Fan et al. 1998; we add more data and analysis in this version.).

2,174 citations

Patent
26 Oct 2001
TL;DR: In this article, the authors present a method and system for creating an innovative file system that separates its directory presentation from its data store, which does not delay the presentation of the content to the user but also uses a reduced amount of storage space.
Abstract: The invention provides a method and system for creating an innovative file system that separates its directory presentation from its data store. The method and system include processing, division, distribution, managing, synchronizing, and reassembling of file system objects that does not delay the presentation of the content to the user, but also uses a reduced amount of storage space. The invention includes the ability to manage and control the integrity of the files distributed across the network, and the ability to serve and reconstruct files in real time using a Virtual File Control System.

1,550 citations

Proceedings Article
08 Dec 1997
TL;DR: GreedyDual-Size as discussed by the authors incorporates locality with cost and size concerns in a simple and nonparameterized fashion for high performance, which can potentially improve the performance of main-memory caching of Web documents.
Abstract: Web caches can not only reduce network traffic and downloading latency, but can also affect the distribution of web traffic over the network through cost-aware caching. This paper introduces GreedyDual-Size, which incorporates locality with cost and size concerns in a simple and non-parameterized fashion for high performance. Trace-driven simulations show that with the appropriate cost definition, GreedyDual-Size outperforms existing web cache replacement algorithms in many aspects, including hit ratios, latency reduction and network cost reduction. In addition, GreedyDual-Size can potentially improve the performance of main-memory caching of Web documents.

1,048 citations

Journal ArticleDOI
TL;DR: The application of discrete-event simulation modeling to health care clinics and systems of clinics (for example, hospitals, outpatient clinics, emergency departments, and pharmacies) and future directions of research and applications are discussed.
Abstract: In recent decades, health care costs have dramatically increased, while health care organisations have been under severe pressure to provide improved quality health care for their patients. Several health care administrators have used discrete-event simulation as an effective tool for allocating scarce resources to improve patient flow, while minimising health care delivery costs and increasing patient satisfaction. The rapid growth in simulation software technology has created numerous new application opportunities, including more sophisticated implementations, as well as combining optimisation and simulation for complex integrated facilities. This paper surveys the application of discrete-event simulation modeling to health care clinics and systems of clinics (for example, hospitals, outpatient clinics, emergency departments, and pharmacies). Future directions of research and applications are also discussed.

891 citations

Journal ArticleDOI
TL;DR: The paper concludes with a discussion of caching and performance issues, using the observed workload characteristics to suggest performance enhancements that seem promising for Internet Web servers.
Abstract: This paper presents a workload characterization study for Internet Web servers. Six different data sets are used in the study: three from academic environments, two from scientific research organizations, and one from a commercial Internet provider. These data sets represent three different orders of magnitude in server activity, and two different orders of magnitude in time duration, ranging from one week of activity to one year. The workload characterization focuses on the document type distribution, the document size distribution, the document referencing behavior, and the geographic distribution of server requests. Throughout the study, emphasis is placed on finding workload characteristics that are common to all the data sets studied. Ten such characteristics are identified. The paper concludes with a discussion of caching and performance issues, using the observed workload characteristics to suggest performance enhancements that seem promising for Internet Web servers.

771 citations