scispace - formally typeset
Search or ask a question

How do I clear my Exchange server cache? 

Answers from top 8 papers

More filters
Papers (8)Insight
We believe that this asymmetric cache study is the first of its kind and provides useful data/insights on cache characteristics of server consolidation.
Open accessProceedings ArticleDOI
30 Oct 2017
30 Citations
Simulation findings show that the combination of the consumer-cache caching strategy and the RR cache replacement policy is the most convenient in IoT environments in terms of hop reduction ratio, server hit reduction and response latency.
We show that these do not prevent effective caching and we introduce a special cache replacement algorithm to maximise efficiency.
Proceedings ArticleDOI
V Sathiyamoorthi, V. Murali Bhaskaran 
19 Apr 2012
9 Citations
Web proxy cache can potentially improve network performance by reducing the number of requests that reaches the server, the volume of data transferred through the network and the delay in accessing Web page.
Proceedings ArticleDOI
Kevin W. Froese, Richard B. Bunt 
03 Jan 1996
30 Citations
Some recent research has suggested that various strategies for cache management may not be equally suited to the circumstances at both the client and the server.
Proceedings ArticleDOI
H. Zhu, Tao Yang 
22 Apr 2001
90 Citations
The experimental results show that the proposed techniques are effective in supporting coarse-grain cache management and reducing server response times for tested applications.
Open accessProceedings ArticleDOI
30 Sep 2015
27 Citations
We implemented OPC and experimentally evaluated it over various cache placement policies, showing that it can enhance the impact of ICN packet-level caching, reducing both network and server load.
Thus, we are able to uncover the resource update history and cache configurations at the server side, and analyze the cache performance in various time granularities.

See what other people are reading

What can i use to learn about rocky programe wich deal with dem?
5 answers
To learn about the ROCKY program, which deals with improving the robustness of STT-MRAM cache memory hierarchy against write failures, you can refer to research by Talebi et al.. The ROCKY architecture proposes efficient replacement policies to enhance the reliability of STT-MRAM memories by reducing susceptible transitions during write operations. The study demonstrates that ROCKY can decrease the Write Error Rate (WER) of STT-MRAM cache memories by up to 35.4% with minimal performance overhead. This research provides valuable insights into addressing the critical reliability challenges faced by STT-MRAM technology, particularly focusing on mitigating write failures in cache memory hierarchies. By studying the ROCKY program, you can gain a deeper understanding of strategies to enhance the robustness of STT-MRAM-based systems in the face of reliability issues.
How does the speed of download and upload impact the performance and efficiency of online applications and services?
5 answers
The speed of download and upload significantly impacts the performance and efficiency of online applications and services. Different application characteristics, such as random, customized, and routine applications, affect network efficiency differently. The interaction between HTTP versions (HTTP/1.1 and HTTP/2) and TCP congestion control algorithms influences web download speeds and efficiency. Implementing a symmetric dual-circuit Mini RS232 safety interface enhances the safety and efficiency of downloading and uploading processes. High-speed Internet access, diverse access technologies, and application protocols affect network performance and user experience, emphasizing the need for understanding flow-level properties and network effects on application quality. Implementing IO performance acceleration with disk block caching optimizes efficiency for applications making IO requests, enhancing performance and flexibility.
How has the use of FIFOs (named pipes) impacted modern operating systems?
5 answers
The use of FIFOs, also known as named pipes, has significantly impacted modern operating systems by facilitating efficient data transfer and communication between different devices and processes. FIFOs play a crucial role in real-time digital system design, especially for data streaming applications like multimedia devices, and are integral to Unix and Linux interprocess communication architectures. They provide a powerful model for transferring data between devices, utilizing memory controllers, CPUs, and data controllers for seamless data delivery. Additionally, FIFOs with integrated error management ensure reliable data transfer by reacting to errors and maintaining known operational states, preventing data gaps or overlaps during communication. The efficient integration of FIFOs in modern operating systems enhances performance, resource utilization, and overall system design.
What's direct memory access?
4 answers
Direct Memory Access (DMA) is a method that allows peripherals to access the system's memory directly without involving the CPU. DMA controllers facilitate this process by managing data transfers between peripherals and memory, enhancing system performance by reducing CPU involvement in data transfer operations. DMA operations involve transferring data in blocks, optimizing memory access efficiency and reducing latency. DMA controllers can execute microprograms for data reorganization in shared local memory, enhancing processing capabilities. Additionally, DMA management includes cache memory address handling and TCE cache flag utilization for efficient data transfer operations. Overall, DMA streamlines data movement, enhances system performance, and minimizes CPU workload in managing memory operations.
What are the best practices for implementing backups in an OpenNebula environment?
4 answers
Best practices for implementing backups in an OpenNebula environment involve utilizing methods like creating snapshots and backups of application output in virtual machines. Additionally, employing a system backup solution can prevent data loss by duplicating and storing server data periodically. Implementing hardware-based mirroring technology can ensure direct mirroring of NVRAM data between servers at an atomic level, enhancing data redundancy and protection. For backing up encrypted data in backend storage without exposing it, a frontend client system with client-controlled keys and minimal server-side processes is recommended. Furthermore, a computer-implemented method for backups may involve identifying data volumes, locating data objects, and backing up references to data objects in archival data stores, optimizing backup efficiency. These practices collectively enhance data protection and recovery capabilities in an OpenNebula environment.
What are the performance limitations of replicating database systems into a database for analytics?
4 answers
Replicating database systems for analytics can lead to performance limitations due to challenges such as maintaining consistent state for real-time analytics, dealing with cold-cache misses during reconfigurations causing high read-performance impact, and facing trade-offs between consistency and latency in distributed storage systems. While modern streaming systems like Apache Flink struggle to efficiently expose state to analytical queries, proposed solutions involve sending read hints to non-serving replicas to keep caches warm and maintain performance levels during reconfigurations. Additionally, managing data distribution transparently while ensuring scalability remains a challenge, with techniques like sharding impacting system complexity and inter-process communication. These factors collectively highlight the intricate balance required to optimize performance when replicating database systems for analytics.
What are the most recent statistics about resilience?
5 answers
Recent statistics on resilience show a growing interest in the concept within various fields. Studies from 2014 to 2018 highlight the multifaceted nature of resilience in sustainability literature, with a focus on frameworks, indicators, and models. Additionally, research in robust statistics has expanded the understanding of resilience by considering perturbations under different distances, leading to the development of robust estimators based on minimum distance functionals. Moreover, in the context of Software-Defined Networking, the performance of the Least Recently Used (LRU) mechanism under DoS attacks has been evaluated, emphasizing the importance of utilizing rule statistics information to enhance cache attack resistance. These diverse studies collectively contribute to the evolving landscape of resilience research across various disciplines.
Why is the browser essential to Google chrome?
4 answers
The browser is essential to Google Chrome because it serves as the primary gateway for internet users to engage in various online activities. Digital forensic practitioners rely on analyzing browser artifacts, such as cache files, to investigate cybercrimes related to internet usage. Additionally, the comparison between Google Chrome and Brave browsers reveals that both share similar data structures, emphasizing the importance of understanding browser artifacts for forensic analysis. Furthermore, the development of browser extensions, like the one for Google Chromium, highlights the need to address security threats like cross-site scripting attacks, which can exploit vulnerabilities in browsers. Overall, browsers like Google Chrome play a crucial role in providing users with access to online content while also posing security challenges that need to be addressed.
What is contact list database?
4 answers
A contact list database is a system that stores and manages contact information for efficient communication. Various methods and systems have been developed to enhance the functionality and usability of contact lists. One approach involves dynamically displaying contacts based on user preferences and data associated with each contact identifier. Another method focuses on quick searching within the contact list by utilizing cache data storage structures to improve search efficiency. Additionally, a contact list processing method adjusts the position of contacts based on their communication mark weight, aiming to save time and enhance user experience. Furthermore, a contact list management device automatically transmits relevant contact list data items to users based on their communication behavior, enhancing convenience and user experience. Overall, these innovations aim to optimize contact list management and improve user interaction with contact information.
What are the key principles and theories underlying equipment allocation within organizational contexts?
5 answers
Equipment allocation within organizational contexts is guided by several key principles and theories. The Resource Allocation Process (RAP) framework, introduced by Bower, provides a theoretical foundation for understanding how firms allocate scarce financial capital efficiently. In the context of video-on-demand (VoD) services, considerations include transport, storage, streaming, and caching in content delivery networks, with a focus on optimizing resource allocation based on a cost function and hit ratio estimation. Additionally, in multiproject resource allocation scenarios, a bilevel model under a fuzzy random environment is proposed, emphasizing the importance of considering uncertainty in resource allocation and utilizing hybrid algorithms for optimization. These theories highlight the significance of strategic decision-making, fairness considerations, and efficient resource utilization in equipment allocation processes within organizational settings.
Does bobcat behavior include storing food for later consumption?
5 answers
Bobcats, like other felids, exhibit behaviors related to storing food for later consumption. While bobcats are known to engage in behaviors such as locomotion, repeated pacing, vigilance, and grooming before enrichment activities, they also show exploratory and food behaviors more often after environmental enrichment, indicating a potential shift towards storing and consuming food strategically. Additionally, the study on European wildcats in Spain documented caching behavior, where wildcats covered carcasses with leaves or snow, suggesting a form of food storage for later consumption. This behavior aligns with the concept of caching observed in other felid species like pumas, which cache their kills to extend foraging time and maximize energetic gains, particularly when preying on intermediate-sized prey.