scispace - formally typeset
Search or ask a question

Answers from top 8 papers

More filters
Papers (8)Insight
Open accessProceedings ArticleDOI
31 Oct 2016
11 Citations
In this work we identify some useful properties about the potential locations for the new servers, from which we derive a novel algorithm for MinMax, and show that it is efficient when the number of new servers is small.
Probing game servers in approximately ascending RTT expedites the identification of playable servers.
The impact appeared greatest for servers with fewer years serving experience, wait-persons, younger servers and servers who worked in establishments without written policies regarding serving practices.
The experimental results demonstrate a better trade-off in improving the average utilization of remaining servers by 13% through servers consolidation.
The simulation and numerical results validate our analysis and show that the PS server outperforms first-come-first-serve servers.
The study on the number selection of utilized satellite edge servers provides an insight for following studies of edge servers scheduling in satellite terrestrial networks.
Our analysis of profiling and redundancy provides insight to help designers determine how many servers and which servers to select to reduce latency.
Proceedings ArticleDOI
Jan Kohout, Tomas Pevny 
11 May 2015
21 Citations
In this work, we propose such tool relying only on high-level statistics of servers' usage, such as volumes and times of interactions with the servers.

See what other people are reading

What is docker?
4 answers
Docker is a powerful tool for creating, deploying, and running applications using containers, which are isolated environments containing all necessary components. It allows developers to package applications with dependencies into portable containers, ensuring compatibility across different systems. Docker operates with Dockerfiles, enabling consistent image generation on various platforms. Compared to traditional virtualization, Docker offers lightweight containerization, enhancing resource efficiency and ease of deployment. Docker's versatility extends to running multiple interconnected containers, facilitating the development of robust and fault-tolerant applications. Furthermore, Docker addresses security concerns in cloud environments, offering an alternative to traditional virtualization methods.
How does the implementation of technology affect accident rates and injury prevention in the workplace?
5 answers
The implementation of technology in the workplace significantly impacts accident rates and injury prevention. New technologies, such as IoT solutions, Smart Personal Protective Equipment (PPE), and virtual reality-based methodologies, play a crucial role in enhancing workplace safety. IoT solutions focus on injury detection, prevention, and response, while Smart PPE integrates sensor networks to monitor workers and their environment in real-time, providing recommendations for risk mitigation. Additionally, Smart PPE and wearable technologies, like helmets, bracelets, and belts, leverage AI techniques for early anomaly detection and worker safety assurance. Virtual reality and digital checklists aid in conducting surveillance activities, improving inspection efficiency, and fostering a safety culture among workers. Overall, these technological advancements aim to reduce workplace injuries and create safer working environments by adapting to the evolving needs of modern industries.
Why is logging valued by librarians in digital library?
5 answers
Logging is highly valued by librarians in digital libraries due to its crucial role in understanding user behavior and evaluating the quality of services provided.Log analysis helps library managers make informed decisions regarding subscription renewals and service improvements by providing insights into user interactions with the digital resources.Additionally, analyzing log data in digital libraries allows for the identification of different types of search behavior using metadata, enhancing the understanding of user preferences and needs within the collection.Furthermore, the combination of implicitly and explicitly collected data through log analysis improves the overall comprehension of user behavior compared to analyzing these data sets separately.This emphasis on logging underscores its significance in enhancing the quality of digital library systems and services.
What is the thematic area of Generative AI and Blockchain networks for utilizing and securing dark data?
5 answers
The thematic area of Generative AI and Blockchain networks for utilizing and securing dark data revolves around the integration of advanced technologies to enhance data security, improve AI algorithms, and unlock the potential of unstructured or unused data, commonly referred to as dark data. Generative AI, particularly Generative Adversarial Networks (GANs), plays a crucial role in manipulating datasets to overcome challenges such as limited data sets and class imbalance, thereby enhancing the performance of AI models and security systems. These technologies are instrumental in generating new, representative datasets for training and assessing AI-based detectors, especially in the context of cybersecurity threats like Advanced Persistent Threats (APTs). Blockchain technology, on the other hand, ensures the secure exchange and storage of data by providing a decentralized and tamper-proof ledger, facilitating trusted data sharing among different stakeholders. This is particularly important for securing data across various sectors, including air traffic management and the Internet of Things (IoT), where data integrity and trust are paramount. The integration of blockchain with AI enhances the security of data sharing and the overall network by ensuring data ownership, promoting trusted data exchange, and enabling the creation of intelligent security rules. Furthermore, the combination of these technologies addresses the challenges associated with dark data by enabling the secure and efficient utilization of this vast, untapped resource. By leveraging blockchain for secure data transactions and AI for data processing and analysis, it is possible to upgrade the value of dark data, providing more transparent and efficient results. This synergy not only enhances the performance of AI algorithms but also fosters a more secure and trustworthy digital ecosystem, facilitating the realization of a highly unified information society that seamlessly integrates cyber, physical, and social systems.
Why is logging important for librarian in digital libraries?
5 answers
Logging is crucial for librarians in digital libraries as it provides valuable insights into user behavior and system performance. By analyzing log data, library managers can understand how members interact with subscribed services, aiding in decision-making processes such as subscription renewals. Additionally, log analysis helps in evaluating search behavior, identifying different types of user interactions, and predicting user needs based on metadata. Furthermore, website logs enable the reconstruction of library websites to enhance findability and usability for patrons. Log data also play a significant role in assessing the quality of services and documents provided by digital libraries, contributing to the evaluation of system quality and interoperability.
Essential considerations for video streaming platforms?
5 answers
Essential considerations for video streaming platforms encompass various aspects. The Practice Theory highlights the significance of analyzing consumption patterns, content recommendations, and user-generated data to understand individuality and identity formation. Operational networking issues play a crucial role in ensuring quality of experience during the delivery of high-bitrate media over the internet. In the performance of video streaming systems, the data transmission mechanism employed by servers significantly impacts video quality and presentation, with buffering occupancy being a key metric to consider. Hardware acceleration through integrated circuits with direct memory access and stream traffic management circuits enhances data stream control and exchange efficiency within the system. These elements collectively contribute to the seamless functioning and user experience of video streaming platforms.
Privacy virtual reality used in education for kids?
5 answers
Privacy in virtual reality (VR) used for educational purposes, especially for children with learning impediments, is a critical concern. The immersive nature of VR learning environments like vSocial poses security, privacy, and safety (SPS) threats that can impact the educational experience adversely. To address these risks, a novel risk assessment framework utilizing attack trees has been proposed to quantify threats and vulnerabilities in VRLEs. By converting attack trees into stochastic timed automata representations, statistical model checking can be employed to enhance security and privacy in social VRLEs. Implementing design principles such as hardening, diversity, and the principle of least privilege can significantly reduce the probability of security breaches and privacy leaks in VRLE systems.
Edge as a service reduce ownership cost?
5 answers
Edge as a Service (EaaS) can significantly reduce ownership costs by offering more efficient solutions for deploying IoT applications. By running security-focused IoT applications like jamming detection at the Edge instead of the Cloud, a Total Cost of Ownership (TCO) model showed up to a 2.13 times superiority in terms of TCO, as demonstrated in. Additionally, operating servers under lower than nominal conditions can lead to up to 9% power savings, enabling potential 100% gains in the TCO/area-coverage metric, effectively doubling the area covered with the same TCO. Furthermore, EaaS platforms can reduce data traffic to the cloud by up to 95% and minimize application latency by 40%-60%, showcasing the cost-saving benefits of leveraging Edge computing.
What is TOE framework and technology adoption in agriculture?
5 answers
The TOE framework, as discussed in Context_3, is utilized to identify drivers and challenges in adopting MLOps tools in agriculture. It highlights that factors like ML usage, performance drivers, and security significantly drive technology adoption, while the regulatory environment and organizational preparation moderately influence it. On the other hand, technology adoption in agriculture, as per Context_1 and Context_2, emphasizes the integration of Artificial Intelligence (AI) and the Internet of Things (IoT) to revolutionize traditional farming methods. This integration enables smart and sustainable agriculture by enhancing resource management, reducing pesticide use, improving water efficiency, and optimizing crop health. The adoption of IoT and AI technologies in agriculture is crucial for addressing future challenges and meeting the increasing global demand for agricultural goods by 2050.
What are the existing approaches in visual analytics health insuranse?
4 answers
Existing approaches in visual analytics for health insurance include FraudAuditor, a three-stage visual analytics approach proposed to detect collusive fraud in health insurance. This approach integrates expert knowledge with interactive co-visit network construction, a community detection algorithm, and tailored visualizations to identify suspicious fraudulent groups effectively. Additionally, research on data breaches in healthcare provider settings emphasizes the importance of visual analytics and data visualization tools in studying breach occurrences, causes, and impacts, with recommendations for proactive measures to mitigate breaches. Furthermore, the UCReg approach presents a unified method for generating, evaluating, and applying regression models in high-dimensional datasets, particularly in health records analysis, highlighting the user's crucial role in model building and knowledge extraction.
How to write the application from the start to the end?
4 answers
To write an application from start to end, one should consider various aspects such as the application's functionality, user interface, and performance optimization. Additionally, simulations and tracking studies can be utilized to ensure smooth operation and efficiency. It is crucial to focus on minimizing transverse emittance, energy spread, and space charge effects during the application's development. Incorporating correction schemes for trajectory adjustments and studying the impact of jitter sources on performance parameters are essential steps in the process. Moreover, understanding the application-specific semantics and the delegation of functions play a vital role in designing a reliable and functional application. By following these guidelines and utilizing advanced tools for simulations and optimizations, one can successfully develop an application from its inception to completion.