scispace - formally typeset
Search or ask a question

Answers from top 12 papers

More filters
Papers (12)Insight
Network game servers, in which a huge number of users can play a common game through interconnecting network, can be designed in a distributed system architecture with the benefit of scalability, reliability, and cost effectiveness.
A disk array can provide high aggregate throughput, but only if the server can effectively balance the load on the disks.
The concept provides a framework for transforming NAS operations in line with global modernization trends.
The results of NAS tests on different devices give results that are file access connections via wireless and LAN can run well and by producing Data Centers at the Computer Laboratory of the University of Technology Sumbawa can facilitate the process of storing cheap and efficient data.
Proceedings ArticleDOI
Lu Fan, Hamish Taylor, Phil Trinder 
19 Sep 2007
60 Citations
In this framework, the functionalities of a traditional game server are distributed, capitalising on the potential of P2P networks, and enabling the MMOG to scale better in both communication and computation.
Hence Raspberry Pi can be used as a cloud server which serves as a storage device for real time applications.
The resulting preliminary performance evaluations show an important improvement of the speedup on a typical NAS OpenMP benchmark application.
Experimental results using multi-programmed NAS workloads suggest that the proposed methods can greatly reduce the effect of cache contention on multi-core systems.
Open accessProceedings ArticleDOI
01 Oct 2019
260 Citations
Encapsulation provides a unified view of NAS and randomly wired networks.
These tests indicate that X-NAS attains a quicker response time and higher throughput than a conventional single NAS, so its cost-performance scalability is also higher.
Proceedings ArticleDOI
08 Dec 2009
33 Citations
We demonstrate MPI-NeTSim’s usefulness in analyzing the effect of the network on communication by using our environment to study the impact of a slow-link on the NAS benchmarks.
Both of our models offer superior performances in comparison to other concurrent works on one-shot NAS.

See what other people are reading

How do instant transfer methods differ from traditional transfer methods in terms of security and fraud prevention measures?
5 answers
Instant transfer methods differ from traditional transfer methods in terms of security and fraud prevention by leveraging real-time capabilities to enhance transaction security. Traditional methods relied on rules and could take days to process transactions, whereas instant methods, as described in contextsand, utilize automated systems for interbank transfers without revealing sensitive account details, thus reducing the risk of fraud. Additionally, instant communication protocols, as outlined in, incorporate security extensions to prevent data leakage and enhance overall security. Furthermore, in the context of IoT, instant encrypted transmission schemes are proposed to ensure the security of users' data during rapid IoT data exchanges. These advancements in instant transfer methods prioritize real-time fraud detection and secure data transmission, setting them apart from traditional slower processes.
How to plan a disassembly process?
5 answers
To plan a disassembly process efficiently, human-robot collaboration (HRC) and human behavior prediction (HBP) are crucial factors to consider. By utilizing a sequence planner that assigns tasks in real-time between a human operator and a robot, challenges such as uncertainties, computational costs, and safety interruptions can be overcome. The disassembly planning method should focus on the flexibility of humans in dealing with complex tasks and the accuracy of robots, while targeting components based on remanufacturability parameters for increased efficiency. Additionally, addressing model variations, physical uncertainties, and incomplete product information through a disassembly sequence planner that distributes tasks between humans and robots in a collaborative setting can automate the process effectively. Considering uncertainties in disassembly time, cost, and effort through stochastic optimization algorithms and multiobjective genetic algorithms can lead to a successful disassembly sequence under uncertain conditions.
What are some internet of things devices?
5 answers
Internet of Things (IoT) devices encompass a wide range of interconnected physical objects controlled through embedded software applications. These devices include home appliances, office equipment, industrial machines, motor vehicles, and more. Specifically, IoT devices are increasingly utilized in various fields such as healthcare monitoring, smart homes, industrial applications, agriculture, city infrastructure, and remote monitoring. The IoT landscape is rapidly expanding, with an estimated 30 billion connected devices currently, expected to double in the next four years, with a significant portion being wirelessly powered machine-to-machine devices. These devices play a crucial role in enhancing surveillance, monitoring, data collection, and remote control capabilities across diverse sectors, illustrating the vast potential and versatility of IoT technology in modern society.
What does arduino have to do with IOT?
5 answers
Arduino plays a significant role in the Internet of Things (IoT) by enabling the development of innovative systems for environmental monitoring, air pollution detection, and alcohol detection. Arduino-based IoT systems utilize low-cost microcontrollers, sensors, and wireless communication modules to collect real-time data. These systems can monitor various environmental factors like temperature, humidity, air quality, and alcohol levels. Arduino's open-source nature encourages community-driven development, customization, and collaboration in creating IoT solutions. However, the widespread use of Arduino in IoT projects raises security concerns due to the lack of deep knowledge in ICT security among hobbyist programmers. Overall, Arduino serves as a versatile platform for implementing IoT applications in diverse fields, ranging from environmental monitoring to road safety enhancement.
How does a resource-based view approach influence the innovation process?
5 answers
A resource-based view (RBV) approach influences the innovation process by emphasizing the importance of resources, capabilities, and competencies within an organization. RBV suggests that firms with diverse sets of stakeholders and access to diverse information sources are more likely to develop different types of innovations, such as eco-innovations and product/process innovations. Additionally, RBV strategy and entrepreneurial orientation, when combined with a focus on innovation, mediate competitive advantage in Small and Medium Enterprises (SMEs). Moreover, in technology-based start-ups, RBV highlights the significance of technological capabilities, entrepreneurship, and technological competitiveness in driving technological innovation. Therefore, adopting an RBV approach can enhance a firm's innovation potential by leveraging its internal resources and external relationships.
How did Rittel and Webber's work contribute to the field of transport planning?
5 answers
Rittel and Webber's work significantly influenced the field of transport planning by introducing the concept of wicked problems, which are complex, interconnected issues that are challenging to solve due to their ambiguous nature and involvement of multiple stakeholders. Their framework emphasized the importance of considering diverse perspectives and the dynamic nature of planning processes, aligning with the idea that transport planning is inherently a political practice that must address social diversity and equity concerns. Additionally, their approach highlighted the need for comprehensive and integrated transport management to address the multifaceted challenges posed by high traffic volumes, environmental impacts, and the necessity for sustainable transport solutions. By acknowledging the intricate nature of transport systems and the necessity for inclusive and holistic planning strategies, Rittel and Webber's work laid the foundation for a more nuanced and effective approach to transport planning.
What is contract management software?
5 answers
Contract management software refers to technology-based platforms designed to streamline decision-making, planning, control, and evaluation of business performance related to contracts. Traditional manual contract management processes often lack clarity and efficiency, leading to resource wastage. To address this, electronic contract management systems have been developed, incorporating features like private key logins for enhanced security. These systems typically include components such as user terminals, partner terminals, platform servers, and various processing parts to ensure effective contract electronization, signing, and security. Additionally, advancements in contract management software have introduced features like standby modules for improved operational efficiency and continuity in case of server failures. Overall, contract management software plays a crucial role in enhancing operational effectiveness and security in managing contracts within organizations.
How is Big Data applied to astronomical surveys?
5 answers
Big Data is revolutionizing astronomical surveys by handling the massive volumes of diverse data generated. Astronomical studies, like photometric reverberation mapping, utilize Big Data to measure time delays between light curves in different bands to determine the accretion disc size in Active Galactic Nuclei. Machine learning algorithms, such as Artificial Neural Networks and Random Forest, are applied using Apache Spark to process large astronomical datasets for tasks like photometric redshift estimation. The rise of Big Data in astronomy is evident from the exponential growth in data size, with archives containing over 100 terabytes of data and billions of detected sources. Polystore systems are proposed to integrate heterogeneous data in astronomy, addressing the challenge of managing unstructured data from various sources.
What is the kirkpatrick model?
5 answers
The Kirkpatrick Model is a comprehensive evaluation framework used in various educational settings, such as Business English Training, statistics courses, and nursing education. It consists of four interrelated levels: reaction, learning, behavior, and impact. This model assesses the effectiveness of training programs by evaluating learners' responses, knowledge acquisition, behavioral changes, and overall outcomes. For instance, in a study on a community-based sport psychology program, the model was utilized to measure young people's reactions, learning outcomes, and the relationship between the two. Kirkpatrick's Model provides a structured approach to assess the quality of education and training initiatives, emphasizing the importance of understanding learners' needs, assessing training delivery, and ensuring continuous improvement in educational practices.
What is visualization?
5 answers
Visualization is the process of making invisible or complex information visible through various means such as graphics, images, and multimedia. It plays a crucial role in fields like scientific computing, education, media communication, and cultural studies. Visualization aids in understanding behavior, data analysis, and enhancing communication by appealing to the visual senses of the audience. In educational settings, visualization enhances students' comprehension, critical thinking skills, and ability to create schematic maps. Moreover, visualization techniques are essential in extracting meaningful insights from vast amounts of data in scientific and computational research. The evolution of visualization methods has led to innovative approaches like direct visualization, which involves creating visual representations directly from media objects, benefiting fields like humanities and cultural institutions.
What is inference latency?
5 answers
Inference latency refers to the time taken for a model to make predictions or decisions after receiving input. It is a crucial metric in deep learning, especially on mobile and edge devices. Various challenges, such as hardware heterogeneity and runtime optimizations, make accurately predicting inference latency difficult. To address this, researchers have developed innovative systems like nn-Meter, which predicts DNN model inference latency accurately by dividing the process into kernels and utilizing adaptive sampling. Additionally, new approaches in neural architecture search aim to increase concurrency and distribution opportunities to reduce inference latency, highlighting the importance of exploring new architectural spaces. Techniques like channel-wise packing and structured pruning have also been proposed to shorten inference latency while maintaining acceptable accuracy levels in homomorphic encryption-enabled CNNs.