scispace - formally typeset
Search or ask a question

Answers from top 8 papers

More filters
Papers (8)Insight
Our case study of server consolidation also shows TMemCanal can promote the performance of memory-intensive services up to 400%.
This presents an opportunity to greatly reduce server hardware cost by provisioning the right amount of memory reliability for different applications.
Therefore, result in failure to meet the memory capacity requirements of modern server and workstation applications.
Our experiments show that the optimal buffer allocation shifts to placing more memory at the server as the server has progressively less information about future frame sizes.
Proceedings ArticleDOI
Scott Pakin, G. Johnson 
17 Sep 2007
20 Citations
This paper represents the first study of memory-server performance at supercomputing scales.
For this reason, PCM cannot deliver satisfactory memory bandwidth for high-end computing environment such as multi-processing and server systems.
Results indicate the DBM particles are completely dispersed in the putty.
This is possibly due to elastic memory effects consequent on moulding the putty materials.

See what other people are reading

What is the definition of mind?
5 answers
The mind can be defined as a cognitive structure that organizes various mental capacities along dimensions such as bodily sensation, cognition, and sometimes emotion, as indicated by research across diverse cultures. In the realm of advanced parallel computer architecture, MIND (Memory, Intelligence, and Network Device) stands out as a Processor-in-Memory (PIM) architecture that integrates DRAM bit cells and CMOS logic devices on the same silicon die, supporting global shared memory and efficient parallel execution models. Furthermore, the human mind is described as a functional capability of the brain, responsible for subjective interpretations based on neurocognitive processes, cognitive judgments, and emotional experiences, shaping individual actions and responses based on detected external events and changes over time and space. Integrating Western and Indian perspectives, the concept of mind is pivotal for understanding psychological processes, cognition, consciousness, self, and paranormal phenomena, offering complementary insights for a comprehensive understanding of human functioning and development.
What is impulsive adoption?
5 answers
Impulsive adoption refers to the spontaneous and immediate acceptance and utilization of a technology or application without much premeditation. In the context of mobile shopping apps in India, the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2) model was extended to include impulsiveness as a construct in understanding the adoption of such apps. This study found that impulsiveness, along with other factors like performance expectancy, effort expectancy, hedonic motivation, habit, and behavioral intention, significantly influences the use behavior of mobile shopping apps. The addition of impulsiveness to the model highlights the impulsive nature of some users in quickly embracing and engaging with mobile shopping applications, showcasing the diverse factors influencing technology adoption in different contexts.
Because is hot?
5 answers
Fever is a physiological response that results from increased prostaglandin synthesis in the hypothalamus due to pyrogenic cytokines, signaling immune system activation. Fever can be caused by various factors such as infections, immune disorders, malignancy, and drug side effects, among others. It is a common reason for seeking pediatric care, with most cases being due to viral illnesses that require supportive care and parental education. Despite fever being a beneficial response aiding in clearing infections, concerns about fever, termed fever phobia, persist among parents and healthcare professionals, leading to unnecessary treatments and anxiety. The management of fever in children involves educating parents, preventing complications, and providing supportive care, emphasizing the importance of understanding the underlying causes and appropriate management strategies.
How many citations watson?
5 answers
Watson and Crick's 1953 paper on the double helix structure of DNA has been cited extensively. Despite being cited over 2,000 times since 1961, Watson and Crick themselves have only referenced their own paper twice. The reasons for citing their work vary, with a significant proportion of citations being historical rather than actively using the findings. Additionally, a parallel version of Galton-Watson processes has been studied for generating tree-shaped structures, with over 49,000 citations attesting to the usefulness of random trees. Furthermore, the WATSON COST Action, initiated in 2020, aims to integrate scientific knowledge on stable isotopes in water to understand water dynamics in the Earth's Critical Zone, involving around 230 members from 37 European countries.
What is the relationship between Shamatha or Samatha and obtaining or gaining Siddhis, Abhijñās or Magickal powers?
5 answers
Shamatha, also known as Samatha, plays a crucial role in the attainment of Siddhis or Abhijñās, which are considered as magical powers in various traditions. The practice of Shamatha involves cultivating a calm and focused mind through meditation, which is essential for developing the mental clarity and concentration required to access these extraordinary abilities. Siddhis are not only inherent in yogic practices but are also seen as a logical consequence of the metaphysical principles underlying traditions like Sāṃkhya/Yoga. Furthermore, the transcendental mental state known as Samadhi, closely related to Shamatha, is considered the pinnacle of human achievement and enlightenment, conferring transformative powers and immortality. Therefore, the relationship between Shamatha and gaining Siddhis or magical powers is deeply intertwined in various spiritual and philosophical traditions.
Which is better BCA or B VOC in software development?
5 answers
In software development, the B method has been widely utilized for safety-critical systems. On the other hand, the BC-BSP system has been proposed for large-scale data processing, showing higher efficiency compared to MapReduce-based applications when data fits in memory. The BC-VoD model, utilizing BC graphs and independent spanning trees, has demonstrated advantageous performance in VoD systems. While B method has proven successful in industrial projects like the METEOR automatic train control system, showing cost-effectiveness and reliability, the BC-BSP system offers scalability and fault tolerance for large-scale graph processing. Therefore, the choice between B method and BC-BSP depends on the specific requirements of the software project, with B method being suitable for safety-critical systems and BC-BSP for large-scale data processing.
In memory studies, how do scholars treat the plurality of memories?
5 answers
Scholars in memory studies approach the plurality of memories in various ways. They emphasize the importance of understanding different forms of memory, such as multidirectional memory, conflicting memory, divided memory, and fragmented memory. Memory researcher activists advocate for transparency, equality, justice, and pluralism, aiming to challenge singular victimhood-based identity politics and power-based knowledge production mechanisms. It is recognized that history and memory serve distinct functions in society, with historical science contributing to shaping the politics of memory while preserving multiplicity for a wide range of possible actions. Additionally, the exploration of survivor testimonies from Nazi concentration camps highlights the diverse subjectivities linked to memory, emphasizing the need for a nuanced understanding to prevent errors in memory treatment.
Iscuss memory future trends, upcoming memory technologies. Provide online references
5 answers
Memory studies have evolved significantly, encompassing diverse phenomena like museums, memorials, and genocidal conflicts. Emerging nonvolatile memory (NVM) technologies like STTMRAM, PCRAM, and RRAM are gaining traction, with applications from last-level cache to storage-class memory. The memory system is becoming a hub for innovation, focusing on energy, security, and reliability alongside high density. ReRAM technologies are crucial for optimized neuromorphic hardware, offering low-power solutions for cognitive systems and emulating plasticity in spiking neural networks. Various emerging memory device technologies, including 3D integrated memory, phase-change RAM, and resistive RAM, present opportunities and challenges for future computer systems. These trends highlight a shift towards intelligent, energy-efficient, and innovative memory solutions in the evolving landscape of memory technologies.
Can memory compaction help improve system performance?
5 answers
Memory compaction plays a crucial role in enhancing system performance by addressing issues like memory fragmentation and improving memory utilization efficiency. Various approaches like FlatPack, CompactGpu, and innovative page compaction methodshave been proposed to optimize memory management. These methods aim to reduce memory traffic, decrease memory fragmentation, and enhance memory defragmentation processes on different architectures, including GPUs. Memory compaction not only enables the creation of contiguous memory space but also contributes to reducing rearrangement time and improving system performance significantly. Additionally, data compaction techniques in memory systems, such as solid-state drives, further demonstrate the benefits of efficient data organization for enhancing overall system performance. Overall, memory compaction strategies prove to be instrumental in improving system performance by optimizing memory utilization and reducing memory-related bottlenecks.
What is RAMP™ (Reconfigurable Analog Module Processor) technology in analog computers for cyber security?
10 answers
RAMP™ (Reconfigurable Analog Module Processor) technology represents a significant advancement in the field of analog computing, particularly in enhancing cybersecurity measures. This technology is built upon the foundation of Field-Programmable Analog Arrays (FPAAs) and incorporates various innovative features to secure intellectual property (IP) and protect against security vulnerabilities in analog and mixed-signal circuits. The core of RAMP technology involves the use of obfuscation techniques to mask the topology of analog circuits, thereby creating a large search space for potential attackers and making brute force or topology attacks highly impractical. This is achieved by efficiently hiding the configuration of differential mode and single-ended mode analog circuits, which significantly enhances the security robustness of the system against unauthorized access or tampering. Moreover, RAMP technology integrates dynamic protection mechanisms against security vulnerabilities through the use of authentication blocks within the signal chain. This ensures that only validated components can communicate, further fortifying the system's defense against potential breaches. The programmability aspect of RAMP, facilitated by configurable analog blocks (CABs) and digital blocks (CDBs), allows for a wide range of analog functions and mixed-signal processing capabilities. This flexibility is crucial for adapting to various cybersecurity requirements without the need for dedicated hardware for each specific application. Additionally, RAMP technology supports the development of general-purpose analog signal processing systems through configurable integrator blocks (CIBs), enabling the implementation of a broad range of signal processing operations essential for cybersecurity applications. The RAMP project also emphasizes the importance of community collaboration and open-source development for tackling the challenges of parallel processing, which is vital for advancing cybersecurity measures in analog computing. Lastly, the inclusion of memory modules with logic support for various access modes within the RAMP system underscores the technology's versatility and capability to handle complex data processing tasks, further enhancing its applicability in cybersecurity domains. In summary, RAMP™ technology leverages the strengths of reconfigurable analog processing, advanced security features, and community-driven innovation to address the pressing cybersecurity challenges in analog computing environments.
How has Felix Uchenna Samuel's contributions influenced the development of modern computing systems?
5 answers
Felix Uchenna Samuel's contributions have significantly impacted the development of modern computing systems. By proposing an in-memory implementation of fast and energy-efficient logic (FELIX) that combines Processing-in-Memory (PIM) with memories, Samuel introduced a novel approach to reduce data movement and latency in computing systems. This innovation enabled single-cycle operations directly in crossbar memory, enhancing efficiency and speed in processing tasks. Additionally, Samuel's work extended single-cycle operations to implement complex functions like XOR and addition in memory, achieving lower latency compared to existing techniques. These advancements in in-memory computing have paved the way for more energy-efficient and high-performance computing systems, aligning with the growing demand for ultra-low power ICs in the era of the Internet of Things.