scispace - formally typeset
Search or ask a question

Showing papers in "Computing in Science and Engineering in 2017"


Journal ArticleDOI
TL;DR: In this article, the basic ideas of blockchain are reviewed and a sample minimalist implementation in Python is presented, based on the same idea. But the implementation is different from the one presented in this paper.
Abstract: Blockchain is a new technology, based on hashing, which is at the foundation of the platforms for trading cryptocurrencies and executing smart contracts. This article reviews the basic ideas of this technology and provides a sample minimalist implementation in Python.

240 citations


Journal ArticleDOI
TL;DR: The end of Moore's law may be the best thing that has happened in computing since the beginning ofMoore's law and should enable a new era of creativity by encouraging computer scientists to invent new biologically inspired paradigms, implemented on emerging architectures.
Abstract: The end of Moore's law may be the best thing that has happened in computing since the beginning of Moore's law. Confronting the end of an epoch should enable a new era of creativity by encouraging computer scientists to invent new biologically inspired paradigms, implemented on emerging architectures, with hybrid circuits and systems that combine the best of scaled silicon CMOS with new devices, physical interactions and materials.

117 citations


Journal ArticleDOI
TL;DR: Having demonstrated scalability and programmability, neuromorphic engineers now seek to encode continuous signals with spike trains in a manner that promises greater energy efficiency than all-analog or all-digital computing across a five-decade precision range.
Abstract: As transistors shrink to nanoscale dimensions, trapped electrons--blocking "lanes" of electron traffic--are making it difficult for digital computers to work. In stark contrast, the brain works fine with single-lane nanoscale devices that are intermittently blocked (ion channels). Conjecturing that it achieves error-tolerance by combining analog dendritic computation with digital axonal communication, neuromorphic engineers (neuromorphs) began emulating dendrites with subthreshold analog circuits and axons with asynchronous digital circuits in the mid-1980s. Three decades in, researchers achieved a consequential scale with Neurogrid--the first neuromorphic system that has billions of synaptic connections. Researchers then tackled the challenge of mapping arbitrary computations onto neuromorphic chips in a manner robust to lanes intermittently--or even permanently--blocked by trapped electrons. Having demonstrated scalability and programmability, they now seek to encode continuous signals with spike trains in a manner that promises greater energy efficiency than all-analog or all-digital computing across a five-decade precision range.

96 citations


Journal ArticleDOI
TL;DR: A hybrid approach that uses agents for IoT modeling and OM NeT for simulation is proposed, providing mapping guidelines between the agent paradigm and the OMNeT simulator abstractions.
Abstract: The focus of the Internet has recently shifted from current computers and mobile devices to everyday objects, people, and places; consequently, the Internet of Things (IoT) promises to be not only a compelling vision but the actual driving force of the upcoming fourth Industrial Revolution. Novel cyber-physical, customized, and highly pervasive services are impacting our lives, involving several stakeholders and fostering an unseen globally interconnected ecosystem. However, IoT system development is a multifaceted process that’s complex, error-prone, and time-consuming. Although modeling and simulation are crucial aspects that could effectively support IoT system development, an integrated approach synergistically providing both of them is lacking. The authors propose a hybrid approach that uses agents for IoT modeling and OMNeT for simulation, providing mapping guidelines between the agent paradigm and the OMNeT simulator abstractions. The proposed approach has been applied in small-, medium-, and large-scale IoT scenarios, where relevant performance indexes of IoT entities communication have been measured and analyzed.

84 citations


Journal ArticleDOI
TL;DR: The US Department of Energy's Exascale Computing Project is a partnership between the DOE Office of Science and the National Nuclear Security Administration and its mission is to transform today's high-performance computing ecosystem by executing a multifaceted plan.
Abstract: The US Department of Energy's (DOE) Exascale Computing Project is a partnership between the DOE Office of Science and the National Nuclear Security Administration. Its mission is to transform today's high-performance computing (HPC) ecosystem by executing a multifaceted plan: developing mission-critical applications of unprecedented complexity; supporting US national security initiatives; partnering with the US HPC industry to develop exascale computer architectures; collaborating with US software vendors to develop a software stack that is both exascale-capable and usable on US industrial- and academic-scale systems; and training the next-generation workforce of computer and computational scientists, engineers, mathematicians, and data scientists.

84 citations


Journal Article
TL;DR: It is concluded that systematic evaluation of educational programs not only allow for the appraisal of instructional effectiveness but also allows for progressive refinement of educational initiatives.
Abstract: Of the many interventions that might be used to improve the responsible conduct of research, educational interventions are among the most frequently employed. However, educational interventions come in many forms and have proven of varying effectiveness. Recognition of this point has led to calls for the systematic evaluation of responsible conduct of research educational programs. In the present effort, the basic principles underlying evaluation of educational programs are discussed. Subsequently, the application of these principles in the evaluation of responsible conduct of research educational programs is described. It is concluded that systematic evaluation of educational programs not only allow for the appraisal of instructional effectiveness but also allows for progressive refinement of educational initiatives. Ethics in the sciences and engineering is of concern not only because of its impact on progress in the research enterprise but also because the work of scientists and engineers impacts the lives of many people. Recognition of this point has led to a number of initiatives intended to improve the ethical conduct of investigators (National Academy of Engineering, 2009; National Institute of Medicine, 2002; National Academy of Sciences, 1992). Although a number of interventions have been proposed as a basis for improving ethical conduct, for example development of ethical guidelines, open data access, and better mentoring, perhaps the most widely applied approach has been ethics education (Council of Graduate Schools, 2012)—an intervention often referred to as training in the responsible conduct of research (RCR). When one examines the available literature on RCR training, it is apparent that a wide variety of approaches have been employed. Some RCR courses are based on a self-paced, online, instructional framework (e.g. Braunschweiger and Goodman, 2007). Other RCR courses involve face-to-face instruction over longer periods of time using realistic exercises and cases (e.g. Kligyte, Marcy, Waples, Sevier, Godfrey, Mumford, and Hougen, 2008). Some RCR courses 1 As the committee launched this study, members realized that questions related to the effectiveness of Responsible Conduct of Research education programs and how they might be improved were an essential part of the study task. A significant amount of work has been done to explore these topics. This work has yielded important insights, but additional research is needed to strengthen the evidence base relevant to several key policy questions. The committee asked one of the leading researchers in this field, Michael D. Mumford, to prepare a review characterizing the current state of knowledge and describing future priorities and pathways for assessing and improving RCR education programs. The resulting review constitutes important source material for Chapter 10 of the report. The committee also believes that the review adds value to this report a as a standalone document, and is including it as an appendix.

42 citations


Journal ArticleDOI
TL;DR: Future directions of transparent computing are indicated, from traditional terminals to mobile devices, by adopting this computing paradigm, which is becoming more lightweight with enhanced security, improved energy efficiency, and cross-platform capability.
Abstract: With the emergence of cloud computing, big data, and mobile networks, the computing paradigm and related technologies have experienced significant development over the past 10 years. Concomitantly, the prevalence of smartphones, wearable devices, and mobile applications has been constantly changing our daily lives. Terminals are evolving toward lightweight, intelligent, highly secure, and convenient devices. As an emerging computing paradigm, cloud computing focuses primarily on providing services through servers and networks but without addressing the inherent challenges and concerns of user terminals, such as energy efficiency, security, and cross-platform compatibility. Consequently, these challenges remain in the era of cloud computing and big data. In this article, the authors present a review to a promise computing paradigm: transparent computing. Similar to cloud computing, transparent computing stores software and user data at specific servers. More specifically, it extends the bus transmission in traditional computer architecture to the network. User interruptions at a terminal are redirected to a specific server through a network connection to request the corresponding instructions and data, which are subsequently executed at the terminal in a page-streaming pattern. By adopting this computing paradigm, user terminals are becoming more lightweight (nearly bare) with enhanced security (dumping after using), improved energy efficiency (no terminal storage), and cross-platform capability (low layer compatibility). This article presents a comprehensive survey and indicates future directions of transparent computing, from traditional terminals to mobile devices.

40 citations


Journal ArticleDOI
TL;DR: Levels of Student Participation and Stages of Relevant Curriculum are developed to help POGIL faculty make their classrooms more inclusive and their curriculum more relevant.
Abstract: Seven instructors at five institutions adopted process-oriented guided inquiry learning (POGIL) activities for their first-year courses. These POGIL activities were designed to prompt students to reflect on the relevance of the curriculum to their own lives. Students were significantly more comfortable with computers after taking the POGIL courses, even compared to students taking the same course from the same instructors. However, there was no overall effect on students' interest in taking more CS classes. Based on these findings, the authors developed "Levels of Student Participation and Stages of Relevant Curriculum" to help POGIL faculty make their classrooms more inclusive and their curriculum more relevant. Follow-up interviews with seven instructors demonstrated a marked increase in their plans to make their course content relevant to students.

35 citations


Journal ArticleDOI
TL;DR: A range of new algorithmic techniques emerging in the context of exascale computing, many of which defy the common wisdom of high-performance computing and are considered unorthodox, but could turn out to be a necessity in near future.
Abstract: On the eve of exascale computing, traditional wisdom no longer applies. High-performance computing is gone as we know it. This article discusses a range of new algorithmic techniques emerging in the context of exascale computing, many of which defy the common wisdom of high-performance computing and are considered unorthodox, but could turn out to be a necessity in near future.

34 citations


Journal ArticleDOI
TL;DR: In recent years, both cost decline rates and other economically valuable technical benefits flowing from use of the newest chip-making technology seem to have substantially diminished.
Abstract: Moore's law in the semiconductor industry came to mean regular, predictable introductions of new manufacturing technology that enabled denser electronics. Adoption of new manufacturing technology, and its effect on silicon wafer-processing costs, determined how rapidly transistor manufacturing costs declined. In recent years, both cost decline rates and other economically valuable technical benefits flowing from use of the newest chip-making technology seem to have substantially diminished.

24 citations


Journal ArticleDOI
TL;DR: This article not only links computational thinking skills to fundamental cognitive competencies but also describes pedagogical tools that have proven effective in teaching them at early ages.
Abstract: A decade of discourse to capture the essence of computational thinking has resulted in a broad set of skills whose teaching continue to pose challenges because of the reliance on the use of electronic computers and programming concepts. This article not only links computational thinking skills to fundamental cognitive competencies but also describes pedagogical tools that have proven effective in teaching them at early ages.

Journal ArticleDOI
TL;DR: In this paper, the authors describe findings from analysis of 64 in-depth interviews with young women who in high school expressed interest in computing by looking into National Center for Women and IT's Aspirations in Computing Award.
Abstract: Previous research has suggested that access and exposure to computing, social supports, preparatory privilege, a sense of belonging in computing, and a computing identity all contribute to women pursuing computing as a field of study or intended career. What we know less about is what keeps young women persisting in computing despite the obstacles they encounter. This article describes findings from analysis of 64 in-depth interviews with young women who in high school expressed interest in computing by looking into National Center for Women and IT's Aspirations in Computing Award. The dataset includes award winners and nonwinners, some of whom have persisted in computing and some who have not. The authors' findings suggest that multiple, redundant supports, including community reinforcement, as well as a bolstered sense of identity/belonging, may make the difference in who persists and who does not.

Journal ArticleDOI
TL;DR: This new phase, called 3D power scaling, will continue to support increase in transistor count at Moore's law pace well into the next decade.
Abstract: The combination of Moore's law and Dennard's scaling laws constituted the fundamental guidelines that provided the semiconductor industry with the foundation of a technically and economically viable roadmap until the end of the previous century. During this time, the transistor evolved from bipolar to PMOS to NMOS to CMOS. However, by the mid-1990s it was clear that fundamental physical limits of the MOS transistor were going to halt Dennard's scaling by 2005, at best. Eventually the power consumed by a single IC, under the relentless growth in the number of transistors and the continuous increase in operating frequency, pushed the IC temperature beyond reliable limits. Several new memory devices operating on completely new physics principles have been invented in the past 10 years and have already demonstrated that computing performance can be substantially improved by monolithically integrating several of these new heterogeneous memory layers on top of logic layers powered by a combination of CMOS and "new switch" transistors. This new phase, called 3D power scaling, will continue to support increase in transistor count at Moore's law pace well into the next decade.

Journal ArticleDOI
TL;DR: Improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory are reported.
Abstract: Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative to a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.

Journal ArticleDOI
TL;DR: The TAEG software did not consistently yield accurate results which varied significantly based on the specific bridge configuration and loading conditions, and this inconsistency was most likely a result of differences between field conditions and ideal conditions used by the software for analysis.
Abstract: An investigation was performed to assess the performance of the Torsional Analysis of Exterior Girders (TAEG) software in predicting rotation of exterior bridge girders due to eccentric construction loads associated with placement of the concrete deck. TAEG is frequently used by bridge contractors and designers to assess exterior girder rotation and design restraint systems. The primary objective was to investigate whether the program yielded consistent and reliable rotation results as compared to field measurements taken during bridge deck placements. The TAEG software was used to analyze four bridge construction projects and the results were compared to field measurements and results obtained from a finite element analysis (FEA). A secondary objective was to investigate how the program utilizes different bridge and construction parameters. Based on rotations measured in the field and determined using the finite element modeling, the program did not consistently yield accurate results which varied significantly based on the specific bridge configuration and loading conditions. This inconsistency was most likely a result of differences between field conditions and ideal conditions used by the software for analysis.

Journal ArticleDOI
TL;DR: The author shows, in comparison to the ideal case, that a relatively simple protocol based on the DEVS abstract simulator is both generally applicable and able to achieve significant speedup in parallel distributed simulation.
Abstract: The Discrete Event Systems Specification (DEVS) formalism has been widely disseminated in this magazine and elsewhere for its applicability to computational science and engineering. While research in parallel and distributed simulation has been active in the past several decades, the utility of many techniques for high-performance DEVS simulation has been limited. Recent research has shown that that a reconstruction of Amdahl's and Gustafson's laws for parallelizing sequential code can afford a better understanding of the underlying principles and their application to simulation of DEVS models. In this article, the author shows, in comparison to the ideal case, that a relatively simple protocol based on the DEVS abstract simulator is both generally applicable and able to achieve significant speedup in parallel distributed simulation.

Journal ArticleDOI
TL;DR: An introductory overview of the extension of the graph Fourier transform and graph wavelet transforms that are based on dictionaries of graph spectral filters, namely, spectral graph wavelets transforms are presented.
Abstract: The conventional wavelet transform is widely used in image and signal processing, where a signal is decomposed into a combination of known signals. By analyzing the individual contributions, the behavior of the original signal can be inferred. In this article, the authors present an introductory overview of the extension of this theory into graphs domains. They review the graph Fourier transform and graph wavelet transforms that are based on dictionaries of graph spectral filters, namely, spectral graph wavelet transforms. Then, the main features of the graph wavelet transforms are presented using real and synthetic data.

Journal ArticleDOI
TL;DR: The guest editor highlights the topics touched on in the articles in this special issue, in which lightweight mobile devices become the main online terminals in daily life and transparent computing promises great opportunities as well as new challenges.
Abstract: The guest editor highlights the topics touched on in the articles in this special issue. As we enter the Internet-of-Things era, in which lightweight mobile devices become the main online terminals in daily life, transparent computing promises great opportunities as well as new challenges.

Journal ArticleDOI
TL;DR: The implementation of the PEEMD and its application for the analysis of Earth science data, including the solution of the Lorenz model, an idealized terrain-induced flow, and Hurricane Sandy are discussed.
Abstract: To efficiently perform multiscale analysis of high-resolution, global, multiple-dimensional datasets, the authors have deployed the parallel ensemble empirical mode decomposition (PEEMD) package by implementing three-level parallelism into the ensemble empirical mode decomposition (EMD), achieving a scaled performance of 5,000 cores. In this study, they discuss the implementation of the PEEMD and its application for the analysis of Earth science data, including the solution of the Lorenz model, an idealized terrain-induced flow, and Hurricane Sandy.

Journal ArticleDOI
TL;DR: A 4D segmentation method by considering the 3D t data as a 4D hyper object, using a D4Q81 lattice in a lattice Bhatnagar-Gross-Krook (LBGK) simulation, where time is considered a fourth dimension for defining directions of particle momentum in the LBGK model.
Abstract: This article proposes a 4D segmentation method by considering the 3D t data as a 4D hyper object, using a D4Q81 lattice in a lattice Bhatnagar-Gross-Krook (LBGK) simulation, where time is considered a fourth dimension for defining directions of particle momentum in the LBGK model. They implemented 4D-LBGK on 20 4D hypersphere and hypercube images with varying amounts of Gaussian noise added (0–300 percent). The average Dice values were 94.56 and 93.35 percent for the hypersphere and hypercube segmentations, respectively, demonstrating good segmentation accuracy. They also applied 4D-LBGK on 4D intracranial aneurysm CTA images and compared the results to five established segmentation methods, demonstrating that the proposed method is accurate and robust. The aneurysm segmentation was subsequently used in a LBGK-based biomechanics simulation to provide potentially novel information for understanding the physiology and treatment of intracranial aneurysms.

Journal ArticleDOI
TL;DR: A pilot programming course in which high school students majoring in computer science were introduced to the visual, scenario-based programming language of live sequence charts, finding that LSCs, with their unique characteristics, can be a good vehicle for that.
Abstract: This article describes a pilot programming course in which high school students were introduced, through the visual programming language of live sequence charts (LSC), to a new paradigm termed scenario-based programming. The rationale underlying this course was teaching high school students a second, very different programming paradigm. Using LSC for this purpose has other advantages, such as exposing students to high-level programming, dealing with nondeterminism and concurrency, and referring to human-computer interaction (HCI) issues. This work also contributes to the discussion about guiding principles for curriculum development. It highlights an important principle: the educational objective of a course should include more than mere knowledge enhancement. A course should be examined and justified through its contribution to learning fundamental ideas and forming useful habits of mind.

Journal ArticleDOI
TL;DR: The results show that the magnitude of experimentally measured compressive strength increases for the core concrete, due to the confinement from the steel that is proportional to the ratio of the Area of steel to the area of concrete.
Abstract: High Strength Concrete Filled Steel tubes (CFST) provide a common construction material in China. The purpose of this research was to determine the axial load properties for CFST subjected to concentric and eccentric loading in a series of experiments. The results show that the magnitude of experimentally measured compressive strength increases for the core concrete, due to the confinement from the steel that is proportional to the ratio of the area of steel to the area of concrete. If the slenderness ratio is kept constant the columns bearing capacity and maximum strain decreases as the eccentricity to radius ratio increases. Formulas to estimate the load bearing capacity for short and for slender eccentrically loaded columns were established from the data. The results have been compared statistically to other published results to show that a general linear form of the capacity equation is warranted for High Strength Concrete Filled Steel tubes.

Journal ArticleDOI
TL;DR: The design philosophy, features, and capabilities of the software framework, and computational approaches underlying NavyFOAM are described, followed by a description of the V&V effort and application examples selected from Navy's recent R&D and acquisition programs.
Abstract: The main challenge facing simulation-based hydrodynamic design of naval ships comes from the complexity of the salient physics involved around ships, which is further compounded by the multidisciplinary nature of ship applications. Simulation of the flow physics using “first principles” is computationally very expensive and time-consuming. Other challenges largely pertain to software engineering, ranging from software architecture, verification and validation (V & V), and quality assurance. This article presents a computational fluid dynamics (CFD) framework called NavyFOAM that has been built around OpenFOAM, an open source CFD library written in C that heavily draws upon object-oriented programming. In the article, the design philosophy, features, and capabilities of the software framework, and computational approaches underlying NavyFOAM are described, followed by a description of the V&V effort and application examples selected from Navy’s recent R&D and acquisition programs.

Journal ArticleDOI
TL;DR: The author reviews some of the history of computational physics education, briefly discusses his experience with the program at Illinois State University, and suggests some direction for the future.
Abstract: While some physics educators have included computing in courses or have developed specialized courses for over 50 years, computational physics education has only slowly made inroads into the broader physics education community. Even now, when computation is arguably more important than ever in physics research and applications, it's rare that a physics department offers more than a single course in the topic to its undergraduate students. There have been several times over the years when interest in a more global approach to computational physics education has surged, only to subside without attaining the goal that computing finally take an essential role in the education of undergraduate physicists. The author reviews some of the history of computational physics education, briefly discusses his experience with the program at Illinois State University, and suggests some direction for the future.

Journal ArticleDOI
TL;DR: Interactive visualization and analysis tools have been developed by different suppliers to manage and understand multivariate data to increase the efficiency, safety, and cleanliness of such systems.
Abstract: One of the problems with which researchers of different domains, such as chemistry and fluid dynamics, are concerned is the optimization of coal combustion processes to increase the efficiency, safety, and cleanliness of such systems. The coal combustion process is reproduced by using complex simulations that normally produce highly complex data comprising many characteristics. Such datasets are employed by scientists to validate their hypotheses or to present new hypotheses, and the data analysis is mostly restricted to time-consuming workflows only capable of a portion of the data’s full spectrum. To support the experts, interactive visualization and analysis tools have been developed by different suppliers to manage and understand multivariate data.

Journal ArticleDOI
TL;DR: This toolbox contains adjustable modules for date and orbital parameters, which helps users improve their realization of satellite position effects on different matters such as eclipse, magnetic field, and satellite communication with ground station.
Abstract: This article presents software for a satellite position and attitude control system based on mathematical modeling. The software is developed using Matlab for conceptual design; an added benefit is that it shortens design time and decrease design costs. Providing interactive modules for different actuators with various configurations and control algorithms makes this toolbox suitable for analyzing the effect of design parameters on satellite response and stability. Moreover, this toolbox contains adjustable modules for date and orbital parameters, which helps users improve their realization of satellite position effects on different matters such as eclipse, magnetic field, and satellite communication with ground station. Taking position data, the software computes disturbance torques and gives users the ability to analyze satellite attitude control performance. The software’s ability to show simulation results as a set of graphics and text windows makes it more user friendly.

Journal ArticleDOI
TL;DR: This research presents a double hashing methodology that first collects statistics about element distribution and then maps between elements of the array and indexes based on the knowledge collected during the first hashing.
Abstract: In the past few years, researchers have introduced several sorting algorithms to enhance time complexity, space complexity, and stability. A double hashing methodology first collects statistics about element distribution and then maps between elements of the array and indexes based on the knowledge collected during the first hashing.

Journal ArticleDOI
TL;DR: This article presents the system model and describes a series of key technologies in CASP, including the client parameter acquisition scheme, the user behavior analysis approach, the service choice algorithm, and the transmission optimization method.
Abstract: Driven by the development of the mobile Internet and the emergence of heterogeneous mobile devices, context-aware applications are attracting growing interest. Based on the idea of transparent computing, the authors propose a novel context-aware service provision (CASP) architecture to transparently and actively provide suitable services to clients. In this article, they present the system model and describe a series of key technologies in CASP, including the client parameter acquisition scheme, the user behavior analysis approach, the service choice algorithm, and the transmission optimization method. Based on the established architecture, they developed software and ran a performance evaluation, proving that CASP offers a good user experience and outstanding generality.

Journal ArticleDOI
TL;DR: A highly scalable system for power-efficient wearable devices, TCID (transparent computing-based intelligent device), which leverages streaming-based execution to make applications in a wearable device obtain data from a server at runtime, eliminating limitations on the size of storage space for program execution on the device.
Abstract: The authors present a highly scalable system for power-efficient wearable devices, TCID (transparent computing-based intelligent device), which leverages streaming-based execution to make applications in a wearable device obtain data from a server at runtime, eliminating limitations on the size of storage space for program execution on the device. To improve performance of streaming-based execution, adaptive network operations and an object-based cache are proposed. The adaptive network operations are designed to deliver high network throughput for various data access patterns, and the object-based cache makes the cache management adaptive to the running mode. The authors have implemented TCID in an intelligent watch. Measurements of a wide range of workloads demonstrate the benefits of this design.

Journal ArticleDOI
TL;DR: Some frequent roles of code in computational science are outlined; it's common for a piece of code to have several roles, at the same time or as an evolution over time.
Abstract: Many of us write code regularly as part of our scientific activity, perhaps even as a full-time job. But even though we write--and use--more and more code, we rarely think about the roles that this code will have in our research, in our publications, and ultimately in the scientific record. In this article, the author outlines some frequent roles of code in computational science. These roles aren't exclusive; in fact, it's common for a piece of code to have several roles, at the same time or as an evolution over time. Thinking about these roles, ideally before starting to write the code, is a good habit to develop.