scispace - formally typeset
Search or ask a question

Showing papers on "Applied science published in 2015"


Book ChapterDOI
01 Jan 2015
TL;DR: In this article, the main concepts of design science research are presented and the foundations for the application of DSS research as a research method and the methods formalized by several authors for its operationalization are presented.
Abstract: This chapter presents the main concepts of design science research, which is a method that is conducted under the paradigm of design science to operationalize research. In addition to these concepts, the foundations for the application of design science research as a research method and the methods formalized by several authors for its operationalization are presented. A comparison of design science research with two alternate methods is performed. To prevent an exhaustive comparison in this book, we compare design science research with methods that are commonly used for qualitative research in Brazil: case study and action research.

75 citations


01 Jan 2015

52 citations


Journal ArticleDOI

35 citations



Dissertation
01 Jan 2015
TL;DR: The creation of an application that translates spoken food diaries into nutritional database entries and explores different methods for solving the problem of converting brands, descriptions and food item names into entries in nutritional databases.
Abstract: The ability to accurately and efficiently track nutritional intake is a powerful tool in combating obesity and other food related diseases. Currently, many methods used for this task are time consuming or easily abandoned; however, a natural language based application that converts spoken text to nutritional information could be a convenient and effective solution. This thesis describes the creation of an application that translates spoken food diaries into nutritional database entries. It explores different methods for solving the problem of converting brands, descriptions and food item names into entries in nutritional databases. Specifically, we constructed a cache of over 4,000 food items, and also created a variety of methods to allow refinement of database mappings. We also explored methods of dealing with ambiguous quantity descriptions and the mapping of spoken quantity values to numerical units. When assessed by 500 users entering their daily meals on Amazon Mechanical Turk, the system was able to map 83.8% of the correctly interpreted spoken food items to relevant nutritional database entries. It was also able to find a logical quantity for 92.2% of the correct food entries. Overall, this system shows a significant step towards the intelligent conversion of spoken food diaries to actual nutritional feedback. Thesis Supervisor: James Glass Title: Senior Research Scientist

9 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine the teaching of an experienced science faculty who had a strong interest in teaching undergraduate and graduate science courses and nature of science specifically, and find that robust SMK and interest in nature-of-science helped him address the different nature- of science aspects, and produce original content-embedded examples for teaching nature science.
Abstract: This is an interpretive case study to examine the teaching of an experienced science faculty who had a strong interest in teaching undergraduate and graduate science courses and nature of science specifically. It was interested in how he transformed knowledge from his experience as a scientist and his ideas about nature of science into forms accessible to his students. Data included observations (through the 12-week semester) and field notes, Views of Nature of Science-Form B, as well as semi-structured interview. Deductive analysis based on existing codes and categories was applied. Results revealed that robust SMK and interest in nature of science helped him address the different nature of science aspects, and produce original content-embedded examples for teaching nature of science. Although he was able to include nature of science as a part of a graduate course and to address nature of science myths that graduate students had, nature of science assessment was missing in his teaching. When subject matter knowledge and nature of science understanding support each other, it may be a key element in successful nature of science learning and teaching. Similar to science teachers, the development of assessment of nature of science may take more time than the development of other components of instruction (i.e., instructional strategy) for science faculties. Hence, this result may be an indication of the specific need for support to develop this component of teaching.

8 citations



01 Jan 2015
TL;DR: This dissertation introduces a novel process to study and analyze sensor data in order to obtain information pertaining to mobile targets at the sub-pixel level and utilizes a set of algorithmic tools for change detection, target extraction and analysis, super-pixel processing and target refinement.
Abstract: This dissertation introduces a novel process to study and analyze sensor data in order to obtain information pertaining to mobile targets at the sub-pixel level. The process design is modular in nature and utilizes a set o f algorithmic tools for change detection, target extraction and analysis, super-pixel processing and target refinement. The scope of this investigation is confined to a staring sensor that records data o f sub­ pixel vehicles traveling horizontally across the ground. Statistical models o f the targets and background are developed with noise and jitter effects. Threshold Change Detection, Duration Change Detection and Fast Adaptive Power Iteration (FAPI) Detection techniques are the three methods used for target detection. The PolyFit and FermiFit are two tools developed and employed for target analysis, which allows for flexible processing. Tunable parameters in the detection methods, along with filters for false alarms, show the adaptability o f the procedures. Super-pixel processing tools are designed, and Refinement Through Tracking (RTT) techniques are investigated as post­ processing refinement options. The process is tested on simulated datasets, and validated with sensor datasets obtained from RP Flight Systems, Inc. APPROVAL FOR SCHOLARLY DISSEMINATION The author grants to the Prescott Memorial Library o f Louisiana Tech University the right to reproduce, by appropriate methods, upon request, any or all portions o f this Dissertation. It is understood that “proPer request” consists o f the agreement, on the part o f the requesting party, that said reproduction is for his personal use and that subsequent reproduction will not occur without written approval o f the author o f this Dissertation. Further, any portions o f the Dissertation used in books, papers, and other works must be appropriately referenced to this Dissertation. Finally, the author o f this Dissertation reserves the right to publish freely, in the literature, at any time, any or all portions o f this Dissertation.

7 citations


Journal ArticleDOI
TL;DR: This article examined first- to fifth-grade elementary science textbooks with regard to their presentations of the nature of science and found that science textbooks, which are key resources in elementary science curricula, convey an accurate conception of science.
Abstract: An important objective in science education is for students to understand the nature of science. Therefore, it is important that science textbooks, which are key resources in elementary science curricula, convey an accurate conception of the nature of science. This study examined first- to fifth-grade elementary science textbooks with regard to their presentations of the nature of science.

6 citations


Journal ArticleDOI
21 Oct 2015-Isis
TL;DR: A list of the major institutions in the United States and Canada where there exist opportunities for graduate studies and research in the history of science (and technology) and history of medicine can be found in this paper.
Abstract: H E F O LLOWIN G LIST has been drawn up on the basis of information received from the major institutions in the United States and Canada where there exist opportunities for graduate studies and research in the history of science (and technology) and the history of medicine. It should be noted that these studies may be offered in connection with related but here unlisted fields, such as history, philosophy of science, sociology of science, and science policy studies, which may or may not be administered by the same department or program. I have listed the name and address of the institution, the name and fields of interest of full-time faculty, and the names of associated faculty, who are usually people in other departments of the university who contribute to the teaching and advise in the program. Further, as an indication of the size of the department, I have given the number of graduate students enrolled, but I warn that these figures are only approximate and not entirely comparable. Qualifications and standards for admission and requirements for the program differ of course from place to place and may be found by direct enquiry. Provision for graduate fellowships also varies from institution to institution, and again this may be found by direct enquiry. This list includes only the major institutions that have come to our attention as employing and training professionals in our fields. There are a large number of universities, colleges, and medical schools where some history of science or history of medicine is taught (at least 350 in the U.S.A. according to an excellent prior compilation by Prof. Duane H. D. Roller), usually in connection with another program, and in many of these cases it is possible to take a master's or a doctor's degree including some work in the history of science.

5 citations



Proceedings ArticleDOI
11 Mar 2015
TL;DR: In this paper, the authors describe applied research provided in cooperation of the Faculty of Applied Sciences and the Techmania Science Centre, and present the unique system Science on a Sphere, its features, control mechanisms and limitations.
Abstract: This paper describes applied research provided in cooperation of the Faculty of Applied Sciences and the Techmania Science Centre. It presents the unique system Science on a Sphere, its features, control mechanisms and limitations. The special visualization system brings a new approach of geographical data presentation. On the other hand it offers just a minimal opportunity of user interaction. Interaction should enable better participation of science centre in the learning process. The paper shows the ways how the existing projection system Science on a Sphere has been extended for the interactive features.

Dissertation
01 Jan 2015
TL;DR: A nonparametric framework for modeling an evolving sequence of (estimated) probability distributions which distinguishes the effects of sequential progression on the observed distribution from extraneous sources of noise (i.e. latent variables which perturb the distributions independently of the sequence-index).
Abstract: We present a nonparametric framework for modeling an evolving sequence of (estimated) probability distributions which distinguishes the effects of sequential progression on the observed distribution from extraneous sources of noise (i.e. latent variables which perturb the distributions independently of the sequence-index). To discriminate between these two types of variation, our methods leverage the underlying assumption that the effects of sequential-progression follow a consistent trend. Our methods are motivated by the recent rise of single-cell RNA-sequencing time course experiments, in which an important analytic goal is the identification of genes relevant to the progression of a biological process of interest at cellular resolution. As existing statistical tools are not suited for this task, we introduce a new regression model for (ordinal-value , univariate-distribution) covariate-response pairs where the class of regression-functions reflects coherent changes to the distributions over increasing levels of the covariate, a concept we refer to as trends in distributions. Through simulation study and extensive application of our ideas to data from recent singlecell gene-expression time course experiments, we demonstrate numerous strengths of our framework. Finally, we characterize both theoretical properties of the proposed estimators and the generality of our trend-assumption across diverse types of underlying sequential-progression effects, thus highlighting the utility of our framework for a wide variety of other applications involving the analysis of distributions with associated ordinal labels. Thesis Supervisor: Tommi S. Jaakkola Title: Professor of Electrical Engineering and Computer Science Thesis Supervisor: David K. Gifford Title: Professor of Electrical Engineering and Computer Science


Journal ArticleDOI
TL;DR: In this article, the authors focus on the process of reverse engineering and provide a model 5E lesson in which they flip the typical science-to-engineering sequence and, instead, use principles of engineering design as a springboard from which to develop scientific concepts.
Abstract: The Next Generation Science Standards call for the integration of science and engineering. Often, the introduction of engineering activities occurs after instruction in the science content. That is, engineering is used as a way for students to elaborate on science ideas that have already been explored. However, using only this sequence of instruction communicates a limited view of the relationship between science and engineering. In this article, we focus on the process of reverse engineering, and we provide a model 5E lesson in which we flip the typical science-to-engineering sequence and, instead, use principles of engineering design as a springboard from which to develop scientific concepts. Specifically, students use principles of engineering to deconstruct already engineered devices (i.e., different types of coffeemakers) in an effort to propose scientific explanations (i.e., factors affecting solubility) for the design features. These proposed explanations are then tested by isolating indivi...

14 Jun 2015
TL;DR: This paper investigates a number of curricula and syllabi to identify a list of topics/concepts that appear central to the learning objectives of Materials Science and Engineering, and reports on an initial review of data compilations and tools, the results of a survey and focus groups responding to an explorative version of a database.
Abstract: The academic areas of Materials Science and Materials Engineering have different emphasis at different Universities. Some would argue that the former is more focused on understanding materials (why) while the latter is more focused on making use of them (how). Another way of looking at these areas is that they emphasize the microscopic (or even nanoscopic) aspects of materials or the macroscopic aspects, respectively. Together, they constitute an important part of many engineering programs and may therefore be treated jointly as Materials Science and Engineering. In this paper, we have investigated a number of curricula and syllabi to identify a list of topics/concepts that appear central to the learning objectives of Materials Science and Engineering. Among the top candidates were: characteristic material properties of the main material groups, modification of microstructure by various (thermal/mechanical) processes, binary phase diagrams, micrographs and materials characterization and testing. Working in a project involving students of engineering and Materials Science, databases were designed containing facts and visual information for the purpose of introductory materials teaching. A non-exhaustive review of existing teaching resources for these areas reveal that many are highly specialized on one topic (e.g., crystallography) or one group of materials (e.g., metals). We are therefore exploring the ways to integrate several of the core themes mentioned in the list above, to facilitate assignments, projects or self-directed studies in Materials Science and Engineering. A standard materials selection software package was used as a starting point, since it offered comprehensive material property databases and the possibility to add tailor-made data records and entire data tables. Furthermore, links between, e.g., heat treatments, phase diagrams and micrographs can be set up. In this paper, we report on an initial review of data compilations and tools, the results of a survey and focus groups responding to an explorative version of a database. We aim to share our findings over the materials community hoping to get feed-back and inspire educational ideas.



29 Apr 2015

BookDOI
01 Jan 2015

Proceedings ArticleDOI
24 Feb 2015
TL;DR: Changing the landscape of computer science education at the high school level is a key component of several initiatives of the National Science Foundation and the private sector with both non-profits and for-profits.
Abstract: Changing the landscape of computer science education at the high school level is a key component of several initiatives of (1) the National Science Foundation (NSF), e.g., as the cornerstone of the CE21 program partnered with academic institutions and (2) the private sector with both non-profits such as Code.org, CodeVA, and MassCan; and for-profits such as Tynker, CodeHS, and Trinket. This collaboration between privately and publicly funded initiatives is designed to reach every student; and to achieve this at scale.

01 Jan 2015
TL;DR: HMS automation refers to use of computers, associated peripheral media such as Disks, Printer, Optical media etc and utilization of computer based products and services in the performance of all type of Institutions functions and operations.
Abstract: HMS automation refers to use of computers, associated peripheral media such as Disks, Printer, Optical media etc. and utilization of computer based products and services in the performance of all type of Institutions functions and operations. Computers are capable of introducing a great deal of automation in operations, functions since they are electronic, programmable and are capable of control over the process being performed.

01 Jan 2015
TL;DR: Saadeh et al. as discussed by the authors proposed a new benchmark system for wind energy transmission systems, which is based on recognizing that bus injection currents Ibus can be viewed as signals produced by a random process and a new parameter estimation technique for determining the bus admittance matrix (Ybus) is also proposed to further calibrate power system models.
Abstract: In this research a new benchmark system is proposed for wind energy transmission systems. New model development, validation, and calibration methods for power transmission systems are proposed and implemented as well. First, a model reduction criteria is chosen based on electrical interconnection and geographical information. Model development is then done using reduction techniques on an operation model provided by a transmission operator based on the chosen criteria. Then model validation is performed using actual PMU synchrophasor measurements provided by a utility company. The model development and validation process ensures the accuracy of the developed model and makes for a realistic benchmark system for wind generation transmission systems. The new proposed model development and validation methods are generic and can be used to model any power transmission system for various simulation needs. Nevertheless, the accuracy of the benchmark model is constrained by the accuracy of the initial operational model. In this research, a new parameter estimation technique for determining the bus admittance matrix (Ybus) is also proposed to further calibrate power system models. Ybus estimation is done using recorded PMU synchrophasor measurements. The approach proposed in this research is based on recognizing that bus injection currents Ibus can be viewed as signals produced by a random process. In this manner, the corresponding bus voltages Vbus are also stochastic signals that are related through a cross-covariance matrix to the vector Ibus. Using estimation techniques developed for statistical signal processing, the cross-covariance matrix is shown to be Zbus. ©2015 by Mahmood Shihadeh Saadeh All Rights Reserved Acknowledgments This research was funded in part by the National Science Foundation Industry/University Cooperative Research Center on Grid-connected Advanced Power Electronics Systems (GRAPES.uark.edu) under grant number 0934390, which provided access to the synchrophasor data used in conducting this research. Dedication This dissertation is dedicated to my parents, Dr. Shihadeh Saadeh and Omaima Sakhnini. I would like to express my appreciation to my advisor Dr. Roy McCann for his continued support and guidance throughout my PhD study. I would also like to thank the rest of my thesis committee for their support. Special thanks to my friends for their continued support and encouragement especially through the final steps in preparing this dissertation. I also would like to thank my parents and siblings for being there for me and for all they have done for me throughout the years. Special thanks to my older brother Dr. Osama Saadeh for his help, support, and encouragement throughout my graduate study. Table of

BookDOI
01 Jan 2015
TL;DR: The class of regular queries and its well-behavedness is described, which is inspired and informed, but quite different than the theory of regular languages.
Abstract: The classical theory of regular languages was initiated in the 1950s and reached a mature and stable state in the 1970s. In particular, the computational complexity of several decision problems for regular expressions, including emptiness, universality, and equivalence, is well understood. A new application area for regular languages emerged in the 1990s in the context of graph databases, where regular expressions provide a way to formulate queries over graphs. In this new context, the classical theory needs to be reconsidered. It turns out that the new context is a fertile area, and gives rise to an elegant theory of regular queries, which is inspired and informed, but quite different than the theory of regular languages. In this talk I will describe the class of regular queries and its well-behavedness.



DissertationDOI
10 Nov 2015
TL;DR: This dissertation proposes a set of methods to investigate models of metabolic networks, and proposes a map between a biological organism and the von Neumann architecture, where metabolism executes reactions mapped to instructions of a Turing machine, therefore providing insights into computational processes.
Abstract: To paraphrase Stan Ulam, a Polish mathematician who became a leading figure in the Manhattan Project, in this dissertation I focus not only on how computer science can help biologists, but also on how biology can inspire computer scientists. On one hand, computer science provides powerful abstraction tools for metabolic networks. Cell metabolism is the set of chemical reactions taking place in a cell, with the aim of maintaining the living state of the cell. Due to the intrinsic complexity of metabolic networks, predicting the phenotypic traits resulting from a given genotype and metabolic structure is a challenging task. To this end, mathematical models of metabolic networks, called genome-scale metabolic models, contain all known metabolic reactions in an organism and can be analyzed with computational methods. In this dissertation, I propose a set of methods to investigate models of metabolic networks. These include multi-objective optimization, sensitivity, robustness and identifiability analysis, and are applied to a set of genome-scale models. Then, I augment the framework to predict metabolic adaptation to a changing environment. The adaptation of a microorganism to new environmental conditions involves shifts in its biochemical network and in the gene expression level. However, gene expression profiles do not provide a comprehensive understanding of the cellular behavior. Examples are the cases in which similar profiles may cause different phenotypic outcomes, while different profiles may give rise to similar behaviors. In fact, my idea is to study the metabolic response to diverse environmental conditions by predicting and analyzing changes in the internal molecular environment and in the underlying multi-omic networks. I also adapt statistical and mathematical methods (including principal component analysis and hypervolume) to evaluate short term metabolic evolution and perform comparative analysis of metabolic conditions. On the other hand, my vision is that a biomolecular system can be cast as a “biological computer”, therefore providing insights into computational processes. I therefore study how computation can be performed in a biological system by proposing a map between a biological organism and the von Neumann architecture, where metabolism executes reactions mapped to instructions of a Turing machine. A Boolean string represents the genetic knockout strategy and also the executable program stored in the “memory” of the organism. I use this framework to investigate scenarios of communication among cells, gene duplication, and lateral gene transfer. Remarkably, this mapping allows estimating the computational capability of an organism, taking into account also transmission events and communication outcomes.

01 Jan 2015

Dissertation
01 Jan 2015
TL;DR: The first platform merges fluorescence cytometry with label-free profiling of the small molecule composition of tens of thousands of cells based on matrixassisted laser desorption/ionization (MALDI) mass spectrometry and demonstrated its utility for on-chip enrichment and recovery of circulating tumor cells and high-content immunophenotyping of small clinical samples.
Abstract: Many important areas of research regarding human health, such as immunology and cancer biology, deal with highly heterogeneous populations of cells where the contributions of individual players cannot be ignored. Single-cell technologies aim to resolve this heterogeneity by analyzing many individual cells in a high-throughput manner. Here we developed two examples of such tools that rely on microfabricated arrays of microwells. The first platform merges fluorescence cytometry with label-free profiling of the small molecule composition of tens of thousands of cells based on matrixassisted laser desorption/ionization (MALDI) mass spectrometry. We evaluated several materials and approaches to chip fabrication suitable for interfacing with a MALDI instrument. We also developed an analytical pipeline for efficient processing of cells on the chip and demonstrated its application to the analysis of brain tumor samples. The second platform provides a new format of microwell arrays for fluorescence cytometry that improves their compatibility with a range of automated equipment and enables more efficient processing of a greater number of samples, while preserving viability and identity of cells for subsequent analyses. We demonstrated its utility for on-chip enrichment and recovery of circulating tumor cells (CTCs) and high-content immunophenotyping of small clinical samples. Thesis Supervisor: J. Christopher Love Title: Associate Professor of Chemical Engineering Thesis Co-Advisor: Darrell J. Irvine Title: Professor of Materials Science and Engineering and Biological Engineering

01 Jan 2015
TL;DR: The purpose of the paper was to present a critical overview on the main opinions and research results on "seabuckthornology" as a new interdisciplinary science, as mentioned on various international conferences and other events hold in different countries where the author took part and expressed his opinions as expert in the field.
Abstract: The purpose of the paper was to present a critical overview on the main opinions and research results on "seabuckthornology" as a new interdisciplinary science, as mentioned on various international conferences and other events hold in different countries where the author took part and expressed his opinions as expert in the field. The current opinion of many experts is that the sea buckthorn is the result of a long hard work in the field of research, practice, landscape architecture, production, soil science, animal and human health. It is an important plant of the 3rd millennium. The only problem many experts are facing is the fact that it is very difficult to put in order all multidisciplinary information from Botany, Geology, Marketing, Medicine, Biochemistry, Agronomy, Management etc. The solution is the elaboration of a statute of the interdisciplinary new science "seabuckthornology" and the creation of a multilingual data base, which should be updated permanently as at any moment a manufacturer having sea buckthorn oil production to find many offers from the entire world, obtaining all parameters and prices in few minutes, the address, fax, e-mail, phone number etc. In the actual world crises, a scientifically aboard of seabuckthorn may be a solution to health and environmental problems.