scispace - formally typeset
Search or ask a question

Showing papers on "Graphical user interface published in 2016"


Journal ArticleDOI
TL;DR: The latest version of the Molecular Evolutionary Genetics Analysis (Mega) software, which contains many sophisticated methods and tools for phylogenomics and phylomedicine, has been optimized for use on 64-bit computing systems for analyzing larger datasets.
Abstract: We present the latest version of the Molecular Evolutionary Genetics Analysis (Mega) software, which contains many sophisticated methods and tools for phylogenomics and phylomedicine. In this major upgrade, Mega has been optimized for use on 64-bit computing systems for analyzing larger datasets. Researchers can now explore and analyze tens of thousands of sequences in Mega The new version also provides an advanced wizard for building timetrees and includes a new functionality to automatically predict gene duplication events in gene family trees. The 64-bit Mega is made available in two interfaces: graphical and command line. The graphical user interface (GUI) is a native Microsoft Windows application that can also be used on Mac OS X. The command line Mega is available as native applications for Windows, Linux, and Mac OS X. They are intended for use in high-throughput and scripted analysis. Both versions are available from www.megasoftware.net free of charge.

33,048 citations


Patent
14 Jun 2016
TL;DR: Newness and distinctiveness is claimed in the features of ornamentation as shown inside the broken line circle in the accompanying representation as discussed by the authors, which is the basis for the representation presented in this paper.
Abstract: Newness and distinctiveness is claimed in the features of ornamentation as shown inside the broken line circle in the accompanying representation.

1,500 citations










Patent
08 Mar 2016
TL;DR: In this article, an electronic device with a touch-sensitive surface, a display, and one or more sensors to detect intensity of contacts: displays a plurality of user interface objects in a first-user interface; detects a contact while a focus selector is at a location of a first user interface object; detects an increase in a characteristic intensity of the contact to a first intensity threshold.
Abstract: An electronic device with a touch-sensitive surface, a display, and one or more sensors to detect intensity of contacts: displays a plurality of user interface objects in a first user interface; detects a contact while a focus selector is at a location of a first user interface object; and, while the focus selector is at the location of the first user interface object: detects an increase in a characteristic intensity of the contact to a first intensity threshold; in response, visually obscures the plurality of user interface objects, other than the first user interface object, while maintaining display of the first user interface object; detects that the characteristic intensity of the contact continues to increase above the first intensity threshold; and, in response, dynamically increases the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object.

Journal ArticleDOI
TL;DR: EosFit7-GUI as mentioned in this paper is a full graphical user interface designed to simplify the analysis of thermal expansion and equations of state (EoSs) and allows users to easily perform least-squares fitting of EoS parameters to diffraction data collected as a function of varying pressure, temperature or both.
Abstract: EosFit7-GUI is a full graphical user interface designed to simplify the analysis of thermal expansion and equations of state (EoSs). The software allows users to easily perform least-squares fitting of EoS parameters to diffraction data collected as a function of varying pressure, temperature or both. It has been especially designed to allow rapid graphical evaluation of both parametric data and the EoS fitted to the data, making it useful both for data analysis and for teaching.















Journal ArticleDOI
TL;DR: The PmagPy software package is described and the power of data discovery and reuse is illustrated through a reanalysis of published paleointensity data which illustrates how the effectiveness of selection criteria can be tested.
Abstract: The Magnetics Information Consortium (MagIC) database provides an archive with a flexible data model for paleomagnetic and rock magnetic data. The PmagPy software package is a cross-platform and open-source set of tools written in Python for the analysis of paleomagnetic data that serves as one interface to MagIC, accommodating various levels of user expertise. PmagPy facilitates thorough documentation of sampling, measurements, data sets, visualization, and interpretation of paleomagnetic and rock magnetic experimental data. Although not the only route into the MagIC database, PmagPy makes preparation of newly published data sets for contribution to MagIC as a byproduct of normal data analysis and allows manipulation as well as reanalysis of data sets downloaded from MagIC with a single software package. The graphical user interface (GUI), Pmag GUI enables use of much of PmagPy's functionality, but the full capabilities of PmagPy extend well beyond that. Over 400 programs and functions can be called from the command line interface mode, or from within the interactive Jupyter notebooks. Use of PmagPy within a notebook allows for documentation of the workflow from the laboratory to the production of each published figure or data table, making research results fully reproducible. The PmagPy design and its development using GitHub accommodates extensions to its capabilities through development of new tools by the user community. Here we describe the PmagPy software package and illustrate the power of data discovery and reuse through a reanalysis of published paleointensity data which illustrates how the effectiveness of selection criteria can be tested.

Journal ArticleDOI
01 Jan 2016
TL;DR: A brain-controlled intelligent wheelchair with the capability of automatic navigation and the mental burden of the user can be substantially alleviated.
Abstract: The concept of controlling a wheelchair using brain signals is promising. However, the continuous control of a wheelchair based on unstable and noisy electroencephalogram signals is unreliable and generates a significant mental burden for the user. A feasible solution is to integrate a brain–computer interface (BCI) with automated navigation techniques. This paper presents a brain-controlled intelligent wheelchair with the capability of automatic navigation. Using an autonomous navigation system, candidate destinations and waypoints are automatically generated based on the existing environment. The user selects a destination using a motor imagery (MI)-based or P300-based BCI. According to the determined destination, the navigation system plans a short and safe path and navigates the wheelchair to the destination. During the movement of the wheelchair, the user can issue a stop command with the BCI. Using our system, the mental burden of the user can be substantially alleviated. Furthermore, our system can adapt to changes in the environment. Two experiments based on MI and P300 were conducted to demonstrate the effectiveness of our system.