scispace - formally typeset
Search or ask a question

Showing papers presented at "Bioinformatics and Bioengineering in 2015"


Proceedings ArticleDOI
02 Nov 2015
TL;DR: This survey discusses the security and privacy issues in current mHealth systems and their impact, and discusses the latest threats, attacks and proposed countermeasures that could support secure sensitive m health systems.
Abstract: mHealth is a growing field that enables individuals to monitor their health status and facilitates the sharing of medical records with physicians and between hospitals anytime and anywhere. Unfortunately, smartphones and mHealth applications are still vulnerable to a wide range of security threats due to their portability and weaknesses in management and design. Nevertheless, mHealth users are becoming more aware of the security and privacy issues related to their personal healthcare information. This survey discusses the security and privacy issues in current mHealth systems and their impact. We also discuss the latest threats, attacks and proposed countermeasures that could support secure sensitive mHealth systems. Finally, we conclude with a brief summary of open security problems that still need to be addressed in the mHealth field.

48 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: This paper presents a framework for classification of Human Epithelial Type 2 cell IIF images using convolutional neural networks (CNNs) and achieves a significant increase in the performance over other approaches that have been used until now.
Abstract: Automated cell classification in Indirect Immunofluorescence (IIF) images has potential to be an important tool in clinical practice and research. This paper presents a framework for classification of Human Epithelial Type 2 cell IIF images using convolutional neural networks (CNNs). Previuos state-of-the-art methods show classification accuracy of 75.6% on a benchmark dataset. We conduct an exploration of different strategies for enhancing, augmenting and processing training data in a CNN framework for image classification. Our proposed strategy for training data and pre-training and fine-tuning the CNN network led to a significant increase in the performance over other approaches that have been used until now. Specifically, our method achieves a 80.25% classification accuracy. Source code and models to reproduce the experiments in the paper is made publicly available.

40 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: This work has developed and deployed a system, based on above concept, on smartphone as an android application for real-time stress monitoring and provides a continuous stress score which helps the user to understand stress patterns across a day in a better way and to take appropriate measures to manage stressful situations.
Abstract: Continuous monitoring of an individual's stress levels is essential to manage stress and mental state in an effective way. With increasing ubiquity of wearable heart rate monitors and their unobtrusiveness, HRV (Heart rate variability) derived from heart rate signals has emerged as one of the most relevant parameters for continuous monitoring of stress. In the present work, we have made an attempt to address the challenges about distinguishing between stressed and non-stressed state of a person based on just one minute of IBI (Inter Beat Interval) records with good accuracy. Such ultra-short term analysis of HRV is particularly advantageous towards capturing very short term fluctuations in mental stress levels and enhanced scope for frequent monitoring. We have analyzed various time domain, frequency domain and nonlinear HRV features to narrow down to a most influential set of features for accurate classification between stressed and non-stressed state. We have identified RMSSD (root mean square of successive differences) of IBI series to be the most direct indicator of stressed state. We also provide a continuous stress score which, when used in continuous monitoring scenario, provides the user with adequate details about his/her stress levels. This helps the user to understand stress patterns across a day in a better way and to take appropriate measures to manage stressful situations. We have developed and deployed a system, based on above concept, on smartphone as an android application for real-time stress monitoring.

27 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: The practical application of data mining methods for estimation of survival rate and disease relapse for breast cancer patients is described and it is concluded that the classifiers obviously learn some of the concepts of breast cancer survivability and recurrence.
Abstract: In this paper, we described the practical application of data mining methods for estimation of survival rate and disease relapse for breast cancer patients. A comparative study of prominent machine learning models was carried out and according to the achieved results we concluded that the classifiers obviously learn some of the concepts of breast cancer survivability and recurrence. These algorithms were successfully applied to a novel breast cancer data set of the Clinical Center of Kragujevac. The Naive Bayes classifier is selected as a model for prognosis of cancer survivability on the basis of the 5 years survival rate, while the Artificial Neural Network has achieved the best performance in prognosis of cancer recurrence. Selection of twenty attributes that are the most related to success of prognosis on survivability can give new insights into the set of prognostic factors which need to be observed by medical experts.

24 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: A new technique to use these two sources together to improve the prediction of influenza outbreaks is proposed and promising results for both nowcasting and forecasting with linear regression models are achieved.
Abstract: Prediction of influenza outbreaks is of utmost importance for health practitioners, officers and people. After the increasing usage of internet, it became easier and more valuable to fetch and process internet search query data. There are two significant platforms that people widely use, Google and Wikipedia. In both platforms, access logs are available which means that we can see how often any query/article was searched. Google has its own web service for monitoring and forecasting influenza-illness which is called the Google Flu Trends. It provides estimates of influenza activity for some countries. The second alternative is Wikipedia access logs which provide the number of visits for the articles on Wikipedia. There are papers which work with these platforms separately. In this paper, we propose a new technique to use these two sources together to improve the prediction of influenza outbreaks. We achieved promising results for both nowcasting and forecasting with linear regression models.

24 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: Experimental results show that the proposed method outperforms the other classification methods in MCC index and have higher accuracy after SVM and LDA classifiers.
Abstract: In this study, we propose a novel method for medical data classification, it is the integration of new heuristic algorithm that get inspired the black hole phenomenon called as Black Hole Algorithm (BHA) and decision tree (C4.5). To evaluate the effectiveness of our proposed method, it is implemented on 2 microarray dataset and 5 different medical data sets obtained from UCI machine learning databases. The results of BHA + C4.5 implementation are compared to seven well-known benchmark classification methods (support vector machine under the kernel of Radial Basis Function, Classification And Regression Tree (CART), C4.5 decision tree, C5.0 decision tree, Linear Discriminant Analysis (LDA), Self-Organizing Map and Naive Bayes). Repeated five-fold cross-validation method is used to justify the performance of classifiers. Two criteria are used for model evaluation. They are Matthews' Correlation Coefficient (MCC) and Accuracy. Experimental results show that our proposed method outperforms the other classification methods in MCC index and have higher accuracy after SVM and LDA classifiers.

15 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: The proposed method is a promising tool to detect necrosis in heterogeneous whole slide images, showing its robustness to varying visual appearances.
Abstract: Automatic detection of necrosis in histological images is an interesting problem of digital pathology that needs to be addressed. Determination of presence and extent of necrosis can provide useful information for disease diagnosis and prognosis, and the detected necrotic regions can also be excluded before analyzing the remaining living tissue. This paper describes a novel appearance-based method to detect tumor necrosis in histopathogical whole slide images. Studies are performed on heterogeneous microscopic images of gastric cancer containing tissue regions with variation in malignancy level and stain intensity. Textural image features are extracted from image patches to efficiently represent necrotic appearance in the tissue and machine learning is performed using support vector machines followed by discriminative thresholding for our complex datasets. The classification results are quantitatively evaluated for different image patch sizes using two cross validation approaches namely three-fold and leave one out cross validation, and the best average cross validation rate of 85.31% is achieved for the most suitable patch size. Therefore, the proposed method is a promising tool to detect necrosis in heterogeneous whole slide images, showing its robustness to varying visual appearances.

13 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: A compressive sensing approach is used to suppress or completely eliminate unwanted artifacts in TFD design methods, and a basis pursuit algorithm is employed for signal reconstruction in order to reduce resolution loss.
Abstract: Signals with time-varying frequency content are generally best represented in the time frequency (TF) domain, with the components instantaneous frequency laws being their key nonstationary features. There are a number of methods for TF distribution calculation (TFD), however, most of them add unwanted artifacts, making the TFD interpretation more difficult. In this paper, we use a compressive sensing approach to suppress or completely eliminate those artifacts. Furthermore, a basis pursuit algorithm is employed for signal reconstruction in order to reduce resolution loss, which is a mayor obstacle of existing TFD design methods.

11 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: Proposed algorithm provides a novel optic disc localization and segmentation technique that detects multiple candidate optic disc regions from fundus image using enhancement and segmentsation and extracts a hybrid feature set for each candidate region consisting of vessel based and intensity based features.
Abstract: Optic disc is one of the fundamental regions located in the internal retina that helps ophthalmologists in analysis and early diagnosis of many retinal diseases such as optic atrophy, optic neuritis, papilledema, ischemic optic neuropathy, glaucoma and diabetic retinopathy. An accurate and early diagnosis requires an accurate optic disc examination. Presence of different retinal abnormalities and non-uniform illumination make optic disc localization a challenging task. There is a need to detect and localize optic disc from fundus images with high accuracy to make the diagnosis using Computer Aided Systems developed for ophthalmic disease diagnosis more reliable. Proposed algorithm provides a novel optic disc localization and segmentation technique that detects multiple candidate optic disc regions from fundus image using enhancement and segmentation. The proposed system then extracts a hybrid feature set for each candidate region consisting of vessel based and intensity based features which are finally fed to SVM classifier. Final decision of Optic disc region is done after computing Manhattan distance from the mean of training data feature matrix. The evaluation of proposed system has been done on publicly available datasets and one local dataset and results shows the validity of proposed system.

11 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: Evidence is provided that such an approach could effectively discriminate indoor activity High Density Regions which may subsequently be transferred to datasets originated from seniors' real homes in the light of context aware gait analysis.
Abstract: Gait analysis is nowadays considered, as a promising contributor towards early detection of cognitive and physical status deterioration when it comes to elderly people. However, the majority of recent efforts on indoor gait analysis methodologies are limited as they only exploit the average walking speed. Applying density based clustering algorithms on indoor location datasets could accelerate context awareness on gait analysis and consequently augment information quality with regard to underlying gait disorders. This work presents the application of DBScan, a well-known algorithm for knowledge discovery, on indoor Kinect location datasets collected in the Active and Healthy Aging Living Lab in the Lab of Medical Physics of the Aristotle University of Thessaloniki. The aim of the paper is to provide evidence that such an approach could effectively discriminate indoor activity High Density Regions which may subsequently be transferred to datasets originated from seniors' real homes in the light of context aware gait analysis.

11 citations


Proceedings ArticleDOI
02 Nov 2015
TL;DR: A low-invasive framework for estimating changes in a cognitive performance using heart rate variability has the potential to help managerial personnel in making performance change reports for their workers, suggesting reasons for changes in the performance, and urging them to change their working styles using HRV.
Abstract: This paper presents a low-invasive framework for estimating changes in a cognitive performance using heart rate variability (HRV). Although HRV is a common physiological indicator of autonomous nerve activity or central nervous fatigue, there are individual differences in the relationship between HRV and such internal state. The new framework enables an estimation model to be determined using the HRV characteristics of individuals performing tasks through cognitive efforts. They also enable users working in a chair to have their changes in the cognitive performance estimated without interrupting their work or having to use a lot of devices as most previous methods require. Experimental results show the framework can estimate mental fatigue; defined based on cognitive performance, using HRV as the same level as the previous work did using higher-invasive method(using multi-channel electroencephalogram (EEG) sensor or multiple vital sensors). It can also estimate changes in a cognitive performance for most of subjects, and one of our proposed method in the framework realizes more effective and useful estimation than the others. It therefore has the potential to help managerial personnel in making performance change reports for their workers, suggesting reasons for changes in the performance, and urging them to change their working styles using HRV.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The proposed algorithm is robust to outliers, and experimental results show that models learned using data scaled by the proposed algorithm generally outperform the ones using min-max mapping and z-score which are currently the most commonly used data scaling algorithms.
Abstract: Gene expression data are widely used in classification tasks for medical diagnosis. Data scaling is recommended and helpful for learning the classification models. In this study, we propose a data scaling algorithm to transform the data uniformly to an appropriate interval by learning a generalized logistic function to fit the empirical cumulative density function of the data. The proposed algorithm is robust to outliers, and experimental results show that models learned using data scaled by the proposed algorithm generally outperform the ones using min-max mapping and z-score which are currently the most commonly used data scaling algorithms.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: For the first time, an unsupervised methodology achieves high seizure detection sensitivity with significantly reduced human intervention using more than 978 hours of EEG recordings from a public database.
Abstract: An unsupervised methodology for the detection of Epileptic seizures in EEG recordings is proposed. The time-frequency content of the EEG signals is extracted using the Short Time Fourier Transform. The analysis focuses on the EEG energy distribution among the well-established delta, theta and alpha rhythms (2–13 Hz), as energy variations in these frequency bands are widely associated with seizure activity. Relying on seizure rhythmicity, the classification is performed by isolating the segments where each rhythm is more clearly and dominantly expressed over the others. For the first time, an unsupervised methodology is evaluated using more than 978 hours of EEG recordings from a public database. The results show that the proposed methodology achieves high seizure detection sensitivity with significantly reduced human intervention.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This research proved the assumption that human perception of the electrical stimulation improves as frequency rises, but also opened some new questions and gave us new ideas for future work.
Abstract: Electrical stimulation and vibration stimulation are two most investigated interfaces for sending the information to humans. It is well known that change in electrical stimulation is not linearly related to the perception of sensation that humans feel. This paper presents the experiment along with results that investigate the relationship between change in electrical stimulation parameters (e.g. pulse frequency and pulse width) and perception of sensation on humans' skin. Our assumption is that human perception of the electrical stimulation improves as frequency rises (from 10 to 100Hz), (e.g. the resolution of perception rises). Psychophysics algorithms were used for both, finding minimal difference of sensation intensity (for different parameters) that humans can sense (Just noticeable difference), as well as data analysis. This research proved the assumption but also opened some new questions and gave us new ideas for future work.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The results indicate that speech disorders are more significantly seen in the patients whose UPDRS exceeds the experimentally determined threshold value (15), validates that vocal impairments can be used as early indicators of the disease.
Abstract: Recently, there is an increasing motivation to develop telemonitoring systems that enable cost-effective screening of Parkinson's Disease (PD) patients. These systems are generally based on measuring the motor system disorders seen in PD patients by the help of non-invasive data collection tools. Vocal impairments one of the most commonly seen PD symptoms in the early stages of the disease, and building such telemonitoring systems based on detecting the level of vocal impairments results in reliable motor UPDRS tracking systems. In this paper, we aim to determine the optimal UPDRS threshold value that can be discriminated by the vocal features extracted from the sustained vowel phonations of PD patients. For this purpose, we used an online available PD telemonitoring dataset consisting of speech recordings of 42 PD patients. We converted the UPDRS prediction problem into a binary classification problem for various motor UPDRS threshold values, and fed the features to k-Nearest Neighbor and Support Vector Machines classifiers to discriminate the PD patients whose UPDRS is less than or greater than the specified threshold value. The results indicate that speech disorders are more significantly seen in the patients whose UPDRS exceeds the experimentally determined threshold value (15). Besides, considering that the motor UPDRS ranges from 0 to 108, relatively low UPDRS threshold of 15 validates that vocal impairments can be used as early indicators of the disease.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This study presents a set of automatic methods for quantifying the motor symptoms of PD and shows that these automatically extracted features can be used to distinguish PD from other movement disorders causing tremor, namely essential tremor (ET), functional tremor (£FT) and enhanced physiological tremor ($EPT).
Abstract: An easily performed and objective test of patients fine motor skills would be valuable in the diagnosis of Parkinson's disease (PD). In this study we present a set of automatic methods for quantifying the motor symptoms of PD and show that these automatically extracted features can be used to distinguish PD from other movement disorders causing tremor, namely essential tremor (ET), functional tremor (FT) and enhanced physiological tremor (EPT). The classification accuracies (mean of sensitivity and specificity) for separating PD from the other tremor syndromes were 82.0 % for ET, 69.8 % for FT and 72.2 % for EPT.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: A web-based simulator, which can be used for generating image sequences of moving spermatozoa cells and assessment of multiple object tracking algorithms, especially Computer Aided Sperm Analysis (CASA) systems is developed.
Abstract: In this research, a web-based simulator is developed, which can be used for generating image sequences of moving spermatozoa cells. It can be used for assessment of multiple object tracking algorithms, especially Computer Aided Sperm Analysis (CASA) systems. The developed software has many useful parameters such as blurring images or adding noise and it also gives full control of sperm counts and types. To illustrate performance of the developed simulator, three parameters (spermatozoa population, standard deviation of Gaussian blur filter and noise intensity) have been swept and the results of three different multiple object tracking algorithms were compared as an application of this simulation.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This study investigates a cost-sensitive approach for false alarm suppression while keeping near perfect true alarm detection rates, outperforming the state-of-the-art methods, which compromisetrue alarm detection rate for higherfalse alarm suppression rate, on these challenging applications.
Abstract: High false alarm rates in intensive care units (ICUs) cause desensitization among care providers, thus risking patients' lives. Providing early detection of true and false cardiac arrhythmia alarms can alert hospital personnel and avoid alarm fatigue, so that they can act only on true life-threatening alarms, hence improving efficiency in ICUs. However, suppressing false alarms cannot be an excuse to suppress true alarm detection rates. In this study, we investigate a cost-sensitive approach for false alarm suppression while keeping near perfect true alarm detection rates. Our experiments on two life threatening cardiac arrhythmia datasets from Physionet's MIMIC II repository provide evidence that the proposed method is capable of identifying patterns that can distinguish false and true alarms using on average 60% of the available time series' length. Using temporal uncertainty estimates of time series predictions, we were able to estimate the confidence in our early classification predictions, therefore providing a cost-sensitive prediction model for ECG signal classification. The results from the proposed method are interpretable, providing medical personnel a visual verification of the predicted results. In conducted experiments, moderate false alarm suppression rates were achieved (34.29% for Asystole and 20.32% for Ventricular Tachycardia) while keeping near 100% true alarm detection, outperforming the state-of-the-art methods, which compromise true alarm detection rate for higher false alarm suppression rate, on these challenging applications.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This paper has proposed a k-mer based database searching and local alignment tool using box queries on BoND-SD-tree indexing, which is quite efficient for indexing and searching in Non-Ordered Discrete Data Space (NDDS).
Abstract: In the past, genome sequence databases had used main memory indexing, such as the suffix tree, for fast sequence searches. With next generation sequencing technologies, the amount of sequence data being generated is huge and main memory indexing is limited by the amount of memory available. K-mer based techniques are being more used for various genome sequence database applications such as local alignment. K-mer can also provide an excellent basis for creating efficient disk based indexing. In this paper, we have proposed a k-mer based database searching and local alignment tool using box queries on BoND-SD-tree indexing. BoND-tree is quite efficient for indexing and searching in Non-Ordered Discrete Data Space (NDDS). We have conducted experiments on searching DNA sequence databases using back translated protein query sequences and have compared with existing methods. We have also implemented local alignment of back translated protein query sequences with large DNA sequence databases using this index based k-mer search. Performances of this local alignment approach has been compared with that of Tblastn of NCBI. The results are quite promising and justify significance of the proposed approach.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: A seizure classification framework of epileptic and non-epileptic events from multi-channel EEG data is proposed, which consists of two types of paroxysmal episodes of loss of consciousness, namely the psychogenic non-Epileptic seizure (PNES) and the vasovagal syncope (VVS).
Abstract: Misdiagnosis of epilepsy, even by experienced clinicians, can cause exposure of patients to medical procedures and treatments with potential complications. Moreover, diagnostic delays (for 7 to 10 years on average) impose economic burden at individual and population levels. In this paper, a seizure classification framework of epileptic and non-epileptic events from multi-channel EEG data is proposed. In contrast to relevant studies found in the literature, in this study, the non-epileptic class consists of two types of paroxysmal episodes of loss of consciousness, namely the psychogenic non-epileptic seizure (PNES) and the vasovagal syncope (VVS). EEG signals are represented in the spectral-spatial-temporal domain. A tensor-based approach is employed to extract signature features to feed the classification models. TUCKER decomposition is applied to learn the essence of original, high-dimensional domain of feature space and extract a multilinear discriminative subspace. The classification models were evaluated on EEG epochs from 11 subjects in an inter-subject cross-validation setting and achieved an accuracy of 96%.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: A web application which overall combines computational methodologies and data visualization techniques, in order to deliver comprehensible illustrations of cellular complexity, for voluminous, molecular datasets, linking the individual genes, with the relevant biological processes, in which they participate, while it manages to prioritize those processes according to their involvement in the cellular phenotype studied.
Abstract: The development of several biomedical ontologies and databases for structuring and categorizing knowledge in life sciences, and particularly the ones which refer to the functions and interactions of biomolecules, have contributed to the rapid inflation of the semantic information universe that describes cellular complexity, at different scales. Together with the ever-growing number of high-throughput molecular data, generated by DNA microarray or NGS experiments, they stress the need for powerful, intuitive data representation methods, which manage to make sense out of the myriads of interactions and pinpoint those with a causal contribution to the phenotypes studied. In this paper, we present a web application, which overall combines computational methodologies and data visualization techniques, in order to deliver comprehensible illustrations of cellular complexity, for voluminous, molecular datasets, linking the individual genes, with the relevant biological processes, in which they participate, while it manages to prioritize those processes according to their involvement in the cellular phenotype studied. The application highlights molecular information (functions, processes, cellular compartments) according several criteria (enrichment score, expression, etc) sorts out regulatory hub genes, with a pivotal role in the phenotype studied, while, most importantly, novel visualization modules provide an efficient, intuitive illustration that aids easy systems' level interpretation. The pipeline is showcased here using a colon cancer dataset.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This paper used BWAKIT and GATK based software for processing larger volume of genomic data that are referred as "NGS workflow at SIDRA" and analyzed the performance bottleneck and application optimization in terms of "scalability" and "multiple instances of NGS workflow with different genome data within a node".
Abstract: Advancement in Next Generation Sequencing (NGS) technology are associated with ever-increasing volume of genomic data every year. These genomic data are efficiently processed by empirical parallelism using High Performance Computing (HPC). The processed data can be used for genome-wide association studies, genetics, personalized medicine and many other areas. There are different kind of algorithms and implementations used in different phases of genome processing. In this paper, we used BWAKIT and GATK based software for processing larger volume of genomic data that are referred as "NGS workflow at SIDRA". We used BWAKIT for genome alignment and GATK for variant discovery in the NGS workflow that required larger computation and huge memory requirement respectively. We observed, the CPU utilization is not more than 45% during variant discovery and hence, it is necessary to understand the optimal selection (in terms of number of threads or cores) of the resources during the NGS workflow automation. We analyzed the performance bottleneck and application optimization in terms of "scalability" (use maximum available CPUs and memory) and "multiple instances of NGS workflow with different genome data within a node" (process more volume of genome data concurrently with limited set of CPUs and memory). We observed that, 40%, 65%, 71% and 76% improvement in performance while processing 2, 4, 8 and 16 samples concurrently using our own scheduling heuristics. As a result, our proposed NGS workflow automation will improve the performance upto 76% compared to application scalability based workflows.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: A computationally efficient method to estimate the optimal order of the autoregressive (AR) modeling of electroencephalographic (EEG) signals in order to use the AR coefficients as features for the analysis of EEG signals and the automatic detection of epileptic seizures is proposed.
Abstract: In this paper, we propose a computationally efficient method to estimate the optimal order of the autoregressive (AR) modeling of electroencephalographic (EEG) signals in order to use the AR coefficients as features for the analysis of EEG signals and the automatic detection of epileptic seizures. The estimation of the optimal AR-order is made using regression analysis of statistical features extracted from the samples of the EEG signals. The proposed method was evaluated in both background and ictal EEG segments using recordings from 10 epileptic patients. The experimental evaluation showed that the mean absolute error of the estimated optimal AR order is approximately 4 units.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The aim of this paper is to present the use of the Walsh-Hadamard transform in the analysis of electromygraphic signals representing the uterine contractile activity in pigs.
Abstract: The aim of this paper is to present the use of the Walsh-Hadamard transform in the analysis of electromygraphic signals representing the uterine contractile activity in pigs. The Fourier spectral analysis is widely used in many biomedical applications. However, for binary time series the Walsh-Hadamard transform based on square or rectangular waves with peaks of ±1 is more accurate. Dominant normalized sequency can serve as a parameter describing the biomedical signal, which may have diagnostic importance.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The aim of this study is the development of a framework for the improvement of the Health Care processes introducing Business Process Modeling Notations (BPMNs) methodologies.
Abstract: The importance of the development of the Primary Care network in a developed country is indisputable high. The increasing pressure for productivity improvement and reduction of costs, requires activities focusing on the control and optimization of care processes improving their efficiency and effectivity. A keystone in such priority is the utilization of several effective and comprehensive processes in the everyday practice. The aim of this study is the development of a framework for the improvement of the Health Care processes introducing Business Process Modeling Notations (BPMNs) methodologies. The large majority of processes in the organizations and healthcare centers which belong to the Greek Primary Care Healthcare system have been collected. For each process the BPMNs diagram has been developed and presented. The analysis and the quantitative indexes have been captured by the integration of the processes in a web application that has been designed and implemented. The application's outcome has been analyzed partially since it will be tested in a Health Care Center in Kissamos, Crete, Greece.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This paper can serve as a starting point for determining the adequate combination of technical equipment and its specifications that work favorably for the operation of the planned real-time biofeedback application.
Abstract: Science and technology are ever more frequently used in sports for achieving the competitive advantage. Motion tracking systems, in connection to the biomechanical biofeedback, help in accelerating motor learning. Requirements about various parameters important in real-time biofeedback applications are discussed. Special focus is given on feedback loop delays and its real-time operation. Optical tracking and inertial sensor tracking systems are presented and compared. Real-time sensor signal acquisitions and real-time processing challenges, in connection to biomechanical biofeedback, are presented. This paper can serve as a starting point for determining the adequate combination of technical equipment and its specifications that work favorably for the operation of the planned real-time biofeedback application.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The results show that the superior ligaments are most beneficial for an accurate representation of the middle ear frequency response.
Abstract: The aim of this study is to investigate the effect of mallear and incudal ligaments to the tympanic membrane and the stapes footplate displacement in a finite element model of the middle ear. Three cases were simulated: one without the ligaments, one including the posterior incudal and the anterior mallear ligaments and one including in addition the superior mallear and incudal ligaments. A maximum stapes footplate displacement 0.023 μm was observed at a frequency 1024 Hz by exciting the tympanic membrane at a sinusoidal sound pressure level (SPL) of 90 dB. The computational results were validated with experimental measurements from the literature. Concluding our results show that the superior ligaments are most beneficial for an accurate representation of the middle ear frequency response. Excellent agreement is observed between our results and human temporal bone experimental data and other finite element studies.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: An example of accurate geometrical model of the customized plate implant for the fixation of mandible fracture is presented and can be used for production of plate implants, and for simulation of orthodontist interventions.
Abstract: Mandible internal fractures are a common injury because of the mandible's lack of structural support. For the treatment of such injuries various fixation elements are used. In order to improve quality of the orthodontists interventions anatomically correct and geometrically accurate customized implants are necessary. In this paper an example of accurate geometrical model of the customized plate implant for the fixation of mandible fracture is presented. For the creation of such model new method has been developed. This method is based on reverse engineering techniques applied on the CT scan of the specific patient mandible. With the application of this method it is possible to create geometrical model of the customized plate implant which geometry and topology conforms to the shape of the mandible of the specific patient. The side of the implant, which is in contact with a periosteum outer layer of the mandible, is fully aligned with the shape of the mandible outer surface near the fracture. The obtained model(s) can be used for production of plate implants, and/or for simulation of orthodontist interventions.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: The measure value of force and position of a body part is used together with finite element method simulation in order to obtain von Mises stress distribution on the tibia, femur and cartilage in the knee joint.
Abstract: In this paper the methodology of vertical jump analysis is presented. Measured results of vertical force during jump are presented. Six subjects (members of "Red star" football club) perform different types of jump (flywheel jump, jump without flywheel, and jump with landing on the left and right foot while vertical ground reaction force is measured using a force plate. One axial load cell force sensor is also used. The measure value of force and position of a body part is used together with finite element method simulation in order to obtain von Mises stress distribution on the tibia, femur and cartilage in the knee joint. The average value of von Mises stress has a significant impact on the injuries and condition of the knee cartilage.

Proceedings ArticleDOI
02 Nov 2015
TL;DR: This study identified disrupted sub-paths due to STAT3 on functional glioma pathways and expects that the proposed algorithmic approach could aid researchers to determine the biological relevance of the binding sites over functional sub- paths and provide insights for new disease treatments.
Abstract: Demand for analyzing very large datasets is increasing, especially with the introduction of chromatin immunoprecipitation sequencing which is a recent method of Next Generation Sequencing used to analyze protein interactions with DNA. The development of new technologies is revolutionizing genome-wide analysis and scientists' abilities to have a better understanding of the biological meaning but inferring gene regulatory networks from such data is still a major challenge in systems biology. Complex reactions at the molecular level in living cells and such knowledge, as it relates to specific phenotype, necessarily implies that a key molecular target should be considered within the framework of its gene regulatory network. The objective of our study is to explore the effect of proteins under specific conditions (e.g. treatment or starvation), in functional sub-pathways for specific phenotype. Using public microarray expression datasets for glioma and the KEGG human gene regulatory networks as proof of concept, we identified disrupted sub-paths due to STAT3 on functional glioma pathways. We expect that the proposed algorithmic approach could aid researchers to determine the biological relevance of the binding sites over functional sub-paths and provide insights for new disease treatments.