scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Bioinformatics in 2011"


Journal ArticleDOI
16 Jun 2011
TL;DR: In this article, a view of recent activity in corporate social media management hiring and contrasts it to the webmaster hiring during the mid-1990s is presented. And the authors explain the core business competencies of strategy development; business analytics; creativity; and collaboration.
Abstract: The article focuses on the influence that social media is having on the corporate landscape. It offers a view of recent activity in corporate social media management hiring and contrasts it to the webmaster hiring during the mid-1990s. The hiring activity reflects the growth in time people are spending online – socially. It also describes how social media management influences brand awareness and brand reputation. It explains the core business competencies of strategy development; business analytics; creativity; and collaboration. The article cites the level of hiring as a call for academic course offerings in corporate social media management.

92 citations


Journal ArticleDOI
12 Sep 2011
TL;DR: SEMM (Search Engine Management Management) as discussed by the authors is a generalization of traditional Search Engine Optimization (SEO) that focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO).
Abstract: As the number of sites on the Web increased in the mid-to-late 90s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text in 1996 and then Goto.com in 1998. Goto.com later changed its name to Overture in 2001, and was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google Ad Words program. By 2007, pay-per-click programs proved to be primary money-makers for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010. Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "Search Engine Marketing" was proposed by Danny Sullivan in 2001 to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals. Some of the latest theoretical advances include Search Engine Marketing Management (SEMM). SEMM relates to activities including SEO but focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO). SEMM also integrates organic SEO, trying to achieve top ranking without using paid means of achieving top in search engines, and PayPerClick SEO. For example some of the attention is placed on the web page layout design and how content and information is displayed to the website visitor.

79 citations


Journal ArticleDOI
01 May 2011
TL;DR: Using RFID and social networking technology, Conferator provides the means for effective management of personal contacts and according conference information before, during and after a conference.
Abstract: Als ein neuartiges soziales Konferenzmanagementsystem ermder Conferator die einfache Verwaltung sozialer Beziehungen und Interaktionen sowie das Management von konferenzspezisc hen Informationen sowohl vor, w  als auch nach einer Konferenz. Basierend auf RFID Technik gekoppelt mit sozialen Netzen bietet der Conferator die Meinfach und eektiv pers  Kontakte und Information wie etwa den Konferenzplan zu verwalten. Wir beschreiben das System und pr Analyseergebnisse in einem typischen Konferenz-Anwendungsszenario. Conferator is a novel social conference system that provides the management of social interactions and context information in ubiquitous and social environments. Using RFID and social networking technology, Conferator provides the means for eective management of personal contacts and according conference information before, during and after a conference. We describe the system in detail, before we analyze and discuss results of a typical application of the Conferator system.

71 citations


Journal ArticleDOI
05 Jan 2011
TL;DR: This paper provides a conspectus of the major issues in cloud computing privacy and should be regarded as an introductory paper on this important topic.
Abstract: Cloud computing is a model for providing on-demand access to computing service via the Internet. In this instance, the Internet is the transport mechanism between a client and a server located somewhere in cyberspace, as compared to having computer applications residing on an “on premises” computer. Adoption of cloud computing practically eliminates two ongoing problems in IT service provisioning: the upfront costs of acquiring computational resources and the time delay of building and deploying software applications. The technology is not without a downside, which in this case is the privacy of business and personal information. This paper provides a conspectus of the major issues in cloud computing privacy and should be regarded as an introductory paper on this important topic.

66 citations


Journal ArticleDOI
16 Jun 2011
TL;DR: In this article, the authors examine the reasons why the implementation of organizational change is so complicated and suggest ways to break down the barriers to change, and suggest that a significant failure rate exists when it comes to organizational change.
Abstract: A significant failure rate exists when it comes to organizational change. Managers understand the importance of organizational change, but many of them do not know how to execute it effectively. This study examines the reasons why the implementation of organizational change is so complicated and it suggests ways to break down the barriers to change.

49 citations


Journal ArticleDOI
Chris Rose1
25 Jan 2011
TL;DR: Social media web sites allow users to share information, communicate with each other, network and interact but because of the easy transfer of information between different social media sites, information that should be private becomes public and opens the users to serious security risks.
Abstract: Social media web sites allow users to share information, communicate with each other, network and interact but because of the easy transfer of information between different social media sites, information that should be private becomes public and opens the users to serious security risks. In addition, there is also massive over-sharing of information by the users of these sites, and if this is combined with the increased availability of location-based information, then all this can be aggregated causing an unacceptable risks and unintended consequences for users.

49 citations


Journal ArticleDOI
30 Nov 2011
TL;DR: A parameter-free method, interconnectedness (ICN), to rank candidate genes by assessing the closeness of them to known disease genes in a network, which can well complement other network-based methods in the context of prioritizing candidate disease genes.
Abstract: Genome-wide disease-gene finding approaches may sometimes provide us with a long list of candidate genes. Since using pure experimental approaches to verify all candidates could be expensive, a number of network-based methods have been developed to prioritize candidates. Such tools usually have a set of parameters pre-trained using available network data. This means that re-training network-based tools may be required when existing biological networks are updated or when networks from different sources are to be tried. We developed a parameter-free method, interconnectedness (ICN), to rank candidate genes by assessing the closeness of them to known disease genes in a network. ICN was tested using 1,993 known disease-gene associations and achieved a success rate of ~44% using a protein-protein interaction network under a test scenario of simulated linkage analysis. This performance is comparable with those of other well-known methods and ICN outperforms other methods when a candidate disease gene is not directly linked to known disease genes in a network. Interestingly, we show that a combined scoring strategy could enable ICN to achieve an even better performance (~50%) than other methods used alone. ICN, a user-friendly method, can well complement other network-based methods in the context of prioritizing candidate disease genes.

47 citations


Journal ArticleDOI
30 Dec 2011
TL;DR: A set of 15 web metrics that can play animportant role in understanding web visitors behavior are defined and suggestion how these metrics can help in making awebsite more popular are provided.
Abstract: Web metrics are established goals and standards for measuringwebsite performance. Web Analytics can be used to analyze andstatistically process user and customer behavior. Web Analyticsespecially refers to the use of data collected from a Website todetermine which aspects of the Website work towards thebusiness objectives. This paper provides a web metrics basedapproach that can be used to analyze and improve web usagepattern. We define a set of 15 web metrics that can play animportant role in understanding web visitors behavior andprovide suggestion how these metrics can help in making awebsite more popular. We describe the approach by consideringa case study of the website gndu.ac.in for the data collected overa period of five months.

45 citations


Proceedings Article
01 Jan 2011
TL;DR: In this article, the authors focus on the fundamental variables influencing equity prices of 'A' Group and 'B' Group shares of the banking companies listed at BSE and find that company specific factors such as market capitalization and dividend yield have significant influence on the equity prices.
Abstract: The Indian stock market has witnessed a paradigm shift in the last two decades of economic reforms. Knowledge of the relative influence of fundamental factors on equity prices is useful to corporate management, government and investors. The post reform period witnessed deregulatory initiatives in the banking sector, an important constituent of the financial sector of the economy. The focus of this paper is on the fundamental variables influencing equity prices of ‘A’ Group and ‘B’ Group shares of the banking companies listed at BSE. Correlation and multiple regression analysis were employed in the analysis of data. The findings reveal that company specific factors such as market capitalization and dividend yield have significant influence on the equity prices of ‘A’ group shares and in the case of group ‘B’ shares book value per share emerged significant.

40 citations


Journal ArticleDOI
01 Dec 2011
TL;DR: A gentle, not too formal introduction to smoothed analysis is given by means of two examples: the k-means method for clustering and the Nemhauser/Ullmann algorithm for the knapsack problem.
Abstract: Many algorithms perform very well in practice, but have a poor worst-case performance. The reason for this discrepancy is that worst-case analysis is often a way too pessimistic measure for the performance of an algorithm. In order to provide a more realistic performance measure that can explain the practical performance of algorithms, smoothed analysis has been introduced. It is a hybrid of the classical worst-case analysis and average-case analysis, where the performance on inputs is measured that are subject to random noise. We give a gentle, not too formal introduction to smoothed analysis by means of two examples: the k-means method for clustering and the Nemhauser/Ullmann algorithm for the knapsack problem.

37 citations


Journal ArticleDOI
30 Nov 2011
TL;DR: A comprehensive study suggests a genotype-phenotype correlation in β-mannosidosis and extrapolated observed mutations from one species to homologous positions in other organisms based on the proximity of the mutations to the enzyme active site and their co-location from different organisms.
Abstract: Lysosomal β-D-mannosidase is a glycosyl hydrolase that breaks down the glycosidic bonds at the non-reducing end of N-linked glycoproteins. Hence, it is a crucial enzyme in polysaccharide degradation pathway. Mutations in the MANBA gene that codes for lysosomal β-mannosidase, result in improper coding and malfunctioning of protein, leading to β-mannosidosis. Studying the location of mutations on the enzyme structure is a rational approach in order to understand the functional consequences of these mutations. Accordingly, the pathology and clinical manifestations of the disease could be correlated to the genotypic modifications. The wild-type and inherited mutations of β-mannosidase were studied across four different species, human, cow, goat and mouse employing a previously demonstrated comprehensive homology modeling and mutational mapping technique, which reveals a correlation between the variation of genotype and the severity of phenotype in β-mannosidosis. X-ray crystallographic structure of β-mannosidase from Bacteroides thetaiotaomicron was used as template for 3D structural modeling of the wild-type enzymes containing all the associated ligands. These wild-type models subsequently served as templates for building mutational structures. Truncations account for approximately 70% of the mutational cases. In general, the proximity of mutations to the active site determines the severity of phenotypic expressions. Mapping mutations to the MANBA gene sequence has identified five mutational hot-spots. Although restrained by a limited dataset, our comprehensive study suggests a genotype-phenotype correlation in β-mannosidosis. A predictive approach for detecting likely β-mannosidosis is also demonstrated where we have extrapolated observed mutations from one species to homologous positions in other organisms based on the proximity of the mutations to the enzyme active site and their co-location from different organisms. Apart from aiding the detection of mutational hotspots in the gene, where novel mutations could be disease-implicated, this approach also provides a way to predict new disease mutations. Higher expression of the exoglycosidase chitobiase is said to play a vital role in determining disease phenotypes in human and mouse. A bigger dataset of inherited mutations as well as a parallel study of β-mannosidase and chitobiase activities in prospective patients would be interesting to better understand the underlying reasons for β-mannosidosis.

Journal ArticleDOI
01 Sep 2011
TL;DR: An overview of a dissertation, which addresses the problem XSS as a whole, starts with a systematic deduction of causes and consequences of XSS, proceeds with presenting countermeasures to mitigate potential XSS-based attacks, and finally provides a type-based methodology that guarantees the creation ofXSS-free applications.

Journal ArticleDOI
01 Jan 2011
TL;DR: It is argued that Privatheit 3.0 should be a combination of (1) Data minimization, (2) User control of personal information disclosure, and (3) Contextual integrity.
Abstract: Over the last two decades, privacy has been fading away. Some people have even stated: You have zero privacy – get over it! As privacy researchers, we are not willing to accept this statement. Therefore, we analyze the causes for this fading away of privacy, and develop a set of approaches to preserve or even regain privacy. We argue that Privacy 3.0 should be a combination of (1) Data minimization, (2) User control of personal information disclosure, and (3) Contextual integrity. Data minimization is one of the main motivations for the development of privacy-enhancing technologies, which aim to limit collection and processing of personal data by data controllers. User control of personal information disclosure supports users in deciding which personal information is released to whom and in which situation. Contextual integrity provides a new quality of privacy by making the original context in which particular personal data have been generated easily accessible to all entities that are aware of that particular personal data. Zusammenfassung In den letzten zwei Jahrzehnten nahm das Gefuhl von Privatheit im Internet bei den Benutzern immer mehr ab. Manche konstatierten sogar: Es gibt keine Privatheit – findet Euch damit ab! In diesem Artikel analysieren wir die Grunde hierfur und beschreiben synergetische Ansatze zur Erhaltung bzw. sogar Ruckgewinnung von Datenschutz und Privatheit. Aus unserer Sicht sollte Privatheit 3.0 einem dreistufigen Ansatz folgen: (1) Datenminimierung, (2) Nutzerkontrolle und (3) Kontextuelle Integritat. Datenminimierung war und ist eine der treibenden Motivationen fur die Entwicklung Privatheit fordernder Technik, die die Begrenzung von Datensammlung und Datenverarbeitung zum Ziel hat. Mit Hilfe der Nutzerkontrolle werden die Nutzer bei der Entscheidungsfindung unterstutzt, welche personlichen Daten sie wem und in welcher Situation zuganglich machen. Die Durchsetzung von Kontextueller Integritat hebt den Datenschutz auf eine qualitativ neue Stufe, indem der originale Kontext, in welchem personliche Daten erstellt wurden, all den Entitaten, die Kenntnis von diesen personlichen Daten haben, zugreifbar gemacht werden.

Journal ArticleDOI
12 Sep 2011
TL;DR: The authors discusses both broad historical and philosophical theories of strategic management, as well as specific communication and human resource management theories and practices, and concludes with an application chapter emphasizing how the Roman Catholic Church needs to develop a strategy to integrate learning and innovation in order to reconcile and communicate its central message locally.
Abstract: This paper discusses both broad historical and philosophical theories of strategic management, as well as specific communication and human resource management theories and practices. It concludes with an application chapter emphasizing how the Roman Catholic Church needs to develop a strategy to integrate learning and innovation in order to reconcile and communicate its central message locally. Although built upon a hierarchical and organizational culture, where strict obedience to institutional directives dominates the communities it serves, the diversity within the Church is forcing the Vatican to ensure specialized sub-cultures are not polarized.

Journal ArticleDOI
30 Nov 2011
TL;DR: The results implicate that there are key differences in functions and evolutionary constraints among singleton genes or duplicated genes with or without alternative splicing incidences, and implies that the gene duplication and alternativesplicing may have different functional significance in the evolution of speciation diversity.
Abstract: Gene duplication provides resources for developing novel genes and new functions while retaining the original functions. In addition, alternative splicing could increase the complexity of expression at the transcriptome and proteome level without increasing the number of gene copy in the genome. Duplication and alternative splicing are thought to work together to provide the diverse functions or expression patterns for eukaryotes. Previously, it was believed that duplication and alternative splicing were negatively correlated and probably interchangeable. We look into the relationship between occurrence of alternative splicing and duplication at different time after duplication events. We found duplication and alternative splicing were indeed inversely correlated if only recently duplicated genes were considered, but they became positively correlated when we took those ancient duplications into account. Specifically, for slightly or moderately duplicated genes with gene families containing 2 - 7 paralogs, genes were more likely to evolve alternative splicing and had on average a greater number of alternative splicing isoforms after long-term evolution compared to singleton genes. On the other hand, those large gene families (contain at least 8 paralogs) had a lower proportion of alternative splicing, and fewer alternative splicing isoforms on average even when ancient duplicated genes were taken into consideration. We also found these duplicated genes having alternative splicing were under tighter evolutionary constraints compared to those having no alternative splicing, and had an enrichment of genes that participate in molecular transducer activities. We studied the association between occurrences of alternative splicing and gene duplication. Our results implicate that there are key differences in functions and evolutionary constraints among singleton genes or duplicated genes with or without alternative splicing incidences. It implies that the gene duplication and alternative splicing may have different functional significance in the evolution of speciation diversity.

Journal ArticleDOI
05 Jan 2011
TL;DR: In this article, the authors investigate the phenomenon of trust in its personal and impersonal forms, i.e., interpersonal and institutional trust with multiple foci ( coworkers, supervisor, and institution) through addressing threefold agenda.
Abstract: Although Mayer et al.’s (1995) integrative model of organizational trust has clarified some confusion regarding the concept of interpersonal trust involving two specific parties by differentiating between factors that cause trust, trust itself, and outcomes of trust through 1) it defined trust as the willingness of a party to be vulnerable to the actions of another party,2) trust is in turn influenced by a trustor’s propensity to trust, the trustor’s perceived characteristics of a trustee (e.g., trustworthiness of trustee based on his/her ability, benevolence, and integrity), 3) the trust leads to actual risk taking in a relationship since behavioral trust is the assuming of risk, and 4), the desired outcome results from the result of risk taking. But researchers like Luhman (1979), McCauley & Kuhnert (1992), Zaheer et al.(1998), Costigan et al.(1998), McKnight et al.(1998,2002), and Atkinson and Butcher( 2003) are of the view that root of trust lies in individual relationships is not in opposition to the experience of trust both inside interpersonal relationships and as an institutional phenomenon beyond interpersonal relationships. Moreover, Colquitt et al.(2007) investigated, it remains unclear that which trust antecedents have the strongest relationships with trust whether trust fully mediates the effects of trustworthiness and trust propensity on behavioral outcomes, and do trust relationship vary according to whether the trustee is a leader versus a coworker? Therefore, the aim of this conceptual paper is to investigate the phenomenon of trust in its personal and impersonal forms i.e., interpersonal and institutional(organizational) trust with multiple foci i.e. coworkers, supervisor, and institution(organization) through addressing threefold agenda; (1) to explore the concept of trust by distinguishing its antecedents i.e., trustor’s propensity to trust, the trustor’s perceived characteristics of a trustee, and institutional trust from trust, (2) to explore whether trust fully mediates the effects of trustworthiness, trust propensity, and institutional trust on behavioral outcomes such as organizational commitment, organizational citizenship behavior, and employees task performance, and (3) to explore that do trust relationships vary according to whether the trustee is a supervisor/leader versus a coworker.

Journal ArticleDOI
25 Jan 2011
TL;DR: In this paper, the authors explore the characteristics of social entrepreneurship, and the factors that make a difference in its success or failure, and shed some light on what a social entrepreneurship is and what it is not.
Abstract: Social entrepreneurship is not new, but has gained greater visibility and recognition in recent years due to its growing worldwide impact. As in the case of business entrepreneurship, social entrepreneurship starts with an entrepreneur who has a novel idea, an innovative product or service, a creative approach to solving a perceived problem, a new business model, and/or a previously untried approach to product or service delivery. However, social entrepreneurship differs from business entrepreneurship because it is after sustainable solutions to societal problems and aims at social change rather than market expansion. It is, therefore, seen more as an agent of change than a profit-seeking enterprise. This paper explores the characteristics of social entrepreneurship, and the factors that make a difference in its success or failure. It also sheds some light on what a social entrepreneurship is and what it is not. Finally, it examines the missions and contributions of six successful social entrepreneurships: The Grameen Bank of Bangladesh, ADAPT of Egypt, BRAC of Bangladesh, Instituto de Pesquisas Ecologicas of Brazil, the Aravind Eye Care Hospitals of India, and Televerde’s Prison Call Centers of the United States. The impact of the first four has spread beyond their countries of origin, either through the geographic expansion of their operations or the application of the same concept or business model by social enterprises in other countries.

Journal ArticleDOI
05 Jan 2011
TL;DR: This paper reviewed the alternatives for assessing the participation of individual student on a team as well as discuss the cases in which each approach was used and identified the positives and negatives of each approach.
Abstract: Cooperative learning is an instructional model in which students work together toward a common goal. Research has clearly shown that cooperation results in higher levels of achievement. Although students may be a part of a cooperative learning environment, they are also responsible for their own individual achievement. This makes student evaluations a challenge because you are evaluating individual as well as team effort. This paper will review the alternatives for assessing the participation of individual student on a team as well as discuss the cases in which each approach was used. It identifies the positives and negatives of each approach.

Journal ArticleDOI
12 Sep 2011
TL;DR: In this article, the authors examined the social networks of European business owners according to employment size after approximately three years of survival as a business and found that the sources of advice used at start-up varied by the size of business with employers of ten or more people more likely to report having received advice from professional acquaintances, financial institutions and training programs, and less likely to have received any advice from family and friends or professional consultants.
Abstract: Social networks are important to new entrepreneurs and small business owners because the ability to access information, advice, and necessary resources is vital to the success of new firms. This study examines the social networks of European business owners according to employment size after approximately three years of survival as a business. The results show that the sources of advice used at start-up varied by the size of business with employers of ten or more people more likely to report having received advice from professional acquaintances, financial institutions and training programs, and less likely to have received advice from family and friends or professional consultants. Although these people were more likely to report that they did not need advice, they were also the least likely to report that they had no access to advice. Those with between one and nine employees were the most likely to report using professional consultants (a formal source), suggesting their informal social networks were not as well­­-developed.

Journal ArticleDOI
25 Jan 2011
TL;DR: In this paper, the authors focused on the relationship between employee loyalty and the growth of companies, using a structured questionnaire on the sample of Slovenian companies from service and manufacturing industries.
Abstract: Employees are crucial for the achievement of internal quality and consequently for business performance of companies. The quality of employees, their competencies, loyalty and commitment are extremely important for business performance achievement. For development of employee loyalty it can be important that employees find in work, which they perform, challenge, interest and the feeling of accomplishment. The way of treatment of employees in the organization is decisive in determining if employees will indeed become an integral part of the competitive advantage of the company. The paper focuses on employee loyalty and growth of companies. The hypothesis about the relationships between employee loyalty and firm growth was developed and empirically tested. Data collection was based of responses to the structured questionnaire on the sample of Slovenian companies from service and manufacturing industries. The hypothesis was tested by using regression analysis. Findings indicate a positive relationship between employee loyalty and firm growth, particularly for manufacturing firms. Recommendations for companies are also provided.

Journal ArticleDOI
01 May 2011
TL;DR: An overview of the ideas and tools behind Enterprise 2.0 is presented, and challenges and approaches for management are discussed.
Abstract: Enterprise 2.0 is an approach to broaden participation of employees in enterprise knowledge management. Building on concepts and tools from the Web 2.0, the effort to participate is minimized and a broad audience is provided. This has positive effects on the motivation of employees to participate. In this article we present an overview of the ideas and tools behind Enterprise 2.0, and discuss challenges and approaches for management.

Journal ArticleDOI
05 Jan 2011
TL;DR: This paper will examine the issue of Information Security Governance (ISG) of an enterprise information system, elaborate on the ISG framework, discuss the legislations and assess how ISG can be framed to meet legislations to show due diligence and continuous process monitoring.
Abstract: Enterprises are now operating in the network economy. The network economy is dependent on the information infrastructure via the Internet. Organizations of all types (business, academia, government, etc.) are facing risks resulting from their ever-increasing reliance on the information infrastructure. Because of this, the US government implemented a number of legislations to secure cyberspace. This paper will examine the issue of Information Security Governance (ISG) of an enterprise information system, it will elaborate on the ISG framework, discuss the legislations and finally, assess how ISG can be framed to meet legislations to show due diligence and continuous process monitoring.

Journal ArticleDOI
25 Jan 2011
TL;DR: In this article, the authors developed a rather comprehensive inventory of experiential learning styles and methods, including both a descriptive and an exploratory perspective, for improving higher management education in Slovenia.
Abstract: Nowadays lecturers in higher education need an awareness of the experiential learning style preferences of students in order to develop and utilize effective and efficient teaching and pedagogical strategies and methods. The experiential learning styles literature has had a revival during last years, especially in the first decade of 21st century (Alban & Metcalfe 2002; Duff & Duffy, 2002; Kayes, 2003; Loo, 2004; Reynolds & Vince, 2007; Cowen & Kazamias, 2009). Upon reviewing the literature on experiential learning, the intense rate and growing interest is involved also in Slovenia, especially when analyzing the case of University of Ljubljana, Faculty of Economics (FELU; http://www.ef.uni-lj.si/en/) in Slovenia. In April 2010 FELU joined an elite group of institutions that have achieved business accreditation from AACSB International. Moreover, combined with EQUIS accreditation, FELU is ranked among 45 best business schools worldwide. The purpose of this paper is to offer a better insight into the experiential learning practices at FELU in order to develop appropriate teaching and pedagogical strategies for improving higher management education in Slovenia. The research objective of this study was to develop rather comprehensive inventory of experiential learning styles and methods, included both a descriptive and an exploratory perspective. In the theoretical part of the study the qualitative meta-analysis method was used to overview the literature background of the study. In the empirical part of the study the Principal Axis Factoring, using varimax rotation, was performed on the explanatory variables with primary goal of data reduction. The modified version of experiential learning style theory was used as research instrument in the questionnaire to determine Slovenian students’ experiential learning styles. According to the research process we can summarize research thesis that matching students’ experiential learning-style preferences with complementary course syllabus improve management education, academic achievements and student’s attitudes toward learning.

Journal ArticleDOI
30 Nov 2011
TL;DR: Of the 49 manuscripts (selected from 104 submissions) accepted to BMC Genomics and BMC Bioinformatics conference supplements, 24 are featured in this issue, covering software tools, genome/proteome analysis, systems biology (networks, pathways, bioimaging) and drug discovery and design.
Abstract: The 2011 International Conference on Bioinformatics (InCoB) conference, which is the annual scientific conference of the Asia-Pacific Bioinformatics Network (APBioNet), is hosted by Kuala Lumpur, Malaysia, is co-organized with the first ISCB-Asia conference of the International Society for Computational Biology (ISCB). InCoB and the sequencing of the human genome are both celebrating their tenth anniversaries and InCoB’s goalposts for the next decade, implementing standards in bioinformatics and globally distributed computational networks, will be discussed and adopted at this conference. Of the 49 manuscripts (selected from 104 submissions) accepted to BMC Genomics and BMC Bioinformatics conference supplements, 24 are featured in this issue, covering software tools, genome/proteome analysis, systems biology (networks, pathways, bioimaging) and drug discovery and design.

Journal ArticleDOI
30 Nov 2011
TL;DR: Comparative analyses of different ribosomal RNA structures reveal several conserved base triple motifs in 50S rRNA structures, indicating an important role in structural stabilization and ultimately RNA function.
Abstract: Highly hydrogen bonded base interactions play a major part in stabilizing the tertiary structures of complex RNA molecules, such as transfer-RNAs, ribozymes and ribosomal RNAs. We describe the graph theoretical identification and searching of highly hydrogen bonded base triples, where each base is involved in at least two hydrogen bonds with the other bases. Our approach correlates theoretically predicted base triples with literature-based compilations and other actual occurrences in crystal structures. The use of ‘fuzzy’ search tolerances has enabled us to discover a number of triple interaction types that have not been previously recorded nor predicted theoretically. Comparative analyses of different ribosomal RNA structures reveal several conserved base triple motifs in 50S rRNA structures, indicating an important role in structural stabilization and ultimately RNA function.

Journal ArticleDOI
01 Dec 2011
TL;DR: Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests were used and accelerated failure time model with the Weibull distribution were used to detect any differentially expressed proteins.
Abstract: Protein abundance in quantitative proteomics is often based on observed spectral features derived from LC-MS experiments. Peak intensities are largely non-Normal in distribution. Furthermore, LC-MS data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model, accelerated failure time model with the Weibull distribution were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated data set.

Journal ArticleDOI
30 Nov 2011
TL;DR: It is shown that non-orthogonal MFs have better performance than orthogonalMFs and K-means for clustering microarray data, and bi-directional sparseness constraints superimposed on non-negative constraints are applied.
Abstract: Clustering-based methods on gene-expression analysis have been shown to be useful in biomedical applications such as cancer subtype discovery. Among them, Matrix factorization (MF) is advantageous for clustering gene expression patterns from DNA microarray experiments, as it efficiently reduces the dimension of gene expression data. Although several MF methods have been proposed for clustering gene expression patterns, a systematic evaluation has not been reported yet. Here we evaluated the clustering performance of orthogonal and non-orthogonal MFs by a total of nine measurements for performance in four gene expression datasets and one well-known dataset for clustering. Specifically, we employed a non-orthogonal MF algorithm, BSNMF (Bi-directional Sparse Non-negative Matrix Factorization), that applies bi-directional sparseness constraints superimposed on non-negative constraints, comprising a few dominantly co-expressed genes and samples together. Non-orthogonal MFs tended to show better clustering-quality and prediction-accuracy indices than orthogonal MFs as well as a traditional method, K-means. Moreover, BSNMF showed improved performance in these measurements. Non-orthogonal MFs including BSNMF showed also good performance in the functional enrichment test using Gene Ontology terms and biological pathways. In conclusion, the clustering performance of orthogonal and non-orthogonal MFs was appropriately evaluated for clustering microarray data by comprehensive measurements. This study showed that non-orthogonal MFs have better performance than orthogonal MFs and K-means for clustering microarray data.

Proceedings Article
01 Jan 2011
TL;DR: In this paper, the authors explore the scope for improvement for the survival of the Gramya banks and propose some areas where the Regional Rural Banks or Gramya Banks face certain challenges and hence need to work further to achieve desire results.
Abstract: It is a stylized fact in the fields of economic growth theory and economic history that innovation is the engine of economic growth. Like the technological innovations, innovations in the banking sector are expected to change all aspects of economic activity, bringing about a greater improvement in the economic performance. Rapid financial innovation as a phenomenon in the past decades has change the array of banking services available to the customers but at the same time, complicated the environment in which Gramya Banks able to handle with new challenges of innovative technologies. Very quickly we have seen the introduction of Debit Cards, ATMs, Phone Banking, Negotiable Order of Withdrawal (NOW) Account, Certificate of Deposits, Mortgages, Automatic Transfer Accounts, Overnight Repurchase Agreements, Eurodollars, Commercial Papers, Money Market, Mutual Funds, Banker Acceptances, Derivatives, Securitization and more importantly IT based payment and settlement system. Some of the areas where the Regional Rural Banks or Gramya Banks face certain challenges and hence need to work further to achieve desire results, particularly with regard to fully leverage the available technology for rendering better banking services to the public at large. Considering this, the present paper tries to explore the scope for improvement for the survival of the Gramya Banks. So the Gramya Banks has only one option and is to keep pace with the technological development and upgrade them accordingly. Innovation is having no meaning unless the benefit of it is reached to the common man.

Journal ArticleDOI
01 Jul 2011
TL;DR: This paper investigates the similarities and differences between Clouds and Grids by evaluating two successful projects, namely for the provision of native hight performance copmputing applications as Grid workflows and for the self-management of Cloud infrastructures.
Abstract: Abstract Cloud Computing represents a novel and promising approach for implementing scalable ICT systems for individual-, communities-, and business-use relying on the latest achievements of diverse research areas, such as Grid computing, Service oriented computing, business processes, and virtualization. From the technological point of view Grid computing is considered as the most related predecessor technology of Cloud computing. Although, Cloud and Grid computing differ in many aspects, as for example, in the general idea of the provision of computational resource, which is in Clouds commercial based and in Grids community based, there are many similarities. In this paper we investigate the similarities and differences between Clouds and Grids by evaluating two successful projects, namely for the provision of native hight performance copmputing applications as Grid workflows and for the self-management of Cloud infrastructures. Abstract Cloud Computing ist eine neuartige Methode für die Implementierung skalierbarer ICT Systeme, die von den Individuen, Gemeinschaften und Geschäftsprozessen benutzt wird. Cloud Computing basiert auf diversen Technologien wie z. B. Grid Computing, Serviceorientierte Architekturen, Geschäftsprozess-Management und Virtualisierung. Von dem technologischen Standpunkt wird Grid Computing als die wichtigste Vorläufertechnologie angesehen. Obwohl Grid und Cloud Infrastrukturen sich in vielen Aspekten unterscheiden, wie z. B. in der Idee wie die Computerressourcen angeboten werden — in Grid basiert dies auf der gemeinsamen Nutzung der Ressourcen in einer Gemeinschaft, in Clouds ist es kommerziell — haben beide Systeme auch sehr viele Gemeinsamkeiten. In dieser Arbeit untersuchen wir die Ähnlichkeiten und Unterschiede zwischen Grid und Cloud Systemen durch die Evaluierung von zwei erfolgreichen Projekten aus beiden Bereichen, nämlich eine Infrastruktur für die Bereitstellung der Hochleistungsapplikationen als Services für Grid und eine Cloud Infrastruktur für das Selbstmanagement der Cloud Applikationen.

Journal ArticleDOI
30 Dec 2011
TL;DR: This study analyzes the performance of static (HLFET) and dynamic (DLS) BNP parallel scheduling algorithm for allocating the tasks of distributed database over number of processors by using HLFET and DLS BNP task scheduling algorithms.
Abstract: Parallel processing is a technique of executing the multiple tasksconcurrently on different processors. Parallel processing is usedto solve the complex problems that require vast amount ofprocessing time. Task scheduling is one of the major problemsof parallel processing. The objective of this study is to analyzethe performance of static (HLFET) and dynamic (DLS) BNPparallel scheduling algorithm for allocating the tasks ofdistributed database over number of processors. In the wholestudy the focus will be given on measuring the impact ofnumber of processors on different metrics of performance likemakespan, speed up and processor utilization by using HLFETand DLS BNP task scheduling algorithms.