scispace - formally typeset
Search or ask a question

Showing papers by "The University of Nottingham Ningbo China published in 2009"


Journal ArticleDOI
TL;DR: The purpose of the present study was to determine the miRNA expression profile of gastric cancer.
Abstract: Background and Aim: MicroRNAs (miRNAs) play important roles in carcinogenesis. The global miRNA expression profile of gastric cancer has not been reported. The purpose of the present study was to determine the miRNA expression profile of gastric cancer. Methods: Total RNA were first extracted from primary gastric cancer tissues and adjacent non-tumorous tissues and then small isolated RNAs (< 300 nt) were 3′-extended with a poly(A) tail. Hybridization was carried out on a μParaflo™ microfluidic chip (LC Sciences, Houston, TX, USA). After hybridization detection by fluorescence labeling using tag-specific Cy3 and Cy5 dyes, hybridization images were collected using a laser scanner and digitized using Array-Pro image analysis software (Media Cybernetics, Silver Spring, MD, USA). To validate the results and investigate the biological meaning of differential expressed miRNAs, immunohistochemistry was used to detect the differential expression of target genes. Results: The most highly expressed miRNAs in non-tumorous tissues were miR-768-3p, miR-139-5p, miR-378, miR-31, miR-195, miR-497 and miR-133b. Three of them, miR-139-5p, miR-497 and miR-768-3p, were first found in non-tumorous tissues. The most highly expressed miRNAs in gastric cancer tissues were miR-20b, miR-20a, miR-17, miR-106a, miR-18a, miR-21, miR-106b, miR-18b, miR-421, miR-340*, miR-19a and miR-658. Among them, miR-340*, miR-421 and miR-658 were first found highly expressed in cancer cells. The expression of some target genes (such as Rb and PTEN) in cancer tissues was found to be decreased. Conclusion: To our knowledge, this is the first report about these miRNAs associated with gastric cancer. This new information may suggest the potential roles of these miRNAs in the diagnosis of gastric cancer.

452 citations


Journal ArticleDOI
TL;DR: Immunohistochemistry results suggest that IL‐13–mediated liver fibrogenesis may take place in the absence of phospho–signal transducer and activator of transcription protein 6 signaling, andDepending on the cause of liver damage, a predominance of TGF‐β orIL‐13 signaling is found.

119 citations


Journal ArticleDOI
TL;DR: In this article, the determinants of firm growth are summarized and classified into three dimensions: individual, organizational, and environmental determinants, and they are found to have a positive impact on firm growth.
Abstract: Firm growth is an important indicator of a thriving economy. Although the determinants of firm growth have been studied in various disciplines, an integrated analysis is still lacking. This paper attempts to provide such an analysis. Many determinants of firm growth are summarized and classified into three dimensions: individual, organizational, and environmental determinants. By conducting an empirical study using 523 Dutch small and medium sized firms, we identify the determinants of firm growth which is measured by employment growth. Our findings show that environmental determinants do not affect firm growth. Individual ones do: entrepreneurs with growth motivation and having technical knowledge are more likely to grow their firms while entrepreneurs characterized by a strong need of achievement are less likely to engage in firm growth. Organizational determinants have the most influence on firm growth: the older the firm, the less likely it is to grow. Availability of financial capital is found to be crucial to firm growth. Finally, the firm’s scalability (its preparedness to grow) is found to have a positive impact on firm growth.

108 citations


Journal ArticleDOI
09 Sep 2009-PLOS ONE
TL;DR: Continuous monitoring in the field should be carried out to know whether H5N1 virus can be maintained by wild birds, and the results of sequence analysis indicated that genetic diversity existed among H4N1 viruses isolated from wild birds.
Abstract: Background The highly pathogenic H5N1 avian influenza emerged in the year 1996 in Asia, and has spread to Europe and Africa recently. At present, effective monitoring and data analysis of H5N1 are not sufficient in Chinese mainland. Methodology/Principal Findings During the period from April of 2004 to August of 2007, we collected 14,472 wild bird samples covering 56 species of 10 orders in 14 provinces of China and monitored the prevalence of flu virus based on RT-PCR specific for H5N1 subtype. The 149 positive samples involved six orders. Anseriformes had the highest prevalence while Passeriformes had the lowest prevalence (2.70% versus 0.36%). Among the 24 positive species, mallard (Anas platyrhynchos) had the highest prevalence (4.37%). A difference of prevalence was found among 14 provinces. Qinghai had a higher prevalence than the other 13 provinces combined (3.88% versus 0.43%). The prevalence in three species in Qinghai province (Pintail (Anas acuta), Mallard (Anas platyrhynchos) and Tufted Duck (Aythya fuligula)) were obviously higher than those in other 13 provinces. The results of sequence analysis indicated that the 17 strains isolated from wild birds were distributed in five clades (2.3.1, 2.2, 2.5, 6, and 7), which suggested that genetic diversity existed among H5N1 viruses isolated from wild birds. The five isolates from Qinghai came from one clade (2.2) and had a short evolutionary distance with the isolates obtained from Qinghai in the year 2005. Conclusions/Significance We have measured the prevalence of H5N1 virus in 56 species of wild birds in 14 provinces of China. Continuous monitoring in the field should be carried out to know whether H5N1 virus can be maintained by wild birds.

68 citations


Journal ArticleDOI
TL;DR: In this article, the negative effects of culture shock are taken into account and how to minimize psychological discomfort when entering new cultural patterns, which may be helpful for people who are experiencing culture shock to keep a healthy psychology.
Abstract: In recent years, international communication has become a common phenomenon because of the trend of globalization. This makes culture shock start to be experienced by more people and causes growing concern. This project takes the negative effects of culture shock into account and pays attention to how to minimize psychological discomfort when entering new cultural patterns. First of all, the main reason and the negative impact of culture shock are given. Next, it describes a number of solutions and evaluates the effectiveness of them. Finally, this paper justifies the preferred choice of solution, and then noticeable points are emphasized. Considering that character and disposition may be various among different people, it is found that in order to attain satisfactory results, choosing appropriate methods and reducing psychological stress to the controllable level are very important. This may be helpful for people who are experiencing culture shock to keep a healthy psychology.

68 citations


Journal ArticleDOI
TL;DR: In this paper, the impact of UK practices with respect to the measurement and disclosure of intangible assets, focusing on RD activities, was investigated and it was shown that market forces are not generally sufficient to ensure adequate disclosures with regards to intangibles by considering the cases of two biotechnology firms involved in the issuance of misleading disclosures.
Abstract: This paper considers the impact of UK practices with respect to the measurement and disclosure of intangible assets, focusing on RD activities. We first update prior UK work relating RD activities to market prices. Second, given the clearly identified role of disclosure outside of the financial statements in helping market participants value RD expenditures, we consider whether market forces are generally sufficient to ensure adequate disclosures with respect to intangibles by considering the cases of two biotechnology firms involved in the issuance of misleading disclosures. Within this context, we consider how disclosure regulation and enforcement mechanisms have evolved in recent years, and how this evolution has likely been affected by our ‘scandal’ cases. Our conclusions are that the case of the UK does not give rise to any wide-scale concerns about the economic ill-effects caused by the current state of recognition and disclosure with respect to expenditures on intangibles. Further, market forces are unlikely to be sufficient in ensuring honest and timely disclosures with respect to intangibles, but the combination of official regulation and voluntary self-regulation appears to have stemmed the tide of any such disclosure scandals in the UK.

61 citations


Journal ArticleDOI
TL;DR: In this article, the short-run performance of UK firms acquiring foreign target firms over a period of 1994 to 2003 was investigated and the impact of deal size and other firm-specific factors on performance was explored.
Abstract: Purpose – The aim of this paper is to consider the short‐run performance of UK firms acquiring foreign target firms over a period of 1994‐2003 and to explore the impact of deal size and other firm‐specific factors on performance. Cross‐border mergers and acquisitions have witnessed a substantial growth worldwide, with the UK being one of the top acquiring nations in the global market for corporate control.Design/methodology/approach – The paper first uses event study methodology to analyse short‐run share price performance. Then a univariate analysis to examine the factors influencing the short‐run performance based on a sample of 373 acquisitions over the period of 1994 to 2003.Findings – The study finds that the UK acquirers do not earn statistically significant positive abnormal returns in the short‐run. Univariate analysis shows that short‐run performance of UK acquirers is influenced by the form of target, acquisition strategy, geographical origin of target firm and the payment methods. However, the ...

61 citations


Journal ArticleDOI
TL;DR: The study suggests that radioimmunotherapy using intravesically instilled 213Bi-anti-EGFR-mAb is a promising option for treatment of bladder cancer in patients.
Abstract: Transurethral resection of urothelial carcinoma often results in tumor recurrence due to disseminated tumor cells. Therefore, new therapeutic strategies are urgently needed. The aim of this study was to establish an orthotopic human bladder carcinoma mouse model using the epidermal growth factor receptor (EGFR)–overexpressing bladder carcinoma cell line EJ28 and to compare therapeutic efficacy of intravesically instilled α-particle–emitting 213Bi-anti-EGFR-monoclonal antibody (mAb) with mitomycin C. Methods: Female Swiss nu/nu mice were intravesically inoculated with luciferase-transfected EJ28 human bladder carcinoma cells after the induction of urothelial lesions by electrocautery. At different time points after cell inoculation, mice were treated intravesically with 213Bi-anti-EGFR-mAb, mitomycin C, or unlabeled anti-EGFR-mAb. Tumor development and therapeutic response were evaluated via bioluminescence imaging. Results: Mice without therapy and those treated with unlabeled anti-EGFR-mAb reached a median survival of 41 d and 89 d, respectively. Mice that underwent therapy with 0.925 MBq of 213Bi-anti-EGFR-mAb 1 h, 7 d, or 14 d after cell instillation survived more than 300 d in 90%, 80%, and 40% of the cases, respectively. Therapy with 0.37 MBq 1 h or 7 d after tumor cell inoculation resulted in survival of more than 300 d in 90% and 50% of mice, respectively. Mitomycin C treatment after 1 h and 7 d prolonged survival to more than 300 d in 40% and 50%, respectively; however, treatment turned out to be nephrotoxic. In contrast, no signs of nephrotoxicity could be observed after 213Bi-anti-EGFR-mAb treatment. Conclusion: The study suggests that radioimmunotherapy using intravesically instilled 213Bi-anti-EGFR-mAb is a promising option for treatment of bladder cancer in patients.

58 citations


Journal ArticleDOI
TL;DR: The morphological characteristics of the anterior ethmoidal artery-that the AEA runs parallel to the ethmoid roof, forming a slight posterolateral to anteromedial curve as it passes from the orbit to the cribriform plate-are the most reliable factors used to identify the artery during surgery.
Abstract: Objectives: To provide anatomical data to help identify and locate the anterior ethmoidal artery (AEA) precisely during endoscopic procedures. Method: We dissected 15 adult cadaver heads, which provided 30 specimens, to study morphological characteristics, courses, and several types of variations. Results: We found the average diameter of the AEA to be 0.80 ± 0.24 mm. In 85.7% of the cases, the artery was seen between the second and third lamella. Other locations were over the roof of the frontal recess cells (10.7%) and the roof of the posterior ethmoid sinus (3.6%). The AEA ran parallel to the ethmoid roof and formed a slight curve. When viewed from the superior side, the angle formed by the long axis of the artery and the lamina papyracea was 60.5 degrees ± 16.4 degrees. In 83.3% of the cases, the anterior ethmoidal canal (AEC) was identified as a separate canal, and in 16.7% the canal was embedded in the ethmoid roof. In 10 of the 30 cases (33.3%), the AEC presented some degree of dehiscence. Conclusion: As a result of these dissections, we found that the AEA's course in the ethmoid roof varies. The morphological characteristics—that the AEA runs parallel to the ethmoid roof, forming a slight posterolateral to anteromedial curve as it passes from the orbit to the cribriform plate—are the most reliable factors used to identify the artery during surgery.

48 citations


Proceedings ArticleDOI
24 Apr 2009
TL;DR: The results show that the proposed recommender algorithm combining slope one scheme and user based collaborative filtering can improve the accuracy of the collaborative filtering recommender system.
Abstract: Predicting products a customer would like on the basis of other customers’ ratings for these products has become a well known approach adopted by many personalized recommendation systems on the Internet. With the development of electronic commerce, the number of customers and products grows rapidly, resulted in the sparsity of the rating dataset. Poor quality is one major challenge in collaborative filtering recommender systems. To solve this problem, the paper proposed a personalized recommendation algorithm combining slope one scheme and user based collaborative filtering. This method employs slope one scheme technology to fill the vacant ratings of the user-item matrix where necessary. Then it utilizes the user based collaborative filtering to produce the recommendation. The experiments were made on a common data set using different filtering algorithms. The results show that the proposed recommender algorithm combining slope one scheme and user based collaborative filtering can improve the accuracy of the collaborative filtering recommender system.

45 citations


Proceedings ArticleDOI
16 May 2009
TL;DR: An approach that combines the advantages of these two kinds of approaches by joining the two methods, and can provide better recommendation than traditional collaborative filtering.
Abstract: Collaborative filtering (CF) technique has been proved to be one of the most successful techniques in recommender systems. Two types of algorithms for collaborative filtering have been researched: memory-based CF and model-based CF. Memory-based approaches identify the similarity between two users by comparing their ratings on a set of items and have suffered from two fundamental problems: sparsity and scalability. Alternatively, the model-based approaches have been proposed to alleviate these problems, but these approaches tend to limit the range of users. This paper presents an approach that combines the advantages of these two kinds of approaches by joining the two methods. Firstly, it employs memory-based CF to fill the vacant ratings of the user-item matrix. Then, it uses the item-based CF as model-based to form the nearest neighbors of every item. At last, it produces prediction of the target user to the target item at real time. The collaborative filtering recommendation method combining memory-based CF and model-based CF can provide better recommendation than traditional collaborative filtering.

Journal ArticleDOI
TL;DR: In this paper, the laser metal inert gas (MIG) hybrid welded AZ31 magnesium alloy is discussed in weld shape, microstructure characteristics and mechanical properties in comparison of single laser and arc welding.
Abstract: The laser metal inert gas (MIG) hybrid welded AZ31 magnesium alloy is discussed in weld shape, microstructure characteristics and mechanical properties in comparison of single laser and arc welding. The stable MIG arc, reliable droplet transfer and regular weld that are hardly obtained in single MIG welding can be obtained in hybrid welding by laser arc synergic effects. The ultimate tensile strength and elongation of hybrid weld are far higher than those of laser weld and reach 97·8 and 87·5% of base metal respectively. Under this experimental condition, the efficiency of hybrid welding is 1·20 times faster than that of single laser welding. Between the wide upper part (arc zone) and the narrow lower part (laser zone), obvious difference is observed. Arc zone has coarser grain size and wider partial melted zone than laser zone. Finally, the porosity reduction mechanism of hybrid weld is discussed according to the weld pool shape and the acting forces on it.

Journal ArticleDOI
TL;DR: In this paper, a solvothermal process has been applied for the preparation of monodisperse magnetite by the decomposition of chelate iron alkoxide complexes with diethylene glycol.

Journal ArticleDOI
TL;DR: In this article, the authors highlight the role of risk calculations in manufacturing technology selection process by elaborating the contribution of risk associated with manufacturing technology alternatives in the shape of opportunities and threats in different decision-making environments.
Abstract: Purpose – The purpose of this paper is to present result obtained from a developed technology selection framework and provide a detailed insight into the risk calculations and their implications in manufacturing technology selection process.Design/methodology/approach – The results illustrated in the paper are the outcome of an action research study that was conducted in an aerospace company.Findings – The paper highlights the role of risk calculations in manufacturing technology selection process by elaborating the contribution of risk associated with manufacturing technology alternatives in the shape of opportunities and threats in different decision‐making environments.Practical implications – The research quantifies the risk associated with different available manufacturing technology alternatives. This quantification of risk crystallises the process of technology selection decision making and supports an industrial manager in achieving objective and comprehensive decisions regarding selection of a ma...

Journal ArticleDOI
TL;DR: The retention time of the target acid compounds shortened with the increase of the alkyl chain length and the concentrations of ionic liquids, probably due to the delocalization of the positive charge on the imidazolium cation.
Abstract: In this present study, 1-butyl-3-methylimidazolium chloride ([C(4)MIM]Cl), I-octyl-3-methylimidazolium chloride ([C(8)MIM]Cl), and 1-decyl-3-methylimidazolium chloride ([C(10)MIM)Cl) were adopted as mobile phase additives in the high performance liquid chromatography (HPLC) to simultaneously separate phenoxy acid herbicides and phenols at neutral pH. It was found that by using 20 mM of [C(4)MIM]Cl, baseline separation and good chromatograms for all the acid compounds were obtained on a normal reversed-phase C(18) column. The retention time of the target acid compounds shortened with the increase of the alkyl chain length and the concentrations of ionic liquids, probably due to the delocalization of the positive charge on the imidazolium cation, the repulsion between chlorine ions of ionic liquids and the acid compounds, as well as the stereo-hindrance effect. The mechanism with ionic liquids as mobile additives for the separation of acid compounds was discussed.

Proceedings ArticleDOI
16 May 2009
TL;DR: This paper proposes a collaborative filtering recommendation algorithm based on the item classification to pre-produce the ratings of the vacant values where necessary, and then uses the item-based collaborative filtering to produce the recommendations.
Abstract: Collaborative filtering systems represent services of personalized that aim at predicting a user’s interest on some items available in the application systems. With the development of electronic commerce, the number of users and items grows rapidly, resulted in the sparsity of the user-item rating dataset. Poor quality is one major challenge in collaborative filtering recommender systems. Sparsity of users’ ratings is the major reason causing the poor quality and the traditional similarity measure methods make poor in this situation. To address this issue, this paper proposes a collaborative filtering recommendation algorithm based on the item classification to pre-produce the ratings. This approach classifies the items to predict the ratings of the vacant values where necessary, and then uses the item-based collaborative filtering to produce the recommendations. The collaborative filtering recommendation method based on item classification prediction can alleviate the sparsity problem of the user-item rating dataset, and can provide better recommendation than traditional collaborative filtering.

Proceedings ArticleDOI
23 Jan 2009
TL;DR: The experimental results on MovieLens dataset show that the algorithm combined SVD method and item-based method is promising, since it does not only solute some of the recorded problems of recommender systems, but also assists in increasing the accuracy of systems employing it.
Abstract: Recommender Systems are introduced as an intelligent technique to deal with the problem of information and product overload. Their purpose is to provide efficient personalized solutions in economic business domains. Collaborative filtering is a widely used method of providing recommendations using ratings on items from users. However, it has three major limitations, accuracy, data sparsity and scalability. This paper proposes a new collaborative filtering algorithm to solve the problems mentioned above. We utilize the results of singular value decomposition (SVD) to fill the vacant ratings and then use the item based method to produce the prediction of unrated items. Our experimental results on MovieLens dataset show that the algorithm combined SVD method and item-based method is promising, since it does not only solute some of the recorded problems of recommender systems, but also assists in increasing the accuracy of systems employing it.

Journal ArticleDOI
TL;DR: The results confirmed the importance of the ratio of auxin (IAA to cytokinin (BA and KT) in the manipulation of shoot regeneration in J. effusus L. the efficient plant regeneration system developed here will be helpful for rapid micropropagation and further genetic improvement.
Abstract: Wetland species mat rush (Juncus effusus L.) is an important economic plant, but no information is available regarding plant regeneration, callus induction, and its proliferation from in vitro seed grown plantlets. The present study investigates the effects of growth regulator combinations and medium innovation on tissue culture system of five mat rush varieties. Addition of N6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D) in Murashige and Skoog (MS) medium showed significantly positive effect on callus proliferation, plant regeneration, and its multiplication compared to the medium devoid of BA. The highest callus induction frequency (80.95%, 90.48%, 75.40%, 70.83%, and 83.33%) was observed in MS medium containing 0.5 mg L−1 (2.2 μM) BA in Yinlin-1, Nonglin-4, Gangshan, Taicao, and Taiwan green, respectively. Various growth regulator combinations with successive subculture (medium replacement) were found essential to develop organogenic calluses and to regenerate shoots. The combination of 0.1 mg L−1 BA (0.4 μM) and 2 mg L−1 2,4-D (9.0 μM) in MS medium was found best for callus proliferation for all the varieties under trial. The plant regeneration required two steps involving successive medium replacements as well as optimal hormonal balances. Successful plant regeneration (over 70%) was observed only by transferring the organogenic callus from regeneration medium I [MS medium containing 0.5 mg L−1 BA (2. μM) and 1.0 mg L−1 kinetin (KT; 4.6 μM)] to the regeneration medium II [MS medium containing 0.5 mg L−1 BA (2.2 μM), 1.0 mg L−1 KT (4.6 μM) and 3.0 mg L−1 indoleacetic acid (IAA; 17.1 μM)]. Our results confirmed the importance of the ratio of auxin (IAA) to cytokinin (BA and KT) in the manipulation of shoot regeneration in J. effusus L. The maximum plant survival frequency and multiplication rates (90.97% and 5.40 and 94.23% and 8.25) were recorded in the presence of 0.5 mg L−1 BA (2.2 μM) in the 1/2 MS multiplication medium for the varieties of Nonglin-4 and Taicao, respectively. About 100% survival rate was also observed for all the varieties in soil conditions. The efficient plant regeneration system developed here will be helpful for rapid micropropagation and further genetic improvement in J. effusus L.

Journal ArticleDOI
TL;DR: In this paper, a high performance liquid chromatography-direct chemical vapour generation-flame atomization-atomic fluorescence spectrometry (HPLC-CVG-FA-AFS) system for speciation of methylmercury (MeHg+), inorganic mercury (Hg2+) and ethylmer cury (EtHg+) without using post-column digestion is developed and characterized.
Abstract: A high performance liquid chromatography-direct chemical vapour generation-flame atomization-atomic fluorescence spectrometry (HPLC-CVG-FA-AFS) system for speciation of methylmercury (MeHg+), inorganic mercury (Hg2+) and ethylmercury (EtHg+) without using post-column digestion is developed and characterized. In this novel system, organomercurial species separated by chromatography were transformed to their hydrides by KBH4, further atomized in the flame atomizer and detected by AFS. The conventionally used on-line UV or microwave digestion system was omitted, and no oxidation reagent was needed, which significantly simplified the instrumentation. Under the optimized conditions, the detection limits were 0.2, 0.4 and 0.4 µg L−1 (as Hg) for MeHg+, Hg2+, and EtHg+ (100 µL injection), which corresponds to absolute detection limits of 0.02, 0.04 and 0.04 ng (as Hg) for MeHg+, Hg2+, and EtHg+, respectively. The sensitivity of the developed method was comparable with the conventional high performance liquid chromatography-UV digestion-cold vapour generation-atomic fluorescence spectrometry (HPLC-UV-CVG-AFS) system. Validation with biological certified reference materials showed that the proposed method is simple and accurate for mercury speciation.

Proceedings ArticleDOI
12 Jun 2009
TL;DR: The roots of Genetic Programming are revisited, and it is concluded that the mechanisms of the process of evolution (i.e. selection, inheritance and variation) are highly suited to the process; genetic code is an effective transmitter of information and crossover is aneffective way to search through the viable combinations.
Abstract: We revisit the roots of Genetic Programming (i.e. Natural Evolution), and conclude that the mechanisms of the process of evolution (i.e. selection, inheritance and variation) are highly suited to the process; genetic code is an effective transmitter of information and crossover is an effective way to search through the viable combinations. Evolution is not without its limitations, which are pointed out, and it appears to be a highly effective problem solver; however we over-estimate the problem solving ability of evolution, as it is often trying to solve "self-imposed" survival problems. We are concerned with the evolution of Turing Equivalent programs (i.e. those with iteration and memory). Each of the mechanisms which make evolution work so well are examined from the perspective of program induction. Computer code is not as robust as genetic code, and therefore poorly suited to the process of evolution, resulting in a insurmountable landscape which cannot be navigated effectively with current syntax based genetic operators. Crossover, has problems being adopted in a computational setting, primarily due to a lack of context of exchanged code. A review of the literature reveals that evolved programs contain at most two nested loops, indicating that a glass ceiling to what can currently be accomplished.

Proceedings ArticleDOI
24 Apr 2009
TL;DR: A personalized recommendation approach joins the user clustering technology and item based collaborative filtering to solve the scalability problem in the collaborative filtering.
Abstract: Personalized recommender systems consist services that produce recommendations and are widely used in the electronic commerce. Many recommendation systems employ the collaborative filtering technology. With the gradual increase of customers and products in electronic commerce systems, the time consuming nearest neighbor collaborative filtering search of the target customer in the total customer space resulted in the failure of ensuring the real time requirement of recommender system. To solve the scalability problem in the collaborative filtering, this paper proposed a personalized recommendation approach joins the user clustering technology and item based collaborative filtering. Users are clustered based on users’ ratings on items, and each cluster has a cluster center. Based on the similarity between target user and cluster centers, the nearest neighbors of target user can be found and pre-produce the prediction where necessary. Then, the proposed approach utilizes the item based collaborative filtering to produce the recommendations. The recommendation joining user clustering and item based collaborative filtering is more scalable than the traditional one.

Proceedings ArticleDOI
23 Jan 2009
TL;DR: A new collaborative filtering personalized recommendation algorithm is proposed which applies the user demography information to improve the prediction accuracy by efficiently managing the problem of data sparsity.
Abstract: Personalized recommendation systems are web-based systems that aim at predicting a user’s interest on available products and services by relying on previously rated items and dealing with the problem of information and product overload. User demography information associated with a user’s personality is rarely considered in the personalization process, especially in the collaborative filtering (CF) which is the very important technology in the recommendation systems. In this paper, a new collaborative filtering personalized recommendation algorithm is proposed which applies the user demography information. This method combines the rating similarity and the user demography similarity in the recommendation process to improve the prediction accuracy by efficiently managing the problem of data sparsity. The experiments suggest that collaborative filtering based on combining similarity provide better recommendation quality than collaborative filtering based on only rating similarity dramatically.

Proceedings ArticleDOI
15 May 2009
TL;DR: Aiming at the problem of data sparsity for collaborative filtering, a collaborative filtering algorithm based on BP neural networks is presented and can produce more accuracy recommendation than the traditional method.
Abstract: Collaborative filtering is one of the most successful technologies in recommender systems, and widely used in many personalized recommender areas with the development of Internet, such as e-commerce, digital library and so on. The K-nearest neighbor method is a popular way for the collaborative filtering realizations. Its key technique is to find k nearest neighbors for a given user to predict his interests. However, most collaborative filtering algorithms suffer from data sparsity which leads to inaccuracy of recommendation. Aiming at the problem of data sparsity for collaborative filtering, a collaborative filtering algorithm based on BP neural networks is presented. This method uses the BP neural networks to fill the vacant ratings at first, then uses collaborative filtering to form nearest neighborhood, and lastly generates recommendations. The collaborative filtering based on BP neural networks smoothing can produce more accuracy recommendation than the traditional method.

Proceedings ArticleDOI
25 Apr 2009
TL;DR: The results show that the proposed recommender algorithm combining rough set theory and item based collaborative filtering can improve the accuracy of the collaborative filtering recommendation system.
Abstract: Recommender systems represent personalized services that aim at predicting users’ interest on information items available in the application domain. Collaborative filtering technique has been proved to be one of the most successful techniques in recommendation systems in recent years. Poor quality is one major challenge in collaborative filtering recommender systems. Sparsity of users’ ratings is the major reason causing the poor quality. To solve this problem, this paper proposed an item based collaborative filtering recommendation algorithm using the rough set theory prediction. This method employs rough set theory to fill the vacant ratings of the user-item matrix where necessary. Then it utilizes the item based collaborative filtering to produce the recommendation. The experiments were made on a common data set using different filtering algorithms. The results show that the proposed recommender algorithm combining rough set theory and item based collaborative filtering can improve the accuracy of the collaborative filtering recommendation system.

Proceedings ArticleDOI
24 Apr 2009
TL;DR: A new personalized recommendation approach based on BP neural networks and item based collaborative filtering is presented, which efficiently improves sparsity of rating data, and promises to make recommendations more accurately than conventional collaborative filtering.
Abstract: Recommendation systems can help people to find interesting things and they are widely used in our life with the development of the Internet. Collaborative filtering technique has been proved to be one of the most successful techniques in recommendation systems in recent years. Poor quality is one major challenge in collaborative filtering recommender systems. Sparsity of source data set is the major reason causing the poor quality. Aiming at the problem of data sparsity for collaborative filtering, a new personalized recommendation approach based on BP neural networks and item based collaborative filtering is presented. This method uses the BP neural networks to fill the vacant ratings where necessary and uses item based collaborative filtering to form nearest neighborhood, and then generates recommendations. The experiment results argue that the algorithm efficiently improves sparsity of rating data, and promises to make recommendations more accurately than conventional collaborative filtering.

Journal ArticleDOI
TL;DR: Improvement of awareness and general control of hypertension were demonstrated and education of both physicians and patients regarding optimal BP control should be reinforced in the future.
Abstract: Controlling hypertension is important to protect renal function and prevent cardiovascular disease in chronic kidney disease (CKD) patients. However, data on hypertension awareness, treatment and control among CKD patients are limited. Two nationwide surveys were conducted in China in 1999–2000 and 2004–2005 among, respectively, 1328 and 1244 adult, non-dialysis, hypertensive CKD patients, to assess the status of hypertension awareness, treatment and control and associated factors. A standard questionnaire was adopted, and blood pressure (BP) was measured by trained staff according to a standard protocol in both surveys. Compared with the data from 1999–2000, the data from 2004–2005 showed increased awareness (87.2 vs. 75.7%, P<0.001), treatment (85.9 vs. 80.4%, P=0.001) and control (30.0 vs. 21.1%, P<0.001, by the general threshold of BP<140/90 mm Hg; 7.7 vs. 5.9%, P=0.075, by an optimal threshold of BP<130/80 mm Hg) of hypertension. The odds ratios for general BP control were 1.4 (95% confidence index (CI), 1.1–1.7) for female gender, 1.1 (95% CI, 1.0–1.1) for high estimated glomerular filtration rate, 1.3 (95% CI, 1.1–1.6) for treatment in a local hospital, 2.8 (95% CI, 2.0–3.9) for hypertension awareness and 1.7 (95% CI, 1.4–1.9) for combined treatment. General physicians from local hospitals made greater contributions to the total improvement. Lack of treatment was mainly due to patients not recognizing the necessity for it. This is the first report of hypertension awareness, treatment and control among hypertensive CKD patients from a developing country. Improvement of awareness and general control of hypertension were demonstrated. Education of both physicians and patients regarding optimal BP control should be reinforced in the future.

Proceedings ArticleDOI
22 May 2009
TL;DR: A recommendation algorithm combining the case-based reasoning and item-based collaborative filtering can alleviate the sparsity issue and can produce more accuracy recommendation than the traditional recommender systems.
Abstract: Recommender systems can find user interested information based on the information filtering algorithms. Collaborative filtering technique has been proved to be one of the most successful techniques in recommender systems. And there are two approaches: one is user-based collaborative filtering and the other is item-based collaborative filtering. Data sparsity is the main problem in recommender system, which leads to the bad accuracy. To solve the sparsity problem, this paper presents a personalized recommendation algorithm joining case-based reasoning and item-based collaborative filtering. At first, it employs case-based reasoning technology to fill the vacant ratings of the user-item matrix. And then, it produces prediction of the target user to the target item using item-based collaborative filtering. The recommendation algorithm combining the case-based reasoning and item-based collaborative filtering can alleviate the sparsity issue and can produce more accuracy recommendation than the traditional recommender systems.

Journal ArticleDOI
TL;DR: Gefitinib (AstraZeneca, Newark, DE, USA) is the first targeted drug to be approved for advanced non-small cell lung cancer (NSCLC) that failed to respond to chemotherapy as mentioned in this paper.
Abstract: Gefitinib (AstraZeneca, Newark, DE, USA) is the first targeted drug to be approved for advanced non-small cell lung cancer (NSCLC) that failed to respond to chemotherapy. It has a fairly effective antitumor activity in patients with tumors harboring epidermal growth factor receptor (EGFR) gene mutations. However, the effect of gefitinib as a neoadjuvant or preoperative therapy remains unclear, especially unknown in patients without EGFR gene mutations. The objective of the present case was to investigate the response of gefitinib in such a patient.

Journal Article
TL;DR: Gagnon, Jr. and Collay's constructivist learning design for Foreign Trade Oral English course as mentioned in this paper was used to design the teaching mode of this course based on George W. Gagnon and Michelle Collay.
Abstract: At present,the Foreign Trade Oral English class has a lot of problems including the limited practice opportunities, the out-of-date teaching method,the failure to exploit the potential of students and the limitation in the teaching material.The writer hopes to provide some ideas for both the teachers and learners by trying to design the teaching mode of this course based on George W.Gagnon,Jr.and Michelle Collay's constructivist learning design.

Proceedings ArticleDOI
26 Dec 2009
TL;DR: This paper implements image interpolation algorithms as a test case to discuss how different tiling strategies affect the program’s performance, and demonstrates that an optimized tiling strategy on one GPU model is not always a good solution when execute on other GPU models, especially when some external conditions were changed.
Abstract: The strategy of using CUDA-compatible GPUs as a parallel computation solution to improve the performance of programs has been more and more widely approved during the last two years since the CUDA platform was released. Its benefit extends from the graphic domain to many other computationally intensive domains. Tiling, as the most general and important technique, is widely used for optimization in CUDA programs. New models of GPUs with better compute capabilities have, however, been released, new versions of CUDA SDKs were also released. These updated compute capabilities must to be considered when optimizing using the tiling technique. In this paper, we implement image interpolation algorithms as a test case to discuss how different tiling strategies affect the program’s performance. We especially focus on how the different models of GPUs affect the tiling’s effectiveness by executing the same program on two different models of GPUs equipped testing platforms. The results demonstrate that an optimized tiling strategy on one GPU model is not always a good solution when execute on other GPU models, especially when some external conditions were changed.