scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Service Operations and Logistics, and Informatics in 2012"


Proceedings ArticleDOI
08 Jul 2012
TL;DR: This paper studies a green VRPSPD problem by including fuel consumption and carbon emission costs into the model and shows that the proposed g-VRPSPD model can generate more environment-friendly routes without scarifying much on the total travel distance.
Abstract: Reducing greenhouse gases (GHG), especially carbon dioxide, has become a critical environmental issue worldwide and has attracted attentions in all economic sectors and industries. Ranking the second in cargo turnover volume in China, road cargo transportation has become a key carbon reduction field in the country. In this paper, we study the vehicle routing problem with simultaneous pickups and deliveries (VRPSPD), a commonly encountered practice in road cargo transportation, especially associated with reverse logistics. More specifically, we study a green VRPSPD (g-VRPSPD) problem by including fuel consumption and carbon emission costs into the model. Our numerical results show that, comparing with the traditional distance-minimizing model, the proposed g-VRPSPD model can generate more environment-friendly routes without scarifying much on the total travel distance. We also provide guidelines for green vehicle routing strategies.

39 citations


Proceedings ArticleDOI
Chun Hua Tian1, Junchi Yan1, Jin Huang1, Yu Wang1, Dong-Sup Kim1, Tongnyoul Yi1 
08 Jul 2012
TL;DR: Negative pressure wave is a popular method to detect the occurrence and location of leak incidents in oil/gas pipeline and multiple-sensor paring algorithms is presented to ensure the robustness and locating precision of LDS can be ensured.
Abstract: Negative pressure wave is a popular method to detect the occurrence and location of leak incidents in oil/gas pipeline. Three core technical challenges and related algorithm are discussed in this paper. The first is data quality. The balance between noise level and locating precision is discussed in filter design. The second one is dynamic slope in anomaly detection, whence a bi-SPC (Static Process Control) algorithms is proposed to make the threshold be adaptive. The third one is the false alarm caused normal working condition changes. Multiple-sensor paring algorithms is presented. With these algorithms, the robustness and locating precision of LDS (Leak Detection System) can be ensured.

37 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: A parallel traffic simulation module of ATS with GPU is built and a maximum speedup factor of 105 is obtained compared with the CPU implementations.
Abstract: Traffic micro-simulation is an important tool in the Intelligent Transportation Systems (ITS) research. In the micro-simulation, a bottom up system can be built up by the interactions of vehicle agents, road agents, traffic lights agents, etc. The Artificial societies, Computational experiments, and Parallel execution (ACP) approach suggests integrating other metropolitan systems such as logistic, infrastructure, legal and regulatory, weather and environmental systems to build an Artificial Transportation System (ATS) to help solve ITS problems. This is reasonable as the transportation system is complex that is affected by many systems interacting with each other. However, there is a challenge that the computing burden can be very heavy as there can be many agents of different kinds interacting in parallel in ATS. In recent years, the Graphics Processing Units (GPUs) have been applied successfully in many areas for parallel computing. Compared with the traditional CPU cluster, GPU has an obvious advantage of low cost of hardware and electricity consumption. In this paper, we build a parallel traffic simulation module of ATS with GPU. The simulation results are reasonable and a maximum speedup factor of 105 is obtained compared with the CPU implementations.

21 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: A new real-time data collection called Radio Frequency Identification (RFID) system will be integrated in VSM to collect the objects information through a manufacturing system in the production floor to be able to interact with the processes, people, material, and any other constraint relevant to the production situation.
Abstract: Value Stream Mapping (VSM) is one of the most powerful lean manufacturing tools used for quick analyses of products and information flow through a manufacturing system from door to door. This versatile, powerful method is used to visualize product flows as snapshot; it just describes the production behavior within specific period from the total production time; which some time cause a misleading during lean tools implementation. In this paper a new real-time data collection called Radio Frequency Identification (RFID) system will be integrated in VSM to collect the objects information through a manufacturing system in the production floor. The new integrated RFID-VSM system called Dynamic Value Stream Mapping (DVSM) Provides a real-time data for VSM to be able to interact with the processes, people, material, and any other constraint relevant to the production situation. As time progresses and the operation are “producing”, the workers and managers will have the ability to interact live with the animated flow. Queues build up, inventory deplete, people move, etc… Based on the real situation the managers can take the right decision at the right time where the workers will make changes to processing capacity, labor requirements, flow, and cell layout, to optimize and design or develop the future state.

19 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: A recent study into universities, which are among the most common yet probably the least understood service systems, is reported, and the nature of universities is explored from both external and internal perspectives.
Abstract: Service systems are a main focus of service research; the advancement of Service Science relies on a thorough understanding of how service systems work. While the service research literature offers a range of theories to examine service systems, empirical evidence is still scarce. This paper reports a recent study into universities, which are among the most common yet probably the least understood service systems. More specifically, the nature of universities is explored from both external and internal perspectives. The external perspective considers the university as a single entity and analyzes its impacts on the city it resides in. It is found that universities have a significant contribution to localities as a source of employment, expenditure, knowledge and talent supply. Equally, universities rely heavily on the support of local communities. The internal perspective reveals the inner working of universities through case studies. A total of 12 functional domains are identified, including education, research, finance, buildings, utilities, among others, with each providing a wide range of services. A structured method is developed to improve service quality by analyzing value co-creation processes among different stakeholders, such as students, faculties and administrators. The paper provides fresh insights into service systems and we hope it will inspire more empirical research into universities and service systems in general.

17 citations


Proceedings ArticleDOI
Yu Wang1, Chunhua Tian1, Junchi Yan1, Jin Huang1
08 Jul 2012
TL;DR: The problem formulation, major methods of three typical optimization tasks, and the challenges and potential research directions of these problems are analyzed and the existing optimization tasks are reviewed and analyzed.
Abstract: Due to its safest operation and lowest expense, the fuel pipeline system plays crucial role in oil/gas transportation. In the pipe network industry, there are diverse optimization tasks during any period from design to operation. Previously, most works focus on developing effective algorithmic variants to strengthen some aspects of search capability. However, few review papers have been reported, which are desiderata for engineering solution providers. In this paper, we try to fulfill this gab by reviewing the existing optimization tasks and classifying them according two kinds of attributes. The problem formulation, major methods of three typical optimization tasks are reviewed and analyzed. The challenges and potential research directions of these problems are also analyzed.

16 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: An overview of urban traffic coordination controls is given and the several main hot topics in UTCCs are surveyed: the correlation degree analysis of intersections, the division approach of traffic control subareas and the objects of coordination.
Abstract: The urban traffic coordination controls (UTCCs) can make full use of the mutual advantages of intersections, which makes it can improve the traffic access capacity and decrease the possibility of traffic congestion in intersections. This paper gives an overview of UTCCs. After reviewing the concept of UTCCs, we survey the several main hot topics in UTCCs: the correlation degree analysis of intersections, the division approach of traffic control subareas and the objects of coordination. This paper provides a survey of urban traffic coordination controls with the goal of promoting research in this area and concludes with some comments on future research directions.

15 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: An overview of communication in G SD process is presented, and some key issues of communication related with activities in GSD process are described and some communication enabling techniques and tools are described.
Abstract: Global software development (GSD) process is a growing focus for researchers. Although there are many challenges in GSD process, there are also some advantages with GSD process. As for these challenges, there are some kinds of communications to reduce these challenges and improve the probability of success of GSD. This paper presents an overview of communication in GSD process, and describes some key issues of communication related with activities in GSD process, and some communication enabling techniques and tools, and then it highlights some key issues of communication in future.

15 citations


Proceedings ArticleDOI
Junchi Yan1, Chunhua Tian1, Jin Huang1, Yu Wang1
08 Jul 2012
TL;DR: A very recently proposed machine learning algorithm, Twin Gaussian Process (TGP) is described and compared with other widely used algorithms and experimental results show that TGP can be a useful tool for load forecasting.
Abstract: Load forecasting is an attractive and complicated application of machine learning theory and algorithms. Continuous efforts have been made from both academics and industry, by using various methods such as Regression, Artificial Neural Network (ANN), Time Series Models like Auto Regressive Moving Average Models (ARMA), Gaussian Process (GP) and Genetic Algorithm (GA). The non-parametric models are not widely used in the forecasting domain, yet the promising results from the recent applications of Gaussian Process have indicated a potential value for this kind of algorithms. In this paper, we describe a very recently proposed machine learning algorithm, Twin Gaussian Process (TGP) and apply it to the load forecasting task. Different from the Gaussian Process Model, the Twin Gaussian Process uses Gaussian Process (GP) priors on both covariance as well as responses, and obtain the output via Kullback-Leibler divergence minimization between two GP modeled as normal distributions over finite index sets of training and testing examples. As a result, TGP is able to account for the correlations of both inputs and outputs. In our case study, TGP is evaluated and compared with other widely used algorithms. And experimental results show that TGP can be a useful tool for load forecasting.

13 citations


Proceedings ArticleDOI
08 Jul 2012
TL;DR: An agent-based traffic simulator is built and Ordinal Optimization (OO) is employed as the optimizer, which shows the effectiveness of the OO method and the power of GPU for parallel computing.
Abstract: Traffic signal coordination has long been a hot topic in Intelligent Transportation Systems (ITS) research. Simulation-based optimization is an important method to optimize the coordination designs as the traffic systems are not easily described by mathematical models accurately. For the traffic simulator in the method, micro-simulation is becoming more and more popular than the macroscopic or mesoscopic simulation in recent years, as the micro-simulation can describe drivers' decision-making process and their driving behaviors in the framework of ITS. However, the computing burden for the micro-simulation is usually very heavy as a large number of vehicles are modeled and simulated separately. For the optimizer in the method, iterative algorithms such as Genetic Algorithms (GA), Ant Colony Optimization (ACO), and Particle Swarm Optimization (PSO) are widely used to find the global optimal solutions. However, these algorithms usually take too much time to converge when a large-scale road network is encountered. To alleviate the computing burden and speed up the convergence, we build an agent-based traffic simulator and employ Ordinal Optimization (OO) as the optimizer. We accelerate both the simulator and the optimizer with Graphics Processing Unit (GPU), which has been applied successfully in many areas for parallel computing. Simulation results show the effectiveness of the OO method and the power of GPU for parallel computing.

10 citations


Proceedings ArticleDOI
Qi Hu1, Shenghua Bao1, Jingmin Xu1, Wenli Zhou1, Min Li1, Heyuan Huang1 
08 Jul 2012
TL;DR: Experimental results show that the proposed method with a proper estimation of edges and vertices in the social network can achieve significant performance improvement for finding the missing recipients.
Abstract: Email is one of the most essential IT services for modern enterprises. However, missing desired recipients for sent messages that frequently happens results in great communication confusion and collaboration inefficiency. In this paper, we propose an effective Email recipient recommendation service implemented by the participant co-occurrence social network based method to help Email users find the missing recipients. Given a specific task of recommending recipient we studied three key factors based on the social network: 1) a general recipient recommendation algorithm based on the social network; 2) the method for updating the edge weight of the social network for the new query (message); and 3) the method to measure the correlated vertices by the social metric of closeness centrality. An extensive evaluation has been conducted on the Enron Email Corpus and Lotus Notes Email Corpus. Experimental results show that the proposed method with a proper estimation of edges and vertices in the social network can achieve significant performance improvement for finding the missing recipients. Five real missing cases of Enron Email Corpus further verify the effectiveness of the proposed method. Besides, we implemented the service in IBM Lotus Notes and invited dozens of colleagues to join in a pilot. The initial feedbacks received from participants are very positive.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: The simulation results show that the remanufacturer should source from the backup supplier from the period at which his profit is approximate to zero, and by comparing the simulation results with and without the strategy, the strategy for sourcing time can be decided according to the re manufacturer's profit.
Abstract: Supply disruption has a low probability and the severe impacts on the supply chain members. There are two effective approaches to mitigate supply disruptions: (1) splitting orders among multiple suppliers; (2) sourcing from regular suppliers firstly and then from specified backup suppliers when supply disruption occurs. For the latter one, when will the remanufacturer source from the backup supplier is the key issue. In this paper, we will focus on the sourcing time for the remanufacturer of reverse supply chain when a supply disruption occurs. First, the stock-flow diagram with supply disruption of the reverse supply chain is presented by Vensim® 5.10. Second, the simulation results of the remanufacturer's profit under different supply disruption periods are shown. Then, by analyzing the simulation results, the strategy for sourcing time can be decided according to the remanufacturer's profit. At the end, by comparing the simulation results with and without the strategy, we can find that the remanufacturer should source from the backup supplier from the period at which his profit is approximate to zero.

Proceedings ArticleDOI
Hong Bo Li1, Wei Wang1, Hongwei Ding1, Jin Dong1
08 Jul 2012
TL;DR: This paper presents a new method for allocating commodity shelves in supermarket based on customers' shopping paths and transactions data mining, and builds benefit optimization model to obtain the optimal allocating solution with considering the profit, sales volume, and purchase probability of the commodity.
Abstract: How to deploy commodities for sale in different shelves in a supermarket in order to obtain better benefit for merchants with considering convenience for customers is an important topic in the retail area. In this paper, we present a new method for allocating commodity shelves in supermarket based on customers' shopping paths and transactions data mining. Therein, customers' shopping paths data can be obtained by shopping cart or basket, on which RFID (Radio Frequency Identification) tags located. And shopping transaction data can be obtained from POS (Point of Sales) machine. Through integrating and mining the frequent paths data and transactions data, See-Buy Rate, which refers to an approximate probability to purchase this commodity for customers when they see this commodity, can be calculated for each commodity. Based on See-Buy Rate, we build benefit optimization model to obtain the optimal allocating solution with considering the profit, sales volume, and purchase probability of the commodity. At last, one computation example is illustrated to show how to apply this method to practice.

Proceedings ArticleDOI
Feng Chen1, Changrui Ren1, Qinhua Wang1, Bing Shao1
08 Jul 2012
TL;DR: A new process definition language for Internet of Things is provided, an XML-based language that includes 3 sections: services, sequence and vars, which includes the specific business process of the IoT application.
Abstract: In Internet of things, lots of instruments and sensors will connect into Internet and can be controlled through Internet to achieve Smart Planet. One of the key challenges is to integrate instruments and sensors into business process. SOA is an ideal infrastructure for business process management and lots of business process definition languages are now available for process orchestration. Instruments functions are encapsulate to web services and can be organized with other web services equally. But these device-oriented web services are different from common web services because devices can't be controlled by more than one client at the same time. This feature can't be supported in traditional process definition languages. In this paper, a new process definition langue for Internet of Things is provided. It is an XML-based language; the structure of the language includes 3 sections: services, sequence and vars. Services section is used to describe service information. Device-oriented web services and common web services are described in this section. Sequence section is the main body of the process, which includes the specific business process of the IoT application. Sequence is used to enable executing several commands in order.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: A support tool for the demand forecast management of local pharmacies based on the forecast of the requirements obtained through the implementation of a Radial Basis Function (RBF) neural network is introduced.
Abstract: This research introduces a support tool for the demand forecast management of local pharmacies. It is based on the forecast of the requirements obtained through the implementation of a Radial Basis Function (RBF) neural network. Having an accurate forecast allows you to reduce the average level of stock and consequently the costs of warehousing and space needed for storage.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: This paper studies a two-echelon dual-channel supply chain system in which a manufacturer sells a single product to customers through its owned online channel and an independent retailer using a Stackelberg game model.
Abstract: This paper studies a two-echelon dual-channel supply chain system in which a manufacturer sells a single product to customers through its owned online channel and an independent retailer. We consider a Stackelberg game model where the manufacturer (the leader) announces the online channel price and wholesale price. In the response, the retailer (the follower) decides a retail price to maximize its retail profit. Potential consumers choose the purchase channel based on price and their acceptance of web-based purchases. Our main interest is to analyze how the different type of products may affect the optimal pricing decisions for each player when all potential consumers differ in their willingness to buy the product and preference to buy from online. Due to complexity of the problem, we use an agent-based modeling approach to investigate this issue. Our numerical results show that the optimal prices are significantly different from those prices of the base model. Our results prove that product category has a great influence on the market segment, which consequently affect the demand of both channels obviously.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: The method regards the cost averaged for one time of successful execution of a composite as its actual executing cost, and then selects the composite services with the aim of minimizing the composite service execution cost.
Abstract: Web services are becoming the most promising technology for cloud computing. When a single web service fails to satisfy service requestor's multiple function demands, web services need to be configured together to form a service composition with cost and other constraints. It is hard to obtain an optimal service composition by the existing service composition methods because the analysis of its dynamic execution process is neglected. This paper uses Petri Net to model and analyze the dynamic execution processes of service compositions. The effect of composite services reliability on the composite service performance is taken into account. The method regards the cost averaged for one time of successful execution of a composite as its actual executing cost, and then selects the composite services with the aim of minimizing the composite service execution cost. The simulation result shows that the proposed method is superior to other methods in execution cost.

Proceedings ArticleDOI
Weishan Dong1, Li Li1, Changjin Zhou1, Yu Wang1, Min Li1, Chunhua Tian1, Wei Sun1 
08 Jul 2012
TL;DR: A novel approach to discover the Generalized Spatial Association Rules (GSAR), which are capable of expressing richer information including not only spatial, but also non-spatial and taxonomy information of spatial objects is proposed.
Abstract: Spatial association rule mining is an important technique of spatial data mining and business intelligence. Nevertheless, traditional spatial association rule mining approaches have a significant limitation that they cannot effectively involve and exploit non-spatial information. As a result, many interesting rules mixing spatial and non-spatial information which provide extra insights and tell the hidden patterns cannot be found. In this paper, we propose a novel approach to discover the Generalized Spatial Association Rules (GSAR), which are capable of expressing richer information including not only spatial, but also non-spatial and taxonomy information of spatial objects. Meanwhile, the additional computation introduced only costs linear time complexity. A case study on a real crime dataset shows that using the proposed approach, many interesting and meaningful crime patterns can be discovered. However, traditional approaches cannot find such patterns at all.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: This twofold assessment enables a fit-gap analysis and a maturity appraisal, and the novelties of GEF are agility and universality, and also the completeness of the analysis.
Abstract: We present a technique in the domain of the business architecture of enterprises. It is called GEF (General Enterprise Framework) because it intends to be universal and to apply at whole enterprise. GEF is a grid where the activities of business processes are classified into five levels - Plan, Execute, Monitor, Control, Manage information. In an ideal situation, all levels are defined and computerized. By surveying the current situation, management can check to which extent the organization's business processes (a) cover levels (completeness) and (b) are IT-supported (computerization). This twofold assessment enables a fit-gap analysis and a maturity appraisal. The novelties of GEF are agility and universality, and also the completeness of the analysis. A test on real projects showed that GEF is easy and effective.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: A datadriven approach to automatically choose blocking column(s) motivated from the modus operandi of data quality practitioners is presented, which produces a ranked list of columns by evaluating them for appropriateness for blocking on the basis of factors including data quality and distribution.
Abstract: Record Linkage is an essential but expensive step in enterprise data management. In most deployments, blocking techniques are employed which can reduce the number of record pair comparisons and hence, the computational complexity of the task. Blocking algorithms require a careful selection of column(s) to be used for blocking. Selection of appropriate blocking column is critical to the accuracy and speed-up offered by the blocking technique and requires intervention by data quality practitioners who can exploit prior domain knowledge to analyse a small sample of the huge database and decide the blocking column(s). However, the selection of optimal blocking column(s) can depend heavily on the quality of data and requires extensive analysis. An experienced data quality practitioner is required for the selection of optimal blocking columns. In this paper, we present a data-driven approach to automatically choose blocking column(s), motivated from the modus operandi of data quality practitioners. Our approach produces a ranked list of columns by evaluating them for appropriateness for blocking on the basis of factors including data quality and distribution. We evaluate our choice of blocking columns through experiments on real world and synthetic datasets. We extend our approach to be employed in scenarios where more than one column can be used for blocking.

Proceedings ArticleDOI
Xiaoqing Wang1, Hongwei Ding1, Wei Wang1, Li Zhang1, Yongqing Xue1 
08 Jul 2012
TL;DR: The logistics carbon emission optimization and its case are analyzed and studied and the current international status and the publications status of carbon emission management are surveyed.
Abstract: Global warming is probably the most pressing ecological, economic and societal challenge facing our generation. Many industries and enterprises have begun steps for carbon emission reduction and management. Logistics activities contribute billions of tons of carbon emission annually with above 50 percent of emission reduction opportunities. This paper surveyed current international status and the publications status of carbon emission management. Logistics carbon emission management is addressed and its importance is analyzed. A logistics carbon footprint evaluation framework is provided. Different low-carbon technologies for logistics enterprises are introduced. The logistics carbon emission optimization and its case are analyzed and studied.

Proceedings ArticleDOI
Junchi Yan1, Chunhua Tian1, Yu Wang1, Jin Huang1
08 Jul 2012
TL;DR: An incremental learning algorithm is presented in this paper, i.e. the online support vector regression model, which enjoys the merits of less memory capacity and less computational overload compared with the batch methods.
Abstract: Modeling methods aiming at predicting electricity price accurately, should be capable of handling a continuous stream of data while keeping responsive to the potential structural changes. To this end, traditional machine learning based approaches are widely applied such as Multi-linear Regression, Artificial Neural Network (ANN), Time Series Models like Auto Regressive Moving Average Models (ARMA), Gaussian Process (GP), random forests and Genetic Algorithm (GA), all of which can fall into two categories: the parametric and nonparametric model. While practical challenges in forecasting streaming data come along with the structural variation of the testing samples making the training samples not necessarily representative enough towards the new arriving samples. In such an online forecasting context, an incremental supervised learning based algorithm is better suited in contrast to the batch-mode one. Given the fact that it can adapt to the new coming streaming data by accommodating the possible variations of new samples, as well as allows for the removal of old data if necessary. An incremental learning algorithm is presented in this paper, i.e. the online support vector regression model, which enjoys the merits of less memory capacity and less computational overload compared with the batch methods. Promising results are demonstrated by evaluating with other typical regression methods for the electricity price forecasting task on a publicly available benchmark data set.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: A two-step approach is presented with a data-driven model estimation for each HVAC configuration and an optimization step taking into account dynamic trade-offs between changing preferences of occupants and energy usage with possibly time-varying penalty constants.
Abstract: We present a comprehensive approach for leveraging sensor networks in order to improve HVAC (Heating, Ventilation, and Air Conditioning) services in terms of occupants preferences as well as sustainability. A two-step approach is presented with a data-driven model estimation for each HVAC configuration and an optimization step taking into account dynamic trade-offs between changing preferences of occupants and energy usage with possibly time-varying penalty constants. We further enable consideration of potentially available forecasts of relevant variables such as outside temperatures. Application of suggested approach is illustrated based on a case study and benefits as well as limitations are discussed.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: The ambition is to be able to suggest improvements to the state of the art by proposing better means to model dynamics of dramatic transformations that lead to turnaround of corporations.
Abstract: This paper discusses Business Model transformation in corporate turnaround events. This is accomplished with a Business Model Framework and by studying a set of cases, where demand-supply constellation has dramatically changed. The cases are from Finland (Nokia) and China (Baidu). A fourth case compares two business models with each other that existed simultaneously, but in different parts of the world.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: Experiments on real datasets show that the dynamic user adaptive combination strategy can significantly enhance the performance of the recommendation and the external open resource IMDB is a very good information resource for recommendation.
Abstract: Movie recommendation is a very popular service in internet based movie related websites such as NetFlix, MovieLens. The performance of the recommendation plays the key role in the user experience. Existing works have shown that combining content based and collaborative filtering based algorithms is the best way for movie recommendation. Nevertheless, the performance of this hybrid algorithm is strongly depended on the strategy how to combine the basic pure algorithms. Existing works usually use a static combination strategy which may generate even worse performance for some users. To solve this problems, in this paper we propose a new item based hybrid algorithm that uses a dynamic user adaptive combination strategy. Besides, we also exploit the external open resources IMDB as the movie content data. Experiments on real datasets show that the dynamic user adaptive combination strategy can significantly enhance the performance of the recommendation and the external open resource IMDB is a very good information resource for recommendation.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: It is proposed in this paper that academic intelligence, journal intelligence, conference intelligence, paper intelligence and so on are integrated together to establish intelligent scientific research collaboration platform.
Abstract: Currently, the research issues are becoming increasingly global and complex. In order to master more and more professional and comprehensive ability to solve problems, it is proposed in this paper that academic intelligence, journal intelligence, conference intelligence, paper intelligence and so on are integrated together to establish intelligent scientific research collaboration platform. And taking the system application of Science and Technology Review as example, the process of scientific research collaboration is carried out to verify the effectiveness of the system. In conclusion, the scientific research collaboration platform could satisfy the comprehensive needs for effectively acquiring a mass of information and launching scientific research collaboration as well as facilitating academic communication.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: It is found that if sample data satisfied with IIA property, the authors' experience has confirmed that MNL is the first choice in mode split forecasting, and the nested logit model and Heteroscedastic Extreme Value (HEV)Model are not significantly better than the multinomiallogit model.
Abstract: Four types of typical discrete Choice Models: Multinomial logit (MNL) model, Nested logit (NL)model, Heteroscedastic Extreme Value (HEV)Model and Mixed logit model, have been proposed and implemented in empirical investigations, although there is no universally acknowledged using principle. Here we report study to test this type of models in a travel mode choice case. we implemented four models calibration using software programmed by ourselves. We found that if sample data satisfied with IIA property, our experience has confirmed that MNL is the first choice in mode split forecasting. The nested logit model and Heteroscedastic Extreme Value (HEV)Model are not significantly better than the multinomial logit model. Mixed logit model corrects IIA flaw, but is somewhat more difficult to estimate.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: A new experiment platform frame, which is based on SPA (Spreading Activation) model and social centrality, is proposed to validate the information spreading speed in social network, to help the administration department to manage social network more efficiently.
Abstract: With the fast development of Internet and data volume increase in Internet, Internet is playing more important role in information spreading in recent years. It is reported that social network is a typical complex network expressed with scale free degree. Emergency cases are happening more and more often in China and all over the world, the governments has realized that it is necessary to monitor and manage the information spreading on social network. In this paper, a new experiment platform frame, which is based on SPA (Spreading Activation) model and social centrality, is proposed to validate the information spreading speed in social network, to help the administration department to manage social network more efficiently.

Proceedings ArticleDOI
08 Jul 2012
TL;DR: With the help of APS, actual power system's control, scheduling, optimization and management can be improved further via providing theoretical guidance and technical support for the rolling optimizations in normal situations and emergency management in abnormal situations.
Abstract: Modern power system is a typical multi-level complex giant system consisting of physical infrastructures, human operators, and social resources, etc. The conventional analytical methods and simulation systems can't provide sufficient guidance for its operation and management, because they are mainly based on physical models, natural phenomenon, or other existing control methods which are based on reductionism. ACP approach, mainly consisting of artificial systems (A), computational experiments (C) and parallel execution (P), which is based on holism and complex system theory, has its specific advantages in the research on power systems. In this article, ACP approach is applied to build up artificial power system (APS) by using multi-agent complex networks. With the help of APS, actual power system's control, scheduling, optimization and management can be improved further via providing theoretical guidance and technical support for the rolling optimizations in normal situations and emergency management in abnormal situations. As a case study, an APS constructed with actual data from North China power grid is constructed and its vulnerability is simulated and analyzed under random, dynamic and static attacks.

Proceedings ArticleDOI
Rong Zeng Cao1, Wei Ding1, Zhong Su1
08 Jul 2012
TL;DR: A business outcome-based pricing models for IT services is developed by understanding the causal relationships between business outcomes and IT capabilities and aims to balance the IT service providers and clients.
Abstract: This paper identifies two trends that are occurring in the IT services industry. First the value proposition of services is shifting towards delivering business outcomes. Second, the shift is causing a serious threat of commoditization to traditional models of IT-focused outsourcing and pure IT cost plays. The authors developed a business outcome-based pricing models for IT services by understanding the causal relationships between business outcomes and IT capabilities. The model aims to balance the IT service providers and clients, exactly minimizing the providers' risk and maximizing the consumers' satisfactory as well. Last, examples of how business outcome-based pricing has been used in different industries.