scispace - formally typeset
Search or ask a question
Author

Jiajie Fan

Bio: Jiajie Fan is an academic researcher from Fudan University. The author has contributed to research in topics: Light-emitting diode & Phosphor. The author has an hindex of 17, co-authored 124 publications receiving 1307 citations. Previous affiliations of Jiajie Fan include Hong Kong Polytechnic University & Delft University of Technology.


Papers
More filters
Journal ArticleDOI
TL;DR: From this review, a number of challenges that result from the rapid adoption of IoT-based PHM are identified and include appropriate analytics, security, IoT platforms, sensor energy harvesting, IoT business models, and licensing approaches.
Abstract: Prognostics and systems health management (PHM) is an enabling discipline that uses sensors to assess the health of systems, diagnoses anomalous behavior, and predicts the remaining useful performance over the life of the asset. The advent of the Internet of Things (IoT) enables PHM to be applied to all types of assets across all sectors, thereby creating a paradigm shift that is opening up significant new business opportunities. This paper introduces the concepts of PHM and discusses the opportunities provided by the IoT. Developments are illustrated with examples of innovations from manufacturing, consumer products, and infrastructure. From this review, a number of challenges that result from the rapid adoption of IoT-based PHM are identified. These include appropriate analytics, security, IoT platforms, sensor energy harvesting, IoT business models, and licensing approaches.

176 citations

Journal ArticleDOI
TL;DR: A quick and efficient evaluation judgment for the thermal management of the IGBTs depended on the requirements on the junction-to-case thermal resistance and equivalent heat transfer coefficient of the test samples is proposed.
Abstract: As an increasing attention towards sustainable development of energy and environment, the power electronics (PEs) are gaining more and more attraction on various energy systems. The insulated gate bipolar transistor (IGBT), as one of the PEs with numerous advantages and potentials for development of higher voltage and current ratings, has been used in a board range of applications. However, the continuing miniaturization and rapid increasing power ratings of IGBTs have remarkable high heat flux, which requires complex thermal management. In this paper, studies of the thermal management on IGBTs are generally reviewed including analyzing, comparing, and classifying the results originating from these researches. The thermal models to accurately calculate the dynamic heat dissipation are divided into analytical models, numerical models, and thermal network models, respectively. The thermal resistances of current IGBT modules are also studied. According to the current products on a number of IGBTs, we observe that the junction-to-case thermal resistance generally decreases inversely in terms of the total thermal power. In addition, the cooling solutions of IGBTs are reviewed and the performance of the various solutions are studied and compared. At last, we have proposed a quick and efficient evaluation judgment for the thermal management of the IGBTs depended on the requirements on the junction-to-case thermal resistance and equivalent heat transfer coefficient of the test samples.

171 citations

Journal ArticleDOI
TL;DR: In this article, the degradation-data-driven method (DDDM) was used to predict the reliability of HPWLEDs through analyzing the lumen maintenance data collected from the IES LM-80-08 lumen standard.
Abstract: High-power white light-emitting diodes (HPWLEDs) have attracted much attention in the lighting market. However, as one of the highly reliable electronic products which may be not likely to fail under the traditional life test or even accelerated life test, HPWLED's lifetime is difficult to estimate by using traditional reliability assessment techniques. In this paper, the degradation-data-driven method (DDDM), which is based on the general degradation path model, was used to predict the reliability of HPWLED through analyzing the lumen maintenance data collected from the IES LM-80-08 lumen maintenance test standard. The final predicted results showed that much more reliability information (e.g., mean time to failure, confidence interval, reliability function, and so on) and more accurate prediction results could be obtained by DDDM including the approximation method, the analytical method, and the two-stage method compared to the IES TM-21-11 lumen lifetime estimation method. Among all these three methods, the two-stage method produced the highest prediction accuracy.

143 citations

Journal ArticleDOI
TL;DR: In this paper, after analyzing the materials and geometries for high-power white LED lighting at all levels, failure modes, mechanisms, and effects analysis (FMMEA) was used in the PoF-based PHM approach to identify and rank the potential failures emerging from the design process.
Abstract: Recently, high-power white light-emitting diodes (LEDs) have attracted much attention due to their versatility in applications and to the increasing market demand for them. So great attention has been focused on producing highly reliable LED lighting. How to accurately predict the reliability of LED lighting is emerging as one of the key issues in this field. Physics-of-failure-based prognostics and health management (PoF-based PHM) is an approach that utilizes knowledge of a product's life cycle loading and failure mechanisms to design for and assess reliability. In this paper, after analyzing the materials and geometries for high-power white LED lighting at all levels, i.e., chips, packages and systems, failure modes, mechanisms and effects analysis (FMMEA) was used in the PoF-based PHM approach to identify and rank the potential failures emerging from the design process. The second step in this paper was to establish the appropriate PoF-based damage models for identified failure mechanisms that carry a high risk.

108 citations

Journal ArticleDOI
TL;DR: A particle filter-based (PF-based) prognostic approach based on both Sequential Monte Carlo (SMC) and Bayesian techniques to predict the lumen maintenance life of LED light sources is described.
Abstract: Destructive life test is time-consuming and expensive to estimate the LED's life.TM-21 standard with least-squares regression has weakness in predicting LED's life.A dynamic recursive method of PF is developed to model the lumen degradation data.An SMC method is proposed to predict RUL distribution with a confidence interval.PF has higher accuracy than TM-21 standard in the LED's long-term life prediction. Lumen degradation is a common failure mode in LED light sources. Lumen maintenance life, defined as the time when the maintained percentages of the initial light output fall below a failure threshold, is a key characteristic for assessing the reliability of LED light sources. Owing to the long lifetime and high reliability of LED lights sources, it is challenging to estimate the lumen maintenance life for LED light sources using traditional life testing that records failure data. This paper describes a particle filter-based (PF-based) prognostic approach based on both Sequential Monte Carlo (SMC) and Bayesian techniques to predict the lumen maintenance life of LED light sources. The lumen maintenance degradation data collected from an accelerated degradation test was used to demonstrate the prediction algorithm and methodology of the proposed PF approach. Its feasibility and prediction accuracy were then validated and compared with the TM-21 standard method that was created by the Illuminating Engineering Society of North America (IESNA). Finally, a robustness study was also conducted to analyze the initialization of parameters impacting the prediction accuracy and the uncertainties of the proposed PF method. The results show that, compared to the TM-21 method, the PF approach achieves better prediction performance, with an error of less than 5% in predicting the long-term lumen maintenance life of LED light sources.

73 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
01 Jan 1977-Nature
TL;DR: Bergh and P.J.Dean as discussed by the authors proposed a light-emitting diode (LEDD) for light-aware Diodes, which was shown to have promising performance.
Abstract: Light-Emitting Diodes. (Monographs in Electrical and Electronic Engineering.) By A. A. Bergh and P. J. Dean. Pp. viii+591. (Clarendon: Oxford; Oxford University: London, 1976.) £22.

1,560 citations

01 Jan 2011
TL;DR: In this paper, a polynomial dimensional decomposition (PDD) method for global sensitivity analysis of stochastic systems subject to independent random input following arbitrary probability distributions is presented.
Abstract: This paper presents a polynomial dimensional decomposition (PDD) method for global sensitivity analysis of stochastic systems subject to independent random input following arbitrary probability distributions. The method involves Fourier-polynomial expansions of lower-variate component functions of a stochastic response by measure-consistent orthonormal polynomial bases, analytical formulae for calculating the global sensitivity indices in terms of the expansion coefficients, and dimension-reduction integration for estimating the expansion coefficients. Due to identical dimensional structures of PDD and analysis-of-variance decomposition, the proposed method facilitates simple and direct calculation of the global sensitivity indices. Numerical results of the global sensitivity indices computed for smooth systems reveal significantly higher convergence rates of the PDD approximation than those from existing methods, including polynomial chaos expansion, random balance design, state-dependent parameter, improved Sobol’s method, and sampling-based methods. However, for non-smooth functions, the convergence properties of the PDD solution deteriorate to a great extent, warranting further improvements. The computational complexity of the PDD method is polynomial, as opposed to exponential, thereby alleviating the curse of dimensionality to some extent. Mathematical modeling of complex systems often requires sensitivity analysis to determine how an output variable of interest is influenced by individual or subsets of input variables. A traditional local sensitivity analysis entails gradients or derivatives, often invoked in design optimization, describing changes in the model response due to the local variation of input. Depending on the model output, obtaining gradients or derivatives, if they exist, can be simple or difficult. In contrast, a global sensitivity analysis (GSA), increasingly becoming mainstream, characterizes how the global variation of input, due to its uncertainty, impacts the overall uncertain behavior of the model. In other words, GSA constitutes the study of how the output uncertainty from a mathematical model is divvied up, qualitatively or quantitatively, to distinct sources of input variation in the model [1].

1,296 citations

Journal ArticleDOI
TL;DR: This paper explores the role of Internet of Things (IoT) and its impact on supply chain management (SCM) through an extensive literature review and finds that most studies have focused on conceptualising the impact of IoT with limited analytical models and empirical studies.
Abstract: This paper explores the role of Internet of Things (IoT) and its impact on supply chain management (SCM) through an extensive literature review. Important aspects of IoT in SCM are covered including IoT definition, main IoT technology enablers and various SCM processes and applications. We offer several categorisation of the extant literature, such as based on methodology, industry sector and focus on a classification based on major supply chain processes. In addition, a bibliometric analysis of the literature is also presented. We find that most studies have focused on conceptualising the impact of IoT with limited analytical models and empirical studies. In addition, most studies have focused on the delivery supply chain process and the food and manufacturing supply chains. Areas of future SCM research that can support IoT implementation are also identified.

727 citations

Proceedings ArticleDOI
15 Mar 2006
TL;DR: In this article, damage pre-cursors based residual life computation approach for various package elements to prognosticate electronic systems prior to appearance of any macro-indicators of damage has been presented.
Abstract: In this paper, damage pre-cursors based residual life computation approach for various package elements to prognosticate electronic systems prior to appearance of any macro-indicators of damage has been presented. In order to implement the system-health monitoring system, precursor variables or leading indicators-of-failure have been identified for various package elements and failure mechanisms. Model-algorithms have been developed to correlate precursors with impending failure for computation of residual life. Package elements investigated include, first-level interconnects, dielectrics, chip interconnects, underfills and semiconductors. Examples of damage proxies include, phase growth rate of solder interconnects, intermetallics, normal stress at chip interface, and interfacial shear stress

331 citations