scispace - formally typeset
Search or ask a question
Author

Ioan Cristian Trelea

Bio: Ioan Cristian Trelea is an academic researcher. The author has contributed to research in topics: Fermentation & Chemistry. The author has an hindex of 1, co-authored 1 publications receiving 2399 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The particle swarm optimization algorithm is analyzed using standard results from the dynamic system theory and graphical parameter selection guidelines are derived, resulting in results superior to previously published results.

2,554 citations

Journal ArticleDOI
TL;DR: In this paper , the individual and combined effects of fermentation parameters for improving cell biomass productivity and the resistance to freezing, freeze-drying and freezing-dried storage of Lactobacillus delbrueckii subsp. bulgaricus CFL1 were investigated.
Abstract: AIM This study investigates the individual and combined effects of fermentation parameters for improving cell biomass productivity and the resistance to freezing, freeze-drying and freeze-dried storage of Lactobacillus delbrueckii subsp. bulgaricus CFL1. METHODS AND RESULTS Cells were cultivated at different temperatures (42°C and 37°C), pHs (5.8 and 4.8) and harvested at various growth phases (mid-exponential, deceleration and stationary growth phases). Specific acidifying activity was determined after fermentation, freezing, freeze-drying and freeze-dried storage. Multiple regression analyses were performed to identify the effects of fermentation parameters on the specific acidifying activity losses and to generate the corresponding 3D response surfaces. A multi-objective decision approach was applied to optimize biomass productivity and specific acidifying activity. The temperature positively influenced biomass productivity, whereas low pH during growth reduced the loss of specific acidifying activity after freezing and freeze-drying. Furthermore, freeze-drying resistance was favored by increased harvest time. CONCLUSIONS Productivity, freezing and freeze-drying resistances of Lactobacillus delbrueckii subsp. bulgaricus CFL1 were differentially affected by the fermentation parameters studied. There was no single fermentation condition that improved both productivity and resistance to freezing and freeze-drying. Thus, Pareto fronts were helpful to optimize productivity and resistance, when cells were grown at 42°C, pH 4.8, and harvested at the deceleration phase.
Journal ArticleDOI
TL;DR: Aivars Aboltins Busra Acoglu Kong S.A. as mentioned in this paper , S.E. Roknul Azam Rubiyah Baini B. Ah-Hen Karim Allaf Stephania Arag on-Rojas Tetsuya Araki Michał Araszkiewicz Diego Arcos-Aviles Kornkanok Aryusuk Getachew Assegehegn M. M. R. Balboni Devrim Balk€ ose Jan Banout Antonello Barresi Jos e Marcelo Hon orio Ferreria Barros Andrew Bayly J anos Beke Soottawat Benjakul Lyes Bennamoun Carolina Beres Manuel Binelo Gokhan Bingol Luigi Capozzi Gisandro Carvalho Siyuan Chang Manop Charoenchaitrakool Ho Hisien Chen Li Chen Long Chen Meiqian Chen Naphaporn Chiewchan Bimal Chitrakar Claudia Cogne Paula M.
Abstract: Aivars Aboltins Busra Acoglu Kong S. Ah-Hen Karim Allaf Stephania Arag on-Rojas Tetsuya Araki Michał Araszkiewicz Diego Arcos-Aviles Kornkanok Aryusuk Getachew Assegehegn SM Roknul Azam Rubiyah Baini B. M. Balboni Devrim Balk€ ose Jan Banout Antonello Barresi Jos e Marcelo Hon orio Ferreria Barros Andrew Bayly J anos Beke Soottawat Benjakul Lyes Bennamoun Carolina Beres Manuel Binelo Gokhan Bingol Luigi Capozzi Gisandro Carvalho Siyuan Chang Manop Charoenchaitrakool Ho Hisien Chen Li Chen Long Chen Meiqian Chen Naphaporn Chiewchan Bimal Chitrakar Claudia Cogne Paula M. R. Correia Francis Courtois Nenad Crnomarkovic Li-Zhen Deng Yun Deng Martin Doß Pascal Dufour Diego Elustondo Thiago Euz ebio Zhongxiang Fang Xue Feng Fabiano Fernandes Adam Figiel Jose Finzer Petra Foerst Sylwester Furmaniak Chihiro Fushimi Volker Gaukel Jolanta Gawałek Bhupendra M Ghodki Zeljko Gori sek Sandaka Gourishankar Alejandro Grimm Sebastian Gruber Raquel Guin e M. Amdadul Hague Maite Harguindeguy Atsushi Hashimoto W.L.J. Hinrichs Ulrich Hirn Henrik Holmberg Kornel Hulak Jan Iciek Koreyoshi Imamura Yoshinori Itaya Masashi Iwata Hiroyuki Iyota M.E. Jaramillo-Flores Fuji Jian Mohammad Joardder Mohammad Jouki Hasan Jubaer Ian Kemp Janne Ker€anen Seddik Khalloufi Sarfaraz Khan Ragab Khir Hanna Kowalska Sitaraman Krishnan Peter Kubbutat Tadeusz Kudra Jundika Kurnia Timothy Langrish Thunnop Laokuldilok Asefeh Latifi Johnselvakumar Lawrence Hehe Li Yongming Li Xiang Lu William Lubitz Tariq Mahmood Mateusz Malinowski Thad Maloney Agata Marzec Klaudia Masztalerz Yasuhiro Matsushita Natalia Menshutina Janie Moore Jun Mu Antonio Mulet Kyuya Nakagawa Lebovka Nikolai Amir Nili-Ahmadabadi Lars Nilsson Takahisa Nishizu Marcello Nitz Lovrenc Novak Sylvanus Odjo Mart ın Olazar Wanderley Oliveira Jeremy Olivier

Cited by
More filters
Journal ArticleDOI
TL;DR: Simulation results show that JADE is better than, or at least comparable to, other classic or adaptive DE algorithms, the canonical particle swarm optimization, and other evolutionary algorithms from the literature in terms of convergence performance for a set of 20 benchmark problems.
Abstract: A new differential evolution (DE) algorithm, JADE, is proposed to improve optimization performance by implementing a new mutation strategy ldquoDE/current-to-p bestrdquo with optional external archive and updating control parameters in an adaptive manner. The DE/current-to-pbest is a generalization of the classic ldquoDE/current-to-best,rdquo while the optional archive operation utilizes historical data to provide information of progress direction. Both operations diversify the population and improve the convergence performance. The parameter adaptation automatically updates the control parameters to appropriate values and avoids a user's prior knowledge of the relationship between the parameter settings and the characteristics of optimization problems. It is thus helpful to improve the robustness of the algorithm. Simulation results show that JADE is better than, or at least comparable to, other classic or adaptive DE algorithms, the canonical particle swarm optimization, and other evolutionary algorithms from the literature in terms of convergence performance for a set of 20 benchmark problems. JADE with an external archive shows promising results for relatively high dimensional problems. In addition, it clearly shows that there is no fixed control parameter setting suitable for various problems or even at different optimization stages of a single problem.

2,778 citations

Journal ArticleDOI
01 Dec 2009
TL;DR: An adaptive particle swarm optimization that features better search efficiency than classical particle Swarm optimization (PSO) is presented and can perform a global search over the entire search space with faster convergence speed.
Abstract: An adaptive particle swarm optimization (APSO) that features better search efficiency than classical particle swarm optimization (PSO) is presented. More importantly, it can perform a global search over the entire search space with faster convergence speed. The APSO consists of two main steps. First, by evaluating the population distribution and particle fitness, a real-time evolutionary state estimation procedure is performed to identify one of the following four defined evolutionary states, including exploration, exploitation, convergence, and jumping out in each generation. It enables the automatic control of inertia weight, acceleration coefficients, and other algorithmic parameters at run time to improve the search efficiency and convergence speed. Then, an elitist learning strategy is performed when the evolutionary state is classified as convergence state. The strategy will act on the globally best particle to jump out of the likely local optima. The APSO has comprehensively been evaluated on 12 unimodal and multimodal benchmark functions. The effects of parameter adaptation and elitist learning will be studied. Results show that APSO substantially enhances the performance of the PSO paradigm in terms of convergence speed, global optimality, solution accuracy, and algorithm reliability. As APSO introduces two new parameters to the PSO paradigm only, it does not introduce an additional design or implementation complexity.

1,713 citations

Book
24 Feb 2006
TL;DR: This work focuses on the optimization of particle Swarm Optimization for TRIBES or co-operation of tribes with a focus on the dynamics of a swarm.
Abstract: Foreword. Introduction. Part 1: Particle Swarm Optimization. Chapter 1. What is a difficult problem? Chapter 2. On a table corner. Chapter 3. First formulations. Chapter 4. Benchmark set. Chapter 5. Mistrusting chance. Chapter 6. First results. Chapter 7. Swarm: memory and influence graphs. Chapter 8. Distributions of proximity. Chapter 9. Optimal parameter settings. Chapter 10. Adaptations. Chapter 11. TRIBES or co-operation of tribes. Chapter 12. On the constraints. Chapter 13. Problems and applications. Chapter 14. Conclusion. Part 2: Outlines. Chapter 15. On parallelism. Chapter 16. Combinatorial problems. Chapter 17. Dynamics of a swarm. Chapter 18. Techniques and alternatives. Further Information. Bibliography. Index.

1,293 citations

Journal ArticleDOI
TL;DR: Current theoretical studies on particle swarm optimization are extended to investigate particle trajectories for general swarms to include the influence of the inertia term, and a formal proof that each particle converges to a stable point is provided.

1,194 citations

Journal ArticleDOI
TL;DR: A comparative study has been carried out to show the effectiveness of the WCA over other well-known optimizers in terms of computational effort and function value in this paper.

1,181 citations