scispace - formally typeset
Search or ask a question
Author

Xavier del Toro Garcia

Other affiliations: University of New South Wales
Bio: Xavier del Toro Garcia is an academic researcher from University of Castilla–La Mancha. The author has contributed to research in topics: Computer science & Harmonics. The author has an hindex of 8, co-authored 20 publications receiving 277 citations. Previous affiliations of Xavier del Toro Garcia include University of New South Wales.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the design, construction and evaluation of a contactless battery charger for electric vehicles (EVs) based on inductive power transfer (IPT) is presented.
Abstract: The design, construction and evaluation of a contactless battery charger for electric vehicles (EVs) based on inductive power transfer (IPT) is presented in this study. The design of such systems entails a high degree of complexity because of the large number of design parameters involved and, consequently, trade-offs in selecting the key design parameters have to be established. The design process and selection of the IPT system parameters is detailed in this study, considering the most common specifications of EV chargers and the practical issues of the implementation. Regarding the compensation scheme, which is one of the main issues in the design, series compensation in both the primary and secondary has been adopted because of the advantages identified after a comprehensive analysis. A laboratory prototype has been built and tested, providing extensive results of the system performance in terms of efficiency and power transfer capability, considering load power variations, as well as changes in the air gap between coils. A detailed analysis of the efficiency of each stage in the IPT system and their contribution to the overall efficiency is also provided.

75 citations

Journal ArticleDOI
TL;DR: In this paper, a fast and accurate method to estimate the fundamental frequency of an electric power system is presented, based on an algebraic method, which is able to calculate the frequency of a pure sinusoidal signal using three samples.
Abstract: The continuous monitoring of voltage characteristics in electric power systems, such as in microgrids, is required for power quality assessment, grid control, and protection purposes. Due to the presence of disturbances in the grid voltage, such as harmonics, imbalances, noise, and offsets introduced by the instrumentation, among others, the frequency-estimation process has to be robust against all these disturbances to obtain an accurate estimation of the frequency value. This paper presents a fast and accurate method to estimate the fundamental frequency of an electric power system. The estimation method works properly in balanced and imbalanced three-phase systems, and even in single-phase systems. The proposed solution is based on an algebraic method, which is able to calculate the frequency of a pure sinusoidal signal using three samples. A filtering stage is used to increase the robustness of the algorithm against disturbances in a wide frequency range. Simulation and experimental results show the good performance of the method for single- and three-phase systems with a high level of harmonic distortion, even in the presence of amplitude, phase, and frequency changes.

60 citations

Journal Article
TL;DR: In this paper, a comparison between two control strategies for Permanent Magnet Synchronous Motors (PMSM): Field Oriented Control (FOC) and Direct Torque Control (DTC) is presented.
Abstract: This paper presents a comparison between two control strategies for Permanent Magnet Synchronous Motors (PMSM): Field Oriented Control (FOC) and Direct Torque Control (DTC). These two strategies can be considered among the family of Vector Control (VC) methods and provide a solution for high-performance drives. This paper presents the implementation of both strategies in PMSM drives. Advantages and disadvantages of both methods are discussed and different simulation tests are performed to illustrate the features of both methods. The criteria followed to establish a fair comparison between both control methods is also presented

53 citations

Journal ArticleDOI
TL;DR: The numerical results show that use of the SAHJA leads to a saving in terms of computational cost without requiring any extra hardware components, and an application to the design of Permanent Magnet Synchronous Motor drives is shown.
Abstract: Purpose – This paper aims to propose a reliable local search algorithm having steepest descent pivot rule for computationally expensive optimization problems. In particular, an application to the design of Permanent Magnet Synchronous Motor (PMSM) drives is shown.Design/methodology/approach – A surrogate assisted Hooke‐Jeeves algorithm (SAHJA) is proposed. The SAHJA is a local search algorithm with the structure of the Hooke‐Jeeves algorithm, which employs a local surrogate model dynamically constructed during the exploratory move at each step of the optimization process.Findings – Several numerical experiments have been designed. These experiments are carried out both on the simulation model (off‐line) and at the actual plant (on‐line). Moreover, the off‐line experiments have been considered in non‐noisy and noisy cases. The numerical results show that use of the SAHJA leads to a saving in terms of computational cost without requiring any extra hardware components.Originality/value – The surrogate approa...

29 citations

21 Dec 2005
TL;DR: The new controller is shown to reduce the ripple in the torque and flux responses and lower current distortion and switching frequency of the semiconductor devices are also obtained in the new system presented.
Abstract: This paper presents a novel controller based on Direct Torque Control (DTC) strategy. This controller is designed to be applied in the control of Induction Motors (IM) fed with a three-level Voltage Source Inverter (VSI). This type of inverter has several advantages over the standard two-level VSI, such as a greater number of levels in the output voltage waveforms, lower dV/dt, less harmonic distortion in voltage and current waveforms and lower switching frequencies. In the new controller, torque and stator flux errors are used together with the stator flux angular frequency to generate a reference voltage vector. Experimental results of the novel system are presented and compared with those obtained for Classical DTC system employing a two-level VSI. The new controller is shown to reduce the ripple in the torque and flux responses. Lower current distortion and switching frequency of the semiconductor devices are also obtained in the new system presented.

24 citations


Cited by
More filters
01 Nov 2000
TL;DR: In this paper, the authors compared the power density characteristics of ultracapacitors and batteries with respect to the same charge/discharge efficiency, and showed that the battery can achieve energy densities of 10 Wh/kg or higher with a power density of 1.2 kW/kg.
Abstract: The science and technology of ultracapacitors are reviewed for a number of electrode materials, including carbon, mixed metal oxides, and conducting polymers. More work has been done using microporous carbons than with the other materials and most of the commercially available devices use carbon electrodes and an organic electrolytes. The energy density of these devices is 3¯5 Wh/kg with a power density of 300¯500 W/kg for high efficiency (90¯95%) charge/discharges. Projections of future developments using carbon indicate that energy densities of 10 Wh/kg or higher are likely with power densities of 1¯2 kW/kg. A key problem in the fabrication of these advanced devices is the bonding of the thin electrodes to a current collector such the contact resistance is less than 0.1 cm2. Special attention is given in the paper to comparing the power density characteristics of ultracapacitors and batteries. The comparisons should be made at the same charge/discharge efficiency.

2,437 citations

01 Sep 2010

2,148 citations

Book
01 Nov 2002
TL;DR: Drive development with automated tests, a style of development called “Test-Driven Development” (TDD for short), which aims to dramatically reduce the defect density of code and make the subject of work crystal clear to all involved.
Abstract: From the Book: “Clean code that works” is Ron Jeffries’ pithy phrase. The goal is clean code that works, and for a whole bunch of reasons: Clean code that works is a predictable way to develop. You know when you are finished, without having to worry about a long bug trail.Clean code that works gives you a chance to learn all the lessons that the code has to teach you. If you only ever slap together the first thing you think of, you never have time to think of a second, better, thing. Clean code that works improves the lives of users of our software.Clean code that works lets your teammates count on you, and you on them.Writing clean code that works feels good.But how do you get to clean code that works? Many forces drive you away from clean code, and even code that works. Without taking too much counsel of our fears, here’s what we do—drive development with automated tests, a style of development called “Test-Driven Development” (TDD for short). In Test-Driven Development, you: Write new code only if you first have a failing automated test.Eliminate duplication. Two simple rules, but they generate complex individual and group behavior. Some of the technical implications are:You must design organically, with running code providing feedback between decisionsYou must write your own tests, since you can’t wait twenty times a day for someone else to write a testYour development environment must provide rapid response to small changesYour designs must consist of many highly cohesive, loosely coupled components, just to make testing easy The two rules imply an order to the tasks ofprogramming: 1. Red—write a little test that doesn’t work, perhaps doesn’t even compile at first 2. Green—make the test work quickly, committing whatever sins necessary in the process 3. Refactor—eliminate all the duplication created in just getting the test to work Red/green/refactor. The TDD’s mantra. Assuming for the moment that such a style is possible, it might be possible to dramatically reduce the defect density of code and make the subject of work crystal clear to all involved. If so, writing only code demanded by failing tests also has social implications: If the defect density can be reduced enough, QA can shift from reactive to pro-active workIf the number of nasty surprises can be reduced enough, project managers can estimate accurately enough to involve real customers in daily developmentIf the topics of technical conversations can be made clear enough, programmers can work in minute-by-minute collaboration instead of daily or weekly collaborationAgain, if the defect density can be reduced enough, we can have shippable software with new functionality every day, leading to new business relationships with customers So, the concept is simple, but what’s my motivation? Why would a programmer take on the additional work of writing automated tests? Why would a programmer work in tiny little steps when their mind is capable of great soaring swoops of design? Courage. Courage Test-driven development is a way of managing fear during programming. I don’t mean fear in a bad way, pow widdle prwogwammew needs a pacifiew, but fear in the legitimate, this-is-a-hard-problem-and-I-can’t-see-the-end-from-the-beginning sense. If pain is nature’s way of saying “Stop!”, fear is nature’s way of saying “Be careful.” Being careful is good, but fear has a host of other effects: Makes you tentativeMakes you want to communicate lessMakes you shy from feedbackMakes you grumpy None of these effects are helpful when programming, especially when programming something hard. So, how can you face a difficult situation and: Instead of being tentative, begin learning concretely as quickly as possible.Instead of clamming up, communicate more clearly.Instead of avoiding feedback, search out helpful, concrete feedback.(You’ll have to work on grumpiness on your own.) Imagine programming as turning a crank to pull a bucket of water from a well. When the bucket is small, a free-spinning crank is fine. When the bucket is big and full of water, you’re going to get tired before the bucket is all the way up. You need a ratchet mechanism to enable you to rest between bouts of cranking. The heavier the bucket, the closer the teeth need to be on the ratchet. The tests in test-driven development are the teeth of the ratchet. Once you get one test working, you know it is working, now and forever. You are one step closer to having everything working than you were when the test was broken. Now get the next one working, and the next, and the next. By analogy, the tougher the programming problem, the less ground should be covered by each test. Readers of Extreme Programming Explained will notice a difference in tone between XP and TDD. TDD isn’t an absolute like Extreme Programming. XP says, “Here are things you must be able to do to be prepared to evolve further.” TDD is a little fuzzier. TDD is an awareness of the gap between decision and feedback during programming, and techniques to control that gap. “What if I do a paper design for a week, then test-drive the code? Is that TDD?” Sure, it’s TDD. You were aware of the gap between decision and feedback and you controlled the gap deliberately. That said, most people who learn TDD find their programming practice changed for good. “Test Infected” is the phrase Erich Gamma coined to describe this shift. You might find yourself writing more tests earlier, and working in smaller steps than you ever dreamed would be sensible. On the other hand, some programmers learn TDD and go back to their earlier practices, reserving TDD for special occasions when ordinary programming isn’t making progress. There are certainly programming tasks that can’t be driven solely by tests (or at least, not yet). Security software and concurrency, for example, are two topics where TDD is not sufficient to mechanically demonstrate that the goals of the software have been met. Security relies on essentially defect-free code, true, but also on human judgement about the methods used to secure the software. Subtle concurrency problems can’t be reliably duplicated by running the code. Once you are finished reading this book, you should be ready to: Start simplyWrite automated testsRefactor to add design decisions one at a time This book is organized into three sections. An example of writing typical model code using TDD. The example is one I got from Ward Cunningham years ago, and have used many times since, multi-currency arithmetic. In it you will learn to write tests before code and grow a design organically.An example of testing more complicated logic, including reflection and exceptions, by developing a framework for automated testing. This example also serves to introduce you to the xUnit architecture that is at the heart of many programmer-oriented testing tools. In the second example you will learn to work in even smaller steps than in the first example, including the kind of self-referential hooha beloved of computer scientists.Patterns for TDD. Included are patterns for the deciding what tests to write, how to write tests using xUnit, and a greatest hits selection of the design patterns and refactorings used in the examples. I wrote the examples imagining a pair programming session. If you like looking at the map before wandering around, you may want to go straight to the patterns in Section 3 and use the examples as illustrations. If you prefer just wandering around and then looking at the map to see where you’ve been, try reading the examples through and refering to the patterns when you want more detail about a technique, then using the patterns as a reference. Several reviewers have commented they got the most out of the examples when they started up a programming environment and entered the code and ran the tests as they read. A note about the examples. Both examples, multi-currency calculation and a testing framework, appear simple. There are (and I have seen) complicated, ugly, messy ways of solving the same problems. I could have chosen one of those complicated, ugly, messy solutions to give the book an air of “reality.” However, my goal, and I hope your goal, is to write clean code that works. Before teeing off on the examples as being too simple, spend 15 seconds imagining a programming world in which all code was this clear and direct, where there were no complicated solutions, only apparently complicated problems begging for careful thought. TDD is a practice that can help you lead yourself to exactly that careful thought.

1,864 citations

Journal ArticleDOI
TL;DR: Several classes of optimization problems, such as discrete, continuous, constrained, multi-objective and characterized by uncertainties, are addressed by indicating the memetic “recipes” proposed in the literature.
Abstract: Memetic computing is a subject in computer science which considers complex structures such as the combination of simple agents and memes, whose evolutionary interactions lead to intelligent complexes capable of problem-solving. The founding cornerstone of this subject has been the concept of memetic algorithms, that is a class of optimization algorithms whose structure is characterized by an evolutionary framework and a list of local search components. This article presents a broad literature review on this subject focused on optimization problems. Several classes of optimization problems, such as discrete, continuous, constrained, multi-objective and characterized by uncertainties, are addressed by indicating the memetic “recipes” proposed in the literature. In addition, this article focuses on implementation aspects and especially the coordination of memes which is the most important and characterizing aspect of a memetic structure. Finally, some considerations about future trends in the subject are given.

522 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a comprehensive literature review of the latest research and development in the field of microgrid as a promising power system, which shows a broad overview on the worldwide research trend on microgrid which is most significant topic at present.
Abstract: The concept of integration of distributed energy resources for formation of microgrid will be most significant in near future. The latest research and development in the field of microgrid as a promising power system through a comprehensive literature review is presented in this paper. It shows a broad overview on the worldwide research trend on microgrid which is most significant topic at present. This literature survey reveals that integration of distributed energy resources, operation, control, power quality issues and stability of microgrid system should be explored to implement microgrid successfully in real power scenario.

431 citations