scispace - formally typeset
Search or ask a question
Author

James C. Smith

Other affiliations: Harvard University, Vanderbilt University, Wellcome Trust  ...read more
Bio: James C. Smith is an academic researcher from Francis Crick Institute. The author has contributed to research in topics: Mesoderm & Xenopus. The author has an hindex of 93, co-authored 436 publications receiving 37251 citations. Previous affiliations of James C. Smith include Harvard University & Vanderbilt University.


Papers
More filters
01 Jan 2015
TL;DR: In the second edition, the authors have reorganized the material to focus on problems, how to represent them, and then how to choose and design algorithms for different representations as discussed by the authors.
Abstract: The overall structure of this new edition is three-tier: Part I presents the basics, Part II is concerned with methodological issues, and Part III discusses advanced topics. In the second edition the authors have reorganized the material to focus on problems, how to represent them, and then how to choose and design algorithms for different representations. They also added a chapter on problems, reflecting the overall book focus on problem-solvers, a chapter on parameter tuning, which they combined with the parameter control and "how-to" chapters into a methodological part, and finally a chapter on evolutionary robotics with an outlook on possible exciting developments in this field. The book is suitable for undergraduate and graduate courses in artificial intelligence and computational intelligence, and for self-study by practitioners and researchers engaged with all aspects of bioinspired design and optimization.

4,461 citations

Book
01 Jan 2003
TL;DR: The authors have reorganized the material to focus on problems, how to represent them, and then how to choose and design algorithms for different representations, and added a chapter on evolutionary robotics with an outlook on possible exciting developments in this field.
Abstract: The overall structure of this new edition is three-tier: Part I presents the basics, Part II is concerned with methodological issues, and Part III discusses advanced topics In the second edition the authors have reorganized the material to focus on problems, how to represent them, and then how to choose and design algorithms for different representations They also added a chapter on problems, reflecting the overall book focus on problem-solvers, a chapter on parameter tuning, which they combined with the parameter control and "how-to" chapters into a methodological part, and finally a chapter on evolutionary robotics with an outlook on possible exciting developments in this field The book is suitable for undergraduate and graduate courses in artificial intelligence and computational intelligence, and for self-study by practitioners and researchers engaged with all aspects of bioinspired design and optimization

3,364 citations

Book ChapterDOI
TL;DR: A classification of different approaches based on a number of complementary features is provided, and special attention is paid to setting parameters on-the-fly, which has the potential of adjusting the algorithm to the problem while solving the problem.
Abstract: The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classification of different approaches based on a number of complementary features, and pay special attention to setting parameters on-the-fly. This has the potential of adjusting the algorithm to the problem while solving the problem. This paper is intended to present a survey rather than a set of prescriptive details for implementing an EA for a particular type of problem. For this reason we have chosen to interleave a number of examples throughout the text. Thus we hope to both clarify the points we wish to raise as we present them, and also to give the reader a feel for some of the many possibilities available for controlling different parameters. © Springer-Verlag Berlin Heidelberg 2007.

1,307 citations

Journal ArticleDOI
04 May 2000-Nature
TL;DR: It is shown that the zebrafish silberblick locus encodes Wnt11 and that Slb/Wnt11 activity is required for cells to undergo correct convergent extension movements during gastrulation, and that the correct extension of axial tissue is at least partly dependent on medio-lateral cell intercalation in paraxial tissue.
Abstract: Vertebrate gastrulation involves the specification and coordinated movement of large populations of cells that give rise to the ectodermal, mesodermal and endodermal germ layers. Although many of the genes involved in the specification of cell identity during this process have been identified, little is known of the genes that coordinate cell movement. Here we show that the zebrafish silberblick (slb) locus1 encodes Wnt11 and that Slb/Wnt11 activity is required for cells to undergo correct convergent extension movements during gastrulation. In the absence of Slb/Wnt11 function, abnormal extension of axial tissue results in cyclopia and other midline defects in the head2. The requirement for Slb/Wnt11 is cell non-autonomous, and our results indicate that the correct extension of axial tissue is at least partly dependent on medio-lateral cell intercalation in paraxial tissue. We also show that the slb phenotype is rescued by a truncated form of Dishevelled that does not signal through the canonical Wnt pathway3, suggesting that, as in flies4, Wnt signalling might mediate morphogenetic events through a divergent signal transduction cascade. Our results provide genetic and experimental evidence that Wnt activity in lateral tissues has a crucial role in driving the convergent extension movements underlying vertebrate gastrulation.

1,006 citations

Journal ArticleDOI
04 Oct 1991-Cell
TL;DR: This paper describes the cloning and expression of a Xenopus homolog of Brachyury, Xbra, and shows that expression of Xbra occurs as a result of mesoderm induction in Xenopus, both in Response to the natural signal and in response to the Mesoderm-inducing factors activin A and basic FGF.

983 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Journal ArticleDOI
TL;DR: The transforming growth factor beta (TGF-beta) family of growth factors control the development and homeostasis of most tissues in metazoan organisms and mutations in these pathways are the cause of various forms of human cancer and developmental disorders.
Abstract: The transforming growth factor beta (TGF-beta) family of growth factors control the development and homeostasis of most tissues in metazoan organisms. Work over the past few years has led to the elucidation of a TGF-beta signal transduction network. This network involves receptor serine/threonine kinases at the cell surface and their substrates, the SMAD proteins, which move into the nucleus, where they activate target gene transcription in association with DNA-binding partners. Distinct repertoires of receptors, SMAD proteins, and DNA-binding partners seemingly underlie, in a cell-specific manner, the multifunctional nature of TGF-beta and related factors. Mutations in these pathways are the cause of various forms of human cancer and developmental disorders.

7,710 citations

Journal ArticleDOI
TL;DR: Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm that is used for optimizing multivariable functions and the results showed that ABC outperforms the other algorithms.
Abstract: Swarm intelligence is a research branch that models the population of interacting agents or swarms that are able to self-organize. An ant colony, a flock of birds or an immune system is a typical example of a swarm system. Bees' swarming around their hive is another example of swarm intelligence. Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm. In this work, ABC algorithm is used for optimizing multivariable functions and the results produced by ABC, Genetic Algorithm (GA), Particle Swarm Algorithm (PSO) and Particle Swarm Inspired Evolutionary Algorithm (PS-EA) have been compared. The results showed that ABC outperforms the other algorithms.

6,377 citations