scispace - formally typeset
Search or ask a question

Showing papers by "Tel Aviv University published in 2007"


Journal ArticleDOI
TL;DR: The arsenal of nanocarriers and molecules available for selective tumour targeting, and the challenges in cancer treatment are detailed and emphasized.
Abstract: Nanotechnology has the potential to revolutionize cancer diagnosis and therapy. Advances in protein engineering and materials science have contributed to novel nanoscale targeting approaches that may bring new hope to cancer patients. Several therapeutic nanocarriers have been approved for clinical use. However, to date, there are only a few clinically approved nanocarriers that incorporate molecules to selectively bind and target cancer cells. This review examines some of the approved formulations and discusses the challenges in translating basic research to the clinic. We detail the arsenal of nanocarriers and molecules available for selective tumour targeting, and emphasize the challenges in cancer treatment.

7,443 citations


Journal ArticleDOI
TL;DR: The results show that the bias is reliably demonstrated with different experimental paradigms and under a variety of experimental conditions, but that it is only an effect size of d = 0.45.
Abstract: This meta-analysis of 172 studies (N = 2,263 anxious, N = 1,768 nonanxious) examined the boundary conditions of threat-related attentional biases in anxiety. Overall, the results show that the bias is reliably demonstrated with different experimental paradigms and under a variety of experimental conditions, but that it is only an effect size of d = 0.45. Although processes requiring conscious perception of threat contribute to the bias, a significant bias is also observed with stimuli outside awareness. The bias is of comparable magnitude across different types of anxious populations (individuals with different clinical disorders, high-anxious nonclinical individuals, anxious children and adults) and is not observed in nonanxious individuals. Empirical and clinical implications as well as future directions for research are discussed.

3,262 citations


Journal ArticleDOI
25 May 2007-Science
TL;DR: A large-scale proteomic analysis of proteins phosphorylated in response to DNA damage on consensus sites recognized by ATM and ATR is performed and more than 900 regulated phosphorylation sites encompassing over 700 proteins are identified.
Abstract: Cellular responses to DNA damage are mediated by a number of protein kinases, including ATM (ataxia telangiectasia mutated) and ATR (ATM and Rad3-related). The outlines of the signal transduction portion of this pathway are known, but little is known about the physiological scope of the DNA damage response (DDR). We performed a large-scale proteomic analysis of proteins phosphorylated in response to DNA damage on consensus sites recognized by ATM and ATR and identified more than 900 regulated phosphorylation sites encompassing over 700 proteins. Functional analysis of a subset of this data set indicated that this list is highly enriched for proteins involved in the DDR. This set of proteins is highly interconnected, and we identified a large number of protein modules and networks not previously linked to the DDR. This database paints a much broader landscape for the DDR than was previously appreciated and opens new avenues of investigation into the responses to DNA damage in mammals.

2,967 citations


Journal ArticleDOI
TL;DR: Clinical diagnostic criteria for probable and possible PD‐D are proposed, characterized by impairment in attention, memory, executive and visuo‐spatial functions, behavioral symptoms such as affective changes, hallucinations, and apathy are frequent.
Abstract: Dementia has been increasingly more recognized to be a common feature in patients with Parkinson's disease (PD), especially in old age. Specific criteria for the clinical diagnosis of dementia associated with PD (PD-D), however, have been lacking. A Task Force, organized by the Movement Disorder Study, was charged with the development of clinical diagnostic criteria for PD-D. The Task Force members were assigned to sub-committees and performed a systematic review of the literature, based on pre-defined selection criteria, in order to identify the epidemiological, clinical, auxillary, and pathological features of PD-D. Clinical diagnostic criteria were then developed based on these findings and group consensus. The incidence of dementia in PD is increased up to six times, point-prevelance is close to 30%, older age and akinetic-rigid form are associated with higher risk. PD-D is characterized by impairment in attention, memory, executive and visuo-spatial functions, behavioral symptoms such as affective changes, hallucinations, and apathy are frequent. There are no specific ancillary investigations for the diagnosis; the main pathological correlate is Lewy body-type degeneration in cerebral cortex and limbic structures. Based on the characteristic features associated with this condition, clinical diagnostic criteria for probable and possible PD-D are proposed.

2,454 citations


Journal ArticleDOI
25 May 2007-Science
TL;DR: This paper showed that there is a broad consensus among climate models that this region will dry in the 21st century and that the transition to a more arid climate should already be under way.
Abstract: How anthropogenic climate change will affect hydroclimate in the arid regions of southwestern North America has implications for the allocation of water resources and the course of regional development. Here we show that there is a broad consensus among climate models that this region will dry in the 21st century and that the transition to a more arid climate should already be under way. If these models are correct, the levels of aridity of the recent multiyear drought or the Dust Bowl and the 1950s droughts will become the new climatology of the American Southwest within a time frame of years to decades.

1,912 citations


Journal ArticleDOI
TL;DR: The Fifth International Workshop-Conference on Gestational Diabetes Mellitus (GDM) was held in Chicago, IL, 11-13 November 2005 under the sponsorship of the American Diabetes Association as mentioned in this paper.
Abstract: The Fifth International Workshop-Conference on Gestational Diabetes Mellitus (GDM) was held in Chicago, IL, 11–13 November 2005 under the sponsorship of the American Diabetes Association. The meeting provided a forum for review of new information concerning GDM in the areas of pathophysiology, epidemiology, perinatal outcome, long-range implications for mother and her offspring, and management strategies. New information and recommendations related to each of these major topics are summarized in the report that follows. The issues regarding strategies and criteria for the detection and diagnosis of GDM were not reviewed or discussed in detail, since it is anticipated that the Hyperglycemia and Adverse Pregnancy Outcome (HAPO) study will provide data in mid-2007 that will foster the development of criteria for the diagnosis of GDM that are based on perinatal outcomes. Thus, for the interim, the participants of the Fifth International Workshop-Conference on GDM endorsed a motion to continue use of the definition, classification criteria, and strategies for detection and diagnosis of GDM that were recommended at the Fourth Workshop-Conference. Those guidelines are reproduced (with minor modifications) in this article in appendix Tables 1 and 2. The invited lectures, topical discussions, and posters presented at the conference and the invited manuscripts that appear in this issue of Diabetes Care served as the basis for the following summary and recommendations. ### Pathophysiology #### General considerations. Current diagnostic criteria assign the diagnosis of GDM to women with glucose levels in the upper ∼5–10% of the population distribution. The hyperglycemia varies in severity from glucose concentrations that would be diagnostic of diabetes outside of pregnancy to concentrations that are asymptomatic and only slightly above normal, but associated with some increased risk of fetal morbidity. Like all forms of hyperglycemia, GDM is characterized by insulin levels that are insufficient to meet insulin demands. The causes of pancreatic β-cell dysfunction that …

1,619 citations


Journal ArticleDOI
TL;DR: The relaxation hypothesis is confirmed through an ab initio numerical investigation of the dynamics of hard-core bosons on a one-dimensional lattice, and a natural extension of the Gibbs ensemble to integrable systems results in a theory that is able to predict the mean values of physical observables after relaxation.
Abstract: In this Letter we pose the question of whether a many-body quantum system with a full set of conserved quantities can relax to an equilibrium state, and, if it can, what the properties of such a state are. We confirm the relaxation hypothesis through an ab initio numerical investigation of the dynamics of hard-core bosons on a one-dimensional lattice. Further, a natural extension of the Gibbs ensemble to integrable systems results in a theory that is able to predict the mean values of physical observables after relaxation. Finally, we show that our generalized equilibrium carries more memory of the initial conditions than the usual thermodynamic one. This effect may have many experimental consequences, some of which have already been observed in the recent experiment on the nonequilibrium dynamics of one-dimensional hard-core bosons in a harmonic potential [T. Kinoshita et al., Nature (London) 440, 900 (2006)].

1,390 citations


Posted Content
TL;DR: In this paper, the authors developed a theory of demand for insurance that emphasizes the interaction between market insurance, self-insurance, and self-protection, and analyzed the effects of changes in prices, income, and other variables on the demand for these alternative forms of insurance.
Abstract: The article develops a theory of demand for insurance that emphasizes the interaction between market insurance, self-insurance, and self- rotection. The effects of changes in prices, income, and other variables on the demand for these alternative forms of insurance are analyzed using the state preference approach to behavior under uncertainty. Market insurance and self-insurance are shown to be substitutes, but market insurance and self-protection can be complements. The analysis challenges the notion that moral hazard is an inevitable consequence of market insurance, by showing that under certain conditions the latter may lead to a reduction in the probabilities of hazardous events.

1,370 citations


Journal ArticleDOI
TL;DR: Generalization of the coral probiotic hypothesis has led to the hologenome theory of evolution, which proposes the occurrence of a dynamic relationship between symbiotic microorganisms and corals that selects for the coral holobiont that is best suited for the prevailing environmental conditions.
Abstract: Coral microbiology is an emerging field, driven largely by a desire to understand, and ultimately prevent, the worldwide destruction of coral reefs. The mucus layer, skeleton and tissues of healthy corals all contain large populations of eukaryotic algae, bacteria and archaea. These microorganisms confer benefits to their host by various mechanisms, including photosynthesis, nitrogen fixation, the provision of nutrients and infection prevention. Conversely, in conditions of environmental stress, certain microorganisms cause coral bleaching and other diseases. Recent research indicates that corals can develop resistance to specific pathogens and adapt to higher environmental temperatures. To explain these findings the coral probiotic hypothesis proposes the occurrence of a dynamic relationship between symbiotic microorganisms and corals that selects for the coral holobiont that is best suited for the prevailing environmental conditions. Generalization of the coral probiotic hypothesis has led us to propose the hologenome theory of evolution.

1,261 citations


Journal ArticleDOI
TL;DR: Research has shown that different dimensions of psychological distance affect mental construal and that these construals, in turn, guide prediction, evaluation, and behavior.

1,249 citations


Book
01 Jan 2007
TL;DR: The current computational approaches for theFunctional annotation of proteins are described, including direct methods, which propagate functional information through the network, and module‐assisted methods, who infer functional modules within the network and use those for the annotation task.
Abstract: Functional annotation of proteins is a fundamental problem in the post-genomic era. The recent availability of protein interaction networks for many model species has spurred on the development of computational methods for interpreting such data in order to elucidate protein function. In this review, we describe the current computational approaches for the task, including direct methods, which propagate functional information through the network, and module-assisted methods, which infer functional modules within the network and use those for the annotation task. Although a broad variety of interesting approaches has been developed, further progress in the field will depend on systematic evaluation of the methods and their dissemination in the biological community.

Journal ArticleDOI
TL;DR: The extension of reinforcement learning models to free-operant tasks unites psychologically and computationally inspired ideas about the role of tonic dopamine in striatum, explaining from a normative point of view why higher levels of dopamine might be associated with more vigorous responding.
Abstract: Rationale Dopamine neurotransmission has long been known to exert a powerful influence over the vigor, strength, or rate of responding. However, there exists no clear understanding of the computational foundation for this effect; predominant accounts of dopamine’s computational function focus on a role for phasic dopamine in controlling the discrete selection between different actions and have nothing to say about response vigor or indeed the freeoperant tasks in which it is typically measured. Objectives We seek to accommodate free-operant behavioral tasks within the realm of models of optimal control and thereby capture how dopaminergic and motivational manipulations affect response vigor. Methods We construct an average reward reinforcement learning model in which subjects choose both which action to perform and also the latency with which to perform it. Optimal control balances the costs of acting quickly against the benefits of getting reward earlier and thereby chooses a best response latency. Results In this framework, the long-run average rate of reward plays a key role as an opportunity cost and mediates motivational influences on rates and vigor of responding. We review evidence suggesting that the average reward rate is reported by tonic levels of dopamine putatively in the nucleus accumbens. Conclusions Our extension of reinforcement learning models to free-operant tasks unites psychologically and computationally inspired ideas about the role of tonic dopamine in striatum, explaining from a normative point of view why higher levels of dopamine might be associated with more vigorous responding.

Journal ArticleDOI
TL;DR: By learning from real‐world examples, autonomous agents display complex natural behaviors that are often missing in crowd simulations.
Abstract: We present an example-based crowd simulation technique. Most crowd simulation techniques assume that the behavior exhibited by each person in the crowd can be defined by a restricted set of rules. This assumption limits the behavioral complexity of the simulated agents. By learning from real-world examples, our autonomous agents display complex natural behaviors that are often missing in crowd simulations. Examples are created from tracked video segments of real pedestrian crowds. During a simulation, autonomous agents search for examples that closely match the situation that they are facing. Trajectories taken by real people in similar situations, are copied to the simulated agents, resulting in seemingly natural behaviors.

Journal ArticleDOI
TL;DR: The main focus of this article is to operationalize the diagnosis of PD‐D and to propose pratical guidelines based on a two level process depending upon the clinical scenario and the expertise of the evaluator involved in the assessment.
Abstract: A preceding article described the clinical features of Parkinson's disease dementia (PD-D) and proposed clinical diagnostic criteria for "probable" and "possible" PD-D. The main focus of this article is to operationalize the diagnosis of PD-D and to propose practical guidelines based on a two level process depending upon the clinical scenario and the expertise of the evaluator involved in the assessment. Level I is aimed primarily at the clinician with no particular expertise in neuropsychological methods, but who requires a simple, pragmatic set of tests that are not excessively time-consuming. Level I can be used alone or in concert with Level II, which is more suitable when there is the need to specify the pattern and the severity on the dementia of PD-D for clinical monitoring, research studies or pharmacological trials. Level II tests can also be proposed when the diagnosis of PD-D remains uncertain or equivocal at the end of a Level I evaluation. Given the lack of evidence-based standards for some tests when applied in this clinical context, we have tried to make practical and unambiguous recommendations, based upon the available literature and the collective experience of the Task Force. We accept, however, that further validation of certain tests and modifications in the recommended cut off values will be required through future studies.

Journal ArticleDOI
Ehud Gazit1
TL;DR: In this tutorial review the process and applications of peptide self-assembly into nanotubes, nanospheres, nanofibrils, nanotapes, and other ordered structures at the nano-scale are discussed.
Abstract: In this tutorial review the process and applications of peptide self-assembly into nanotubes, nanospheres, nanofibrils, nanotapes, and other ordered structures at the nano-scale are discussed. The formation of well-ordered nanostructures by a process of self-association represents the essence of modern nanotechnology. Such self-assembled structures can be formed by a variety of building blocks, both organic and inorganic. Of the organic building blocks, peptides are among the most useful ones. Peptides possess the biocompatibility and chemical diversity that are found in proteins, yet they are much more stable and robust and can be readily synthesized on a large scale. Short peptides can spontaneously associate to form nanotubes, nanospheres, nanofibrils, nanotapes, and other ordered structures at the nano-scale. Peptides can also form macroscopic assemblies such as hydrogels with nano-scale order. The application of peptide building blocks in biosensors, tissue engineering, and the development of antibacterial agents has already been demonstrated.

Journal ArticleDOI
TL;DR: In this paper, a worldwide panel of experts on the study and treatment of those exposed to disaster and mass violence to extrapolate from related fields of research, and to gain consensus on intervention principles.
Abstract: Given the devastation caused by disasters and mass violence, it is critical that intervention policy be based on the most updated research findings. However, to date, no evidence-based consensus has been reached supporting a clear set of recommendations for intervention during the immediate and the mid-term post mass trauma phases. Because it is unlikely that there will be evidence in the near or mid-term future from clinical trials that cover the diversity of disaster and mass violence circumstances, we assembled a worldwide panel of experts on the study and treatment of those exposed to disaster and mass violence to extrapolate from related fields of research, and to gain consensus on intervention principles. We identified five empirically supported intervention principles that should be used to guide and inform intervention and prevention efforts at the early to mid-term stages. These are promoting: 1) a sense of safety, 2) calming, 3) a sense of self- and community efficacy, 4) connectedness, and 5) hope.

Journal ArticleDOI
TL;DR: NMSS can be used to assess the frequency and severity of NMS in PD patients across all stages in conjunction with the recently validated non‐motor questionnaire.
Abstract: Non-motor symptoms (NMS) in Parkinson's disease (PD) are common, significantly reduce quality of life and at present there is no validated clinical tool to assess the progress or potential response to treatment of NMS. A new 30-item scale for the assessment of NMS in PD (NMSS) was developed. NMSS contains nine dimensions: cardiovascular, sleep/fatigue, mood/cognition, perceptual problems, attention/memory, gastrointestinal, urinary, sexual function, and miscellany. The metric attributes of this instrument were analyzed. Data from 242 patients mean age 67.2 +/- 11 years, duration of disease 6.4 +/- 6 years, and 57.3% male across all stages of PD were collected from the centers in Europe, USA, and Japan. The mean NMSS score was 56.5 +/- 40.7, (range: 0-243) and only one declared no NMS. The scale provided 99.2% complete data for the analysis with the total score being free of floor and ceiling effect. Satisfactory scaling assumptions (multitrait scaling success rate >95% for all domains except miscellany) and internal consistency were reported for most of the domains (mean alpha, 0.61). Factor analysis supported the a prori nine domain structure (63% of the variance) while a small test-retest study showed satisfactory reproducibility (ICC > 0.80) for all domains except cardiovascular (ICC = 0.45). In terms of validity, the scale showed modest association with indicators of motor symptom severity and disease progression but a high correlation with other measures of NMS (NMSQuest) and health-related quality of life measure (PDQ-8) (both, rS = 0.70). In conclusion, NMSS can be used to assess the frequency and severity of NMS in PD patients across all stages in conjunction with the recently validated non-motor questionnaire.

Journal ArticleDOI
TL;DR: This research attacked the mode of action of determinants of disease by studying the response of the immune system toAgentes exactas fisicas y naturales to disease-causing agents.
Abstract: Herbivory by domestic and wild ungulates is a major driver of global vegetation dynamics. However, grazing is not considered in dynamic global vegetation models, or more generally in studies of the effects of environmental change on ecosystems at regional to global scale. An obstacle to this is a lack of empirical tests of several hypotheses linking plant traits with grazing. We, therefore, set out to test whether some widely recognized trait responses to grazing are consistent at the global level. We conducted a meta-analysis of plant trait responses to grazing, based on 197 studies from all major regions of the world, and using six major conceptual models of trait response to grazing as a framework. Data were available for seven plant traits: life history, canopy height, habit, architecture, growth form (forb, graminoid, herbaceous legume, woody), palatability, and geographic origin. Covariates were precipitation and evolutionary history of herbivory. Overall, grazing favoured annual over perennial plants, short plants over tall plants, prostrate over erect plants, and stoloniferous and rosette architecture over tussock architecture. There was no consistent effect of grazing on growth form. Some response patterns were modified by particular combinations of precipitation and history of herbivory. Climatic and historical contexts are therefore essential for understanding plant trait responses to grazing. Our study identifies some key traits to be incorporated into plant functional classifications for the explicit consideration of grazing into global vegetation models used in global change research. Importantly, our results suggest that plant functional type classifications and response rules need to be specific to regions with different climate and herbivory history.

Book ChapterDOI
TL;DR: In this article, the authors proposed a model for medium reorganization and donor-acceptor coupling for long-range and multicenter Electron Transfer in the gas phase.
Abstract: Electron Transfer Past and Future (R Marcus) Electron Transfer Reactions in Solution: A Historical Perspective (N Sutin) Electron Transfer--From Isolated Molecules to Biomolecules (M Bixon & J Jortner) Charge Transfer in Bichromophoric Molecules in the Gas Phase (D Levy) Long--Range Charge Separation in Solvent--Free Donor--Bridge--Acceptor Systems (B Wegewijs & J Verhoeven) Electron Transfer and Charge Separation in Clusters (C Dessent, et al) Control of Electron Transfer Kinetics: Models for Medium Reorganization and Donor--Acceptor Coupling (M Newton) Theories of Structure--Function Relationships for Bridge--Mediated Electron Transfer Reactions (S Skourtis & D Beratan) Fluctuations and Coherence in Long--Range and Multicenter Electron Transfer (G Iversen, et al) Lanczos Algorithm for Electron Transfer Rates in Solvents with Complex Spectral Densities (A Okada, et al) Spectroscopic Determination of Electron Transfer Barriers and Rate Constants (K Omberg, et al) Photoinduced Electron Transfer Within Donor--Spacer--Acceptor Molecular Assemblies Studied by Time--Resolved Microwave Conductivity (J Warman, et al) From Close Contact to Long--Range Intramolecular Electron Transfer (J Verhoeven) Photoinduced Electron Transfers Through sigma Bonds in Solution (N--C Yang, et al) Indexes

Journal ArticleDOI
Arie Levant1
TL;DR: A recently developed robust exact differentiator being applied, robust output-feedback controllers with finite-time convergence are produced, capable to control any general uncertain single-input-single-output process with relative degree 2.

Journal ArticleDOI
TL;DR: This analysis uses information on the connectivity of the network shells to separate, in a unique (no parameters) way, the Internet into three subcomponents: a nucleus that is a small, very well connected globally distributed subgraph; a fractal subcomponent that is able to connect the bulk of the Internet without congesting the nucleus, with self-similar properties and critical exponents predicted from percolation theory.
Abstract: We study a map of the Internet (at the autonomous systems level), by introducing and using the method of k-shell decomposition and the methods of percolation theory and fractal geometry, to find a model for the structure of the Internet. In particular, our analysis uses information on the connectivity of the network shells to separate, in a unique (no parameters) way, the Internet into three subcomponents: (i) a nucleus that is a small (≈100 nodes), very well connected globally distributed subgraph; (ii) a fractal subcomponent that is able to connect the bulk of the Internet without congesting the nucleus, with self-similar properties and critical exponents predicted from percolation theory; and (iii) dendrite-like structures, usually isolated nodes that are connected to the rest of the network through the nucleus only. We show that our method of decomposition is robust and provides insight into the underlying structure of the Internet and its functional consequences. Our approach of decomposing the network is general and also useful when studying other complex networks.

Journal ArticleDOI
TL;DR: A detailed overview of the theoretical and computational approaches that have been taken to understand transport in molecular junctions when these vibronic interactions are involved can be found in this article, where the authors define a particular microscopic model Hamiltonian.
Abstract: Transport of electrons in a single molecule junction is the simplest problem in the general subject area of molecular electronics. In the past few years, this area has been extended to probe beyond the simple tunnelling associated with large energy gaps between electrode Fermi level and molecular levels, to deal with smaller gaps, with near-resonance tunnelling and, particularly, with effects due to interaction of electronic and vibrational degrees of freedom. This overview is devoted to the theoretical and computational approaches that have been taken to understanding transport in molecular junctions when these vibronic interactions are involved. After a short experimental overview, and discussion of different test beds and measurements, we define a particular microscopic model Hamiltonian. That overall Hamiltonian can be used to discuss all of the phenomena dealt with subsequently. These include transition from coherent to incoherent transport as electron/vibration interaction increases in strength, inelastic electron tunnelling spectroscopy and its interpretation and measurement, affects of interelectronic repulsion treated at the Hubbard level, noise in molecular transport junctions, non-linear conductance phenomena, heating and heat conduction in molecular transport junctions and current-induced chemical reactions. In each of these areas, we use the same simple model Hamiltonian to analyse energetics and dynamics. While this overview does not attempt survey the literature exhaustively, it does provide appropriate references to the current literature (both experimental and theoretical). We also attempt to point out directions in which further research is required to answer cardinal questions concerning the behaviour and understanding of vibrational effects in molecular transport junctions. (Some figures in this article are in colour only in the electronic version)

Journal ArticleDOI
01 Oct 2007-Proteins
TL;DR: FireDock's prediction results are comparable to current state‐of‐the‐art refinement methods while its running time is significantly lower, and its refinement procedure significantly improves the ranking of the rigid‐body PatchDock algorithm for these cases.
Abstract: Here, we present FireDock, an efficient method for the refinement and rescoring of rigid-body docking solutions. The refinement process consists of two main steps: (1) rearrangement of the interface side-chains and (2) adjustment of the relative orientation of the molecules. Our method accounts for the observation that most interface residues that are important in recognition and binding do not change their conformation significantly upon complexation. Allowing full side-chain flexibility, a common procedure in refinement methods, often causes excessive conformational changes. These changes may distort preformed structural signatures, which have been shown to be important for binding recognition. Here, we restrict side-chain movements, and thus manage to reduce the false-positive rate noticeably. In the later stages of our procedure (orientation adjustments and scoring), we smooth the atomic radii. This allows for the minor backbone and side-chain movements and increases the sensitivity of our algorithm. FireDock succeeds in ranking a near-native structure within the top 15 predictions for 83% of the 30 enzyme-inhibitor test cases, and for 78% of the 18 semiunbound antibody-antigen complexes. Our refinement procedure significantly improves the ranking of the rigid-body PatchDock algorithm for these cases. The FireDock program is fully automated. In particular, to our knowledge, FireDock's prediction results are comparable to current state-of-the-art refinement methods while its running time is significantly lower. The method is available at http://bioinfo3d.cs.tau.ac.il/FireDock/.

Posted Content
TL;DR: In this article, a simple model of international trade with heterogeneous firms is developed, which is consistent with a number of stylized features of the data, and the model predicts positive and zero trade flows across pairs of countries, and it allows the number of exporting firms to vary across destination countries.
Abstract: We develop a simple model of international trade with heterogeneous firms that is consistent with a number of stylized features of the data. In particular, the model predicts positive as well as zero trade flows across pairs of countries, and it allows the number of exporting firms to vary across destination countries. As a result, the impact of trade frictions on trade flows can be decomposed into the intensive and extensive margins, where the former refers to the trade volume per exporter and the latter refers to the number of exporters. This model yields a generalized gravity equation that accounts for the self-selection of firms into export markets and their impact on trade volumes. We then develop a two-stage estimation procedure that uses a selection equation into trade partners in the first stage and a trade flow equation in the second. We implement this procedure parametrically, semi-parametrically, and non-parametrically, showing that in all three cases the estimated effects of trade frictions are similar. Importantly, our method provides estimates of the intensive and extensive margins of trade. We show that traditional estimates are biased, and that most of the bias is not due to selection but rather due to the omission of the extensive margin. Moreover, the effect of the number of exporting firms varies across country pairs according to their characteristics. This variation is large, and particularly so for trade between developed and less developed countries and between pairs of less developed countries.

Journal ArticleDOI
TL;DR: The smooth second-order sliding mode control-based guidance law is designed and compared with augmented proportional navigation guidance law via computer simulations of a guided missile intercepting a maneuvering ballistic target.

Journal ArticleDOI
TL;DR: In this article, a conceptual framework that concerns the sociopsychological foundation and dynamics of intractable conflict is presented, which describes how societies involved in this reality adapt to the conditions of infractable conflicts.
Abstract: The article presents a conceptual framework that concerns the sociopsychological foundation and dynamics of intractable conflict. First, it defines and characterizes the nature of intractable conflict, and then it describes how societies involved in this reality adapt to the conditions of intractable conflict. This adaptation meets three fundamental challenges: satisfying the needs of the society members, coping with stress, and withstanding the rival. In trying to confront them successfully, societies develop appropriate sociopsychological infrastructure, which includes collective memory, ethos of conflict, and collective emotional orientations. This infrastructure fulfills important individual and collective level functions, including the important role of formation, maintenance, and strengthening of a social identity that reflects this conflict. Special attempts are made to disseminate this infrastructure via societal channels of communication and institutionalize it. The evolved sociopsychological inf...

Journal ArticleDOI
TL;DR: It seems that vitamin D has crossed the boundaries of calcium metabolism and has become a significant factor in a number of physiological functions, specifically as a biological inhibitor of inflammatory hyperactivity.
Abstract: Vitamin D is frequently prescribed by rheumatologists to prevent and treat osteoporosis. Several observations have shown that vitamin D inhibits proinflammatory processes by suppressing the enhanced activity of immune cells that take part in the autoimmune reaction. Moreover, recent evidence strongly suggests that vitamin D supplementation may be therapeutically beneficial, particularly for Th1-mediated autoimmune disorders. Some reports imply that vitamin D may even be preventive in certain disorders such as multiple sclerosis and diabetes type 1. It seems that vitamin D has crossed the boundaries of calcium metabolism and has become a significant factor in a number of physiological functions, specifically as a biological inhibitor of inflammatory hyperactivity.

Journal ArticleDOI
01 Nov 2007-Nature
TL;DR: The analytical approach provides a universal scaling dependence of the mean FPT on both the volume of the confining domain and the source–target distance, which is applicable to a broad range of stochastic processes characterized by length-scale-invariant properties.
Abstract: How long does it take a random walker to reach a given target point? This quantity, known as a first-passage time (FPT), has led to a growing number of theoretical investigations over the past decade. The importance of FPTs originates from the crucial role played by first encounter properties in various real situations, including transport in disordered media, neuron firing dynamics, spreading of diseases or target search processes. Most methods of determining FPT properties in confining domains have been limited to effectively one-dimensional geometries, or to higher spatial dimensions only in homogeneous media. Here we develop a general theory that allows accurate evaluation of the mean FPT in complex media. Our analytical approach provides a universal scaling dependence of the mean FPT on both the volume of the confining domain and the source-target distance. The analysis is applicable to a broad range of stochastic processes characterized by length-scale-invariant properties. Our theoretical predictions are confirmed by numerical simulations for several representative models of disordered media, fractals, anomalous diffusion and scale-free networks.

Journal ArticleDOI
TL;DR: The results offer a resolution to a central controversy regarding the coupling between neurons, LFP, and BOLD signals by demonstrating, for the first time, that the coupling of single units to the other measures is variable yet it is tightly related to the degree of interneuronal correlations in the human auditory cortex.

Proceedings ArticleDOI
26 Dec 2007
TL;DR: The proposed algorithm is fully automatic and based on local saliency, motion detection and object detectors, and compared to the state of the art in image retargeting.
Abstract: Video retargeting is the process of transforming an existing video to fit the dimensions of an arbitrary display. A compelling retargeting aims at preserving the viewers' experience by maintaining the information content of important regions in the frame, whilst keeping their aspect ratio. An efficient algorithm for video retargeting is introduced. It consists of two stages. First, the frame is analyzed to detect the importance of each region in the frame. Then, a transformation that respects the analysis shrinks less important regions more than important ones. Our analysis is fully automatic and based on local saliency, motion detection and object detectors. The performance of the proposed algorithm is demonstrated on a variety of video sequences, and compared to the state of the art in image retargeting.