scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Modern Physics A in 2014"


Journal ArticleDOI
TL;DR: The ALICE experiment at the CERN Large Hadron Collider as mentioned in this paper continuously took data during the first physics campaign of the machine from fall 2009 until early 2013, using proton and lead-ion beams.
Abstract: ALICE is the heavy-ion experiment at the CERN Large Hadron Collider. The experiment continuously took data during the first physics campaign of the machine from fall 2009 until early 2013, using proton and lead-ion beams. In this paper we describe the running environment and the data handling procedures, and discuss the performance of the ALICE detectors and analysis methods for various physics observables.

691 citations


Journal ArticleDOI
TL;DR: The Kibble-Zurek mechanism as mentioned in this paper was developed to describe the associated nonequilibrium dynamics and to estimate the density of defects as a function of the quench rate through the transition.
Abstract: In the course of a nonequilibrium continuous phase transition, the dynamics ceases to be adiabatic in the vicinity of the critical point as a result of the critical slowing down (the divergence of the relaxation time in the neighborhood of the critical point). This enforces a local choice of the broken symmetry and can lead to the formation of topological defects. The Kibble–Zurek mechanism (KZM) was developed to describe the associated nonequilibrium dynamics and to estimate the density of defects as a function of the quench rate through the transition. During recent years, several new experiments investigated the formation of defects in phase transitions induced by a quench both in classical and quantum mechanical systems. At the same time, some established results were called into question. We review and analyze the Kibble–Zurek mechanism focusing in particular on this surge of activity, and suggest possible directions for further progress.

249 citations


Journal ArticleDOI
TL;DR: The MoEDAL experiment at Point 8 of the LHC ring is the seventh and newest LHC experiment, dedicated to the search for highly-ionizing particle avatars of physics beyond the Standard Model, extending significantly the discovery horizon of the HPC.
Abstract: The MoEDAL experiment at Point 8 of the LHC ring is the seventh and newest LHC experiment. It is dedicated to the search for highly-ionizing particle avatars of physics beyond the Standard Model, extending significantly the discovery horizon of the LHC. A MoEDAL discovery would have revolutionary implications for our fundamental understanding of the Microcosm. MoEDAL is an unconventional and largely passive LHC detector comprised of the largest array of Nuclear Track Detector stacks ever deployed at an accelerator, surrounding the intersection region at Point 8 on the LHC ring. Another novel feature is the use of paramagnetic trapping volumes to capture both electrically and magnetically charged highly-ionizing particles predicted in new physics scenarios. It includes an array of TimePix pixel devices for monitoring highly-ionizing particle backgrounds. The main passive elements of the MoEDAL detector do not require a trigger system, electronic readout, or online computerized data acquisition. The aim of this paper is to give an overview of the MoEDAL physics reach, which is largely complementary to the programs of the large multipurpose LHC detectors ATLAS and CMS.

142 citations


Journal ArticleDOI
TL;DR: The mirror dark matter theory as discussed by the authors was proposed to accommodate the existence of a hidden sector, which is a set of new particles and forces interacting with the known particles predominantly via gravity.
Abstract: A simple way to accommodate dark matter is to postulate the existence of a hidden sector. That is, a set of new particles and forces interacting with the known particles predominantly via gravity. In general, this leads to a large set of unknown parameters, however, if the hidden sector is an exact copy of the standard model sector, then, an enhanced symmetry arises. This symmetry, which can be interpreted as space–time parity, connects each ordinary particle (e, ν, p, n, γ, …) with a mirror partner (e′, ν′, p′, n′, γ′, …). If this symmetry is completely unbroken, then the mirror particles are degenerate with their ordinary particle counterparts, and would interact amongst themselves with exactly the same dynamics that govern ordinary particle interactions. The only new interaction postulated is photon–mirror photon kinetic mixing, whose strength ϵ, is the sole new fundamental (Lagrangian) parameter relevant for astrophysics and cosmology. It turns out that such a theory, with suitably chosen initial conditions effective in the very early universe, can provide an adequate description of dark matter phenomena provided that ϵ~10-9. This review focusses on three main developments of this mirror dark matter theory during the last decade: early universe cosmology, galaxy structure and the application to direct detection experiments.

141 citations


Journal ArticleDOI
TL;DR: In this article, a metric connection compatible with both the generalised metric and the OD, D structure was constructed for double field theory and the doubled action was constructed in terms of generalised torsion of this connection.
Abstract: We construct an action for double field theory using a metric connection that is compatible with both the generalised metric and the OD, D structure. The connection is simultaneously torsionful and flat. Using this connection, one may construct a proper covariant derivative for double field theory. We then write the doubled action in terms of the generalised torsion of this connection. This action then exactly reproduces that required for double field theory and gauged supergravity.

109 citations


Journal ArticleDOI
TL;DR: In this article, the status of black holes within a particular proposal for quantum gravity, Weinberg's asymptotic safety program, is reviewed and an improved quantum picture of the Schwarzschild black hole is developed, including angular momenta, higher-derivative corrections and the implications of extra dimensions.
Abstract: Black holes are probably among the most fascinating objects populating our universe. Their characteristic features found within general relativity, encompassing space–time singularities, event horizons, and black hole thermodynamics, provide a rich testing ground for quantum gravity ideas. We review the status of black holes within a particular proposal for quantum gravity, Weinberg's asymptotic safety program. Starting from a brief survey of the effective average action and scale setting procedures, an improved quantum picture of the black hole is developed. The Schwarzschild black hole and its generalizations including angular momenta, higher-derivative corrections and the implications of extra dimensions are discussed in detail. In addition, the quantum singularity emerging for the inclusion of a cosmological constant is elucidated and linked to the phenomenon of a dynamical dimensional reduction of space–time.

96 citations


Journal ArticleDOI
M. Wakamatsu1
TL;DR: In this article, a general agreement now is that there are at least two physically inequivalent gauge-invariant decompositions (I) and (II) of the nucleon.
Abstract: Is gauge-invariant complete decomposition of the nucleon spin possible? Although it is a difficult theoretical question which has not reached a complete consensus yet, a general agreement now is that there are at least two physically inequivalent gauge-invariant decompositions (I) and (II) of the nucleon. In these two decompositions, the intrinsic spin parts of quarks and gluons are just common. What discriminate these two decompositions are the orbital angular momentum parts. The orbital angular momenta of quarks and gluons appearing in the decomposition (I) are the so-called "mechanical" orbital angular momenta, while those appearing in the decomposition (II) are the generalized (gauge-invariant) "canonical" ones. By this reason, these decompositions are also called the "mechanical" and "canonical" decompositions of the nucleon spin, respectively. A crucially important question is which decomposition is more favorable from the observational viewpoint. The main objective of this concise review is to try to answer this question with careful consideration of recent intensive researches on this problem.

93 citations


Journal ArticleDOI
TL;DR: In this article, the cosmological dynamics of a recently proposed infrared modification of the Einstein equations, based on the introduction of a nonlocal term constructed with m2gμν□-1R, where m is a mass parameter, are studied.
Abstract: We study the cosmological dynamics of a recently proposed infrared modification of the Einstein equations, based on the introduction of a nonlocal term constructed with m2gμν□-1R, where m is a mass parameter. The theory generates automatically a dynamical dark energy component, that can reproduce the observed value of the dark energy density without introducing a cosmological constant. Fixing m so to reproduce the observed value ΩDE≃0.68, and writing wDE(a) = w0+(1-a)wa, the model provides a neat prediction for the equation of state parameters of dark energy, w0≃-1.042 and wa≃-0.020, and more generally provides a pure prediction for wDE as a function of redshift. We show that, because of some freedom in the definition of □-1, one can extend the construction so to define a more general family of nonlocal models. However, in a first approximation this turns out to be equivalent to adding an explicit cosmological constant term on top of the dynamical dark energy component. This leads to an extended model with two parameters, ΩΛ and m. Even in this extension the EOS parameter w0 is always on the phantom side, in the range -1.33 ≲w0≤-1, and there is a prediction for the relation between w0 and wa.

89 citations


Journal ArticleDOI
TL;DR: In this paper, the effects of quantum gravity on black hole physics were discussed and the tunneling radiation of scalar particles and fermions was discussed using modified fundamental commutation relations.
Abstract: In this review, we discuss the effects of quantum gravity on black hole physics. After a brief review of the origin of the minimal observable length from various quantum gravity theories, we present the tunneling method. To incorporate quantum gravity effects, we modify the Klein–Gordon equation and Dirac equation by the modified fundamental commutation relations. Then we use the modified equations to discuss the tunneling radiation of scalar particles and fermions. The corrected Hawking temperatures are related to the quantum numbers of the emitted particles. Quantum gravity corrections slow down the increase of the temperatures. The remnants are observed as $M_{\rm Res} \gtrsim \frac{M_p}{\sqrt{\beta_0}}$. The mass is quantized by the modified Wheeler–DeWitt equation and is proportional to n in quantum gravity regime. The thermodynamical property of the black hole is studied by the influence of quantum gravity effects.

86 citations


Journal ArticleDOI
TL;DR: The Axion Dark Matter eXperiment at High Frequencies (ADMX-HF) as discussed by the authors was designed to detect axions in a microwave cavity permeated by a magnetic field.
Abstract: The axion is a light pseudoscalar particle which suppresses CP-violating effects in strong interactions and also happens to be an excellent dark matter candidate. Axions constituting the dark matter halo of our galaxy may be detected by their resonant conversion to photons in a microwave cavity permeated by a magnetic field. The current generation of the microwave cavity experiment has demonstrated sensitivity to plausible axion models, and upgrades in progress should achieve the sensitivity required for a definitive search, at least for low mass axions. However, a comprehensive strategy for scanning the entire mass range, from 1–1000 μeV, will require significant technological advances to maintain the needed sensitivity at higher frequencies. Such advances could include sub-quantum-limited amplifiers based on squeezed vacuum states, bolometers, and/or superconducting microwave cavities. The Axion Dark Matter eXperiment at High Frequencies (ADMX-HF) represents both a pathfinder for first data in the 20–100 μeV range (~5–25 GHz), and an innovation test-bed for these concepts.

80 citations


Journal ArticleDOI
TL;DR: In this paper, a review of statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments is presented. But, the authors do not consider the effect of high-energy reactions in the phase space of final states.
Abstract: We review some recent highlights from the applications of statistical–thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical–thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical–thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical–thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical–thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the radiation of spin-1 particles by black holes in (1+1) dimensions within the Proca equation and showed that the emission temperature for the Schwarzschild background geometry is the same as the Hawking temperature corresponding to scalar particles emission.
Abstract: We investigate the radiation of spin-1 particles by black holes in (1+1) dimensions within the Proca equation. The process is considered as quantum tunneling of bosons through an event horizon. It is shown that the emission temperature for the Schwarzschild background geometry is the same as the Hawking temperature corresponding to scalar particles emission. We also obtain the radiation temperatures for the de Sitter, Rindler and Schwarzschild–de Sitter space–times. In a particular case when two horizons in Schwarzschild–de Sitter space–time coincides, the Nariai temperature is recovered. The thermodynamical entropy of a black hole is calculated for Schwarzschild–de Sitter space–time having two horizons.

Journal ArticleDOI
TL;DR: The second-order phase transition of a Reissner-nordstrom-de Sitter (RN-dS) black hole was investigated in this article, showing that the position of the phase transition point is irrelevant to the electric charge of the system.
Abstract: After introducing the connection between the black hole horizon and the cosmological horizon, we discuss the thermodynamic properties of Reissner–Nordstrom–de Sitter (RN–dS) space–time. We present the condition under which RN–dS black hole can exist. Employing Ehrenfest' classification, we conclude that the phase transition of RN–dS black hole is the second-order one. The position of the phase transition point is irrelevant to the electric charge of the system. It only depends on the ratio of the black hole horizon and the cosmological horizon.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the origin and the basic features causing e-cloud formation in accelerators, then review some of the theoretical work produced to simulate and analyze such phenomenon.
Abstract: Low energy electrons in accelerators are known to interact with the circulating beam, giving rise to the formation of a so-called e- cloud. Such e- cloud may induce detrimental effects on the accelerated beam quality and stability. Those effects have been observed in most accelerators of positively charged particles. A longstanding effort has been so far devoted to understand in detail the physical origin of such e- cloud, its build-up and its interaction with the circulating beam. We will first describe the origin and the basic features causing e- cloud formation in accelerators, then we review some of the theoretical work produced to simulate and analyze such phenomenon. In selected cases, theoretical expectations and experimental observations will be compared, to address the importance of benchmarking codes versus observations to reach the required predictive capability. To this scope, codes need realistic input parameters which correctly describe material and surface properties at the basis of such e- cloud formation and build-up. The experimental efforts, performed worldwide in many surface and material science laboratories, to measure such essential parameters will then be presented and critically reviewed. Finally, we will describe some of the e- cloud mitigation strategies adopted so far and draw some conclusions.

Journal ArticleDOI
TL;DR: In this paper, a review on the Feynman problem and an original research presentation on the relations between Fermionic theories and qubits theories, both regarded in the novel framework of operational probabilistic theories are presented.
Abstract: The present paper is both a review on the Feynman problem, and an original research presentation on the relations between Fermionic theories and qubits theories, both regarded in the novel framework of operational probabilistic theories. The most relevant results about the Feynman problem of simulating Fermions with qubits are reviewed, and in the light of the new original results, the problem is solved. The answer is twofold. On the computational side, the two theories are equivalent, as shown by Bravyi and Kitaev [S. B. Bravyi and A. Y. Kitaev, Ann. Phys. 298, 210 (2002)]. On the operational side, the quantum theory of qubits and the quantum theory of Fermions are different, mostly in the notion of locality, with striking consequences on entanglement. Thus the emulation does not respect locality, as it was suspected by Feynman [R. Feynman, Int. J. Theor. Phys. 21, 467 (1982)].

Journal ArticleDOI
TL;DR: An overview of the experimental study of quark-gluon matter produced in relativistic nucleus-nucleus collisions is given in this paper, with emphasis on recent measurements at the Large Hadron Collider.
Abstract: An overview is given on the experimental study of quark–gluon matter produced in relativistic nucleus–nucleus collisions, with emphasis on recent measurements at the Large Hadron Collider.

Journal ArticleDOI
TL;DR: Forger and Romero as discussed by the authors showed the equivalence between the resulting symplectic and Poisson structures without passing through the canonical Hamiltonian formalism as an intermediary, even in the presence of constraints and gauge symmetries.
Abstract: It is well known that both the symplectic structure and the Poisson brackets of classical field theory can be constructed directly from the Lagrangian in a covariant way, without passing through the noncovariant canonical Hamiltonian formalism. This is true even in the presence of constraints and gauge symmetries. These constructions go under the names of the covariant phase space formalism and the Peierls bracket. We review both of them, paying more careful attention, than usual, to the precise mathematical hypotheses that they require, illustrating them in examples. Also an extensive historical overview of the development of these constructions is provided. The novel aspect of our presentation is a significant expansion and generalization of an elegant and quite recent argument by Forger and Romero showing the equivalence between the resulting symplectic and Poisson structures without passing through the canonical Hamiltonian formalism as an intermediary. We generalize it to cover theories with constraints and gauge symmetries and formulate precise sufficient conditions under which the argument holds. These conditions include a local condition on the equations of motion that we call hyperbolizability, and some global conditions of cohomological nature. The details of our presentation may shed some light on subtle questions related to the Poisson structure of gauge theories and their quantization.

Journal ArticleDOI
TL;DR: In this paper, the theoretical aspects of central exclusive production were discussed and the phenomenological implications in a variety of processes, comparing to existing collider data and addressing the possibilities for the future.
Abstract: We review recent results within the Durham model of central exclusive production. We discuss the theoretical aspects of this approach and consider the phenomenological implications in a variety of processes, comparing to existing collider data and addressing the possibilities for the future.

Journal ArticleDOI
TL;DR: In this article, a review of recent developments in the understanding of field theories in the perturbative regime is presented, in particular the notions of analyticity, unitarity and locality, and the singularity structure of scattering amplitudes in general interacting theories.
Abstract: We review some recent developments in the understanding of field theories in the perturbative regime. In particular, we discuss the notions of analyticity, unitarity and locality, and therefore the singularity structure of scattering amplitudes in general interacting theories. We describe their tree-level structure and their on-shell representations, as well as the links between the tree-level structure itself and the structure of the loop amplitudes. Finally, we describe the on-shell diagrammatics recently proposed both on general grounds and in the remarkable example of planar supersymmetric theories.

Journal ArticleDOI
TL;DR: In this article, the existence of new stable charged leptons and quarks is shown to be possible if they are hidden in O-helium "atoms" and the excessive -2 charged particles are bound in these scenarios.
Abstract: The nonbaryonic dark matter of the universe is assumed to consist of new stable forms of matter. Their stability reflects symmetry of micro world and particle candidates for cosmological dark matter are the lightest particles that bear new conserved quantum numbers. Dark matter candidates can appear in the new families of quarks and leptons and the existence of new stable charged leptons and quarks is possible, if they are hidden in elusive "dark atoms." Such possibility, strongly restricted by the constraints on anomalous isotopes of light elements, is not excluded in scenarios that predict stable double charged particles. The excessive -2 charged particles are bound in these scenarios with primordial helium in O-helium "atoms," maintaining specific nuclear-interacting form of the dark matter, which may provide an interesting solution for the puzzles of the direct dark matter searches.

Journal ArticleDOI
TL;DR: Theoretical and experimental techniques employed in dedicated searches for dark matter at hadron colliders are reviewed in this article, where Bounds from the 7 TeV and 8 TeV proton-proton collisions at the Large Hadron Collider (LHC) have been collected and the results interpreted.
Abstract: Theoretical and experimental techniques employed in dedicated searches for dark matter at hadron colliders are reviewed. Bounds from the 7 TeV and 8 TeV proton–proton collisions at the Large Hadron Collider (LHC) on dark matter interactions have been collected and the results interpreted. We review the current status of the Effective Field Theory picture of dark matter interactions with the Standard Model. Currently, LHC experiments have stronger bounds on operators leading to spin-dependent scattering than direct detection experiments, while direct detection probes are more constraining for spin-independent scattering for WIMP masses above a few GeV.

Journal ArticleDOI
TL;DR: In this article, a general method for the construction of twist operator (satisfying cocycle and normalization condition) corresponding to deformed coalgebra structure is presented for natural realization (classical basis) of κ-Minkowski space time.
Abstract: The quantum phase space described by Heisenberg algebra possesses undeformed Hopf algebroid structure. The κ-deformed phase space with noncommutative coordinates is realized in terms of undeformed quantum phase space. There are infinitely many such realizations related by similarity transformations. For a given realization, we construct corresponding coproducts of commutative coordinates and momenta (bialgebroid structure). The κ-deformed phase space has twisted Hopf algebroid structure. General method for the construction of twist operator (satisfying cocycle and normalization condition) corresponding to deformed coalgebra structure is presented. Specially, twist for natural realization (classical basis) of κ-Minkowski space–time is presented. The cocycle condition, κ-Poincare algebra and R-matrix are discussed. Twist operators in arbitrary realizations are constructed from the twist in the given realization using similarity transformations. Some examples are presented. The important physical applications of twists, realizations, R-matrix and Hopf algebroid structure are discussed.

Journal ArticleDOI
TL;DR: In this article, a self-adjoint dynamical time operator is introduced in Dirac's relativistic formulation of quantum mechanics and shown to satisfy a commutation relation with the Hamiltonian analogous to that of the position and momentum operators.
Abstract: A self-adjoint dynamical time operator is introduced in Dirac's relativistic formulation of quantum mechanics and shown to satisfy a commutation relation with the Hamiltonian analogous to that of the position and momentum operators. The ensuing time-energy uncertainty relation involves the uncertainty in the instant of time when the wave packet passes a particular spatial position and the energy uncertainty associated with the wave packet at the same time, as envisaged originally by Bohr. The instantaneous rate of change of the position expectation value with respect to the simultaneous expectation value of the dynamical time operator is shown to be the phase velocity, in agreement with de Broglie's hypothesis of a particle associated wave whose phase velocity is larger than c. Thus, these two elements of the original basis and interpretation of quantum mechanics are integrated into its formal mathematical structure. Pauli's objection is shown to be resolved or circumvented. Possible relevance to current developments in electron channeling, in interference in time, in Zitterbewegung-like effects in spintronics, graphene and superconducting systems and in cosmology is noted.

Journal ArticleDOI
TL;DR: The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which is called the maximum entropy principle as discussed by the authors.
Abstract: We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

Journal ArticleDOI
TL;DR: In this article, the dual-gauge transformations for Abelian 1-form and 2-form gauge theories were studied and they were shown to be a perfect model for the de Rham cohomological operators of differential geometry.
Abstract: Taking the simple examples of an Abelian 1-form gauge theory in two (1+1)-dimensions, a 2-form gauge theory in four (3+1)-dimensions and a 3-form gauge theory in six (5+1)-dimensions of space–time, we establish that such gauge theories respect, in addition to the gauge symmetry transformations that are generated by the first-class constraints of the theory, additional continuous symmetry transformations. We christen the latter symmetry transformations as the dual-gauge transformations. We generalize the above gauge and dual-gauge transformations to obtain the proper (anti-)BRST and (anti-)dual-BRST transformations for the Abelian 3-form gauge theory within the framework of BRST formalism. We concisely mention such symmetries for the 2D free Abelian 1-form and 4D free Abelian 2-form gauge theories and briefly discuss their topological aspects in our present endeavor. We conjecture that any arbitrary Abelian p-form gauge theory would respect the above cited additional symmetry in D = 2p(p = 1, 2, 3, …) dimensions of space–time. By exploiting the above inputs, we establish that the Abelian 3-form gauge theory, in six (5+1)-dimensions of space–time, is a perfect model for the Hodge theory whose discrete and continuous symmetry transformations provide the physical realizations of all aspects of the de Rham cohomological operators of differential geometry. As far as the physical utility of the above nilpotent symmetries is concerned, we demonstrate that the 2D Abelian 1-form gauge theory is a perfect model of a new class of topological theory and 4D Abelian 2-form as well as 6D Abelian 3-form gauge theories are the field theoretic models for the quasi-topological field theory.

Journal ArticleDOI
TL;DR: In this article, the consequences of deformation of canonical commutation relations with respect to the existence of a minimum length and a maximum momentum were analyzed and a modified version of second quantization was constructed.
Abstract: In this paper, we will analyze the consequences of deforming the canonical commutation relations consistent with the existence of a minimum length and a maximum momentum. We first generalize the deformation of first quantized canonical commutation relation to second quantized canonical commutation relation. Thus, we arrive at a modified version of second quantization. A modified Wheeler–DeWitt equation will be constructed by using this deformed second quantized canonical commutation relation. Finally, we demonstrate that in this modified theory the big bang singularity gets naturally avoided.

Journal ArticleDOI
TL;DR: In this paper, the authors study moduli masses and sharpen the claim that moduli dominated the pre-BBN universe and conjecture that in any string theory with stabilized moduli there will be at least one modulus field whose mass is of order (or less than) the gravitino mass.
Abstract: In recent years it has been realized that pre-BBN decays of moduli can be a significant source of dark matter production, giving a "nonthermal WIMP miracle" and substantially reduced fine-tuning in cosmological axion physics. We study moduli masses and sharpen the claim that moduli dominated the pre-BBN universe. We conjecture that in any string theory with stabilized moduli there will be at least one modulus field whose mass is of order (or less than) the gravitino mass. Cosmology then generically requires the gravitino mass not be less than about 30 TeV and the cosmological history of the universe is nonthermal prior to BBN. Stable LSP's produced in these decays can account for the observed dark matter if they are "wino-like." We briefly consider implications for the LHC, rare decays, and dark matter direct detection and point out that these results could prove challenging for models attempting to realize gauge mediation in string theory.

Journal ArticleDOI
TL;DR: In this paper, a consistent quantum treatment of general gauge theories with an arbitrary gauge fixing in the presence of soft breaking of the BRST symmetry in the field-antifield formalism is developed.
Abstract: A consistent quantum treatment of general gauge theories with an arbitrary gauge-fixing in the presence of soft breaking of the BRST symmetry in the field–antifield formalism is developed. It is based on a gauged (involving a field-dependent parameter) version of finite BRST transformations. The prescription allows one to restore the gauge-independence of the effective action at its extremals and therefore also that of the conventional S-matrix for a theory with BRST-breaking terms being additively introduced into a BRST-invariant action in order to achieve a consistency of the functional integral. We demonstrate the applicability of this prescription within the approach of functional renormalization group to the Yang–Mills and gravity theories. The Gribov–Zwanziger action and the refined Gribov–Zwanziger action for a many-parameter family of gauges, including the Coulomb, axial and covariant gauges, are derived perturbatively on the basis of finite gauged BRST transformations starting from Landau gauge. It is proved that gauge theories with soft breaking of BRST symmetry can be made consistent if the transformed BRST-breaking terms satisfy the same soft BRST symmetry breaking condition in the resulting gauge as the untransformed ones in the initial gauge, and also without this requirement.

Journal ArticleDOI
TL;DR: In this paper, the authors review the historical development of particle accelerators used for external beam radiotherapy and discuss the more recent progress towards more capable and cost-effective sources of particles.
Abstract: Recent developments for the delivery of proton and ion beam therapy have been significant, and a number of technological solutions now exist for the creation and utilisation of these particles for the treatment of cancer. In this paper we review the historical development of particle accelerators used for external beam radiotherapy and discuss the more recent progress towards more capable and cost-effective sources of particles.

Journal ArticleDOI
TL;DR: The complexity of radiative neutrino-mass models can be judged by: (i) whether they require the imposition of ad hoc symmetries, (ii) the number of new multiplets they introduce and (iii) the amount of arbitrary parameters that appear as discussed by the authors.
Abstract: The complexity of radiative neutrino-mass models can be judged by: (i) whether they require the imposition of ad hoc symmetries, (ii) the number of new multiplets they introduce and (iii) the number of arbitrary parameters that appear. Considering models that do not employ new symmetries, the simplest models have two new multiplets and a minimal number of new parameters. With this in mind, we search for the simplest models of radiative neutrino mass. We are led to two models, containing a real scalar triplet and a charged scalar doublet (respectively), in addition to the charged singlet scalar considered by Zee [h+~(1, 1, 2)]. These models are essentially simplified versions of the Zee model and appear to be the simplest models of radiative neutrino mass. However, despite successfully generating nonzero masses, present-day data is sufficient to rule these simple models out. The Zee and Zee–Babu models therefore remain as the simplest viable models. Moving beyond the minimal cases, we find a new model of two-loop masses that employs the charged doublet Φ~(1, 2, 3) and the doubly-charged scalar k++~(1, 1, 4). This is the sole remaining model that employs only three new noncolored multiplets.