scispace - formally typeset
Search or ask a question
Author

Roger Penrose

Bio: Roger Penrose is an academic researcher from University of Oxford. The author has contributed to research in topics: General relativity & Quantum gravity. The author has an hindex of 78, co-authored 201 publications receiving 39379 citations. Previous affiliations of Roger Penrose include University College London & King's College London.


Papers
More filters
Posted Content
TL;DR: In this paper, the authors present strong observational evidence of numerous previously unobserved anomalous circular spots, of significantly raised temperature, in the CMB sky, having angular radii between 0.03 and 0.04 radians.
Abstract: This paper presents strong observational evidence of numerous previously unobserved anomalous circular spots, of significantly raised temperature, in the CMB sky. The spots have angular radii between 0.03 and 0.04 radians (i.e. angular diameters between about 3 and 4 degrees). There is a clear cut-off at that size, indicating that each anomalous spot would have originated from a highly energetic point-like source, located at the end of inflation -- or else point-like at the conformally expanded Big Bang, if it is considered that there was no inflationary phase. The significant presence of these anomalous spots, was initially noticed in the Planck 70 GHz satellite data by comparison with 1000 standard simulations, and then confirmed by extending the comparison to 10000 simulations. Such anomalous points were then found at precisely the same locations in the WMAP data, their significance confirmed by comparison with 1000 WMAP simulations. Planck and WMAP have very different noise properties and it seems exceedingly unlikely that the observed presence of anomalous points in the same directions on both maps may come entirely from the noise. Subsequently, further confirmation was found in the Planck data by comparison with 1000 FFP8.1 MC simulations (with $l \leq 1500$). The existence of such anomalous regions, resulting from point-like sources at the conformally stretched-out big bang, is a predicted consequence of conformal cyclic cosmology (CCC), these sources being the Hawking points of the theory, resulting from the Hawking radiation from supermassive black holes in a cosmic aeon prior to our own.

16 citations

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the impact of the de Sitter effect in the error budget of the LARES 2 frame-dragging experiment and show that it has negligible impact on the final error budget.
Abstract: In two previous papers we presented the LARES 2 space experiment aimed at a very accurate test of frame-dragging and at other tests of fundamental physics and measurements of space geodesy and geodynamics. We presented the error sources of the LARES 2 experiment, its error budget and Monte Carlo simulations and covariance analyses confirming an accuracy of a few parts in one thousand in the test of frame-dragging. Here we discuss the impact of the orbital perturbation known as the de Sitter effect, or geodetic precession, in the error budget of the LARES 2 frame-dragging experiment. We show that the uncertainty in the de Sitter effect has a negligible impact in the final error budget because of the very accurate results now available for the test of the de Sitter precession and because of its very nature. The total error budget in the LARES 2 test of frame-dragging remains at a level of the order of $$0.2\%$$ , as determined in the first two papers of this series.

13 citations

01 Jan 2006
TL;DR: The Second Law of thermodynamics has been used to explain the evolution of life in the universe as discussed by the authors, and has been shown to be a crucial part of the process of life evolution.
Abstract: Proposals for describing the initial state of the universe hardly ever address a certain fundamental conundrum [1] — yet this is a conundrum whose significance is, in a certain sense, obvious. The issue arises from one of the most fundamental principles of physics: the Second Law of thermodynamics. According to the Second Law, roughly speaking, the entropy of the universe increases with time, where the term “entropy” refers to an appropriate measure of disorder or lack of “specialness” of the state of the universe. Since the entropy increases in the future direction of time, it must decrease in the past time-direction. Accordingly, the initial state of the universe must be the most special of all, so any proposal for the actual nature of this initial state must account for its extreme specialness. Proposals have been put forward from time to time (such as in various forms of “inflationary cosmology” and the previously popular “chaotic cosmology”) in which it is suggested that the initial state of the universe ought to have been in some sense “random”, and various physical processes are invoked in order to provide mechanisms whereby the universe might be driven into the special state in which it appears actually to have been in, at slightly later stages. But “random” means “non-special” in the extreme; hence the conundrum just referred to. Sometimes theorists have tried to find an explanation via the fact that the early universe was very “small”, this smallness perhaps allowing only a tiny number of alternative initial states, or perhaps they try to take refuge in the anthropic principle, which would be a selection principle in favour of certain special initial states that allow the eventual evolution of intelligent life. Neither of these suggested explanations gets close to resolving the issue, however. It may be seen that, with timesymmetrical dynamical laws, the mere smallness of the early universe does not provide a restriction on its degrees of freedom. For we may contemplate a universe model in the final stages of collapse. It must do something, in accordance with its dynamical laws, and we expect it to collapse to some sort of complicated space-time singularity, a singularity encompassing as many degrees of freedom as were already present in its earlier nonsingular collapsing phase. Time-reversing this situation, we see that an initial singular state could also contain as many degrees of freedom as such a collapsing one. But in our actual universe, almost all of those degrees of freedom were somehow not activated. What about the anthropic principle? Again, this is virtually no help to us whatever in resolving our conundrum. It is normally assumed that life had to arise via complicated evolutionary processes, and these processes required particular conditions, and particular physical laws, including the Second Law. The Second Law was certainly a crucial part of evolution, in the way that our particular form of life actually came about. But the very action of this Second Law tells us that however special the universe may be now, with life existing in it now, it must have been far more special at an earlier stage in which life was not present. From the purely anthropic point of view, this earlier far more special phase was not needed; it would have been much more likely that our present “improbable” stage came about simply by chance, rather than coming about via an earlier even more improbable stage. When the Second Law is a crucial component, there is always a far more probable set of initial conditions that would lead to this same state of affairs, namely one in which the Second Law was violated prior to the situation now! As another aspect of this same issue, we may think of the vastness of our actual universe, most of which had no actual bearing on our existence. Though very special initial conditions were indeed required for our existence in our particular spatial location, we did not actually need these same special conditions at distant places in the universe. Yet as we look out at the universe, we see the same kind of conditions, acting according to the same Second Law of thermodynamics, no matter how far out we look. If we take the view that the Second Law was introduced in our vicinity merely for our own benefit, then we are left with no explanation for the extravagance of this same Second Law having to be invoked uniformly throughout the universe, as it appears to be as far as our powerful instruments are able to probe.

13 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, it is shown that quantum mechanical effects cause black holes to create and emit particles as if they were hot bodies with temperature, which leads to a slow decrease in the mass of the black hole and to its eventual disappearance.
Abstract: In the classical theory black holes can only absorb and not emit particles. However it is shown that quantum mechanical effects cause black holes to create and emit particles as if they were hot bodies with temperature\(\frac{{h\kappa }}{{2\pi k}} \approx 10^{ - 6} \left( {\frac{{M_ \odot }}{M}} \right){}^ \circ K\) where κ is the surface gravity of the black hole. This thermal emission leads to a slow decrease in the mass of the black hole and to its eventual disappearance: any primordial black hole of mass less than about 1015 g would have evaporated by now. Although these quantum effects violate the classical law that the area of the event horizon of a black hole cannot decrease, there remains a Generalized Second Law:S+1/4A never decreases whereS is the entropy of matter outside black holes andA is the sum of the surface areas of the event horizons. This shows that gravitational collapse converts the baryons and leptons in the collapsing body into entropy. It is tempting to speculate that this might be the reason why the Universe contains so much entropy per baryon.

10,923 citations

Journal ArticleDOI
TL;DR: The author revealed that quantum teleportation as “Quantum one-time-pad” had changed from a “classical teleportation” to an “optical amplification, privacy amplification and quantum secret growing” situation.
Abstract: Quantum cryptography could well be the first application of quantum mechanics at the individual quanta level. The very fast progress in both theory and experiments over the recent years are reviewed, with emphasis on open questions and technological issues.

6,949 citations

Journal ArticleDOI
TL;DR: In this paper, the concept of black-hole entropy was introduced as a measure of information about a black hole interior which is inaccessible to an exterior observer, and it was shown that the entropy is equal to the ratio of the black hole area to the square of the Planck length times a dimensionless constant of order unity.
Abstract: There are a number of similarities between black-hole physics and thermodynamics. Most striking is the similarity in the behaviors of black-hole area and of entropy: Both quantities tend to increase irreversibly. In this paper we make this similarity the basis of a thermodynamic approach to black-hole physics. After a brief review of the elements of the theory of information, we discuss black-hole physics from the point of view of information theory. We show that it is natural to introduce the concept of black-hole entropy as the measure of information about a black-hole interior which is inaccessible to an exterior observer. Considerations of simplicity and consistency, and dimensional arguments indicate that the black-hole entropy is equal to the ratio of the black-hole area to the square of the Planck length times a dimensionless constant of order unity. A different approach making use of the specific properties of Kerr black holes and of concepts from information theory leads to the same conclusion, and suggests a definite value for the constant. The physical content of the concept of black-hole entropy derives from the following generalized version of the second law: When common entropy goes down a black hole, the common entropy in the black-hole exterior plus the black-hole entropy never decreases. The validity of this version of the second law is supported by an argument from information theory as well as by several examples.

6,591 citations

Proceedings ArticleDOI
Lov K. Grover1
01 Jul 1996
TL;DR: In this paper, it was shown that a quantum mechanical computer can solve integer factorization problem in a finite power of O(log n) time, where n is the number of elements in a given integer.
Abstract: were proposed in the early 1980’s [Benioff80] and shown to be at least as powerful as classical computers an important but not surprising result, since classical computers, at the deepest level, ultimately follow the laws of quantum mechanics. The description of quantum mechanical computers was formalized in the late 80’s and early 90’s [Deutsch85][BB92] [BV93] [Yao93] and they were shown to be more powerful than classical computers on various specialized problems. In early 1994, [Shor94] demonstrated that a quantum mechanical computer could efficiently solve a well-known problem for which there was no known efficient algorithm using classical computers. This is the problem of integer factorization, i.e. testing whether or not a given integer, N, is prime, in a time which is a finite power of o (logN) . ----------------------------------------------

6,335 citations

Journal ArticleDOI
TL;DR: Recognition-by-components (RBC) provides a principled account of the heretofore undecided relation between the classic principles of perceptual organization and pattern recognition.
Abstract: The perceptual recognition of objects is conceptualized to be a process in which the image of the input is segmented at regions of deep concavity into an arrangement of simple geometric components, such as blocks, cylinders, wedges, and cones. The fundamental assumption of the proposed theory, recognition-by-components (RBC), is that a modest set of generalized-cone components, called geons (N £ 36), can be derived from contrasts of five readily detectable properties of edges in a two-dimensiona l image: curvature, collinearity, symmetry, parallelism, and cotermination. The detection of these properties is generally invariant over viewing position an$ image quality and consequently allows robust object perception when the image is projected from a novel viewpoint or is degraded. RBC thus provides a principled account of the heretofore undecided relation between the classic principles of perceptual organization and pattern recognition: The constraints toward regularization (Pragnanz) characterize not the complete object but the object's components. Representational power derives from an allowance of free combinations of the geons. A Principle of Componential Recovery can account for the major phenomena of object recognition: If an arrangement of two or three geons can be recovered from the input, objects can be quickly recognized even when they are occluded, novel, rotated in depth, or extensively degraded. The results from experiments on the perception of briefly presented pictures by human observers provide empirical support for the theory. Any single object can project an infinity of image configurations to the retina. The orientation of the object to the viewer can vary continuously, each giving rise to a different two-dimensional projection. The object can be occluded by other objects or texture fields, as when viewed behind foliage. The object need not be presented as a full-colored textured image but instead can be a simplified line drawing. Moreover, the object can even be missing some of its parts or be a novel exemplar of its particular category. But it is only with rare exceptions that an image fails to be rapidly and readily classified, either as an instance of a familiar object category or as an instance that cannot be so classified (itself a form of classification).

5,464 citations