scispace - formally typeset
Search or ask a question
Institution

Johannes Kepler University of Linz

EducationLinz, Oberösterreich, Austria
About: Johannes Kepler University of Linz is a education organization based out in Linz, Oberösterreich, Austria. It is known for research contribution in the topics: Computer science & Thin film. The organization has 6605 authors who have published 19243 publications receiving 385667 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The C s -smoothness of isogeometric functions is found to be equivalent to geometric smoothness of the same order of their graph surfaces, which motivates them to be called C s-smooth geometrically continuous isogeometometric functions.
Abstract: We study the linear space of C s -smooth isogeometric functions defined on a multi-patch domain? ? ? R 2 . We show that the construction of these functions is closely related to the concept of geometric continuity of surfaces, which has originated in geometric design. More precisely, the C s -smoothness of isogeometric functions is found to be equivalent to geometric smoothness of the same order ( G s -smoothness) of their graph surfaces. This motivates us to call them C s -smooth geometrically continuous isogeometric functions. We present a general framework to construct a basis and explore potential applications in isogeometric analysis. The space of C 1 -smooth geometrically continuous isogeometric functions on bilinearly parameterized two-patch domains is analyzed in more detail. Numerical experiments with bicubic and biquartic functions for performing L 2 approximation and for solving Poisson's equation and the biharmonic equation on two-patch geometries are presented and indicate optimal rates of convergence.

87 citations

Proceedings ArticleDOI
15 Feb 2014
TL;DR: A new, practical algorithm that performs control flow sensitive Partial Escape Analysis in a dynamic Java compiler that allows Escape Analysis, Scalar Replacement and Lock Elision to be performed on individual branches is presented.
Abstract: Escape Analysis allows a compiler to determine whether an object is accessible outside the allocating method or thread. This information is used to perform optimizations such as Scalar Replacement, Stack Allocation and Lock Elision, allowing modern dynamic compilers to remove some of the abstractions introduced by advanced programming models.The all-or-nothing approach taken by most Escape Analysis algorithms prevents all these optimizations as soon as there is one branch where the object escapes, no matter how unlikely this branch is at runtime.This paper presents a new, practical algorithm that performs control flow sensitive Partial Escape Analysis in a dynamic Java compiler. It allows Escape Analysis, Scalar Replacement and Lock Elision to be performed on individual branches. We implemented the algorithm on top of Graal, an open-source Java just-in-time compiler, and it performs well on a diverse set of benchmarks.In this paper, we evaluate the effect of Partial Escape Analysis on the DaCapo, ScalaDaCapo and SpecJBB2005 benchmarks, in terms of run-time, number and size of allocations and number of monitor operations. It performs particularly well in situations with additional levels of abstraction, such as code generated by the Scala compiler. It reduces the amount of allocated memory by up to 58.5%, and improves performance by up to 33%.

87 citations

Journal ArticleDOI
TL;DR: In this article, the authors study the performance of classical and quantum machine learning (ML) models in predicting outcomes of physical experiments and show that for any input distribution D(x), a classical ML model can provide accurate predictions on average by accessing E a number of times comparable to the optimal quantum ML model.
Abstract: We study the performance of classical and quantum machine learning (ML) models in predicting outcomes of physical experiments. The experiments depend on an input parameter x and involve execution of a (possibly unknown) quantum process E. Our figure of merit is the number of runs of E required to achieve a desired prediction performance. We consider classical ML models that perform a measurement and record the classical outcome after each run of E, and quantum ML models that can access E coherently to acquire quantum data; the classical or quantum data are then used to predict the outcomes of future experiments. We prove that for any input distribution D(x), a classical ML model can provide accurate predictions on average by accessing E a number of times comparable to the optimal quantum ML model. In contrast, for achieving an accurate prediction on all inputs, we prove that the exponential quantum advantage is possible. For example, to predict the expectations of all Pauli observables in an n-qubit system ρ, classical ML models require 2^{Ω(n)} copies of ρ, but we present a quantum ML model using only O(n) copies. Our results clarify where the quantum advantage is possible and highlight the potential for classical ML models to address challenging quantum problems in physics and chemistry.

87 citations

Journal ArticleDOI
02 Mar 2021
TL;DR: This work compares six different GNN-based generative models in GraphINVENT, and shows that ultimately the gated-graph neural network performs best against the metrics considered here.
Abstract: Deep learning methods applied to chemistry can be used to accelerate the discovery of new molecules. This work introduces GraphINVENT, a platform developed for graph-based molecular design using graph neural networks (GNNs). GraphINVENT uses a tiered deep neural network architecture to probabilistically generate new molecules a single bond at a time. All models implemented in GraphINVENT can quickly learn to build molecules resembling the training set molecules without any explicit programming of chemical rules. The models have been benchmarked using the MOSES distribution-based metrics, showing how GraphINVENT models compare well with state-of-the-art generative models. This work is one of the first thorough graph-based molecular design studies, and illustrates how GNN-based models are promising tools for molecular discovery.

87 citations

Proceedings ArticleDOI
15 Sep 2010
TL;DR: This work modified an existing high-performance virtual machine to allow arbitrary changes to the definition of loaded classes, including adding and deleting fields and methods, and also allows any kind of change to the class and interface hierarchy.
Abstract: Dynamic code evolution is a technique to update a program while it is running. In an object-oriented language such as Java, this can be seen as replacing a set of classes by new versions. We modified an existing high-performance virtual machine to allow arbitrary changes to the definition of loaded classes. Besides adding and deleting fields and methods, we also allow any kind of changes to the class and interface hierarchy. Our approach focuses on increasing developer productivity during debugging. Changes can be applied at any point a Java program can be suspended. The evaluation section shows that our modifications to the virtual machine have no negative performance impact on normal program execution. The fast in-place instance update algorithm ensures that the performance characteristics of a change are comparable with performing a full garbage collection run. Standard Java development environments are capable of using the code evolution features of our modified virtual machine, so no additional tools are required.

86 citations


Authors

Showing all 6718 results

NameH-indexPapersCitations
Wolfgang Wagner1562342123391
A. Paul Alivisatos146470101741
Klaus-Robert Müller12976479391
Christoph J. Brabec12089668188
Andreas Heinz108107845002
Niyazi Serdar Sariciftci9959154055
Lars Samuelson9685036931
Peter J. Oefner9034830729
Dmitri V. Talapin9030339572
Tomás Torres8862528223
Ramesh Raskar8667030675
Siegfried Bauer8442226759
Alexander Eychmüller8244423688
Friedrich Schneider8255427383
Maksym V. Kovalenko8136034805
Network Information
Related Institutions (5)
Karlsruhe Institute of Technology
82.1K papers, 2.1M citations

92% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

91% related

RWTH Aachen University
96.2K papers, 2.5M citations

91% related

Georgia Institute of Technology
119K papers, 4.6M citations

91% related

Nanyang Technological University
112.8K papers, 3.2M citations

91% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20242
202354
2022187
20211,404
20201,412
20191,365