scispace - formally typeset
Search or ask a question
Institution

Central European University

EducationVienna, Austria
About: Central European University is a education organization based out in Vienna, Austria. It is known for research contribution in the topics: Politics & European union. The organization has 1358 authors who have published 4186 publications receiving 85246 citations. The organization is also known as: CEU & Közép-Európai Egyetem.


Papers
More filters
Proceedings ArticleDOI
01 Oct 2020
TL;DR: Transformers is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community.
Abstract: Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.

4,798 citations

Journal ArticleDOI
TL;DR: A cross-cultural study of behavior in ultimatum, public goods, and dictator games in a range of small-scale societies exhibiting a wide variety of economic and cultural conditions found the canonical model – based on self-interest – fails in all of the societies studied.
Abstract: Researchers from across the social sciences have found consistent deviations from the predictions of the canonical model of self-interest in hundreds of experiments from around the world. This research, however, cannot determine whether the uniformity re- sults from universal patterns of human behavior or from the limited cultural variation available among the university students used in virtually all prior experimental work. To address this, we undertook a cross-cultural study of behavior in ultimatum, public goods, and dictator games in a range of small-scale societies exhibiting a wide variety of economic and cultural conditions. We found, first, that the canonical model - based on self-interest - fails in all of the societies studied. Second, our data reveal substantially more behavioral vari- ability across social groups than has been found in previous research. Third, group-level differences in economic organization and the structure of social interactions explain a substantial portion of the behavioral variation across societies: the higher the degree of market integration and the higher the payoffs to cooperation in everyday life, the greater the level of prosociality expressed in experimental games. Fourth, the available individual-level economic and demographic variables do not consistently explain game behavior, either within or across groups. Fifth, in many cases experimental play appears to reflect the common interactional patterns of everyday life.

1,589 citations

Posted Content
09 Oct 2019
TL;DR: Transformers is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community.
Abstract: Recent advances in modern Natural Language Processing (NLP) research have been dominated by the combination of Transfer Learning methods with large-scale Transformer language models. With them came a paradigm shift in NLP with the starting point for training a model on a downstream task moving from a blank specific model to a general-purpose pretrained architecture. Still, creating these general-purpose models remains an expensive and time-consuming process restricting the use of these methods to a small sub-set of the wider NLP community. In this paper, we present Transformers, a library for state-of-the-art NLP, making these developments available to the community by gathering state-of-the-art general-purpose pretrained models under a unified API together with an ecosystem of libraries, examples, tutorials and scripts targeting many downstream NLP tasks. Transformers features carefully crafted model implementations and high-performance pretrained weights for two main deep learning frameworks, PyTorch and TensorFlow, while supporting all the necessary tools to analyze, evaluate and use these models in downstream tasks such as text/token classification, questions answering and language generation among others. Transformers has gained significant organic traction and adoption among both the researcher and practitioner communities. We are committed at Hugging Face to pursue the efforts to develop Transformers with the ambition of creating the standard library for building NLP systems.

1,261 citations

Journal ArticleDOI
TL;DR: The results show that, from birth, human infants prefer to look at faces that engage them in mutual gaze and that, at an early age, healthy babies show enhanced neural processing of direct gaze.
Abstract: Making eye contact is the most powerful mode of establishing a communicative link between humans. During their first year of life, infants learn rapidly that the looking behaviors of others conveys significant information. Two experiments were carried out to demonstrate special sensitivity to direct eye contact from birth. The first experiment tested the ability of 2- to 5-day-old newborns to discriminate between direct and averted gaze. In the second experiment, we measured 4-month-old infants' brain electric activity to assess neural processing of faces when accompanied by direct (as opposed to averted) eye gaze. The results show that, from birth, human infants prefer to look at faces that engage them in mutual gaze and that, from an early age, healthy babies show enhanced neural processing of direct gaze. The exceptionally early sensitivity to mutual gaze demonstrated in these studies is arguably the major foundation for the later development of social skills.

1,199 citations

Journal ArticleDOI
TL;DR: A list from A to Z of twenty-six proposals regarding what a “good” QCAbased research entails, both with regard to QCA as a research approach and as an analytical technique are presented.
Abstract: As a relatively new methodological tool, QCA is still a work in progress. Standards of good practice are needed in order to enhance the quality of its applications. We present a list from A to Z of twenty-six proposals regarding what a “good” QCAbased research entails, both with regard to QCA as a research approach and as an analytical technique. Our suggestions are subdivided into three categories: criteria referring to the research stages before, during, and after the analytical moment of data analysis. Th is listing can be read as a guideline for authors, reviewers, and readers of QCA.

975 citations


Authors

Showing all 1416 results

NameH-indexPapersCitations
Albert-László Barabási152438200119
Dan Sperber6720732068
Gergely Csibra6717216635
Herbert Gintis6626935339
János Kertész6436919276
Rosario N. Mantegna6226820543
Saul Estrin5835916448
Philippe C. Schmitter5116717240
Günther Knoblich4915610789
Robert J. Willis4912514068
János Kornai4520313830
Philip N. Howard441298566
Milos R. Popovic443127458
Ernest Gellner4216611173
David Stark411338238
Network Information
Related Institutions (5)
London School of Economics and Political Science
35K papers, 1.4M citations

88% related

University of Sussex
44.6K papers, 2M citations

84% related

Royal Holloway, University of London
20.9K papers, 851.2K citations

83% related

York University
43.3K papers, 1.5M citations

83% related

University of Kent
28.3K papers, 811.7K citations

82% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202314
2022123
2021358
2020348
2019334
2018287