scispace - formally typeset
Search or ask a question
Topic

Character (mathematics)

About: Character (mathematics) is a research topic. Over the lifetime, 46723 publications have been published within this topic receiving 411412 citations.


Papers
More filters
Book
01 Jan 1956
TL;DR: Though it incorporates much new material, this new edition preserves the general character of the book in providing a collection of solutions of the equations of diffusion and describing how these solutions may be obtained.
Abstract: Though it incorporates much new material, this new edition preserves the general character of the book in providing a collection of solutions of the equations of diffusion and describing how these solutions may be obtained

20,495 citations

Journal ArticleDOI
TL;DR: This paper proposed a new approach based on skip-gram model, where each word is represented as a bag of character n-grams, words being represented as the sum of these representations, allowing to train models on large corpora quickly and allowing to compute word representations for words that did not appear in the training data.
Abstract: Continuous word representations, trained on large unlabeled corpora are useful for many natural language processing tasks. Popular models to learn such representations ignore the morphology of words, by assigning a distinct vector to each word. This is a limitation, especially for languages with large vocabularies and many rare words. In this paper, we propose a new approach based on the skipgram model, where each word is represented as a bag of character n-grams. A vector representation is associated to each character n-gram, words being represented as the sum of these representations. Our method is fast, allowing to train models on large corpora quickly and allows to compute word representations for words that did not appear in the training data. We evaluate our word representations on nine different languages, both on word similarity and analogy tasks. By comparing to recently proposed morphological word representations, we show that our vectors achieve state-of-the-art performance on these tasks.

7,537 citations

Book
15 Jan 1976
TL;DR: The Schur index Projective representations Character degrees Character correspondence Linear groups Changing the characteristic Some character tables Bibliographic notes References Index as discussed by the authors The Schur Index Projective representation of characters
Abstract: Algebras, modules, and representations Group representations and characters Characters and integrality Products of characters Induced characters Normal subgroups T.I. sets and exceptional characters Brauer's theorem Changing the field The Schur index Projective representations Character degrees Character correspondence Linear groups Changing the characteristic Some character tables Bibliographic notes References Index.

2,657 citations

Posted Content
TL;DR: A new approach based on the skipgram model, where each word is represented as a bag of character n-grams, with words being represented as the sum of these representations, which achieves state-of-the-art performance on word similarity and analogy tasks.
Abstract: Continuous word representations, trained on large unlabeled corpora are useful for many natural language processing tasks. Popular models that learn such representations ignore the morphology of words, by assigning a distinct vector to each word. This is a limitation, especially for languages with large vocabularies and many rare words. In this paper, we propose a new approach based on the skipgram model, where each word is represented as a bag of character $n$-grams. A vector representation is associated to each character $n$-gram; words being represented as the sum of these representations. Our method is fast, allowing to train models on large corpora quickly and allows us to compute word representations for words that did not appear in the training data. We evaluate our word representations on nine different languages, both on word similarity and analogy tasks. By comparing to recently proposed morphological word representations, we show that our vectors achieve state-of-the-art performance on these tasks.

2,425 citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20242
20233,365
20227,818
20211,037
20201,521
20191,881