scispace - formally typeset
Search or ask a question
Topic

Adjacency list

About: Adjacency list is a research topic. Over the lifetime, 4419 publications have been published within this topic receiving 78449 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: It is shown that the cocoons are organized into a hierarchy which is a sub-hierarchy of the one produced by the standard clustering algorithm of complete linkage, which offers a new point of view on what the complete linkage algorithm achieves when it is applied on image data.

64 citations

Journal ArticleDOI
Quanxue Gao1, Jingjie Ma1, Hailin Zhang1, Xinbo Gao1, Yamin Liu1 
TL;DR: An adjacency graph is constructed to model the intraclass variation that characterizes the most important properties, such as diversity of patterns, and then incorporate the diversity into the discriminant objective function for linear dimensionality reduction.
Abstract: Manifold learning is widely used in machine learning and pattern recognition. However, manifold learning only considers the similarity of samples belonging to the same class and ignores the within-class variation of data, which will impair the generalization and stableness of the algorithms. For this purpose, we construct an adjacency graph to model the intraclass variation that characterizes the most important properties, such as diversity of patterns, and then incorporate the diversity into the discriminant objective function for linear dimensionality reduction. Finally, we introduce the orthogonal constraint for the basis vectors and propose an orthogonal algorithm called stable orthogonal local discriminate embedding. Experimental results on several standard image databases demonstrate the effectiveness of the proposed dimensionality reduction approach.

64 citations

Journal ArticleDOI
TL;DR: In a case study, three different ways to find a solution were examined: a random search algorithm, a simulated annealing algorithm and the prebiased random search method found in the SNAP II program, which was the fastest.
Abstract: Regulations defining the maximum opening size in the sub‐alpine region of Sweden, introduce new planning issues. The combinatorial problems that arise in harvest planning become very complex, but can be solved by different methods. In a case study, three different ways to find a solution were examined: a random search algorithm, a simulated annealing algorithm and the prebiased random search method found in the SNAP II program. Two different alternatives were studied, one with no road in the area and one with a road constructed. All three methods were found to give feasible solutions. The simulated annealing produced the best solutions, in terms of present net value, while the SNAP II program was the fastest. The SNAP II did not give as good solutions as the others in the case with a road, probably due to the lack of a distinct gradient in the structure.

64 citations

Proceedings ArticleDOI
21 Oct 1985
TL;DR: An algorithm to construct a cell decomposition of Rd, including adjacency information, defined by any given set of rational polynomials in d variables is given, extending a recent algorithm of Ben-Or, Kozen, and Reif for deciding the theory of real closed fields.
Abstract: We give an algorithm to construct a cell decomposition of Rd, including adjacency information, defined by any given set of rational polynomials in d variables. The algorithm runs in single exponential parallel time, and in NC for fixed d. The algorithm extends a recent algorithm of Ben-Or, Kozen, and Reif for deciding the theory of real closed fields.

64 citations

Proceedings Article
30 Apr 2020
TL;DR: This work presents a novel and principled solution for modeling both the global absolute positions of words and their order relationships, and is the first work in NLP to link imaginary numbers in complex-valued representations to concrete meanings (i.e., word order).
Abstract: Sequential word order is important when processing text. Currently, neural networks (NNs) address this by modeling word position using position embeddings. The problem is that position embeddings capture the position of individual words, but not the ordered relationship (e.g., adjacency or precedence) between individual word positions. We present a novel and principled solution for modeling both the global absolute positions of words and their order relationships. Our solution generalizes word embeddings, previously defined as independent vectors, to continuous word functions over a variable (position). The benefit of continuous functions over variable positions is that word representations shift smoothly with increasing positions. Hence, word representations in different positions can correlate with each other in a continuous function. The general solution of these functions can be extended to complex-valued variants. We extend CNN, RNN and Transformer NNs to complex-valued versions to incorporate our complex embedding (we make all code available). Experiments on text classification, machine translation and language modeling show gains over both classical word embeddings and position-enriched word embeddings. To our knowledge, this is the first work in NLP to link imaginary numbers in complex-valued representations to concrete meanings (i.e., word order).

64 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
82% related
Probabilistic logic
56K papers, 1.3M citations
82% related
Cluster analysis
146.5K papers, 2.9M citations
81% related
Matrix (mathematics)
105.5K papers, 1.9M citations
81% related
Robustness (computer science)
94.7K papers, 1.6M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023209
2022439
2021283
2020280
2019296
2018232