scispace - formally typeset
Search or ask a question

Showing papers on "Chunking (computing) published in 2001"


01 Jan 2001
TL;DR: In this paper, an adaptive behavior that integrates planning with learning is presented and the former is performed adopting a hierarchical approach, interleaved with execution, while the latter is devised to identify new abstract operators.
Abstract: Planning in a dynamic environment is a complex task that requires several issues to be investigated in order to manage the associated search complexity. In this paper, an adaptive behavior that integrates planning with learning is presented. The former is performed adopting a hierarchical approach, interleaved with execution. The latter, devised to identify new abstract operators, adopts a chunking techniqueoperators, adopts a chunking technique on successful plans. Integration between planning and learning is also promoted by an agent architecture explicitly designed for supporting abstraction.

9 citations



Proceedings ArticleDOI
07 Oct 2001
TL;DR: A system of chunk-like annotation to describe Chinese predicate-argument structures, and some of the work in developing learned models for automatically annotating fresh text according to this system are described.
Abstract: This paper introduces a system of chunk-like annotation to describe Chinese predicate-argument structures, and describes some of our work in developing learned models for automatically annotating fresh text according to this system. The annotation is very similar in form to other chunking systems, except that chunks are defined not bottom-up but top-down, in terms of relationship to a main predicate. Bottom-up parsing of these structures seems to require great consideration of structural information and long-distance influences. Explicit representation of chunk structure during parsing allows us to provide more informative features, and experiments show that these give significant improvements in performance.

4 citations


Journal ArticleDOI
TL;DR: This paper describes Ada95 bindings for HDF4 and HDF5, the current versions of the NCSA Hierarchical Data Format (HDF), intended for storage of large, diverse collections of scientific data and for retrieving subsets of these data.
Abstract: This paper describes Ada95 bindings for HDF4 and HDF5, the current versions of the NCSA Hierarchical Data Format (HDF). These self-describing file formats are intended for storage of large, diverse collections of scientific data and for retrieving subsets of these data. The libraries also support data compression, chunking of large arrays, and automatic conversion of vendor-specific binary formats for a variety of data types.

3 citations


01 May 2001
TL;DR: In this paper well-known state-of-the-art data-driven algorithms are applied topart- of-speech tagging and shallow parsing of Swedish texts.
Abstract: In this paper well-known state-of-the-art data-driven algorithms are applied topart-of-speech tagging and shallow parsing of Swedish texts.

2 citations



Book ChapterDOI
01 Mar 2001
TL;DR: The term collocation will be used to loosely describe any generally accepted grouping of words into phrases or clauses, and makes sense to regard collocations as items which frequently occur together and have some degree of semantic unpredictability.
Abstract: The term ‘collocation’ is used to refer to a group of words that belong together, either because they commonly occur together like take a chance , or because the meaning of the group is not obvious from the meaning of the parts, as with by the way or to take someone in (trick them). A major problem in the study of collocation is determining in a consistent way what should be classified as a collocation. This is a problem because they occur in a variety of general forms and with a variety of relationships between the words that make up the collocation. In this book, the term collocation will be used to loosely describe any generally accepted grouping of words into phrases or clauses. From a learning point of view, it makes sense to regard collocations as items which frequently occur together and have some degree of semantic unpredictability. These two criteria justify spending time on collocations because of the return in fluency and nativelike selection. Collocation is often described as a ‘Firthian’ term (Kjellmer, 1982: 25; Fernando, 1996: 29), but Palmer used it many years before Firth and produced a substantial report which used a restricted definition of collocation, focusing mainly on items whose meaning is not obvious from their parts. ‘Each [collocation] … must or should be learnt, or is best or most conveniently learnt as an integral whole or independent entity, rather than by the process of piecing together their component parts.’ (Palmer, 1933: 4)

2 citations


Proceedings Article
22 May 2001

1 citations


01 Jan 2001
TL;DR: In one application, the short protection circuitry is incorporated into a high power CMOS driver that forces the output to supply voltage VDD, again eliminating the excessive current flow.
Abstract: Electronic logic circuitry operates to detect and correct a short circuit condition, be it a short to system ground or supply voltage, in a circuit output. In one application, the short protection circuitry is incorporated into a high power CMOS driver. If the circuit output is shorted to ground, the short is detected and a signal is fed back to the input. This feedback signal forces the output to ground thus eliminating the excessive current flow. Conversely, if there is a short to supply voltage, VDD, a signal is fed back to the input that forces the output to supply voltage VDD, again eliminating the excessive current flow.

1 citations


Journal ArticleDOI
TL;DR: Simon was a true polymath whose research started in management science and political science, later encompassed operations research, statistics and economics, and finally included computer science, artificial intelligence, psychology, education, philosophy of science, biology, and the sciences of design.
Abstract: With the disappearance of Herbert A. Simon, we have lost one of the most original thinkers of the 20th century. Highly influential in a number of scientific fields—some of which he actually helped create, such as artificial intelligence or information-processing psychology—Simon was a true polymath. His research started in management science and political science, later encompassed operations research, statistics and economics, and finally included computer science, artificial intelligence, psychology, education, philosophy of science, biology, and the sciences of design. His often controversial ideas earned him wide scientific recognition and essentially all the top awards of the fields in which he researched, including the Turing award from the Association of Computing Machinery, with Allen Newell, in 1975, the Nobel prize in economics, in 1978, and the Gold Medal Award for Psychological Science from the American Psychological Foundation, in 1988.

1 citations