Author
Ronald F. Boisvert
Other affiliations: Purdue University
Bio: Ronald F. Boisvert is an academic researcher from National Institute of Standards and Technology. The author has contributed to research in topics: Mathematical software & Software. The author has an hindex of 29, co-authored 91 publications receiving 5953 citations. Previous affiliations of Ronald F. Boisvert include Purdue University.
Papers published on a yearly basis
Papers
More filters
•
01 May 2010
TL;DR: This handbook results from a 10-year project conducted by the National Institute of Standards and Technology with an international group of expert authors and validators and is destined to replace its predecessor, the classic but long-outdated Handbook of Mathematical Functions, edited by Abramowitz and Stegun.
Abstract: Modern developments in theoretical and applied science depend on knowledge of the properties of mathematical functions, from elementary trigonometric functions to the multitude of special functions. These functions appear whenever natural phenomena are studied, engineering problems are formulated, and numerical simulations are performed. They also crop up in statistics, financial models, and economic analysis. Using them effectively requires practitioners to have ready access to a reliable collection of their properties. This handbook results from a 10-year project conducted by the National Institute of Standards and Technology with an international group of expert authors and validators. Printed in full color, it is destined to replace its predecessor, the classic but long-outdated Handbook of Mathematical Functions, edited by Abramowitz and Stegun. Included with every copy of the book is a CD with a searchable PDF of each chapter.
3,646 citations
•
01 Jan 1985330 citations
30 May 1996
TL;DR: The MatrixMarket as mentioned in this paper is a repository of data for the testing of numerical algorithms and mathematical software for matrix computations, designed to accommodate both dense and sparse matrices, as well as software to generate matrices.
Abstract: We describe a repository of data for the testing of numerical algorithms and mathematical software for matrix computations. The repository is designed to accommodate both dense and sparse matrices, as well as software to generate matrices. It has been seeded with the well known Harwell-Boeing sparse matrix collection. The raw data files have been augmented with an integrated World Wide Web interface which describes the matrices in the collection quantitatively and visually, For example, each matrix has a Web page which details its attributes, graphically depicts its sparsity pattern, and provides access to the matrix itself in several formats. In addition, a search mechanism is included which allows retrieval of matrices based on a variety of attributes, such as type and size, as well as through free-text search in abstracts. The URL is http://math.nist.gov/MatrixMarket.
240 citations
••
01 Jan 1997
TL;DR: A repository of data for the testing of numerical algorithms and mathematical software for matrix computations designed to accommodate both dense and sparse matrices, as well as software to generate matrices.
Abstract: We describe a repository of data for the testing of numerical algorithms and mathematical software for matrix computations. The repository is designed to accommodate both dense and sparse matrices, as well as software to generate matrices. It has been seeded with the well-known Harwell-Boeing sparse matrix collection. The raw data files have been augmented with an integrated World Wide Web interface which describes the matrices in the collection quantitatively and visually. For example, each matrix has a Web page which details its attributes, graphically depicts its sparsity pattern, and provides access to the matrix itself in several formats. In addition, a search mechanism is included which allows retrieval of matrices based on a variety of attributes, such as type and size, as well as through free-text search in abstracts. The URL is http://math.nist.gov/MatrizMarket/.
228 citations
••
01 Sep 1996TL;DR: In this article, the authors present some background on software libraries and problem-solving environments and discuss the long path to this vision of scientific software's future, and the roadblocks in the way.
Abstract: As more scientists and engineers adopt computation as a primary tool, they will want more problem-solving help from easy-to-use, comprehensive software systems. A workshop discussed the long path to this vision of scientific software's future, and the roadblocks in the way. In order to understand the findings of the workshop, the paper presents some background on software libraries and problem solving environments.
138 citations
Cited by
More filters
••
University of Jyväskylä1, University of California, Los Angeles2, California Polytechnic State University3, Los Alamos National Laboratory4, National Research University – Higher School of Economics5, University of California, Berkeley6, University of Birmingham7, Australian Nuclear Science and Technology Organisation8, University of Washington9, University of Massachusetts Amherst10, University of West Bohemia11, Brigham Young University12, University of Texas at Austin13, Universidade Federal de Minas Gerais14, Google15
TL;DR: SciPy as discussed by the authors is an open source scientific computing library for the Python programming language, which includes functionality spanning clustering, Fourier transforms, integration, interpolation, file I/O, linear algebra, image processing, orthogonal distance regression, minimization algorithms, signal processing, sparse matrix handling, computational geometry, and statistics.
Abstract: SciPy is an open source scientific computing library for the Python programming language. SciPy 1.0 was released in late 2017, about 16 years after the original version 0.1 release. SciPy has become a de facto standard for leveraging scientific algorithms in the Python programming language, with more than 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories, and millions of downloads per year. This includes usage of SciPy in almost half of all machine learning projects on GitHub, and usage by high profile projects including LIGO gravitational wave analysis and creation of the first-ever image of a black hole (M87). The library includes functionality spanning clustering, Fourier transforms, integration, interpolation, file I/O, linear algebra, image processing, orthogonal distance regression, minimization algorithms, signal processing, sparse matrix handling, computational geometry, and statistics. In this work, we provide an overview of the capabilities and development practices of the SciPy library and highlight some recent technical developments.
12,774 citations
••
University of Jyväskylä1, California Polytechnic State University2, University of California, Los Angeles3, Los Alamos National Laboratory4, National Research University – Higher School of Economics5, University of California, Berkeley6, University of Birmingham7, Australian Nuclear Science and Technology Organisation8, University of Washington9, University of Massachusetts Amherst10, University of West Bohemia11, Brigham Young University12, University of Texas at Austin13, Universidade Federal de Minas Gerais14, Google15
TL;DR: SciPy as discussed by the authors is an open-source scientific computing library for the Python programming language, which has become a de facto standard for leveraging scientific algorithms in Python, with over 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories and millions of downloads per year.
Abstract: SciPy is an open-source scientific computing library for the Python programming language. Since its initial release in 2001, SciPy has become a de facto standard for leveraging scientific algorithms in Python, with over 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories and millions of downloads per year. In this work, we provide an overview of the capabilities and development practices of SciPy 1.0 and highlight some recent technical developments.
6,244 citations
22 May 2010
TL;DR: This work describes a Natural Language Processing software framework which is based on the idea of document streaming, i.e. processing corpora document after document, in a memory independent fashion, and implements several popular algorithms for topical inference, including Latent Semantic Analysis and Latent Dirichlet Allocation in a way that makes them completely independent of the training corpus size.
Abstract: Large corpora are ubiquitous in today's world and memory
quickly becomes the limiting factor in practical applications
of the Vector Space Model (VSM). We identify gap in existing
VSM implementations, which is their scalability and ease of
use. We describe a Natural Language Processing software
framework which is based on the idea of document streaming,
i.e. processing corpora document after document, in a memory
independent fashion. In this framework, we implement several
popular algorithms for topical inference, including Latent
Semantic Analysis and Latent Dirichlet Allocation, in a way
that makes them completely independent of the training corpus
size. Particular emphasis is placed on straightforward and
intuitive framework design, so that modifications and
extensions of the methods and/or their application by
interested practitioners are effortless. We demonstrate the
usefulness of our approach on a real-world scenario of
computing document similarities within an existing digital
library DML-CZ.
3,965 citations
••
TL;DR: The University of Florida Sparse Matrix Collection, a large and actively growing set of sparse matrices that arise in real applications, is described and a new multilevel coarsening scheme is proposed to facilitate this task.
Abstract: We describe the University of Florida Sparse Matrix Collection, a large and actively growing set of sparse matrices that arise in real applications The Collection is widely used by the numerical linear algebra community for the development and performance evaluation of sparse matrix algorithms It allows for robust and repeatable experiments: robust because performance results with artificially generated matrices can be misleading, and repeatable because matrices are curated and made publicly available in many formats Its matrices cover a wide spectrum of domains, include those arising from problems with underlying 2D or 3D geometry (as structural engineering, computational fluid dynamics, model reduction, electromagnetics, semiconductor devices, thermodynamics, materials, acoustics, computer graphics/vision, robotics/kinematics, and other discretizations) and those that typically do not have such geometry (optimization, circuit simulation, economic and financial modeling, theoretical and quantum chemistry, chemical process simulation, mathematics and statistics, power networks, and other networks and graphs) We provide software for accessing and managing the Collection, from MATLAB™, Mathematica™, Fortran, and C, as well as an online search capability Graph visualization of the matrices is provided, and a new multilevel coarsening scheme is proposed to facilitate this task
3,456 citations
••
1,524 citations