scispace - formally typeset
Search or ask a question
Author

William H. Richardson

Bio: William H. Richardson is an academic researcher from Palmetto Health Richland. The author has contributed to research in topics: Envenomation & Antivenom. The author has an hindex of 11, co-authored 27 publications receiving 3972 citations. Previous affiliations of William H. Richardson include University of California, San Diego & Memorial Hospital of South Bend.

Papers
More filters
Journal ArticleDOI
TL;DR: An iterative method of restoring degraded images was developed by treating images, point spread functions, and degraded images as probability-frequency functions and by applying Bayes’s theorem.
Abstract: An iterative method of restoring degraded images was developed by treating images, point spread functions, and degraded images as probability-frequency functions and by applying Bayes’s theorem. The method functions effectively in the presence of noise and is adaptable to computer operation.

3,869 citations

Journal ArticleDOI
TL;DR: Using a modified Delphi method, an evidence-informed treatment algorithm is provided in an attempt to reduce variation in care and possibly improve clinical outcomes of crotaline envenomation.
Abstract: Envenomation by crotaline snakes (rattlesnake, cottonmouth, copperhead) is a complex, potentially lethal condition affecting thousands of people in the United States each year. Treatment of crotaline envenomation is not standardized, and significant variation in practice exists. A geographically diverse panel of experts was convened for the purpose of deriving an evidence-informed unified treatment algorithm. Research staff analyzed the extant medical literature and performed targeted analyses of existing databases to inform specific clinical decisions. A trained external facilitator used modified Delphi and structured consensus methodology to achieve consensus on the final treatment algorithm. A unified treatment algorithm was produced and endorsed by all nine expert panel members. This algorithm provides guidance about clinical and laboratory observations, indications for and dosing of antivenom, adjunctive therapies, post-stabilization care, and management of complications from envenomation and therapy. Clinical manifestations and ideal treatment of crotaline snakebite differ greatly, and can result in severe complications. Using a modified Delphi method, we provide evidence-informed treatment guidelines in an attempt to reduce variation in care and possibly improve clinical outcomes.

169 citations

Journal ArticleDOI
TL;DR: A case of a 13-year-old male who was stung by ∼700 honey bees and developed progressive upper-body swelling and systemic manifestations of mass envenomation including rhabdomyolysis, renal insufficiency, and a transient transaminase elevation is presented.
Abstract: Massive envenomations by honey bees are capable of causing multiorgan dysfunction as a result of the direct toxic effects of the large venom load received. Although all varieties of honey bee have the potential for these attacks, the Africanized honey bee (Apis mellifera scutellata) is the most commonly implicated subspecies. In the United States, the Africanized strain is found primarily in the southwestern states and is known for its highly defensive behavior if disturbed. Mechanisms behind the multiorgan dysfunction produced by these mass envenomations are not clearly understood. We present a case of a 13-year-old male who was stung by ∼700 honey bees and developed progressive upper-body swelling and systemic manifestations of mass envenomation including rhabdomyolysis, renal insufficiency, and a transient transaminase elevation.

67 citations

Journal ArticleDOI
TL;DR: In this murine model of amatoxin-induced hepatotoxicity, N-acetylcysteine, benzylpenicillin, cimetidine, thioctic acid, and silybin were not effective in limiting hepatic injury after alpha-amanitin poisoning.

55 citations

Journal ArticleDOI
TL;DR: A cost-benefit analysis of a regional poison center provides significant positive return on investment in addition to non-monetary benefits, and shows the value of current PC phone services.
Abstract: Background. Funding poison center (PC) operations has become a major challenge nationwide. Increasingly, state and federal budget cuts have resulted in diminished funding to PCs. Objectives. In an effort to demonstrate the value of current PC phone services, a cost-benefit analysis of a regional center was completed. Methods. A telephone survey was used to collect data from PC callers during an 8-week period in 2004. Callers with human exposure poisonings determined by the PC to be of minimal or no risk were asked to complete the phone survey. Callers were asked their alternative plan if the PC staff had not been available to assist them. Benefits were measured as healthcare charges potentially avoided. Results: A total of 652 caller surveys were completed. The benefit-to-cost ratio was 7.67 (95% C.I. 6.83, 8.50). Conclusion. In addition to non-monetary benefits, the operation of a regional poison center provides significant positive return on investment.

43 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Abstract: We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field (MRF) equivalence, this assignment also determines an MRF image model. The energy function is a more convenient and natural mechanism for embodying picture attributes than are the local characteristics of the MRF. For a range of degradation mechanisms, including blurring, nonlinear deformations, and multiplicative or additive noise, the posterior distribution is an MRF with a structure akin to the image model. By the analogy, the posterior distribution defines another (imaginary) physical system. Gradual temperature reduction in the physical system isolates low energy states (``annealing''), or what is the same thing, the most probable states under the Gibbs distribution. The analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations. The result is a highly parallel ``relaxation'' algorithm for MAP estimation. We establish convergence properties of the algorithm and we experiment with some simple pictures, for which good restorations are obtained at low signal-to-noise ratios.

18,761 citations

Journal ArticleDOI
21 Oct 1999-Nature
TL;DR: An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations.
Abstract: Is perception of the whole based on perception of its parts? There is psychological and physiological evidence for parts-based representations in the brain, and certain computational theories of object recognition rely on such representations. But little is known about how brains or computers might learn the parts of objects. Here we demonstrate an algorithm for non-negative matrix factorization that is able to learn parts of faces and semantic features of text. This is in contrast to other methods, such as principal components analysis and vector quantization, that learn holistic, not parts-based, representations. Non-negative matrix factorization is distinguished from the other methods by its use of non-negativity constraints. These constraints lead to a parts-based representation because they allow only additive, not subtractive, combinations. When non-negative matrix factorization is implemented as a neural network, parts-based representations emerge by virtue of two properties: the firing rates of neurons are never negative and synaptic strengths do not change sign.

11,500 citations

01 Jan 1999
TL;DR: In this article, non-negative matrix factorization is used to learn parts of faces and semantic features of text, which is in contrast to principal components analysis and vector quantization that learn holistic, not parts-based, representations.
Abstract: Is perception of the whole based on perception of its parts? There is psychological and physiological evidence for parts-based representations in the brain, and certain computational theories of object recognition rely on such representations. But little is known about how brains or computers might learn the parts of objects. Here we demonstrate an algorithm for non-negative matrix factorization that is able to learn parts of faces and semantic features of text. This is in contrast to other methods, such as principal components analysis and vector quantization, that learn holistic, not parts-based, representations. Non-negative matrix factorization is distinguished from the other methods by its use of non-negativity constraints. These constraints lead to a parts-based representation because they allow only additive, not subtractive, combinations. When non-negative matrix factorization is implemented as a neural network, parts-based representations emerge by virtue of two properties: the firing rates of neurons are never negative and synaptic strengths do not change sign.

9,604 citations

Proceedings Article
01 Jan 2000
TL;DR: Two different multiplicative algorithms for non-negative matrix factorization are analyzed and one algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence.
Abstract: Non-negative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the Expectation-Maximization algorithm. The algorithms can also be interpreted as diagonally rescaled gradient descent, where the rescaling factor is optimally chosen to ensure convergence.

7,345 citations

Proceedings ArticleDOI
21 Jul 2017
TL;DR: It is concluded that the NTIRE 2017 challenge pushes the state-of-the-art in single-image super-resolution, reaching the best results to date on the popular Set5, Set14, B100, Urban100 datasets and on the authors' newly proposed DIV2K.
Abstract: This paper introduces a novel large dataset for example-based single image super-resolution and studies the state-of-the-art as emerged from the NTIRE 2017 challenge. The challenge is the first challenge of its kind, with 6 competitions, hundreds of participants and tens of proposed solutions. Our newly collected DIVerse 2K resolution image dataset (DIV2K) was employed by the challenge. In our study we compare the solutions from the challenge to a set of representative methods from the literature and evaluate them using diverse measures on our proposed DIV2K dataset. Moreover, we conduct a number of experiments and draw conclusions on several topics of interest. We conclude that the NTIRE 2017 challenge pushes the state-of-the-art in single-image super-resolution, reaching the best results to date on the popular Set5, Set14, B100, Urban100 datasets and on our newly proposed DIV2K.

2,388 citations