Institution
University of Saskatchewan
Education•Saskatoon, Saskatchewan, Canada•
About: University of Saskatchewan is a education organization based out in Saskatoon, Saskatchewan, Canada. It is known for research contribution in the topics: Population & Health care. The organization has 25021 authors who have published 52579 publications receiving 1483049 citations. The organization is also known as: USask.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: A novel multiresolution color image segmentation (MCIS) algorithm which uses Markov random fields (MRF's) is proposed, a relaxation process that converges to the MAP (maximum a posteriori) estimate of the segmentation.
Abstract: Image segmentation is the process by which an original image is partitioned into some homogeneous regions. In this paper, a novel multiresolution color image segmentation (MCIS) algorithm which uses Markov random fields (MRF's) is proposed. The proposed approach is a relaxation process that converges to the MAP (maximum a posteriori) estimate of the segmentation. The quadtree structure is used to implement the multiresolution framework, and the simulated annealing technique is employed to control the splitting and merging of nodes so as to minimize an energy function and therefore, maximize the MAP estimate. The multiresolution scheme enables the use of different dissimilarity measures at different resolution levels. Consequently, the proposed algorithm is noise resistant. Since the global clustering information of the image is required in the proposed approach, the scale space filter (SSF) is employed as the first step. The multiresolution approach is used to refine the segmentation. Experimental results of both the synthesized and real images are very encouraging. In order to evaluate experimental results of both synthesized images and real images quantitatively, a new evaluation criterion is proposed and developed. >
530 citations
••
TL;DR: Together with earlier observations of electronic order in other cuprate families, these findings suggest the existence of a generic charge-ordered state in underdoped cuprates and uncover its intimate connection to the pseudogap regime.
Abstract: The understanding of the origin of superconductivity in cuprates has been hindered by the apparent diversity of intertwining electronic orders in these materials. We combined resonant x-ray scattering (REXS), scanning-tunneling microscopy (STM), and angle-resolved photoemission spectroscopy (ARPES) to observe a charge order that appears consistently in surface and bulk, and in momentum and real space within one cuprate family, Bi2Sr(2-x)La(x)CuO(6+δ). The observed wave vectors rule out simple antinodal nesting in the single-particle limit but match well with a phenomenological model of a many-body instability of the Fermi arcs. Combined with earlier observations of electronic order in other cuprate families, these findings suggest the existence of a generic charge-ordered state in underdoped cuprates and uncover its intimate connection to the pseudogap regime.
529 citations
••
TL;DR: A series of results supports two general assertions about this process: first, objects are identified first at a particular level of abstraction which is neither the most general nor the most specific possible, and second, the particular entry point for a given object covaries with its typicality, which affects whether or not the object will be identified at the “basic” level.
526 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1491 more•Institutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.
526 citations
••
TL;DR: A strong case can be made for moving away from ad hoc use of aggregated efficiency metrics and towards a framework based on purpose-dependent evaluation metrics and benchmarks that allows for more robust model adequacy assessment.
Abstract: . A traditional metric used in hydrology to summarize model
performance is the Nash–Sutcliffe efficiency (NSE). Increasingly an
alternative metric, the Kling–Gupta efficiency (KGE), is used instead. When
NSE is used, NSE = 0 corresponds to using the mean flow as a benchmark
predictor. The same reasoning is applied in various studies that use KGE as
a metric: negative KGE values are viewed as bad model performance, and only
positive values are seen as good model performance. Here we show that using
the mean flow as a predictor does not result in KGE = 0, but instead KGE = 1 - √ 2 ≈ - 0.41 . Thus, KGE values greater than −0.41
indicate that a model improves upon the mean flow benchmark – even if the
model's KGE value is negative. NSE and KGE values cannot be directly
compared, because their relationship is non-unique and depends in part on
the coefficient of variation of the observed time series. Therefore,
modellers who use the KGE metric should not let their understanding of NSE
values guide them in interpreting KGE values and instead develop new
understanding based on the constitutive parts of the KGE metric and the
explicit use of benchmark values to compare KGE scores against. More
generally, a strong case can be made for moving away from ad hoc use of
aggregated efficiency metrics and towards a framework based on
purpose-dependent evaluation metrics and benchmarks that allows for more
robust model adequacy assessment.
524 citations
Authors
Showing all 25277 results
Name | H-index | Papers | Citations |
---|---|---|---|
Tomas Hökfelt | 158 | 1033 | 95979 |
Frederick Wolfe | 119 | 417 | 101272 |
Christopher G. Goetz | 116 | 651 | 59510 |
John P. Giesy | 114 | 1162 | 62790 |
Helmut Kettenmann | 104 | 380 | 40211 |
Paul M. O'Byrne | 104 | 605 | 56520 |
Susan S. Taylor | 104 | 518 | 42108 |
Keith A. Hobson | 103 | 653 | 41300 |
Mark S. Tremblay | 100 | 541 | 43843 |
James F. Fries | 100 | 369 | 83589 |
Gordon McKay | 97 | 661 | 61390 |
Jonathan D. Adachi | 96 | 589 | 31641 |
Wenjun Zhang | 96 | 976 | 38530 |
William C. Dement | 96 | 340 | 43014 |
Chris Ryan | 95 | 971 | 34388 |