scispace - formally typeset
Search or ask a question

Showing papers by "University of Science and Technology of China published in 2019"


Proceedings ArticleDOI
25 Feb 2019
TL;DR: This paper proposes a network that maintains high-resolution representations through the whole process of human pose estimation and empirically demonstrates the effectiveness of the network through the superior pose estimation results over two benchmark datasets: the COCO keypoint detection dataset and the MPII Human Pose dataset.
Abstract: In this paper, we are interested in the human pose estimation problem with a focus on learning reliable high-resolution representations. Most existing methods recover high-resolution representations from low-resolution representations produced by a high-to-low resolution network. Instead, our proposed network maintains high-resolution representations through the whole process. We start from a high-resolution subnetwork as the first stage, gradually add high-to-low resolution subnetworks one by one to form more stages, and connect the mutli-resolution subnetworks in parallel. We conduct repeated multi-scale fusions such that each of the high-to-low resolution representations receives information from other parallel representations over and over, leading to rich high-resolution representations. As a result, the predicted keypoint heatmap is potentially more accurate and spatially more precise. We empirically demonstrate the effectiveness of our network through the superior pose estimation results over two benchmark datasets: the COCO keypoint detection dataset and the MPII Human Pose dataset. In addition, we show the superiority of our network in pose tracking on the PoseTrack dataset. The code and models have been publicly available at https://github.com/leoxiaobin/deep-high-resolution-net.pytorch.

2,979 citations


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +403 moreInstitutions (82)
TL;DR: In this article, the Event Horizon Telescope was used to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87.
Abstract: When surrounded by a transparent emission region, black holes are expected to reveal a dark shadow caused by gravitational light bending and photon capture at the event horizon. To image and study this phenomenon, we have assembled the Event Horizon Telescope, a global very long baseline interferometry array observing at a wavelength of 1.3 mm. This allows us to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87. We have resolved the central compact radio source as an asymmetric bright emission ring with a diameter of 42 +/- 3 mu as, which is circular and encompasses a central depression in brightness with a flux ratio greater than or similar to 10: 1. The emission ring is recovered using different calibration and imaging schemes, with its diameter and width remaining stable over four different observations carried out in different days. Overall, the observed image is consistent with expectations for the shadow of a Kerr black hole as predicted by general relativity. The asymmetry in brightness in the ring can be explained in terms of relativistic beaming of the emission from a plasma rotating close to the speed of light around a black hole. We compare our images to an extensive library of ray-traced general-relativistic magnetohydrodynamic simulations of black holes and derive a central mass of M = (6.5 +/- 0.7) x 10(9) M-circle dot. Our radio-wave observations thus provide powerful evidence for the presence of supermassive black holes in centers of galaxies and as the central engines of active galactic nuclei. They also present a new tool to explore gravity in its most extreme limit and on a mass scale that was so far not accessible.

2,589 citations


Journal ArticleDOI
TL;DR: This review provides a comprehensive account of significant progress in the design and synthesis of MOF-based materials, including MOFs, MOF composites and MOF derivatives, and their application to carbon capture and conversion.
Abstract: Rapidly increasing atmospheric CO2 concentrations threaten human society, the natural environment, and the synergy between the two. In order to ameliorate the CO2 problem, carbon capture and conversion techniques have been proposed. Metal–organic framework (MOF)-based materials, a relatively new class of porous materials with unique structural features, high surface areas, chemical tunability and stability, have been extensively studied with respect to their applicability to such techniques. Recently, it has become apparent that the CO2 capture capabilities of MOF-based materials significantly boost their potential toward CO2 conversion. Furthermore, MOF-based materials’ well-defined structures greatly facilitate the understanding of structure–property relationships and their roles in CO2 capture and conversion. In this review, we provide a comprehensive account of significant progress in the design and synthesis of MOF-based materials, including MOFs, MOF composites and MOF derivatives, and their application to carbon capture and conversion. Special emphases on the relationships between CO2 capture capacities of MOF-based materials and their catalytic CO2 conversion performances are discussed.

1,378 citations


Proceedings ArticleDOI
15 Jun 2019
TL;DR: This work presents a reformulation of Deformable Convolutional Networks that improves its ability to focus on pertinent image regions, through increased modeling power and stronger training, and guides network training via a proposed feature mimicking scheme that helps the network to learn features that reflect the object focus and classification power of R-CNN features.
Abstract: The superior performance of Deformable Convolutional Networks arises from its ability to adapt to the geometric variations of objects. Through an examination of its adaptive behavior, we observe that while the spatial support for its neural features conforms more closely than regular ConvNets to object structure, this support may nevertheless extend well beyond the region of interest, causing features to be influenced by irrelevant image content. To address this problem, we present a reformulation of Deformable ConvNets that improves its ability to focus on pertinent image regions, through increased modeling power and stronger training. The modeling power is enhanced through a more comprehensive integration of deformable convolution within the network, and by introducing a modulation mechanism that expands the scope of deformation modeling. To effectively harness this enriched modeling capability, we guide network training via a proposed feature mimicking scheme that helps the network to learn features that reflect the object focus and classification power of R-CNN features. With the proposed contributions, this new version of Deformable ConvNets yields significant performance gains over the original model and produces leading results on the COCO benchmark for object detection and instance segmentation.

1,373 citations


Journal ArticleDOI
26 Jul 2019-PeerJ
TL;DR: Comparing MetaBAT 2 to alternative software tools on over 100 real world metagenome assemblies shows superior accuracy and computing speed, and recommends the community adopts Meta BAT 2 for their meetagenome binning experiments.
Abstract: We previously reported on MetaBAT, an automated metagenome binning software tool to reconstruct single genomes from microbial communities for subsequent analyses of uncultivated microbial species. MetaBAT has become one of the most popular binning tools largely due to its computational efficiency and ease of use, especially in binning experiments with a large number of samples and a large assembly. MetaBAT requires users to choose parameters to fine-tune its sensitivity and specificity. If those parameters are not chosen properly, binning accuracy can suffer, especially on assemblies of poor quality. Here, we developed MetaBAT 2 to overcome this problem. MetaBAT 2 uses a new adaptive binning algorithm to eliminate manual parameter tuning. We also performed extensive software engineering optimization to increase both computational and memory efficiency. Comparing MetaBAT 2 to alternative software tools on over 100 real world metagenome assemblies shows superior accuracy and computing speed. Binning a typical metagenome assembly takes only a few minutes on a single commodity workstation. We therefore recommend the community adopts MetaBAT 2 for their metagenome binning experiments. MetaBAT 2 is open source software and available at https://bitbucket.org/berkeleylab/metabat.

1,334 citations


Posted Content
TL;DR: The superiority of the proposed HRNet in a wide range of applications, including human pose estimation, semantic segmentation, and object detection, is shown, suggesting that the HRNet is a stronger backbone for computer vision problems.
Abstract: High-resolution representations are essential for position-sensitive vision problems, such as human pose estimation, semantic segmentation, and object detection. Existing state-of-the-art frameworks first encode the input image as a low-resolution representation through a subnetwork that is formed by connecting high-to-low resolution convolutions \emph{in series} (e.g., ResNet, VGGNet), and then recover the high-resolution representation from the encoded low-resolution representation. Instead, our proposed network, named as High-Resolution Network (HRNet), maintains high-resolution representations through the whole process. There are two key characteristics: (i) Connect the high-to-low resolution convolution streams \emph{in parallel}; (ii) Repeatedly exchange the information across resolutions. The benefit is that the resulting representation is semantically richer and spatially more precise. We show the superiority of the proposed HRNet in a wide range of applications, including human pose estimation, semantic segmentation, and object detection, suggesting that the HRNet is a stronger backbone for computer vision problems. All the codes are available at~{\url{this https URL}}.

1,278 citations


Proceedings ArticleDOI
18 Jul 2019
TL;DR: Wang et al. as discussed by the authors proposed Neural Graph Collaborative Filtering (NGCF), which exploits the user-item graph structure by propagating embeddings on it, effectively injecting the collaborative signal into the embedding process in an explicit manner.
Abstract: Learning vector representations (aka. embeddings) of users and items lies at the core of modern recommender systems. Ranging from early matrix factorization to recently emerged deep learning based methods, existing efforts typically obtain a user's (or an item's) embedding by mapping from pre-existing features that describe the user (or the item), such as ID and attributes. We argue that an inherent drawback of such methods is that, the collaborative signal, which is latent in user-item interactions, is not encoded in the embedding process. As such, the resultant embeddings may not be sufficient to capture the collaborative filtering effect. In this work, we propose to integrate the user-item interactions - more specifically the bipartite graph structure - into the embedding process. We develop a new recommendation framework Neural Graph Collaborative Filtering (NGCF), which exploits the user-item graph structure by propagating embeddings on it. This leads to the expressive modeling of high-order connectivity in user-item graph, effectively injecting the collaborative signal into the embedding process in an explicit manner. We conduct extensive experiments on three public benchmarks, demonstrating significant improvements over several state-of-the-art models like HOP-Rec [39] and Collaborative Memory Network [5]. Further analysis verifies the importance of embedding propagation for learning better user and item representations, justifying the rationality and effectiveness of NGCF. Codes are available at https://github.com/xiangwang1223/neural_graph_collaborative_filtering.

1,225 citations


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +251 moreInstitutions (56)
TL;DR: In this article, the authors present measurements of the properties of the central radio source in M87 using Event Horizon Telescope data obtained during the 2017 campaign, and find that >50% of the total flux at arcsecond scales comes from near the horizon and that the emission is dramatically suppressed interior to this region by a factor >10, providing direct evidence of the predicted shadow of a black hole.
Abstract: We present measurements of the properties of the central radio source in M87 using Event Horizon Telescope data obtained during the 2017 campaign. We develop and fit geometric crescent models (asymmetric rings with interior brightness depressions) using two independent sampling algorithms that consider distinct representations of the visibility data. We show that the crescent family of models is statistically preferred over other comparably complex geometric models that we explore. We calibrate the geometric model parameters using general relativistic magnetohydrodynamic (GRMHD) models of the emission region and estimate physical properties of the source. We further fit images generated from GRMHD models directly to the data. We compare the derived emission region and black hole parameters from these analyses with those recovered from reconstructed images. There is a remarkable consistency among all methods and data sets. We find that >50% of the total flux at arcsecond scales comes from near the horizon, and that the emission is dramatically suppressed interior to this region by a factor >10, providing direct evidence of the predicted shadow of a black hole. Across all methods, we measure a crescent diameter of 42 ± 3 μas and constrain its fractional width to be <0.5. Associating the crescent feature with the emission surrounding the black hole shadow, we infer an angular gravitational radius of GM/Dc2 = 3.8 ± 0.4 μas. Folding in a distance measurement of ${16.8}_{-0.7}^{+0.8}\,\mathrm{Mpc}$ gives a black hole mass of $M=6.5\pm 0.2{| }_{\mathrm{stat}}\pm 0.7{| }_{\mathrm{sys}}\times {10}^{9}\hspace{2pt}{M}_{\odot }$. This measurement from lensed emission near the event horizon is consistent with the presence of a central Kerr black hole, as predicted by the general theory of relativity.

1,024 citations


Journal ArticleDOI
TL;DR: In this paper, a spontaneous gas-foaming method was used to prepare nitrogen doped ultrathin carbon nanosheets (NCNs) by simply pyrolysing a mixture of citric acid and NH4Cl.
Abstract: Rational design and facile preparation of non-noble trifunctional electrocatalysts with high performance, low cost and strong durability for the oxygen reduction reaction (ORR), oxygen evolution reaction (OER) and hydrogen evolution reaction (HER) are highly demanded, but remain as a big challenge. Herein, we report a spontaneous gas-foaming method to prepare nitrogen doped ultrathin carbon nanosheets (NCNs) by simply pyrolysing a mixture of citric acid and NH4Cl. Under the optimized pyrolysis temperature (carbonized at 1000 °C) and mass ratio of precursors (1 : 1), the synthesized NCN-1000-5 sample possesses an ultrathin sheet structure, an ultrahigh specific surface area (1793 m2 g−1), and rich edge defects, and exhibits low overpotential and robust stability for the ORR, OER and HER. By means of density functional theory (DFT) computations, we revealed that the intrinsic active sites for the ORR, OER and HER are the carbon atoms located at the armchair edge and adjacent to the graphitic N dopants. When practically used as a catalyst in rechargeable Zn–air batteries, a high energy density (806 W h kg−1), a low charge/discharge voltage gap (0.77 V) and an ultralong cycle life (over 330 h) were obtained at 10 mA cm−2 for NCN-1000-5. This work not only presents a versatile strategy to develop advanced carbon materials with ultrahigh specific surface area and abundant edge defects, but also provides useful guidance for designing and developing multifunctional metal-free catalysts for various energy-related electrocatalytic reactions.

955 citations


Proceedings ArticleDOI
TL;DR: This work develops a new recommendation framework Neural Graph Collaborative Filtering (NGCF), which exploits the user-item graph structure by propagating embeddings on it, effectively injecting the collaborative signal into the embedding process in an explicit manner.
Abstract: Learning vector representations (aka. embeddings) of users and items lies at the core of modern recommender systems. Ranging from early matrix factorization to recently emerged deep learning based methods, existing efforts typically obtain a user's (or an item's) embedding by mapping from pre-existing features that describe the user (or the item), such as ID and attributes. We argue that an inherent drawback of such methods is that, the collaborative signal, which is latent in user-item interactions, is not encoded in the embedding process. As such, the resultant embeddings may not be sufficient to capture the collaborative filtering effect. In this work, we propose to integrate the user-item interactions -- more specifically the bipartite graph structure -- into the embedding process. We develop a new recommendation framework Neural Graph Collaborative Filtering (NGCF), which exploits the user-item graph structure by propagating embeddings on it. This leads to the expressive modeling of high-order connectivity in user-item graph, effectively injecting the collaborative signal into the embedding process in an explicit manner. We conduct extensive experiments on three public benchmarks, demonstrating significant improvements over several state-of-the-art models like HOP-Rec and Collaborative Memory Network. Further analysis verifies the importance of embedding propagation for learning better user and item representations, justifying the rationality and effectiveness of NGCF. Codes are available at this https URL.

953 citations


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +251 moreInstitutions (58)
TL;DR: In this article, the first Event Horizon Telescope (EHT) images of M87 were presented, using observations from April 2017 at 1.3 mm wavelength, showing a prominent ring with a diameter of ~40 μas, consistent with the size and shape of the lensed photon orbit encircling the "shadow" of a supermassive black hole.
Abstract: We present the first Event Horizon Telescope (EHT) images of M87, using observations from April 2017 at 1.3 mm wavelength. These images show a prominent ring with a diameter of ~40 μas, consistent with the size and shape of the lensed photon orbit encircling the "shadow" of a supermassive black hole. The ring is persistent across four observing nights and shows enhanced brightness in the south. To assess the reliability of these results, we implemented a two-stage imaging procedure. In the first stage, four teams, each blind to the others' work, produced images of M87 using both an established method (CLEAN) and a newer technique (regularized maximum likelihood). This stage allowed us to avoid shared human bias and to assess common features among independent reconstructions. In the second stage, we reconstructed synthetic data from a large survey of imaging parameters and then compared the results with the corresponding ground truth images. This stage allowed us to select parameters objectively to use when reconstructing images of M87. Across all tests in both stages, the ring diameter and asymmetry remained stable, insensitive to the choice of imaging technique. We describe the EHT imaging procedures, the primary image features in M87, and the dependence of these features on imaging assumptions.

Journal ArticleDOI
TL;DR: In this article, the authors present a comprehensive study and evaluation of existing single image dehazing algorithms, using a new large-scale benchmark consisting of both synthetic and real-world hazy images, called Realistic Single-Image DEhazing (RESIDE).
Abstract: We present a comprehensive study and evaluation of existing single-image dehazing algorithms, using a new large-scale benchmark consisting of both synthetic and real-world hazy images, called REalistic Single-Image DEhazing (RESIDE). RESIDE highlights diverse data sources and image contents, and is divided into five subsets, each serving different training or evaluation purposes. We further provide a rich variety of criteria for dehazing algorithm evaluation, ranging from full-reference metrics to no-reference metrics and to subjective evaluation, and the novel task-driven evaluation. Experiments on RESIDE shed light on the comparisons and limitations of the state-of-the-art dehazing algorithms, and suggest promising future directions.

Posted Content
TL;DR: A new pre-trainable generic representation for visual-linguistic tasks, called Visual-Linguistic BERT (VL-BERT), which adopts the simple yet powerful Transformer model as the backbone, and extends it to take both visual and linguistic embedded features as input.
Abstract: We introduce a new pre-trainable generic representation for visual-linguistic tasks, called Visual-Linguistic BERT (VL-BERT for short). VL-BERT adopts the simple yet powerful Transformer model as the backbone, and extends it to take both visual and linguistic embedded features as input. In it, each element of the input is either of a word from the input sentence, or a region-of-interest (RoI) from the input image. It is designed to fit for most of the visual-linguistic downstream tasks. To better exploit the generic representation, we pre-train VL-BERT on the massive-scale Conceptual Captions dataset, together with text-only corpus. Extensive empirical analysis demonstrates that the pre-training procedure can better align the visual-linguistic clues and benefit the downstream tasks, such as visual commonsense reasoning, visual question answering and referring expression comprehension. It is worth noting that VL-BERT achieved the first place of single model on the leaderboard of the VCR benchmark. Code is released at \url{this https URL}.

Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +259 moreInstitutions (62)
TL;DR: In this article, a large library of models based on general relativistic magnetohydrodynamic (GRMHD) simulations and synthetic images produced by GRS was constructed and compared with the observed visibilities.
Abstract: The Event Horizon Telescope (EHT) has mapped the central compact radio source of the elliptical galaxy M87 at 1.3 mm with unprecedented angular resolution. Here we consider the physical implications of the asymmetric ring seen in the 2017 EHT data. To this end, we construct a large library of models based on general relativistic magnetohydrodynamic (GRMHD) simulations and synthetic images produced by general relativistic ray tracing. We compare the observed visibilities with this library and confirm that the asymmetric ring is consistent with earlier predictions of strong gravitational lensing of synchrotron emission from a hot plasma orbiting near the black hole event horizon. The ring radius and ring asymmetry depend on black hole mass and spin, respectively, and both are therefore expected to be stable when observed in future EHT campaigns. Overall, the observed image is consistent with expectations for the shadow of a spinning Kerr black hole as predicted by general relativity. If the black hole spin and M87's large scale jet are aligned, then the black hole spin vector is pointed away from Earth. Models in our library of non-spinning black holes are inconsistent with the observations as they do not produce sufficiently powerful jets. At the same time, in those models that produce a sufficiently powerful jet, the latter is powered by extraction of black hole spin energy through mechanisms akin to the Blandford-Znajek process. We briefly consider alternatives to a black hole for the central compact object. Analysis of existing EHT polarization data and data taken simultaneously at other wavelengths will soon enable new tests of the GRMHD models, as will future EHT campaigns at 230 and 345 GHz.

Journal ArticleDOI
01 Mar 2019
TL;DR: Shui et al. as mentioned in this paper reported a class of concave Fe-N-C single-atom catalysts possessing an enhanced external surface area and mesoporosity that meets the 2018 PGM-free catalyst activity target.
Abstract: To achieve the US Department of Energy 2018 target set for platinum-group metal-free catalysts (PGM-free catalysts) in proton exchange membrane fuel cells, the low density of active sites must be overcome. Here, we report a class of concave Fe–N–C single-atom catalysts possessing an enhanced external surface area and mesoporosity that meets the 2018 PGM-free catalyst activity target, and a current density of 0.047 A cm–2 at 0.88 ViR-free under 1.0 bar H2–O2. This performance stems from the high density of active sites, which is realized through exposing inaccessible Fe–N4 moieties (that is, increasing their utilization) and enhancing the mass transport of the catalyst layer. Further, we establish structure–property correlations that provide a route for designing highly efficient PGM-free catalysts for practical application, achieving a power density of 1.18 W cm−2 under 2.5 bar H2–O2, and an activity of 129 mA cm−2 at 0.8 ViR-free under 1.0 bar H2–air. Iron single-atom catalysts are among the most promising fuel cell cathode materials in acid electrolyte solution. Now, Shui, Xu and co-workers report concave-shaped Fe–N–C nanoparticles with increased availability of active sites and improved mass transport, meeting the US Department of Energy 2018 target for platinum-group metal-free fuel cell catalysts.

Proceedings ArticleDOI
25 Jul 2019
TL;DR: Wang et al. as mentioned in this paper proposed a knowledge graph attention network (KGAT) which explicitly models the high-order connectivities in KG in an end-to-end fashion.
Abstract: To provide more accurate, diverse, and explainable recommendation, it is compulsory to go beyond modeling user-item interactions and take side information into account. Traditional methods like factorization machine (FM) cast it as a supervised learning problem, which assumes each interaction as an independent instance with side information encoded. Due to the overlook of the relations among instances or items (e.g., the director of a movie is also an actor of another movie), these methods are insufficient to distill the collaborative signal from the collective behaviors of users. In this work, we investigate the utility of knowledge graph (KG), which breaks down the independent interaction assumption by linking items with their attributes. We argue that in such a hybrid structure of KG and user-item graph, high-order relations --- which connect two items with one or multiple linked attributes --- are an essential factor for successful recommendation. We propose a new method named Knowledge Graph Attention Network (KGAT) which explicitly models the high-order connectivities in KG in an end-to-end fashion. It recursively propagates the embeddings from a node's neighbors (which can be users, items, or attributes) to refine the node's embedding, and employs an attention mechanism to discriminate the importance of the neighbors. Our KGAT is conceptually advantageous to existing KG-based recommendation methods, which either exploit high-order relations by extracting paths or implicitly modeling them with regularization. Empirical results on three public benchmarks show that KGAT significantly outperforms state-of-the-art methods like Neural FM and RippleNet. Further studies verify the efficacy of embedding propagation for high-order relation modeling and the interpretability benefits brought by the attention mechanism. We release the codes and datasets at https://github.com/xiangwang1223/knowledge_graph_attention_network.

Proceedings ArticleDOI
TL;DR: This work proposes a new method named Knowledge Graph Attention Network (KGAT), which explicitly models the high-order connectivities in KG in an end-to-end fashion and significantly outperforms state-of-the-art methods like Neural FM and RippleNet.
Abstract: To provide more accurate, diverse, and explainable recommendation, it is compulsory to go beyond modeling user-item interactions and take side information into account. Traditional methods like factorization machine (FM) cast it as a supervised learning problem, which assumes each interaction as an independent instance with side information encoded. Due to the overlook of the relations among instances or items (e.g., the director of a movie is also an actor of another movie), these methods are insufficient to distill the collaborative signal from the collective behaviors of users. In this work, we investigate the utility of knowledge graph (KG), which breaks down the independent interaction assumption by linking items with their attributes. We argue that in such a hybrid structure of KG and user-item graph, high-order relations --- which connect two items with one or multiple linked attributes --- are an essential factor for successful recommendation. We propose a new method named Knowledge Graph Attention Network (KGAT) which explicitly models the high-order connectivities in KG in an end-to-end fashion. It recursively propagates the embeddings from a node's neighbors (which can be users, items, or attributes) to refine the node's embedding, and employs an attention mechanism to discriminate the importance of the neighbors. Our KGAT is conceptually advantageous to existing KG-based recommendation methods, which either exploit high-order relations by extracting paths or implicitly modeling them with regularization. Empirical results on three public benchmarks show that KGAT significantly outperforms state-of-the-art methods like Neural FM and RippleNet. Further studies verify the efficacy of embedding propagation for high-order relation modeling and the interpretability benefits brought by the attention mechanism.

Journal ArticleDOI
TL;DR: It is shown that MOFs provide a powerful platform to study photocatalysis, in which the involved three key processes, namely, light harvesting, electron-hole separation, and surface redox reactions, can be rationally improved.
Abstract: ConspectusTo meet the ever-increasing global demand for energy, conversion of solar energy to chemical/thermal energy is very promising. Light-mediated catalysis, including photocatalysis (organic transformations, water splitting, CO2 reduction, etc.) and photothermal catalysis play key roles in solar to chemical/thermal energy conversion via the light–matter interaction. The major challenges in traditional semiconductor photocatalysts include insufficient sunlight utilization, charge carrier recombination, limited exposure of active sites, and particularly the difficulty of understanding the structure–activity relationship. Metal–organic frameworks (MOFs), featuring semiconductor-like behavior, have recently captured broad interest toward photocatalysis and photothermal catalysis because of their well-defined and tailorable porous structures, high surface areas, etc. These advantages are beneficial for rational structural modulation for improved light harvesting and charge separation as well as other eff...

Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +394 moreInstitutions (78)
TL;DR: The Event Horizon Telescope (EHT) as mentioned in this paper is a very long baseline interferometry (VLBI) array that comprises millimeter and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth.
Abstract: The Event Horizon Telescope (EHT) is a very long baseline interferometry (VLBI) array that comprises millimeter- and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth. At a nominal operating wavelength of ~1.3 mm, EHT angular resolution (λ/D) is ~25 μas, which is sufficient to resolve nearby supermassive black hole candidates on spatial and temporal scales that correspond to their event horizons. With this capability, the EHT scientific goals are to probe general relativistic effects in the strong-field regime and to study accretion and relativistic jet formation near the black hole boundary. In this Letter we describe the system design of the EHT, detail the technology and instrumentation that enable observations, and provide measures of its performance. Meeting the EHT science objectives has required several key developments that have facilitated the robust extension of the VLBI technique to EHT observing wavelengths and the production of instrumentation that can be deployed on a heterogeneous array of existing telescopes and facilities. To meet sensitivity requirements, high-bandwidth digital systems were developed that process data at rates of 64 gigabit s^(−1), exceeding those of currently operating cm-wavelength VLBI arrays by more than an order of magnitude. Associated improvements include the development of phasing systems at array facilities, new receiver installation at several sites, and the deployment of hydrogen maser frequency standards to ensure coherent data capture across the array. These efforts led to the coordination and execution of the first Global EHT observations in 2017 April, and to event-horizon-scale imaging of the supermassive black hole candidate in M87.

Journal ArticleDOI
TL;DR: This review summarizes the mechanisms of intrinsic- and extrinsic-environment-induced decomposition of perovskite quantum dots and some possible solutions to improve the stability of PQDs together with suggestions for further improving the performance of pc-LEDs as well as the device lifetime.
Abstract: Beyond the unprecedented success achieved in photovoltaics (PVs), lead halide perovskites (LHPs) have shown great potential in other optoelectronic devices. Among them, nanometer-scale perovskite quantum dots (PQDs) with fascinating optical properties including high brightness, tunable emission wavelength, high color purity, and high defect tolerance have been regarded as promising alternative down-conversion materials in phosphor-converted light-emitting diodes (pc-LEDs) for lighting and next-generation of display technology. Despite the promising applications of perovskite materials in various fields, they have received strong criticism for the lack of stability. The poor stability has also attracted much attention. Within a few years, numerous strategies towards enhancing the stability have been developed. This review summarizes the mechanisms of intrinsic- and extrinsic-environment-induced decomposition of PQDs. Simultaneously, the strategies for improving the stability of PQDs are reviewed in detail, which can be classified into four types: (1) compositional engineering; (2) surface engineering; (3) matrix encapsulation; (4) device encapsulation. Finally, the challenges for applying PQDs in pc-LEDs are highlighted, and some possible solutions to improve the stability of PQDs together with suggestions for further improving the performance of pc-LEDs as well as the device lifetime are provided.

Journal ArticleDOI
TL;DR: In this article, the authors show that the formation of a highly stable Cu-C-O-In intermediate at the Cu-In dual sites is the key feature determining selectivity.
Abstract: Due to the large number of possible products and their similar reduction potentials, a significant challenge in CO2 photoreduction is achieving selectivity to a single product while maintaining high conversion efficiency. Controlling the reaction intermediates that form on the catalyst surface through careful catalyst design is therefore crucial. Here, we prepare atomically thin layers of sulfur-deficient CuIn5S8 that contain charge-enriched Cu–In dual sites, which are highly selective towards photocatalytic production of CH4 from CO2. We propose that the formation of a highly stable Cu–C–O–In intermediate at the Cu–In dual sites is the key feature determining selectivity. We suggest that this configuration not only lowers the overall activation energy barrier, but also converts the endoergic protonation step to an exoergic reaction process, thus changing the reaction pathway to form CH4 instead of CO. As a result, the CuIn5S8 single-unit-cell layers achieve near 100% selectivity for visible-light-driven CO2 reduction to CH4 over CO, with a rate of 8.7 μmol g−1 h−1. Many different molecules can form during photocatalytic reduction of CO2, so identifying catalyst structure–product selectivity relationships is vital. Here, the authors find that sulfur-deficient CuIn5S8 is highly selective to CH4 and suggest that the presence of Cu–In binding sites is key to this behaviour.

Proceedings ArticleDOI
01 Jun 2019
TL;DR: This work develops a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly, and incorporates three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance.
Abstract: Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, e.g. online services that involve continuous streams of incoming data. Recently, incremental learning receives increasing attention, and is considered as a promising solution to the practical challenges mentioned above. However, it has been observed that incremental learning is subject to a fundamental difficulty -- catastrophic forgetting, namely adapting a model to new data often results in severe performance degradation on previous tasks or classes. Our study reveals that the imbalance between previous and new data is a crucial cause to this problem. In this work, we develop a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly. Specifically, we incorporate three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance. Experiments show that the proposed method can effectively rebalance the training process, thus obtaining superior performance compared to the existing methods. On CIFAR-100 and ImageNet, our method can reduce the classification errors by more than 6% and 13% respectively, under the incremental setting of 10 phases.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a comprehensive review of the thermal runaway phenomenon and related fire dynamics in singe and multi-cell battery packs, as well as potential fire prevention measures.

Journal ArticleDOI
TL;DR: This review summarizes recent advances in the design and synthesis of stable MOFs and highlights the relationships between the stability and functional applications.
Abstract: Metal-organic frameworks (MOFs) have been recognized as one of the most important classes of porous materials due to their unique attributes and chemical versatility. Unfortunately, some MOFs suffer from the drawback of relatively poor stability, which would limit their practical applications. In the recent past, great efforts have been invested in developing strategies to improve the stability of MOFs. In general, stable MOFs possess potential toward a broader range of applications. In this review, we summarize recent advances in the design and synthesis of stable MOFs and MOF-based materials via de novo synthesis and/or post-synthetic structural processing. Also, the relationships between the stability and functional applications of MOFs are highlighted, and finally, the subsisting challenges and the directions that future research in this field may take have been indicated.

Journal ArticleDOI
TL;DR: In this paper, the authors used onion-like nanospheres of carbon (OLC) to anchor stable atomically dispersed Pt to act as a catalyst for hydrogen evolution reaction (HER) electrocatalysts.
Abstract: Dispersing catalytically active metals as single atoms on supports represents the ultimate in metal utilization efficiency and is increasingly being used as a strategy to design hydrogen evolution reaction (HER) electrocatalysts. Although platinum (Pt) is highly active for HER, given its high cost it is desirable to find ways to improve performance further while minimizing the Pt loading. Here, we use onion-like nanospheres of carbon (OLC) to anchor stable atomically dispersed Pt to act as a catalyst (Pt1/OLC) for the HER. In acidic media, the performance of the Pt1/OLC catalyst (0.27 wt% Pt) in terms of a low overpotential (38 mV at 10 mA cm−2) and high turnover frequencies (40.78 H2 s−1 at 100 mV) is better than that of a graphene-supported single-atom catalyst with a similar Pt loading, and comparable to a commercial Pt/C catalyst with 20 wt% Pt. First-principle calculations suggest that a tip-enhanced local electric field at the Pt site on the curved support promotes the reaction kinetics for hydrogen evolution. Isolating metal atoms on supports is becoming an increasingly studied approach to design water splitting electrocatalysts. Here, the authors prepare a hydrogen evolution catalyst comprising atomically dispersed Pt atoms on curved carbon supports, which outperform similar catalysts where the support is flat.

Book ChapterDOI
Matej Kristan1, Ales Leonardis2, Jiří Matas3, Michael Felsberg4  +155 moreInstitutions (47)
23 Jan 2019
TL;DR: The Visual Object Tracking challenge VOT2018 is the sixth annual tracker benchmarking activity organized by the VOT initiative; results of over eighty trackers are presented; many are state-of-the-art trackers published at major computer vision conferences or in journals in the recent years.
Abstract: The Visual Object Tracking challenge VOT2018 is the sixth annual tracker benchmarking activity organized by the VOT initiative. Results of over eighty trackers are presented; many are state-of-the-art trackers published at major computer vision conferences or in journals in the recent years. The evaluation included the standard VOT and other popular methodologies for short-term tracking analysis and a “real-time” experiment simulating a situation where a tracker processes images as if provided by a continuously running sensor. A long-term tracking subchallenge has been introduced to the set of standard VOT sub-challenges. The new subchallenge focuses on long-term tracking properties, namely coping with target disappearance and reappearance. A new dataset has been compiled and a performance evaluation methodology that focuses on long-term tracking capabilities has been adopted. The VOT toolkit has been updated to support both standard short-term and the new long-term tracking subchallenges. Performance of the tested trackers typically by far exceeds standard baselines. The source code for most of the trackers is publicly available from the VOT page. The dataset, the evaluation kit and the results are publicly available at the challenge website (http://votchallenge.net).

Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +243 moreInstitutions (60)
TL;DR: In this paper, the Event Horizon Telescope (EHT) 1.3 mm radio wavelength observations of the supermassive black hole candidate at the center of the radio galaxy M87 and the quasar 3C 279, taken during the 2017 April 5-11 observing campaign are presented.
Abstract: We present the calibration and reduction of Event Horizon Telescope (EHT) 1.3 mm radio wavelength observations of the supermassive black hole candidate at the center of the radio galaxy M87 and the quasar 3C 279, taken during the 2017 April 5–11 observing campaign. These global very long baseline interferometric observations include for the first time the highly sensitive Atacama Large Millimeter/submillimeter Array (ALMA); reaching an angular resolution of 25 μas, with characteristic sensitivity limits of ~1 mJy on baselines to ALMA and ~10 mJy on other baselines. The observations present challenges for existing data processing tools, arising from the rapid atmospheric phase fluctuations, wide recording bandwidth, and highly heterogeneous array. In response, we developed three independent pipelines for phase calibration and fringe detection, each tailored to the specific needs of the EHT. The final data products include calibrated total intensity amplitude and phase information. They are validated through a series of quality assurance tests that show consistency across pipelines and set limits on baseline systematic errors of 2% in amplitude and 1° in phase. The M87 data reveal the presence of two nulls in correlated flux density at ~3.4 and ~8.3 Gλ and temporal evolution in closure quantities, indicating intrinsic variability of compact structure on a timescale of days, or several light-crossing times for a few billion solar-mass black hole. These measurements provide the first opportunity to image horizon-scale structure in M87.

Journal ArticleDOI
01 Apr 2019
TL;DR: Wu et al. as mentioned in this paper constructed a series of alloy-supported Ru1 using different PtCu alloys through sequential acid etching and electrochemical leaching, and found a volcano relation between OER activity and the lattice constant of the alloys.
Abstract: Single-atom precious metal catalysts hold the promise of perfect atom utilization, yet control of their activity and stability remains challenging. Here we show that engineering the electronic structure of atomically dispersed Ru1 on metal supports via compressive strain boosts the kinetically sluggish electrocatalytic oxygen evolution reaction (OER), and mitigates the degradation of Ru-based electrocatalysts in an acidic electrolyte. We construct a series of alloy-supported Ru1 using different PtCu alloys through sequential acid etching and electrochemical leaching, and find a volcano relation between OER activity and the lattice constant of the PtCu alloys. Our best catalyst, Ru1–Pt3Cu, delivers 90 mV lower overpotential to reach a current density of 10 mA cm−2, and an order of magnitude longer lifetime over that of commercial RuO2. Density functional theory investigations reveal that the compressive strain of the Ptskin shell engineers the electronic structure of the Ru1, allowing optimized binding of oxygen species and better resistance to over-oxidation and dissolution. While Ru-based electrocatalysts are among the most active for acidic water oxidation, they suffer from severe deactivation. Now, Yuen Wu, Wei-Xue Li and co-workers report a core–shell Ru1–Pt3Cu catalyst with surface-dispersed Ru atoms for a highly active and stable oxygen evolution reaction in acid electrolyte.

Journal ArticleDOI
16 Jan 2019-Joule
TL;DR: In this paper, a facile synthesis of earth-abundant Ni single-atom catalysts on commercial carbon black was further employed in a gas-phase electrocatalytic reactor under ambient conditions.

Journal ArticleDOI
TL;DR: In this article, the authors demonstrate that robust bifunctional oxygen reduction reaction (ORR) and oxygen evolution reaction (OER) activity can be achieved by inducing lattice strain in noble-metal-free metal-organic frameworks (MOFs).
Abstract: Oxygen electrocatalysis is central to technologies such as fuel cells and electrolysers, but challenges remain due to the lack of effective earth-abundant electrocatalysts and insufficient understanding of catalytic mechanisms. Here we demonstrate that robust bifunctional oxygen reduction reaction (ORR) and oxygen evolution reaction (OER) activity can be achieved by inducing lattice strain in noble-metal-free metal–organic frameworks (MOFs). Lattice-strained NiFe MOFs exhibit mass activities of 500 A gmetal−1 at a half-wave potential of 0.83 V for the ORR and 2,000 A gmetal−1 at an overpotential of 0.30 V for the OER, which are 50–100 times that of pristine NiFe metal–organic frameworks. The catalyst maintains ~97% of its initial activity after 200 h of continuous ORR/OER reaction at a high current density of 100–200 mA cm−2. Using operando synchrotron spectroscopies, we observed a key superoxide *OOH intermediate emerging on Ni4+ sites during both the ORR and OER processes, which suggests a four-electron mechanistic pathway. Metal–organic frameworks (MOFs) are increasingly being explored as electrocatalysts for the oxygen evolution and reduction reactions, which are important processes in electrolysers and fuel cells. Here, the authors increase the activity of MOFs for these reactions by introducing strain into the lattice using UV light illumination.