scispace - formally typeset
Search or ask a question
Author

Hao Wang

Bio: Hao Wang is an academic researcher from University of Southern Queensland. The author has contributed to research in topics: Medicine & Computer science. The author has an hindex of 89, co-authored 1599 publications receiving 43904 citations. Previous affiliations of Hao Wang include Beijing Institute of Technology & Qingdao University of Science and Technology.


Papers
More filters
Posted Content
TL;DR: This paper proposes the convolutional LSTM (ConvLSTM) and uses it to build an end-to-end trainable model for the precipitation nowcasting problem and shows that it captures spatiotemporal correlations better and consistently outperforms FC-L STM and the state-of-the-art operational ROVER algorithm.
Abstract: The goal of precipitation nowcasting is to predict the future rainfall intensity in a local region over a relatively short period of time. Very few previous studies have examined this crucial and challenging weather forecasting problem from the machine learning perspective. In this paper, we formulate precipitation nowcasting as a spatiotemporal sequence forecasting problem in which both the input and the prediction target are spatiotemporal sequences. By extending the fully connected LSTM (FC-LSTM) to have convolutional structures in both the input-to-state and state-to-state transitions, we propose the convolutional LSTM (ConvLSTM) and use it to build an end-to-end trainable model for the precipitation nowcasting problem. Experiments show that our ConvLSTM network captures spatiotemporal correlations better and consistently outperforms FC-LSTM and the state-of-the-art operational ROVER algorithm for precipitation nowcasting.

4,487 citations

Proceedings Article
07 Dec 2015
TL;DR: In this article, a convolutional LSTM (ConvLSTM) was proposed to capture spatiotemporal correlations better and consistently outperforms FC-LSTMs.
Abstract: The goal of precipitation nowcasting is to predict the future rainfall intensity in a local region over a relatively short period of time. Very few previous studies have examined this crucial and challenging weather forecasting problem from the machine learning perspective. In this paper, we formulate precipitation nowcasting as a spatiotemporal sequence forecasting problem in which both the input and the prediction target are spatiotemporal sequences. By extending the fully connected LSTM (FC-LSTM) to have convolutional structures in both the input-to-state and state-to-state transitions, we propose the convolutional LSTM (ConvLSTM) and use it to build an end-to-end trainable model for the precipitation nowcasting problem. Experiments show that our ConvLSTM network captures spatiotemporal correlations better and consistently outperforms FC-LSTM and the state-of-the-art operational ROVER algorithm for precipitation nowcasting.

2,474 citations

Journal ArticleDOI
TL;DR: In this paper, a review on the tensile properties of natural fiber reinforced polymer composites is presented, where several chemical modifications are employed to improve the interfacial matrix-fiber bonding resulting in the enhancement of tensile strength of the composites.
Abstract: This paper is a review on the tensile properties of natural fiber reinforced polymer composites. Natural fibers have recently become attractive to researchers, engineers and scientists as an alternative reinforcement for fiber reinforced polymer (FRP) composites. Due to their low cost, fairly good mechanical properties, high specific strength, non-abrasive, eco-friendly and bio-degradability characteristics, they are exploited as a replacement for the conventional fiber, such as glass, aramid and carbon. The tensile properties of natural fiber reinforce polymers (both thermoplastics and thermosets) are mainly influenced by the interfacial adhesion between the matrix and the fibers. Several chemical modifications are employed to improve the interfacial matrix–fiber bonding resulting in the enhancement of tensile properties of the composites. In general, the tensile strengths of the natural fiber reinforced polymer composites increase with fiber content, up to a maximum or optimum value, the value will then drop. However, the Young’s modulus of the natural fiber reinforced polymer composites increase with increasing fiber loading. Khoathane et al. [1] found that the tensile strength and Young’s modulus of composites reinforced with bleached hemp fibers increased incredibly with increasing fiber loading. Mathematical modelling was also mentioned. It was discovered that the rule of mixture (ROM) predicted and experimental tensile strength of different natural fibers reinforced HDPE composites were very close to each other. Halpin–Tsai equation was found to be the most effective equation in predicting the Young’s modulus of composites containing different types of natural fibers.

1,757 citations

Proceedings ArticleDOI
10 Aug 2015
TL;DR: Wang et al. as discussed by the authors proposed a hierarchical Bayesian model called collaborative deep learning (CDL), which jointly performs deep representation learning for the content information and collaborative filtering for the ratings (feedback) matrix.
Abstract: Collaborative filtering (CF) is a successful approach commonly used by many recommender systems. Conventional CF-based methods use the ratings given to items by users as the sole source of information for learning to make recommendation. However, the ratings are often very sparse in many applications, causing CF-based methods to degrade significantly in their recommendation performance. To address this sparsity problem, auxiliary information such as item content information may be utilized. Collaborative topic regression (CTR) is an appealing recent method taking this approach which tightly couples the two components that learn from two different sources of information. Nevertheless, the latent representation learned by CTR may not be very effective when the auxiliary information is very sparse. To address this problem, we generalize recently advances in deep learning from i.i.d. input to non-i.i.d. (CF-based) input and propose in this paper a hierarchical Bayesian model called collaborative deep learning (CDL), which jointly performs deep representation learning for the content information and collaborative filtering for the ratings (feedback) matrix. Extensive experiments on three real-world datasets from different domains show that CDL can significantly advance the state of the art.

1,546 citations

Journal ArticleDOI
TL;DR: In this article, a comprehensive overview of surface treatments applied to natural fibres for advanced composites applications is presented, where the effects of different chemical treatments on cellulosic fibres that are used as reinforcements for thermoset and thermoplastics are studied.
Abstract: This paper provides a comprehensive overview on different surface treatments applied to natural fibres for advanced composites applications. In practice, the major drawbacks of using natural fibres are their high degree of moisture absorption and poor dimensional stability. The primary objective of surface treatments on natural fibres is to maximize the bonding strength so as the stress transferability in the composites. The overall mechanical properties of natural fibre reinforced polymer composites are highly dependent on the morphology, aspect ratio, hydrophilic tendency and dimensional stability of the fibres used. The effects of different chemical treatments on cellulosic fibres that are used as reinforcements for thermoset and thermoplastics are studied. The chemical sources for the treatments include alkali, silane, acetylation, benzoylation, acrylation and acrylonitrile grafting, maleated coupling agents, permanganate, peroxide, isocyanate, stearic acid, sodium chlorite, triazine, fatty acid derivate (oleoyl chloride) and fungal. The significance of chemically-treated natural fibres is seen through the improvement of mechanical strength and dimensional stability of resultant composites as compared with a pristine sample.

1,158 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
TL;DR: Slow momentum for some cancers amenable to early detection is juxtaposed with notable gains for other common cancers, and it is notable that long‐term rapid increases in liver cancer mortality have attenuated in women and stabilized in men.
Abstract: Each year, the American Cancer Society estimates the numbers of new cancer cases and deaths that will occur in the United States and compiles the most recent data on population-based cancer occurrence. Incidence data (through 2016) were collected by the Surveillance, Epidemiology, and End Results Program; the National Program of Cancer Registries; and the North American Association of Central Cancer Registries. Mortality data (through 2017) were collected by the National Center for Health Statistics. In 2020, 1,806,590 new cancer cases and 606,520 cancer deaths are projected to occur in the United States. The cancer death rate rose until 1991, then fell continuously through 2017, resulting in an overall decline of 29% that translates into an estimated 2.9 million fewer cancer deaths than would have occurred if peak rates had persisted. This progress is driven by long-term declines in death rates for the 4 leading cancers (lung, colorectal, breast, prostate); however, over the past decade (2008-2017), reductions slowed for female breast and colorectal cancers, and halted for prostate cancer. In contrast, declines accelerated for lung cancer, from 3% annually during 2008 through 2013 to 5% during 2013 through 2017 in men and from 2% to almost 4% in women, spurring the largest ever single-year drop in overall cancer mortality of 2.2% from 2016 to 2017. Yet lung cancer still caused more deaths in 2017 than breast, prostate, colorectal, and brain cancers combined. Recent mortality declines were also dramatic for melanoma of the skin in the wake of US Food and Drug Administration approval of new therapies for metastatic disease, escalating to 7% annually during 2013 through 2017 from 1% during 2006 through 2010 in men and women aged 50 to 64 years and from 2% to 3% in those aged 20 to 49 years; annual declines of 5% to 6% in individuals aged 65 years and older are particularly striking because rates in this age group were increasing prior to 2013. It is also notable that long-term rapid increases in liver cancer mortality have attenuated in women and stabilized in men. In summary, slowing momentum for some cancers amenable to early detection is juxtaposed with notable gains for other common cancers.

15,080 citations