Institution
Harbin Institute of Technology
Education•Harbin, China•
About: Harbin Institute of Technology is a education organization based out in Harbin, China. It is known for research contribution in the topics: Microstructure & Control theory. The organization has 88259 authors who have published 109297 publications receiving 1603393 citations. The organization is also known as: HIT.
Topics: Microstructure, Control theory, Ultimate tensile strength, Alloy, Laser
Papers published on a yearly basis
Papers
More filters
••
TL;DR: It is argued that anodic property not only affects the reaction kinetics of various steps of EC organic oxidation, but also alters the pathway of phenol electrolysis.
510 citations
••
TL;DR: The anammox metabolism and its factors were critically reviewed so as to form biofilm/granules for retaining anamm ox bacteria in sewage treatment to make sewage treatment energy-neutral or energy-positive.
509 citations
••
TL;DR: Proton exchange membrane fuel cell (PEMFC) is considered as one promising clean and highly efficient power generation technology in 21st century as mentioned in this paper, which is considered to be a promising energy generation technology.
506 citations
•
TL;DR: DeepFM as mentioned in this paper combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture, which has a shared input to its "wide" and "deep" parts.
Abstract: Learning sophisticated feature interactions behind user behaviors is critical in maximizing CTR for recommender systems. Despite great progress, existing methods seem to have a strong bias towards low- or high-order interactions, or require expertise feature engineering. In this paper, we show that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions. The proposed model, DeepFM, combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture. Compared to the latest Wide \& Deep model from Google, DeepFM has a shared input to its "wide" and "deep" parts, with no need of feature engineering besides raw features. Comprehensive experiments are conducted to demonstrate the effectiveness and efficiency of DeepFM over the existing models for CTR prediction, on both benchmark data and commercial data.
504 citations
••
TL;DR: In this article, a robust superhydrophobic polysiloxane layer was coated onto the surface of 3D porous polyurethane sponges through a one-step solution immersion method.
Abstract: The fabrication of robust superhydrophobic 3D porous materials is of great importance for both academic research and industrial applications. The main challenge is the poor adhesion between porous substrates and superhydrophobic coatings. In this study, a robust superhydrophobic polysiloxane layer was coated onto the surface of 3D porous polyurethane sponges through a one-step solution immersion method. The durability of the resulting sponges was investigated by repeated mechanical compressions, ultrasonication in polar solvents, and strong acid/alkali attacks. Results revealed that the superhydrophobic sponges showed excellent elasticity, high mechanical durability and good chemical stability. By combining the special wettability and high porosity, the sponges exhibited high oil-absorption capacity and high selectivity when they were employed as absorptive materials for cleaning oils on the water surface. More importantly, the superhydrophobic sponges could be reused for oil–water separation for more than 300 cycles without losing their superhydrophobicity, exhibiting the highest reusability and durability among the reported counterparts. Therefore, the present study offers a simple and low-cost strategy for large-scale fabrication of robust superhydrophobic 3D porous materials that might be applied to the cleanup of oil spills on the water surface.
502 citations
Authors
Showing all 89023 results
Name | H-index | Papers | Citations |
---|---|---|---|
Jiaguo Yu | 178 | 730 | 113300 |
Lei Jiang | 170 | 2244 | 135205 |
Gang Chen | 167 | 3372 | 149819 |
Xiang Zhang | 154 | 1733 | 117576 |
Hui-Ming Cheng | 147 | 880 | 111921 |
Yi Yang | 143 | 2456 | 92268 |
Bruce E. Logan | 140 | 591 | 77351 |
Bin Liu | 138 | 2181 | 87085 |
Peng Shi | 137 | 1371 | 65195 |
Hui Li | 135 | 2982 | 105903 |
Lei Zhang | 135 | 2240 | 99365 |
Jie Liu | 131 | 1531 | 68891 |
Lei Zhang | 130 | 2312 | 86950 |
Zhen Li | 127 | 1712 | 71351 |
Kurunthachalam Kannan | 126 | 820 | 59886 |