Institution
Brno University of Technology
Education•Brno, Czechia•
About: Brno University of Technology is a education organization based out in Brno, Czechia. It is known for research contribution in the topics: Computer science & Fracture mechanics. The organization has 6339 authors who have published 15226 publications receiving 194088 citations. The organization is also known as: Vysoké učení technické v Brně & BUT.
Papers published on a yearly basis
Papers
More filters
••
11 Jun 2019
TL;DR: It is demonstrated that efficient approximations can be introduced into the computational path of DNN accelerators while retraining can completely be avoided, and a simple weight updating scheme is proposed that compensates the inaccuracy introduced by employing approximate multipliers.
Abstract: The state-of-the-art approaches employ approximate computing to reduce the energy consumption of DNN hardware. Approximate DNNs then require extensive retraining afterwards to recover from the accuracy loss caused by the use of approximate operations. However, retraining of complex DNNs does not scale well. In this paper, we demonstrate that efficient approximations can be introduced into the computational path of DNN accelerators while retraining can completely be avoided. ALWANN provides highly optimized implementations of DNNs for custom low-power accelerators in which the number of computing units is lower than the number of DNN layers. First, a fully trained DNN (e.g., in TensorFlow) is converted to operate with 8-bit weights and 8-bit multipliers in convolutional layers. A suitable approximate multiplier is then selected for each computing element from a library of approximate multipliers in such a way that (i) one approximate multiplier serves several layers, and (ii) the overall classification error and energy consumption are minimized. The optimizations including the multiplier selection problem are solved by means of a multiobjective optimization NSGA-II algorithm. In order to completely avoid the computationally expensive retraining of DNN, which is usually employed to improve the classification accuracy, we propose a simple weight updating scheme that compensates the inaccuracy introduced by employing approximate multipliers. The proposed approach is evaluated for two architectures of DNN accelerators with approximate multipliers from the open-source “EvoApprox” library, while executing three versions of ResNet on CIFAR-10. We report that the proposed approach saves 30% of energy needed for multiplication in convolutional layers of ResNet-50 while the accuracy is degraded by only 0.6% (0.9% for the ResNet-14). The proposed technique and approximate layers are available as an open-source extension of TensorFlow at https://github.com/ehw-fit/tf-approximate.
88 citations
••
20 Jan 2013TL;DR: This work presents a simple and efficient framework for automatic verification of systems with a parameteric number of communicating processes, which relies on an abstraction function that views the system from the perspective of a fixed number of processes.
Abstract: We present a simple and efficient framework for automatic verification of systems with a parameteric number of communicating processes The processes may be organized in various topologies such as words, multisets, rings, or trees Our method needs to inspect only a small number of processes in order to show correctness of the whole system It relies on an abstraction function that views the system from the perspective of a fixed number of processes The abstraction is used during the verification procedure in order to dynamically detect cut-off points beyond which the search of the state space need not continue We show that the method is complete for a large class of well quasi-ordered systems including Petri nets Our experimentation on a variety of benchmarks demonstrate that the method is highly efficient and that it works well even for classes of systems with undecidable verification problems
88 citations
••
TL;DR: In this article, the authors investigated the hydration of ternary blends comprising Portland cement, blast-furnace slag and metakaolin using X-ray diffraction and isothermal calorimetry.
88 citations
••
TL;DR: Using low energy electron diffraction (LEED), density functional theory (DFT) and scanning tunneling microscopy (STM), Todorova et al. as mentioned in this paper re-analyzed the Pd(100)-(root 5 x root 5)R27 degrees-O surface oxide structure consisting, in the most recent model, of a strained PdO(101) layer on top of the surface.
88 citations
••
TL;DR: The nonlocal probabilistic theory developed in Part I is applied in numerical studies of plain concrete beams and is compared to the existing test data on the modulus of rupture.
Abstract: The nonlocal probabilistic theory developed in Part I is applied in numerical studies of plain concrete beams and is compared to the existing test data on the modulus of rupture. For normal size test beams, the deterministic theory is found to dominate and give adequate predictions for the mean. But the present probabilistic theory can further provide the standard deviation and the entire probability distribution (calculated via Latin hypercube sampling). For very large beam sizes, the statistical size effect dominates and the mean prediction approaches asymptotically the classical Weibull size effect. This is contrary to structures failing only after the formation of a large crack, for which the classical Weibull size effect is asymptotically approached for very small structure sizes. Comparison to the existing test data on the modulus of rupture demonstrates good agreement with both the measured means and the scatter breadth.
88 citations
Authors
Showing all 6383 results
Name | H-index | Papers | Citations |
---|---|---|---|
Georg Kresse | 111 | 430 | 244729 |
Patrik Schmuki | 109 | 763 | 52669 |
Michael Schmid | 88 | 715 | 30874 |
Robert M. Malina | 88 | 691 | 38277 |
Jiří Jaromír Klemeš | 64 | 565 | 14892 |
Alessandro Piccolo | 62 | 284 | 14332 |
René Kizek | 61 | 672 | 16554 |
George Danezis | 59 | 209 | 11516 |
Stevo Stević | 58 | 374 | 9832 |
Edvin Lundgren | 57 | 286 | 10158 |
Franz Halberg | 55 | 750 | 15400 |
Vojtech Adam | 55 | 611 | 14442 |
Lukas Burget | 53 | 252 | 21375 |
Jan Cermak | 53 | 238 | 9563 |
Hynek Hermansky | 51 | 317 | 14372 |