scispace - formally typeset
Search or ask a question
BookDOI

Deep Learning with Python

01 Jan 2017-
About: The article was published on 2017-01-01. It has received 1025 citations till now. The article focuses on the topics: Python (programming language) & Computing Methodologies.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors provide a thorough overview on using a class of advanced machine learning techniques, namely deep learning (DL), to facilitate the analytics and learning in the IoT domain.
Abstract: In the era of the Internet of Things (IoT), an enormous amount of sensing devices collect and/or generate various sensory data over time for a wide range of fields and applications. Based on the nature of the application, these devices will result in big or fast/real-time data streams. Applying analytics over such data streams to discover new information, predict future insights, and make control decisions is a crucial process that makes IoT a worthy paradigm for businesses and a quality-of-life improving technology. In this paper, we provide a thorough overview on using a class of advanced machine learning techniques, namely deep learning (DL), to facilitate the analytics and learning in the IoT domain. We start by articulating IoT data characteristics and identifying two major treatments for IoT data from a machine learning perspective, namely IoT big data analytics and IoT streaming data analytics. We also discuss why DL is a promising approach to achieve the desired analytics in these types of data and applications. The potential of using emerging DL techniques for IoT data analytics are then discussed, and its promises and challenges are introduced. We present a comprehensive background on different DL architectures and algorithms. We also analyze and summarize major reported research attempts that leveraged DL in the IoT domain. The smart IoT devices that have incorporated DL in their intelligence background are also discussed. DL implementation approaches on the fog and cloud centers in support of IoT applications are also surveyed. Finally, we shed light on some challenges and potential directions for future research. At the end of each section, we highlight the lessons learned based on our experiments and review of the recent literature.

903 citations

Posted Content
TL;DR: Ten concerns for deep learning are presented, and it is suggested that deep learning must be supplemented by other techniques if the authors are to reach artificial general intelligence.
Abstract: Although deep learning has historical roots going back decades, neither the term "deep learning" nor the approach was popular just over five years ago, when the field was reignited by papers such as Krizhevsky, Sutskever and Hinton's now classic (2012) deep network model of Imagenet. What has the field discovered in the five subsequent years? Against a background of considerable progress in areas such as speech recognition, image recognition, and game playing, and considerable enthusiasm in the popular press, I present ten concerns for deep learning, and suggest that deep learning must be supplemented by other techniques if we are to reach artificial general intelligence.

779 citations


Cites background from "Deep Learning with Python"

  • ...A partial list includes Brenden Lake and Marco Baroni (2017), François Chollet (2017), Robin Jia and Percy Liang (2017), Dileep George and others at Vicarious (Kansky et al., 2017) and Pieter Abbeel and colleagues at Berkeley (Stoica et al., 2017)....

    [...]

  • ...(Chollet makes quite similar points in the closing chapters of his his (Chollet, 2017) text.)...

    [...]

  • ...Yet deep learning may well be approaching a wall, much as I anticipated earlier, at beginning of the resurgence (Marcus, 2012), and as leading figures like Hinton (Sabour, Frosst, & Hinton, 2017) and Chollet (2017) have begun to imply in recent months....

    [...]

  • ...I thank Christina 1 Chen, François Chollet, Ernie Davis, Zack Lipton, Stefano Pacifico, Suchi Saria, and Athena Vouloumanos for sharp-eyed comments, all generously supplied on short notice during the holidays at the close of 2017....

    [...]

  • ...François Chollet, Google, author of Keras neural network library December 18, 2017 ‘Science progresses one funeral at a time....

    [...]

Journal ArticleDOI
TL;DR: It is shown that experiment- and simulation-based data mining in combination with machine leaning tools provide exceptional opportunities to enable highly reliant identification of fundamental interrelations within materials for characterization and optimization in a scale-bridging manner.
Abstract: Machine learning tools represent key enablers for empowering material scientists and engineers to accelerate the development of novel materials, processes and techniques. One of the aims of using such approaches in the field of materials science is to achieve high-throughput identification and quantification of essential features along the process-structure-property-performance chain. In this contribution, machine learning and statistical learning approaches are reviewed in terms of their successful application to specific problems in the field of continuum materials mechanics. They are categorized with respect to their type of task designated to be either descriptive, predictive or prescriptive; thus to ultimately achieve identification, prediction or even optimization of essential characteristics. The respective choice of the most appropriate machine learning approach highly depends on the specific use-case, type of material, kind of data involved, spatial and temporal scales, formats, and desired knowledge gain as well as affordable computational costs. Different examples are reviewed involving case-by-case dependent application of different types of artificial neural networks and other data-driven approaches such as support vector machines, decision trees and random forests as well as Bayesian learning, and model order reduction procedures such as principal component analysis, among others. These techniques are applied to accelerate the identification of material parameters or salient features for materials characterization, to support rapid design and optimization of novel materials or manufacturing methods, to improve and correct complex measurement devices, or to better understand and predict fatigue behavior, among other examples. Besides experimentally obtained datasets, numerous studies draw required information from simulation-based data mining. Altogether, it is shown that experiment- and simulation-based data mining in combination with machine leaning tools provide exceptional opportunities to enable highly reliant identification of fundamental interrelations within materials for characterization and optimization in a scale-bridging manner. Potentials of further utilizing applied machine learning in materials science and empowering significant acceleration of knowledge output are pointed out.

222 citations


Cites background from "Deep Learning with Python"

  • ...Due to the good readability of Python’s syntax, the convenience and easy access for machine learning and data mining newcomers is increased (Chollet, 2018)....

    [...]

Journal ArticleDOI
TL;DR: This work surveys 19 studies that relied on CNNs to automatically identify crop diseases, describing their profiles, their main implementation aspects and their performance, and provides guidelines to improve the use of CNNs in operational contexts.
Abstract: Deep learning techniques, and in particular Convolutional Neural Networks (CNNs), have led to significant progress in image processing. Since 2016, many applications for the automatic identification of crop diseases have been developed. These applications could serve as a basis for the development of expertise assistance or automatic screening tools. Such tools could contribute to more sustainable agricultural practices and greater food production security. To assess the potential of these networks for such applications, we survey 19 studies that relied on CNNs to automatically identify crop diseases. We describe their profiles, their main implementation aspects and their performance. Our survey allows us to identify the major issues and shortcomings of works in this research area. We also provide guidelines to improve the use of CNNs in operational contexts as well as some directions for future research.

186 citations


Cites background or methods from "Deep Learning with Python"

  • ...This means that information about the validation data indirectly leaks into the model, resulting in an artificial ability to perform well on these images (Chollet, 2017)....

    [...]

  • ...Fine-tuning consists in using the weights of a pre-trained model to initialize the model and then training all or part of these weights on the target dataset (Chollet, 2017)....

    [...]

  • ...Secondly, the images must be normalized to help the model to converge more quickly as well as to better generalize on unseen data (Chollet, 2017)....

    [...]

Journal ArticleDOI
TL;DR: It was found XGBoost and LSTM provided the most accurate load prediction in the shallow and deep learning category, and both outperformed the best baseline model, which uses the previous day’s data for prediction.

157 citations


Cites methods from "Deep Learning with Python"

  • ...These two tools are powerful, as most Kaggle competition3 winners used either the XGBoost library (for shallow machine learning) or Keras (for deep learning) [33]....

    [...]