DeepLearning frameworks are popping up at very high frequency but only a few of them are suitable to run on clusters, use GPUs and supporting topologies beyond Feed-Forward at the same time. DeepLearning4J, ApacheSystemML and TensorSpark feature all this without forcing you to learn new exotic programming languages and in addition also scales-out on well established infrastructures like ApacheSpark. In this talk we will introduce DeepLearning4J and Apache SystemML on top of ApacheSpark with an example to create an anomaly detector for IoT sensor data with a LSTM auto encoder neural network. We’ll also explain how Apache SystemML uses cost-based optimisers for Neural Network training and how TensorSpark parallelises TensorFlow on ApacheSpark.
IBM/Watson IoT / Chief Data Scientist @ Watson IoT
Romeo Kienzler is Chief Data Scientist and DeepLearning/AI Engineer at IBM Watson IoT and as IBM Certified Senior Architect he helps clients worldwide to solve their data analysis challenges. He holds an M. Sc. (ETH) in Computer Science with specialisation in Information Systems, Bioinformatics and Applied Statistics from the Swiss Federal Institute of Technology Zurich. He works as an Associate Professor for artificial intelligence at a Swiss University and his current research focus is on cloud-scale machine learning and deep learning using open source technologies including R, Apache Spark, Apache SystemML, Apache Flink, DeepLearning4J and TensorFlow. He also contributes to various open source projects. He regularly speaks at international conferences including significant publications in the area of data mining, machine learning and Blockchain technologies. Recently his latest book on Mastering Apache Spark V2.X has been published: [amzn.to 連結] Romeo Kienzler is a member of the IBM Technical Expert Council and the IBM Academy of Technology - IBM’s leading brain trusts. #ibmaot