27
AI in real world Automobile, Robotics, Bio/Healthcare and Art Creation Daisuke Okanohara Preferred Networks [email protected] May. 11 2017@Gatsby-Kaken Joint Workshop

Gatsby kaken-2017-pfn okanohara

Embed Size (px)

Citation preview

Page 1: Gatsby kaken-2017-pfn okanohara

AI in real worldAutomobile, Robotics,

Bio/Healthcare and Art Creation

Daisuke Okanohara

Preferred Networks

[email protected]

May. 11 2017@Gatsby-Kaken Joint Workshop

Page 2: Gatsby kaken-2017-pfn okanohara

Preferred Networks (PFN)

“Make everything intelligent and collaborative”

Founded : March. 2014 (Founder:Toru Nishikawa (CEO), Daisuke Okanohara (EVP))

Office: Tokyo, San Mateo

Employees: ~80 (doubles every year)

Investors: FANUC, Toyota, NTT

2

Page 3: Gatsby kaken-2017-pfn okanohara

Preferred Networks’ positioning in AI: Industrial IoT

3

Consumer Industrial

Cloud

Device

Infrastructure

Factory

Robot

Automotive

Healthcare

Smart City

Industry4.0

Industrial

Edge-side

Page 4: Gatsby kaken-2017-pfn okanohara

Automobile

Page 5: Gatsby kaken-2017-pfn okanohara

Robotics

Page 6: Gatsby kaken-2017-pfn okanohara

Anomaly Detection

Page 7: Gatsby kaken-2017-pfn okanohara

Example: FANUC Reducer Anomaly Detection[Presented at iREX 2015]

7

Anomaly detection using deep generative models

No anomaly

Found anomalies

Normal Anomaly

Actual sensor data from reducers

Page 8: Gatsby kaken-2017-pfn okanohara

Can predict the failure much earlier thanthe existing methods

We heavily use deep generative models to detect anomalies

Deep learning based methods

異常スコア

Detect 40 days before the failure

Threshold

Existing methods

Elapsed time

Detect just before the failure

Robot failure

Robot failure

15日前

Page 9: Gatsby kaken-2017-pfn okanohara

Life Science

Page 10: Gatsby kaken-2017-pfn okanohara

The National Cancer Center in Japan and Preferred Networks start collaborative research in deep learning

Page 11: Gatsby kaken-2017-pfn okanohara

Accuracy for Breast Cancer Diagnosis

90%

99%

80%Mammography

SOTA Liquid Biopsy

SOTA Liquid Biopsy

with Deep Learning

Page 12: Gatsby kaken-2017-pfn okanohara

Art Creator

Page 13: Gatsby kaken-2017-pfn okanohara

Random sampling of images using GAN [2015]

13

Page 14: Gatsby kaken-2017-pfn okanohara

PaintsChainer (#PaintsChainer)

GAN training. U-Net + Super-resolution

Released Jan. 2017, and already painted about one

million line images

Much cooler newer version will be released soon

http://free-illustrations.gatag.net/2014/01/10/220000.html

Page 15: Gatsby kaken-2017-pfn okanohara

PaintsChainer

Tweet from @munashihc

Page 16: Gatsby kaken-2017-pfn okanohara

Technologies

Page 17: Gatsby kaken-2017-pfn okanohara

Chainer : Flexible deep learning framework

https://github.com/pfnet/chainer

113 contributors

2,473 stars & 639 fork

8,804 commits

Active development & release

— v1.0.0 (June 2015) to v1.23.0 (May 2017)

17

Original developer

Seiya Tokui

Page 18: Gatsby kaken-2017-pfn okanohara

ChainerRL: deep reinforcement learning library[2016]

Implements various SOTA deep RL algorithms

— User can quickly try Atari 2600 and openAI gym tasks

Yasuhisa Fujita

Page 19: Gatsby kaken-2017-pfn okanohara

To process this huge amount of data, we need to apply parallel computing to deep learning

Page 20: Gatsby kaken-2017-pfn okanohara

ChainerMNScalable Trainining of Deep Learning Model

ChainerMN

developer

Takuya Akiba

Page 21: Gatsby kaken-2017-pfn okanohara

Scaling Result for CNTK, MXNet, TensorFlow and Chainer

Page 22: Gatsby kaken-2017-pfn okanohara

Validation Accuracy against # of GPUs

Page 23: Gatsby kaken-2017-pfn okanohara

23

Future AI needs 100Exa ~ 1Zeta flops

1E〜100E Flops1TB /car / day10~1000 cars, 100days

Life Science

Speech Rec. Robotics/Drone

10P〜 Flops

5000 hours of speech, 0.1 miliion of generated speech[Baidu 2015]

100P 〜 1E Flops10M SNPs per person. 100PF for 1million, 1EF for100 million.

10P(Image) 〜 10E(Video) Flops

100million images,

ImageVideo Rec.

1E〜100E Flops

1TB/device/year1million ~ 100 milliondevices

Autonomous Driving

10PF 100EF100PF 1EF 10EF

P:Peta E:ExaF:Flops

Machine generated data is much bigger than human generated data

These estimation is based on;

To finish training using 1GB within 1day require 1Tflops

Page 24: Gatsby kaken-2017-pfn okanohara

Computing Infrastructure

Current PFN’s infrastructure

— >1000 GPUs, ~ 10PFlops, connected by InfiniBand in 2Q 2017

— Still not enough for current R&D demand

Unsupervised learning, learning from Video, RL

We are developing a new chip specialized for DL ops

— Super power-efficient chip enable ~1 Peta DL ops per 1Chip

— Plan to build a cluster capable of 1 Exa DL ops by 2019

Since brain has 1 Zeta Flops*1, we require more resource

— We expect to have such a cluster by 2034

— This is optimistic, but expect several new technology will emerge

24

*1 http://timdettmers.com/2015/07/27/brain-vs-deep-learning-singularity/

Page 25: Gatsby kaken-2017-pfn okanohara

Semi-supervised LearningVirtually Adversarial Training [arxiv:1704.03976]

SoTA of semi-supervised learning on CIFAR-10, SVHNTakeru Miyato

* CIFAR-10, SVHNを含んだ実験結果は投稿準備中

Page 26: Gatsby kaken-2017-pfn okanohara

IMSAT(VAT) [Hu and Miyato 17]

IMSAT: VAT + Information Maximization CriterionUnsup. Discrete Coding

SoTA on Unsup. Clusteirng and Hash Learning

Result during 2016 summer internship

Page 27: Gatsby kaken-2017-pfn okanohara

Conclusion and Future Work

Recognition to planning, controlling, and creation

— Deep learning was first used in recognition tasks but now used for

many different tasks

Future Work

— Increase data and computing resources significantly (x1000) ?

Generate high-volume data in real world (use robotics?)

New hardware and networks achieving 1 Zeta flops

— Interpretability and controllability of AI systems in critical tasks

— A new way to accumulate these obtained knowledges

New language, and communication for machines (and human)

— We can learn a lot from brain research