44
Human Capacity Cognitive Computing Patrick Ehlen Chief Scientist, Loop AI Labs March 16, 2017

"Human Capacity Cognitive Computing" by Patrick Ehlen, PhD and Chief Scientist at Loop AI Labs (loop.ai)

Embed Size (px)

Citation preview

Human Capacity Cognitive Computing

Patrick EhlenChief Scientist, Loop AI Labs

March 16, 2017

Cognitive Computing Platforms

• IDC forecast: $12.5B market in 2019 (CAGR of 35%)

• By 2018, ½ of consumers will regularly interact with cognitive computing services

• How will it work?• What is “cognitive computing,” anyhow?

Which is more difficult?

Which is more difficult?

Communicate in multiple modalities

Embed and recurse over long sequences

This is the rat that ate the malt that lay in the house that Jack built.

“Discrete Infinity”: infinite use of finite means

This is the farmer sowing the corn, that kept the cock that crowed in the morn, that waked the priest all shaven and shorn, that married the man all tattered and torn, that kissed the maiden all forlorn, that milked the cow with the crumpled horn, that tossed the dog, that worried the cat, that killed the rat, that ate the malt that lay in the house that Jack built.

• Multimodal Communication• Recursion• Discrete Infinity

• Multimodal Communication• Recursion• Discrete Infinity

“The Human Capacity”

• Multimodal Communication• Recursion• Discrete Infinity

“The Human Capacity”

3 Possible Sources:

• Interface• Learning Algorithm• Architecture

“The Human Capacity”

3 Possible Sources:

• Interface• Learning Algorithm• Architecture

“The Human Capacity”

Deep Learning

Hinton (’81, ’86)

• Assemblies for different semantic roles (hack)

Hinton, G.E. (1981) Implementing semantic networks in parallel hardware.Hinton, G.E. (1986) Learning distributed representations of concepts.

Frankland & Greene (2015)

• Assemblies for different semantic roles (brain)

Frankland, S.M. & Greene, J.D. (2015) An architecture for encoding sentence meanings in left mid-superior temporal cortex. PNAS 112:37

Recap

• (Even though we can all do it…) Language is Hard

• “Human Capacity” probably arises fromspecial architecture

Semantics

• What does anything mean?

Dictionary approach

• Define a thing by its necessary & sufficient features

Bachelor #1 Bachelor #2

Bachelor #1 Bachelor #2

Bachelor #1 Bachelor #2

Katz, J.J. & Fodor, J.A. (1963) The structure of a semantic theory. Language 39:2

Bachelor #1 Bachelor #2

noun

hum

an

anim

al

mal

e

neve

r mar

ried

youn

g

unm

ated

seal

acad

emic

deg

ree

“Units Representation”

“Matrix Representation”

Bachelor #1

Bachelor #2

hum

an

animal

young

“Vector Space Representation”

Problems with Dictionary Approach:

• “necessary and sufficient” features: neither necessary nor sufficient

• Prototypical examples (E. Rosch): • “Robin” -> more representative of bird than

“finch” or “penguin”• Things don’t categorize so easily

• Metaphorical & analogic nature of language (G. Lakoff & M. Johnson, D. Hofstadter)

Problems with Dictionary Approach:

• “Edge cases” (C. Fillmore):• “Widow”: woman who murdered husband?

“Max went too far today and teapotted a policeman”

(H. Clark)

Distributional approach:

• Determine what words mean solely by their lexical context (surrounding words)

the quick brown fox jumped over the lazy dog

Distributional approach:

• Determine what words mean solely by their lexical context (surrounding words)

the quick brown fox jumped over the lazy dog

Distributional approach:

• Determine what words mean solely by their lexical context (surrounding words)

the quick brown fox jumped over the lazy dog

Distributional approach:

• Determine what words mean solely by their lexical context (surrounding words)

Cont

ext F

eatu

res

Distributional approach:

• Determine what words mean solely by their lexical context (surrounding words)

• Use dimensionality reduction to collapse into latent factors (or “microfeatures”)

Distributional approach:La

tent

Con

text

Fea

ture

s

Local Representation:

noun

hum

an

anim

al

mal

e

neve

r mar

ried

youn

g

unm

ated

seal

acad

emic

deg

ree

Local Distributed Representation:

x0 x1 x2 x3 x4 x5 x6 x7

Local Distributed Representation:

x0 x1 x2 x3 x4 x5 x6 x7

young

bachelor

X 1

X3

X 2

Deep Learning

• Learn distributed representations using neural networks

• Learn from data as it comes in• Learn from sequences (e.g., sentences)

Deep Learning

• Learn from lots of additional context features (not just other words)

• Visual features (CNNs)• Parse structure (Recursive NNs)• Higher-level abstractions from earlier sequences

(RNNs)

Deep Learning

• Learn from lots of additional context features (not just other words)

“Max went too far today and teapotted a policeman”

(H. Clark)

Deep Learning

• Learn from lots of additional context features (not just other words)

• For Human Capacity Cognitive Computing: HUGE potential “context feature” input space• very sparse

Deep Learning

• Large, sparse input fully-connected to many layers

• Complex memory assemblies• RNN• LSTM or GRU to retain relevant context features

from further upstream

Human Capacity Cognitive Computing Platform

• Handle context feature input from multiple modalities and project into single representation space

• Support architectures with specialized assemblies permitting recursion / embedding

• “Discrete Infinity”• “Fluid” interpretation and understanding

Loop Cognitive Computing Platform

• GPU-based appliance• Human Capacity understanding• Learns from unstructured and structured data• Produces a structured representation• Understands concepts in the context of their

domain

Your use of the word “teapot” does not match any of my dictionary entries.