42
Damageless Information Hiding Te chnique using Neural Network Keio University Graduate School of Media and Governance Kensuke Naoe

Damageless Information Hiding Technique using Neural Network

Embed Size (px)

DESCRIPTION

Damageless Information Hiding Technique using Neural Network. Keio University Graduate School of Media and Governance Kensuke Naoe. Abstract. An information hiding technique without embedding any data to target content Pattern recognition model Neural network as classifier (extraction key) - PowerPoint PPT Presentation

Citation preview

Damageless Information Hiding Technique using Neural Network

Keio University

Graduate School of Media and Governance

Kensuke Naoe

Abstract

An information hiding technique without embedding any data to target contentPattern recognition model

Neural network as classifier (extraction key)

Advantage and disadvantage

Outline

Background Motivation Current Problem Proposed Method Experiment results Future Work Coclusion

Background

Emergence of the InternetContents are widely distributed

Information hiding provides reliabilityDigital watermarking for Digital Rights Manag

ementSteganography for covert channel

Motivation and current problem

Use one information hiding algorithm with another to strengthen the security of the content Digital watermarking Steganography FIngerprinting

There are many great information hiding algorithm but have difficulties to collaborate possibility of obstructing previously embedded data Applying another information hiding algorithm might re

sult in recalculation of fingerprint for the content

Research Objective

To hide or to relate certain information without embedding any information to the target content

Ability to collaborate with another information hiding algorithm to strengthen the security

Proposed Method

ApproachEmbed model to pattern recognition model

Neural network as classifier (extraction key)

Only proper extraction key will lead to proper hidden signal

Why use neural network?

Has abilities ofTolerance to noiseError correction and complementationAdditional learning characteristic

Multi-layered Perceptron ModelBackpropagation Learning (Supervised Learni

ng)

Proposed Method (Embedding)1.Frequency Transformation of content

Hidden signal as teacher signal

2.Selection of feature subblock

3.Use feature values as input value for neural network

4. Generation of classifier (extraction key)

Coordinate of feature subblocks (extraction key)

Proposed Method (Extraction)1.Frequency Transformation of content

Hidden signal as output signal

2.Selection of feature subblock

3.Use feature values as input value for neural network

4. Applying the classifier (encryption key)

Coordinate of feature subblocks (encryption key)

What is neural network?

neuron ( nervous cell ) It only has a function of receiving a signal and dispatc

hing signal to connected neuron When organically connected, it has ability to process

a complicated task

A network built with these neurons are called neural network Multi layered perceptron model

Often used for non-linear pattern classifier

Calculation of network

Input value of neuron Sum product of network weight and output values from

previous layer

jxj

yj

y1 yi yN

w1j wijwNj

N

iiijj ywx

1

Generating classifier (extraction key)

1.Frequency Transformation of content

Hidden signal as teacher signal

2.Selection of feature subblock

3.Use feature values as input value for neural network

4. Generation of classifier (encryption key)

Coordinate of feature subblocks (encryption key)

Patterns and signals to concealpatter

nhidden signal

pattern

hidden signal

pattern

hidden signal

1 00000 11 01010 21 10100

2 00001 12 01011 22 10101

3 00010 13 01100 23 10110

4 00011 14 01101 24 10111

5 00100 15 01110 25 11000

6 00101 16 01111 26 11001

7 00110 17 10000 27 11010

8 00111 18 10001 28 11011

9 01000 19 10010 29 11100

10 01001 20 10011 30 11101

        31 11110

        32 11111

Backpropagation learning

Network 1

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

input pattern

sign

al v

alue

Network 2

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

input pattern

sign

al v

alue

Network 3

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

input pattern

sign

al v

alue

Network 4

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

input pattern

sign

al v

alue

Network 5

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

input pattern

sign

al v

alue

Further experiments

Can proposed method extract from high pass filtered image or jpeg image

Network 1

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

sign

al v

alue

original highpass

Network 2

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132

pattern

sign

al v

alue

original highpass

Network 3

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

sign

al v

alue

original highpass

Network 4

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

sign

al v

alue

original highpass

Network 5

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

sign

al v

alue

original highpass

Network 1

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

outp

ut s

igna

l

original jpeg

Network 2

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

outp

ut s

igna

l

original jpeg

Network 3

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

outp

ut s

igna

l

original jpeg

Network 4

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

outp

ut s

igna

l

original jpeg

Network 5

00.10.20.30.40.50.60.70.80.9

1

1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132pattern

sign

al v

alue

original jpeg

Future work

Because it relies on the position of feature sub block, it is weak to geometric attacksRotation, expansion, shrinking

Key sharing has to rely on another security technology

Conclusion

Information hiding technique without embedding any data into target content by using neural network

Ability to collaborate with other information hiding algorithm

Thank you

Appendix

Tradeoffs for information hiding

Watermarking

(Digital Right Management)

Steganography

(Covert Channel)

Fingerprinting

(Integrity check)

Capacity

(Amount of data to be embedded)

Not important

Small amount is enough

Important

More the better

Not Important

More the better

Robustness

(tolerance against attack to the container)

Important

Must not be destroyed

Not important

Content and hidden data are

not related

Important

Should be weak against alteration

Invisibility

(transparency of hidden data)

Important

Should not disturb the content

Important

Existence should be kept secret

Not Important

Existence can be informed

Three layered perceptron model

Three layer model Feed forward model

Input function Sigmoid function

Backpropagation learning x1 xi xM

i

j

k

jkw

ijw

Input layer

Hidden layer

Output layer

Sigmoid function Input function for multi-layered perceptron model sigmoid = look like letter of S

xy

exp1

1

x y

Selection of feature values

Feature subblock

Has DC value and various values of AC (low, middle, high)

number of hidden neuron=10 threshold=0.05

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

level of alteration (increase with step of 0.1)

perc

enta

ge

selected feature sub blocksother sub blocks

number of hidden neuron=10 threshold=0.1

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

level of alteration (increase with step of 0.1)

perc

enta

ge

selected feature sub blocksother sub blocks

number of hidden neuron=20 threshold=0.1

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

level of alteration (increase with step of 0.1)

perc

enta

ge

selected feature sub blocksother sub blocks