40
Edge Preserving Image Restoration using L 1 norm Vivek Agarwal The University of Tennessee, Knoxville

Edge Preserving Image Restoration using L 1 norm

Embed Size (px)

DESCRIPTION

Edge Preserving Image Restoration using L 1 norm. Vivek Agarwal The University of Tennessee, Knoxville. Outline. Introduction Regularization based image restoration L 2 norm regularization L 1 norm regularization Tikhonov regularization Total Variation regularization - PowerPoint PPT Presentation

Citation preview

Edge Preserving Image Restoration using L1 norm

Vivek AgarwalThe University of Tennessee, Knoxville

2

Outline

• Introduction

• Regularization based image restoration– L2 norm regularization– L1 norm regularization

• Tikhonov regularization

• Total Variation regularization

• Least Absolute Shrinkage and Selection Operator (LASSO)

• Results

• Conclusion and future work

3

Introduction -Physics of Image formation

f(x’,y’)

Imaging system

K(x,y,x’,y’)

g(x,y)

Registration

system

noise

g(x,y)+noise

Reverse Process Forward Process

4

Image Restoration

• Image restoration is a subset of image processing.

• It is a highly ill-posed problem.

• Most of the image restoration algorithms uses least squares.

• L2 norm based algorithms produces smooth restoration which is inaccurate if the image consists of edges.

• L1 norm algorithms preserves the edge information in the restored images. But the algorithms are slow.

5

Well-Posed Problem

In 1923, the French mathematician Hadamard introduced the

notion of well-posed problems.

According to Hadamard a problem is called well-posed if

1. A solution for the problem exists (existence).

2. This solution is unique (uniqueness).

3. This unique solution is stable under small perturbations in the data, in other words small perturbations in the data should cause small perturbations in the solution (stability).

If at least one of these conditions fails the problem is called ill or

incorrectly posed and demands a special consideration.

6

Existence

To deal with non-existence we have to enlarge the domain where

the solution is sought.

Example: A quadratic equation ax2 + bx +c =0 in general form has

two solutions:

ia

bac

a

bxi

a

bac

a

bx

acbif

a

acbbxand

a

acbbx

2

4

2

2

4

2

onescomplex are erehowever th

roots, real no is there,04

2

4

2

4

2

2

2

1

2

2

2

2

1

There is a solution

Complex domain

Real Domain

No SolSution

Non-existence is Harmfull

7

Uniqueness

Non-uniqueness is usually caused by the lack or absence of

information about underlying model.

Example: Neural networks. Error surface has multiple local minima

and many of these minima fit training data very well, however

Generalization capabilities of these different solution (predictive

models) can be very different, ranging from poor to excellent. How

to pick up a model which is going to generalize well?

Solution #1Bad or good?

Solution #2Bad or good?

Solution #3Bad or good?

8

Uniqueness

• Non-uniqueness is not always harmful. It depends on what we are looking for. If we are looking for a desired effect, that is we know how the good solution looks like then we can be happy with multiple solutions just picking up a good one from a variety of solution.

• The non-uniqueness is harmful if we are looking for an observed effect, that is we do not know how good solution looks like.

• The best way to combat non-uniqueness is just specify a model using prior knowledge of the domain or at least restrict the space where the desired model is searched.

9

Instability

Instability is caused by an attempt to reverse cause-effect relationships.Nature always solves just for forward problem, because of thearrow of time. Cause always goes before effect.

In practice very often we have to reverse the relationships, that isto go from effect to cause.Example: Convolution-deconvolution, Fredhold integral equationsof the first kind.

Forward OperationEffect Cause

10

L1 and L2 Norms

The general expression for norm is given as

L2 norm: is the Euclidean distance or vector

distance.

L1 norm: is also known as Manhattan norm because

it corresponds to the sum of the distances along the coordinate

axes.

p

i

pp

ixx

1

||||||

i ixx 2

2 ||||

|||||| 1 i ixx

11

Why Regularization?

• Most of the restoration is based on Least Squares. But if the problem is ill-posed then least squares method fails.

50 100 150 200 250

50

100

150

200

250

12

Regularization

The general formulation for regularization techniques is

Where is the Error term

is the regularization parameter

is the penalty term

22

22 |||||||| LxgAx

22|||| bAx

22|||| Lx

13

Tikhonov Regularization

• Tikhonov is a L2 norm or classical regularization technique.

• Tikhonov regularization technique produces smoothing effect on the restored image.

• In zero order Tikhonov regularization, the regularization operator (L) is identity matrix.

• The expression that can be used to compute, Tikhonov regularization is

• In Higher order Tikhonov, L is either first order or second order differentiation matrix.

gALAAf TT 1

14

Tikhonov Regularization

Original Image Blurred Image

50 100 150 200 250

50

100

150

200

250

15

Tikhonov Regularization - Restoration

Reconstructed Image for = 7.9123e-012

50 100 150 200 250

50

100

150

200

250

16

Total Variation

• Total Variation is a deterministic approach.

• This regularization method preserve the edge information in the restored images.

• TV regularization penalty function obeys the L1 norm.

• The mathematical expression for TV regularization is given as

222 || ||||2

1)( xbAxxT

17

Difference between Tikhonov regularization and Total Variation

S.No Tikhonov Regularization Total Variation regularization

1.

2. Assumes smooth and continuous information

Smoothness is not assumed.

3. Computationally less complex Computationally more complex

4. Restored image is smooth Restored image is blocky and preserves the edges.

122 |||||||| xgAx 2

222 |||||||| IxgAx

18

Computation Challenges

222 || ||||2

1)( xbAxxT

||

)(x

xxTV

0**||

)(

bAAxAx

xxT

Total Variation

Gradient

Non-Linear PDE

19

Computation Challenges (Contd..)

• Iterative method is necessary to solve.

• TV function is non-differential at zero.

• The is non-linear operator.

• The ill conditioning of the operator causes numerical difficulties.

• Good Preconditioning is required.

|| x

x

|| x

x

20

Computation of Regularization OperatorTotal Variation is computed using the formulation.

The total variation is obtained after minimization of the

222 || ||||2

1)( fgAffT

Total Variation Penalty function (L)Least Square Solution

)( )(

)(1

11

vvT

v

Tv

Tv

fTgradfLAAf

gAfLAAf

21

Computation of Regularization Operator

Discretization of Total variation function:

Gradient of Total Variation is given by

x yn

i

n

j

yji

xji fDfDfT

1 1

2

,

2

, 2

1)(

x

fffD jijix

ji

,1,,

y

fffD jijiy

ji

1,,,

22)( tt

x yn

i

n

j

yji

xjiji fDfDfT

1 1

2

,

2

,,'

2

1)(

22

Regularization Operator

The regularization operator is computer using the expression

Where

y

xTY

Tx

yTyx

Tx

D

D

fdiag

fdiagDD

DfdiagDDfdiagDfL

)(0

0)(

)( )( )(

'

'

''

)(2

,

2

,'

,' fDfDf y

jix

jiji

23

Lasso Regression

• Lasso for “Least Absolute Shrinkage and Selection Operator” is a shrinkage and selection method for linear regression introduced by Tibshirani 1995.

• It minimizes the usual sum of squared errors, with a bound on the sum of the absolute values of the coefficients.

• The computation of solution for Lasso is a quadratic programming problem that can be best solved by least angle regression algorithm.

• Lasso also uses L1 penalty norm.

sxyMini j jiji || subject to

j j

2

24

Ridge Regression and Lasso Equivalence

• The cost function of ridge regression is given as

• Ridge regression is identical to Zero Order Tikhonov regularization

• Analytical Solution of Ridge and Tikhonov are similar

• The bias introduced favors solution with small weights and the effect is to smooth the output function.

m

jj

p

iii wxfyC

1

2

2

1

)(ˆ

yXIXX TT 1

25

Ridge Regression and Lasso Equivalence

• Instead of single value of λ, different values of λ can be used for different pixels.

• It should provide same solution as lasso regression (regularization).

• Thus we establish relation between lasso and Zero Order Tikhonov, there is a relation between Total Variation and Lasso

m

jjj

p

iii wxfyC

1

2

2

1

)(ˆ

Tikhonov

LassoTotal Variation

Proved

Both are L1Norm penalties

Our AimTo Prove

26

L1 norm regularization - Restoration

Input Image Blurred and Noisy Image

Synthetic Images

27

L1 norm regularization - Restoration

LASSO Restoration

Total Variation Restoration

28

L1 norm regularization - Restoration

I Deg of Blur III Deg of BlurII Deg of Blur

Blurred and Noisy Images

Total VariationRegularization

LASSORegularization

29

L1 norm regularization - Restoration

Blurred and Noisy Images

Total VariationRegularization

LASSORegularization

I level of Noise III level of NoiseII level of Noise

30

Cross Section of Restoration

Total VariationRegularization

LASSO Regularization

Different degrees Of Blurring

31

Cross Section of Restoration

Total VariationRegularization

LASSORegularization

Different levels of Noise

32

Comparison of AlgorithmsOriginal Image LASSO Restoration

Total Variation RestorationTikhonov Restoration

33

Effect of Different Levels of Noise and Blurring

Blurred and Noisy Image LASSO Restoration

Total Variation RestorationTikhonov Restoration

34

Numerical Analysis of Results - Airplane

Plane PD

Iteration

CG

Iteration

Lambda Blurring Error

(%)

Residual

Error

(%)

Restoration Time

(min)

Total

Variation

2 10 2.05e-02 81.4 1.74 2.50

LASSO

Regression

1 6 1.00e-04 81.4 1.81 0.80

Tikhonov

Regularization

-- -- 1.288e-10 81.4 9.85 0.20

First Level of Noise

Second Level of NoisePlane PD

Iteration

CG

Iteration

Lambda Blurring Error

(%)

Residual Error

(%)

Restoration Time

(min)

Total

Variation

1 15 1e-03 83.5 3.54 1.4

LASSO

Regression

1 2 1e-03 83.5 4.228 0.8

Tikhonov

Regularization

-- -- 1.12e-10 83.5 11.2 0.30

35

Numerical Analysis of Results - Airplane

Shelves PD

Iteration

CG

Iteration

Lambda Blurring Error

(%)

Residual

Error

(%)

Restoration Time

(min)

Total

Variation

2 11 1.00e-04 84.1 2.01 2.00

LASSO

Regression

1 8 1.00e-06 84.1 1.23 0.90

Plane PD

Iteration

CG

Iteration

Lambda Blurring Error

(%)

Residual Error

(%)

Restoration Time

(min)

Total

Variation

2 10 1.00e-03 81.2 3.61 2.10

LASSO

Regression

1 14 1.00e-03 81.2 3.59 1.00

36

Graphical Representation – 5 Real Images

RestorationTime

Residual Error

Different degrees of Blur

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

1 2 3 4 5

Image

Tim

e in

min

ute

s

Restoration Time using TV method

Restoration Time using Lasso method

Residual Error

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

4.0000

4.5000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual Error using Lasso method

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

4.0000

4.5000

1 2 3 4 5

Image

Tim

e in

min

ute

s

Restoration Time using TV method

Restoration Time using Lasso method

Residual Error

0.0000

1.0000

2.0000

3.0000

4.0000

5.0000

6.0000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual method using Lasso method

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

4.0000

4.5000

5.0000

1 2 3 4 5

Image

Tim

e i

n m

inu

tes

Restoration Time using TV method

Restoration Time using Lasso method

Residual Error

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual Error using Lasso method

37

Graphical Representation - 5 Real Images

Different levels of Noise

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

1 2 3 4 5

Image

Tim

e in

min

ute

s

Restoration Time using TV method

Restoration Time using Lasso method

Residual Error

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

4.0000

4.5000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual Error using Lasso method

Residual Error

RestorationTime

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

1 2 3 4 5

Image

Tim

e in

min

ute

s

Restoration time using TV method

Restoration time using Lasso method

Residual Error

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual Error using Lasso method

Image Restoration Time

0.0000

0.5000

1.0000

1.5000

2.0000

2.5000

3.0000

3.5000

1 2 3 4 5

Image

Tim

e in

min

ute

s

Restoration time using TV method

Restoration time using Lasso method

Residual Error

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

14.0000

1 2 3 4 5

Image

Err

or

in (

%)

Residual Error using TV method

Residual Error using Lasso method

38

Effect of Blurring and NoiseEffect of Blurring

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

1 2 3 4 5

Image

Err

or

in (

%)

First Degree of Blurring

Second Degree of Blurring

Third Degree of Blurring

Effect of Noise

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

14.0000

16.0000

1 2 3 4 5

Image

Err

or

in (

%)

First Level of NoiseSecond Level of NoiseThird Level of Noise

Effect of Blurring on Error and Time

0.0000

1.0000

2.0000

3.0000

4.0000

5.0000

6.0000

7.0000

8.0000

9.0000

10.0000

1 2 3

Degree of Blurring

Err

or

in (

%)

and

Tim

e in

min

ute

s

Restoration Time in minutes

Residual Error in (%)

Effect of Noise on Error and Restoration Time

0.0000

2.0000

4.0000

6.0000

8.0000

10.0000

12.0000

1 2 3

Noise Level

Err

or

in (

%)

and

Tim

e in

min

ute

s

Restoration Time in minutes

Residual Error in (%)

39

Conclusion

• Total variation method preserves the edge information in the restored image.

• Restoration time in Total Variation regularization is high

• LASSO provides an impressive alternative to TV regularization

• Restoration time of LASSO regularization is two times less than restoration time of RV regularization

• Restoration quality of LASSO is better or equal to the restoration quality of TV regularization

40

Conclusion

• Both LASSO and TV regularization fails to suppress the noise in the restored images.

• Analysis shows increase in degree of blur increases the restoration error

• Increase in the noise level does not have a significant influence on the restoration time but effects the residual error