35
グラフィカル Lasso を用いた異常検知 M1 高品 佑也 1

グラフィカル Lasso を用いた異常検知

Embed Size (px)

Citation preview

  1. 1. Lasso M1 1
  2. 2. : 2
  3. 3. : 3
  4. 4. : ln p(xy = normal) p(xy = anomaly) 4
  5. 5. : ln p(xy = normal) 5
  6. 6. : ln p(xy = anomaly) ln p(xy = normal) ln p(xy = normal) 6
  7. 7. 7
  8. 8. 0 1 p(x) = N(x0, ) 1 1 2 8
  9. 9. N = arg ln p() N(x 0, ) scale p() = exp ^ max { n=1 N (n) 1 } 2 ( 1) 9
  10. 10. MAP S = arg ln det tr(S) ^ max { 1} 3 10
  11. 11. Lasso L 1 11
  12. 12. Lasso Markov random field; MRF 0 12
  13. 13. Lasso 38 Lasso 13
  14. 14. 14
  15. 15. i a a (x) ln p(x x ) i i i i 15
  16. 16. a (x) ln p(x x ) = ln + x 4 i i i 2 1 i,i 2 2i,i 1 ( j=1 M i,j j) 2 16
  17. 17. = , 17
  18. 18. = r (x) = exp x x r (x) N(x0, )1 N(x0, )1 ( 2 1 ) 18
  19. 19. p(x) = r(x)p (x) 1 r (x) = r (x)p (x) p(x) = arg KL(p rp ), subjectto R [8] dxp (x)r (x) r (x) ^ min 1 19
  20. 20. [7] 20
  21. 21. sklearn.covariance.GraphLasso glasso/glassoanomaly.ipynb https://github.com/ytakashina/notebooks/ 21
  22. 22. Appendix 22
  23. 23. 1: 2 p(x , x x , , x ) = p(x , x ) = dx dx p(x) 1 2 3 M p(x , , x )3 M p(x) 1 2 3 M 23
  24. 24. 2: x , x p(x , x x , , x ) = x , x 1 2 1 2 3 M (2)M/2 det 1/2 p(x , , x )3 M exp x x{ 2 1 } 1 2 24
  25. 25. x x = x x = x x + 2 x x + x x = x + 2x x + x + 2x x + 2x x + const. = i=1 M i j=1 M i,j j i=1 2 i j=1 2 i,j j i=1 2 i j=3 M i,j j i=3 M i j=3 M i,j j 1 2 1,1 1 2 1,2 2 2 2,2 1 j=3 M j 1,j 2 j=3 M j 2,j i,j j,i 25
  26. 26. x , x p(x x , , x ) exp x + 2x x p(x x , , x ) exp x + 2x x 1 2 1 3 M { 2 1 ( 1 2 1,1 1 j=3 M j 1,j)} 2 3 M { 2 1 ( 2 2 2,2 2 j=3 M j 2,j)} 26
  27. 27. p(x , x x , , x ) exp x + 2x x +x + 2x x + 2x x p(x x , , x ) exp x + 2x x p(x x , , x ) exp x + 2x x 1 2 3 M { 2 1 ( 1 2 1,1 1 2 1,2 2 2 2,2 1 j=3 M j 1,j 2 j=3 M j 2,j)} 1 3 M { 2 1 ( 1 2 1,1 1 j=3 M j 1,j)} 2 3 M { 2 1 ( 2 2 2,2 2 j=3 M j 2,j)} 27
  28. 28. x , x p(x , x x , , x ) = p(x x , , x )p(x x , , x ) = 0 x x othervariables (i, j) i j 1 2 3 M 1 3 M 2 3 M i,j i j 28
  29. 29. 3: , , S , , = arg w W w : w s ~ [ L l l ] ~ [ W w w ] S ~ [ R s s r ] w^ w min { 1 } 29
  30. 30. = arg W b + , where b W s, W w Lasso ^ min { 2 1 1/2 2 1} 1/2 1 30
  31. 31. 4: a (x) ln p(x x ) = ln = ln(2) ln + (x x x Lx ) = ln + (2x l x + x ) = ln + l x + x i i i (2)(M1)/2 det L1/2 (2)M/2 det 1/2 exp x Lx{ 2 1 i i} exp x x{ 2 1 } 2 1 2 1 det L det 2 1 i i 2 1 ( l L l 1 2 ) 2 1 i i i 2 2 1 2 2 1 ( i i) 2 31
  32. 32. [1] , . . , , 2015 [2] . . Vol. 5, , 2017 32
  33. 33. [3] . . http://ideresearch.net/papers/2009_DMSM_Ide.pdf, 2009 [4] . . http://latent dynamics.net/01/2010_LD_Ide.pdf, 2010 [3] DL [4] 33
  34. 34. [5] J. Friedman et al. Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9:432441, 2008. [6] O. Banerjee et al. Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data. Journal of Machine Learning Research, 9:485516, 2008 [5] Lasso [6] [5] 34
  35. 35. [7] T. Ide et al. Sparse Gaussian Markov Random Field Mixtures for Anomaly Detection. Proceedings of the 2016 IEEE International Conference on Data Mining, 955960, 2016 [8] S. Liu et al. Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation. Neural Computation, 26:11691197, 2014 [7] Lasso [8] Lasso 35