Pycon jp2017 20170908_ota

  • View
    2.852

  • Download
    0

Embed Size (px)

Text of Pycon jp2017 20170908_ota

  1. 1. Python Python The Automatic Text Generation by using NLP & DL with Python Pycon JP 2017 in TOKYO 20179802:40PM-03:10PM
  2. 2. 1. 1.1. 1.2. 1.3. 3
  3. 3. 1.1. ------------------ ------------------ 1.1 Amazon 4
  4. 4. 1.2. 2 1. 2. 3. 300500 5
  5. 5. 1.2. 6
  6. 6. 1.3 1 E2E NLG Challenge http://www.macs.hw.ac.uk/InteractionLab/E2E/ cf. (Text Summarization)10 7
  7. 7. 1.3.2 WEB cf. 8
  8. 8. 2. 2.1 2.2 2.3 2.4 (RNN)/ LSTM/GAN 9
  9. 9. 2.1 3 1. 2. 3./ LSTM 10
  10. 10. 2.2 Markovproperty http://postd.cc/from-what-is-a-markov-model-to-here-is- how-markov-models-work-1/ 11
  11. 11. 2.3 H. P. Luhn(1958) . Luhn 12
  12. 12. 2.4 (RNN)/LSTM/GAN Andrej Karpathychar-rnntiny shakespeare[7] Long short term memory(LSTM) Recurrent Neural Network(RNN)1995 LSTMEpoch100 GPU GAN 13
  13. 13. 3. ([7]) 3.1 3.2 14
  14. 14. 3.1 : 15
  15. 15. 3.2 () ([7]) 1) :100200 2)keras(RNN/LSTM):5000 3)Luhn :1000 4)LexRank/TextRank :300400 5)tensorflow/seq2seq:100000 16
  16. 16. 4. ([8]) 4.1 4.21 4.2.1 2 4.2.2 2 4.2.2.1 1 4.2.2.2 2 4.3 2 4.4 4.4.1 4.4.2 n-gram(n = 1-5) 4.4.3 17
  17. 17. 4.1 [: 2] 1. 4.2.1 2 18 B L . K : : 3::7 / 034 19 5/2 065
  18. 18. 4.2.2 2 19 4.2.2.1 1 361 4.2.2.2 2 1,80070g 2 (455)
  19. 19. 4.2.2 2 20 4.2.2.1 1 361
  20. 20. 4.2.2 2 21 4.2.2.2 2 1,80070g 2 (455)
  21. 21. 4.3 2 5 . 5 4 3 2 - 1 22
  22. 22. 4.3 2 ! ? ? ? ? 351 23
  23. 23. 4.3 2 1 2 "11 " 24
  24. 24. 4.3 2 Y 90Y 2 90 Y 1 5 ! 3 70% 10 1 1 572 25
  25. 25. 4.3 2 2 5 26
  26. 26. 4.4 n-gramn = 1, 2, 3, 4, 5 Google5-gram5-gram ([10]) 27
  27. 27. 4.4.1 21 28
  28. 28. 4.4.1 a 456 b 448 c 405 4.4.2 n-gram(n = 1-5) a: , b: a, c: b ab bc 29 ac 2-gram: 1.151 3-gram: 0.582 4-gram: 0.506 5-gram: 0.388 bc 2-gram: 1.386 3-gram: 0.798 4-gram: 0.317 5-gram: 0.207
  29. 29. 4.4.1 a 456 21 30
  30. 30. 4.4.1 b 448 21 31
  31. 31. 4.4.1 c 405 21 32
  32. 32. 4.4.2 n-gram(n = 1-5) a: , b: a, c: b ab bc: 33 ac 2-gram: 1.151 3-gram: 0.582 4-gram: 0.506 5-gram: 0.388 bc 2-gram: 1.386 3-gram: 0.798 4-gram: 0.3171 5-gram: 0.2075
  33. 33. 4.4.3 3-gram, 4-gram, 5-gram n5 34
  34. 34. entity-grid model[11] [9][10] Sentence ordering (Generative Adversarial Network: GAN) 35 5.
  35. 35. 5. . [: ]Sentence Ordering, Coherence [: ] Entity-grid model1 [: ] GAN 1 entity-grid 36
  36. 36. 10/1Julia Julia http://www.nts- book.co.jp/item/detail/summary/it/20171001_35.html 37
  37. 37. URL https://www.slideshare.net/secret/etCvel5EtrAjfr 38