Normal view MARC view

Natural language processing in action : understanding, analyzing, and generating text with Python / Hobson Lane, Cole Howard, Hannes Max Hapke [Texto impreso]

By: Lane, HobsonContributor(s): Hapke, Hannes Max | Howard, ColeLanguage: English Publisher: Shelter Island (New York) : Manning, 2019Description: xviii, 512 p.: gráf.: 24 cmISBN: 978-1-61729-463-1Subject(s): Análisis automático (Lingüística) | Python (lenguaje de programación)
Contents:
PART 1: Wordy machines ; 1. Pac kets of thought (NPL oerview) ; 2. Build your vocabulary (word tokenization) ; 3. Math with word (TF-IDF vectors ; PART 2: Deeper learning (Neutral networks) ; 5. Baby steps with neral networks (perceptrons and backpropagation) ; 6. Reasoning with word vectors (Word2vec) ; 7. Getting word in order with convolutional neural networks (CNNS) ; 8. Loopy (recurrent) neural networks (RNNS) ; 9. Improving retention with long short-term memory networks ; 10. Sequence-to-secuence models and attention ; 11. Information extration (named entity extraction and question answering) ; 12. Getting chatty (dialog engines) ; 13. Scaling up (optimazition, parallelization, and batch processing) ; Appendix A: Your NLP tools ; Appendix B: Playful Python and regular expressions ; Appendix C: Vectors and matrices (linear algebra fundamentals) ; Appendix D: Machine learning tools and techniques ; Appendix E: Resources ; Appendix F: Glossary ; Appendix G: Setting up your GPU ; Appendix H: Locality sensitive Hashing.
    Average rating: 0.0 (0 votes)
Item type Current location Collection Call number Status Date due Barcode Course reserves
Libro Libro Biblioteca Universidad Europea del Atlántico
Fondo General
No ficción 81´322 LAN nat Available 3419

Lingüística Aplicada a la Tecnología


Bibliografía: 481-489.

PART 1: Wordy machines ; 1. Pac kets of thought (NPL oerview) ; 2. Build your vocabulary (word tokenization) ; 3. Math with word (TF-IDF vectors ; PART 2: Deeper learning (Neutral networks) ; 5. Baby steps with neral networks (perceptrons and backpropagation) ; 6. Reasoning with word vectors (Word2vec) ; 7. Getting word in order with convolutional neural networks (CNNS) ; 8. Loopy (recurrent) neural networks (RNNS) ; 9. Improving retention with long short-term memory networks ; 10. Sequence-to-secuence models and attention ; 11. Information extration (named entity extraction and question answering) ; 12. Getting chatty (dialog engines) ; 13. Scaling up (optimazition, parallelization, and batch processing) ; Appendix A: Your NLP tools ; Appendix B: Playful Python and regular expressions ; Appendix C: Vectors and matrices (linear algebra fundamentals) ; Appendix D: Machine learning tools and techniques ; Appendix E: Resources ; Appendix F: Glossary ; Appendix G: Setting up your GPU ; Appendix H: Locality sensitive Hashing.

Servicio de Biblioteca de la Universidad Europea del Atlantico | biblioteca@uneatlantico.es | Tlf: 942 244 244 Ext. 5020