Foundation Models for Natural Language Processing Foundation Models for Natural Language Processing
Artificial Intelligence: Foundations, Theory, and Algorithms

Foundation Models for Natural Language Processing

Pre-trained Language Models Integrating Media

Lời Giới Thiệu Của Nhà Xuất Bản

This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. 

Over the recent years, a revolutionary new paradigm has been developed for training models for NLP. These models are first pre-trained on large collections of text documents to acquire general syntactic knowledge and semantic information. Then, they are fine-tuned for specific tasks, which they can often solve with superhuman accuracy. When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning. Moreover, they can be applied to a wide range of different media and problem domains, ranging from image and video processing to robot control learning. Because they provide a blueprint for solving many tasks in artificial intelligence, they have been called Foundation Models.


After a brief introduction tobasic NLP models the main pre-trained language models BERT, GPT and sequence-to-sequence transformer are described, as well as the concepts of self-attention and context-sensitive embedding. Then, different approaches to improving these models are discussed, such as expanding the pre-training criteria, increasing the length of input texts, or including extra knowledge. An overview of the best-performing models for about twenty application areas is then presented, e.g., question answering, translation, story generation, dialog systems, generating images from text, etc. For each application area, the strengths and weaknesses of current models are discussed, and an outlook on further developments is given. In addition, links are provided to freely available program code. A concluding chapter summarizes the economic opportunities, mitigation of risks, and potential developments of AI.

THỂ LOẠI
Máy Vi Tính & Internet
ĐÃ PHÁT HÀNH
2023
23 tháng 5
NGÔN NGỮ
EN
Tiếng Anh
ĐỘ DÀI
454
Trang
NHÀ XUẤT BẢN
Springer International Publishing
NGƯỜI BÁN
Springer Nature B.V.
KÍCH THƯỚC
63,2
Mb
Hands-On Large Language Models Hands-On Large Language Models
2024
Natural Language Processing with Transformers, Revised Edition Natural Language Processing with Transformers, Revised Edition
2022
ChatGPT Unplugged Complete Guide ChatGPT Unplugged Complete Guide
2023
Automated Machine Learning Automated Machine Learning
2019
500 Artificial Intelligence (AI) Interview Questions and Answers 500 Artificial Intelligence (AI) Interview Questions and Answers
2020
Deep Learning for Coders with fastai and PyTorch Deep Learning for Coders with fastai and PyTorch
2020
Künstliche Intelligenz Künstliche Intelligenz
2021
Artificial Intelligence Artificial Intelligence
2024
Retrieval-Augmented Generation (RAG): Empowering Large Language Models (LLMs) Retrieval-Augmented Generation (RAG): Empowering Large Language Models (LLMs)
2023
Hypergraph Computation Hypergraph Computation
2023
Representation Learning for Natural Language Processing Representation Learning for Natural Language Processing
2020
Quantum Computing in Artificial Intelligence: Enhancing Applications Quantum Computing in Artificial Intelligence: Enhancing Applications
2024
Artificial Intelligence and Cognitive Science Artificial Intelligence and Cognitive Science
2023
Big Data and Artificial Intelligence in Digital Finance Big Data and Artificial Intelligence in Digital Finance
2022
Hypergraph Computation Hypergraph Computation
2023
AI Ethics AI Ethics
2023
Heterogeneous Graph Representation Learning and Applications Heterogeneous Graph Representation Learning and Applications
2022
Towards a Code of Ethics for Artificial Intelligence Towards a Code of Ethics for Artificial Intelligence
2017
Multi-Modal Robotic Intelligence Multi-Modal Robotic Intelligence
2025
Neural Text-to-Speech Synthesis Neural Text-to-Speech Synthesis
2023