Dublin Core
Title
Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media
Subject
Natural language & machine translation
Computational linguistics
Artificial intelligence
Expert systems / knowledge-based systems
Machine learning
Computational linguistics
Artificial intelligence
Expert systems / knowledge-based systems
Machine learning
Description
This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. Over the recent years, a revolutionary new paradigm has been developed for training models for NLP. These models are first pre-trained on large collections of text documents to acquire general syntactic knowledge and semantic information. Then, they are fine-tuned for specific tasks, which they can often solve with superhuman accuracy. When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning. Moreover, they can be applied to a wide range of different media and problem domains, ranging from image and video processing to robot control learning.
Creator
Paaß, Gerhard
Giesselbach, Sven
Giesselbach, Sven
Source
https://directory.doabooks.org/handle/20.500.12854/107926
Publisher
Springer Nature
Date
2023
Contributor
Wahyuni
Rights
http://creativecommons.org/licenses/by/4.0/
Format
Pdf
Language
English
Type
Textbooks
Identifier
DOI
10.1007/978-3-031-23190-2
10.1007/978-3-031-23190-2
ISBN
9783031231902, 9783031231896
9783031231902, 9783031231896