Om Modelling Natural Language with Claude Shannon's Notion of Surprisal
Have you ever wondered how the principles behind Shannon's groundbreaking Information Theory can be interwoven with the intricate fabric of linguistic communication? This book takes you on a fascinating journey, offering insights into how humans process and comprehend language. By applying Information Theory to the realm of natural language semantics, it unravels the connection between regularities in linguistic messages and the cognitive intricacies of language processing. Highlighting the intersections of information theory with linguistics, philosophy, cognitive psychology, and computer science, this book serves as an inspiration for anyone seeking to understand the predictive capabilities of Information Theory in modeling human communication. It elaborates on the seminal works from giants in the field like Dretske, Hale, and Zipf, exploring concepts like surprisal theory and the principle of least effort. With its empirical approach, this book not only discusses the theoretical aspects but also ventures into the application of Shannon's Information Theory in real-world language scenarios, strengthened by advanced statistical methods and machine learning. It touches upon challenging areas such as the distinction between mathematical and semantic information, the concept of information in linguistic utterances, and the intricate play between truth, context, and meaning. Whether you are a linguist, a cognitive psychologist, a philosopher, or simply an enthusiast eager to dive deep into the world where language meets information, this book promises a thought-provoking journey.
Visa mer