EconPapers    
Economics at your fingertips  
 

Quantifying Context With and Without Statistical Language Models

Cassandra L. Jacobs
Additional contact information
Cassandra L. Jacobs: University of Wisconsin, Department of Psychology

Chapter 34 in Handbook of Cognitive Mathematics, 2022, pp 1053-1081 from Springer

Abstract: Abstract Context is a driving force behind many cognitive computational models and touches nearly all aspects of human cognition. Moreover, researchers in computational linguistics and natural language processing have increasingly incorporated contextual factors into their models of language. This chapter aims to introduce the reader to the most frequent approaches to quantifying context, from symbolic to statistical approaches. The reader will gain a deeper understanding of the parallels between the different approaches and how they build on each other.

Keywords: Context; Language comprehension; Language production; Neural networks; Statistical language models (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-031-03945-4_17

Ordering information: This item can be ordered from
http://www.springer.com/9783031039454

DOI: 10.1007/978-3-031-03945-4_17

Access Statistics for this chapter

More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-11-30
Handle: RePEc:spr:sprchp:978-3-031-03945-4_17