EconPapers    
Economics at your fingertips  
 

Decoding Consumer Preferences Using Attention-Based Language Models

Joshua Foster and Fredrik Odegaard

Papers from arXiv.org

Abstract: This paper proposes a new demand estimation method using attention-based language models. An encoder-only language model is trained in a two-stage process to analyze the natural language descriptions of used cars from a large US-based online auction marketplace. The approach enables semi-nonparametrically estimation for the demand primitives of a structural model representing the private valuations and market size for each vehicle listing. In the first stage, the language model is fine-tuned to encode the target auction outcomes using the natural language vehicle descriptions. In the second stage, the trained language model's encodings are projected into the parameter space of the structural model. The model's capability to conduct counterfactual analyses within the trained market space is validated using a subsample of withheld auction data, which includes a set of unique "zero shot" instances.

Date: 2025-07
References: Add references at CitEc
Citations:

Downloads: (external link)
http://arxiv.org/pdf/2507.17564 Latest version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2507.17564

Access Statistics for this paper

More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().

 
Page updated 2025-07-26
Handle: RePEc:arx:papers:2507.17564