Generic decoding of seen and imagined objects using hierarchical visual features
Tomoyasu Horikawa and
Yukiyasu Kamitani ()
Additional contact information
Tomoyasu Horikawa: ATR Computational Neuroscience Laboratories
Yukiyasu Kamitani: ATR Computational Neuroscience Laboratories
Nature Communications, 2017, vol. 8, issue 1, 1-15
Abstract:
Abstract Object recognition is a key function in both human and machine vision. While brain decoding of seen and imagined objects has been achieved, the prediction is limited to training examples. We present a decoding approach for arbitrary objects using the machine vision principle that an object category is represented by a set of features rendered invariant through hierarchical processing. We show that visual features, including those derived from a deep convolutional neural network, can be predicted from fMRI patterns, and that greater accuracy is achieved for low-/high-level features with lower-/higher-level visual areas, respectively. Predicted features are used to identify seen/imagined object categories (extending beyond decoder training) from a set of computed features for numerous object images. Furthermore, decoding of imagined objects reveals progressive recruitment of higher-to-lower visual representations. Our results demonstrate a homology between human and machine vision and its utility for brain-based information retrieval.
Date: 2017
References: Add references at CitEc
Citations: View citations in EconPapers (4)
Downloads: (external link)
https://www.nature.com/articles/ncomms15037 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:8:y:2017:i:1:d:10.1038_ncomms15037
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/ncomms15037
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().