Comparing object recognition from binary and bipolar edge images for visual prostheses
Jae-Hyun Jung,
Tian Pu and
Eli Peli
Working Paper from Harvard University OpenScholar
Abstract:
Visual prostheses require an effective representation method due to the limited display condition which has only 2 or 3 levels of grayscale in low resolution. Edges derived from abrupt luminance changes in images carry essential information for object recognition. Typical binary (black and white) edge images have been used to represent features to convey essential information. However, in scenes with a complex cluttered background, the recognition rate of the binary edge images by human observers is limited and additional information is required. The polarity of edges and cusps (black or white features on a gray background) carries important additional information; the polarity may provide shape from shading information missing in the binary edge image. This depth information may be restored by using bipolar edges. We compared object recognition rates from 16 binary edge images and bipolar edge images by 26 subjects to determine the possible impact of bipolar filtering in visual prostheses with 3 or more levels of grayscale. Recognition rates were higher with bipolar edge images and the improvement was significant in scenes with complex backgrounds. The results also suggest that erroneous shape from shading interpretation of bipolar edges resulting from pigment rather than boundaries of shape may confound the recognition.
References: Add references at CitEc
Citations:
Downloads: (external link)
http://scholar.harvard.edu/jaehyun_jung/node/475401
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:qsh:wpaper:475401
Access Statistics for this paper
More papers in Working Paper from Harvard University OpenScholar Contact information at EDIRC.
Bibliographic data for series maintained by Richard Brandon ( this e-mail address is bad, please contact ).