+ Site Statistics
+ Search Articles
+ PDF Full Text Service
How our service works
Request PDF Full Text
+ Follow Us
Follow on Facebook
Follow on Twitter
Follow on LinkedIn
+ Subscribe to Site Feeds
Most Shared
PDF Full Text
+ Translate
+ Recently Requested

Perceptual Annotation: Measuring Human Vision to Improve Computer Vision

Perceptual Annotation: Measuring Human Vision to Improve Computer Vision

IEEE Transactions on Pattern Analysis and Machine Intelligence 36(8): 1679-1686

For many problems in computer vision, human learners are considerably better than machines. Humans possess highly accurate internal recognition and learning mechanisms that are not yet understood, and they frequently have access to more extensive training data through a lifetime of unbiased experience with the visual world. We propose to use visual psychophysics to directly leverage the abilities of human subjects to build better machine learning systems. First, we use an advanced online psychometric testing platform to make new kinds of annotation data available for learning. Second, we develop a technique for harnessing these new kinds of information-"perceptual annotations"-for support vector machines. A key intuition for this approach is that while it may remain infeasible to dramatically increase the amount of data and high-quality labels available for the training of a given system, measuring the exemplar-by-exemplar difficulty and pattern of errors of human annotators can provide important information for regularizing the solution of the system at hand. A case study for the problem face detection demonstrates that this approach yields state-of-the-art results on the challenging FDDB data set.

Please choose payment method:

(PDF emailed within 0-6 h: $19.90)

Accession: 058525030

Download citation: RISBibTeXText

PMID: 26353347

DOI: 10.1109/tpami.2013.2297711

Related references

Determination of far vision with DIN 58220 (criterion 6/10): computer vision test with high resolution monitor in comparison with vision charts. Klinische Monatsblatter für Augenheilkunde 211(6): 380-386, 1998

Preattentive computer vision towards a two-stage computer vision system for the extraction of qualitative descriptors and the cues for focus of attention. Image and Vision Computing 12(9): 583-599, 1994

Measuring stereoscopic vision by means of the distant vision stereotester and the roemhild feldes near vision stereotester. Klinische Monatsblaetter fuer Augenheilkunde 192(1): 68-71, 1988

Computer vision and perceptual psychology. Psychological Bulletin 92(2): 283-309, 1982

Perceptual equations: Implications for computer vision. Irish Journal of Psychology 14(3): 330-342, 1993

Real-world scene perception and perceptual organization Lessons from Computer Vision. 2013

Design of the Low Vision Quality-of-life Questionnaire (LVQOL) and measuring the outcome of low-vision rehabilitation. American Journal of Ophthalmology 130(6): 793-802, 2000

Measuring outcomes of vision rehabilitation with the Veterans Affairs Low Vision Visual Functioning Questionnaire. Investigative Ophthalmology & Visual Science 47(8): 3253-3261, 2006

Links between vision and somatosensation: Vision can improve the felt position of the unseen hand. Current Biology 11(12): 975-980, 26 June, 2001

A Vision Enhancement System to Improve Face Recognition with Central Vision Loss. Optometry and Vision Science 95(9): 738-746, 2018

Application of the SP theory of intelligence to the understanding of natural vision and the development of computer vision. Springerplus 3: 552, 2014

Benchmarking neuromorphic vision: lessons learnt from computer vision. Frontiers in Neuroscience 9: 374, 2015

Measuring the height of a fluidized bed by computer vision. Chemical Engineering Science 95: 33-42, 2013

Joint solution of low, intermediate, and high-level vision tasks by evolutionary optimization: Application to computer vision at low SNR. IEEE Transactions on Neural Networks 5(1): 83-95, 1994

Comparison study between biological vision and computer vision. Hang Tian Yi Xue Yu Yi Xue Gong Cheng 14(4): 303-307, 2001