Journal article

Getting "fumpered": Classifying objects by what has been done to them


Authors listFleming, Roland W.; Schmidt, Filipp

Publication year2019

JournalJournal of Vision

Volume number19

Issue number4

ISSN1534-7362

Open access statusGold

DOI Linkhttps://doi.org/10.1167/19.4.15

PublisherAssociation for Research in Vision and Ophthalmology


Abstract
Every object acquires its shape from some kind of generative process, such as manufacture, biological growth, or self-organization, in response to external forces. Inferring such generative processes from an observed shape is computationally challenging because a given process can lead to radically different shapes, and similar shapes can result from different generative processes. Here, we suggest that in some cases, generative processes endow objects with distinctive statistical features that observers can use to classify objects according to what has been done to them. We found that from the very first trials in an eight-alternative forced-choice classification task, observers were extremely good at classifying unfamiliar objects by the transformations that had shaped them. Further experiments show that the shape features underlying this ability are distinct from Euclidean shape similarity and that observers can separate and voluntarily respond to both aspects of objects. Our findings suggest that perceptual organization processes allow us to identify salient statistical shape features that are diagnostic of generative processes. By so doing, we can classify objects we have never seen before according to the processes that shaped them.



Authors/Editors




Citation Styles

Harvard Citation styleFleming, R. and Schmidt, F. (2019) Getting "fumpered": Classifying objects by what has been done to them, Journal of Vision, 19(4), Article 15. https://doi.org/10.1167/19.4.15

APA Citation styleFleming, R., & Schmidt, F. (2019). Getting "fumpered": Classifying objects by what has been done to them. Journal of Vision. 19(4), Article 15. https://doi.org/10.1167/19.4.15


Last updated on 2025-20-06 at 12:10