Pedestrians with low vision are at risk of injury when hazards, such as steps and posts, have low visibility. This study aims at validating the software implementation of a computational model that estimates hazard visibility. The model takes as input a photorealistic 3D rendering of an architectural space, and the acuity and contrast sensitivity of a low-vision observer, and outputs estimates of the visibility of hazards in the space. Our experiments explored whether the model could predict the likelihood of observers correctly identifying hazards. In Experiment 1, we tested fourteen normally sighted subjects with blur goggles that simulated moderate or severe acuity reduction. In Experiment 2, we tested ten low-vision subjects with moderate to severe acuity reduction. Subjects viewed computer-generated images of a walkway containing five possible targets ahead—big step-up, big step-down, small step-up, small step-down, or a flat continuation. Each subject saw these stimuli with variations of lighting and viewpoint in 250 trials and indicated which of the five targets was present. The model generated a score on each trial that estimated the visibility of the target. If the model is valid, the scores should be predictive of how accurately the subjects identified the targets. We used logistic regression to examine the correlation between the scores and the participants’ responses. For twelve of the fourteen normally sighted subjects with artificial acuity reduction and all ten low-vision subjects, there was a significant relationship between the scores and the participant’s probability of correct identification. These experiments provide evidence for the validity of a computational model that predicts the visibility of architectural hazards. It lays the foundation for future validation of this hazard evaluation tool, which may be useful for architects to assess the visibility of hazards in their designs, thereby enhancing the accessibility of spaces for people with low vision.
|Original language||English (US)|
|State||Published - Nov 2021|
Bibliographical notePublisher Copyright:
Copyright: © 2021 Liu et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
PubMed: MeSH publication types
- Journal Article
- Research Support, N.I.H., Extramural