A Sign Similarity Approach to an Information Retrieval Inspired Visual Dictionary for Sign Language Learners
In this paper, we propose that visual dictionaries for sign language learners should take visual similarity into account in a manner that is inspired by the design and evaluation of information retrieval systems. Sign language learners would benefit from a visual dictionary that allows them to search for the translation of a sign using their web camera to capture themselves executing the sign. As computer vision technology develops towards that goal, we point out that learners are not necessarily supported by systems that return exact matches. Rather, these systems should take sign similarity into account, exhibiting robustness to less-than-perfect sign execution as well as providing information about signs that are visually similar, but different in meaning. The contribution of this paper is a sign similarity measure that was designed based on interviews with signers and sign learners. We also present two sets of experiments to demonstrate how the measure can be used to evaluate a visual dictionary and as an objective function for sign ranking.