Deaf students face a persistent visual attention split between signer and instructional materials. Although virtual reality (VR) is often promoted as an educational solution, it typically reinforces hearing norms (e.g., caption overlays or interpreter boxes onto hearing classrooms). Our work foregrounds Deaf leadership and reclaims VR design authority: in a mixed-hearing team led by Deaf scholars, we designed and evaluated a VR classroom prototype featuring three signer-placement modes: corner, parallel, and transparent. Twelve Deaf participants explored the prototype during a 15-minute lecture and participated in qualitative semi-structured interviews. Participants reported reduced attention split and improved visibility, and suggested VR may support flexibility and comprehension in Deaf learning. From these reflections, we introduce a five-dimension conceptual framework---proximity, customizability, visual efficiency, cultural fit, and task flexibility---that organizes how Deaf signers evaluate signer placements. This work moves Deaf Tech theory into practice, opening pathways for future Deaf-centered, culturally grounded HCI.
ACM CHI Conference on Human Factors in Computing Systems