A New Atlas article about Cornell University’s experimental eyewear that can read wearers’ silently spoken voice commands mentions the EarCommand system developed in the lab of Zhanpeng Jin, associate professor in the Department of Computer Science and Engineering.
New Atlas reported on technology bring developed in the lab of Zhanpeng Jin, associate professor of computer science and engineering, that will allow people to talk silently into their smart phones when giving spoken commands.
Futurity reported that a team led by Zhanpeng Jin, associate professor in the Department of Computer Science and Engineering, has used Bluetooth earbuds and a deep learning AI system to diagnose three common ear conditions with a simple, non-invasive audio test that uses a sonar-like audio chirp to map out the ear structure.
Auto-flaggers for work zones, contactless fingerprint scanners, and a tool to defend against fake media were just a few of the inventive student research projects presented at the first annual Russell Agrusa CSE Student Innovation Competition.
Element 14 reports that UB engineers led by Zhanpeng Jin developed SonicASL, a system that utilizes modified noise-canceling headphones linked with a smartphone to detect and translates American Sign Language.
Technology.org reported on a system of new noise-cancelling headphones developed by Zhanpeng Jin that allows the headphones to “see” and translate American Sign Language (ASL) when paired with a smartphone.