A team of MIT researchers has managed to train a neural network to understand what action a human is performing on a piece of carpet.
The "intelligent carpet" is made out of 9,000 sensors, specially designed pressure-sensitive film, and conductive threads. The researchers specifically didn't include any cameras in the design due to the apparent growing concern for cameras invading privacy. The MIT team then synchronized the visual feed and the electric signal data, and with this synchronization, they then trained a neural network to be able to identify human actions performed on the carpet.
Yunzhu Li, a Ph.D. student a co-author of the paper, suggested that the carpet could be used for "workout purposes" as users could form workout actions on the carpet and then receive information such as calories burned, reps performed, etc. The carpet does have limitations as most of the pressure information provided is sourced from the lower body. Additionally, the researchers want to be able to provide accurate detection for two people using the mat as well as the height and weight of users.
- > NEXT STORY: Sony investors re-elect full board of directors following record year
- < PREVIOUS STORY: Doom Eternal VR could be announced at QuakeCon 2021