In the anime series Naruto, ninjas use 12 basic “hand seals,” or simple gestures named after the 12 animals in the Chinese Zodiac, to fight their enemies and do ninja stuff.
In the show, the characters fly through sequences of signs wildly fast, so if you’re trying to learn the sequences—as a lot of fans who you’ve probably seen wearing Naruto headbands in public do—being able to test your accuracy could help.
YouTuber AngryCoder designed an algorithm to detect and recognize hand seals from Naruto in real-time, as they’re done in front of a computer camera.
According to the project video, the idea came from wanting to learn more about machine learning, but not being able to decide on a worthwhile project to practice with.
“Of course I didn’t want it to be analytical or statistics related… math is pretty boring to watch,” they said in the video.
Every frame of hand gesture is translated into a set of 1s and 0s, which is passed through a model to identify hand signs, and then given a probability rating—like an object recognition and machine vision algorithm that identifies whether something more closely matches a turtle versus a gun.
This may be an incredibly nerdy use of machine learning, but it’s refreshing to see someone learning their skills on something that doesn’t involve experimenting on women’s bodies or furthering the police surveillance state.