Marking another step forward in the march of algorithmic & embedded governance, researchers at the University of Colorado have developed a process for effectively detecting & deleting “objectionable” (ie nude) content in videos.

Marking another step forward in the march of algorithmic & embedded governance, researchers at the University of Colorado have developed a process for effectively detecting & deleting “objectionable” (ie nude) content in videos. By running facial detection, determining whether or not the upper body is in view, and looking for signature motions of skin & hands, the suite of algorithms can determine a statistical likelihood – a “misbehaving probability” – of naughty behavior in the stream. After a quick analysis, the offending video stream can be automatically killed.

The team ran tests on Chatroulette.com, a streaming site regularly plagued by video flashers, and found their “obscene content detection system”, known as SafeVchat, out-performed existing filters. Additional work will aim to reduce the response time from its current 0.9 seconds per stream in order to scale effectively for high-traffic video sites like Chatroulette.

University of Colorado

[via New Scientist]

Image by mandiberg

Quantcast