ISP, social networks and web publishers need to take more responsibility for content uploaded to their websites, according to a UK company that has developed an in-stream image recognition software to filter out unsuitable photos.
WeSee said its piFilter service examines images being uploaded from phones or computers, filtering them in real-time using pattern recognition and artificial intelligence to spot potential child abuse images or nudity.
The company said current systems relied too much on tags, which can be left blank, and consumers reporting indecent images and manual moderation.
"Facebook has more than 50 million pictures uploaded every week - there is no way it can review them all," Adrian Moxley, WeSee co-founder, told PC Pro. "This is about social and corporate responsibility. If you create a platform you need a system of controlling it."
The company said it was launching the service in the wake of warnings last year from the Child Exploitation and Online Protection unit (CEOP) that children were increasingly posting explicit "self-taken" images of themselves online.
Despite playing the child protection card, the company stressed it would not be acting as judge and jury over what content might be blocked at source.
" It is up to the publisher, ISP or Facebook what their approach might be, this is a tool to help them control their environment," said Moxley. "We can help operators and social media sites control what users are sharing."
Costs for the service would vary depending on the scale of the corporate customer, but WeSee said it had been in talks with BT, with the services to a company of the telco's size costing 0.03 cents per thousand images scanned
Return to internet news headlines
View Internet News Archive