App that ‘undresses’ people in photos with machine learning has been taken offline

Spread the love

DeepNude, an application that uses machine learning to undress people, but mainly women, in photos, has been taken offline. The makers did not expect the app’s popularity and considered the risk of abuse too great.

‘The world is not ready for DeepNude’, say the makers in a tweet, while earlier that day they wrote that work was underway to fix technical issues caused by the app’s many visits and purchases. The makers do not speak of the reactions to the app. There was a mixed reaction to the application and its withdrawal on social media. On the one hand people were enthusiastic about the possibilities of the app and on the other hand the idea was labeled as morally reprehensible. According to The Register, the app “doesn’t work well” on men.

The desktop version of the app, suitable for Linux and Windows 10, was only available for five days. However, the browser version was active since March 28. The free version of the application did its job, but covered the produced photos with obvious watermarks. The paid version, which cost $50, featured a smaller watermark.

Although the project has been discontinued, activated paid copies and downloaded free copies will continue to work. These can be put back online by others, the free version can be cracked and someone else can put a clone of the application on the web, which means that it is actually already too late to stop DeepNude, although the makers are not making any money right now. earn more. DeepNude was developed by a team in Estonia.

Using machine learning to fake photos and videos can yield surprising results. The so-called DeepFakes app can replace faces in photos and videos with another person’s face. The application was immediately used by people after its release to paste the faces of celebrities on those of porn actresses. Such images are banned on Twitter. The US Defense has created tools to identify these fakes.

You might also like