Pirate4x4.Com : 4x4 and Off-Road Forum

Pirate4x4.Com : 4x4 and Off-Road Forum (/forum/)
-   General Chit-Chat (https://www.pirate4x4.com/forum/24-general-chit-chat/)
-   -   First there was deepfakes, Now theres deepnude (https://www.pirate4x4.com/forum/general-chit-chat/2684476-first-there-deepfakes-now-theres-deepnude.html)

nahmus 06-27-2019 01:00 PM

First there was deepfakes, Now theres deepnude
 
This Horrifying App Undresses a Photo of Any Woman With a Single Click
https://www.vice.com/en_us/article/k...s-of-any-woman

While as a nerd I am fascinated about the technology, it is a little scary. Soon we'll have to go back to film and Polaroids is we ever want to trust anything again. Personally If i had created this I don't think i would have released it. There are plenty of women who want you to see them nude. There should be some privacy left in the digital age.


Quote:

A programmer created an application that uses neural networks to remove clothing from the images of women, making them look realistically nude.

The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person. It swaps clothes for naked breasts and a vulva, and only works on images of women. When Motherboard tried using an image of a man, it replaced his pants with a vulva. While DeepNude works with varying levels of success on images of fully clothed women, it appears to work best on images where the person is already showing a lot of skin. We tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated Swimsuit issues.

Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool. But the most devastating use of deepfakes has always been in how they're used against women: whether to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.

"This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public."

This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard.

“Yes, it isn’t your actual vagina, but... others think that they are seeing you naked,” she said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”

DeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23.

Motherboard downloaded the application and tested it on a Windows machine. It installed and launched like any other Windows application and didn't require technical expertise to use. In the free version of the app, the output images are partially covered with a large watermark. In a paid version, which costs $50, the watermark is removed, but a stamp that says "FAKE" is placed in the upper-left corner. (Cropping out the "fake" stamp or removing it with Photoshop would be very easy.)

Motherboard tested it on more than a dozen images of women and men, in varying states of dress—fully clothed to string bikinis—and a variety of skin tones. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.

But it's not flawless. Most images, and low-resolution images especially, produced some visual artifacts. DeepNude failed entirely with some photographs that used weird angles, lighting, or clothing that seem to throw off the neural network it uses. When we fed it an image of the cartoon character Jessica Rabbit, it distorted and destroyed the image altogether, throwing stray nipples into a blob of a figure.

In an email, the anonymous creator of DeepNude, who requested to go by the name Alberto, told Motherboard that the software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images—in the case of DeepNude, more than 10,000 nude photos of women, the programmer said—and then trying to improve against itself. This algorithm is similar to what's used in deepfake videos, and what self-driving cars use to "imagine" road scenarios.

The algorithm only works with women, Alberto said, because images of nude women are easier to find online—but he's hoping to create a male version, too.

"The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it," he said. "All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future."

Deepfake videos, by comparison, take hours or days to render a believable face-swapped video. For even a skilled editor, manually using Photoshop to realistically change a clothed portrait to nude would take several minutes.
Why DeepNude was created

Alberto said he was inspired to create DeepNude by ads for gadgets like X-Ray glasses that he saw while browsing magazines from the 1960s and 70s, which he had access to during his childhood. The logo for DeepNude, a man wearing spiral glasses, is an homage to those ads.

"Like everyone, I was fascinated by the idea that they could really exist and this memory remained," he said. "About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results."

Alberto said he continued to experiment out of "fun" and curiosity.

"I'm not a voyeur, I'm a technology enthusiast,” he said. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That's why I created DeepNude."

Unprompted, he said he's always asked himself whether the program should have ever been made: "Is this right? Can it hurt someone?" he asked.

"I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial)," he said, noting that DeepNude doesn't transmit images itself, only creates them and allows the user to do what they will with the results.

"I also said to myself: the technology is ready (within everyone's reach)," he said. "So if someone has bad intentions, having DeepNude doesn't change much... If I don't do it, someone else will do it in a year."
Better, and much worse, than deepfakes

In the year and a half since Motherboard discovered deepfakes on Reddit, the machine learning technology it employs has moved at breakneck speed. Algorithmic face-swaps have gone from requiring hundreds of images and days of processing time in late 2017, to requiring only a handful of images, or even just text inputs, and a few hours of time, in recent months.

Read more: It's Getting Way Too Easy to Create Fake Videos of People's Faces

Motherboard showed the DeepNude application to Hany Farid, a computer-science professor at UC Berkeley who has become a widely-cited expert on the digital forensics of deepfakes. Farid was shocked at this development, and the ease at which it can be done.

"We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways," Farid said. "In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. And, our legislators are going to have to think about how to thoughtfully regulate in this space."

Deepfakes have become a widespread, international phenomenon, but platform moderation and legislation so far has failed to keep up with this fast-moving technology. In the meantime, women are victimized by deepfakes and left behind for a more political, US-centric political narrative. Though deepfakes have been weaponized most often against unconsenting women, most headlines and political fear of them have focused on their fake news potential.

Even bills like the DEEPFAKES Accountability Act, introduced earlier this month, aren't enough to stop this technology from hurting real people.

"It’s a real bind—deepfakes defy most state revenge porn laws because it’s not the victim’s own nudity depicted, but also our federal laws protect the companies and social media platforms where it proliferates," attorney Carrie Goldberg, whose law firm specializes in revenge porn, told Motherboard. "It’s incumbent on the public to avoid consumption of what we call at my office humili-porn. Whether it’s revenge porn or deepfakes, don’t click or link or share or like! That’s how these sites make money. People need to stop letting their Id drive internet use and use the internet ethically and conscientiously."

DeepNude is easier to use, and more easily accessible than deepfakes have ever been. Whereas deepfakes require a lot of technical expertise, huge datasets, and access to expensive graphics cards, DeepNude is a consumer-facing app that is easier to install than most video games that can produce a believable nude in 30 seconds with the click of a single button.

Emanuel Maiberg contributed reporting to this article.

Editor's note, June 27 1:05 p.m. EST: This story originally included five side-by-side images of various celebrities and DeepNude-manipulated images of those celebrities. While the images were redacted to not show explicit nudity, after hearing from our readers, academic experts, and colleagues, we realized that those images could do harm to the real people in them. We think it's important to show the real consequences that new technologies unleashed on the world without warning have on people, but we also have to make sure that our reporting minimizes harm. For that reason, we have removed the images from the story, and regret the error.

thefishguy77 06-27-2019 02:32 PM

Link to app download?

For science...


Sent from my iPhone using Tapatalk

Norm 06-27-2019 02:35 PM

Quote:

Originally Posted by thefishguy77 (Post 44500554)
Link to app download?

For science...


Sent from my iPhone using Tapatalk

https://cdn.shopify.com/s/files/1/05...sses_large.jpg

nahmus 06-27-2019 02:38 PM

Quote:

Originally Posted by thefishguy77 (Post 44500554)
Link to app download?

For science...


Sent from my iPhone using Tapatalk

it's in the linked article

PROJECTJUNKIE 06-27-2019 02:54 PM

I love this shit. The average human has been too easy to manipulate into believing anything they heat or see. Stuff like this will bring skepticism back into the mainstream.
"Yeah, I see that evidence, but what else do you have? "

Harry Johnson 06-27-2019 02:55 PM

I've been beta testing this software for a while now.

Here's it working in one of the early revisions:

http://images6.memedroid.com/images/...502ac4526.jpeg

CDA 455 06-27-2019 02:59 PM

:laughing:


$50 for an app to cut and paste pics tits and ass.

Photoshop isn't good enough anymore?

thefishguy77 06-27-2019 03:36 PM

Quote:

Originally Posted by Harry Johnson (Post 44500588)
I've been beta testing this software for a while now.



Here's it working in one of the early revisions:



http://images6.memedroid.com/images/...502ac4526.jpeg



Thats funny


Sent from my iPhone using Tapatalk

DirtyComanche 06-27-2019 03:38 PM

I want evidence that it's actually that good. Side by side pictures of what it thought the snatch flappers would look like, versus what they actually do.

arickvan 06-27-2019 03:50 PM

Quote:

Originally Posted by PROJECTJUNKIE (Post 44500586)
I love this shit. The average human has been too easy to manipulate into believing anything they heat or see. Stuff like this will bring skepticism back into the mainstream.
"Yeah, I see that evidence, but what else do you have? "

i bet thats what you want...



ALEX JONES!!

EverNoob 06-27-2019 03:56 PM

This is new?

I've been beating it to photos of Jennifer Lawrence at 18 for at least a decade now. How old is she anyway, like 38?

https://imgur.com/0wxnMwj.jpg

Is there some type of disclaimer I need to post this? I didn't post the markup photo thingie.

arickvan 06-27-2019 04:02 PM

update today 3pm EST is that the creator has taken down the app

guess we'll never know...

DirtyComanche 06-27-2019 04:09 PM

Quote:

Originally Posted by arickvan (Post 44500662)
update today 3pm EST is that the creator has taken down the app

guess we'll never know...

Somebody must have sent him pics of his mother.

Roc Doc 06-27-2019 05:41 PM

Quote:

Originally Posted by DirtyComanche (Post 44500670)
Somebody must have sent him pics of his mother.

:laughing:

arickvan 06-27-2019 06:05 PM

Quote:

Originally Posted by DirtyComanche (Post 44500670)
Quote:

Originally Posted by arickvan (Post 44500662)
update today 3pm EST is that the creator has taken down the app

guess we'll never know...

Somebody must have sent him pics of his mother.

😆

Just think of the poor programmer that had to look at hundreds of nekkid old ladies to create a base model

nahmus 06-27-2019 06:19 PM

wonder if the people who bought it can still use it?

DirtyComanche 06-27-2019 06:34 PM

Quote:

Originally Posted by nahmus (Post 44500746)
wonder if the people who bought it can still use it?

Better be some refunds if they can't.

underbelly 06-27-2019 06:53 PM

does anybody have the download link for andriod
 
they removed the download options form there site if some body has it please help a brother out

DEER TICK 06-27-2019 06:57 PM

Quote:

Originally Posted by EverNoob (Post 44500658)
This is new?

I've been beating it to photos of Jennifer Lawrence at 18 for at least a decade now. How old is she anyway, like 38?

https://imgur.com/0wxnMwj.jpg

Is there some type of disclaimer I need to post this? I didn't post the markup photo thingie.



35% , there bud

DirtyComanche 06-27-2019 07:06 PM

Quote:

Originally Posted by underbelly (Post 44500790)
they removed the download options form there site if some body has it please help a brother out

Did you register here just to ask for it?

We're a forum full of perverts, sure, but not rich perverts. So no, nobody paid the $50 for it. :flipoff2:

gladman 06-27-2019 07:23 PM

Quote:

Originally Posted by DirtyComanche (Post 44500772)
Better be some refunds if they can't.

not exactly refunds, just pictures of what looks like their money.

Roc Doc 06-27-2019 07:46 PM

Quote:

Originally Posted by DEER TICK (Post 44500792)
35% , there bud

It's a link ya prude. What did you think it was recipes for pancakes?

thefishguy77 06-27-2019 07:59 PM

First there was deepfakes, Now theres deepnude
 
It shows up for me without the link. Did everboob just join the band?


Sent from my iPhone using Tapatalk

45acp 06-27-2019 08:05 PM

How does the software predict vajayjayness?

I been with chicks with fat full lips that had knife wounds down there... and chicks with no lips at all that had Arbys in a leg lock with 7 different sets of labias. :confused:

3nuts 06-27-2019 08:53 PM

Search for it on github perhaps?


All times are GMT -7. The time now is 01:48 PM.

Powered by vBulletin® Version 3.8.8 Beta 4
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
Search Engine Optimization by vBSEO 3.6.0 ©2011, Crawlability, Inc.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity v2.2.2 (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.