Aviv Ovadya and the Coming “Infocalypse”

In a far-ranging, frightening, and fascinating interview, Buzzfeed News catches up with engineer and tech prognosticator Aviv Ovadya, who anticipated the current scourge of “fake news” and says we haven’t seen anything yet.


“He Predicted The 2016 Fake News Crisis. Now He’s Worried About An Information Apocalypse.”
By Charlie Warzel
Buzzfeed
February 11, 2018

In mid-2016, Aviv Ovadya realized there was something fundamentally wrong with the internet — so wrong that he abandoned his work and sounded an alarm. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled “Infocalypse”

The web and the information ecosystem that had developed around it was wildly unhealthy, Ovadya argued. The incentives that governed its biggest platforms were calibrated to reward information that was often misleading and polarizing, or both. Platforms like Facebook, Twitter, and Google prioritized clicks, shares, ads, and money over quality of information, and Ovadya couldn’t shake the feeling that it was all building toward something bad — a kind of critical threshold of addictive and toxic misinformation. The presentation was largely ignored by employees from the Big Tech platforms — including a few from Facebook who would later go on to drive the company’s NewsFeed integrity effort.

“At the time, it felt like we were in a car careening out of control and it wasn’t just that everyone was saying, “we’ll be fine’ — it’s that they didn’t even see the car,” he said.

Ovadya saw early what many — including lawmakers, journalists, and Big Tech CEOs — wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable — to propaganda, to misinformation, to dark targeted advertising from foreign governments — so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact.

But it’s what he sees coming next that will really scare the shit out of you. Read more.

Terror Pranking, a Brief History

There’s a huge difference between socially revealing satirical commentary and scaring the shit out of everyone…


Inside the world of extreme ‘terror pranking’
BBC
February 11, 2018

Fake bombs, staged murders, stunts that resemble acid attacks – as competition for eyeballs on YouTube gets fiercer every day, popular vloggers are resorting to extreme pranks to get clicks.

Arya Mosallah’s video channel had more than 650,000 subscribers. But his YouTube career came skidding to a halt with a video titled “Throwing Water On Peoples Faces PT. 2”. In it, he approaches several people, and after a brief conversation, throws a cup of water in their faces.

Many viewers thought the prank in the video looked like an attempt by the British social media star to mimic an acid attack – amid a recent increase in such crimes in London and across the UK.

YouTube deleted Mosallah’s channel – and then a second channel he set up. He told the BBC he had not meant to reference acid attacks – but that he would continue to produce prank videos.

But Arya Mosallah is certainly not the first YouTuber to get into trouble for prank videos. His story, along with the controversy over hugely popular Youtuber Logan Paul joking about a suicide victim to his young audience, have put a spotlight on extreme content on YouTube.

But although it appears to be on the rise – and is getting more attention from news outlets – extreme pranking is not an entirely new phenomenon. For some time, vloggers have been faking bomb attacks and murders, tricking and frightening friends and members of the public in an attempt to up their view counts. Read the rest of this article here.

Speaking of Artificial Intelligence…

Friendly foes seek world peace at the Olympics but are shown the door instead.


“˜Trump” And “˜Kim” Thrown Out Of Opening Ceremony
Reuters
February 10, 2018

“We wanted to surprise everyone and bring world peace and then we”re being escorted out by security guards, which I think is really unfair.”

PYEONGCHANG-GUN, SOUTH KOREA – FEBRUARY 09: Impersonators of Donald Trump and Kim Jong Un pose during the Opening Ceremony of the PyeongChang 2018 Winter Olympic Games at PyeongChang Olympic Stadium on February 9, 2018 in Pyeongchang-gun, South Korea. (Photo by Ryan Pierse/Getty Images)

PYEONGCHANG, South Korea (Reuters) – People dressed up as U.S. President Donald Trump and North Korean leader Kim Jong Un caused a commotion when they appeared in the stands at the Olympic Games Opening Ceremony on Friday before swiftly being shown out by security staff.

“We”re getting along great,” said the Kim lookalike, who declined to give his name.

“We wanted to surprise everyone and bring world peace and then we”re being escorted out by security guards, which I think is really unfair,” he added. “Doesn”t everyone want peace?”

The ceremony was marked by a show of unity between North and South Korea.

Athletes from both countries entered the arena under a flag depicting a unified Korea as American Vice President Mike Pence and Kim”s sister, Kim Yo Jong, watched on from a VIP box.

“Is my sister getting the same treatment?,” the Kim lookalike said as he was escorted out of the stadium.

Deepfake: AI-Assisted Porn

Hey! What’s my face doing on a porn star’s body?


Everything You Need To Know About The Face-Swap Technology That’s Sweeping The Internet (And Getting Banned Everywhere)
Digg
February 8, 2018

Gal Gadot’s face on someone else’s body. Image: Screenshot from SendVids

In the past couple of months, “deepfake” has gone from a nonsense word to a widely-used synonym for videos in which one person’s face is digitally grafted onto another person’s body. The most popular “” and troubling “” type of deepfake is artificially produced porn appearing to star famous actresses like Gal Gadot, Daisy Ridley and Scarlett Johansson. Sites like Reddit and Pornhub have made moves to ban pornographic deepfakes in recent days, but it’s never been easier for anyone with an internet connection to make disturbingly real-looking porn by mapping almost anyone’s face over those of porn performers. Here’s what you need to know.

‘Deepfake’ Celebrity Porn First Emerged In December

In an only somewhat hyperbolically titled article called “AI-Assisted Fake Porn Is Here and We’re All Fucked,” Motherboard’s Samantha Cole interviewed the first Redditor to post convincing face-swapped videos, who called himself “deepfakes.” (“Deepfake” which has since become a term used the doctored videos produced by the technology.) “Deepfakes” explained how he created a porn video appearing to star Gal Gadot. Read the rest here.