Breaking News

The Inside Story of How Signal Became the Private Messaging App for an Age of Fear and Distrust

Signal has become a tool for everyone from whistleblowers to Black Lives Matter protesters. As its profile grows, can the app maintain its commitment to privacy?

Ama Russell and Evamelo Oleita had never been to a protest before June. But as demonstrations against systemic racism and police brutality began to spread across the U.S. earlier this year, the two 17 year-olds from Michigan, both of whom are Black, were inspired to organize one of their own.

Seeking practical help, Oleita reached out to Michigan Liberation, a local civil rights group. The activist who replied told her to download the messaging app Signal. “They were saying that to be safe, they were using Signal now,” Oleita tells TIME. It turned out to be useful advice. “I think Signal became the most important tool for protesting for us,” she says.

Within a month, Oleita and Russell had arranged a nonviolent overnight occupation at a detention center on the outskirts of Detroit, in protest against a case where a judge had put a 15 year-old Black schoolgirl in juvenile detention for failing to complete her schoolwork while on probation. The pair used Signal to discuss tactics, and to communicate with their teams marshalling protestors and liaising with the police.

“I don’t think anything we say is incriminating, but we definitely don’t trust the authorities,” says Russell. “We don’t want them to know where we are, so they can’t stop us at any point. On Signal, being able to communicate efficiently, and knowing that nothing is being tracked, definitely makes me feel very secure.”

Signal is an end-to-end encrypted messaging service, similar to WhatsApp or iMessage, but owned and operated by a non-profit foundation rather than a corporation, and with more wide-ranging security protections. One of the first things you see when you visit its website is a 2015 quote from the NSA whistleblower Edward Snowden: “I use Signal every day.” Now, it’s clear that increasing numbers of ordinary people are using it too.

“Any time there is some form of unrest or a contentious election, there seems to be an opportunity for us to build our audience,” says Brian Acton, the Signal Foundation’s co-founder and executive chairman, in an interview with TIME. “It’s a little bit bittersweet, because a lot of times our spikes come from bad events. It’s like, woohoo, we’re doing great — but the world’s on fire.”

Indeed, just as protests against systemic racism and police brutality intensified this year, downloads of Signal surged across the country. Downloads rose by 50% in the U.S. between March and August compared to the prior six months, according to data shared with TIME by the analysis firm App Annie, which tracks information from the Apple and Google app stores. In Hong Kong they rose by 1,000% over the same period, coinciding with Beijing’s imposition of a controversial national security law. (The Signal Foundation, the non-profit that runs the app, doesn’t share official download numbers for what it says are privacy reasons.) “

We’re seeing a lot more people attending their first actions or protests this year—and one of the first things I tell them to do is download Signal,” says Jacky Brooks, a Chicago-based activist who leads security and safety for Kairos, a group that trains people of color to use digital tools to organize for social change. “Signal and other end-to-end encryption technology have become vital tools in protecting organizers and activists.”

Read more: Young Activists Drive Peaceful Protests Across the U.S.

In June, Signal took its most explicitly activist stance yet, rolling out a new feature allowing users to blur people’s faces in photos of crowds. Days later, in a blog post titled “Encrypt your face,” the Signal Foundation announced it would begin distributing face masks to protesters, “to help support everyone self-organizing for change in the streets.” Asked if the chaos of 2020 has pushed Signal to become a more outwardly activist organization, Acton pauses. “I don’t know if I would say more,” he says. “I would say that right now it’s just congruent. It’s a continuation of our ongoing mission to protect privacy.”

Brian Acton speaks at the WIRED25 Summit November 08, 2019 in San Francisco, California. Phillip Faraone/Getty Images for WIRED

What makes Signal different

Signal’s user base — somewhere in the tens of millions, according to app store data — is still a fraction of its main competitor WhatsApp’s, which has some 2 billion users and is owned by Facebook. But it is increasingly clear that among protesters, dissidents and investigative journalists, Signal is the new gold standard because of how little data it keeps about its users. At their core, both apps use cryptography to make sure that the messages, images and videos they carry can only be seen by the sender and the recipient — not governments, spies, nor even the designers of the app itself. But on Signal, unlike on WhatsApp, your messages’ metadata are encrypted, meaning that even authorities with a warrant cannot obtain your address book, nor see who you’re talking to and when, nor see your messages.

“Historically, when an investigative journalist’s source is prosecuted in retaliation for something they have printed, prosecutors will go after metadata logs and call logs about who’s been calling whom,” says Harlo Holmes, the director of newsroom digital security at the Freedom of the Press Foundation.

WhatsApp states on its website that it does not store logs of who is messaging who, “in the ordinary course of providing our service”. Yet it does have the technical capacity to do so. In some cases including when they believe it’s necessary to keep users safe or comply with legal processes, they state, “we may collect, use, preserve, and share user information” including “information about how some users interact with others on our service.”

Signal, by contrast, cannot comply with law enforcement even if it wanted to. (It’s not clear that it does: in early June, Signal’s founder and CEO Moxie Marlinspike tweeted “ACAB” — All Cops Are Bastards — in response to allegations that police had stockpiled personal protective equipment amid the pandemic.) In 2016, a Virginia grand jury subpoenaed Signal for data about a user, but because it encrypts virtually all its metadata, the only information Signal was able to provide in response was the date and time the user downloaded the app, and when they had last used it. “Signal works very, very hard in order to protect their users by limiting the amount of metadata that is available in the event of a subpoena,” Holmes says.

The approach has not won Signal fans in the Justice Department, which is supporting a new bill that would require purveyors of encrypted software to insert “backdoors” to make it possible for authorities to access people’s messages. Opponents say the bill would undermine both democracy and the very principles that make the app so secure in the first place. Ironically, Signal is commonly used by senior Trump Administration officials and those in the intelligence services, who consider it one of the most secure options available, according to reporters in TIME’s Washington bureau.

Signal’s value system aligns neatly with the belief, popular in Silicon Valley’s early days, that encryption is the sole key to individual liberty in a world where authorities will use technology to further their inevitably authoritarian goals. Known as crypto-anarchism, this philosophy emerged in the late 1980s among libertarian computer scientists and influenced the thinking of many programmers, including Marlinspike. “Crypto-anarchists thought that the one thing you can rely on to guarantee freedom is basically physics, which in the mid 1990s finally allowed you to build systems that governments couldn’t monitor and couldn’t control,” says Jamie Bartlett, the author of The People vs Tech, referring to the mathematical rules that make good encryption so secure. “They were looking at the Internet that they loved but they could see where it was going. Governments would be using it to monitor people, businesses would be using it to collect data about people. And unless they made powerful encryption available to ordinary people, this would turn into a dystopian nightmare.”

Signal’s founder Moxie Marlinspike during a TechCrunch event on September 18, 2017 in San Francisco, California. Steve Jennings/Getty Images for TechCrunch

As a young adult in the 1990s, Marlinspike — who declined to be interviewed for this story — spent his life on the fringes of society, teaching himself computer science, hacking into insecure servers, and illegally hitching rides on freight trains across the United States. A tall white man with dreadlocks, he always had a distrust for authority, but Snowden’s leaks appeared to crystallize his views. In a post published on his blog in June 2013, which is no longer accessible online, Marlinspike wrote about the danger these new surveillance capabilities posed when exercised by a state that you could not trust. “Police already abuse the immense power they have, but if everyone’s every action were being monitored … then punishment becomes purely selective,” he wrote. “Those in power will essentially have what they need to punish anyone they’d like, whenever they choose, as if there were no rules at all.” But, Marlinspike argued, this problem was not unsolvable. “It is possible to develop user-friendly technical solutions that would stymie this type of surveillance,” he wrote.

By the time he’d written that blog post, Marlinspike had already made an effort to build such a “user-friendly technical solution.” Called the Textsecure Protocol (later the Signal Protocol), it was a sort of recipe for strong end-to-end encryption that could ensure only the sender and recipient of a message were able to read its contents, and not authorities or bad actors wishing to pry. In 2010 Marlinspike launched two apps—one for text messaging and another for phone calls—based on the protocol. In 2014 he merged them, and Signal was born.

The app was kept afloat thanks to nearly $3 million in funding from the Open Technology Fund, a Congress-funded nonprofit that finances projects aimed at countering censorship and surveillance. In keeping with security best practices, the Signal Protocol is open source, meaning that it’s publicly available for analysts around the world to audit and suggest improvements. (Signal’s other main competitor, Telegram, is not end-to-end encrypted by default, and security researchers have raised concerns about its encryption protocol, which unlike Signal’s is not open source.) But although by all accounts secure, Signal back in 2014 was hardly user-friendly. It had a relatively small user base, mostly made up of digital security geeks. It wasn’t the kind of influence Marlinspike wanted.

Read more: How the Trump Administration is Undermining the Open Technology Fund

So Marlinspike sought out Acton, who had co-founded WhatsApp in 2009 along with Jan Koum. The pair had since grown it into the largest messaging app in the world, and in 2014 Facebook snapped it up for a record-setting $19 billion. Marlinspike’s views on privacy aligned with theirs (Koum had grown up under the ever-present surveillance of Soviet Ukraine) and in 2016, with Facebook’s blessing, they worked to integrate the Signal Protocol into WhatsApp, encrypting billions of conversations globally. It was a huge step toward Marlinspike’s dream of an Internet that rejected, rather than enabled, surveillance. “The big win is when a billion people are using WhatsApp and don’t even know it’s encrypted,” he told Wired magazine in 2016. “I think we’ve already won the future.”

But Acton, who was by now a billionaire thanks to the buyout, would soon get into an acrimonious dispute with Facebook’s executives. When he and Koum agreed to the sale in 2014, Acton scrawled a note to Koum stipulating the ways WhatsApp would remain separate from its new parent company: “No ads! No games! No gimmicks!” Even so, while Acton was still at the company in 2016, WhatsApp introduced new terms of service that forced users, if they wanted to keep using the app, to agree that their WhatsApp data could be accessed by Facebook. It was Facebook’s first step toward monetizing the app, which at the time was barely profitable.

Acton was growing alarmed at what he saw as Facebook’s plans to add advertisements and track even more user data. In Sept. 2017, he walked away from the company, leaving behind $850 million in Facebook stock that would have vested in the coming months had he stayed. (As of September 2020, Facebook still hasn’t inserted ads into…

Billy Perrigo

Read full article



Source link