Page Nav

HIDE

Grid

GRID_STYLE

Trending News

latest

Why You Should Stop Using Facebook Messenger

Facebook Messenger logo seen displayed on a smartphone. / Avishek Das If you’re one of the 1.3 billion people using Facebook Messenger, then...

Facebook Messenger logo seen displayed on a smartphone. / Avishek Das
If you’re one of the 1.3 billion people using Facebook Messenger, then you have already been warned of significant delays to critical security enhancements. That upgrade has been described by company executives as “essential,” even as they confirm serious slippage. Now the situation has become much worse, with two critical developments.
 
“We need to find a balance of safety, privacy, and security,” Facebook says of its hyper-scale Messenger app, which is second only to WhatsApp in popularity. The issue is that Messenger has become center stage in the fight between law enforcement and big tech over privacy, and that has left more than a billion users in a security limbo-land.

Unlike WhatsApp, your content is not fully secured on Facebook Messenger—the company itself has admitted to spying on your content and we recently exposed them for “secretly” downloading private links and files sent between users.

Facebook says it wants to fix this—albeit Facebook says a lot of things. It promises to widen the end-to-end encryption that secures WhatsApp to protect Messenger as well as messages sent over Instagram. That promise was made two-years ago—and what we have thus far is delay after delay. Messenger and Instagram have seen some integration, nothing, though, that enhances any user’s security or privacy.

If you’re a Facebook Messenger user, then this shift to end-to-end encryption is critical. As I’ve said multiple times before, don’t just take my word for it—Facebook says the same, admitting, in fact, that such security stops the company itself snooping on your messages. And, again, remember it has admitted to doing exactly that.

The idea of Facebook extending encryption to its billion-plus Messenger users as well as its billion-plus Instagram users has caused something of a meltdown among hawkish lawmakers and security agencies.

The latest to add their voice to this mix is Ken McCallum, the relatively new head of Britain’s MI5, who says that by widening default end-to-end encryption, Facebook would in effect be giving a “free pass” to “some of the worst people in our society,” including terrorists and those organizing child sexual abuse, “where they know that nobody can see what they’re doing.”

When WhatsApp pushed the big red button in 2016, end-to-end encrypting all messages for all users, there wasn’t the focus on encryption that there is now. It passed under the radar. Suddenly hundreds of millions (now two billion) users were able to use a secure mainstream platform, leaving lawmakers in the “dark.” If WhatsApp can’t see your content, neither can a law enforcement agency—even with a warrant.

We’ve known for some weeks now that Facebook has delayed its Messenger security upgrade until next year, “at the earliest,” and while this has been painted as a technical issue given the sprawling nature of the platform, the clever money is on it being more complex than that. Let’s see how fast Facebook works out compensatory tech to deal with Apple’s new App Tracking Transparency, by way of comparison.

If MI5’s boss sounding a warning was the first piece of new news aimed at Facebook’s Menlo Park HQ, then the second was the quite extraordinary backtrack the company was forced into over WhatsApp’s controversial change of terms. The contrast between Facebook Messenger and its WhatsApp stablemate has been fascinating this year—and, in reality, sends a very clear message to those 1.3 billion Messenger users that it’s now time to jump ship.

WhatsApp used its default and-to-end encryption as its primary (some might say only) defense against the accusation that Facebook was encroaching on the privacy of its users. WhatsApp’s boss even penned an opinion piece giving all the reasons such security was “essential” albeit under threat from lawmakers and regulators.

The irony in WhatsApp insisting end-to-end encryption is needed, while Messenger users go without, is stark enough on its own. But the opinion piece also provides multiple examples—medical, financial, etc., where users don’t want their content open to interception from shadowy big brother characters. The opinion piece suggests governments, foreign and domestic, but Facebook itself is a more likely foe.

“Surrendering our privacy would paralyze us,” Cathcart warns. “The power of technology is that it lets us connect at extraordinary speed and scale and democratizes information better than anything ever invented. But if we choose to erode our privacy and security, it will do the opposite. Instead of sharing our ideas, it will shut them down. Instead of bringing us closer together, it will keep us apart. Instead of giving everyone in the world a voice, it will silence us.”

The WhatsApp backtrack over changed terms was a game-changer. Not because it resolves a particularly critical here and now issue for its users, many of whom have already accepted the new terms, but because it showed how governmental pressure could be brought to bear on Facebook in the new environment to change its course.

It became clear that Facebook would hit governmental blockers. The move was blocked in Germany and challenged elsewhere, and while India’s digital security law, which would compromise WhatsApp's security in different ways, is under challenge, it sends a clear message to Facebook that it shouldn’t be crossing any red lines.

According to Burcu Kilic, Director of the Digital Rights Program at Public Citizen, the attempt to change WhatsApp’s user terms “is further proof that Facebook is abusing its dominant market power.” She says that the retreat came after “regulators in Turkey, India, South Africa, Brazil, Germany, and Colombia heard our call.”

And so, the ultimate irony here is that WhatsApp came under so much international pressure over user privacy that it had to change its plans, even as international pressure continues to intensify on WhatsApp to compromise user privacy with backdoors to allow law enforcement a window on user content. “We cannot take end-to-end encryption for granted,” WhatsApp’s Will Cathcart has said, calling out Europe, India and Brazil. “There remains serious pressure to take it away.”

Forcing WhatsApp to break its encryption will have a huge impact on secure communications worldwide. And many in the government security community recognize that while this will help its own investigators, it will also put at risk the hundreds of millions around the world that rely on WhatsApp (and Signal and iMessage and Telegram’s secret messages) to stay safe from their own authorities.

But preventing an expansion of this encryption is a different matter—especially when you consider the critical differences between WhatsApp and Messenger, which is multi-platform, doesn't require a phone number or even a phone to use, and so is easier to use anonymously and by minors.

And so, there are some sensible arguments for keeping Messenger outside the default end-to-end encryption used by WhatsApp. It’s used by many more minors, and it’s easy to jump from Facebook’s mainstream platform into Messenger, and thus there is a clear danger that it can be used to groom the vulnerable. That’s not the same with WhatsApp—you can’t prowl the platform for other users and contact them at will.

Facebook doesn’t need to know much about WhatsApp users—it doesn't link to a profile. Integrating with its profile-drive social media platforms, Instagram and Facebook itself, breaks this model and invites privacy risks. Adding commercial and financial features potentially does the same.

We need to recognize the difference between social media and messaging, one invites personal information to be shared quasi-publicly, while the other does the opposite. The debate has shone a light on this difference. No-one is suggesting terrorism and other serious crimes are plotted en masse on Messenger in full view, but it is part of the world’s largest social media platform, inviting different risks.

Remember, we’re not just talking plotting here—this is about dangerous interest groups, outreach, radicalization, grooming, recruitment. WhatsApp began life as a point-to-point messenger, an upgrade over SMS. And while groups have complicated this, and Facebook seems determined to evolve it into more of a social media platform, it’s still used primarily as a close-contact messenger.

As such, tapping into that content would simply push it to other messaging options. With Messenger you’re looking to flag threads, behaviors, people of interest, and then follow those threads to see where they lead. Shutting off the comms between all those people would, in law enforcement’s view, prevent such flags from being raised and investigators from looking into those “rooms” to see what might be there.

The case for terrorism and serious and organized crime is harder to make—certainly beyond outreach, recruitment and radicalisation. But threads lead to new places.

And so, let’s game this out. Facebook cannot afford to compromise WhatsApp’s encryption for one absolutely critical reason—this has been that primary defense against the data sharing backlash that has hammered the platform this year.

Don’t worry, WhatsApp has said over and over—we can’t see your content, and neither can Facebook. Take away end-to-end encryption and that argument collapses. It’s no surprise that WhatsApp is now suing the Indian government to prevent it from having to identify “the first originator of information,” a short hop from breaking encryption.

But if/when Facebook has Messenger encryption ready, you can expect a huge backlash from governments around the world to stop it switching over. And it is highly likely that some form of compromise will be needed. That could be kicking Messenger encryption down the road even further or changing the model. And the risk in changing the model, in weakening the security, is that this would likely apply to WhatsApp as well—they will fall under the same umbrella, after all.

“Our plans to encrypt Facebook Messenger and Instagram Direct Messaging will follow the WhatsApp model,” the company maintains. “WhatsApp uses a combination of other signals—including user reports, unencrypted account-level information, account-level metadata and, critically, in this context, certain pieces of traffic data—to help keep users safe and to prevent our services being misused to cause harm.”

The argument is made as a carve-out from user privacy directives to enable Facebook to enable it to mine metadata in order to flag dangerous behaviors. But metadata can only go so far, and it won’t be enough for lawmakers and security agencies.

And so, take WhatsApp’s advice and insist on default end-to-end encryption, but don’t want for Facebook Messenger to possibly deliver its update—make the switch instead, certainly for your personal messaging. Assume that anything sent over Messenger is open to monitoring and review.

You don’t need to delete Messenger, which is hard to do anyway if you remain a Facebook user, but you should switch your personal chats and certainly anything sensitive over to WhatsApp—or, even better, to Signal.