Ta kontakt med oss


WhatsApp Adds New Privacy Tools, Including Online Activity Controls and the Ability to Silently Leave Group Chats


WhatsApp Adds New Privacy Tools, Including Online Activity Controls and the Ability to Silently Leave Group Chats

Amid ongoing concerns about how it can be used to organize criminal activity, due to its default encryption process, WhatsApp has meddelat some additional privacy features, providing even more assurance and control for users, in various respects.

First off, WhatsApp’s giving users more control over how others see them in the app, with the option to switch off online activity markers, or restrict those signals to certain users.

As shown here, you’ll soon be able to decide who can see when you’re online in the app – ‘Everyone’, ‘Contacts’, ‘My Contacts Except’ or ‘Nobody’.

That’ll provide more capacity to avoid unwanted interactions by hiding your active status, which could be of significant value for users who want to go about their interactions in their own time and space.

WhatsApp’s also adding a new option to leave groups silently, so you can skip out of a group chat without alerting all group members.

WhatsApp updates

As you can see, group admins will still know you’ve left the chat, but there won’t be a ‘John Doe has left the discussion’ notification for all users in the thread.

In addition to this, WhatsApp is also extending the time window for deleting your messages from your chats.

And finally, WhatsApp’s also rolling screenshot blocking for ‘View Once’ messages:

"View Once is already an incredibly popular way to share photos or media that don’t need to have a permanent digital record. Now we’re enabling screenshot blocking for View Once messages for an added layer of protection. We’re testing this feature now and are excited to roll it out to users soon.”

WhatsApp updates

That could facilitate even more private sharing on WhatsApp, which may lead to more questionable material being shared. If that’s what people want – though that specific aspect has also been the focus of various authorities, in various regions, who have called on Meta to enable a level of messaging access to authorities, in order to avoid its apps being used for illegal activity, which is currently shielded by its privacy measures.

Recently, the UK National Cyber Security Center published a research paper that proposed a new automated scanning process for WhatsApp, and other messaging tools, which would better facilitate the detection of illegal exchanges, while still maintaining privacy for users. The European Union has also proposed new legislation that would put more onus on Meta itself to detect and report any such activity within its platforms.

Thus far, Meta has resisted all calls to add in ‘back door’ access, or anything like it, arguing that the handel-off between all users’ privacy, and catching out the small percentage of criminal activity, is simply too great to consider.

Som förklarat av WhatsApp chief Will Cathcart in response to the UK proposal:

“What’s being proposed is that we – either directly or indirectly through software – read everyone’s messages. I don’t think people want that.”

Indeed, Meta is actually still in the process of rolling out end-to-end encryption in all of its messaging tools, with both Messenger and Instagram Direct both getting enhanced security features, in order to bring them into line with WhatsApp.

The next stage, then, will be to integrate all of its messaging platforms into a single back-end, facilitating cross-platform chat – though Meta has delayed the full implementation of this due to ongoing regulatory questions and concerns.


And there is valid concern here. An inarguable side effect of the connective capacity of social media is that while social platforms and messaging apps enable everyone to ‘find their tribe’, those tribes are not always wholesome communities of knitting enthusiasts and TV show fans.

Sometimes, those tribes are dangerous, even criminal. And with encryption hiding any such exchanges, from everyone, there’s no telling just how significant this could be, and what types of activity WhatsApp could be facilitated through its circuits.

But as Cathcart notes, the alternative is that all of WhatsApp’s 2 billion active users lose their privacy, due to the actions of a likely few.

It’s a difficult argument, which looks set to carry on for some time yet.



Brittisk tonåring dog efter "negativa effekter av onlineinnehåll": rättsläkare


Molly Russell was exposed to online material 'that may have influenced her in a negative way'

Molly Russell was exposed to online material ‘that may have influenced her in a negative way’ – Copyright POOL/AFP/File Philip FONG

A 14-year-old British girl died from an act of self harm while suffering from the “negative effects of online content”, a coroner said Friday in a case that shone a spotlight on social media companies.

Molly Russell was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness,” Andrew Walker ruled at North London Coroner’s Court.

The teenager “died from an act of self-harm while suffering depression”, he said, but added it would not be “safe” to conclude it was suicide.

Some of the content she viewed was “particularly graphic” and “normalised her condition,” said Walker.

Russell, from Harrow in northwest London, died in November 2017, leading her family to set up a campaign highlighting the dangers of social media.

“There are too many others similarly affected right now,” her father Ian Russell said after the ruling.


“At this point, I just want to say however dark it seems, there is always hope.

“I hope that this will be an important step in bringing about much needed change,” he added.

The week-long hearing became heated when the family’s lawyer, Oliver Sanders, took an Instagram executive to task.

A visibly angry Sanders asked Elizabeth Lagone, the head of hälsa and wellbeing at Meta, Instagram’s parent company, why the platform allowed children to use it when it was “allowing people to put potentially harmful content on it”.

“You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this,” he said.

Lagone apologised after being shown footage, viewed by Russell, that “violated our policies”.

Of the 16,300 posts Russell saved, shared or liked on Instagram in the six-month period before her death, 2,100 related to depression, self-harm or suicide, the inquest heard.

Children’s charity NSPCC said the ruling “must be a turning point”.


“Tech companies must be held accountable when they don’t make children’s safety a priority,” tweeted the charity.

“This must be a turning point,” it added, stressing that any delay to a government bill dealing with online safety “would be inconceivable to parents”.


Fortsätt läsa

Prenumerera på vårt nyhetsbrev
Vi lovar att inte spamma dig. Avsluta prenumerationen när som helst.
Ogiltig e-postadress