Connect with us

SOCIAL

Facebook is Taking Legal Action Against a Company Using Cloaking to Re-Direct Ads

Published

on

Facebook has filed new lawsuit over violation of its terms and conditions, this time taking aim at a company called ‘LeadCloak’ and its use of ad ‘cloaking’ to re-direct user actions.

As explained by Facebook:

Cloaking is a malicious technique that impairs ad review systems by concealing the nature of the website linked to an ad. When ads are cloaked, a company’s ad review system may see a website showing an innocuous product such as a sweater, but a user will see a different website, promoting deceptive products and services which, in many cases, are not allowed.”

Facebook’s Integrity Team Lead Rob Leathern provided this video overview of how cloaking works for more context:

Among various violations, Facebook says that LeadCloak’s software has been used to conceal websites featuring scams related to COVID-19, cryptocurrency, pharmaceuticals, diet pills, and fake news pages. Some of these cloaked websites also included images of celebrities.

It’s the latest in Facebook’s increasing legal action against companies that violate its terms – over the past year, Facebook has initiated legal proceedings against:

  • Companies that sell fake followers and likes, which Facebook has pushed harder to enforce since New York’s Attorney General ruled that selling fake social media followers and likes is illegal last February
  • Two different app developers over ‘click injection fraud‘, which simulates clicks in order to extract ad revenue
  • Two companies over the creation of malware, and tricking Facebook users into installing it in order to steal personal information
  • An organization which had registered various domain names which, Facebook claims, were intended to deceive people by pretending to be affiliated with Facebook apps via scams like emails that ask users to log-in to correct an error
See also  Facebook Tests Adaptive Color Backgrounds for Profiles

These types of scams have been problematic for a long time, but Facebook is now taking up official, legal recourse to stop them, which could help to establish precedents that Facebook can then refer to in future proceedings.

Essentially, Facebook’s taking a harder stance against such scams. After the controversy of Cambridge Analytica, Facebook’s not taking any more chances, and if it can extract bigger penalties for such violations, it can also use those as a warning to others who may be looking to attempt the same.

In the past, scammers could get away with platform bans, but increasingly, Facebook’s looking to take things further – which should, hopefully, act as a deterrent as well as a case-by-case improvement.

Advertisement

Such proceedings can take time, but it’ll be interesting to see what results Facebook sees in each case, and how they relate to future efforts to combat the same.

In addition, Facebook says that its also looking to work with other digital platforms to share learnings, and address the same issues within the broader industry. 

Socialmediatoday.com

SOCIAL

New Screenshots Highlight How Snapchat’s Coming ‘Family Center’ Will Work

Published

on

New Screenshots Highlight How Snapchat's Coming 'Family Center' Will Work

Snapchat’s parental control options look close to launch, with new screenshots based on back-end code showing how Snap’s coming ‘Family Center’ will look in the app.

As you can see in these images, shared by app intelligence company Watchful (via TechCrunch), the Family Center will enable parents to see who their child is engaging with in the app, along with who they’ve added, who they’re following, etc.

That could provide a new level of assurance for parents – though it could also be problematic for Snap, which has become a key resource for more private, intimate connection, with its anti-public posting ethos, and disappearing messages, helping to cement its place as an alternative to other social apps.

That’s really how Snap has embedded its niche. While other apps are about broadcasting your life to the wider world, Snap is about connecting with a small group of friends, where you can share your more private, secret thoughts, without concern of them living on forever, and coming back to bite you at a later stage.

That also, of course, means that more questionable, dangerous communications are happening in the app. Various reports have investigated how Snap is used for sending lewd messages, and arranging hook-ups, while drug dealers reportedly now use Snap to organize meet-ups and sales.

Which, of course, is why parents will be keen to get more insight into such, but I can’t imagine Snap users will be so welcoming of an intrusive tool in this respect.

But if parents know that it exists, they may have to, and that could be problematic for Snap. Teen users will need to accept their parents’ invitation to enable Family Center monitoring, but you can see how this could become an issue for many younger users in the app.

Advertisement

Still, the protective benefits may well be worth it, with random hook-ups and other engagements posing significant risks. And with kids as young as 13 able to create a Snapchat account, there are many vulnerable youngsters engaging in the app.

See also  10 Marketing Trends for 2021 [Infographic]

But it could reduce Snap’s appeal, as more parents become aware of the tool.

Snapchat hasn’t provided any further insight into the new Family Center, or when it will be released, but it looks close to launch based on these images.  

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending