Connect with us


Instagram ‘pods’ game the algorithm by coordinating likes and comments on millions of posts



Researchers at NYU have identified hundreds of groups of Instagram users, some with thousands of members, that systematically exchange likes and comments in order to game the service’s algorithms and boost visibility. In the process, they also trained machine learning agents to identify whether a post has been juiced in this way.

“Pods,” as they’ve been dubbed, straddle the line between real and fake engagement, making them tricky to detect or take action against. And while they used to be a niche threat (and still are compared with fake account and bot activity), the practice is growing in volume and efficacy.

Pods are easily found via searching online, and some are open to the public. The most common venue for them is Telegram, as it’s more or less secure and has no limit to the number of people who can be in a channel. Posts linked in the pod are liked and commented on by others in the group, with the effect of those posts being far more likely to be spread widely by Instagram’s recommendation algorithms, boosting organic engagement.

Reciprocity as a service

The practice of groups mutually liking one another’s posts is called reciprocity abuse, and social networks are well aware of it, having removed setups of this type before. But the practice has never been studied or characterized in detail, the team from NYU’s Tandon School of Engineering explained.

“In the past they’ve probably been focused more on automated threats, like giving credentials to someone to use, or things done by bots,” said lead author of the study Rachel Greenstadt. “We paid attention to this because it’s a growing problem, and it’s harder to take measures against.”

On a small scale it doesn’t sound too threatening, but the study found nearly 2 million posts that had been manipulated by this method, with more than 100,000 users taking part in pods. And that’s just the ones in English, found using publicly available data. The paper describing the research was published in the Proceedings of the World Wide Web Conference and can be read here.

Importantly, the reciprocal liking does more than inflate apparent engagement. Posts submitted to pods got large numbers of artificial likes and comments, yes, but that activity deceived Instagram’s algorithm into promoting them further, leading to much more engagement even on posts not submitted to the pod.

When contacted for comment, Instagram initially said that this activity “violates our policies and we have numerous measures in place to stop it,” and said that the researchers had not collaborated with the company on the research.


In fact the team was in contact with Instagram’s abuse team from early on in the project, and it seems clear from the study that whatever measures are in place have not, at least in this context, had the desired effect. I pointed this out to the representative and will update this post if I hear back with any more information.

“It’s a grey area”

But don’t reach for the pitchforks just yet — the fact is this kind of activity is remarkably hard to detect, because really it’s identical in many ways to a group of friends or like-minded users engaging with each others’ content in exactly the way Instagram would like. And really, even classifying the behavior as abuse isn’t so simple.

“It’s a grey area, and I think people on Instagram think of it as a grey area,” said Greenstadt. “Where does it end? If you write an article and post it on social media and send it to friends, and they like it, and they sometimes do that for you, are you part of a pod? The issue here is not necessarily that people are doing this, but how the algorithm should treat this action, in terms of amplifying or not amplifying that content.”

Obviously if people are doing it systematically with thousands of users and even charging for access (as some groups do), that amounts to abuse. But drawing the line isn’t easy.

More important is that the line can’t be drawn unless you first define the behavior, which the researchers did by carefully inspecting the differences in patterns of likes and comments on pod-boosted and ordinary posts.

“They have different linguistic signatures,” explained co-author Janith Weerasinghe. “What words they use, the timing patterns.”

As you might expect, strangers obligated to comment on posts they don’t actually care about tend to use generic language, saying things like “nice pic” or “wow” rather than more personal remarks. Some groups actually warn against this, Weerasinghe said, but not many.

The list of top words used reads, predictably, like the comment section on any popular post, though perhaps that speaks to a more general lack of expressiveness on Instagram than anything else:


But statistical analysis of thousands of such posts, both pod-powered and normal, showed a distinctly higher prevalence of “generic support” comments, often showing up in a predictable pattern.

This data was used to train a machine learning model, which when set loose on posts it had never seen, was able to identify posts given the pod treatment with as high as 90% accuracy. This could help surface other pods — and make no mistake, this is only a small sample of what’s out there.

“We got a pretty good sample for the time period of the easily accessible, easily findable pods,” said Greenstadt. “The big part of the ecosystem that we’re missing is pods that are smaller but more lucrative, that have to have a certain presence on social media already to join. We’re not influencers, so we couldn’t really measure that.”

The numbers of pods and the posts they manipulate has grown steadily over the last two years. About 7,000 posts were found during March of 2017. A year later that number had jumped to nearly 55,000. March of 2019 saw over 100,000, and the number continued to increase through the end of the study’s data. It’s safe to say that pods are now posting over 4,000 times a day — and each one is getting a large amount of engagement, both artificial and organic. Pods now have 900 users on average, and some had over 10,000.

You may be thinking: “If a handful of academics using publicly available APIs and Google could figure this out, why hasn’t Instagram?”

As mentioned before, it’s possible the teams there have simply not considered this to be a major threat and consequently have not created policies or tools to prevent it. Rules proscribing using a “third party app or service to generate fake likes, follows, or comments” arguably don’t apply to these pods, since in many ways they’re identical to perfectly legitimate networks of users (though Instagram clarified that it considers pods as violating the rule). And certainly the threat from fake accounts and bots is of a larger scale.

And while it’s possible that pods could be used as a venue for state-sponsored disinformation or other political purposes, the team didn’t notice anything happening along those lines (though they were not looking for it specifically). So for now the stakes are still relatively small.

That said, Instagram clearly has access to data that would help to define and detect this kind of behavior, and its policies and algorithms could be changed to accommodate it. No doubt the NYU researchers would love to help.




5 Effective Ways to Run Facebook Ads A/B Tests




Facebook Ads A/B Tests or split tests help them try different versions of ads with various campaign elements. This process helps them arrive at the best version for the organization’s target. 

A/B Tests offer a vast pool of resources to try out various versions. You may get caught up and lose your way to arriving at the best version in a limited time. To better understand this topic you can read the Facebook ad testing guide. Here are five effective ways to run Facebook Ads A/B Tests-

1) Start with the minimal number of variables

This approach will help you analyze the impact of a variable much better. The lesser the variables, the better will be the relevant results and more conclusive. Once you have various versions, you will need to run them through the A/B Significance Test to determine if the test results are valid.

2) The second way is to select the correct structure. 

There are two structures in A/B tests. One is a single ad test, and the other is multiple single variation ad sets. All the variations will go under one ad set in the first structure. Each variation will be under a separate ad set in the second one. Out of the two, the second one works out to be better and gives better results.

3) Use of spreadsheets is important to stay organized. 


These spreadsheets help collect and analyze data to get meaningful insights and arrive at data-backed decisions.

4) Do target advertising and set realistic time goals. 

One approach is to choose an entirely new set of audiences. Also, the data pool should be vast and not the same as some existing campaigns. The reason for choosing a different audience is that Facebook may mix up your ads and give contaminated output. 

Another approach to choosing the right audience is to pick geography. It works better, especially when you have business in a particular region.   

It’s also essential to set a realistic timeline for your testing. Facebook suggests one should run a test for at least four days, but you can choose to run the test for up to 30 days.   

5) Set an ideal budget. 

The concept of a perfect budget is subjective. But, you can fix it yourself, or Facebook can do that for you based on your testing data. A large part of the test budget is spent on avoiding audience duplication. If the same audience sees variations, it could affect the test results.

Besides these top five effective ideas, you will need to take a few more action points to make the testing process efficient. Make sure you put the website’s domain link and not the landing page link in the ad, as that doesn’t look good. Put appropriate Call To Action Button, such as ‘Learn More,’ ‘Buy Now,’ etc. It’s also important to see how your ad is coming across on various electronic gadgets- mobile, tablets, etc.


Another strategy that works is trying to engage the customer. You may add social engagement buttons such as ‘Like’ or ‘Comment.’ Use high-resolution images as they work better with the customers. Low-quality, highly edited images are often not liked and trusted by the consumers.

You can learn more about the audience behavior patterns with A/B test results. Conducting these tests on Facebook streamlines the entire process and makes it smooth for you. With the test results, advertisers and marketers can work on the creatives they need to utilize.

To sum it up, you can run an effective A/B test campaign within the specified budget. You don’t need to spend massive amounts to get your advertisement right. You’ll make the correct assumptions about the performance of variations with a good understanding of business and consumers.

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address