Connect with us

SOCIAL

Planning for 2023: What Social Media Marketers Need to Win in 2023

Published

on

Planning for 2023: What Social Media Marketers Need to Win in 2023

January is, for many, a month of reflection, goal-setting, strategizing and planning for the year ahead. 

In line with this, we’ve kicked off the new year with a series of articles covering the latest stats, tips and strategies to help social media marketers build an effective game plan for 2023.

Below, you’ll find links to our 2023 social media planning series, which includes:

  • Content strategy guidelines to help you define your brand’s content mission and set SMART goals
  • Organic posting tips for Facebook, Instagram, TikTok, Twitter, LinkedIn, Snapchat and Pinterest 
  • Explainers on how to research key topics of interest in your niche, understand the competitive landscape, and help you find your audience and connect with them where they’re active
  • A holiday calendar and notes on the best days and times to post to each of the major platforms

 

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SOCIAL

LinkedIn Shares New Insights into How Public Group Posts are Distributed

Published

on

LinkedIn Shares New Insights into How Public Group Posts are Distributed

Could LinkedIn groups be making a comeback?

I mean, probably not. Long gone are the Halcyon days of robust LinkedIn groups, most of which have since been overrun by spammers and scammers looking to get attention at all costs, which has rendered most groups, and group notifications, as spam themselves.

But maybe, there is a way for LinkedIn to get at least some groups back on track.

Maybe.

Today, LinkedIn has published a new overview of the work that it’s put into building public groups, which is an option that LinkedIn’s still in the process of rolling out to all users.

Public groups, as the name suggests, are wholly viewable by members and non-members, as opposed to having to join a group to see what’s happening within it. Up till a year ago, LinkedIn users could only create “listed” or “unlisted” groups, with listed communities showing up in relevant searches, and unlisted ones hidden from non-member view. So you could find a listed group, but you’d still have to join it to get a view of the discussions happening within. But with public groups, they’re both listed and the content is viewable.

Which, according to LinkedIn, has been a positive:

Over the last few years, the Groups product has evolved significantly across feed, notifications, creators, group discovery, content moderation, and other domains of organizer tooling. In continuation of these improvements, we launched public groups to help non-group members see valuable conversations happening in groups, and to help group organizers and creators foster more engagement and a stronger community. This has led to a 35% increase in daily group contributors and a more than 10% incremental increase in joins in these groups.

Which makes sense. Enabling users to view what’s happening within groups, especially highly active, well-moderated ones, is going to attract more members. But it is also interesting to consider whether there might be value in switching your group to public, and making it more of a focus.

Within the new technical overview, LinkedIn explains that public group posts are eligible to be distributed in member timelines, as well as their expanded networks.

“For posts created inside public group, we set the distribution to MAIN_FEED to allow for distribution on the home feed to group members, first degree connections of the author, and first degree connections of any members who react/comment/repost on the post. This helps increase distribution of public group posts.”

That could facilitate good distribution for public feed posts, and could help to increase engagement within your LinkedIn group.

As you can see in this example, another strong lure is that only group members can comment on a public group post. Anyone can react to a public group update, but you have to actually join the community, which you can do via the CTA, to participate in the discussion.

In combination, this could be a powerful way to maximize group engagement, and depending on where that fits into your strategy, it could put more emphasis on LinkedIn groups as a means to broaden connection and community.

Though, as noted, many soured on LinkedIn groups long ago, once the spammers settled in. Back in 2018, LinkedIn actually tried to initiate a groups refresh, with new regulations around spam, and limits on notifications about groups activity, to discourage misuse.

That, seemingly, didn’t have a huge impact, but as LinkedIn notes, it has continued to update its group rules and processes, in order to make it a more compelling product.

Could it be worthy of consideration once again?

There are definitely things to like here, and for those who already have active LinkedIn groups, making the switch to “Public” could have some benefit.

I do think that LinkedIn groups require strong moderation to maximize their value, and establishing a core focus statement for your group, and what it’s for, is also essential to help to guide your direction.

But maybe, they’re worth a look once again.

Maybe.

You can read more about LinkedIn’s latest public groups updates here.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

X Raises Questions on Content Moderation After Navalny’s Wife Allegedly Banned

Published

on

Screenshot from Yulia Navalnaya on X

Amidst speculation surrounding the banning of Navalny’s wife from X, questions arise over the platform’s content moderation policies in Europe. 

(Photo : Yulia Navalnaya )
Screenshot from Yulia Navalnaya on X

Continuing Alexei Navalny’s Fight

Alexei Navalny, a prominent Russian anti-corruption activist, died under mysterious circumstances in a Siberian penal colony on Feb. 16. While the exact cause of Navalny’s death remains unclear, Western officials have pointed fingers at Russian President Vladimir Putin

Navalny’s wife, Yulia Navalnaya, fueled speculation further with claims in a video statement. She alleged that Russian authorities may be withholding her husband’s corpse to eliminate evidence of a deadly nerve agent, Novichok.

The video accused Putin of orchestrating her husband’s demise and pledged to continue his work. This development raises concerns about X’s content moderation practices and its implications for freedom of speech in Europe.

In a video shared in Russian language, she conveyed her aspiration for a liberated Russia, emphasizing her desire to live and contribute to its freedom. Following this, Navalnaya rapidly amassed a significant online following, receiving an outpouring of support from thousands of sympathetic messages. 

According to a report from The Guardian, Navalnaya currently resides in a location undisclosed to the public outside of Russia. She established her X account in February and made her inaugural post on the 19th while in Brussels, engaging with EU officials regarding her husband’s passing. 

Facing X Suspension

However, her presence on X encountered a brief suspension on Tuesday, triggering widespread user concern. During the suspension period, allegations circulated, suggesting a connection between owner Elon Musk and sympathies toward Putin.

X’s Safety team later clarified that the account suspension resulted from an error in the platform’s spam detection system, which erroneously flagged @yulia_navalnaya’s account. 

As per Daily Dot, the suspension was promptly lifted upon the team’s realization of the mistake, with assurances of enhancements to the platform’s defense mechanism. X’s announcement does not explicitly indicate whether Navalnaya’s account suspension resulted from an automated system. 

Also read: China, Russia Agree to Coordinate AI Use in Military Technology

However, attributing the suspension to a “defense mechanism” and the pledge to “update the defense” led some information analysts to infer that human intervention was not involved in the initial account shutdown.

This interpretation prompted swift scrutiny from researchers, who questioned the accuracy of attributing the suspension, even implicitly, to an automated decision.

Responding to the statement, Michael Veale, an associate professor of Digital Rights & Regulation at University College London’s Faculty of Laws, expressed skepticism. He noted the irony, given X’s previous claims under the Digital Services Act, that they refrain from automated content moderation.

Implemented by the EU in October 2022, the Digital Services Act (DSA) aims to combat illegal content, ensure advertising transparency, and counter disinformation. 

Among its mandates, the act necessitates platforms to disclose moderation determinations in the DSA Transparency Database, detailing factors like the rationale behind the decision, the content type in question, and whether automation was involved in the decision-making process.

2023 study by the University of Bremen researchers scrutinizing moderation verdicts uploaded to the database for a single day revealed that X exclusively relied on human moderation for its decisions.

Consequently, X reported significantly fewer moderation determinations than other platforms during the observed period.

Related Article: Vladimir Putin’s Unusual New Year’s Message Sparks Death Rumors: Is ‘AI Putin’ Behind the Speech?

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

EU Officials Launch Investigation into TikTok Over Potential DSA Violations

Published

on

EU Officials Launch Investigation into TikTok Over Potential DSA Violations

EU officials are wasting no time enacting their new powers under the Digital Services Act (DSA), with the European Commission announcing a new investigation into whether TikTok is currently in violation of DSA rules in relation to the protection of minors in the app.

Concerns have actually been raised around TikTok’s compliance on several fronts, including systemic risks related to app addiction, its age verification processes, its security measures for minors, data transparency, and more.

As per the European Commission:

On the basis of the preliminary investigation conducted so far, including on the basis of an analysis of the risk assessment report sent by TikTok in September 2023, as well as TikTok’s replies to the Commission’s formal Requests for Information (on illegal contentprotection of minors, and data access), the Commission has decided to open formal proceedings against TikTok under the Digital Services Act.

It’s the second major probe under the new DSA laws, with X also currently under EU investigation over its efforts in restricting illegal content, and stopping the spread of misinformation in the app.

TikTok will now need to provide further information to EU investigators to assess its efforts, with a maximum penalty of up to 6% of its global earnings on the cards if it is found to be in violation.

Though that’s probably unlikely, given that the DSA also includes clauses that enable investigators to “accept any commitment made by TikTok to remedy on the matters subject to the proceeding”.

Given that the DSA has only recently been initiated, this will probably be the outcome of these early investigations, though EU officials may also want to send a strong message early, in order to underline the seriousness of the new rules.

Though there’s also this:

The duration of an in-depth investigation depends on several factors, including the complexity of the case, the extent to which the company concerned cooperates with the Commission and the exercise of the rights of defence.

So any investigation could carry on for some time, meaning we won’t know the outcome for a while yet. But again, potentially, TikTok could face big fines if it is found to be in breach, and it fails to take action to address any highlighted concerns.

It’ll be interesting to see how EU officials look to enact the regulations, and keep each platform in line with these more restrictive processes. That could get especially complex with the DSA, given the variable interpretations around what constitutes adequate action on certain fronts.

As such, these early cases could play a key role in establishing precedent, which could indeed see big fines coming, and could even force apps to reassess their operations in the region as a result.  

I mean, Meta has threatened that before, and depending on how EU officials approach these new laws, there could be further concerns on this front.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS