Connect with us

SOCIAL

Meta’s Developing and ‘Ethical Framework’ for the Use of Virtual Influencers

Published

on

Meta's Developing and 'Ethical Framework' for the Use of Virtual Influencers


With the rise of digital avatars, and indeed, fully digital characters that have evolved into genuine social media influencers in their own right, online platforms now have an obligation to establish clear markers as to what’s real and what’s not, and how such creations can be used in their apps.

The coming metaverse shift will further complicate this, with the rise of virtual depictions blurring the lines of what will be allowed, in terms of representation. But with many virtual influencers already operating, Meta is now working to establish ethical boundaries on their application.

As explained by Meta:

From synthesized versions of real people to wholly invented “virtual influencers” (VIs), synthetic media is a rising phenomenon. Meta platforms are home to more than 200 VIs, with 30 verified VI accounts hosted on Instagram. These VIs boast huge follower counts, collaborate with some of the world’s biggest brands, fundraise for organizations like the WHO, and champion social causes like Black Lives Matter.”

Some of the more well-known examples on this front are Shudu, who has more than 200k followers on Instagram, and Lil’ Miquela, who has an audience of over 3 million in the app.

At first glance, you wouldn’t necessarily realize that this is not an actual person, which makes such characters a great vehicle for brand and product promotions, as they can be utilized 24/7, and can be placed into any environment. But that also leads to concerns about body image perception, deepfakes, and other forms of misuse through false or unclear representation.

See also  All You Need to Know About Instagram Reels [Infographic]

Deepfakes, in particular, may be problematic, with Meta citing this campaign, with English football star David Beckham, as an example of how new technologies are evolving to expand the use of language, as one element, for varying purpose.

Advertisement

The well-known ‘DeepTomCruise’ account on TikTok is another example of just how far these technologies have come, and it’s not hard to imagine a scenario where they could be used to, say, show a politician saying or doing something that he or she actually didn’t, which could have significant real world impacts.

Which is why Meta is working with developers and experts to establish clearer boundaries on such use – because while there is potential for harm, there are also beneficial uses for such depictions.

Imagine personalized video messages that address individual followers by name. Or celebrity brand ambassadors appearing as salespeople at local car dealerships. A famous athlete would make a great tutor for a kid who loves sports but hates algebra.

Such use cases will increasingly become the norm as VR and AR technologies are developed, with these platforms placing digital characters front and center, and establishing new norms for digital connection.

It would be better to know what’s real and what’s not, and as such, Meta needs clear regulations to remove dishonest depictions, and enforce transparency over VI use.

But then again, much of what you see on Instagram these days is not real, with filters and editing tools altering people’s appearance well beyond what’s normal, or realistic. That can also have damaging consequences, and while Meta’s looking to implement rules on VI use, there’s arguably a case for similar transparency in editing tools applied to posted videos and images as well.

See also  Meta's NPE Group Releases New App Focused on Managing To-Do Lists

That’s a more complex element, particularly as such tools also enable people to feel more comfortable in posting, which no doubt increases their in-app activity. Would Meta be willing to put more focus on this element if it could risk impacting user engagement? The data on the impact of Instagram on people’s mental health are pretty clear, with comparison being a key concern.

Advertisement

Should that also come under the same umbrella of increased digital transparency?

It’s seemingly not included in the initial framework as yet, but at some stage, this is another element that should be examined, especially given the harmful effects that social media usage can have on young women.

But however you look at it, this is no doubt a rising element of concern, and it’s important for Meta to build guardrails and rules around the use of virtual influencers in their apps.

You can read more about Meta’s approach to virtual influencers here.





Source link

SOCIAL

New Screenshots Highlight How Snapchat’s Coming ‘Family Center’ Will Work

Published

on

New Screenshots Highlight How Snapchat's Coming 'Family Center' Will Work

Snapchat’s parental control options look close to launch, with new screenshots based on back-end code showing how Snap’s coming ‘Family Center’ will look in the app.

As you can see in these images, shared by app intelligence company Watchful (via TechCrunch), the Family Center will enable parents to see who their child is engaging with in the app, along with who they’ve added, who they’re following, etc.

That could provide a new level of assurance for parents – though it could also be problematic for Snap, which has become a key resource for more private, intimate connection, with its anti-public posting ethos, and disappearing messages, helping to cement its place as an alternative to other social apps.

That’s really how Snap has embedded its niche. While other apps are about broadcasting your life to the wider world, Snap is about connecting with a small group of friends, where you can share your more private, secret thoughts, without concern of them living on forever, and coming back to bite you at a later stage.

That also, of course, means that more questionable, dangerous communications are happening in the app. Various reports have investigated how Snap is used for sending lewd messages, and arranging hook-ups, while drug dealers reportedly now use Snap to organize meet-ups and sales.

Which, of course, is why parents will be keen to get more insight into such, but I can’t imagine Snap users will be so welcoming of an intrusive tool in this respect.

But if parents know that it exists, they may have to, and that could be problematic for Snap. Teen users will need to accept their parents’ invitation to enable Family Center monitoring, but you can see how this could become an issue for many younger users in the app.

Advertisement

Still, the protective benefits may well be worth it, with random hook-ups and other engagements posing significant risks. And with kids as young as 13 able to create a Snapchat account, there are many vulnerable youngsters engaging in the app.

See also  Twitter Announces New Election Integrity Measures as We Head Into the Final Weeks of the US Presidential Campaign

But it could reduce Snap’s appeal, as more parents become aware of the tool.

Snapchat hasn’t provided any further insight into the new Family Center, or when it will be released, but it looks close to launch based on these images.  

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending