Connect with us


Instagram’s Testing a New, Full-Screen Main Feed of Feed Posts, Stories and Reels Content



Instagram Will Now Reduce the Reach of Posts That are 'Likely' to Contain Bullying of Hate Speech

It’s been on the horizon for a while, given the evolving usage trends in the app. And now, it looks a step closer to reality, with Instagram testing a new, fully-integrated home feed that would do away with the top Stories bar, and present everything in an immersive, full-screen, swipeable UI.

As you can see in this example, posted by app researcher Alessando Paluzzi, the experimental Instagram feed would include regular Feed posts, Stories and Reels all within a single flow.

Stories would be presented with a frame bar at the bottom of the display, indicating that you can swipe left to see the other frames, while videos have a progress bar instead.

It’s a more intuitive, and really logical way to present Instagram content, which would also align with evolving, TikTok-led usage trends. The update would also enable algorithmic improvements based on your response to each specific post, as opposed to the current format which presents things in different ways, and often shows more than one post on screen at a time.

Which is where TikTok has been able to gain its most significant advantage. Because all TikTok clips are displayed one at a time, in full-screen, everything you do while viewing that post can be used as a measure of your response to that specific content. If you tap ‘Like’ on a clip, if you watch it all the way through, if you let it play twice, swipe back to it again – every response is specific to that video, which gives TikTok a level of advantage in determining the specific elements of interest in each clip, which it can then align with your profile to improve your feed recommendations.

See also  Twitter Shares New Research into the Effectiveness of its Offensive Reply Warnings

That’s why TikTok’s feed is so addictive – and while Instagram Reels are also presented in the same way, Instagram hasn’t yet been able to crack the algorithm as effectively as TikTok has, fueling its more immersive, more addictive ‘For You’ content stream.

This new presentation style could help to change that, and would be a big step in moving into line with the broader TikTok trend, which shows no sign of slowing as yet. And given that Reels is now the largest contributor to engagement growth on Instagramand users spend more time with Stories than they do with their main feed, it makes perfect sense.


Again, I’ve been predicting that this would happen for the last two years – and really, the only surprise here is that it’s taken IG this long to actually move to live testing of the format.

Which, it’s important to note, hasn’t begun just yet. This is a back-end prototype at present, which might still not see the light of day. But it probably will, and given the state of its development, as shown here, I’d be expecting to see this soon, giving users a whole new way to engage with all of Instagram’s different content formats, while also aligning with the platform’s stated push on video content.

Indeed, back in December, Instagram chief Adam Mosseri said that video would be a key focus for IG in 2022.

“We’re going to double-down on our focus on video and consolidate all of our video formats around Reels”

This seems like the ultimate next step on this front, and another re-positioning in its face-off against TikTok, in order to mitigate TikTok’s rising dominance in the space.

See also  How Content makes PPC Advertising Successful

Source link


Meta’s Developing New Spatial Audio Tools for AR and VR to Enhance Virtual Experiences



Meta's Developing New Spatial Audio Tools for AR and VR to Enhance Virtual Experiences

Visual elements are the main focus of next-level digital experiences, like AR and VR tools, but audio also plays a key role in facilitating fully immersive interaction, with the sounds that you hear helping to transport your mind, and bring virtual environments to life.

Which is where Meta’s latest research comes in – in order to facilitate more true-to-life AR and VR experiences, Meta’s developing new spatial audio tools which respond to different environments as displayed within visuals.

As you can see in this video overview, Meta’s work here revolves around the commonalities of sound that people expect to experience in certain environments, and how that can be translated into digital realms.

As explained by Meta:

“Whether it’s mingling at a party in the metaverse or watching a home movie in your living room through augmented reality (AR) glasses, acoustics play a role in how these moments will be experienced […] We envision a future where people can put on AR glasses and relive a holographic memory that looks and sounds the exact way they experienced it from their vantage point, or feel immersed by not just the graphics but also the sounds as they play games in a virtual world.”

That could make its coming metaverse much more immersive, and could actually play a far more significant role in the experience than you might initially expect.


Meta’s already factored this in, to at least some degree, with the first generation version of its Ray-Ban Stories glasses, which include open air speakers that deliver sound directly into your ears.

Which is a surprisingly sleek addition – the way the speakers are positioned enables fully immersive audio without the need for earbuds. Which seems like it shouldn’t work, but it does, and it may already be a key selling point for the device.

See also  YouTube Expands its 'Pre-Publish Checks' Tool to the Mobile App

In order to take its immersive audio elements to the next stage, Meta’s making three new models for audio-visual understanding open to developers.

“These models, which focus on human speech and sounds in video, are designed to push us toward a more immersive reality at a faster rate.”

Meta has already developed its own self-supervised visual-acoustic matching model, as outlined in the video clip, but by expanding the research here to more developers and audio experts, that could help Meta build even more realistic audio translation tools, to further build upon its work.

Which, again, could be more significant than you think.

As noted by Meta CEO Mark Zuckerberg:

“Getting spatial audio right will be one of the things that delivers that ‘wow’ factor in what we’re building for the metaverse. Excited to see how this develops.


Similar to the audio elements in Ray-Ban Stories, that ‘wow’ factor may well be what gets more people buying VR headsets, which could then help to usher in the next stage of digital connection that Meta is building towards.

As such, it could end up being a major advance, and it’ll be interesting to see how Meta looks to build its spatial audio tools to enhance its VR and AR systems.

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address