Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

MARKETING

The State of Digital Accessibility: Three Key Challenges

Published

on

The State of Digital Accessibility: Three Key Challenges

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Earlier this year, the Department of Justice (DOJ) published its first web accessibility guidance in 10 years. It was meant to remind businesses of all sizes that their websites — just like physical storefronts — need to be accessible to people with disabilities. 

The DOJ guidance comes at a time when the majority of US businesses are getting swept up in accelerated digital transformation and a struggle to make their websites accessible to people of all abilities. 

According to WebAIM’s most recent accessibility analysis of the top one million homepages, 97% of websites have accessibility errors — such as low contrast text and missing written descriptions of images — failing to meet some of the basic Website Content Accessibility Guidelines (WCAG), a de facto international standard. This is a slight improvement from 2020, when 98% of homepages were inaccessible. 

With only 3% of the Internet accessible, we have an urgent problem on a big scale. 

There are a number of reasons why, despite the growing awareness of digital accessibility, expectations of inclusivity, and renewed efforts by the government, we are still lagging behind. 

Among those reasons are the following three challenges that reflect that state of digital accessibility today. 

Three key challenges in digital accessibility 

1. The lack of clarity on legal requirements 

Illustration of a hand bringing down a purple gavel onto the web accessibility icon.

The Americans with Disabilities Act (ADA), which prohibits discrimination based on disability, and other laws governing accessibility in the United States were written before the Internet became an integral part of our lives. Today, the Justice Department and courts across the country decide on digital accessibility lawsuits on a case-by-case basis, relying on WCAG as a technical standard. But because these guidelines haven’t been codified, for many businesses it’s hard to know with certainty which standards are applicable to them, whether their websites meet legal requirements, and what specific steps they should take to comply with the laws.  

The Justice Department’s 2022 guidance somewhat addresses this ambiguity by reaffirming that web accessibility is a requirement under Title III of the ADA. Title III of the ADA requires any business “open to the public” to make their online content and services accessible to people who rely on assistive technologies, such as screen readers, to browse the Internet. 

With the current laws, businesses can choose how to ensure their content is accessible to people with disabilities. The DOJ guidance points to the WCAG and the Section 508 Standards (which the US federal government uses for its own websites), but it doesn’t provide a new legal standard. For example, it’s not clear whether businesses with online-only stores have to adhere to the same legal standard as those with both physical locations and e-commerce sites. 

With so much left to interpretation, including how many and which WCAG criteria a website needs to conform with in order to be considered ADA compliant, it’s hard for businesses to know where they stand when it comes to digital accessibility compliance. 

Further complicating matters is the complex and ever-changing nature of the Internet.

2. The dynamic nature of the Internet 

Illustration of several web page examples floating against a purple and teal background.

Whether it’s personalization based on user actions and preferences, or new content creation – websites are constantly changing, posing an ongoing challenge to keep them accessible. Every change, no matter how small — like adding a new product description or an image — can potentially make content inaccessible to users with disabilities. 

In a recent analysis of 3,500 websites across 22 industries, including healthcare, e-commerce, and employment, AudioEye, a web accessibility platform, found that 79% of the websites had at least three severe accessibility errors that could potentially block an assistive technology user from interacting with the content and/or completing the goal of a site visit, such as submitting a form or requesting information. 

When comparing different industries in the same analysis, the analysis found that 83% of e-commerce sites, 78% of healthcare sites, and 77% of jobs and career sites had accessibility errors that blocked or significantly impacted users’ ability to complete key tasks, such as viewing product descriptions, making a purchase, filling out an application, or booking an appointment.

Considering the dynamic nature of the Internet and the speed of content creation (more than 250,000 sites are launched every day), it’s clear we need a web accessibility solution that can monitor for accessibility errors in real-time and help fix issues as they come up. 

And while automation can provide rapid improvement at scale, it cannot solve all errors. 

3. Current limits of technology

Illustration of the web accessibility icon in a pink circle with a crack through it, centered among web page examples.

Even the best accessibility automation today, which can detect up to 70% of common accessibility errors and resolve two-thirds of them, cannot solve complex accessibility issues that require human judgment. Detecting more subtle errors often requires an understanding of context that is beyond even the most sophisticated AI today. For example, automation can detect that an image lacks a written description, or alt text, but it cannot tell whether an existing description is meaningful or accurate. Even with human judgment, if you ask two people to describe an image, their descriptions may be similar, but it is unlikely they would be exactly the same. Determining which description is the better one is also subjective, and AI is not yet able to make those types of judgments.

AudioEye’s analysis of 20,000 websites across industries showed that even the sites that were using some type of an automated digital accessibility solution — or about 6% of all sites in the analysis — still had accessibility errors with significant impact on the user experience. 

In another analysis — this time a manual audit of randomly selected 55 websites that used manual testing and remediation services, or traditional approach — AudioEye found over 950 accessibility issues. More than 40 of these sites had one or more severe accessibility issues, such as non-functional site navigation, unlabeled graphics, inaccessible video controls, and other issues that made digital content and tools inaccessible to people with disabilities.

Looking specifically at their own customers’ websites, AudioEye found that the majority of accessibility issues (up to 95%) can be fixed and prevented using a mix of automated and manual remediations, leveraging JavaScript, without the need to modify the original source code.

What will it take to solve digital accessibility at scale?

Accessibility solutions today range from simple automation-only tools to labor-intensive manual audits. AudioEye’s research, which included both automated and manual analysis of websites across industries, showed that the most effective way to solve web accessibility at scale is through a combination of technology and human expertise. 

To learn more about the state of digital accessibility and the role of technology in solving accessibility at scale, download AudioEye’s white paper on Building for Digital Accessibility at Scale which includes research details.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

MARKETING

YouTube Ad Specs, Sizes, and Examples [2024 Update]

Published

on

YouTube Ad Specs, Sizes, and Examples

Introduction

With billions of users each month, YouTube is the world’s second largest search engine and top website for video content. This makes it a great place for advertising. To succeed, advertisers need to follow the correct YouTube ad specifications. These rules help your ad reach more viewers, increasing the chance of gaining new customers and boosting brand awareness.

Types of YouTube Ads

Video Ads

  • Description: These play before, during, or after a YouTube video on computers or mobile devices.
  • Types:
    • In-stream ads: Can be skippable or non-skippable.
    • Bumper ads: Non-skippable, short ads that play before, during, or after a video.

Display Ads

  • Description: These appear in different spots on YouTube and usually use text or static images.
  • Note: YouTube does not support display image ads directly on its app, but these can be targeted to YouTube.com through Google Display Network (GDN).

Companion Banners

  • Description: Appears to the right of the YouTube player on desktop.
  • Requirement: Must be purchased alongside In-stream ads, Bumper ads, or In-feed ads.

In-feed Ads

  • Description: Resemble videos with images, headlines, and text. They link to a public or unlisted YouTube video.

Outstream Ads

  • Description: Mobile-only video ads that play outside of YouTube, on websites and apps within the Google video partner network.

Masthead Ads

  • Description: Premium, high-visibility banner ads displayed at the top of the YouTube homepage for both desktop and mobile users.

YouTube Ad Specs by Type

Skippable In-stream Video Ads

  • Placement: Before, during, or after a YouTube video.
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Vertical: 9:16
    • Square: 1:1
  • Length:
    • Awareness: 15-20 seconds
    • Consideration: 2-3 minutes
    • Action: 15-20 seconds

Non-skippable In-stream Video Ads

  • Description: Must be watched completely before the main video.
  • Length: 15 seconds (or 20 seconds in certain markets).
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Vertical: 9:16
    • Square: 1:1

Bumper Ads

  • Length: Maximum 6 seconds.
  • File Format: MP4, Quicktime, AVI, ASF, Windows Media, or MPEG.
  • Resolution:
    • Horizontal: 640 x 360px
    • Vertical: 480 x 360px

In-feed Ads

  • Description: Show alongside YouTube content, like search results or the Home feed.
  • Resolution:
    • Horizontal: 1920 x 1080px
    • Vertical: 1080 x 1920px
    • Square: 1080 x 1080px
  • Aspect Ratio:
    • Horizontal: 16:9
    • Square: 1:1
  • Length:
    • Awareness: 15-20 seconds
    • Consideration: 2-3 minutes
  • Headline/Description:
    • Headline: Up to 2 lines, 40 characters per line
    • Description: Up to 2 lines, 35 characters per line

Display Ads

  • Description: Static images or animated media that appear on YouTube next to video suggestions, in search results, or on the homepage.
  • Image Size: 300×60 pixels.
  • File Type: GIF, JPG, PNG.
  • File Size: Max 150KB.
  • Max Animation Length: 30 seconds.

Outstream Ads

  • Description: Mobile-only video ads that appear on websites and apps within the Google video partner network, not on YouTube itself.
  • Logo Specs:
    • Square: 1:1 (200 x 200px).
    • File Type: JPG, GIF, PNG.
    • Max Size: 200KB.

Masthead Ads

  • Description: High-visibility ads at the top of the YouTube homepage.
  • Resolution: 1920 x 1080 or higher.
  • File Type: JPG or PNG (without transparency).

Conclusion

YouTube offers a variety of ad formats to reach audiences effectively in 2024. Whether you want to build brand awareness, drive conversions, or target specific demographics, YouTube provides a dynamic platform for your advertising needs. Always follow Google’s advertising policies and the technical ad specs to ensure your ads perform their best. Ready to start using YouTube ads? Contact us today to get started!

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

Why We Are Always ‘Clicking to Buy’, According to Psychologists

Published

on

Why We Are Always 'Clicking to Buy', According to Psychologists

Amazon pillows.

(more…)

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

A deeper dive into data, personalization and Copilots

Published

on

A deeper dive into data, personalization and Copilots

Salesforce launched a collection of new, generative AI-related products at Connections in Chicago this week. They included new Einstein Copilots for marketers and merchants and Einstein Personalization.

To better understand, not only the potential impact of the new products, but the evolving Salesforce architecture, we sat down with Bobby Jania, CMO, Marketing Cloud.

Dig deeper: Salesforce piles on the Einstein Copilots

Salesforce’s evolving architecture

It’s hard to deny that Salesforce likes coming up with new names for platforms and products (what happened to Customer 360?) and this can sometimes make the observer wonder if something is brand new, or old but with a brand new name. In particular, what exactly is Einstein 1 and how is it related to Salesforce Data Cloud?

“Data Cloud is built on the Einstein 1 platform,” Jania explained. “The Einstein 1 platform is our entire Salesforce platform and that includes products like Sales Cloud, Service Cloud — that it includes the original idea of Salesforce not just being in the cloud, but being multi-tenancy.”

Data Cloud — not an acquisition, of course — was built natively on that platform. It was the first product built on Hyperforce, Salesforce’s new cloud infrastructure architecture. “Since Data Cloud was on what we now call the Einstein 1 platform from Day One, it has always natively connected to, and been able to read anything in Sales Cloud, Service Cloud [and so on]. On top of that, we can now bring in, not only structured but unstructured data.”

That’s a significant progression from the position, several years ago, when Salesforce had stitched together a platform around various acquisitions (ExactTarget, for example) that didn’t necessarily talk to each other.

“At times, what we would do is have a kind of behind-the-scenes flow where data from one product could be moved into another product,” said Jania, “but in many of those cases the data would then be in both, whereas now the data is in Data Cloud. Tableau will run natively off Data Cloud; Commerce Cloud, Service Cloud, Marketing Cloud — they’re all going to the same operational customer profile.” They’re not copying the data from Data Cloud, Jania confirmed.

Another thing to know is tit’s possible for Salesforce customers to import their own datasets into Data Cloud. “We wanted to create a federated data model,” said Jania. “If you’re using Snowflake, for example, we more or less virtually sit on your data lake. The value we add is that we will look at all your data and help you form these operational customer profiles.”

Let’s learn more about Einstein Copilot

“Copilot means that I have an assistant with me in the tool where I need to be working that contextually knows what I am trying to do and helps me at every step of the process,” Jania said.

For marketers, this might begin with a campaign brief developed with Copilot’s assistance, the identification of an audience based on the brief, and then the development of email or other content. “What’s really cool is the idea of Einstein Studio where our customers will create actions [for Copilot] that we hadn’t even thought about.”

Here’s a key insight (back to nomenclature). We reported on Copilot for markets, Copilot for merchants, Copilot for shoppers. It turns out, however, that there is just one Copilot, Einstein Copilot, and these are use cases. “There’s just one Copilot, we just add these for a little clarity; we’re going to talk about marketing use cases, about shoppers’ use cases. These are actions for the marketing use cases we built out of the box; you can build your own.”

It’s surely going to take a little time for marketers to learn to work easily with Copilot. “There’s always time for adoption,” Jania agreed. “What is directly connected with this is, this is my ninth Connections and this one has the most hands-on training that I’ve seen since 2014 — and a lot of that is getting people using Data Cloud, using these tools rather than just being given a demo.”

What’s new about Einstein Personalization

Salesforce Einstein has been around since 2016 and many of the use cases seem to have involved personalization in various forms. What’s new?

“Einstein Personalization is a real-time decision engine and it’s going to choose next-best-action, next-best-offer. What is new is that it’s a service now that runs natively on top of Data Cloud.” A lot of real-time decision engines need their own set of data that might actually be a subset of data. “Einstein Personalization is going to look holistically at a customer and recommend a next-best-action that could be natively surfaced in Service Cloud, Sales Cloud or Marketing Cloud.”

Finally, trust

One feature of the presentations at Connections was the reassurance that, although public LLMs like ChatGPT could be selected for application to customer data, none of that data would be retained by the LLMs. Is this just a matter of written agreements? No, not just that, said Jania.

“In the Einstein Trust Layer, all of the data, when it connects to an LLM, runs through our gateway. If there was a prompt that had personally identifiable information — a credit card number, an email address — at a mimum, all that is stripped out. The LLMs do not store the output; we store the output for auditing back in Salesforce. Any output that comes back through our gateway is logged in our system; it runs through a toxicity model; and only at the end do we put PII data back into the answer. There are real pieces beyond a handshake that this data is safe.”

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending