Connect with us

MARKETING

How to Use Chrome to View a Website as Googlebot

Published

on

How to Use Chrome to View a Website as Googlebot

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Introduction to Googlebot spoofing

In this article, I’ll describe how and why to use Google Chrome (or Chrome Canary) to view a website as Googlebot.

We’ll set up a web browser specifically for Googlebot browsing. Using a user-agent browser extension is often close enough for SEO audits, but extra steps are needed to get as close as possible to emulating Googlebot.

Skip to “How to set up your Googlebot browser”.

Why should I view a website as Googlebot?

For many years, us technical SEOs had it easy when auditing websites, with HTML and CSS being web design’s cornerstone languages. JavaScript was generally used for embellishments (such as small animations on a webpage).

Advertisement

Increasingly, though, whole websites are being built with JavaScript.

Originally, web servers sent complete websites (fully rendered HTML) to web browsers. These days, many websites are rendered client-side (in the web browser itself) – whether that’s Chrome, Safari, or whatever browser a search bot uses – meaning the user’s browser and device must do the work to render a webpage.

SEO-wise, some search bots don’t render JavaScript, so won’t see webpages built using it. Especially when compared to HTML and CSS, JavaScript is very expensive to render. It uses much more of a device’s processing power — wasting the device’s battery life— and much more of Google’s, Bing’s, or any search engine’s server resource.

Even Googlebot has difficulties rendering JavaScript and delays rendering of JavaScript beyond its initial URL discovery – sometimes for days or weeks, depending on the website. When I see “Discovered – currently not indexed” for several URLs in Google Search Console’s Coverage (or Pages) section, the website is more often than not JavaScript-rendered.

Attempting to get around potential SEO issues, some websites use dynamic rendering, so each page has two versions:

Generally, I find that this setup overcomplicates websites and creates more technical SEO issues than a server-side rendered or traditional HTML website. A mini rant here: there are exceptions, but generally, I think client-side rendered websites are a bad idea. Websites should be designed to work on the lowest common denominator of a device, with progressive enhancement (through JavaScript) used to improve the experience for people, using devices that can handle extras. This is something I will investigate further, but my anecdotal evidence suggests client-side rendered websites are generally more difficult to use for people who rely on accessibility devices such as a screen reader. There are instances where technical SEO and usability crossover.

Advertisement

Technical SEO is about making websites as easy as possible for search engines to crawl, render, and index (for the most relevant keywords and topics). Like it or lump it, the future of technical SEO, at least for now, includes lots of JavaScript and different webpage renders for bots and users.

Viewing a website as Googlebot means we can see discrepancies between what a person sees and what a search bot sees. What Googlebot sees doesn’t need to be identical to what a person using a browser sees, but main navigation and the content you want the page to rank for should be the same.

That’s where this article comes in. For a proper technical SEO audit, we need to see what the most common search engine sees. In most English language-speaking countries, at least, that’s Google.

Why use Chrome (or Chrome Canary) to view websites as Googlebot?

Can we see exactly what Googlebot sees?

No.

Googlebot itself uses a (headless) version of the Chrome browser to render webpages. Even with the settings suggested in this article, we can never be exactly sure of what Googlebot sees. For example, no settings allow for how Googlebot processes JavaScript websites. Sometimes JavaScript breaks, so Googlebot might see something different than what was intended.

Advertisement

The aim is to emulate Googlebot’s mobile-first indexing as closely as possible.

When auditing, I use my Googlebot browser alongside Screaming Frog SEO Spider’s Googlebot spoofing and rendering, and Google’s own tools such as URL Inspection in Search Console (which can be automated using SEO Spider), and the render screenshot and code from the Mobile Friendly Test.

Even Google’s own publicly available tools aren’t 100% accurate in showing what Googlebot sees. But along with the Googlebot browser and SEO Spider, they can point towards issues and help with troubleshooting.

Why use a separate browser to view websites as Googlebot?

1. Convenience

Having a dedicated browser saves time. Without relying on or waiting for other tools, I get an idea of how Googlebot sees a website in seconds.

While auditing a website that served different content to browsers and Googlebot, and where issues included inconsistent server responses, I needed to switch between the default browser user-agent and Googlebot more often than usual. But constant user-agent switching using a Chrome browser extension was inefficient.

Some Googlebot-specific Chrome settings don’t save or transport between browser tabs or sessions. Some settings affect all open browser tabs. E.g., disabling JavaScript may stop websites in background tabs that rely on JavaScript from working (such as task management, social media, or email applications).

Advertisement

Aside from having a coder who can code a headless Chrome solution, the “Googlebot browser” setup is an easy way to spoof Googlebot.

2. Improved accuracy

Browser extensions can impact how websites look and perform. This approach keeps the number of extensions in the Googlebot browser to a minimum.

3. Forgetfulness

It’s easy to forget to switch Googlebot spoofing off between browsing sessions, which can lead to websites not working as expected. I’ve even been blocked from websites for spoofing Googlebot, and had to email them with my IP to remove the block.

For which SEO audits are a Googlebot browser useful?

The most common use-case for SEO audits is likely websites using client-side rendering or dynamic rendering. You can easily compare what Googlebot sees to what a general website visitor sees.

Even with websites that don’t use dynamic rendering, you never know what you might find by spoofing Googlebot. After over eight years auditing e-commerce websites, I’m still surprised by issues I haven’t come across before.

Example Googlebot comparisons for technical SEO and content audits:

  • Is the main navigation different?

  • Is Googlebot seeing the content you want indexed?

  • If a website relies on JavaScript rendering, will new content be indexed promptly, or so late that its impact is reduced (e.g. for forthcoming events or new product listings)?

  • Do URLs return different server responses? For example, incorrect URLs can return 200 OK for Googlebot but 404 Not Found for general website visitors.

  • Is the page layout different to what the general website visitor sees? For example, I often see links as blue text on a black background when spoofing Googlebot. While machines can read such text, we want to present something that looks user-friendly to Googlebot. If it can’t render your client-side website, how will it know? (Note: a website might display as expected in Google’s cache, but that isn’t the same as what Googlebot sees.)

  • Do websites redirect based on location? Googlebot mostly crawls from US-based IPs.

It depends how in-depth you want to go, but Chrome itself has many useful features for technical SEO audits. I sometimes compare its Console and Network tab data for a general visitor vs. a Googlebot visit (e.g. Googlebot might be blocked from files that are essential for page layout or are required to display certain content).

Advertisement

How to set up your Googlebot browser

Once set up (which takes about a half hour), the Googlebot browser solution makes it easy to quickly view webpages as Googlebot.

Step 1: Download and install Chrome or Canary

If Chrome isn’t your default browser, use it as your Googlebot browser.

If Chrome is your default browser, download and install Chrome Canary. Canary is a development version of Chrome where Google tests new features, and it can be installed and run separately to Chrome’s default version.

Named after the yellow canaries used to detect poisonous gases in mines, with its yellow icon, Canary is easy to spot in the Windows Taskbar:

Screenshot of the yellow Chrome Canary icon in a Windows 10 taskbar

As Canary is a development version of Chrome, Google warns that Canary “can be unstable.” But I’m yet to have issues using it as my Googlebot browser.

Step 2: Install browser extensions

I installed five browser extensions and a bookmarklet on my Googlebot browser. I’ll list the extensions, then advise on settings and why I use them.

For emulating Googlebot (the links are the same whether you use Chrome or Canary):

Advertisement

Not required to emulate Googlebot, but my other favorites for technical SEO auditing of JavaScript websites:

User-Agent Switcher extension

User-Agent Switcher does what it says on the tin: switches the browser’s user-agent. Chrome and Canary have a user-agent setting, but it only applies to the tab you’re using and resets if you close the browser.

I take the Googlebot user-agent string from Chrome’s browser settings, which at the time of writing will be the latest version of Chrome (note that below, I’m taking the user-agent from Chrome and not Canary).

To get the user-agent, access Chrome DevTools (by pressing F12 or using the hamburger menu to the top-right of the browser window, then navigating to More tools > Developer tools). See the screenshot below or follow these steps:

  1. Go to the Network tab

  2. From the top-right Network hamburger menu: More tools > Network conditions

  3. Click the Network conditions tab that appears lower down the window

  4. Untick “Use browser default”

  5. Select “Googlebot Smartphone” from the list, then copy and paste the user-agent from the field below the list into the User-Agent Switcher extension list (another screenshot below). Don’t forget to switch Chrome back to its default user-agent if it’s your main browser.
    • At this stage, if you’re using Chrome (and not Canary) as your Googlebot browser, you may as well tick “Disable cache” (more on that later).

Screenshot of DevTools showing the steps described above

To access User-Agent Switcher’s list, right-click its icon in the browser toolbar and click Options (see screenshot below). “Indicator Flag” is text that appears in the browser toolbar to show which user-agent has been selected — I chose GS to mean “Googlebot Smartphone:”

Screenshot showing User-Agent Switcher options described in the paragraph above

I added Googlebot Desktop and the bingbots to my list, too.

Why spoof Googlebot’s user agent?

Web servers detect what is browsing a website from a user-agent string. For example, the user-agent for a Windows 10 device using the Chrome browser at the time of writing is:

Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.115 Safari/537.36

Advertisement

If you’re interested in why other browsers seem to be named in the Chrome user-agent string, read History of the user-agent string.

Web Developer extension

Web Developer is a must-have browser extension for technical SEOs. In my Googlebot browser, I switch between disabling and enabling JavaScript to see what Googlebot might see with and without JavaScript.

Why disable JavaScript?

Short answer: Googlebot doesn’t execute any/all JavaScript when it first crawls a URL. We want to see a webpage before any JavaScript is executed.

Long answer: that would be a whole other article.

Windscribe (or another VPN)

Windscribe (or your choice of VPN) is used to spoof Googlebot’s US location. I use a pro Windscribe account, but the free account allows up to 2GB data transfer a month and includes US locations.

I don’t think the specific US location matters, but I pretend Gotham is a real place (in a time when Batman and co. have eliminated all villains):

Advertisement
Windscribe browser extension showing location set to New York: Gotham, with a background of the United States of America flag behind a blue overlay

Ensure settings that may impact how webpages display are disabled — Windscribe’s extension blocks ads by default. The two icons to the top-right should show a zero.

For the Googlebot browser scenario, I prefer a VPN browser extension to an application, because the extension is specific to my Googlebot browser.

Why spoof Googlebot’s location?

Googlebot mostly crawls websites from US IPs, and there are many reasons for spoofing Googlebot’s primary location.

Some websites block or show different content based on geolocation. If a website blocks US IPs, for example, Googlebot may never see the website and therefore cannot index it.

Another example: some websites redirect to different websites or URLs based on location. If a company had a website for customers in Asia and a website for customers in America, and redirected all US IPs to the US website, Googlebot would never see the Asian version of the website.

Other Chrome extensions useful for auditing JavaScript websites

With Link Redirect Trace, I see at a glance what server response a URL returns.

The View Rendered Source extension enables easy comparison of raw HTML (what the web server delivers to the browser) and rendered HTML (the code rendered on the client-side browser).

Advertisement

I also added the NoJS Side-by-Side bookmarklet to my Googlebot browser. It compares a webpage with and without JavaScript enabled, within the same browser window.

Step 3: Configure browser settings to emulate Googlebot

Next, we’ll configure the Googlebot browser settings in line with what Googlebot doesn’t support when crawling a website.

What doesn’t Googlebot crawling support?

  • Service workers (because people clicking to a page from search results may never have visited before, so it doesn’t make sense to cache data for later visits).

  • Permission requests (e.g. push notifications, webcam, geolocation). If content relies on any of these, Googlebot will not see that content.

  • Googlebot is stateless so doesn’t support cookies, session storage, local storage, or IndexedDB. Data can be stored in these mechanisms but will be cleared before Googlebot crawls the next URL on a website.

These bullet points are summarized from an interview by Eric Enge with Google’s Martin Splitt:

Step 3a: DevTools settings

To open Developer Tools in Chrome or Canary, press F12, or using the hamburger menu to the top-right, navigate to More tools > Developer tools:

Screenshot showing the steps described above to access DevTools

The Developer Tools window is generally docked within the browser window, but I sometimes prefer it in a separate window. For that, change the “Dock side” in the second hamburger menu:

Screenshot showing the 'Dock side' of DevTools
Disable cache

If using normal Chrome as your Googlebot browser, you may have done this already.

Otherwise, via the DevTools hamburger menu, click to More tools > Network conditions and tick the “Disable cache” option:

DevTools screenshot showing the actions described above to disable cache
Block service workers

To block service workers, go to the Application tab > Service Workers > tick “Bypass for network”:

Screenshot showing the steps described above to disable service workers

Step 3b: General browser settings

In your Googlebot browser, navigate to Settings > Privacy and security > Cookies (or visit chrome://settings/cookies directly) and choose the “Block all cookies (not recommended)” option (isn’t it fun to do something “not recommended?”):

Screenshot showing how to block cookies in Chrome settings

Also in the “Privacy and security” section, choose “Site settings” (or visit chrome://settings/content) and individually block Location, Camera, Microphone, Notifications, and Background sync (and likely anything that appears there in future versions of Chrome):

Screenshot of Chrome's privacy settings

Step 4: Emulate a mobile device

Finally, as our aim is to emulate Googlebot’s mobile-first crawling, emulate a mobile device within your Googlebot browser.

Towards the top-left of DevTools, click the device toolbar toggle, then choose a device to emulate in the browser (you can add other devices too):

Advertisement
Screenshot showing mobile device emulation in Chrome

Whatever device you choose, Googlebot doesn’t scroll on webpages, and instead renders using a window with a long vertical height.

I recommend testing websites in desktop view, too, and on actual mobile devices if you have access to them.

How about viewing a website as bingbot?

To create a bingbot browser, use a recent version of Microsoft Edge with the bingbot user agent.

Bingbot is similar to Googlebot in terms of what it does and doesn’t support.

Yahoo! Search, DuckDuckGo, Ecosia, and other search engines are either powered by or based on Bing search, so Bing is responsible for a higher percentage of search than many people realize.

Summary and closing notes

So, there you have your very own Googlebot emulator.

Using an existing browser to emulate Googlebot is the easiest method to quickly view webpages as Googlebot. It’s also free, assuming you already use a desktop device that can install Chrome and/or Canary.

Advertisement

Other tools exist to help “see” what Google sees. I enjoy testing Google’s Vision API (for images) and their Natural Language API.

Auditing JavaScript websites — especially when they’re dynamically rendered — can be complex, and a Googlebot browser is one way of making the process simpler. If you’d like to learn more about auditing JavaScript websites and the differences between standard HTML and JavaScript-rendered websites, I recommend looking up articles and presentations from Jamie Indigo, Joe Hall and Jess Peck. Two of them contribute in the below video. It’s a good introduction to JavaScript SEO and touches on points I mentioned above:

Questions? Something I missed? Tweet me @AlexHarfordSEO. Thanks for reading!



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

MARKETING

How To Combine PR and Content Marketing Superpowers To Achieve Business Goals

Published

on

A figure pulls open a dress shirt to reveal the term PR on a Superman-like costume, reflecting the superpower resulting from combining content and PR.

A transformative shift is happening, and it’s not AI.

The aisle between public relations and content marketing is rapidly narrowing. If you’re smart about the convergence, you can forever enhance your brand’s storytelling.

The goals and roles of content marketing and PR overlap more and more. The job descriptions look awfully similar. Shrinking budgets and a shrewd eye for efficiency mean you and your PR pals could face the chopping block if you don’t streamline operations and deliver on the company’s goals (because marketing communications is always first to be axed, right?).

Yikes. Let’s take a big, deep breath. This is not a threat. It’s an opportunity.

Advertisement

Reach across the aisle to PR and streamline content creation, improve distribution strategies, and get back to the heart of what you both are meant to do: Build strong relationships and tell impactful stories.

So, before you panic-post that open-to-work banner on LinkedIn, consider these tips from content marketing, PR, and journalism pros who’ve figured out how to thrive in an increasingly narrowing content ecosystem.

1. See journalists as your audience

Savvy pros know the ability to tell an impactful story — and support it with publish-ready collateral — grounds successful media relationships. And as a content marketer, your skills in storytelling and connecting with audiences, including journalists, naturally support your PR pals’ media outreach.

Strategic storytelling creates content focused on what the audience needs and wants. Sharing content on your blog or social media builds relationships with journalists who source those channels for story ideas, event updates, and subject matter experts.

“Embedding PR strategies in your content marketing pieces informs your audience and can easily be picked up by media,” says Alex Sanchez, chief experience officer at BeWell, New Mexico’s Health Insurance Marketplace. “We have seen reporters do this many times, pulling stories from our blogs and putting them in the nightly news — most of the time without even reaching out to us.”

Acacia James, weekend producer/morning associate producer at WTOP radio in Washington, D.C., says blogs and social media posts are helpful to her work. “If I see a story idea, and I see that they’re willing to share information, it’s easier to contact them — and we can also backlink their content. It’s huge for us to be able to use every avenue.” 

Advertisement

Kirby Winn, manager of PR at ImpactLife, says reporters and assignment editors are key consumers of their content. “And I don’t mean a news release that just hit their inbox. They’re going to our blog and consuming our stories, just like any other audience member,” he says. “Our organization has put more focus into content marketing in the past few years — it supports a media pitch so well and highlights the stories we have to tell.”

Storytelling attracts earned media that might not pick up the generic news topic. “It’s one thing to pitch a general story about how we help consumers sign up for low-cost health insurance,” Alex says. “Now, imagine a single mom who just got a plan after years of thinking it was too expensive. She had a terrible car accident, and the $60,000 ER bill that would have ruined her financially was covered. Now that’s a story journalists will want to cover, and that will be relatable to their audience and ours.” 

2. Learn the media outlet’s audience

Seventy-three percent of reporters say one-fourth or less of the stories pitched are relevant to their audiences, according to Cision’s 2023 State of the Media Report (registration required).

PR pros are known for building relationships with journalists, while content marketers thrive in building communities around content. Merge these best practices to build desirable content that works for your target audience and the media’s audiences simultaneously.

WTOP’s Acacia James says sources who show they’re ready to share helpful, relevant content often win pitches for coverage. “In radio, we do a lot of research on who is listening to us, and we’re focused on a prototype called ‘Mike and Jen’ — normal, everyday people in Generation X … So when we get press releases and pitches, we ask, ‘How interested will Mike and Jen be in this story?’” 

3. Deliver the full content package (and make journalists’ jobs easier)

Cranking out content to their media outlet’s standards has never been tougher for journalists. Newsrooms are significantly understaffed, and anything you can do to make their lives easier will be appreciated and potentially rewarded with coverage. Content marketers are built to think about all the elements to tell the story through multiple mediums and channels.

Advertisement

“Today’s content marketing pretty much provides a package to the media outlet,” says So Young Pak, director of media relations at MedStar Washington Hospital Center. “PR is doing a lot of storytelling work in advance of media publication. We (and content marketing) work together to provide the elements to go with each story — photos, subject matter experts, patients, videos, and data points, if needed.”   

At WTOP, the successful content package includes audio. “As a radio station, we are focused on high-quality sound,” Acacia James says. “Savvy sources know to record and send us voice memos, and then we pull cuts from the audio … You will naturally want to do someone a favor if they did you one — like providing helpful soundbites, audio, and newsworthy stories.”  

While production value matters to some media, you shouldn’t stress about it. “In the past decade, how we work with reporters has changed. Back in the day, if they couldn’t be there in person, they weren’t going to interview your expert,” says Jason Carlton, an accredited PR professional and manager of marketing and communications at Intermountain Health. “During COVID, we had to switch to virtual interviewing. Now, many journalists are OK with running a Teams or Zoom interview they’ve done with an expert on the news.”

BeWell’s Alex Sanchez agrees. “I’ve heard old school PR folks cringe at the idea of putting up a Zoom video instead of getting traditional video interviews. It doesn’t really matter to consumers. Focus on the story, on the timeliness, and the relevance. Consumers want authenticity, not super stylized, stiff content.”

4. Unite great minds to maximize efficiency

Everyone needs to set aside the debate about which team — PR or content marketing — gets credit for the resulting media coverage.

At MedStar Washington Hospital Center, So Young and colleagues adopt a collaborative mindset on multichannel stories. “We can get the interview and gather information for all the different pieces — blog, audio, video, press release, internal newsletter, or magazine. That way, we’re not trying to figure things out individually, and the subject matter experts only have to have that conversation once,” she says.

Advertisement

Regular, cross-team meetings are essential to understand the best channels for reaching key audiences, including the media. A story that began life as a press release might reap SEO and earned media gold if it’s strategized as a blog, video, and media pitch.

“At Intermountain Health, we have individual teams for media relations, marketing, social media, and hospital communications. That setup works well because it allows us to bring in the people who are the given experts in those areas,” says Intermountain’s Jason Carlton. “Together, we decide if a story is best for the blog, a media pitch, or a mix of channels — that way, we avoid duplicating work and the risk of diluting the story’s impact.”

5. Measure what matters

Cutting through the noise to earn media mentions requires keen attention to metrics. Since content marketing and PR metrics overlap, synthesizing the data in your team meetings can save time while streamlining your storytelling efforts.

“For content marketers, using analytical tools such as GA4 can help measure the effectiveness of their content campaigns and landing pages to determine meaningful KPIs such as organic traffic, keyword rankings, lead generation, and conversion rates,” says John Martino, director of digital marketing for Visiting Angels. “PR teams can use media coverage and social interactions to assess user engagement and brand awareness. A unified and omnichannel approach can help both teams demonstrate their value in enhancing brand visibility, engagement, and overall business success.”

To track your shared goals, launch a shared dashboard that helps tell the combined “story of your stories” to internal and executive teams. Among the metrics to monitor:

  • Page views: Obviously, this queen of metrics continues to be important across PR and content marketing. Take your analysis to the next level by evaluating which niche audiences are contributing to these views to further hone your storytelling targets, including media outlets.
  • Earned media mentions: Through a media tracker service or good old Google Alerts, you can tally the echo of your content marketing and PR. Look at your site’s referral traffic report to identify media outlets that send traffic to your blog or other web pages.
  • Organic search queries: Dive into your analytics platform to surface organic search queries that lead to visitors. Build from those questions to develop stories that further resonate with your audience and your targeted media.
  • On-page actions: When visitors show up on your content, what are they doing? What do they click? Where do they go next? Building next-step pathways is your bread and butter in content marketing — and PR can use them as a natural pipeline for media to pick up more stories, angles, and quotes.

But perhaps the biggest metric to track is team satisfaction. Who on the collaborative team had the most fun writing blogs, producing videos, or calling the news stations? Lean into the natural skills and passions of your team members to distribute work properly, maximize the team output, and improve relationships with the media, your audience, and internal teams.

“It’s really trying to understand the problem to solve — the needle to move — and determining a plan that will help them achieve their goal,” Jason says. “If you don’t have those measurable objectives, you’re not going to know whether you made a difference.”

Advertisement

Don’t fear the merger

Whether you deliberately work together or not, content marketing and public relations are tied together. ImpactLife’s Kirby Winn explains, “As soon as we begin to talk about (ourselves) to a reporter who doesn’t know us, they are certainly going to check out our stories.”

But consciously uniting PR and content marketing will ease the challenges you both face. Working together allows you to save time, eliminate duplicate work, and gain free time to tell more stories and drive them into impactful media placements.

Register to attend Content Marketing World in San Diego. Use the code BLOG100 to save $100. Can’t attend in person this year? Check out the Digital Pass for access to on-demand session recordings from the live event through the end of the year.

HANDPICKED RELATED CONTENT:

Cover image by Joseph Kalinowski/Content Marketing Institute

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

Trends in Content Localization – Moz

Published

on

Trends in Content Localization - Moz

Multinational fast food chains are one of the best-known examples of recognizing that product menus may sometimes have to change significantly to serve distinct audiences. The above video is just a short run-through of the same business selling smokehouse burgers, kofta, paneer, and rice bowls in an effort to appeal to people in a variety of places. I can’t personally judge the validity of these representations, but what I can see is that, in such cases, you don’t merely localize your content but the products on which your content is founded.

Sometimes, even the branding of businesses is different around the world; what we call Burger King in America is Hungry Jack’s in Australia, Lays potato chips here are Sabritas in Mexico, and DiGiorno frozen pizza is familiar in the US, but Canada knows it as Delissio.

Tales of product tailoring failures often become famous, likely because some of them may seem humorous from a distance, but cultural sensitivity should always be taken seriously. If a brand you are marketing is on its way to becoming a large global seller, the best insurance against reputation damage and revenue loss as a result of cultural insensitivity is to employ regional and cultural experts whose first-hand and lived experiences can steward the organization in acting with awareness and respect.

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

How AI Is Redefining Startup GTM Strategy

Published

on

How AI Is Redefining Startup GTM Strategy

AI and startups? It just makes sense.

(more…)

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS