Connect with us


Are you still using spreadsheets to manage your work? Take our poll



Are you still using spreadsheets to manage your work? Take our poll

Earlier this year, revenue orchestration platform LeanData released a report suggesting that lead management remains a “heavily manual” process. Based on a survey of more than 1,700 sales, marketing and operations professionals, the results showed that, despite all the talk of digital transformation, the number two challenge for revenue teams was too many manual processes and not enough automation (the number one challenge was insufficient pipeline).

LeanData, which partnered with Sales Hacker, Outreach and Heinz Marketing in conducting the survey, is interested in that result, of course, because lead management is precisely the process they offer to automate. We were struck by the contrast with Scott Brinker’s recent statement that we are arriving at a post-digital-transformation era: “(C)ompanies are no longer planning to become ‘digital.’ They are digital.”

And then we got the results of our 2022 MarTech Career and Salary Survey. Among the surprising nuggets to be mined from our findings was that 77% of respondents identify spreadsheets as the tool they spend most time (10 or more hours a week) working with. That doesn’t mean that spreadsheets are a marketer’s most important tool, but it does suggest that manual processes remain a key part of daily life for marketing managers and staff.

We wanted to extend the opportunity to all our readers — B2B, B2C, agencies — to give us a reality check on spreadsheet use. MarTech is marketing, we like to say, and certainly today’s marketing is fundamentally data-driven and digital. But is it too soon to say that marketers are working in a digital and largely automated environment?

Download the 2022 MarTech Career and Salary Survey here


Get the daily newsletter digital marketers rely on.

About The Author

Kim Davis is the Editorial Director of MarTech. Born in London, but a New Yorker for over two decades, Kim started covering enterprise software ten years ago. His experience encompasses SaaS for the enterprise, digital- ad data-driven urban planning, and applications of SaaS, digital technology, and data in the marketing space.


He first wrote about marketing technology as editor of Haymarket’s The Hub, a dedicated marketing tech website, which subsequently became a channel on the established direct marketing brand DMN. Kim joined DMN proper in 2016, as a senior editor, becoming Executive Editor, then Editor-in-Chief a position he held until January 2020.

Prior to working in tech journalism, Kim was Associate Editor at a New York Times hyper-local news site, The Local: East Village, and has previously worked as an editor of an academic publication, and as a music journalist. He has written hundreds of New York restaurant reviews for a personal blog, and has been an occasional guest contributor to Eater.

Source link



8 Must-Haves for High-Quality Original Research



8 Must-Haves for High-Quality Original Research

Content marketing can get attention for your company. Publishing original research can earn something much more elusive: authority.

Producing original research indicates your brand has insights about your industry that no one else owns. Given how hard it is to say something truly new and compelling with content marketing, it’s a small wonder more people don’t use original research.

But not all research projects are created equal, and over the years, I’ve learned to spot research inexperience. These eight signals tell me whether a study was well designed or produced.

1. Tell stories; don’t take inventory

In my 15-plus years of writing and publishing research, this is the No. 1 thing people get wrong: When undertaking a research project, they heed the siren’s call – “Let’s find out all the things!”

After all, when spending all this time and money on a survey, why not ask … everything? The problem is (a) participants won’t complete all of the survey and (b) reading through all those things is downright boring.

Get focused. For example, a broad study about artificial intelligence wouldn’t work well, but research on the challenges of adopting AI tech in health care would. Narrow the area of study so you can extract meaningful, never-before insights.

No. 1 mistake with original #research? Using it to find out everything. Better option? Pick a narrow focus, says @clare_mcd via @CMIContent. Click To Tweet


2. Clearly state methodology

A competent study – whether survey-based or otherwise – publishes a clear methodology. It includes the sample size, how respondents were recruited, and demographic summaries relevant to the study (e.g., gender, years of experience, role, geography). These details help your reader gut-check whether your findings are worthwhile.

3. Sample competently or be transparent about potential bias

How did you source respondents for your survey? Did you use a panel and try to get an accurate sampling based on the underlying population? While ideal, this approach isn’t always cost-effective. For niche groups, panels can cost well over $50 per complete survey, a price many companies simply can’t pay.

Sampling your own audience is free and effective, but you should be transparent about the biases that may surface based on that audience. For example, if you’re determining how SEO-savvy new business owners are, polling members of an SEO tech platform means the results include opinions from people who are more advanced in the subject.

4. Ask questions that don’t push a point of view

You think you’re being sneaky, but I see you. Don’t design questions to push your product or service.

The funny thing: If you ask a patently self-promotional question, your survey takers will kick your butt. I was once involved in a study where the client asked a barely disguised question about people’s preference for their product or their competitors’. The smart survey takers smelled a rat and chose the more “primitive” product from the lineup. It was a bit of a “&%$# you and the horse you rode in on.” It made me giggle.

Don’t design questions to push your product. Survey takers will smell a rat, says @clare_mcd via @CMIContent. #OriginalResearch Click To Tweet

5. Work with a report writer who speaks data

Not all good writers have the skills to report your research findings. You want a writer-analyst, not just a writer. You want someone who won’t regurgitate the basic facts of the survey but put those findings in context and explain why they’re fascinating. Experienced journalists are excellent analysts, and I seek them out for tough research-reporting assignments.

6. Tie to relevant, timely, and properly cited third-party research

Excellent research reports tie in other research data to make their cases. For example, if you do a research study about telemedicine in health care, your audience will respect the findings even more if you mention other respectable research organizations uncovering similar issues. But for the love of all things nerdy, please don’t quote a third-party study if it’s more than two years old. And quite frankly, in pandemic times, I’m loath to quote studies more than three months old. Be smart about what data are truly relevant.


For the love of all things nerdy, please don’t quote a third-party study if it’s more than two years old, says @clare_mcd via @CMIContent. #OriginalResearch Click To Tweet

7. Charts and graphs should be able to stand alone

This is a personal pet peeve, but it bears making this list. If you publish a blog post or any type of report with original charts and graphs, each one should be able to stand alone. In other words, if someone published just your chart image, it should contain all the information a viewer needs to make sense of that insight – the key finding, the question asked, the sample size, and the source.

Every research results graphic should be able to stand on its own, says @clare_mcd via @CMIContent. #OriginalResearch. Click To Tweet

Find out more about designing templates for charts and graphs in this article I wrote for the Content Marketing Institute.

8. Design visuals for clarity, not sparkle

We all get taken in by cool visualizations. Who of us hasn’t swooned over Edward Tufte’s books? (OK, maybe not, but I know I’m not the only one.) But let’s get real; few (I daresay none) of us need to channel Beautiful Evidence to publish thought leadership research. Your mantra should be KISS (keep it simple, stupid).

Always design for clarity, and don’t be afraid of the good ol’ bar or line chart if you’re inexperienced. My favorite book for chart and graph design is The Wall Street Journal’s Guide to Information Graphics. There is time, young grasshopper, to graduate to spider charts and heat maps.

Respect the signals

Writing all that is particularly painful because I sound like a research scold, but these good early signals indicate if a research project is run competently. I share these with a tremendous amount of humility, having made a lot of mistakes in my years of designing research.

What would you add to the list of good research hallmarks?



Want more content marketing tips, insights, and examples? Subscribe to workday or weekly emails from CMI.

Cover image by Joseph Kalinowski/Content Marketing Institute

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address