How Do You Set Smart SEO Goals for Your Team/Agency/Project? – Whiteboard Friday

Posted by randfish

Are you sure that your current SEO goals are the best fit for your organization? It's incredibly important that they tie into both your company goals and your marketing goals, as well as provide specific, measurable metrics you can work to improve. In this edition of Whiteboard Friday, Rand outlines how to set the right SEO goals for your team and shares two examples of how different businesses might go about doing just that.

Setting Smart SEO Goals for Your Team, Agency, or Project

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about SEO goals, how to set smart ones, how to measure your progress against them, how to amplify those goals to the rest of your organization so that people really buy in to SEO.

This is a big challenge. So many folks that I've talked to in the field have basically said, "I'm not sure exactly how to set goals for our SEO team that are the right ones." I think that there's a particularly pernicious problem once Google took away the keyword-level data for SEO referrals.

So, from paid search, you can see this click was on this keyword and sent traffic to this page and then here's how it performed after that. In organic search, you can no longer do that. You haven't been able to do it for a few years now. Because of that removal, proving the return on investment for SEO has been really challenging. We'll talk in a future Whiteboard Friday about proving ROI. But let's focus here on how you get some smart SEO goals that are actually measurable, trackable, and pertain intelligently to the goals of the business, the organization.

Where to start:

So the first thing, the first problem that I see is that a lot of folks start here, which seems like a reasonable idea, but is actually a terrible idea. Don't start with your SEO goals. When your SEO team gets together or when you get together with your consultants, your agency, don't start with what the SEO goals should be.

  • Start with the company goals. This is what our company is trying to accomplish this quarter or this year or this month.
  • Marketing goals. Go from there to here's how marketing is going to contribute to those company goals. So if the company has a goal of increasing sales, marketing's job is what? Is marketing's job improving the conversion funnel? Is it getting more traffic to the top of the funnel? Is it bringing back more traffic that's already been to the site but needs to be re-earned? Those marketing goals should be tied directly to the company goals so that anyone and everyone in the organization can clearly see, "Here's why marketing is doing what they're doing."
  • SEO goals. Next, here's how SEO contributes to those marketing goals. So if the goal is around, as we mentioned, growing traffic to the top of the funnel, for example, SEO could be very broad in their targeting. If it's bringing people back, you've got to get much more narrow in your keyword targeting.
  • Specific metrics to measure and improve. From those SEO goals, you can get the outcome of specific metrics to measure and improve.

Measurable goal metrics

So that list is kind of right here. It's not very long. There are not that many things in the SEO world that we can truly measure directly. So measurable goal metrics might be things like...

1. Rankings. Which we can measure in three ways. We can measure them globally, nationally, or locally. You can choose to set those up.

2. Organic search visits. So this would be just the raw traffic that is sent from organic search.

3. You can also separate that into branded search versus non-branded search. But it's much more challenging than it is with paid, because we don't have the keyword data. Thus, we have to use an implied or inferred model, where essentially we say, "These pages are likely to be receiving branded search traffic, versus these pages that are likely to be receiving non-branded search traffic."

A good example is the homepage of most brands is most likely to get primarily branded search traffic, whereas resource pages, blog pages, content marketing style pages, those are mostly going to get unbranded. So you can weight those appropriately as you see fit.

Tracking your rankings is crucially important, because that way you can see which pages show up for branded queries versus which pages show up for unbranded queries, and then you can build pretty darn good models of branded search versus non-branded search visits based on which landing pages are going to get traffic.

4. SERP ownership. So ideas around your reputation in the search results. So this is essentially looking at the page of search results that comes up for a given query and what results are in there. There might be things you don't like and don't want and things you really do want, and the success and failure can be measured directly through the rankings in the SERP.

5. Search volume. So for folks who are trying to improve their brand's affinity and reputation on the web and trying to grow the quantity of branded search, which is a good metric, you can look at that through things like Google Trends or through a Google AdWords campaign or through something like Moz's Keyword Explorer.

6. Links and link metrics. So you could look at the growth or shrinkage of links over time. You can measure that through things like the number of linking root domains, the total number of links. Authority or spam metrics and how those are distributed.

7. Referral traffic. And last, but not least, most SEO campaigns, especially those that focus on links or improving rankings, are going to also send referral traffic from the links that are built. So you can watch referral traffic and what those referrers are and whether they came from pages where you built links with SEO intent.

So taking all of these metrics, these should be applied to the SEO goals that you choose that match up with your marketing and company goals. I wanted to try and illustrate this, not just explain it, but illustrate it through two examples that are very different in what they're measuring.

Example one

So, first off, Taft Boots, they've been advertising like crazy to me on Instagram. Apparently, I must need new boots.

  • Grow online sales. Let's say that their big company goal for 2018 is "grow online sales to core U.S. customers, so the demographics and psychographics they're already reaching, by 30%."
  • Increase top of funnel website traffic by 50%. So marketing says, "All right, you know what? There's a bunch of ways to do that, but we think that our best opportunity to do that is to grow top of funnel, because we can see how top of funnel turns into sales over time, and we're going to target a number of 50% growth." This is awesome. This can turn into very measurable, actionable SEO goals.
  • Grow organic search visits 70%. We can say, "Okay, we know that search is going to contribute an outsized quantity of this 50% growth. So what we want to do is take search traffic up by 70%. How are we going to do that? We have four different plans.
    • A. We're going to increase our blog content, quality and quantity.
    • B. We're going to create new product pages that are more detailed, that are better optimized, that target good searches.
    • C. We're going to create a new resources section with some big content pieces.
    • D. We're going to improve our link profile and Domain Authority."

Now, you might say, "Wait a minute. Rand, this is a pretty common SEO methodology here." Yes, but many times this is not directly tied to the marketing goals, which is not directly tied to the business goals. If you want to have success as an SEO, you want to convince people to keep investing in you, you want to keep having that job or that consulting gig, you've got to connect these up.

From these, we can then say, "Okay, for each one, how do we measure it?" Well...

  • A. Quantity of content and search visits/piece. Blog content can be measured through the quantity of content we produce, the search visits that each of those pieces produce, and what the distribution and averages are.
  • B. Rankings and organic traffic. Is a great way to measure product pages and whether we're hitting our goals there.
  • C. Link growth, rankings, and traffic. That's a great way to measure the new resources section.
  • D. Linking root domains plus the DA distribution and maybe Spam Score distribution. That's a great way to measure whether we're improving our link profile.

All of these, this big-picture goal is going to be measured by the contribution of search visits to essentially non-homepage and non-branded pages that contribute to the conversion funnel. So we have a methodology to create a smart goal and system here.

Example two

Another example, totally different, but let's try it out because I think that many folks have trouble connecting non-e-commerce pages, non-product stuff. So we're going to use Book-It Theatre. They're a theater group here in the Seattle area. They use the area beneath Seattle Center House as their space. They basically will take popular books and literature and convert them into plays. They'll adapt them into screenplays and then put on performances. It's quite good. We've been to a few shows, Geraldine and I have, and we really like them.

So their goal — I'm making this up, I don't actually know if this is their goal — but let's say they want to...

  • Attract theater goers from outside the Seattle area. So they're looking to hit tourists and critics, people who are not just locals, because they want to expand their brand.
  • Reach audiences in 4 key geographies — LA, Portland, Vancouver, Minneapolis. So they decide, "You know what? Marketing can contribute to this in four key geographies, and that's where we're going to focus a bunch of efforts — PR efforts, outreach efforts, offline media, and SEO. The four key geographies are Los Angeles, Portland, Vancouver, and Minneapolis. We think these are good theater-going towns where we can attract the right audiences."

So what are we going to do as SEOs? Well, as SEOs, we better figure out what's going to match up to this.

  • Drive traffic from these regions to Book-It Theatre's pages and to reviews of our show. So it's not just content on our site. We want to drive people to other critics and press that's reviewed us.
    • A. So we're going to create some geo landing pages, maybe some special offers for people from each of these cities.
    • B. We're going to identify third-party reviews and hopefully get critics who will write reviews, and we're going to ID those and try and drive traffic to them.
    • C. We're going to do the same with blog posts and informal critics.
    • D. We're going to build some content pages around the books that we're adapting, hoping to drive traffic, that's interested in those books, from all over the United States to our pages and hopefully to our show.

So there are ways to measure each of these.

  • A. Localized rankings in Moz Pro or a bunch of other rank tracking tools. You can set up geo-specific localized rankings. "I want to track rankings in Vancouver, British Columbia. I want to track rankings from Los Angeles, California." Those might look different than the ones you see here in Seattle, Washington.
  • B. We can do localized rankings and visits from referrals for the third-party reviews. We won't be able to track the visits that those pages receive, but if they mention Book-It Theatre and link to us, we can see, oh yes, look, the Minneapolis Journal wrote about us and they linked to us, and we can see what the reviews are from there.
  • C. We can do localized rankings and visits from referrals for the third-party blog posts.
  • D. Local and national ranking, organic visits. For these Book-It content pages, of course, we can track our local and national rankings and the organic visits.

Each of these, and as a whole, the contribution of search visits from non-Seattle regions, so we can remove Seattle or Washington State in our analytics and we can see: How much traffic did we get from there? Was it more than last year? What's it contributing to the ticket sales conversion funnel?

You can see how, if you build these smart goals and you measure them correctly and you align them with what the company and the marketing team is trying to do, you can build something really special. You can get great involvement from the rest of your teams, and you can show the value of SEO even to people who might not believe in it already.

All right, everyone. Look forward to your thoughts and feedback in the comments, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How Do You Set Smart SEO Goals for Your Team/Agency/Project? - Whiteboard Friday posted first on https://moz.com/blog

The MozCon 2018 Final Agenda

Posted by Trevor-Klein

MozCon 2018 is just around the corner — just over six weeks away — and we're excited to share the final agenda with you today. There are some familiar faces, and some who'll be on the MozCon stage for the first time, with topics ranging from the evolution of searcher intent to the increasing importance of local SEO, and from navigating bureaucracy for buy-in to cutting the noise out of your reporting.

We're also thrilled to announce this year's winning pitches for our six MozCon Community Speaker slots! If you're not familiar, each year we hold several shorter speaking slots, asking you all to submit your best pitches for what you'd like to teach everyone at MozCon. The winners — all members of the Moz Community — are invited to the conference alongside all our other speakers, and are always some of the most impressive folks on the stage. Check out the details of their talks below, and congratulations to this year's roster!

Still need your tickets? We've got you covered, but act fast — they're over 70% sold!

Pick up your ticket to MozCon!

The Agenda


Monday, July 9


8:30–9:30 am

Breakfast and registration

Doors to the conference will open at 8:00 for those looking to avoid registration lines and grab a cup of coffee (or two) before breakfast, which will be available starting at 8:30.


9:30–9:45 am

Welcome to MozCon 2018!
Sarah Bird

Moz CEO Sarah Bird will kick things off by sharing everything you need to know about your time at MozCon 2018, including conference logistics and evening events.

She'll also set the tone for the show with an update on the state of the SEO industry, illustrating the fact that there's more opportunity in it now than there's ever been before.


9:50–10:20 am

The Democratization of SEO
Jono Alderson

How much time and money we collectively burn by fixing the same kinds of basic, "binary," well-defined things over and over again (e.g., meta tags, 404s, URLs, etc), when we could be teaching others throughout our organizations not to break them in the first place?

As long as we "own" technical SEO, there's no reason (for example) for the average developer to learn it or care — so they keep making the same mistakes. We proclaim that others are doing things wrong, but by doing so we only reinforce the line between our skills and theirs.

We need to start giving away bits of the SEO discipline, and technical SEO is probably the easiest thing for us to stop owning. We need more democratization, education, collaboration, and investment in open source projects so we can fix things once, rather than a million times.


10:20–10:50 am

Mobile-First Indexing or a Whole New Google
Cindy Krum

The emergence of voice-search and Google Assistant is forcing Google to change its model in search, to favor their own entity understanding or the world, so that questions and queries can be answered in context. Many marketers are struggling to understand how their website and their job as an SEO or SEM will change, as searches focus more on entity-understanding, context and action-oriented interaction. This shift can either provide massive opportunities, or create massive threats to your company and your job — the main determining factor is how you choose to prepare for the change.


10:50–11:20 am

AM Break


11:30–11:50 am

It Takes a Village:
2x Your Paid Search Revenue by Smashing Silos
Community speaker: Amy Hebdon

Your company's unfair advantage to skyrocketing paid search revenue is within your reach, but it's likely outside the control of your paid search team. Good keywords and ads are just a few cogs in the conversion machine. The truth is, the success of the entire channel depends on people who don't touch the campaigns, and may not even know how paid search works. We'll look at how design, analysis, UX, PM and other marketing roles can directly impact paid search performance, including the most common issues that arise, and how to immediately fix them to improve ROI and revenue growth.


11:50 am–12:10 pm

The #1 and Only Reason Your SEO Clients Keep Firing You
Community speaker: Meredith Oliver

You have a kick-ass keyword strategy. Seriously, it could launch a NASA rocket; it's that good. You have the best 1099 local and international talent on your SEO team that working from home and an unlimited amount of free beard wax can buy. You have a super-cool animal inspired company name like Sloth or Chinchilla that no one understands, but the logo is AMAZING. You have all of this, yet, your client turnover rate is higher than Snoop Dogg's audience on an HBO comedy special. Why? You don't talk to your clients. As in really communicate, teach them what you know, help them get it, really get it, talk to them. How do I know? I was you. In my agency's first five years we churned and burned through clients faster than Kim Kardashian could take selfies. My mastermind group suggested we *proactively* set up and insist upon a monthly review meeting with every single client. It was a game-changer, and we immediately adopted the practice. Ten years later we have a 90% client retention rate and more than 30 SEO clients on retainer.



12:10–12:30 pm

Why "Blog" Is a Misnomer for Our 2018 Content Strategy
Community speaker: Taylor Coil

At the end of 2017, we totally redesigned our company's blog. Why? Because it's not really a blog anymore - it's an evergreen collection of traffic and revenue-generating resources. The former design catered to a time-oriented strategy surfacing consistently new posts with short half-lives. That made sense when we started our blog in 2014. Today? Not so much. In her talk, Taylor will detail how to make the perspective shift from "blog" to "collection of resources," why that shift is relevant in 2018's content landscape, and what changes you can make to your blog's homepage, nav, and taxonomy that reflect this new perspective.


12:30–2:00 pm

Lunch


2:05–2:35 pm

Near Me or Far:
How Google May Be Deciding Your Local Intent For You
Rob Bucci

In August 2017, Google stated that local searches without the "near me" modifier had grown by 150% and that searchers were beginning to drop geo-modifiers — like zip code and neighborhood — from local queries altogether. But does Google still know what searchers are after?

For example: the query [best breakfast places] suggests that quality takes top priority; [breakfast places near me] indicates that close proximity is essential; and [breakfast places in Seattle] seems to cast a city-wide net; while [breakfast places] is largely ambiguous.

By comparing non-geo-modified keywords against those modified with the prepositional phrases "near me" and "in [city name]" and qualifiers like "best," we hope to understand how Google interprets different levels of local intent and uncover patterns in the types of SERPs produced.

With a better understanding of how local SERPs behave, SEOs can refine keyword lists, tailor content, and build targeted campaigns accordingly.


2:35–3:05 pm

None of Us Is as Smart as All of Us
Lisa Myers

Success in SEO, or in any discipline, is frequently reliant on people's ability to work together. Lisa Myers started Verve Search in 2009, and from the very beginning was convinced of the importance of building a diverse team, then developing and empowering them to find their own solutions.

In this session she'll share her experiences and offer actionable advice on how to attract, develop, and retain the right people in order to build a truly world-class team.


3:05–3:35 pm

PM Break


3:45–4:15 pm

Search-Driven Content Strategy
Stephanie Briggs

Google's improvements in understanding language and search intent have changed how and why content ranks. As a result, many SEOs are chasing rankings that Google has already decided are hopeless. Stephanie will cover how this should impact the way you write and optimize content for search, and will help you identify the right content opportunities. She'll teach you how to persuade organizations to invest in content, and will share examples of strategies and tactics she has used to grow content programs by millions of visits.


4:15–4:55 pm

Ranking Is a Promise: Can You Deliver?
Dr. Pete Meyers

In our rush to rank, we put ourselves first, neglecting what searchers (and our future customers) want. Google wants to reward sites that deliver on searcher intent, and SERP features are a window into that intent. Find out how to map keywords to intent, understand how intent informs the buyer funnel, and deliver on the promise of ranking to drive results that attract clicks and customers.


7:00–10:00 pm

Kickoff Party

Networking the Mozzy way! Join us for an evening of fun on the first night of the conference (stay tuned for all the details!).



Tuesday, July 10


8:30–9:30 am

Breakfast


9:35–10:15 am

Content Marketing Is Broken
and Only Your M.O.M. Can Save You
Oli Gardner

Traditional content marketing focuses on educational value at the expense of product value, which is a broken and outdated way of thinking. We all need to sell a product, and our visitors all need a product to improve their lives, but we're so afraid of being seen as salesy that somehow we got lost, and we forgot why our content even exists. We need our M.O.M.s! No, not your actual mother. Your Marketing Optimization Map — your guide to exploring the nuances of optimized content marketing through a product-focused lens.

In this session you'll learn data and lessons from Oli's biggest ever content marketing experiment, and how those lessons have changed his approach to content; a context-to-content-to-conversion strategy for big content that converts; advanced methods for creating "choose your own adventure" navigational experiences to build event-based behavioral profiles of your visitors (using GTM and GA); and innovative ways to productize and market the technology you already have, with use cases your customers had never considered.


10:15–10:45 am

Lies, Damned Lies, and Analytics
Russ Jones

Search engine optimization is a numbers game. We want some numbers to go up (links, rankings, traffic, and revenue), others to go down (bounce rate, load time, and budget). Underlying all these numbers are assumptions that can mislead, deceive, or downright ruin your campaigns. Russ will help uncover the hidden biases, distortions, and fabrications that underlie many of the metrics we have come to trust implicitly and from the ashes show you how to build metrics that make a difference.


10:45–11:15 am

AM Break


11:25–11:55 am

The Awkward State of Local
Mike Ramsey

You know it exists. You know what a citation is, and have a sense for the importance of accurate listings. But with personalization and localization playing an increasing role in every SERP, local can no longer be seen in its own silo — every search and social marketer should be honing their understanding. For that matter, it's also time for local search marketers to broaden the scope of their work.


11:55 am–12:25 pm

The SEO Cyborg:
Connecting Search Technology and Its Users
Alexis Sanders

SEO requires a delicate balance of working for the humans you're hoping to reach, and the machines that'll help you reach them. To make a difference in today's SERPs, you need to understand the engines, site configurations, and even some machine learning, in addition to the emotional, raw, authentic connections with people and their experiences. In this talk, Alexis will help marketers of all stripes walk that line.


12:25–1:55 pm

Lunch


2:00–2:30 pm

Email Unto Others:
The Golden Rules for Human-Centric Email Marketing
Justine Jordan

With the arrival of GDPR and the ease with which consumers can unsubscribe and report spam, it's more important than ever to treat people like people instead of just leads. To understand how email marketing is changing and to identify opportunities for brands, Litmus surveyed more than 3,000 marketers worldwide. Justine will cover the biggest trends and challenges facing email today and help you put the human back in marketing’s most personal — and effective — marketing channel.

2:30–3:00 pm

Your Red-Tape Toolkit:
How to Win Trust and Get Approval for Search Work
Heather Physioc

Are your search recommendations overlooked and misunderstood? Do you feel like you hit roadblocks at every turn? Are you worried that people don't understand the value of your work? Learn how to navigate corporate bureaucracy and cut through red tape to help clients and colleagues understand your search work — and actually get it implemented. From diagnosing client maturity to communicating where search fits into the big picture, these tools will equip you to overcome obstacles to doing your best work.


3:00–3:30 pm

PM Break


3:40–4:10 pm

The Problem with Content &
Other Things We Don't Want to Admit
Casie Gillette

Everyone thinks they need content but they don't think about why they need it or what they actually need to create. As a result, we are overwhelmed with poor quality content and marketers are struggling to prove the value. In this session, we'll look at some of the key challenges facing marketers and how a data-driven strategy can help us make better decisions.


4:10–4:50 pm

Excel Is for Rookies:
Why Every Search Marketer Needs to Get Strong in BI, ASAP
Wil Reynolds

The analysts are coming for your job, not AI (at least not yet). Analysts stopped using Excel years ago; they use Tableau, Power BI, Looker! They see more data than you, and that is what is going to make them a threat to your job. They might not know search, but they know data. I'll document my obsession with Power BI and the insights I can glean in seconds which is helping every single client at Seer at the speed of light. Search marketers must run to this opportunity, as analysts miss out on the insights because more often than not they use these tools to report. We use them to find insights.



Wednesday, July 11


8:30–9:30 am

Breakfast


9:35–10:15 am

Machine Learning for SEOs
Britney Muller

People generally react to machine learning in one of two ways: either with a combination of fascination and terror brought on by the possibilities that lie ahead, or with looks of utter confusion and slight embarrassment at not really knowing much about it. With the advent of RankBrain, not even higher-ups at Google can tell us exactly how some things rank above others, and the impact of machine learning on SEO is only going to increase from here. Fear not: Moz's own senior SEO scientist, Britney Muller, will talk you through what you need to know.


10:15–10:45 am

Shifting Toward Engagement and Reviews
Darren Shaw

With search results adding features and functionality all the time, and users increasingly finding what they need without ever leaving the SERP, we need to focus more on the forest and less on the trees. Engagement and behavioral optimization are key. In this talk, Darren will offer new data to show you just how tight the proximity radius around searchers really is, and how reviews can be your key competitive advantage, detailing new strategies and tactics to take your reivews to the next level.

10:45–11:15 am

AM Break


11:25–11:45 am

Location-Free Local SEO
Community speaker: Tom Capper

Let's talk about local SEO without physical premises. Not the Google My Business kind — the kind of local SEO that job boards, house listing sites, and national delivery services have to reckon with. Should they have landing pages, for example, for "flower delivery in London?"

This turns out to be a surprisingly nuanced issue: In some industries, businesses are ranking for local terms without a location-specific page, and in others local pages are absolutely essential. I've worked with clients across several industries on why these sorts of problems exist, and how to tackle them. How should you figure out whether you need these pages, how can you scale them and incorporate them in your site architecture, and how many should you have for what location types?


11:45 am–12:05 pm

SEO without Traffic:
Community speaker: Hannah Thorpe

Answer boxes, voice search, and a reduction in the number of results displayed sometimes all result in users spending more time in the SERPs and less on our websites. But does that mean we should stop investing in SEO?

This talk will cover what metrics we should now care about, and how strategies need to change, covering everything from measuring more than just traffic and rankings to expanding your keyword research beyond just keyword volumes.


12:05–12:25 pm

Tools Change, People Don't:
Empathy-Driven Online Marketing
Community speaker: Ashley Greene

When everyone else zags, the winners zig. As winners, while your 101+ competitors are trying to automate 'til the cows come home and split test their way to greatness‚ you're zigging. Whether you're B2B or B2C, you're marketing to humans. Real people. Homo sapiens. But where is the human element in the game plan? Quite simply, it has gone missing, which provides a window of opportunity for the smartest marketers.

In this talk, Ashley will provide a framework of simple user interview and survey techniques to build customer empathy and your "voice of customer" playbook. Using real examples from companies like Slack, Pinterest, Intercom, and Airbnb, this talk will help you uncover your customers' biggest problems and pain points; know what, when, and how your customers research (and Google!) a need you solve; and find new sources of information and influencers so you can unearth distribution channels and partnerships.


12:25–1:55 pm

Lunch


2:00–2:30 pm

You Don't Know SEO
Michael King

Or maybe, "SEO you don't know you don't know." We've all heard people throw jargon around in an effort to sound smart when they clearly don't know what it means, and our industry of SEO is no exception. There are aspects of search that are acknowledged as important, but seldom actually understood. Michael will save us from awkward moments, taking complex topics like the esoteric components of information retrieval and log-file analysis, pairing them with a detailed understanding of technical implementation of common SEO recommendations, and transforming them into tools and insights we wish we'd never neglected.

2:30–3:00 pm

What All Marketers Can Do about Site Speed
Emily Grossman

At this point, we should all have some idea of how important site speed is to our performance in search. The recently announced "speed update" underscored that fact yet again. It isn't always easy for marketers to know where to start improving their site's speed, though, and a lot of folks mistakenly believe that site speed should only be a developer's problem. Emily will clear that up with an actionable tour of just how much impact our own work can have on getting our sites to load quickly enough for today's standards.

3:00–3:30 pm

PM Break


3:40–4:10 pm

Traffic vs. Signal
Dana DiTomaso

With an ever-increasing slate of options in tools like Google Tag Manager and Google Data Studio, marketers of all stripes are falling prey to the habit of "I'll collect this data because maybe I'll need it eventually," when in reality it's creating a lot of noise for zero signal.

We're still approaching our metrics from the organization's perspective, and not from the customer's perspective. Why, for example, are we not reporting on (or even thinking about, really) how quickly a customer can do what they need to do? Why are we still fixated on pageviews? In this talk, Dana will focus our attention on what really matters.


4:10–4:50 pm

Why Nine out of Ten Marketing Launches Suck
(And How to Be the One that Doesn't)
Rand Fishkin

More than ever before, marketers are launching things — content, tools, resources, products — and being held responsible for how/whether they resonate with customers and earn the amplification required to perform. But this is hard. Really, really hard. Most of the projects that launch, fail. What separates the wheat from the chaff isn't just the quality of what's built, but the process behind it. In this presentation, Rand will present examples of dismal failures and skyrocketing successes, and dive into what separates the two. You'll learn how anyone can make a launch perform better, and benefit from the power of being "new."


7:00–11:30 pm

MozCon Bash

Join us at Garage Billiards to wrap up the conference with an evening of networking, billiards, bowling, and karaoke with MozCon friends new and old. Don't forget to bring your MozCon badge and US ID or passport.



Grab your ticket today!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


The MozCon 2018 Final Agenda posted first on https://moz.com/blog

Backlink Blindspots: The State of Robots.txt

Posted by rjonesx.

Here at Moz we have committed to making Link Explorer as similar to Google as possible, specifically in the way we crawl the web. I have discussed in previous articles some metrics we use to ascertain that performance, but today I wanted to spend a little bit of time talking about the impact of robots.txt and crawling the web.

Most of you are familiar with robots.txt as the method by which webmasters can direct Google and other bots to visit only certain pages on the site. Webmasters can be selective, allowing certain bots to visit some pages while denying other bots access to the same. This presents a problem for companies like Moz, Majestic, and Ahrefs: we try to crawl the web like Google, but certain websites deny access to our bots while allowing that access to Googlebot. So, why exactly does this matter?

Why does it matter?

Graph showing how crawlers hop from one link to another

As we crawl the web, if a bot encounters a robots.txt file, they're blocked from crawling specific content. We can see the links that point to the site, but we're blind regarding the content of the site itself. We can't see the outbound links from that site. This leads to an immediate deficiency in the link graph, at least in terms of being similar to Google (if Googlebot is not similarly blocked).

But that isn't the only issue. There is a cascading failure caused by bots being blocked by robots.txt in the form of crawl prioritization. As a bot crawls the web, it discovers links and has to prioritize which links to crawl next. Let's say Google finds 100 links and prioritizes the top 50 to crawl. However, a different bot finds those same 100 links, but is blocked by robots.txt from crawling 10 of the top 50 pages. Instead, they're forced to crawl around those, making them choose a different 50 pages to crawl. This different set of crawled pages will return, of course, a different set of links. In this next round of crawling, Google will not only have a different set they're allowed to crawl, the set itself will differ because they crawled different pages in the first place.

Long story short, much like the proverbial butterfly that flaps its wings eventually leading to a hurricane, small changes in robots.txt which prevent some bots and allow others ultimately leads to very different results compared to what Google actually sees.

So, how are we doing?

You know I wasn't going to leave you hanging. Let's do some research. Let's analyze the top 1,000,000 websites on the Internet according to Quantcast and determine which bots are blocked, how frequently, and what impact that might have.

Methodology

The methodology is fairly straightforward.

  1. Download the Quantcast Top Million
  2. Download the robots.txt if available from all top million sites
  3. Parse the robots.txt to determine whether the home page and other pages are available
  4. Collect link data related to blocked sites
  5. Collect total pages on-site related to blocked sites.
  6. Report the differences among crawlers.

Total sites blocked

The first and easiest metric to report is the number of sites which block individual crawlers (Moz, Majestic, Ahrefs) while allowing Google. Most site that block one of the major SEO crawlers block them all. They simply formulate robots.txt to allow major search engines while blocking other bot traffic. Lower is better.

Bar graph showing number of sites blocking each SEO tool in robots.txt

Of the sites analyzed, 27,123 blocked MJ12Bot (Majestic), 32,982 blocked Ahrefs, and 25,427 blocked Moz. This means that among the major industry crawlers, Moz is the least likely to be turned away from a site that allows Googlebot. But what does this really mean?

Total RLDs blocked

As discussed previously, one big issue with disparate robots.txt entries is that it stops the flow of PageRank. If Google can see a site, they can pass link equity from referring domains through the site's outbound domains on to other sites. If a site is blocked by robots.txt, it's as though the outbound lanes of traffic on all the roads going into the site are blocked. By counting all the inbound lanes of traffic, we can get an idea of the total impact on the link graph. Lower is better.

According to our research, Majestic ran into dead ends on 17,787,118 referring domains, Ahrefs on 20,072,690 and Moz on 16,598,365. Once again, Moz's robots.txt profile was most similar to that of Google's. But referring domains isn't the only issue with which we should be concerned.

Total pages blocked

Most pages on the web only have internal links. Google isn't interested in creating a link graph — they're interested in creating a search engine. Thus, a bot designed to act like Google needs to be just as concerned about pages that only receive internal links as they are those that receive external links. Another metric we can measure is the total number of pages that are blocked by using Google's site: query to estimate the number of pages Google has access to that a different crawler does not. So, how do the competing industry crawlers perform? Lower is better.

Once again, Moz shines on this metric. It's not just that Moz is blocked by fewer sites— Moz is blocked by less important and smaller sites. Majestic misses the opportunity to crawl 675,381,982 pages, Ahrefs misses 732,871,714 and Moz misses 658,015,885. There's almost an 80 million-page difference between Ahrefs and Moz just in the top million sites on the web.

Unique sites blocked

Most of the robots.txt disallows facing Moz, Majestic, and Ahrefs are simply blanket blocks of all bots that don't represent major search engines. However, we can isolate the times when specific bots are named deliberately for exclusion while competitors remain. For example, how many times is Moz blocked while Ahrefs and Majestic are allowed? Which bot is singled out the most? Lower is better.

Ahrefs is singled out by 1201 sites, Majestic by 7152 and Moz by 904. It is understandable that Majestic has been singled out, given that they have been operating a very large link index for many years, a decade or more. It took Moz 10 years to accumulate 904 individual robots.txt blocks, and took Ahrefs 7 years to accumulate 1204. But let me give some examples of why this is important.

If you care about links from name.com, hypermart.net, or eclipse.org, you can't rely solely on Majestic.

If you care about links from popsugar.com, dict.cc, or bookcrossing.com, you can't rely solely on Moz.

If you care about links from dailymail.co.uk, patch.com, or getty.edu, you can't rely solely on Ahrefs.

And regardless of what you do or which provider you use, you can't links from yelp.com, who.int, or findarticles.com.

Conclusions

While Moz's crawler DotBot clearly enjoys the closest robots.txt profile to Google among the three major link indexes, there's still a lot of work to be done. We work very hard on crawler politeness to ensure that we're not a burden to webmasters, which allows us to crawl the web in a manner more like Google. We will continue to work more to improve our performance across the web and bring to you the best backlink index possible.

Thanks to Dejan SEO for the beautiful link graph used in the header image and Mapt for the initial image used in the diagrams.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Backlink Blindspots: The State of Robots.txt posted first on https://moz.com/blog

What Google’s GDPR Compliance Efforts Mean for Your Data: Two Urgent Actions

Posted by willcritchlow

It should be quite obvious for anyone that knows me that I’m not a lawyer, and therefore that what follows is not legal advice. For anyone who doesn’t know me: I’m not a lawyer, I’m certainly not your lawyer, and what follows is definitely not legal advice.

With that out of the way, I wanted to give you some bits of information that might feed into your GDPR planning, as they come up more from the marketing side than the pure legal interpretation of your obligations and responsibilities under this new legislation. While most legal departments will be considering the direct impacts of the GDPR on their own operations, many might miss the impacts that other companies’ (namely, in this case, Google’s) compliance actions have on your data.

But I might be getting a bit ahead of myself: it’s quite possible that not all of you know what the GDPR is, and why or whether you should care. If you do know what it is, and you just want to get to my opinions, go ahead and skip down the page.

What is the GDPR?

The tweet-length version is that the GDPR (General Data Protection Regulation) is new EU legislation covering data protection and privacy for EU citizens, and it applies to all companies offering goods or services to people in the EU.

Even if you aren’t based in the EU, it applies to your company if you have customers who are, and it has teeth (fines of up to the greater of 4% of global revenue or EUR20m). It comes into force on May 25. You have probably heard about it through the myriad organizations who put you on their email list without asking and are now emailing you to “opt back in.”

In most companies, it will not fall to the marketing team to research everything that has to change and achieve compliance, though it is worth getting up to speed with at least the high-level outline and in particular its requirements around informed consent, which is:

"...any freely given, specific, informed, and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."

As always, when laws are made about new technology, there are many questions to be resolved, and indeed, jokes to be made:

But my post today isn’t about what you should do to get compliant — that’s specific to your circumstances — and a ton has been written about this already:

My intention is not to write a general guide, but rather to warn you about two specific things you should be doing with analytics (Google Analytics in particular) as a result of changes Google is making because of GDPR.

Unexpected consequences of GDPR

When you deal directly with a person in the EU, and they give you personally identifiable information (PII) about themselves, you are typically in what is called the "data controller" role. The GDPR also identifies another role, which it calls "data processor," which is any other company your company uses as a supplier and which handles that PII. When you use a product like Google Analytics on your website, Google is taking the role of data processor. While most of the restrictions of the GDPR apply to you as the controller, the processor must also comply, and it’s here that we see some potentially unintended (but possibly predictable) consequences of the legislation.

Google is unsurprisingly seeking to minimize their risk (I say it’s unsurprising because those GDPR fines could be as large as $4.4 billion based on last year’s revenue if they get it wrong). They are doing this firstly by pushing as much of the obligation onto you (the data controller) as possible, and secondly, by going further by default than the GDPR requires and being more aggressive than the regulation requires in shutting down accounts that infringe their terms (regardless of whether the infringement also infringes the GDPR).

This is entirely rational — with GA being in most cases a product offered for free, and the value coming to Google entirely in the aggregate, it makes perfect sense to limit their risks in ways that don’t degrade their value, and to just kick risky setups off the platform rather than taking on extreme financial risk for individual free accounts.

It’s not only Google, by the way. There are other suppliers doing similar things which will no doubt require similar actions, but I am focusing on Google here simply because GA is pervasive throughout the web marketing world. Some companies are even going as far as shutting down entirely for EU citizens (like unroll.me). See this Twitter thread of others.

Consequence 1: Default data retention settings for GA will delete your data

Starting on May 25, Google will be changing the default for data retention, meaning that if you don’t take action, certain data older than the cutoff will be automatically deleted.

You can read more about the details of the change on Krista Seiden’s personal blog (Krista works at Google, but this post is written in her personal capacity).

The reason I say that this isn’t strictly a GDPR thing is that it is related to changes Google is making on their end to ensure that they comply with their obligations as a data processor. It gives you tools you might need but isn’t strictly related to your GDPR compliance. There is no particular “right” answer to the question of how long you need to/should be/are allowed to keep this data stored in GA under the GDPR, but by my reading, given that it shouldn’t be PII anyway (see below) it isn’t really a GDPR question for most organizations. In particular, there is no particular reason to think that Google’s default is the correct/mandated/only setting you can choose under the GDPR.

Action: Review the promises being made by your legal team and your new privacy policy to understand the correct timeline setting for your org. In the absence of explicit promises to your users, my understanding is that you can retain any of this data you were allowed to capture in the first place unless you receive a deletion request against it. So while most orgs will have at least some changes to make to privacy policies at a minimum, most GA users can change back to retain this data indefinitely.

Consequence 2: Google is deleting GA accounts for capturing PII

It has long been against the Terms of Service to store any personally identifiable information (PII) in Google Analytics. Recently, though, it appears that Google has become far more diligent in checking for the presence of PII and robust in their handling of accounts found to contain any. Put more simply, Google will delete your account if they find PII.

It’s impossible to know for sure that this is GDPR-related, but being able if necessary to demonstrate to regulators that they are taking strict actions against anyone violating their PII-related terms is an obvious move for Google to reduce the risk they face as a Data Processor. It makes particular sense in an area where the vast majority of accounts are free accounts. Much like the previous point, and the reason I say that this is related to Google’s response to the GDPR coming into force, is that it would be perfectly possible to get your users’ permission to record their data in third-party services like GA, and fully comply with the regulations. Regardless of the permissions your users give you, Google’s GDPR-related crackdown (and heavier enforcement of the related terms that have been present for some time) means that it’s a new and greater risk than it was before.

Action: Audit your GA profile and implementation for PII risks:

  • There are various ways you can search within GA itself to find data that could be personally identifying in places like page titles, URLs, custom data, etc. (see these two excellent guides)
  • You can also audit your implementation by reviewing rules in tag manager and/or reviewing the code present on key pages. The most likely suspects are the places where people log in, take key actions on your site, give you additional personal information, or check out

Don’t take your EU law advice from big US tech companies

The internal effort and coordination required at Google to do their bit to comply even “just” as data processor is significant. Unfortunately, there are strong arguments that this kind of ostensibly user-friendly regulation which incurs outsize compliance burdens on smaller companies will cement the duopoly and dominance of Google and Facebook and enables them to pass the costs and burdens of compliance onto sectors that are already struggling.

Regardless of the intended or unintended consequences of the regulation, it seems clear to me that we shouldn’t be basing our own businesses’ (and our clients’) compliance on self-interested advice and actions from the tech giants. No matter how impressive their own compliance, I’ve been hugely underwhelmed by guidance content they’ve put out. See, for example, Google’s GDPR “checklist” — not exactly what I’d hope for:

Client Checklist: As a marketer we know you need to select products that are compliant and use personal data in ways that are compliant. We are committed to complying with the GDPR and would encourage you to check in on compliance plans within your own organisation. Key areas to think about:  How does your organisation ensure user transparency and control around data use? Do you explain to your users the types of data you collect and for what purposes? Are you sure that your organisation has the right consents in place where these are needed under the GDPR? Do you have all of the relevant consents across your ad supply chain? Does your organisation have the right systems to record user preferences and consents? How will you show to regulators and partners that you meet the principles of the GDPR and are an accountable organisation?

So, while I’m not a lawyer, definitely not your lawyer, and this is not legal advice, if you haven’t already received any advice, I can say that you probably can’t just follow Google’s checklist to get compliant. But you should, as outlined above, take the specific actions you need to take to protect yourself and your business from their compliance activities.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


What Google's GDPR Compliance Efforts Mean for Your Data: Two Urgent Actions posted first on https://moz.com/blog

GDPR: What it Means for Google Analytics & Online Marketing

Posted by Angela_Petteys

If you’ve been on the Internet at all in the past few months, you’ve probably seen plenty of notices about privacy policy updates from one service or another. As a marketer, a few of those notices have most likely come from Google.

With the General Data Privacy Regulation (GDPR) set to go into effect on May 25th, 2018, many Internet services have been scrambling to get in compliance with the new standards — and Google is no exception. Given the nature of the services Google provides to marketers, GDPR absolutely made some significant changes in how they conduct business. And, in turn, some marketers may have to take steps to make sure their use of Google Analytics is allowable under the new rules. But a lot of marketers aren’t entirely sure what exactly GDPR is, what it means for their jobs, and what they need to do to follow the rules.

What is GDPR?

GDPR is a very broad reform that gives citizens who live in the European Economic Area (EEA) and Switzerland more control over how their personal data is collected and used online. GDPR introduces a lot of new rules and if you’re up for a little light reading, you can check out the full text of the regulation online. But here are a few of the most significant changes:

  • Companies and other organizations have to be more transparent and clearly state what information they’re collecting, what it will be used for, how they’re collecting it, and if that information will be shared with anyone else. They can also only collect information that is directly relevant for its intended use. If the organization collecting that information later decides to use it for a different purpose, they must get permission again from each individual.
  • GDPR also spells out how that information needs to be given to consumers. That information can no longer be hidden in long privacy policies filled with legal jargon. The information in disclosures needs to be written in plain language and “freely given, specific, informed, and unambiguous.” Individuals also have to take an action which clearly gives their consent to their information being collected. Pre-checked boxes and notices that rely on inaction as a way of giving consent will no longer be allowed. If a user does not agree to have their information collected, you cannot block them from accessing content based on that fact.
  • Consumers also have the right to see what information a company has about them, request that incorrect information be corrected, revoke permission for their data to be saved, and have their data exported so they can switch to another service. If someone decides to revoke their permission, the organization needs to not only remove that information from their systems in a timely manner, they also need to have it removed from anywhere else they’ve shared that information.
  • Organizations must also be able to give proof of the steps they’re taking to be in compliance. This can include keeping records of how people opt in to being on marketing lists and documentation regarding how customer information is being protected.
  • Once an individual’s information has been collected, GDPR sets out requirements for how that information is stored and protected. If a data breach occurs, consumers must be notified within 72 hours. Failing to comply with GDPR can come with some very steep consequences. If a data breach occurs because of non-compliance, a company can be hit with fines as high as €20 million or 4% of the company’s annual global revenue, whichever amount is greater.

Do US-based businesses need to worry about GDPR?

Just because a business isn’t based in Europe doesn’t necessarily mean they’re off the hook as far as GDPR goes. If a company is based in the United States (or elsewhere outside the EEA), but conducts business in Europe, collects data about users from Europe, markets themselves in Europe, or has employees who work in Europe, GDPR applies to them, too.

Even if you’re working with a company that only conducts business in a very specific geographic area, you might occasionally get some visitors to your site from people outside of that region. For example, let’s say a pizza restaurant in Detroit publishes a blog post about the history of pizza on their site. It’s a pretty informative post and as a result, it brings in some traffic from pizza enthusiasts outside the Detroit area, including a few visitors from Spain. Would GDPR still apply in that sort of situation?

As long as it’s clear that a company’s goods or services are only available to consumers in the United States (or another country outside the EEA), GDPR does not apply. Going back to the pizza restaurant example, the other content on their site is written in English, emphasizes their Detroit location, and definitely doesn’t make any references to delivery to Spain, so those few page views from Spain wouldn’t be anything to worry about.

However, let’s say another US-based company has a site with the option to view German and French language versions of pages, lets customers pay with Euros, and uses marketing language that refers to European customers. In that situation, GDPR would apply since they are more clearly soliciting business from people in Europe.

Google Analytics & GDPR

If you use Google Analytics, Google is your data processor and since they handle data from people all over the world, they’ve had to take steps to become compliant with GDPR standards. However, you/your company are considered the data controller in this relationship and you will also need to take steps to make sure your Google Analytics account is set up to meet the new requirements.

Google has been rolling out some new features to help make this happen. In Analytics, you will now have the ability to delete the information of individual users if they request it. They’ve also introduced data retention settings which allow you to control how long individual user data is saved before being automatically deleted. Google has set this to be 26 months as the default setting, but if you are working with a US-based company that strictly conducts business in the United States, you can set it to never expire if you want to — at least until data protection laws change here, too. It’s important to note that this only applies to data about individual users and events, so aggregate data about high-level information like page views won’t be impacted by this.

To make sure you’re using Analytics in compliance with GDPR, a good place to start is by auditing all the data you collect to make sure it’s all relevant to its intended purpose and that you aren’t accidentally sending any personally identifiable information (PII) to Google Analytics. Sending PII to Google Analytics was already against its Terms of Service, but very often, it happens by accident when information is pushed through in a page URL. If it turns out you are sending PII to Analytics, you’ll need to talk to your web development team about how to fix it because using filters in Analytics to block it isn’t enough — you need to make sure it’s never sent to Google Analytics in the first place.

PII includes anything that can potentially be used to identify a specific person, either on its own or when combined with another piece of information, like an email address, a home address, a birthdate, a zip code, or an IP address. IP addresses weren’t always considered PII, but GDPR classifies them as an online identifier. Don’t worry, though — you can still get geographical insights about the visitors to your site. All you have to do is turn on IP anonymization and the last portion of an IP address will be replaced with a zero, so you can still get a general idea of where your traffic is coming from, although it will be a little less precise.

If you use Google Tag Manager, IP anonymization is pretty easy. Just open your Google Analytics tag or its settings variable, choose “More Settings,” and select “Fields to Set.” Then, choose “anonymizeip” in the “Field Name” box, enter “true” in the “Value” box,” and save your changes.

If you don’t use GTM, talk to your web development team about editing the Google Analytics code to anonymize IP addresses.

Pseudonymous information like user IDs and transaction IDs are still acceptable under GDPR, but it needs to be protected. User and transaction IDs need to be alphanumeric database identifiers, not written out in plain text.

Also, if you haven’t already done so, don’t forget to take the steps Google has mentioned in some of those emails they’ve sent out. If you’re based outside the EEA and GDPR applies to you, go into your Google Analytics account settings and accept the updated terms of processing. If you’re based in the EEA, the updated terms have already been included in your data processing terms. If GDPR applies to you, you’ll also need to go into your organization settings and provide contact information for your organization.

Privacy policies, forms, & cookie notices

Now that you’ve gone through your data and checked your settings in Google Analytics, you need to update your site’s privacy policy, forms, and cookie notices. If your company has a legal department, it may be best to involve them in this process to make sure you’re fully compliant.

Under GDPR, a site’s privacy policy needs to be clearly written in plain language and answer basic questions like what information is being collected, why it’s being collected, how it’s being collected, who is collecting it, how it will be used, and if it will be shared with anyone else. If your site is likely to be visited by children, this information needs to be written simply enough for a child to be able to understand it.

Forms and cookie notices also need to provide that kind of information. Cookie consent forms with really vague, generic messages like, “We use cookies to give you a better experience and by using this site, you agree to our policy,” are not GDPR compliant.

GDPR & other types of marketing

The impact GDPR will have on marketers isn’t just limited to how you use Google Analytics. If you use some particular types of marketing in the course of your job, you may have to make a few other changes, too.

Referral deals

If you work with a company that does “refer a friend”-type promotions where a customer has to enter information for a friend to receive a discount, GDPR is going to make a difference for you. Giving consent for data to be collected is a key part of GDPR and in these sorts of promotions, the person being referred can’t clearly consent to their information being collected. Under GDPR, it is possible to continue this practice, but it all depends on how that information is being used. If you store the information of the person being referred and use it for marketing purposes, it would be a violation of GDPR standards. However, if you don’t store that information or process it, you’re OK.

Email marketing

If you’re an email marketer and already follow best industry standards by doing things like only sending messages to those who clearly opt in to your list and making it easy for people to unsubscribe, the good news is that you’re probably in pretty good shape. As far as email marketing goes, GDPR is going to have the biggest impact on those who do things that have already been considered sketchy, like buying lists of contacts or not making it clear when someone is signing up to receive emails from you.

Even if you think you’re good to go, it’s still a good time to review your contacts and double check that your European contacts have indeed opted into being on your list and that it was clear what they were signing up for. If any of your contacts don’t have their country listed or you’re not sure how they opted in, you may want to either remove them from your list or put them on a separate segment so they don’t get any messages from you until you can get that figured out. Even if you’re confident your European contacts have opted in, there’s no harm in sending out an email asking them to confirm that they would like to continue receiving messages from you.

Creating a double opt-in process isn’t mandatory, but it would be a good idea since it helps remove any doubt over whether or not a person has agreed to being on your list. While you’re at it, take a look at the forms people use to sign up to be on your list and make sure they’re in line with GDPR standards, with no pre-checked boxes and the fact that they’re agreeing to receive emails from you is very clear.

For example, here’s a non-GDPR compliant email signup option I recently saw on a checkout page. They tell you what they’re planning to send to you, but the fact that it’s a pre-checked box placed underneath the more prominent “Place Order” button makes it very easy for people to unintentionally sign up for emails they might not actually want.

Jimmy Choo, on the other hand, also gives you the chance to sign up for emails while making a purchase, but since the box isn’t pre-checked, it’s good to go under GDPR.

Marketing automation

As is the case with standard email marketing, marketing automation specialists will need to make sure they have clear consent from everyone who has agreed to be part of their lists. Check your European contacts to make sure you know how they’ve opted in. Also review the ways people can opt into your list to make sure it’s clear what, exactly, they’re signing up for so that your existing contacts would be considered valid.

If you use marketing automation to re-engage customers who have been inactive for a while, you may need to get permission to contact them again, depending on how long it has been since they last interacted with you.

Some marketing automation platforms have functionality which will be impacted by GDPR. Lead scoring, for example, is now considered a form of profiling and you will need to get permission from individuals to have their information used in that way. Reverse IP tracking also needs consent.

It’s also important to make sure your marketing automation platform and CRM system are set to sync automatically. If a person on your list unsubscribes and continues receiving emails because of a lapse between the two, you could get in trouble for not being GDPR compliant.

Gated content

A lot of companies use gated content, like free reports, whitepapers, or webinars, as a way to generate leads. The way they see it, the person’s information serves as the price of admission. But since GDPR prohibits blocking access to content if a person doesn’t consent to their information being collected, is gated content effectively useless now?

GDPR doesn’t completely eliminate the possibility of gated content, but there are now higher standards for collecting user information. Basically, if you’re going to have gated content, you need to be able to prove that the information you collect is necessary for you to provide the deliverable. For example, if you were organizing a webinar, you’d be justified in collecting email addresses since attendees need to be sent a link to join in. You’d have a harder time claiming an email address was required for something like a whitepaper since that doesn’t necessarily have to be delivered via email. And of course, as with any other form on a site, forms for gated content need to clearly state all the necessary information about how the information being collected will be used.

If you don’t get a lot of leads from European users anyway, you may want to just block all gated content from European visitors. Another option would be to go ahead and make that information freely available to visitors from Europe.

Google AdWords

If you use Google AdWords to advertise to European residents, Google already required publishers and advertisers to get permission from end users by putting disclaimers on the landing page, but GDPR will be making some changes to these requirements. Google will now be requiring publishers to get clear consent from individuals to have their information collected. Not only does this mean you have to give more information about how a person’s information will be used, you’ll also need to keep records of consent and tell users how they can opt out later on if they want to do so. If a person doesn’t give consent to having their information collected, Google will be making it possible to serve them non-personalized ads.

In the end

GDPR is a significant change and trying to grasp the full scope of its changes is pretty daunting. This is far from being a comprehensive guide, so if you have any questions about how GDPR applies to a particular client you’re working with, it may be best to get in touch with their legal department or team. GDPR will impact some industries more than others, so it’s best to get some input from someone who truly understands the law and how it applies to that specific business.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


GDPR: What it Means for Google Analytics & Online Marketing posted first on https://moz.com/blog

Let’s Make Money: 4 Tactics for Agencies Looking to Succeed – Whiteboard Friday

Posted by rjonesx.

We spend a lot of time discussing SEO tactics, but in a constantly changing industry, one thing that deserves more attention are the tactics agencies should employ in order to see success. From confidently raising your prices to knowing when to say no, Moz's own Russ Jones covers four essential success tactics that'll ultimately increase your bottom line in today's edition of Whiteboard Friday.

Agency tactics

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I am Russ Jones, and I can't tell you how excited I am for my first Whiteboard Friday. I am Principal Search Scientist here at Moz. But before coming to Moz, for the 10 years prior to that, I was the Chief Technology Officer of a small SEO agency back in North Carolina. So I have a strong passion for agencies and consultants who are on the ground doing the work, helping websites rank better and helping build businesses.

So what I wanted to do today was spend a little bit of time talking about the lessons that I learned at an agency that admittedly I only learned through trial and error. But before we even go further, I just wanted to thank the folks at Hive Digital who I learned so much from, Jeff and Jake and Malcolm and Ryan, because the team effort over time is what ended up building an agency. Any agency that succeeds knows that that's part of it. So we'll start with that thank-you.

But what I really want to get into is that we spend a lot of time talking about SEO tactics, but not really about how to succeed in an industry that changes rapidly, in which there's almost no certification, and where it can be difficult to explain to customers exactly how they're going to be successful with what you offer. So what I'm going to do is break down four really important rules that I learned over the course of that 10 years. We're going to go through each one of them as quickly as possible, but at the same time, hopefully you'll walk away with some good ideas. Some of these are ones that it might at first feel a little bit awkward, but just follow me.

1. Raise prices

The first rule, number one in Let's Make Money is raise your prices. Now, I remember quite clearly two years in to my job at Hive Digital — it was called Virante then — and we were talking about raising prices. We were just looking at our customers, saying to ourselves, "There's no way they can afford it." But then luckily we had the foresight that there was more to raising prices than just charging your customers more.

How it benefits old customers

The first thing that just hit us automatically was... "Well, with our old customers, we can just discount them. It's not that bad. We're in the same place as we always were." But then it occurred to us, "Wait, wait, wait. If we discount our customers, then we're actually increasing our perceived value." Our existing customers now think, "Hey, they're actually selling something better that's more expensive, but I'm getting a deal," and by offering them that deal because of their loyalty, you engender more loyalty. So it can actually be good for old customers.

How it benefits new customers

Now, for new customers, once again, same sort of situation. You've increased the perceived value. So your customers who come to you think, "Oh, this company is professional. This company is willing to invest. This company is interested in providing the highest quality of services." In reality, because you've raised prices, you can. You can spend more time and money on each customer and actually do a better job. The third part is, "What's the worst that could happen?" If they say no, you offer them the discount. You're back where you started. You're in the same position that you were before.

How it benefits your workers

Now, here's where it really matters — your employees, your workers. If you are offering bottom line prices, you can't offer them raises, you can't offer them training, you can't hire them help, or you can't get better workers. But if you do, if you raise prices, the whole ecosystem that is your agency will do better.

How it improves your resources

Finally, and most importantly, which we'll talk a little bit more later, is that you can finally tool up. You can get the resources and capital that you need to actually succeed. I drew this kind of out.

If we have a graph of quality of services that you offer and the price that you sell at, most agencies think that they're offering great quality at a little price, but the reality is you're probably down here. You're probably under-selling your services and, because of that, you can't offer the best that you can.

You should be up here. You should be offering higher quality, your experts who spend time all day studying this, and raising prices allows you to do that.

2. Schedule

Now, raising prices is only part one. The second thing is discipline, and I am really horrible about this. The reality is that I'm the kind of guy who looks for the latest and greatest and just jumps into it, but schedule matters. As hard as it is to admit it, I learned this from the CPC folks because they know that they have to stay on top of it every day of the week.

Well, here's something that we kind of came up with as I was leaving the company, and that was to set all of our customers as much as possible into a schedule.

  • Annually: we would handle keywords and competitors doing complete analysis.
  • Semi-annually: Twice a year, we would do content analysis. What should you be writing about? What's changed in your industry? What are different keywords that you might be able to target now given additional resources?
  • Quarterly: You need to be looking at links. It's just a big enough issue that you've got to look at it every couple of months, a complete link analysis.
  • Monthly: You should be looking at your crawls. Moz will do that every week for you, but you should give your customers an idea, over the course of a month, what's changed.
  • Weekly: You should be doing rankings

But there are three things that, when you do all of these types of analysis, you need to keep in mind. Each one of them is a...

  • Report
  • Hours for consulting
  • Phone call

This might seem like a little bit of overkill. But of course, if one of these comes back and nothing changed, you don't need to do the phone call, but each one of these represents additional money in your pocket and importantly better service for your customers.

It might seem hard to believe that when you go to a customer and you tell them, "Look, nothing's changed," that you're actually giving them value, but the truth is that if you go to the dentist and he tells you, you don't have a cavity, that's good news. You shouldn't say to yourself at the end of the day, "Why'd I go to the dentist in the first place?" You should say, "I'm so glad I went to the dentist." By that same positive outlook, you should be selling to your customers over and over and over again, hoping to give them the clarity they need to succeed.

3. Tool up!

So number three, you're going to see this a lot in my videos because I just love SEO tools, but you've got to tool up. Once you've raised prices and you're making more money with your customers, you actually can. Tools are superpowers. Tools allow you to do things that humans just can't do. Like I can't figure out the link graph on my own. I need tools to do it. But tools can do so much more than just auditing existing clients. For example, they can give you...

Better leads:

You can use tools to find opportunities.Take for example the tools within Moz and you want to find other car dealerships in the area that are really good and have an opportunity to rank, but aren't doing as well as they should be in SERPs. You want to do this because you've already serviced successfully a different car dealership. Well, tools like Moz can do that. You don't just have to use Moz to help your clients. You can use them to help yourself.

Better pre-audits:

Nobody walks into a sales call blind. You know who the website is. So you just start with a great pre-audit.

Faster workflows:

Which means you make more money quicker. If you can do your keyword analysis annually in half the time because you have the right tool for it, then you're going to make far more money and be able to serve more customers.

Bulk pricing:

This one is just mind-blowingly simple. It's bulk pricing. Every tool out there, the more you buy from them, the lower the price is. I remember at my old company sitting down at one point and recognizing that every customer that came in the door would need to spend about $1,000 on individual accounts to match what they were getting through us by being able to take advantage of the bulk discounts that we were getting as an agency by buying these seats on behalf of all of our customers.

So tell your clients when you're talking to them on the phone, in the pitch be like, "Look, we use Moz, Majestic, Ahrefs, SEMrush," list off all of the competitors. "We do Screaming Frog." Just name them all and say, "If you wanted to go out and just get the data yourself from these tools, it would cost you more than we're actually charging you." The tools can sell themselves. You are saving them money.

4. Just say NO

Now, the last section, real quickly, are the things you've just got to learn to say no to. One of them has a little nuance to it. There's going to be some bite back in the comments, I'm pretty sure, but I want to be careful with it.

No month-to-month contracts

The first thing to say no to is month-to-month contracts.

If a customer comes to you and they say, "Look, we want to do SEO, but we want to be able to cancel every 30 days." the reality is this. They're not interested in investing in SEO. They're interested in dabbling in SEO. They're interested in experimenting with SEO. Well, that's not going to succeed. It's only going to take one competitor or two who actually invest in it to beat them out, and when they beat them out, you're going to look bad and they're going to cancel their account with you. So sit down with them and explain to them that it is a long-term strategy and it's just not worth it to your company to bring on customers who aren't interested in investing in SEO. Say it politely, but just turn it away.

Don't turn anything away

Now, notice that my next thing is don't turn anything away. So here's something careful. Here's the nuance. It's really important to learn to fire clients who are bad for your business, where you're losing money on them or they're just impolite, but that doesn't mean you have to turn them away. You just need to turn them in the right direction. That right direction might be tools themselves. You can say, "Look, you don't really need our consulting hours. You should go use these tools." Or you can turn them to other fledgling businesses, friends you have in the industry who might be struggling at this time.

I'll tell you a quick example. We don't have much time, but many, many years ago, we had a client that came to us. At our old company, we had a couple of rules about who we would work with. We chose not to work in the adult industry. But at the time, I had a friend in the industry. He lived outside of the United States, and he had fallen on hard times. He literally had his business taken away from him via a series of just really unscrupulous events. I picked up the phone and gave him a call. I didn't turn away the customer. I turned them over to this individual.

That very next year, he had ended up landing a new job at the top of one of the largest gambling organizations in the world. Well, frankly, they weren't on our list of people we couldn't work with. We landed the largest contract in the history of our company at that time, and it set our company straight for an entire year. It was just because instead of turning away the client, we turned them to a different direction. So you've got to say no to turning away everybody. They are opportunities. They might not be your opportunity, but they're someone's.

No service creep

The last one is service creep. Oh, man, this one is hard. A customer comes up to you and they list off three things that you offer that they want, and then they say, "Oh, yeah, we need social media management." Somebody else comes up to you, three things you want to offer, and they say, "Oh yeah, we need you to write content," and that's not something you do. You've just got to not do that. You've got to learn to shave off services that you can't offer. Instead, turn them over to people who can do them and do them very well.

What you're going to end up doing in your conversation, your sales pitch is, "Look, I'm going to be honest with you. We are great at some things, but this isn't our cup of tea. We know someone who's really great at it." That honesty, that candidness is just going to give them such a better relationship with you, and it's going to build a stronger relationship with those other specialty companies who are going to send business your way. So it's really important to learn to say no to say no service creep.

Well, anyway, there's a lot that we went over there. I hope it wasn't too much too fast, but hopefully we can talk more about it in the comments. I look forward to seeing you there. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Let's Make Money: 4 Tactics for Agencies Looking to Succeed - Whiteboard Friday posted first on https://moz.com/blog

Time to Act: Review Responses Just Evolved from “Extra” to “Expected”

Posted by MiriamEllis

I’ve advocated the use of Google’s owner response review feature since it first rolled out in 2010. This vital vehicle defends brand reputation and revenue, offering companies a means of transforming dissatisfied consumers into satisfied ones, supporting retention so that less has to be spent on new customer acquisition. I consider review responses to be a core customer service responsibility. Yet, eight years into the existence of this feature, marketing forums are still filled with entry-level questions like:

  • Should I respond to reviews?
  • Should I respond to positive reviews?
  • How should I respond to negative reviews?

Over the years, I’ve seen different local SEO consultants reply in differing degrees to these common threads, but as of May 11, 2018, both agencies and brands woke to a new day: the day on which Google announced it would be emailing notifications like this to consumers when a business responds to their reviews, prompting them to view the reply.

Surveys indicate that well over 50% of consumers already expect responses within days of reviewing a business. With Google’s rollout, we can assume that this numbers is about to rise.

Why is this noteworthy news? I’ll explain exactly that in this post, plus demo how Moz Local can be a significant help to owners and marketers in succeeding in this new environment.

When "extra" becomes "expected"

In the past, owner responses may have felt like something extra a business could do to improve management of its reputation. Perhaps a company you’re marketing has been making the effort to respond to negative reviews, at the very least, but you’ve let replying to positive reviews slide. Or maybe you respond to reviews when you can get around to it, with days or weeks transpiring between consumer feedback and brand reaction.

Google’s announcement is important for two key reasons:

1) It signals that Google is turning reviews into a truly interactive feature, in keeping with so much else they’ve rolled out to the Knowledge Panel in recent times. Like booking buttons and Google Questions & Answers, notifications of owner responses are Google’s latest step towards making Knowledge Panels transactional platforms instead of static data entities. Every new feature brings us that much closer to Google positioning itself between providers and patrons for as many transactional moments as possible.

2) It signals a major turning point in consumer expectations. In the past, reviewers have left responses from motives of “having their say,” whether that’s to praise a business, warn fellow consumers, or simply document their experiences.

Now, imagine a patron who writes a negative review of two different restaurants he dined at for Sunday lunch and dinner. On Monday, he opens his email to find a Google notification that Restaurant A has left an owner response sincerely apologizing and reasonably explaining why service was unusually slow that weekend, but that Restaurant B is meeting his complaint about a rude waiter with dead air.

“So, Restaurant A cares about me, and Restaurant B couldn’t care less,” the consumer is left to conclude, creating an emotional memory that could inform whether he’s ever willing to give either business a second chance in the future.

Just one experience of receiving an owner response notification will set the rules of the game from here on out, making all future businesses that fail to respond seem inaccessible, neglectful, and even uncaring. It’s the difference between reviewers narrating their experiences from random motives, and leaving feedback with the expectation of being heard and answered.

I will go so far as to predict that Google’s announcement ups the game for all review platforms, because it will make owner responses to consumer sentiment an expected, rather than extra, effort.

The burden is on brands

Because no intelligent business would believe it can succeed in modern commerce while appearing unreachable or unconcerned, Google’s announcement calls for a priority shift. For brands large and small, it may not be an easy one, but it should look something like this:

  • Negative reviews are now direct cries for help to our business; we will respond with whatever help we can give within X number of hours or days upon receipt
  • Positive reviews are now thank-you notes directly to our company; we will respond with gratitude within X number of hours or days upon receipt

Defining X is going to have to depend on the resources of your organization, but in an environment in which consumers expect your reply, the task of responding must now be moved from the back burner to a hotter spot on the stovetop. Statistics differ in past assessments of consumer expectations of response times:

  • In 2016, GetFiveStars found that 15.6% of consumers expected a reply with 1–3 hours, and 68.3% expected a reply within 1–3 days of leaving a review.
  • In 2017, RevLocal found that 52% of consumers expected responses within 7 days.
  • Overall, 30% of survey respondents told BrightLocal in 2017 that owner responses were a factor they looked at in judging a business.

My own expectation is that all of these numbers will now rise as a result of Google’s new function, meaning that the safest bet will be the fastest possible response. If resources are limited, I recommend prioritizing negative sentiment, aiming for a reply within hours rather than days as the best hope of winning back a customer. “Thank yous” for positive sentiment can likely wait for a couple of days, if absolutely necessary.

It’s inspiring to know that a recent Location3 study found that brands which do a good job of responding to reviews saw an average conversion rate of 13.9%, versus lackluster responders whose conversion rate was 10.4%. Depending on what you sell, those 3.5 points could be financially significant! But it’s not always easy to be optimally responsive.

If your business is small, accelerating response times can feel like a burden because of lack of people resources. If your business is a large, multi-location enterprise, the burden may lie in organizing awareness of hundreds of incoming reviews in a day, as well as keeping track of which reviews have been responded to and which haven’t.

The good news is…

Moz Local can help

The screenshot, above, is taken from the Moz Local dashboard. If you’re a customer, just log into your Moz Local account and go to your review section. From the “sources” section, choose “Google” — you’ll see the option to filter your reviews by “replied” and “not replied.” You’ll instantly be able to see which reviews you haven’t yet responded to. From there, simply use the in-dashboard feature that enables you to respond to your (or your clients’) reviews, without having to head over to your GMB dashboard or log into a variety of different clients’ GMB dashboards. So easy!

I highly recommend that all our awesome customers do this today and be sure you’ve responded to all of your most recent reviews. And, in the future, if you’re working your way through a stack of new, incoming Google reviews, this function should make it a great deal easier to keep organized about which ones you’ve checked off and which ones are still awaiting your response. I sincerely hope this function makes your work more efficient!

Need some help setting the right review response tone?

Please check out Mastering the Owner Response to the Quintet of Google My Business Reviews, which I published in 2016 to advocate responsiveness. It will walk you through these typical types of reviews you’ll be receiving:

  • “I love you!”
  • “I haven’t made up my mind yet.”
  • “There was hair in my taco...”
  • “I’m actually your competitor!”
  • “I’m citing illegal stuff.”

The one update I’d make to the advice in the above piece, given Google’s rollout of the new notification function, would be to increase the number of positive reviews to which you’re responding. In 2016, I suggested that enterprises managing hundreds of locations should aim to express gratitude for at least 10% of favorable reviews. In 2018, I’d say reply with thanks to as many of these as you possibly can. Why? Because reviews are now becoming more transactional than ever, and if a customer says, “I like you,” it’s only polite to say, “Thanks!”. As more customers begin to expect responsiveness, failure to acknowledge praise could feel uncaring.

I would also suggest that responses to negative reviews more strongly feature a plea to the customer to contact the business so that things can be “made right.” GetFiveStars co-founder, Mike Blumenthal, is hoping that Google might one day create a private channel for brands and consumers to resolve complaints, but until that happens, definitely keep in mind that:

  1. The new email alerts will ensure that more customers realize you’ve responded to their negative sentiment.
  2. If, while “making things right” in the public response, you also urge the unhappy customer to let you make things “more right” in person, you will enhance your chances of retaining him.
  3. If you are able to publicly or privately resolve a complaint, the customer may feel inspired to amend his review and raise your star rating; over time, more customers doing this could significantly improve your conversions and, possibly, your local search rankings.
  4. All potential customers who see your active responses to complaints will understand that your policies are consumer-friendly, which should increase the likelihood of them choosing your business for transactions.

Looking ahead

One of the most interesting aspects I’m considering as of the rollout of response notifications is whether it may ultimately impact the tone of reviews themselves. In the past, some reviewers have given way to excesses in their sentiment, writing about companies in the ugliest possible language… language I’ve always wanted to hope they wouldn’t use face-to-face with other human beings at the place of business. I’m wondering now if knowing there’s a very good chance that brands responding to feedback could lessen the instances of consumers taking wild, often anonymous potshots at brands and create a more real-world, conversational environment.

In other words, instead of: “You overcharged me $3 for a soda and I know it’s because you’re [expletive] scammers, liars, crooks!!! Everyone beware of this company!!!”

We might see: “Hey guys, I just noticed a $3 overcharge on my receipt. I’m not too happy about this.”

The former scenario is honestly embarrassing. Trying to make someone feel better when they’ve just called you a thief feels a bit ridiculous and depressing. But the latter scenario is, at least, situation-appropriate instead of blown out of all proportion, creating an opening for you and your company to respond well and foster loyalty.

I can’t guarantee that reviewers will tone it down a bit if they feel more certain of being heard, but I’m hoping it will go that way in more and more cases.

What do you think? How will Google’s new function impact the businesses you market and the reviewers you serve? Please share your take and your tips with our community!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Time to Act: Review Responses Just Evolved from "Extra" to "Expected" posted first on https://moz.com/blog

How to Write Meta Descriptions in a Constantly Changing World (AKA Google Giveth, Google Taketh Away)

Posted by Dr-Pete

Summary: As of mid-May 2018, Google has reverted back to shorter display snippets. Our data suggests these changes are widespread and that most meta descriptions are being cut off in the previous range of about 155–160 characters.

Back in December, Google made a significant shift in how they displayed search snippets, with our research showing many snippets over 300 characters. Over the weekend, they seem to have rolled back that change (Danny Sullivan partially confirmed this on Twitter on May 14). Besides the obvious question — What are the new limits? — it may leave you wondering how to cope when the rules keep changing. None of us have a crystal ball, but I'm going to attempt to answer both questions based on what we know today.

Lies, dirty lies, and statistics...

I pulled all available search snippets from the MozCast 10K (page-1 Google results for 10,000 keywords), since that's a data set we collect daily and that has a rich history. There were 89,383 display snippets across that data set on the morning of May 15.

I could tell you that, across the entire data set, the minimum length was 6 characters, the maximum was 386, and the mean was about 159. That's not very useful, for a couple of reasons. First, telling you to write meta descriptions between 6–386 characters isn't exactly helpful advice. Second, we're dealing with a lot of extremes. For example, here's a snippet on a search for "USMC":

Marine Corps Community Services may be a wonderful organization, but I'm sorry to report that their meta description is, in fact, "apple" (Google appends the period out of, I assume, desperation). Here's a snippet for a search on the department store "Younkers":

Putting aside their serious multi-brand confusion, I think we can all agree that "BER Meta TAG1" is not optimal. If these cases teach you anything, it's only about what not to do. What about on the opposite extreme? Here's a snippet with 386 characters, from a search for "non-compete agreement":

Notice the "Jump to Exceptions" and links at the beginning. Those have been added by Google, so it's tough to say what counts against the character count and what doesn't. Here's one without those add-ons that clocks in at 370 characters, from a search for "the Hunger Games books":

So, we know that longer snippets do still exist. Note, though, that both of these snippets come from Wikipedia, which is an exception to many SEO rules. Are these long descriptions only fringe cases? Looking at the mean (or even the median, in this case) doesn't really tell us.

The big picture, part 1

Sometimes, you have to let the data try to speak for itself, with a minimum of coaxing. Let's look at all of the snippets that were cut off (ending in "...") and remove video results (we know from previous research that these skew a bit shorter). This leaves 42,863 snippets (just under half of our data set). Here's a graph of all of the cut-off lengths, gathered into 25 character bins (0–25, 26–50, etc.):

This looks very different from our data back in December, and is clearly clustered in the 150–175 character range. We see a few Google display snippets cut off after the 300+ range, but those are dwarfed by the shorter cut-offs.

The big picture, part 2

Obviously, there's a lot happening in that 125–175 character range, so let's zoom in and look at just the middle portion of the frequency distribution, broken up into smaller, 5-character buckets:

We can see pretty clearly that the bulk of cut-offs are happening in the 145–165 character range. Before December, our previous guidelines for meta descriptions were to keep them below 155 characters, so it appears that Google has more-or-less reverted to the old rules.

Keep in mind that Google uses proportional fonts, so there is no exact character limit. Some people have hypothesized a pixel-width limit, like with title tags, but I've found that more difficult to pin down with multi-line snippets (the situation gets even weirder on mobile results). Practically, it's also difficult to write to a pixel limit. The data suggests that 155 characters is a reasonable approximation.

To the Wayback Machine... ?!

Should we just go back to a 155 character cut-off? If you've already written longer meta descriptions, should you scrap that work and start over? The simple truth is that none of us know what's going to happen next week. The way I see it, we have four viable options:

(1) Let Google handle it

Some sites don't have meta descriptions at all. Wikipedia happens to be one of them. Now, Google's understanding of Wikipedia's content is much deeper than most sites (thanks, in part, to Wikidata), but many sites do fare fine without the tag. If your choice is to either write bad, repetitive tags or leave them blank, then I'd say leave them blank and let Google sort it out.

(2) Let the ... fall where it may

You could just write to the length you think is ideal for any given page (within reason), and if the snippets get cut off, don't worry about it. Maybe the ellipsis (...) adds intrigue. I'm half-joking, but the reality is that a cut-off isn't the kiss of death. A good description should entice people to want to read more.

(3) Chop everything at 155 characters

You could go back and mercilessly hack all of your hard work back to 155 characters. I think this is generally going to be time badly spent and may result in even worse search snippets. If you want to rewrite shorter Meta Descriptions for your most important pages, that's perfectly reasonable, but keep in mind that some results are still showing longer snippets and this situation will continue to evolve.

(4) Write length-adaptive descriptions

Is it possible to write a description that works well at both lengths? I think it is, with some care and planning. I wouldn't necessarily recommend this for every single page, but maybe there is a way to have our cake and eat at least half of it, too...

The 150/150 approach

I've been a bit obsessed with the "inverted pyramid" style of writing lately. This is a journalistic style where you start with the lead or summary of your main point and then break that down into the details, data, and context. While this approach is well suited to the web, its origins come from layout limitations in print. You never knew when your editor would have to cut your article short to fit the available space, so the inverted pyramid style helped guarantee that the most important part would usually be spared.

What if we took this approach to meta descriptions? In other words, why not write a 150-character "lead" that summarizes the page, and then add 150 characters of useful but less essential detail (when adding that detail makes sense and provides value)? The 150/150 isn't a magic number — you could even do 100/100 or 100/200. The key is to make sure that the text before the cut can stand on its own.

Think of it a bit like an ad, with two separate lines of copy. Let's take this blog post:

Line 1 (145 chars.)

In December, we reported that Google increased search snippets to over 300 characters. Unfortunately, it looks like the rules have changed again.

Line 2 (122 chars.)

According to our new research (May 2018), the limit is back to 155-160 characters. How should SEOs adapt to these changes?

Line 1 has the short version of the story and hopefully lets searchers know they're heading down the right path. Line 2 dives into a few details and gives away just enough data (hopefully) to be intriguing. If Google uses the longer description, it should work nicely, but if they don't, we shouldn't be any worse for wear.

Should you even bother?

Is this worth the effort? I think writing effective descriptions that engage search visitors is still very important, in theory (and that this indirectly impacts even ranking), but you may find you can write perfectly well within a 155-character limit. We also have to face the reality that Google seems to be rewriting more and more descriptions. This is difficult to measure, as many rewrites are partial, but there's no guarantee that your meta description will be used as written.

Is there any way to tell when a longer snippet (>300 characters) will still be used? Some SEOs have hypothesized a link between longer snippets and featured snippets at the top of the page. In our overall data set, 13.3% of all SERPs had featured snippets. If we look at just SERPs with a maximum display snippet length of 160 characters (i.e. no result was longer than 160 characters), the featured snippet occurrence was 11.4%. If we look at SERPs with at least one display snippet over 300 characters, featured snippets occurred at a rate of 41.8%. While that second data set is fairly small, it is a striking difference. There does seem to be some connection between Google's ability to extract answers in the form of featured snippets and their ability or willingness to display longer search snippets. In many cases, though, these longer snippets are rewrites or taken directly from the page, so even then there's no guarantee that Google will use your longer meta description.

For now, it appears that the 155-character guideline is back in play. If you've already increased some of your meta descriptions, I don't think there's any reason to panic. It might make sense to rewrite overly-long descriptions on critical pages, especially if the cut-offs are leading to bad results. If you do choose to rewrite some of them, consider the 150/150 approach — at least then you'll be a bit more future-proofed.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How to Write Meta Descriptions in a Constantly Changing World (AKA Google Giveth, Google Taketh Away) posted first on https://moz.com/blog

A Machine Learning Guide for Average Humans

Posted by alexis-sanders

Machine learning (ML) has grown consistently in worldwide prevalence. Its implications have stretched from small, seemingly inconsequential victories to groundbreaking discoveries. The SEO community is no exception. An understanding and intuition of machine learning can support our understanding of the challenges and solutions Google's engineers are facing, while also opening our minds to ML's broader implications.

The advantages of gaining an general understanding of machine learning include:

  • Gaining empathy for engineers, who are ultimately trying to establish the best results for users
  • Understanding what problems machines are solving for, their current capabilities and scientists' goals
  • Understanding the competitive ecosystem and how companies are using machine learning to drive results
  • Preparing oneself for for what many industry leaders call a major shift in our society (Andrew Ng refers to AI as a "new electricity")
  • Understanding basic concepts that often appear within research (it's helped me with understanding certain concepts that appear within Google Brain's Research)
  • Growing as an individual and expanding your horizons (you might really enjoy machine learning!)
  • When code works and data is produced, it's a very fulfilling, empowering feeling (even if it's a very humble result)

I spent a year taking online courses, reading books, and learning about learning (...as a machine). This post is the fruit borne of that labor -- it covers 17 machine learning resources (including online courses, books, guides, conference presentations, etc.) comprising the most affordable and popular machine learning resources on the web (through the lens of a complete beginner). I've also added a summary of "If I were to start over again, how I would approach it."

This article isn't about credit or degrees. It's about regular Joes and Joannas with an interest in machine learning, and who want to spend their learning time efficiently. Most of these resources will consume over 50 hours of commitment. Ain't nobody got time for a painful waste of a work week (especially when this is probably completed during your personal time). The goal here is for you to find the resource that best suits your learning style. I genuinely hope you find this research useful, and I encourage comments on which materials prove most helpful (especially ones not included)! #HumanLearningMachineLearning


Executive summary:

Here's everything you need to know in a chart:

Machine Learning Resource

Time (hours)

Cost ($)

Year

Credibility

Code

Math

Enjoyability

Jason Maye's Machine Learning 101 slidedeck: 2 years of headbanging, so you don't have to

2

$0

'17

Credibility level 3

Code level 1

Math level 1

Enjoyability level 5

{ML} Recipes with Josh Gordon Playlist

2

$0

'16

Credibility level 3

Code level 3

Math level 1

Enjoyability level 4

Machine Learning Crash Course

15

$0

'18

Credibility level 4

Code level 4

Math level 2

Enjoyability level 4

OCDevel Machine Learning Guide Podcast

30

$0

'17-

Credibility level 1

Code level 1

Math level 1

Enjoyability level 5

Kaggle's Machine Learning Track (part 1)

6

$0

'17

Credibility level 3

Code level 5

Math level 1

Enjoyability level 4

Fast.ai (part 1)

70

$70*

'16

Credibility level 4

Code level 5

Math level 3

Enjoyability level 5

Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems

20

$25

'17

Credibility level 4

Code level 4

Math level 2

Enjoyability level 3

Udacity's Intro to Machine Learning (Kate/Sebastian)

60

$0

'15

Credibility level 4

Code level 4

Math level 3

Enjoyability level 3

Andrew Ng's Coursera Machine Learning

55

$0

'11

Credibility level 5

Code level 2

Math level 4

Enjoyability level 1

iPullRank Machine Learning Guide

3

$0

'17

Credibility level 1

Code level 1

Math level 1

Enjoyability level 3

Review Google PhD

2

$0

'17

Credibility level 5

Code level 4

Math level 2

Enjoyability level 2

Caltech Machine Learning on iTunes

27

$0

'12

Credibility level 5

Code level 2

Math level 5

Enjoyability level 2

Pattern Recognition & Machine Learning by Christopher Bishop

150

$75

'06

Credibility level 5

Code level 2

Math level 5

N/A

Machine Learning: Hands-on for Developers and Technical Professionals

15

$50

'15

Credibility level 2

Code level 3

Math level 2

Enjoyability level 3

Introduction to Machine Learning with Python: A Guide for Data Scientists

15

$25

'16

Credibility level 3

Code level 3

Math level 3

Enjoyability level 2

Udacity's Machine Learning by Georgia Tech

96

$0

'15

Credibility level 5

Code level 1

Math level 5

Enjoyability level 1

Machine Learning Stanford iTunes by Andrew Ng

25

$0

'08

Credibility level 5

Code level 1

Math level 5

N/A

*Free, but there is the cost of running an AWS EC2 instance (~$70 when I finished, but I did tinker a ton and made a Rick and Morty script generator, which I ran many epochs [rounds] of...)


Here's my suggested program:

1. Starting out (estimated 60 hours)

Start with shorter content targeting beginners. This will allow you to get the gist of what's going on with minimal time commitment.

2. Ready to commit (estimated 80 hours)

By this point, learners would understand their interest levels. Continue with content focused on applying relevant knowledge as fast as possible.

3. Broadening your horizons (estimated 115 hours)

If you've made it through the last section and are still hungry for more knowledge, move on to broadening your horizons. Read content focused on teaching the breadth of machine learning -- building an intuition for what the algorithms are trying to accomplish (whether visual or mathematically).

Your next steps

By this point, you will already have AWS running instances, a mathematical foundation, and an overarching view of machine learning. This is your jumping-off point to determine what you want to do.

You should be able to determine your next step based on your interest, whether it's entering Kaggle competitions; doing Fast.ai part two; diving deep into the mathematics with Pattern Recognition & Machine Learning by Christopher Bishop; giving Andrew Ng's newer Deeplearning.ai course on Coursera; learning more about specific tech stacks (TensorFlow, Scikit-Learn, Keras, Pandas, Numpy, etc.); or applying machine learning to your own problems.


Why am I recommending these steps and resources?

I am not qualified to write an article on machine learning. I don't have a PhD. I took one statistics class in college, which marked the first moment I truly understood "fight or flight" reactions. And to top it off, my coding skills are lackluster (at their best, they're chunks of reverse-engineered code from Stack Overflow). Despite my many shortcomings, this piece had to be written by someone like me, an average person.

Statistically speaking, most of us are average (ah, the bell curve/Gaussian distribution always catches up to us). Since I'm not tied to any elitist sentiments, I can be real with you. Below contains a high-level summary of my reviews on all of the classes I took, along with a plan for how I would approach learning machine learning if I could start over. Click to expand each course for the full version with notes.


In-depth reviews of machine learning courses:

Starting out

Jason Maye's Machine Learning 101 slidedeck: 2 years of head-banging, so you don't have to ↓

{ML} Recipes with Josh Gordon ↓

Google's Machine Learning Crash Course with TensorFlow APIs ↓

OCDevel's Machine Learning Guide Podcast ↓

Kaggle Machine Learning Track (Lesson 1) ↓


Ready to commit

Fast.ai (part 1 of 2) ↓

Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems ↓


Broadening your horizons

Udacity: Intro to Machine Learning (Kate/Sebastian) ↓

Andrew Ng's Coursera Machine Learning Course ↓


Additional machine learning opportunities

iPullRank Machine Learning Guide ↓

Review Google PhD ↓

Caltech Machine Learning iTunes ↓

"Pattern Recognition & Machine Learning" by Christopher Bishop ↓

Machine Learning: Hands-on for Developers and Technical Professionals ↓

Introduction to Machine Learning with Python: A Guide for Data Scientists ↓

Udacity: Machine Learning by Georgia Tech ↓

Andrew Ng's Stanford's Machine Learning iTunes ↓


Motivations and inspiration

If you're wondering why I spent a year doing this, then I'm with you. I'm genuinely not sure why I set my sights on this project, much less why I followed through with it. I saw Mike King give a session on Machine Learning. I was caught off guard, since I knew nothing on the topic. It gave me a pesky, insatiable curiosity itch. It started with one course and then spiraled out of control. Eventually it transformed into an idea: a review guide on the most affordable and popular machine learning resources on the web (through the lens of a complete beginner). Hopefully you found it useful, or at least somewhat interesting. Be sure to share your thoughts or questions in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


A Machine Learning Guide for Average Humans posted first on https://moz.com/blog

Monitoring Featured Snippets – Whiteboard Friday

Posted by BritneyMuller

We've covered finding featured snippet opportunities. We've covered the process of targeting featured snippets you want to win. Now it's time for the third and final piece of the puzzle: how to monitor and measure the effectiveness of all your efforts thus far. In this episode of Whiteboard Friday, Britney shares three pro tips on how to make sure your featured snippet strategy is working.

Monitoring featured snippets

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we are going over part three of our three-part series all about featured snippets. So part one was about how to discover those featured snippet opportunities, part two was about how to target those, and this final one is how to properly monitor and measure the effectiveness of your targeting.

So we'll jump right in. So there are a couple different steps and things you can do to go through this.

I. Manually resubmit URL and check SERP in incognito

First is just to manually resubmit a URL after you have tweaked that page to target that featured snippet. Super easy to do. All you do is go to Google and you type in "add URL to Google." You will see a box pop up where you can submit that URL. You can also go through Search Console and submit it manually there. But this just sort of helps Google to crawl it a little faster and hopefully get it reprioritized to, potentially, a featured snippet.

From there, you can start to check for the keyword in an incognito window. So, in Chrome, you go to File > New Incognito. It tends to be a little bit more unbiased than your regular browser page when you're doing a search. So this way, you'd start to get an idea of whether or not you're moving up in that search result. So this can be anywhere from, I kid you not, a couple of minutes to months.

So Google tends to test different featured snippets over a long period of time, but occasionally I've had experience and I know a lot of you watching have had different experiences where you submit that URL to Google and boom — you're in that featured snippet. So it really just depends, but you can keep an eye on things this way.

II. Track rankings for target keyword and Search Console data!

But you also want to keep in mind that you want to start also tracking for rankings for your target keyword as well as Search Console data. So what does that click-through rate look like? How are the impressions? Is there an upward trend in you trying to target that snippet?

So, in my test set, I have seen an average of around 80% increase in those keywords, just in rankings alone. So that's a good sign that we're improving these pages and hopefully helping to get us more featured snippets.

III. Check for other featured snippets

Then this last kind of pro tip here is to check for other instances of featured snippets. This is a really fun thing to do. So if you do just a basic search for "what are title tags," you're going to see Moz in the featured snippet. Then if you do "what are title tags" and then you do a -site:Moz.com, you're going to see another featured snippet that Google is pulling is from a different page, that is not on Moz.com. So really interesting to sort of evaluate the types of content that they are testing and pulling for featured snippets.

Another trick that you can do is to append this ampersand, &num=1, &num=2 and so forth. What this is doing is you put this at the end of your Google URL for a search. So, typically, you do a search for "what are title tags," and you're going to see Google.com/search/? that typical markup. You can do a close-up on this, and then you're just going to append it to pull in only three results, only two results, only four results, or else you can go longer and you can see if Google is pulling different featured snippets from that different quota of results. It's really, really interesting, and you start to see what they're testing and all that great stuff. So definitely play around with these two hacks right here.


Then lastly, you really just want to set the frequency of your monitoring to meet your needs. So hopefully, you have all of this information in a spreadsheet somewhere. You might have the keywords that you're targeting as well as are they successful yet, yes or no. What's the position? Is that going up or down?

Then you can start to prioritize. If you're doing hundreds, you're trying to target hundreds of featured snippets, maybe you check the really, really important ones once a week. Some of the others maybe are monthly checks.

From there, you really just need to keep track of, "Okay, well, what did I do to make that change? What was the improvement to that page to get it in the featured snippet?" That's where you also want to keep detailed notes on what's working for you and in your space and what's not.

So I hope this helps. I look forward to hearing all of your featured snippet targeting stories. I've gotten some really awesome emails and look forward to hearing more about your journey down below in the comments. Feel free to ask me any questions and I look forward to seeing you on our next edition of Whiteboard Friday. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Monitoring Featured Snippets - Whiteboard Friday posted first on https://moz.com/blog