Image Link Building – Whiteboard Friday

Posted by BritneyMuller

Image link building is a delicate art. There are some distinct considerations from traditional link building, and doing it successfully requires a balance of creativity, curiosity, and having the right tools on hand. In today's Whiteboard Friday, Moz's own SEO and link building aficionado Britney Muller offers up concrete advice for successfully building links via images.

Image Link Building

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, welcome to another edition of Whiteboard Friday. Today we're going to go over all things image link building, which is sort of an art. I'm so excited to dig into this with you.

Know your link targets

So first and foremost, you need to know your link targets:

I. Popular industry platforms - top pages

What are those top platforms or websites that you would really like to acquire a link from? Then, from there, you can start to understand who might be influencers on those platforms, who's writing the content, who might you contact, and also what are the top pages currently for those sites. There are a number of tools that give you a glimpse into that information. Moz's OSE, Open Site Explorer, will show you top pages. SEMrush has a top page report. SimilarWeb has a popular page report. You can dig into all that information there, really interesting stuff.

II. Old popular images - update!

You can also start to dig into old, popular images and then update them. So what are old popular images within your space that you could have an opportunity to revamp and update? A really neat way to sort of dig into some of that is BuzzSumo's infographics filter, and then you would insert the topic. You enter the industry or the topic you're trying to address and then search by the infographics to see if you can come across anything.

III. Transform popular content into images

You can also just transform popular content into images, and I think there is so much opportunity in doing that for new statistics reports, new data that comes out. There are tons of great opportunities to transform those into multiple images and leverage that across different platforms for link building.

IV. Influencers

Again, just understanding who those influencers are.

Do your keyword research

So, from here, we're going to dive into the keyword research part of this whole puzzle, and this is really understanding the intent behind people searching about the topic or the product or whatever it might be. Something you can do is evaluate keywords with link intent. This is a brilliant concept I heard about a couple weeks back from Dan Shure's podcast. Thank you, Dan. Essentially it's the idea that keywords with statistics or facts after the keyword have link intent baked into the search query. It's brilliant. Those individuals are searching for something to reference, to maybe link to, to include in a presentation or an article or whatever that might be. It has this basic link intent.

Another thing you want to evaluate is just anything around images. Do any of your keywords and pictures or photos, etc. have good search volume with some opportunities? What does that search result currently look like? You have to evaluate what's currently ranking to understand what's working and what's not. I used to say at my old agency I didn't want anyone writing any piece of content until they had read all of the 10 search results for that keyword or that phrase we were targeting. Why would you do that until you have a full understanding of how that looks currently and how we can make something way better?

Rand had also mentioned this really cool tip on if you find some keywords, it's good to evaluate whether or not the image carousel shows up for those searches, because if it does, that's a little glimpse into the searcher intent that leads to images. That's a good sign that you're on the right track to really optimize for a certain image. It's something to keep in mind.

Provide value

So, from here, we're going to move up to providing value. Now we're in the brainstorming stage. Hopefully, you've gotten some ideas, you know where you want to link from, and you need to provide value in some way. It could be a...

I. Reference/bookmark Maybe something that people would bookmark, that always works.

II. Perspective is a really interesting one. So some of the most beautiful data visualizations do this extremely well, where they can simplify a confusing concept or a lot of data. It's a great way to leverage images and graphics.

III. Printouts still work really well. Moz has the SEO Dev Cheat Sheet that I have seen printed all over at different agencies, and that's really neat to see it adding value directly.

IV. Curate images. We see this a lot with different articles. Maybe the top 25 to 50 images from this tradeshow or this event or whatever it might be, that's a great way to leverage link building and kind of getting people fired up about a curated piece of content.

Gregory Ciotti — I don't know if I'm saying that right — has an incredible article I suggest you all read called "Why a Visual Really Is Worth a Thousand Words," and he mentions don't be afraid to get obvious. I love that, because I think all too often we tend to overthink images and executing things in general. Why not just state the obvious and see how it goes? He's got great examples.

Optimize

So, from here, we are going to move into optimization. If any of you need a brush-up on image optimization, I highly suggest you check out Rand's Whiteboard Friday on image SEO. It covers everything. But some of the basics are your...

Title

You want to make sure that the title of the image has your keyword and explains what it is that you're trying to convey.

Alt text

This was first and foremost designed for the visually impaired, so you need to be mindful of visually impaired screen readers that will read this to people to explain what the image actually is. So first and foremost, you just need to be helpful and provide information in a descriptive way to describe that image.

Compression

Compression is huge. Page speed is so big right now. I hear about it all the time. I know you guys do too. But one of the easiest ways to help page speed is to compress those huge images. There's a ton of great free tools out there, like Optimizilla, where you can bulk upload a bunch of large images and then bulk download. It makes it super easy. There are also some desktop programs, if you're doing this kind of stuff all the time, that will automatically compress images you download or save. That might be worth looking into if you do this a lot.
You want to host the image. You want it to live on your domain. You want to house that. You can leverage it on other platforms, but you want sort of that original to be on your site.

SRCSET

Source set attribute is getting a little technical. It's super interesting, and it's basically this really incredible image attribute that allows you to set the minimum browser size and the image you would prefer to show up for different sizes. So you can not only have different images show up for different devices in different sizes, but you can also revamp them. You can revamp the same image and serve it better for a mobile user versus a tablet, etc. John Henshaw has some of the greatest stuff on source set. Highly suggest you look at some of his articles. He's doing really cool things with it. Check that out.

Promotion

So, from here, you want to promote your images. You obviously want to share it on popular platforms. You want to reach back out to some of these things that you might have into earlier. If you updated a piece of content, make them aware of that. Or if you transformed a really popular piece of content into some visuals, you might want to share that with the person who is sharing that piece of content. You want to start to tap into that previous research with your promotion.

Inform the influencers

Ask people to share it. There is nothing wrong with just asking your network of people to share something you've worked really hard on, and hopefully, vice versa, that can work in return and you're not afraid to share something a connection of yours has that they worked really hard on.

Monitor the image SERPs

From here, you need to monitor. One of the best ways to do this is Google reverse image search. So if you go to Google and you click the images tab, there's that little camera icon that you can click on and upload images to see where else they live on the web. This is a great way to figure out who is using your image, where it's being held, are you getting a backlink or are you not. You want to keep an eye on all of that stuff.

Two other tools to do this, that I've heard about, are Image Raider and TinEye. But I have not had great experience with either of these. I would love to hear your comments below if maybe you have.

Reverse image search with Google works the best for me. This is also an awesome opportunity for someone to get on the market and create a Google alert for images. I don't think anyone is actually doing that right now. If you know someone that is, please let me know down below in the comments. But it could be a cool business opportunity, right? I don't know.

So for monitoring, let's say you find your image is being used on different websites. Now you need to do some basic outreach to get that link. You want to request that link for using your image.

This is just a super basic template that I came up with. You can use it. You can change it, do whatever you want. But it's just:

Hi, [first name].
Thank you so much for including our image in your article. Great piece. Just wondering if you could link to us.com as the source.
Thanks,
Britney

Something like that. Something short, to the point. If you can make it more personalized, please do so. I can't stress that enough. People will take you way more seriously if you have some nugget of personal information or connection that you can make.

From there, you just sort of stay in this loop. After you go through this process, you need to continue to promote your content and continue to monitor and do outreach and push that to maximize your link building efforts.
So I hope you enjoyed this. I look forward to hearing all of your comments and thoughts down below in the comments. I look forward to seeing you all later. Thanks for joining us on this edition of Whiteboard Friday. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Image Link Building - Whiteboard Friday posted first on https://moz.com/blog

Moz the Monster: Anatomy of an (Averted) Brand Crisis

Posted by Dr-Pete

On the morning of Friday, November 10, we woke up to the news that John Lewis had launched an ad campaign called "Moz the Monster". If you're from the UK, John Lewis needs no introduction, but for our American audience, they're a high-end retail chain that's gained a reputation for a decade of amazing Christmas ads.

It's estimated that John Lewis spent upwards of £7m on this campaign (roughly $9.4M). It quickly became clear that they had organized a multi-channel effort, including a #mozthemonster Twitter campaign.

From a consumer perspective, Moz was just a lovable blue monster. From the perspective of a company that has spent years building a brand, John Lewis was potentially going to rewrite what "Moz" meant to the broader world. From a search perspective, we were facing a rare possibility of competing for our own brand on Google results if this campaign went viral (and John Lewis has a solid history of viral campaigns).

Step #1: Don't panic

At the speed of social media, it can be hard to stop and take a breath, but you have to remember that that speed cuts both ways. If you're too quick to respond and make a mistake, that mistake travels at the same speed and can turn into a self-fulfilling prophecy, creating exactly the disaster you feared.

The first step is to get multiple perspectives quickly. I took to Slack in the morning (I'm two hours ahead of the Seattle team) to find out who was awake. Two of our UK team (Jo and Eli) were quick to respond, which had the added benefit of getting us the local perspective.

Collectively, we decided that, in the spirit of our TAGFEE philosophy, a friendly monster deserved a friendly response. Even if we chose to look at it purely from a pragmatic, tactical standpoint, John Lewis wasn't a competitor, and going in metaphorical guns-blazing against a furry blue monster and the little boy he befriended could've been step one toward a reputation nightmare.

Step #2: Respond (carefully)

In some cases, you may choose not to respond, but in this case we felt that friendly engagement was our best approach. Since the Seattle team was finishing their first cup of coffee, I decided to test the waters with a tweet from my personal account:

I've got a smaller audience than the main Moz account, and a personal tweet as the west coast was getting in gear was less exposure. The initial response was positive, and we even got a little bit of feedback, such as suggestions to monitor UK Google SERPs (see "Step #3").

Our community team (thanks, Tyler!) quickly followed up with an official tweet:

While we didn't get direct engagement from John Lewis, the general community response was positive. Roger Mozbot and Moz the Monster could live in peace, at least for now.

Step #3: Measure

There was a longer-term fear – would engagement with the Moz the Monster campaign alter Google SERPs for Moz-related keywords? Google has become an incredibly dynamic engine, and the meaning of any given phrase can rewrite itself based on how searchers engage with that phrase. I decided to track "moz" itself across both the US and UK.

In that first day of the official campaign launch, searches for "moz" were already showing news ("Top Stories") results in the US and UK, with the text-only version in the US:

...and the richer Top Stories carousel in the UK:

The Guardian article that announced the campaign launch was also ranking organically, near the bottom of page one. So, even on day one, we were seeing some brand encroachment and knew we had to keep track of the situation on a daily basis.

Just two days later (November 12), Moz the Monster had captured four page-one organic results for "moz" in the UK (at the bottom of the page):

While it still wasn't time to panic, John Lewis' campaign was clearly having an impact on Google SERPs.

Step #4: Surprises

On November 13, it looked like the SERPs might be returning to normal. The Moz Blog had regained the Top Stories block in both US and UK results:

We weren't in the clear yet, though. A couple of days later, a plagiarism scandal broke, and it was dominating the UK news for "moz" by November 18:

This story also migrated into organic SERPs after The Guardian published an op-ed piece. Fortunately for John Lewis, the follow-up story didn't last very long. It's an important reminder, though, that you can't take your eyes off of the ball just because it seems to be rolling in the right direction.

Step #5: Results

It's one thing to see changes in the SERPs, but how was all of this impacting search trends and our actual traffic? Here's the data from Google Trends for a 4-week period around the Moz the Monster launch (2 weeks on either side):

The top graph is US trends data, and the bottom graph is UK. The large spike in the middle of the UK graph is November 10, where you can see that interest in the search "moz" increased dramatically. However, this spike fell off fairly quickly and US interest was relatively unaffected.

Let's look at the same time period for Google Search Console impression and click data. First, the US data (isolated to just the keyword "moz"):

There was almost no change in impressions or clicks in the US market. Now, the UK data:

Here, the launch spike in impressions is very clear, and closely mirrors the Google Trends data. However, clicks to Moz.com were, like the US market, unaffected. Hindsight is 20/20, and we were trying to make decisions on the fly, but the short-term shift in Google SERPs had very little impact on clicks to our site. People looking for Moz the Monster and people looking for Moz the search marketing tool are, not shockingly, two very different groups.

Ultimately, the impact of this campaign was short-lived, but it is interesting to see how quickly a SERP can rewrite itself based on the changing world, especially with an injection of ad dollars. At one point (in UK results), Moz the Monster had replaced Moz.com in over half (5 of 8) page-one organic spots and Top Stories – an impressive and somewhat alarming feat.

By December 2, Moz the Monster had completely disappeared from US and UK SERPs for the phrase "moz". New, short-term signals can rewrite search results, but when those signals fade, results often return to normal. So, remember not to panic and track real, bottom-line results.

Your crisis plan

So, how can we generalize this to other brand crises? What happens when someone else's campaign treads on your brand's hard-fought territory? Let's restate our 5-step process:

(1) Remember not to panic

The very word "crisis" almost demands panic, but remember that you can make any problem worse. I realize that's not very comforting, but unless your office is actually on fire, there's time to stop and assess the situation. Get multiple perspectives and make sure you're not overreacting.

(2) Be cautiously proactive

Unless there's a very good reason not to (such as a legal reason), it's almost always best to be proactive and respond to the situation on your own terms. At least acknowledge the situation, preferably with a touch of humor. These brand intrusions are, by their nature, high profile, and if you pretend it's not happening, you'll just look clueless.

(3) Track the impact

As soon as possible, start collecting data. These situations move quickly, and search rankings can change overnight in 2017. Find out what impact the event is really having as quickly as possible, even if you have to track some of it by hand. Don't wait for the perfect metrics or tracking tools.

(4) Don't get complacent

Search results are volatile and social media is fickle – don't assume that a lull or short-term change means you can stop and rest. Keep tracking, at least for a few days and preferably for a couple of weeks (depending on the severity of the crisis).

(5) Measure bottom-line results

As the days go by, you'll be able to more clearly see the impact. Track as deeply as you can – long-term rankings, traffic, even sales/conversions where necessary. This is the data that tells you if the short-term impact in (3) is really doing damage or is just superficial.

The real John Lewis

Finally, I'd like to give a shout-out to someone who has felt a much longer-term impact of John Lewis' succesful holiday campaigns. Twitter user and computer science teacher @johnlewis has weathered his own brand crisis year after year with grace and humor:

So, a hat-tip to John Lewis, and, on behalf of Moz, a very happy holidays to Moz the Monster!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Moz the Monster: Anatomy of an (Averted) Brand Crisis posted first on https://moz.com/blog

Keyword Research Beats Nate Silver’s 2016 Presidential Election Prediction

Posted by BritneyMuller

100% of statisticians would say this is a terrible method for predicting elections. However, in the case of 2016’s presidential election, analyzing the geographic search volume of a few telling keywords “predicted” the outcome more accurately than Nate Silver himself.

The 2016 US Presidential Election was a nail-biter, and many of us followed along with the famed statistician’s predictions in real time on FiveThirtyEight.com. Silver’s predictions, though more accurate than many, were still disrupted by the election results.

In an effort to better understand our country (and current political chaos), I dove into keyword research state-by-state searching for insights. Keywords can be powerful indicators of intent, thought, and behavior. What keyword searches might indicate a personal political opinion? Might there be a common denominator search among people with the same political beliefs?

It’s generally agreed that Fox News leans to the right and CNN leans to the left. And if we’ve learned anything this past year, it’s that the news you consume can have a strong impact on what you believe, in addition to the confirmation bias already present in seeking out particular sources of information.

My crazy idea: What if Republican states showed more “fox news” searches than “cnn”? What if those searches revealed a bias and an intent that exit polling seemed to obscure?

The limitations to this research were pretty obvious. Watching Fox News or CNN doesn’t necessarily correlate with voter behavior, but could it be a better indicator than the polls? My research says yes. I researched other media outlets as well, but the top two ideologically opposed news sources — in any of the 50 states — were consistently Fox News and CNN.

Using Google Keyword Planner (connected to a high-paying Adwords account to view the most accurate/non-bucketed data), I evaluated each state's search volume for “fox news” and “cnn.”

Eight states showed the exact same search volumes for both. Excluding those from my initial test, my results accurately predicted 42/42 of the 2016 presidential state outcomes including North Carolina and Wisconsin (which Silver mis-predicted). Interestingly, "cnn" even mirrored Hillary Clinton, similarly winning the popular vote (25,633,333 vs. 23,675,000 average monthly search volume for the United States).

In contrast, Nate Silver accurately predicted 45/50 states using a statistical methodology based on polling results.

Click for a larger image

This gets even more interesting:

The eight states showing the same average monthly search volume for both “cnn” and “fox news” are Arizona, Florida, Michigan, Nevada, New Mexico, Ohio, Pennsylvania, and Texas.

However, I was able to dive deeper via GrepWords API (a keyword research tool that actually powers Keyword Explorer's data), to discover that Arizona, Nevada, New Mexico, Pennsylvania, and Ohio each have slightly different “cnn” vs “fox news” search averages over the previous 12-month period. Those new search volume averages are:


“fox news” avg monthly search volume

“cnn” avg monthly search volume

KWR Prediction

2016 Vote

Arizona

566333

518583

Trump

Trump

Nevada

213833

214583

Hillary

Hillary

New Mexico

138833

142916

Hillary

Hillary

Ohio

845833

781083

Trump

Trump

Pennsylvania

1030500

1063583

Hillary

Trump

Four out of five isn’t bad! This brought my new prediction up to 46/47.

Silver and I each got Pennsylvania wrong. The GrepWords API shows the average monthly search volume for “cnn” was ~33,083 searches higher than “fox news” (to put that in perspective, that’s ~0.26% of the state’s population). This tight-knit keyword research theory is perfectly reflected in Trump’s 48.2% win against Clinton’s 47.5%.

Nate Silver and I have very different day jobs, and he wouldn’t make many of these hasty generalizations. Any prediction method can be right a couple times. However, it got me thinking about the power of keyword research: how it can reveal searcher intent, predict behavior, and sometimes even defy the logic of things like statistics.

It’s also easy to predict the past. What happens when we apply this model to today's Senate race?

Can we apply this theory to Alabama’s special election in the US Senate?

After completing the above research on a whim, I realized that we’re on the cusp of yet another hotly contested, extremely close election: the upcoming Alabama senate race, between controversy-laden Republican Roy Moore and Democratic challenger Doug Jones, fighting for a Senate seat that hasn’t been held by a Democrat since 1992.

I researched each Alabama county — 67 in total — for good measure. There are obviously a ton of variables at play. However, 52 out of the 67 counties (77.6%) 2016 presidential county votes are correctly “predicted” by my theory.

Even when giving the Democratic nominee more weight to the very low search volume counties (19 counties showed a search volume difference of less than 500), my numbers lean pretty far to the right (48/67 Republican counties):

It should be noted that my theory incorrectly guessed two of the five largest Alabama counties, Montgomery and Jefferson, which both voted Democrat in 2016.

Greene and Macon Counties should both vote Democrat; their very slight “cnn” over “fox news” search volume is confirmed by their previous presidential election results.

I realize state elections are not won by county, they’re won by popular vote, and the state of Alabama searches for “fox news” 204,000 more times a month than “cnn” (to put that in perspective, that’s around ~4.27% of Alabama’s population).

All things aside and regardless of outcome, this was an interesting exploration into how keyword research can offer us a glimpse into popular opinion, future behavior, and search intent. What do you think? Any other predictions we could make to test this theory? What other keywords or factors would you look at? Let us know in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Keyword Research Beats Nate Silver's 2016 Presidential Election Prediction posted first on https://moz.com/blog

Not-Actually-the-Best Local SEO Practices

Posted by MiriamEllis

It’s never fun being the bearer of bad news.

You’re on the phone with an amazing prospect. Let’s say it’s a growing appliance sales and repair provider with 75 locations in the western US. Your agency would absolutely love to onboard this client, and the contact is telling you, with some pride, that they’re already ranking pretty well for about half of their locations.

With the right strategy, getting them the rest of the way there should be no problem at all.

But then you notice something, and your end of the phone conversation falls a little quiet as you click through from one of their Google My Business listings in Visalia to Streetview and see… not a commercial building, but a house. Uh-oh. In answer to your delicately worded question, you find out that 45 of this brand’s listings have been built around the private homes of their repairmen — an egregious violation of Google’s guidelines.

“I hate to tell you this…,” you clear your throat, and then you deliver the bad news.

marketingfoundations1.jpg

If you do in-house Local SEO, do it for clients, or even just answer questions in a forum, you’ve surely had the unenviable (yet vital) task of telling someone they’re “doing it wrong,” frequently after they’ve invested considerable resources in creating a marketing structure that threatens to topple due to a crack in its foundation. Sometimes you can patch the crack, but sometimes, whole edifices of bad marketing have to be demolished before safe and secure new buildings can be erected.

Here are 5 of the commonest foundational marketing mistakes I’ve encountered over the years as a Local SEO consultant and forum participant. If you run into these in your own work, you’ll be doing someone a big favor by delivering “the bad news” as quickly as possible:

1. Creating GMB listings at ineligible addresses

What you’ll hear:

“We need to rank for these other towns, because we want customers there. Well, no, we don’t really have offices there. We have P.O. Boxes/virtual offices/our employees’ houses.”

Why it’s a problem:

Google’s guidelines state:

  • Make sure that your page is created at your actual, real-world location
  • PO Boxes or mailboxes located at remote locations are not acceptable.
  • Service-area businesses—businesses that serve customers at their locations—should have one page for the central office or location and designate a service area from that point.

All of this adds up to Google saying you shouldn’t create a listing for anything other than a real-world location, but it’s extremely common to see a) spammers simply creating tons of listings for non-existent locations, b) people of good will not knowing the guidelines and doing the same thing, and c) service area businesses (SABs) feeling they have to create fake-location listings because Google won’t rank them for their service cities otherwise.

In all three scenarios, the brand puts itself at risk for detection and listing removal. Google can catch them, competitors and consumers can catch them, and marketers can catch them. Once caught, any effort that was put into ranking and building reputation around a fake-location listing is wasted. Better to have devoted resources to risk-free marketing efforts that will add up to something real.

What to do about it:

Advise the SAB owner to self-report the problem to Google. I know this sounds risky, but Google My Business forum Top Contributor Joy Hawkins let me know that she’s never seen a case in which Google has punished a business that self-reported accidental spam. The owner will likely need to un-verify the spam listings (see how to do that here) and then Google will likely remove the ineligible listings, leaving only the eligible ones intact.

What about dyed-in-the-wool spammers who know the guidelines and are violating them regardless, turning local pack results into useless junk? Get to the spam listing in Google Maps, click the “Suggest an edit” link, toggle the toggle to “Yes,” and choose the radio button for spam. Google may or may not act on your suggestion. If not, and the spam is misleading to consumers, I think it’s always a good idea to report it to the Google My Business forum in hopes that a volunteer Top Contributor may escalate an egregious case to a Google staffer.

2. Sharing phone numbers between multiple entities

What you’ll hear:

“I run both my dog walking service and my karate classes out of my house, but I don’t want to have to pay for two different phone lines.”

-or-

“Our restaurant has 3 locations in the city now, but we want all the calls to go through one number for reservation purposes. It’s just easier.”

-or-

“There are seven doctors at our practice. Front desk handles all calls. We can’t expect the doctors to answer their calls personally.”

Why it’s a problem:

There are actually multiple issues at hand on this one. First of all, Google’s guidelines state:

  • Provide a phone number that connects to your individual business location as directly as possible, and provide one website that represents your individual business location.
  • Use a local phone number instead of a central, call center helpline number whenever possible.
  • The phone number must be under the direct control of the business.

This rules out having the phone number of a single location representing multiple locations.

Confusing to Google

Google has also been known in the past to phone businesses for verification purposes. Should a business answer “Jim’s Dog Walking” when a Google rep is calling to verify that the phone number is associated with “Jim’s Karate Lessons,” we’re in trouble. Shared phone numbers have also been suspected in the past of causing accidental merging of Google listings, though I’ve not seen a case of this in a couple of years.

Confusing for businesses

As for the multi-practitioner scenario, the reality is that some business models simply don’t allow for practitioners to answer their own phones. Calls for doctors, dentists, attorneys, etc. are traditionally routed through a front desk. This reality calls into question whether forward-facing listings should be built for these individuals at all. We’ll dive deeper into this topic below, in the section on multi-practitioner listings.

Confusing for the ecosystem

Beyond Google-related concerns, Moz Local’s awesome engineers have taught me some rather amazing things about the problems shared phone numbers can create for citation-building campaigns in the greater ecosystem. Many local business data platforms are highly dependent on unique phone numbers as a signal of entity uniqueness (the “P” in NAP is powerful!). So, for example, if you submit both Jim’s Dog Walking and Jim’s Bookkeeping to Infogroup with the same number, Infogroup may publish both listings, but leave the phone number fields blank! And without a phone number, a local business listing is pretty worthless.

It’s because of realities like these that a unique phone number for each entity is a requirement of the Moz Local product, and should be a prerequisite for any citation building campaign.

What to do about it:

Let the business owner know that a unique phone number for each business entity, each business location, and each forward-facing practitioner who wants to be listed is a necessary business expense (and, hey, likely tax deductible, too!). Once the investment has been made in the unique numbers, the work ahead involves editing all existing citations to reflect them. The free tool Moz Check Listing can help you instantly locate existing citations for the purpose of creating a spreadsheet that details the bad data, allowing you to start correcting it manually. Or, to save time, the business owner may wish to invest in a paid, automated citation correction product like Moz Local.

Pro tip: Apart from removing local business listing stumbling blocks, unique phone numbers have an added bonus in that they enable the benefits of associating KPIs like clicks-to-call to a given entity, and existing numbers can be ported into call tracking numbers for even further analysis of traffic and conversions. You just can’t enjoy these benefits if you lump multiple entities together under a single, shared number.

3. Keyword stuffing GMB listing names

What you’ll hear:

“I have 5 locations in Dallas. How are my customers supposed to find the right one unless I add the neighborhood name to the business name on the listings?”

-or-

“We want customers to know we do both acupuncture and massage, so we put both in the listing name.”

-or-

“Well, no, the business name doesn’t actually have a city name in it, but my competitors are adding city names to their GMB listings and they’re outranking me!”

Why it’s a problem:

Long story short, it’s a blatant violation of Google’s guidelines to put extraneous keywords in the business name field of a GMB listing. Google states:

  • Your name should reflect your business’ real-world name, as used consistently on your storefront, website, stationery, and as known to customers.
  • Including unnecessary information in your business name is not permitted, and could result in your listing being suspended.

What to do about it:

I consider this a genuine Local SEO toughie. On the one hand, Google’s lack of enforcement of these guidelines, and apparent lack of concern about the whole thing, makes it difficult to adequately alarm business owners about the risk of suspension. I’ve successfully reported keyword stuffing violations to Google and have had them act on my reports within 24 hours… only to have the spammy names reappear hours or days afterwards. If there’s a suspension of some kind going on here, I don’t see it.

Simultaneously, Google’s local algo apparently continues to be influenced by exact keyword matches. When a business owner sees competitors outranking him via outlawed practices which Google appears to ignore, the Local SEO may feel slightly idiotic urging guideline-compliance from his patch of shaky ground.

But, do it anyway. For two reasons:

  1. If you’re not teaching business owners about the importance of brand building at this point, you’re not really teaching marketing. Ask the owner, “Are you into building a lasting brand, or are you hoping to get by on tricks?” Smart owners (and their marketers) will see that it’s a more legitimate strategy to build a future based on earning permanent local brand recognition for Lincoln & Herndon, than for Springfield Car Accident Slip and Fall Personal Injury Lawyers Attorneys.
  2. I find it interesting that, in all of Google’s guidelines, the word “suspended” is used only a few times, and one of these rare instances relates to spamming the business title field. In other words, Google is using the strongest possible language to warn against this practice, and that makes me quite nervous about tying large chunks of reputation and rankings to a tactic against which Google has forewarned. I remember that companies were doing all kinds of risky things on the eve of the Panda and Penguin updates and they woke up to a changed webscape in which they were no longer winners. Because of this, I advocate alerting any business owner who is risking his livelihood to chancy shortcuts. Better to build things for real, for the long haul.

Fortunately, it only takes a few seconds to sign into a GMB account and remove extraneous keywords from a business name. If it needs to be done at scale for large multi-location enterprises across the major aggregators, Moz Local can get the job done. Will removing spammy keywords from the GMB listing title cause the business to move down in Google’s local rankings? It’s possible that they will, but at least they’ll be able to go forward building real stuff, with the moral authority to report rule-breaking competitors and keep at it until Google acts.

And tell owners not to worry about Google not being able to sort out a downtown location from an uptown one for consumers. Google’s ability to parse user proximity is getting better every day. Mobile-local packs prove this out. If one location is wrongly outranking another, chances are good the business needs to do an audit to discover weaknesses that are holding the more appropriate listing back. That’s real strategy - no tricks!

4. Creating a multi-site morass

What you’ll hear:

“So, to cover all 3 or our locations, we have greengrocerysandiego.com, greengrocerymonterey.com and greengrocerymendocino.com… but the problem is, the content on the three sites is kind of all the same. What should we do to make the sites different?”

-or-

“So, to cover all of our services, we have jimsappliancerepair.com, jimswashingmachinerepair.com, jimsdryerrepair.com, jimshotwaterheaterrepair.com, jimsrefrigeratorrepair.com. We’re about to buy jimsvacuumrepair.com … but the problem is, there’s not much content on any of these sites. It feels like management is getting out of hand.”

Why it’s a problem:

Definitely a frequent topic in SEO forums, the practice of relying on exact match domains (EMDs) proliferates because of Google’s historic bias in their favor. The ranking influence of EMDs has been the subject of a Google updateand has lessened over time. I wouldn’t want to try to rank for competitive terms with creditcards.com or insurance.com these days.

But if you believe EMDs no longer work in the local-organic world, read this post in which a fellow’s surname/domain name gets mixed up with a distant city name and he ends up ranking in the local packs for it! Chances are, you see weak EMDs ranking all the time for your local searches — more’s the pity. And, no doubt, this ranking boost is the driving force behind local business models continuing to purchase multiple keyword-oriented domains to represent branches of their company or the variety of services they offer. This approach is problematic for 3 chief reasons:

  1. It’s impractical. The majority of the forum threads I’ve encountered in which small-to-medium local businesses have ended up with two, or five, or ten domains invariably lead to the discovery that the websites are made up of either thin or duplicate content. Larger enterprises are often guilty of the same. What seemed like a great idea at first, buying up all those EMDs, turns into an unmanageable morass of web properties that no one has the time to keep updated, to write for, or to market.
  2. Specific to the multi-service business, it’s not a smart move to put single-location NAP on multiple websites. In other words, if your construction firm is located at 123 Main Street in Funky Town, but consumers and Google are finding that same physical address associated with fences.com, bathroomremodeling.com, decks.com, and kitchenremodeling.com, you are sowing confusion in the ecosystem. Which is the authoritative business associated with that address? Some business owners further compound problems by assuming they can then build separate sets of local business listings for each of these different service-oriented domains, violating Google’s guidelines, which state:

    Do not create more than one page for each location of your business.

    The whole thing can become a giant mess, instead of the clean, manageable simplicity of a single brand, tied to a single domain, with a single NAP signal.
  1. With rare-to-nonexistent exceptions, I consider EMDs to be missed opportunities for brand building. Imagine, if instead of being Whole Foods at WholeFoods.com, the natural foods giant had decided they needed to try to squeeze a ranking boost out of buying 400+ domains to represent the eventual number of locations they now operate. WholeFoodsDallas.com, WholeFoodsMississauga.com, etc? Such an approach would get out of hand very fast.

Even the smallest businesses should take cues from big commerce. Your brand is the magic password you want on every consumer’s lips, associated with every service you offer, in every location you open. As I recently suggested to a Moz community member, be proud to domain your flower shop as rossirovetti.com instead of hoping FloralDelivery24hoursSanFrancisco.com will boost your rankings. It’s authentic, easy to remember, looks trustworthy in the SERPs, and is ripe for memorable brand building.

What to do about it:

While I can’t speak to the minutiae of every single scenario, I’ve yet to be part of a discussion about multi-sites in the Local SEO community in which I didn’t advise consolidation. Basically, the business should choose a single, proud domain and, in most cases, 301 redirect the old sites to the main one, then work to get as many external links that pointed to the multi-sites to point to the chosen main site. This oldie but goodie from the Moz blog provides a further technical checklist from a company that saw a 40% increase in traffic after consolidating domains. I’d recommend that any business that is nervous about handling the tech aspects of consolidation in-house should hire a qualified SEO to help them through the process.

5. Creating ill-considered practitioner listings

What you’ll hear:

“We have 5 dentists at the practice, but one moved/retired last month and we don’t know what to do with the GMB listing for him.”

-or-

“Dr. Green is outranking the practice in the local results for some reason, and it’s really annoying.”

Why it’s a problem:

I’ve saved the most complex for last! Multi-practitioner listings can be a blessing, but they’re so often a bane that my position on creating them has evolved to a point where I only recommend building them in specific cases.

When Google first enabled practitioner listings (listings that represent each doctor, lawyer, dentist, or agent within a business) I saw them as a golden opportunity for a given practice to dominate local search results with its presence. However, Google’s subsequent unwillingness to simply remove practitioner duplicates, coupled with the rollout of the Possum update which filters out shared category/similar location listings, coupled with the number of instances I’ve seen in which practitioner listings end up outranking brand listings, has caused me to change my opinion of their benefits. I should also add that the business title field on practitioner listings is a hotbed of Google guideline violations — few business owners have ever read Google’s nitty gritty rules about how to name these types of listings.

In a nutshell, practitioner listings gone awry can result in a bunch of wrongly-named listings often clouded by duplicates that Google won’t remove, all competing for the same keywords. Not good!

What to do about it:

You’ll have multiple scenarios to address when offering advice about this topic.

1.) If the business is brand new, and there is no record of it on the Internet as of yet, then I would only recommend creating practitioner listings if it is necessary to point out an area of specialization. So, for example if a medical practice has 5 MDs, the listing for the practice covers that, with no added listings needed. But, if a medical practice has 5 MDs and an Otolaryngologist, it may be good marketing to give the specialist his own listing, because it has its own GMB category and won’t be competing with the practice for rankings. *However, read on to understand the challenges being undertaken any time a multi-practitioner listing is created.

2.) If the multi-practitioner business is not new, chances are very good that there are listings out there for present, past, and even deceased practitioners.

  • If a partner is current, be sure you point his listing at a landing page on the practice’s website, instead of at the homepage, see if you can differentiate categories, and do your utmost to optimize the practice’s own listing — the point here is to prevent practitioners from outranking the practice. What do I mean by optimization? Be sure the practice’s GMB listing is fully filled out, you’ve got amazing photos, you’re actively earning and responding to reviews, you’re publishing a Google Post at least once a week, and your citations across the web are consistent. These things should all strengthen the listing for the practice.
  • If a partner is no longer with the practice, it’s ideal to unverify the listing and ask Google to market it as moved to the practice — not to the practitioner’s new location. Sound goofy? Read Joy Hawkins’ smart explanation of this convoluted issue.
  • If, sadly, a practitioner has passed away, contact Google to show them an obituary so that the listing can be removed.
  • If a listing represents what is actually a solo practitioner (instead of a partner in a multi-practitioner business model) and his GMB listing is now competing with the listing for his business, you can ask Google to merge the two listings.

3.) If a business wants to create practitioner listings, and they feel up to the task of handling any ranking or situational management concerns, there is one final proviso I’d add. Google’s guidelines state that practitioners should be “directly contactable at the verified location during stated hours” in order to qualify for a GMB listing. I’ve always found this requirement rather vague. Contactable by phone? Contactable in person? Google doesn’t specify. Presumably, a real estate agent in a multi-practitioner agency might be directly contactable, but as my graphic above illustrates, we wouldn’t really expect the same public availability of a surgeon, right? Point being, it may only make marketing sense to create a practitioner listing for someone who needs to be directly available to the consumer public for the business to function. I consider this a genuine grey area in the guidelines, so think it through carefully before acting.

Giving good help

It’s genuinely an honor to advise owners and marketers who are strategizing for the success of local businesses. In our own small way, local SEO consultants live in the neighborhood Mister Rogers envisioned in which you could look for the helpers when confronted with trouble. Given the livelihoods dependent on local commerce, rescuing a company from a foundational marketing mistake is satisfying work for people who like to be “helpers,” and it carries a weight of responsibility.

I’ve worked in 3 different SEO forums over the past 10+ years, and I’d like to close with some things I’ve learned about helping:

  1. Learn to ask the right questions. Small nuances in business models and scenarios can necessitate completely different advice. Don’t be scared to come back with second and third rounds of follow-up queries if someone hasn’t provided sufficient detail for you to advise them well. Read all details thoroughly before replying.
  2. Always, always consult Google’s guidelines, and link to them in your answers. It’s absolutely amazing how few owners and marketers have ever encountered them. Local SEOs are volunteer liaisons between Google and businesses. That’s just the way things have worked out.
  3. Don’t say you’re sure unless you’re really sure. If a forum or client question necessitates a full audit to surface a useful answer, say so. Giving pat answers to complicated queries helps no one, and can actually hurt businesses by leaving them in limbo, losing money, for an even longer time.
  4. Network with colleagues when weird things come up. Ranking drops can be attributed to new Google updates, or bugs, or other factors you haven’t yet noticed but that a trusted peer may have encountered.
  5. Practice humility. 90% of what I know about Local SEO, I’ve learned from people coming to me with problems for which, at some point, I had to discover answers. Over time, the work put in builds up our store of ready knowledge, but we will never know it all, and that’s humbling in a very good way. Community members and clients are our teachers. Let’s be grateful for them, and treat them with respect.
  6. Finally, don’t stress about delivering “the bad news” when you see someone who is asking for help making a marketing mistake. In the long run, your honesty will be the best gift you could possibly have given.

Happy helping!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Not-Actually-the-Best Local SEO Practices posted first on https://moz.com/blog

What Do Google’s New, Longer Snippets Mean for SEO? – Whiteboard Friday

Posted by randfish

Snippets and meta descriptions have brand-new character limits, and it's a big change for Google and SEOs alike. Learn about what's new, when it changed, and what it all means for SEO in this edition of Whiteboard Friday.

What do Google's now, longer snippets mean for SEO?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about Google's big change to the snippet length.

This is the display length of the snippet for any given result in the search results that Google provides. This is on both mobile and desktop. It sort of impacts the meta description, which is how many snippets are written. They're taken from the meta description tag of the web page. Google essentially said just last week, "Hey, we have officially increased the length, the recommended length, and the display length of what we will show in the text snippet of standard organic results."

So I'm illustrating that for you here. I did a search for "net neutrality bill," something that's on the minds of a lot of Americans right now. You can see here that this article from The Hill, which is a recent article — it was two days ago — has a much longer text snippet than what we would normally expect to find. In fact, I went ahead and counted this one and then showed it here.


So basically, at the old 165-character limit, which is what you would have seen prior to the middle of December on most every search result, occasionally Google would have a longer one for very specific kinds of search results, but more than 90%, according to data from SISTRIX, which put out a great report and I'll link to it here, more than 90% of search snippets were 165 characters or less prior to the middle of November. Then Google added basically a few more lines.

So now, on mobile and desktop, instead of an average of two or three lines, we're talking three, four, five, sometimes even six lines of text. So this snippet here is 266 characters that Google is displaying. The next result, from Save the Internet, is 273 characters. Again, this might be because Google sort of realized, "Hey, we almost got all of this in here. Let's just carry it through to the end rather than showing the ellipsis." But you can see that 165 characters would cut off right here. This one actually does a good job of displaying things.

So imagine a searcher is querying for something in your field and they're just looking for a basic understanding of what it is. So they've never heard of net neutrality. They're not sure what it is. So they can read here, "Net neutrality is the basic principle that prohibits internet service providers like AT&T, Comcast, and Verizon from speeding up, slowing down, or blocking any . . ." And that's where it would cut off. Or that's where it would have cut off in November.

Now, if I got a snippet like that, I need to visit the site. I've got to click through in order to learn more. That doesn't tell me enough to give me the data to go through. Now, Google has tackled this before with things, like a featured snippet, that sit at the top of the search results, that are a more expansive short answer. But in this case, I can get the rest of it because now, as of mid-November, Google has lengthened this. So now I can get, "Any content, applications, or websites you want to use. Net neutrality is the way that the Internet has always worked."

Now, you might quibble and say this is not a full, thorough understanding of what net neutrality is, and I agree. But for a lot of searchers, this is good enough. They don't need to click any more. This extension from 165 to 275 or 273, in this case, has really done the trick.

What changed?

So this can have a bunch of changes to SEO too. So the change that happened here is that Google updated basically two things. One, they updated the snippet length, and two, they updated their guidelines around it.

So Google's had historic guidelines that said, well, you want to keep your meta description tag between about 160 and 180 characters. I think that was the number. They've updated that to where they say there's no official meta description recommended length. But on Twitter, Danny Sullivan said that he would probably not make that greater than 320 characters. In fact, we and other data providers, that collect a lot of search results, didn't find many that extended beyond 300. So I think that's a reasonable thing.

When?

When did this happen? It was starting at about mid-November. November 22nd is when SISTRIX's dataset starts to notice the increase, and it was over 50%. Now it's sitting at about 51% of search results that have these longer snippets in at least 1 of the top 10 as of December 2nd.

Here's the amazing thing, though — 51% of search results have at least one. Many of those, because they're still pulling old meta descriptions or meta descriptions that SEO has optimized for the 165-character limit, are still very short. So if you're the person in your search results, especially it's holiday time right now, lots of ecommerce action, if you're the person to go update your important pages right now, you might be able to get more real estate in the search results than any of your competitors in the SERPs because they're not updating theirs.

How will this affect SEO?

So how is this going to really change SEO? Well, three things:

A. It changes how marketers should write and optimize the meta description.

We're going to be writing a little bit differently because we have more space. We're going to be trying to entice people to click, but we're going to be very conscientious that we want to try and answer a lot of this in the search result itself, because if we can, there's a good chance that Google will rank us higher, even if we're actually sort of sacrificing clicks by helping the searcher get the answer they need in the search result.

B. It may impact click-through rate.

We'll be looking at Jumpshot data over the next few months and year ahead. We think that there are two likely ways they could do it. Probably negatively, meaning fewer clicks on less complex queries. But conversely, possible it will get more clicks on some more complex queries, because people are more enticed by the longer description. Fingers crossed, that's kind of what you want to do as a marketer.

C. It may lead to lower click-through rate further down in the search results.

If you think about the fact that this is taking up the real estate that was taken up by three results with two, as of a month ago, well, maybe people won't scroll as far down. Maybe the ones that are higher up will in fact draw more of the clicks, and thus being further down on page one will have less value than it used to.

What should SEOs do?

What are things that you should do right now? Number one, make a priority list — you should probably already have this — of your most important landing pages by search traffic, the ones that receive the most search traffic on your website, organic search. Then I would go and reoptimize those meta descriptions for the longer limits.

Now, you can judge as you will. My advice would be go to the SERPs that are sending you the most traffic, that you're ranking for the most. Go check out the limits. They're probably between about 250 and 300, and you can optimize somewhere in there.

The second thing I would do is if you have internal processes or your CMS has rules around how long you can make a meta description tag, you're going to have to update those probably from the old limit of somewhere in the 160 to 180 range to the new 230 to 320 range. It doesn't look like many are smaller than 230 now, at least limit-wise, and it doesn't look like anything is particularly longer than 320. So somewhere in there is where you're going to want to stay.

Good luck with your new meta descriptions and with your new snippet optimization. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


What Do Google's New, Longer Snippets Mean for SEO? - Whiteboard Friday posted first on https://moz.com/blog

Don’t Be Fooled by Data: 4 Data Analysis Pitfalls & How to Avoid Them

Posted by Tom.Capper

Digital marketing is a proudly data-driven field. Yet, as SEOs especially, we often have such incomplete or questionable data to work with, that we end up jumping to the wrong conclusions in our attempts to substantiate our arguments or quantify our issues and opportunities.

In this post, I’m going to outline 4 data analysis pitfalls that are endemic in our industry, and how to avoid them.

1. Jumping to conclusions

Earlier this year, I conducted a ranking factor study around brand awareness, and I posted this caveat:

"...the fact that Domain Authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:
  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings"
    ~ Me

However, I want to go into this in a bit more depth and give you a framework for analyzing these yourself, because it still comes up a lot. Take, for example, this recent study by Stone Temple, which you may have seen in the Moz Top 10 or Rand’s tweets, or this excellent article discussing SEMRush’s recent direct traffic findings. To be absolutely clear, I’m not criticizing either of the studies, but I do want to draw attention to how we might interpret them.

Firstly, we do tend to suffer a little confirmation bias — we’re all too eager to call out the cliché “confirmation vs. causation” distinction when we see successful sites that are keyword-stuffed, but all too approving when we see studies doing the same with something we think is or was effective, like links.

Secondly, we fail to critically analyze the potential mechanisms. The options aren’t just causation or coincidence.

Before you jump to a conclusion based on a correlation, you’re obliged to consider various possibilities:

  • Complete coincidence
  • Reverse causation
  • Joint causation
  • Linearity
  • Broad applicability

If those don’t make any sense, then that’s fair enough — they’re jargon. Let’s go through an example:

Before I warn you not to eat cheese because you may die in your bedsheets, I’m obliged to check that it isn’t any of the following:

  • Complete coincidence - Is it possible that so many datasets were compared, that some were bound to be similar? Why, that’s exactly what Tyler Vigen did! Yes, this is possible.
  • Reverse causation - Is it possible that we have this the wrong way around? For example, perhaps your relatives, in mourning for your bedsheet-related death, eat cheese in large quantities to comfort themselves? This seems pretty unlikely, so let’s give it a pass. No, this is very unlikely.
  • Joint causation - Is it possible that some third factor is behind both of these? Maybe increasing affluence makes you healthier (so you don’t die of things like malnutrition), and also causes you to eat more cheese? This seems very plausible. Yes, this is possible.
  • Linearity - Are we comparing two linear trends? A linear trend is a steady rate of growth or decline. Any two statistics which are both roughly linear over time will be very well correlated. In the graph above, both our statistics are trending linearly upwards. If the graph was drawn with different scales, they might look completely unrelated, like this, but because they both have a steady rate, they’d still be very well correlated. Yes, this looks likely.
  • Broad applicability - Is it possible that this relationship only exists in certain niche scenarios, or, at least, not in my niche scenario? Perhaps, for example, cheese does this to some people, and that’s been enough to create this correlation, because there are so few bedsheet-tangling fatalities otherwise? Yes, this seems possible.

So we have 4 “Yes” answers and one “No” answer from those 5 checks.

If your example doesn’t get 5 “No” answers from those 5 checks, it’s a fail, and you don’t get to say that the study has established either a ranking factor or a fatal side effect of cheese consumption.

A similar process should apply to case studies, which are another form of correlation — the correlation between you making a change, and something good (or bad!) happening. For example, ask:

  • Have I ruled out other factors (e.g. external demand, seasonality, competitors making mistakes)?
  • Did I increase traffic by doing the thing I tried to do, or did I accidentally improve some other factor at the same time?
  • Did this work because of the unique circumstance of the particular client/project?

This is particularly challenging for SEOs, because we rarely have data of this quality, but I’d suggest an additional pair of questions to help you navigate this minefield:

  • If I were Google, would I do this?
  • If I were Google, could I do this?

Direct traffic as a ranking factor passes the “could” test, but only barely — Google could use data from Chrome, Android, or ISPs, but it’d be sketchy. It doesn’t really pass the “would” test, though — it’d be far easier for Google to use branded search traffic, which would answer the same questions you might try to answer by comparing direct traffic levels (e.g. how popular is this website?).

2. Missing the context

If I told you that my traffic was up 20% week on week today, what would you say? Congratulations?

What if it was up 20% this time last year?

What if I told you it had been up 20% year on year, up until recently?

It’s funny how a little context can completely change this. This is another problem with case studies and their evil inverted twin, traffic drop analyses.

If we really want to understand whether to be surprised at something, positively or negatively, we need to compare it to our expectations, and then figure out what deviation from our expectations is “normal.” If this is starting to sound like statistics, that’s because it is statistics — indeed, I wrote about a statistical approach to measuring change way back in 2015.

If you want to be lazy, though, a good rule of thumb is to zoom out, and add in those previous years. And if someone shows you data that is suspiciously zoomed in, you might want to take it with a pinch of salt.

3. Trusting our tools

Would you make a multi-million dollar business decision based on a number that your competitor could manipulate at will? Well, chances are you do, and the number can be found in Google Analytics. I’ve covered this extensively in other places, but there are some major problems with most analytics platforms around:

  • How easy they are to manipulate externally
  • How arbitrarily they group hits into sessions
  • How vulnerable they are to ad blockers
  • How they perform under sampling, and how obvious they make this

For example, did you know that the Google Analytics API v3 can heavily sample data whilst telling you that the data is unsampled, above a certain amount of traffic (~500,000 within date range)? Neither did I, until we ran into it whilst building Distilled ODN.

Similar problems exist with many “Search Analytics” tools. My colleague Sam Nemzer has written a bunch about this — did you know that most rank tracking platforms report completely different rankings? Or how about the fact that the keywords grouped by Google (and thus tools like SEMRush and STAT, too) are not equivalent, and don’t necessarily have the volumes quoted?

It’s important to understand the strengths and weaknesses of tools that we use, so that we can at least know when they’re directionally accurate (as in, their insights guide you in the right direction), even if not perfectly accurate. All I can really recommend here is that skilling up in SEO (or any other digital channel) necessarily means understanding the mechanics behind your measurement platforms — which is why all new starts at Distilled end up learning how to do analytics audits.

One of the most common solutions to the root problem is combining multiple data sources, but…

4. Combining data sources

There are numerous platforms out there that will “defeat (not provided)” by bringing together data from two or more of:

  • Analytics
  • Search Console
  • AdWords
  • Rank tracking

The problems here are that, firstly, these platforms do not have equivalent definitions, and secondly, ironically, (not provided) tends to break them.

Let’s deal with definitions first, with an example — let’s look at a landing page with a channel:

  • In Search Console, these are reported as clicks, and can be vulnerable to heavy, invisible sampling when multiple dimensions (e.g. keyword and page) or filters are combined.
  • In Google Analytics, these are reported using last non-direct click, meaning that your organic traffic includes a bunch of direct sessions, time-outs that resumed mid-session, etc. That’s without getting into dark traffic, ad blockers, etc.
  • In AdWords, most reporting uses last AdWords click, and conversions may be defined differently. In addition, keyword volumes are bundled, as referenced above.
  • Rank tracking is location specific, and inconsistent, as referenced above.

Fine, though — it may not be precise, but you can at least get to some directionally useful data given these limitations. However, about that “(not provided)”...

Most of your landing pages get traffic from more than one keyword. It’s very likely that some of these keywords convert better than others, particularly if they are branded, meaning that even the most thorough click-through rate model isn’t going to help you. So how do you know which keywords are valuable?

The best answer is to generalize from AdWords data for those keywords, but it’s very unlikely that you have analytics data for all those combinations of keyword and landing page. Essentially, the tools that report on this make the very bold assumption that a given page converts identically for all keywords. Some are more transparent about this than others.

Again, this isn’t to say that those tools aren’t valuable — they just need to be understood carefully. The only way you could reliably fill in these blanks created by “not provided” would be to spend a ton on paid search to get decent volume, conversion rate, and bounce rate estimates for all your keywords, and even then, you’ve not fixed the inconsistent definitions issues.

Bonus peeve: Average rank

I still see this way too often. Three questions:

  1. Do you care more about losing rankings for ten very low volume queries (10 searches a month or less) than for one high volume query (millions plus)? If the answer isn’t “yes, I absolutely care more about the ten low-volume queries”, then this metric isn’t for you, and you should consider a visibility metric based on click through rate estimates.
  2. When you start ranking at 100 for a keyword you didn’t rank for before, does this make you unhappy? If the answer isn’t “yes, I hate ranking for new keywords,” then this metric isn’t for you — because that will lower your average rank. You could of course treat all non-ranking keywords as position 100, as some tools allow, but is a drop of 2 average rank positions really the best way to express that 1/50 of your landing pages have been de-indexed? Again, use a visibility metric, please.
  3. Do you like comparing your performance with your competitors? If the answer isn’t “no, of course not,” then this metric isn’t for you — your competitors may have more or fewer branded keywords or long-tail rankings, and these will skew the comparison. Again, use a visibility metric.

Conclusion

Hopefully, you’ve found this useful. To summarize the main takeaways:

  • Critically analyse correlations & case studies by seeing if you can explain them as coincidences, as reverse causation, as joint causation, through reference to a third mutually relevant factor, or through niche applicability.
  • Don’t look at changes in traffic without looking at the context — what would you have forecasted for this period, and with what margin of error?
  • Remember that the tools we use have limitations, and do your research on how that impacts the numbers they show. “How has this number been produced?” is an important component in “What does this number mean?”
  • If you end up combining data from multiple tools, remember to work out the relationship between them — treat this information as directional rather than precise.

Let me know what data analysis fallacies bug you, in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Don't Be Fooled by Data: 4 Data Analysis Pitfalls & How to Avoid Them posted first on https://moz.com/blog

Our Readership: Results of the 2017 Moz Blog Reader Survey

Posted by Trevor-Klein

This blog is for all of you. In a notoriously opaque and confusing industry that's prone to frequent changes, we see immense benefit in helping all of you stay on top of the game. To that end, every couple of years we ask for a report card of sorts, hoping not only to get a sense for how your jobs have changed, but also to get a sense for how we can improve.

About a month ago, we asked you all to take a reader survey, and nearly 600 of you generously gave your time. The results, summarized in this post, were immensely helpful, and were a reminder of how lucky we are to have such a thoughtful community of readers.

I've offered as much data as I can, and when possible, I've also trended responses against the same questions from our 2015 and 2013 surveys, so you can get a sense for how things have changed. There's a lot here, so buckle up. =)


Who our readers are

To put all of this great feedback into context, it helps to know a bit about who the people in our audience actually are. Sure, we can glean a bit of information from our site analytics, and can make some educated guesses, but neither of those can answer the questions we're most curious about. What's your day-to-day work like, and how much SEO does it really involve? Would you consider yourself more of an SEO beginner, or more of an SEO wizard? And, most importantly, what challenges are you facing in your work these days? The answers give us a fuller understanding of where the rest of your feedback comes from.

What is your job title?

Readers of the Moz Blog have a multitude of backgrounds, from CEOs of agencies to in-the-weeds SEOs of all skill levels. One of the most common themes we see, though, is a skew toward the more general marketing industry. I know that word clouds have their faults, but it's still a relatively interesting way to gauge how often things appear in a list like this, so here's what we've got this year:

Of note, similar to our results in 2015, the word "marketing" is the most common result, followed by the word "SEO" and the word "manager."

Here's a look at the top 20 terms used in this year's results, along with the percentage of responses containing each term. You'll also see those same percentages from the 2015 and 2013 surveys to give you an idea of what's changed -- the darker the bar, the more recent the survey:

The thing that surprises me the most about this list is how little it's changed in the four-plus years since we first asked the question (a theme you'll see recur in the rest of these results). In fact, the top 20 terms this year are nearly identical to the top 20 terms four years ago, with only a few things sliding up or down a few spots.

What percentage of your day-to-day work involves SEO?

We hear a lot about people wearing multiple hats for their companies. One person who took this survey noted that even at a 9,000-person company, they were the only one who worked on SEO, and it was only about 80% of their job. That idea is backed up by this data, which shows an incredibly broad range of responses. More than 10% of respondents barely touch SEO, and not even 14% say they're full-time:

One interesting thing to note is the sharp decline in the number of people who say that SEO isn't a part of their day-to-day at all. That shift is likely a result of our shift back toward SEO, away from related areas like social media and content marketing. I think we had attracted a significant number of community managers and content specialists who didn't work in SEO, and we're now seeing the pendulum swing the other direction.

On a scale of 1-5, how advanced would you say your SEO knowledge is?

The similarity between this year's graph for this question and those from 2015 and 2013 is simply astonishing:

There's been a slight drop in folks who say they're at an expert level, and a slight increase in folks who have some background, but are relative beginners. But only slight. The interesting thing is, our blog traffic has increased significantly over these four years, so the newer members of our audience bear a striking resemblance to those of you who've been around for quite some time. In a sense, that's reassuring -- it paints a clear picture for us as we continue refining our content.

Do you work in-house, or at an agency/consultancy?

Here's another window into just how little our audience has changed in the last couple of years:

A slight majority of our readers still work in-house for their own companies, and about a third still work on SEO for their company's clients.

Interestingly, though, respondents who work for clients deal with many of the same issues as those who work in-house -- especially in trying to convey the value of their work in SEO. They're just trying to send that message to external clients instead of internal stakeholders. More details on that come from our next question:

What are some of the biggest challenges you face in your work today?

I'm consistently amazed by the time and thought that so many of you put into answering this question, and rest assured, your feedback will be presented to several teams around Moz, both on the marketing and the product sides. For this question, I organized each and every response into recurring themes, tallying each time those themes were mentioned. Here are all the themes that were mentioned 10 or more times:

Challenge # of mentions
My clients / colleagues / bosses don't understand the value of SEO 59
The industry and tactics are constantly changing; algo updates 45
Time constraints 44
Link building 35
My clients / colleagues / bosses don't understand how SEO works 29
Content (strategy / creation / marketing) 25
Resource constraints 23
It's difficult to prove ROI 18
Budget constraints 17
It's a difficult industry in which to learn tools and techniques 16
I regularly need to educate my colleagues / employees 16
It's difficult to prioritize my work 16
My clients either don't have or won't offer sufficient budget / effort 15
Effective reporting 15
Bureaucracy, red tape, other company problems 11
It's difficult to compete with other companies 11
I'm required to wear multiple hats 11

More than anything else, it's patently obvious that one of the greatest difficulties faced by any SEO is explaining it to other people in a way that demonstrates its value while setting appropriate expectations for results. Whether it's your clients, your boss, or your peers that you're trying to convince, it isn't an easy case to make, especially when it's so difficult to show what kind of return a company can see from an investment in SEO.

We also saw tons of frustrated responses about how the industry is constantly changing, and it takes too much of your already-constrained time just to stay on top of those changes.

In terms of tactics, link building easily tops the list of challenges. That makes sense, as it's the piece of SEO that relies most heavily on the cooperation of other human beings (and humans are often tricky beings to figure out). =)

Content marketing -- both the creation/copywriting side as well as the strategy side -- is still a challenge for many folks in the industry, though fewer people mentioned it this year as mentioned it in 2015, so I think we're all starting to get used to how those skills overlap with the more traditional aspects of SEO.


How our readers read

With all that context in mind, we started to dig into your preferences in terms of formats, frequency, and subject matter on the blog.

How often do you read posts on the Moz Blog?

This is the one set of responses that caused a bit of concern. We've seen a steady decrease in the number of people who say they read every day, a slight decrease in the number of people who say they read multiple times each week, and a dramatic increase in the number of people who say they read once a week.

The 2015 decrease came after an expansion in the scope of subjects we covered on the blog -- as we branched away from just SEO, we published more posts about social media, email, and other aspects of digital marketing. We knew that not all of those subjects were relevant for everyone, so we expected a dip in frequency of readership.

This year, though, we've attempted to refocus on SEO, and might have expected a bit of a rebound. That didn't happen:

There are two other factors at play, here. For one thing, we no longer publish a post every single weekday. After our publishing volume experiment in 2015, we realized it was safe (even beneficial) to emphasize quality over quantity, so if we don't feel like a post turned out the way we hoped, we don't publish it until we've had a chance to improve it. That means we're down to about four posts per week. We've also made a concerted effort to publish more posts about local SEO, as that's relevant to our software and an increasingly important part of the work of folks in our industry.

It could also be a question of time -- we've already covered how little time everyone in our industry has, and with that problem continuing, there may just be less time to read blog posts.

If anyone has any additional insight into why they read less often than they once did, please let us know in the comments below!

On which types of devices do you prefer to read blog posts?

We were surprised by the responses to this answer in 2013, and they've only gotten more extreme:

Nearly everyone prefers to read blog posts on a full computer. Only about 15% of folks add their phones into the equation, and the number of people in all the other buckets is extremely small. In 2013, our blog didn't have a responsive design, and was quite difficult to read on mobile devices. We thought that might have had something to do with people's responses -- maybe they were just used to reading our blog on larger screens. The trend in 2015 and this year, though, proves that's not the case. People just prefer reading posts on their computers, plain and simple.

Which other site(s), if any, do you regularly visit for information or education on SEO?

This was a new question for this year. We have our own favorite sites, of course, but we had no idea how the majority of folks would respond to this question. As it turns out, there was quite a broad range of responses listing sites that take very different approaches:

Site # responses
Search Engine Land 184
Search Engine Journal 89
Search Engine Roundtable 74
SEMrush 51
Ahrefs 50
Search Engine Watch 41
Quick Sprout / Neil Patel 35
HubSpot 33
Backlinko 31
Google Blogs 29
The SEM Post 21
Kissmetrics 17
Yoast 16
Distilled 13
SEO by the Sea 13

I suppose it's no surprise that the most prolific sites sit at the top. They've always got something new, even if the stories don't often go into much depth. We've tended to steer our own posts toward longer-form, in-depth pieces, and I think it's safe to say (based on these responses and some to questions below) that it'd be beneficial for us to include some shorter stories, too. In other words, depth shouldn't necessarily be a requisite for a post to be published on the Moz Blog. We may start experimenting with a more "short and sweet" approach to some posts.


What our readers think of the blog

Here's where we get into more specific feedback about the Moz Blog, including whether it's relevant, how easy it is for you to consume, and more.

What percentage of the posts on the Moz Blog would you say are relevant to you and your work?

Overall, I'm pretty happy with the results here, as SEO is a broad enough industry (and we've got a broad enough audience) that there's simply no way we're going to hit the sweet spot for everyone with every post. But those numbers toward the bottom of the chart are low enough that I feel confident we're doing pretty well in terms of topic relevance.

Do you feel the Moz Blog posts are generally too basic, too advanced, or about right?

Responses to this question have made me smile every time I see them. This is clearly one thing we're getting about as right as we could expect to. We're even seeing a slight balancing of the "too basic" and "too advanced" columns over time, which is great:

We also asked the people who told us that posts were "too basic" or "too advanced" to what extent they felt that way, using a scale from 1-5 (1 being "just a little bit too basic/advanced" and 5 being "way too basic/advanced." The responses tell us that the people who feel posts are too advanced feel more strongly about that opinion than the people who feel posts are too basic:

This makes some sense, I think. If you're just starting out in SEO, which many of our readers are, some of the posts on this blog are likely to go straight over your head. That could be frustrating. If you're an SEO expert, though, you probably aren't frustrated by posts you see as too basic for you -- you just skip past them and move on with your day.

This does make me think, though, that we might benefit from offering a dedicated section of the site for folks who are just starting out -- more than just the Beginner's Guide. That's actually something that was specifically requested by one respondent this year.

In general, what do you think about the length of Moz Blog posts?

While it definitely seems like we're doing pretty well in this regard, I'd also say we've got some room to tighten things up a bit, especially in light of the lack of time so many of you mentioned:

There were quite a few comments specifically asking for "short and sweet" posts from time to time -- offering up useful tips or news in a format that didn't expound on details because it didn't have to. I think sprinkling some of those types of posts in with the longer-form posts we have so often would be beneficial.

Do you ever comment on Moz Blog posts?

This was another new question this year. Despite so many sites are removing comment sections from their blogs, we've always believed in their value. Sometimes the discussions we see in comments end up being the most helpful part of the posts, and we value our community too much to keep that from happening. So, we were happy to see a full quarter of respondents have participated in comments:

We also asked for a bit of info about why you either do or don't comment on posts. The top reasons why you do were pretty predictable -- to ask a clarifying question related to the post, or to offer up your own perspective on the topic at hand. The #3 reason was interesting -- 18 people mentioned that they like to comment in order to thank the author for their hard work. This is a great sentiment, and as someone who's published several posts on this blog, I can say for a fact that it does feel pretty great. At the same time, those comments are really only written for one person -- the author -- and are a bit problematic from our perspective, because they add noise around the more substantial conversations, which are what we like to see most.

I think the solution is going to lie in a new UI element that allows readers to note their appreciation to the authors without leaving one of the oft-maligned "Great post!" comments. There's got to be a happy medium there, and I think it's worth our finding it.

The reasons people gave for not commenting were even more interesting. A bunch of people mentioned the need to log in (sorry, folks -- if we didn't require that, we'd spend half our day removing spam!). The most common response, though, involved a lack of confidence. Whether it was worded along the lines of "I'm an introvert" or along the lines of "I just don't have a lot of expertise," there were quite a few people who worried about how their comments would be received.

I want to take this chance to encourage those of you who feel that way to take the step, and ask questions about points you find confusing. At the very least, I can guarantee you aren't the only ones, and others like you will appreciate your initiative. One of the best ways to develop your expertise is to get comfortable asking questions. We all work in a really confusing industry, and the Moz Blog is all about providing a place to help each other out.

What, if anything, would you like to see different about the Moz Blog?

As usual, the responses to this question were chock full of great suggestions, and again, we so appreciate the amount of time you all spent providing really thoughtful feedback.

One pattern I saw was requests for more empirical data -- hard evidence that things should be done a certain way, whether through case studies or other formats. Another pattern was requests for step-by-step walkthroughs. That makes a lot of sense for an industry of folks who are strapped for time: Make things as clear-cut as possible, and where we can, offer a linear path you can walk down instead of asking you to holistically understand the subject matter, then figure that out on your own. (That's actually something we're hoping to do with our entire Learning Center: Make it easier to figure out where to start, and where to continue after that, instead of putting everything into buckets and asking you all to figure it out.)

Whiteboard Friday remains a perennial favorite, and we were surprised to see more requests for more posts about our own tools than we had requests for fewer posts about our own tools. (We've been wary of that in the past, as we wanted to make sure we never crossed from "helpful" into "salesy," something we'll still focus on even if we do add another tool-based post here and there.)

We expected a bit of feedback about the format of the emails -- we're absolutely working on that! -- but didn't expect to see so many folks requesting that we bring back YouMoz. That's something that's been on the backs of our minds, and while it may not take the same form it did before, we do plan on finding new ways to encourage the community to contribute content, and hope to have something up and running early in 2018.

Request #responses
More case studies 26
More Whiteboard Friday (or other videos) 25
More long-form step-by-step training/guides 18
Clearer steps to follow in posts; how-tos 11
Bring back UGC / YouMoz 9
More from Rand 9
Improve formatting of the emails 9
Higher-level, less-technical posts 8
More authors 7
More news (algorithm updates, e.g.) 7
Shorter posts, "quick wins" 7
Quizzes, polls, or other engagement opportunities 6
Broader range of topics (engagement, CRO, etc.) 6
More about Moz tools 5
More data-driven, less opinion-based 5

What our readers want to see

This section is a bit more future-facing, where some of what we asked before had to do with how things have been in the past.

Which of the following topics would you like to learn more about?

There were very, very few surprises in this list. Lots of interest in on-page SEO and link building, as well as other core tactical areas of SEO. Content, branding, and social media all took dips -- that makes sense, given the fact that we don't usually post about those things anymore, and we've no doubt lost some audience members who were more interested in them as a result. Interestingly, mobile took a sizable dip, too. I'd be really curious to know what people think about why that is. My best guess is that with the mobile-first indexing from Google and with responsive designs having become so commonplace, there isn't as much of a need as there once was to think of mobile much differently than there was a couple of years ago. Also of note: When we did this survey in 2015, Google had recently rolled out its "Mobile-Friendly Update," not-so-affectionately referred to by many in the industry as Mobilegeddon. So... it was on our minds. =)

Which of the following types of posts would you most like to see on the Moz Blog?

This is a great echo and validation of what we took away from the more general question about what you'd like to see different about the Blog: More tactical posts and step-by-step walkthroughs. Posts that cut to the chase and offer a clear direction forward, as opposed to some of the types at the bottom of this list, which offer more opinions and cerebral explorations:


What happens next?

Now we go to work. =)

We'll spend some time fully digesting this info, and coming up with new goals for 2018 aimed at making improvements inspired by your feedback. We'll keep you all apprised as we start moving forward.

If you have any additional insight that strikes you in taking a look at these results, please do share it in the comments below -- we'd love to have those discussions.

For now, we've got some initial takeaways that we're already planning to take action on.

Primary takeaways

There are some relatively obvious things we can take away from these results that we're already working on:

  • People in all businesses are finding it quite difficult to communicate the value of SEO to their clients, bosses, and colleagues. That's something we can help with, and we'll be developing materials in the near future to try and alleviate some of that particular frustration.
  • There's a real desire for more succinct, actionable, step-by-step walkthroughs on the Blog. We can pretty easily explore formats for posts that are off our "beaten path," and will attempt to make things easier to consume through improvements to both the content itself and its delivery. I think there's some room for more "short and sweet" mixed in with our longer norm.
  • The bulk of our audience does more than just SEO, despite a full 25% of them having it in their job titles, and the challenges you mentioned include a bunch of areas that are related to, but outside the traditional world of SEO. Since you all are clearly working on those sorts of things, we should work to highlight and facilitate the relationship between the SEO work and the non-SEO marketing work you do.
  • In looking through some of the other sites you all visit for information on SEO, and knowing the kinds of posts they typically publish, it's clear we've got an opportunity to publish more news. We've always dreamed of being more of a one-stop shop for SEO content, and that's good validation that we may want to head down that path.

Again, thank you all so much for the time and effort you spent filling out this survey. Hopefully you'll notice some changes in the near (and not-so-near) future that make it clear we're really listening.

If you've got anything to add to these results -- insights, further explanations, questions for clarification, rebuttals of points, etc. -- please leave them in the comments below. We're looking forward to continuing the conversation. =)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Our Readership: Results of the 2017 Moz Blog Reader Survey posted first on https://moz.com/blog

How Local SEO Fits In With What You’re Already Doing

Posted by MiriamEllis

islandfinal.jpg

You own, work for, or market a business, but you don’t think of yourself as a Local SEO.

That’s okay. The forces of history have, in fact, conspired in some weird ways to make local search seem like an island unto itself. Out there, beyond the horizon, there may be technicians puzzling out NAP, citations, owner responses, duplicate listings, store locator widgets and the like, but it doesn’t seem like they’re talking about your job at all.

And that’s the problem.

If I could offer you a seat in my kayak, I’d paddle us over to that misty isle, and we’d go ashore. After we’d walked around a bit, talking to the locals, it would hit you that the language barrier you’d once perceived is a mere illusion, as is the distance between you.

By sunset — whoa! Look around again. This is no island. You and the Local SEOs are all mainlanders, reaching towards identical goals of customer acquisition, service, and retention via an exceedingly enriched and enriching skill set. You can use it all.

Before I paddle off into the darkness, under the rising stars, I’d like to leave you a chart that plots out how Local SEO fits in with everything you’ve been doing all along.

The roots of the divide

Why is Local SEO often treated as separate from the rest of marketing? We can narrow this down to three contributing factors:

1) Early separation of the local and organic algos

Google’s early-days local product was governed by an algorithm that was much more distinct from their organic algorithm than it is today. It was once extremely common, for example, for businesses without websites to rank well locally. This didn’t do much to form clear bridges between the offline, organic, and local marketing worlds. But, then came Google’s Pigeon Update in 2013, which signaled Google’s stated intention of deeply tying the two algorithms together.

This should ultimately impact the way industry publications, SaaS companies, and agencies present local as an extension of organic SEO, but we’re not quite there yet. I continue to encounter examples of large companies which are doing an amazing job with their website strategies, their e-commerce solutions and their paid outreach, but which are only now taking their first steps into local listings management for their hundreds of physical locations. It’s not that they’re late to the party — it’s just that they’ve only recently begun to realize what a large party their customers are having with their brands’ location data layers on the web.

2) Inheriting the paid vs. organic dichotomy

Local SEO has experienced the same lack-of-adoption/awareness as organic SEO. Agencies have long fought the uphill battle against a lopsided dependence on paid advertising. This phenomenon is highlighted by historic stats like these showing brands investing some $10 million in PPC vs. $1 million in SEO, despite studies like this one which show PPC earning less than 10% of clicks in search.

My take on this is that the transition from traditional offline paid advertising to its online analog was initially easier for many brands to get their heads around. And there have been ongoing challenges in proving direct ROI from SEO in the simple terms a PPC campaign can provide. To this day, we’re still all seeing statistics like only 17% of small businesses investing in SEO. In many ways, the SEO conundrum has simply been inherited by every Local SEO.

3) A lot to take in and on

Look at the service menu of any full-service digital marketing agency and you’ll see just how far it’s had to stretch over the past couple of decades to encompass an ever-expanding range of publicity opportunities:

  • Technical website audits
  • On-site optimization
  • Linkbuilding
  • Keyword research
  • Content dev and promotion
  • Brand building
  • Social media marketing
  • PPC management
  • UX audits
  • Conversion optimization
  • Etc.

Is it any wonder that agencies feel spread a bit too thin when considering how to support yet further needs and disciplines? How do you find the bandwidth, and the experts, to be able to offer:

  • Ongoing citation management
  • Local on-site SEO
  • Local landing page dev
  • Store locator SEO
  • Review management
  • Local brand building
  • Local link building
  • And abstruse forms of local Schema implementation...

And while many agencies have met the challenge by forming smart, strategic partnerships with providers specializing in Local SEO solutions, the agency is still then tasked with understanding how Local fits in with everything else they’re doing, and then explaining this to clients. At the multi-location and enterprise level, even amongst the best-known brands, high-level staffers may have no idea what it is the folks in the in-house Local SEO department are actually doing, or why their work matters.

To tie it all together … that’s what we need to do here. With a shared vision of how all practitioners are working on consumer-centric outreach, we can really get somewhere. Let’s plot this out, together:

Sharing is caring

“We see our customers as invited guests to a party, and we are the hosts. It's our job every day to make every important aspect of the customer experience a little bit better.”
- Jeff Bezos, Amazon

Let’s imagine a sporting goods brand, established in 1979, that’s grown to 400 locations across the US while also becoming well-known for its e-commerce presence. Whether aspects of marketing are being outsourced or it’s all in-house, here is how 3 shared consumer-centric goals unify all parties.

sharedgoalsfinal.jpg

As we can see from the above chart, there is definitely an overlap of techniques, particularly between SEOs and Local SEOs. Yet overall, it’s not the language or tactics, but the end game and end goals that unify all parties. Viewed properly, consumers are what make all marketing a true team effort.

Before I buy that kayak…

On my commute, I hear a radio ad promoting a holiday sale at some sporting goods store, but which brand was it?

Then I turn to the Internet to research kayak brands, and I find your website’s nicely researched, written, and optimized article comparing the best models in 2017. It’s ranking #2 organically. Those Sun Dolphins look pretty good, according to your massive comparison chart.

I think about it for a couple of days and go looking again, and I see your Adwords spot advertising your 30% off sale. This is the third time I’ve encountered your brand.

On my day off, I’m doing a local search for your brand, which has impressed me so far. I’m ready to look at these kayaks in person. Thanks to the fact that you properly managed your recent move across town by updating all of your major citations, I’m finding an accurate address on your Google My Business listing. Your reviews are mighty favorable, too. They keep mentioning how knowledgeable the staff is at your location nearest me.

And that turns out to be true. At first, I’m disappointed that I don’t see any Sun Dolphins on your shelves — your website comparison chart spoke well of them. As a sales associate approaches me, I notice in-store signage above his head, featuring a text/phone hotline for complaints. I don’t really have a complaint… not yet… but it’s good to know you care.

“I’m so sorry. We just sold out of Sun Dolphins this morning. But we can have one delivered to you within 3 days. We have in-store pickup, too,” the salesperson says. “Or, maybe you’d be interested in another model with comparable features. Let me show you.”

Turns out, your staffer isn’t just helpful — his training has made him so well-versed in your product line that he’s able to match my needs to a perfect kayak for me. I end up buying an Intex on the spot.

The cashier double-checks with me that I’ve found everything satisfactory and lets me know your brand takes feedback very seriously. She says my review would be valued, and my receipt invites me to read your reviews on Google, Yelp, and Facebook… and offers a special deal for signing up for your email newsletter.

My subsequent 5-star review signals to all departments of your company that a company-wide goal was met. Over the next year, my glowing review also influences 20 of my local neighbors to choose you over a competitor.

After my first wet, cold, and exciting kayaking trip, I realize I need to invest in a better waterproof jacket for next time. Your email newsletter hits my inbox at just the right time, announcing your Fourth of July sale. I’m about to become a repeat customer… worth up to 10x the value of my first purchase.

“No matter how brilliant your mind or strategy, if you’re playing a solo game, you’ll always lose out to a team.”
- Reid Hoffman, Co-Founder of LinkedIn

There’s a kind of magic in this adventurous mix of marketing wins. Subtract anything from the picture, and you may miss out on the customer. It’s been said that great teams beat with a single heart. The secret lies in seeing every marketing discipline and practitioner as part of your team, doing what your brand has been doing all along: working with dedication to acquire, serve and retain consumers. Whether achievement comes via citation management, conversion optimization, or a write-up in the New York Times, the end goal is identical.

It’s also long been said that the race is to the swift. Media mogul Rupert Murdoch appears to agree, stating that, in today’s world, it’s not big that beats small — it’s fast that beats slow. How quickly your brand is able to integrate all forms of on-and-offline marketing into its core strategy, leaving no team as an island, may well be what writes your future.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How Local SEO Fits In With What You’re Already Doing posted first on https://moz.com/blog

Designing a Page’s Content Flow to Maximize SEO Opportunity – Whiteboard Friday

Posted by randfish

Controlling and improving the flow of your on-site content can actually help your SEO. What's the best way to capitalize on the opportunity present in your page design? Rand covers the questions you need to ask (and answer) and the goals you should strive for in today's Whiteboard Friday.

Designing a page's content flow to maximize SEO opportunity

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about a designing a page's content flow to help with your SEO.

Now, unfortunately, somehow in the world of SEO tactics, this one has gotten left by the wayside. I think a lot of people in the SEO world are investing in things like content and solving searchers' problems and getting to the bottom of searcher intent. But unfortunately, the page design and the flow of the elements, the UI elements, the content elements that sit in a page is discarded or left aside. That's unfortunate because it can actually make a huge difference to your SEO.

Q: What needs to go on this page, in what order, with what placement?

So if we're asking ourselves like, "Well, what's the question here?" Well, it's what needs to go on this page. I'm trying to rank for "faster home Wi-Fi." Right now, Lifehacker and a bunch of other people are ranking in these results. It gets a ton of searches. I can drive a lot of revenue for my business if I can rank there. But what needs to go on this page in what order with what placement in order for me to perform the best that I possibly can? It turns out that sometimes great content gets buried in a poor page design and poor page flow. But if we want to answer this question, we actually have to ask some other ones. We need answers to at least these three:

A. What is the searcher in this case trying to accomplish?

When they enter "faster home Wi-Fi," what's the task that they want to get done?

B. Are there multiple intents behind this query, and which ones are most popular?

What's the popularity of those intents in what order? We need to know that so that we can design our flow around the most common ones first and the secondary and tertiary ones next.

C. What's the business goal of ranking? What are we trying to accomplish?

That's always going to have to be balanced out with what is the searcher trying to accomplish. Otherwise, in a lot of cases, there's no point in ranking at all. If we can't get our goals met, we should just rank for something else where we can.

Let's assume we've got some answers:

Let's assume that, in this case, we have some good answers to these questions so we can proceed. So pretty simple. If I search for "faster home Wi-Fi," what I want is usually it's going to be...

A. Faster download speed at home.

That's what the searcher is trying to accomplish. But there are multiple intents behind this. Sometimes the searcher is looking to do that..

B1. With their current ISP and their current equipment.

They want to know things they can optimize that don't cause them to spend money. Can they place their router in different places? Can they change out a cable? Do they need to put it in a different room? Do they need to move their computer? Is the problem something else that's interfering with their Wi-Fi in their home that they need to turn off? Those kinds of issues.

B2. With a new ISP.

Or can they get a new ISP? They might be looking for an ISP that can provide them with faster home internet in their area, and they want to know what's available, which is a very different intent than the first one.

B3. With current ISP but new equipment.

maybe they want to keep their ISP, but they are willing to upgrade to new equipment. So they're looking for what's the equipment that I could buy that would make the current ISP I have, which in many cases in the United States, sadly, there's only one ISP that can provide you with service in a lot of areas. So they can't change ISP, but they can change out their equipment.

C. Affiliate revenue with product referrals.

Let's assume that (C) is we know that what we're trying to accomplish is affiliate revenue from product referrals. So our business is basically we're going to send people to new routers or the Google Mesh Network home device, and we get affiliate revenue by passing folks off to those products and recommending them.

Now we can design a content flow.

Okay, fair enough. We now have enough to be able to take care of this design flow. The design flow can involve lots of things. There are a lot of things that could live on a page, everything from navigation to headline to the lead-in copy or the header image or body content, graphics, reference links, the footer, a sidebar potentially.

The elements that go in here are not actually what we're talking about today. We can have that conversation too. I want a headline that's going to tell people that I serve all of these different intents. I want to have a lead-in that has a potential to be the featured snippet in there. I want a header image that can rank in image results and be in the featured snippet panel. I'm going to want body content that serves all of these in the order that's most popular. I want graphics and visuals that suggest to people that I've done my research and I can provably show that the results that you get with this different equipment or this different ISP will be relevant to them.

But really, what we're talking about here is the flow that matters. The content itself, the problem is that it gets buried. What I see many times is folks will take a powerful visual or a powerful piece of content that's solving the searcher's query and they'll put it in a place on the page where it's hard to access or hard to find. So even though they've actually got great content, it is buried by the page's design.

5 big goals that matter.

The goals that matter here and the ones that you should be optimizing for when you're thinking about the design of this flow are:

1. How do I solve the searcher's task quickly and enjoyably?

So that's about user experience as well as the UI. I know that, for many people, they are going to want to see and, in fact, the result that's ranking up here on the top is Lifehacker's top 10 list for how to get your home Wi-Fi faster. They include things like upgrading your ISP, and here's a tool to see what's available in your area. They include maybe you need a better router, and here are the best ones. Maybe you need a different network or something that expands your network in your home, and here's a link out to those. So they're serving that purpose up front, up top.

2. Serve these multiple intents in the order of demand.

So if we can intuit that most people want to stick with their ISP, but are willing to change equipment, we can serve this one first (B3). We can serve this one second (B1), and we can serve the change out my ISP third (B2), which is actually the ideal fit in this scenario for us. That helps us

3. Optimize for the business goal without sacrificing one and two.

I would urge you to design generally with the searcher in mind and if you can fit in the business goal, that is ideal. Otherwise, what tends to happen is the business goal comes first, the searcher comes second, and you come tenth in the results.

4. If possible, try to claim the featured snippet and the visual image that go up there.

That means using the lead-in up at the top. It's usually the first paragraph or the first few lines of text in an ordered or unordered list, along with a header image or visual in order to capture that featured snippet. That's very powerful for search results that are still showing it.

5. Limit our bounce back to the SERP as much as possible.

In many cases, this means limiting some of the UI or design flow elements that hamper people from solving their problems or that annoy or dissuade them. So, for example, advertising that pops up or overlays that come up before I've gotten two-thirds of the way down the page really tend to hamper efforts, really tend to increase this bounce back to the SERP, the search engine call pogo-sticking and can harm your rankings dramatically. Design elements, design flows where the content that actually solves the problem is below an advertising block or below a promotional block, that also is very limiting.

So to the degree that we can control the design of our pages and optimize for that, we can actually take existing content that you might already have and improve its rankings without having to remake it, without needing new links, simply by improving the flow.

I hope we'll see lots of examples of those in the comments, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Designing a Page's Content Flow to Maximize SEO Opportunity - Whiteboard Friday posted first on https://moz.com/blog

The Complete Guide to Direct Traffic in Google Analytics

Posted by tombennet

When it comes to direct traffic in Analytics, there are two deeply entrenched misconceptions.

The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.

What is direct traffic?

In short, Google Analytics will report a traffic source of "direct" when it has no data on how the session arrived at your website, or when the referring source has been configured to be ignored. You can think of direct as GA’s fall-back option for when its processing logic has failed to attribute a session to a particular source.

To properly understand the causes and fixes for direct traffic, it’s important to understand exactly how GA processes traffic sources. The following flow-chart illustrates how sessions are bucketed — note that direct sits right at the end as a final "catch-all" group.

Broadly speaking, and disregarding user-configured overrides, GA’s processing follows this sequence of checks:

AdWords parameters > Campaign overrides > UTM campaign parameters > Referred by a search engine > Referred by another website > Previous campaign within timeout period > Direct

Note the penultimate processing step (previous campaign within timeout), which has a significant impact on the direct channel. Consider a user who discovers your site via organic search, then returns via direct a week later. Both sessions would be attributed to organic search. In fact, campaign data persists for up to six months by default. The key point here is that Google Analytics is already trying to minimize the impact of direct traffic for you.

What causes direct traffic?

Contrary to popular belief, there are actually many reasons why a session might be missing campaign and traffic source data. Here we will run through some of the most common.

1. Manual address entry and bookmarks

The classic direct-traffic scenario, this one is largely unavoidable. If a user types a URL into their browser’s address bar or clicks on a browser bookmark, that session will appear as direct traffic.

Simple as that.

2. HTTPS > HTTP

When a user follows a link on a secure (HTTPS) page to a non-secure (HTTP) page, no referrer data is passed, meaning the session appears as direct traffic instead of as a referral. Note that this is intended behavior. It’s part of how the secure protocol was designed, and it does not affect other scenarios: HTTP to HTTP, HTTPS to HTTPS, and even HTTP to HTTPS all pass referrer data.

So, if your referral traffic has tanked but direct has spiked, it could be that one of your major referrers has migrated to HTTPS. The inverse is also true: If you’ve migrated to HTTPS and are linking to HTTP websites, the traffic you’re driving to them will appear in their Analytics as direct.

If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.

If, on the other hand, you’ve already migrated to HTTPS and are concerned about your users appearing to partner websites as direct traffic, you can implement the meta referrer tag. Cyrus Shepard has written about this on Moz before, so I won’t delve into it now. Suffice to say, it’s a way of telling browsers to pass some referrer data to non-secure sites, and can be implemented as a <meta> element or HTTP header.

3. Missing or broken tracking code

Let’s say you’ve launched a new landing page template and forgotten to include the GA tracking code. Or, to use a scenario I’m encountering more and more frequently, imagine your GTM container is a horrible mess of poorly configured triggers, and your tracking code is simply failing to fire.

Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.

As a short-term fix, you can try to repair the damage by simply adding the missing tracking code. To prevent it happening again, carry out a thorough Analytics audit, move to a GTM-based tracking implementation, and promote a culture of data-driven marketing.

4. Improper redirection

This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.

Once again, control what you can: use carefully mapped (i.e. non-chained) code 301 server-side redirects to preserve referrer data wherever possible.

5. Non-web documents

Links in Microsoft Word documents, slide decks, or PDFs do not pass referrer information. By default, users who click these links will appear in your reports as direct traffic. Clicks from native mobile apps (particularly those with embedded "in-app" browsers) are similarly prone to stripping out referrer data.

To a degree, this is unavoidable. Much like so-called “dark social” visits (discussed in detail below), non-web links will inevitably result in some quantity of direct traffic. However, you also have an opportunity here to control the controllables.

If you publish whitepapers or offer downloadable PDF guides, for example, you should be tagging the embedded hyperlinks with UTM campaign parameters. You’d never even contemplate launching an email marketing campaign without campaign tracking (I hope), so why would you distribute any other kind of freebie without similarly tracking its success? In some ways this is even more important, since these kinds of downloadables often have a longevity not seen in a single email campaign. Here’s an example of a properly tagged URL which we would embed as a link:

https://builtvisible.com/embedded-whitepaper-url/?..._medium=offline_document&utm_campaign=201711_utm_whitepaper

The same goes for URLs in your offline marketing materials. For major campaigns it’s common practice to select a short, memorable URL (e.g. moz.com/tv/) and design an entirely new landing page. It’s possible to bypass page creation altogether: simply redirect the vanity URL to an existing page URL which is properly tagged with UTM parameters.

So, whether you tag your URLs directly, use redirected vanity URLs, or — if you think UTM parameters are ugly — opt for some crazy-ass hash-fragment solution with GTM (read more here), the takeaway is the same: use campaign parameters wherever it’s appropriate to do so.

6. “Dark social”

This is a big one, and probably the least well understood by marketers.

The term “dark social” was first coined back in 2012 by Alexis Madrigal in an article for The Atlantic. Essentially it refers to methods of social sharing which cannot easily be attributed to a particular source, like email, instant messaging, Skype, WhatsApp, and Facebook Messenger.

Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.

People who use the ambiguous phrase “social media marketing” are typically referring to advertising: you broadcast your message and hope people will listen. Even if you overcome consumer indifference with a well-targeted campaign, any subsequent interactions are affected by their very public nature. The privacy of dark social, by contrast, represents a potential goldmine of intimate, targeted, and relevant interactions with high conversion potential. Nebulous and difficult-to-track though it may be, dark social has the potential to let marketers tap into elusive power of word of mouth.

So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.

Checklist: Minimizing direct traffic

To summarize what we’ve already discussed, here are the steps you can take to minimize the level of unnecessary direct traffic in your reports:

  1. Migrate to HTTPS: Not only is the secure protocol your gateway to HTTP/2 and the future of the web, it will also have an enormously positive effect on your ability to track referral traffic.
  2. Manage your use of redirects: Avoid chains and eliminate client-side redirection in favour of carefully-mapped, single-hop, server-side 301s. If you use vanity URLs to redirect to pages with UTM parameters, be meticulous.
  3. Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
  4. Conduct an Analytics audit: Data integrity is vital, so consider this essential when assessing the success of your marketing. It’s not simply a case of checking for missing track code: good audits involve a review of your measurement plan and rigorous testing at page and property-level.

Adhere to these principles, and it’s often possible to achieve a dramatic reduction in the level of direct traffic reported in Analytics. The following example involved an HTTPS migration, GTM migration (as part of an Analytics review), and an overhaul of internal campaign tracking processes over the course of about 6 months:

But the saga of direct traffic doesn’t end there! Once this channel is “clean” — that is, once you’ve minimized the number of avoidable pollutants — what remains might actually be one of your most valuable traffic segments.

Analyze! Or: why direct traffic can actually be pretty cool

For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.

The number of potential avenues to explore is infinite, but here are some good starting points:

  • Build meaningful custom segments, defining a subset of your direct traffic based on their landing page, location, device, repeat visit or purchase behavior, or even enhanced e-commerce interactions.
  • Track meaningful engagement metrics using modern GTM triggers such as element visibility and native scroll tracking. Measure how your direct users are using and viewing your content.
  • Watch for correlations with your other marketing activities, and use it as an opportunity to refine your tagging practices and segment definitions. Create a custom alert which watches for spikes in direct traffic.
  • Familiarize yourself with flow reports to get an understanding of how your direct traffic is converting. By using Goal Flow and Behavior Flow reports with segmentation, it’s often possible to glean actionable insights which can be applied to the site as a whole.
  • Ask your users for help! If you’ve isolated a valuable segment of traffic which eludes deeper analysis, add a button to the page offering visitors a free downloadable ebook if they tell you how they discovered your page.
  • Start thinking about lifetime value, if you haven’t already — overhauling your attribution model or implementing User ID are good steps towards overcoming the indifference or frustration felt by marketers towards direct traffic.

I hope this guide has been useful. With any luck, you arrived looking for ways to reduce the level of direct traffic in your reports, and left with some new ideas for how to better analyze this valuable segment of users.

Thanks for reading!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


The Complete Guide to Direct Traffic in Google Analytics posted first on https://moz.com/blog