Desktop, Mobile, or Voice? (D) All of the Above – Whiteboard Friday

Posted by Dr-Pete

We're facing more and more complexity in our everyday work, and the answers to our questions are about as clear as mud. Especially in the wake of the mobile-first index, we're left wondering where to focus our optimization efforts. Is desktop the most important? Is mobile? What about the voice phenomenon sweeping the tech world?

As with most things, the most important factor is to consider your audience. People aren't siloed to a single device — your optimization strategy shouldn't be, either. In today's Whiteboard Friday, Dr. Pete soothes our fears about a multi-platform world and highlights the necessity of optimizing for a journey rather than a touchpoint.

Desktop, Mobile, or Voice? All of the above.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, everybody. It's Dr. Pete here from Moz. I am the Marketing Scientist here, and I flew in from Chicago just for you fine people to talk about something that I think is worrying us a little bit, especially with the rollout of the mobile index recently, and that is the question of: Should we be optimizing for desktop, for mobile, or for voice? I think the answer is (d) All of the above. I know that might sound a little scary, and you're wondering how you do any of these. So I want to talk to you about some of what's going on, some of our misconceptions around mobile and voice, and some of the ways that maybe this is a little easier than you think, at least to get started.

The mistakes we make

So, first of all, I think we make a couple of mistakes. When we're talking about mobile for the last few years, we tend to go in and we look at our analytics and we do this. These are made up. The green numbers are made up or the blue ones. We say, "Okay, about 90% of my traffic is coming from desktop, about 10% is coming from mobile, and nothing is coming from voice. So I'm just going to keep focusing on desktop and not worry about these other two experiences, and I'll be fine." There are two problems with this:

Self-fulfilling prophecy

One is that these numbers are kind of a self-fulfilling prophecy. They might not be coming to your mobile site. You might not be getting those mobile visitors because your mobile experience is terrible. People come to it and it's lousy, and they don't come back. In the case of voice, we might just not be getting that data yet. We have very little data. So this isn't telling us anything. All this may be telling us is that we're doing a really bad job on mobile and people have given up. We've seen that with Moz in the past. We didn't adopt to mobile as fast as maybe we should have. We saw that in the numbers, and we argued about it because we said, "You know what? This doesn't really tell us what the opportunity is or what our customers or users want. It's just telling us what we're doing well or badly right now, and it becomes a self-fulfilling prophecy."

Audiences

The other mistake I think we make is the idea that these are three separate audiences. There are people who come to our site on desktop, people who come to our site on mobile, people who come to our site on voice, and these are three distinct groups of people. I think that's incredibly wrong, and that leads to some very bad ideas and some bad tactical decisions and some bad choices.

So I want to share a couple of stats. There was a study Google did called The Multiscreen World, and this was almost six years ago, 2012. They found six years ago that 65% of searchers started a search on their smartphones. Two-thirds of searchers started on smartphones six years ago. Sixty percent of those searches were continued on a desktop or laptop. Again, this has been six years, so we know the adoption rate of mobile has increased. So these are not people who only use desktop or who only use mobile. These are people on a journey of search that move between devices, and I think in the real world it looks more something like this right now.

Another stat from the series was that 88% of people said that they used their smartphone and their TV at the same time. This isn't shocking to you. You sit in front of the TV with your phone and you sit in front of the TV with your laptop. You might sit in front of the TV with a smartwatch. These devices are being used at the same time, and we're doing more searches and we're using more devices. So one of these things isn't replacing the other.

The cross-device journey

So a journey could look something like this. You're watching TV. You see an ad and you hear about something. You see a video you like. You go to your phone while you're watching it, and you do a search on that to get more information. Then later on, you go to your laptop and you do a bit of research, and you want that bigger screen to see what's going on. Then at the office the next day, you're like, "Oh, I'll pull up that bookmark. I wanted to check something on my desktop where I have more bandwidth or something." You're like, "Oh, maybe I better not buy that at work. I don't want to get in trouble. So I'm going to home and go back to my laptop and make that purchase." So this purchase and this transaction, this is one visitor on this chain, and I think we do this a lot right now, and that's only going to increase, where we operate between devices and this journey happens across devices.

So the challenge I would make to you is if you're looking at this and you're saying, "Only so many percent of our users are on mobile. Our mobile experience doesn't matter that much. It's not that important. We can just live with the desktop people. That's enough. We'll make enough money." If they're really on this journey and they're not segmented like this, and this chain, you break it, what happens? You lose that person completely, and that was a person who also used desktop. So that person might be someone who you bucketed in your 90%, but they never really got to the device of choice and they never got to the transaction, because by having a lousy mobile experience, you've broken the chain. So I want you to be aware of that, that this is the cross-device journey and not these segmented ideas.

Future touchpoints

This is going to get worse. This is going to get scarier for us. So look at the future. We're going to be sitting in our car and we're going to be listening — I still listen to CDs in the car, I know it's kind of sad — but you're going to be listening to satellite radio or your Wi-Fi or whatever you have coming in, and let's say you hear a podcast or you hear an author and you go, "Oh, that person sounds interesting. I want to learn more about them." You tell your smartwatch, "Save this search. Tell me something about this author. Give me their books." Then you go home and you go on Google Home and you pull up that search, and it says, "Oh, you know what? I've got a video. I can't play that because obviously I'm a voice search device, but I can send that to Chromecast on your TV." So you send that to your TV, and you watch that. While you're watching the TV, you've got your phone out and you're saying, "Oh, I'd kind of like to buy that." You go to Amazon and you make that transaction.

So it took this entire chain of devices. Again now, what about the voice part of this chain? That might not seem important to you right now, but if you break the chain there, this whole transaction is gone. So I think the danger is by neglecting pieces of this and not seeing that this is a journey that happens across devices, we're potentially putting ourselves at much higher risk than we think.

On the plus side

I also want to look at sort of the positive side of this. All of these devices are touchpoints in the journey, and they give us credibility. We found something interesting at Moz a few years ago, which was that our sale as a SaaS product on average took about three touchpoints. People didn't just hit the Moz homepage, do a free trial, and then buy it. They might see a Whiteboard Friday. They might read our Beginner's Guide. They might go to the blog. They might participate in the community. If they hit us with three touchpoints, they were much more likely to convert.

So I think the great thing about this journey is that if you're on all these touchpoints, even though to you that might seem like one search, it lends you credibility. You were there when they ran the search on that device. You were there when they tried to repeat that search on voice. The information was in that video. You're there on that mobile search. You're there on that desktop search. The more times they see you in that chain, the more that you seem like a credible source. So I think this can actually be good for us.

The SEO challenge

So I think the challenge is, "Well, I can't go out and hire a voice team and a mobile team and do a design for all of these things. I don't want to build a voice app. I don't have the budget. I don't have the buy-in." That's fine.
One thing I think is really great right now and that we're encouraging people to experiment with, we've talked a lot about featured snippets. We've talked about these answer boxes that give you an organic result. One of the things Google is trying to do with this is they realize that they need to use their same core engine, their same core competency across all devices. So the engine that powers search, they want that to run on a TV. They want that to run on a laptop, on a desktop, on a phone, on a watch, on Goggle Home. They don't want to write algorithms for all of these things.

So Google thinks of their entire world in terms of cards. You may not see that on desktop, but everything on desktop is a card. This answer box is a card. That's more obvious. It's got that outline. Every organic result, every ad, every knowledge panel, every news story is a card. What that allows Google to do, and will allow them to do going forward, is to mix and match and put as many pieces of information as it makes sense for any given device. So for desktop, that might be a whole bunch. For mobile, that's going to be a vertical column. It might be less. But for a watch or a Google Glass, or whatever comes after that, or voice, you're probably only going to get one card.

But one great thing right now, from an SEO perspective, is these featured snippets, these questions and answers, they fit on that big screen. We call it result number zero on desktop because you've got that box, and you've got a bunch of stuff underneath it. But that box is very prominent. On mobile, that same question and answer take up a lot more screen space. So they're still a SERP, but that's very dominant, and then there's some stuff underneath. On voice, that same question and answer pairing is all you get, and we're seeing that a lot of the answers on voice, unless they're specialty like recipes or weather or things like that, have this question and answer format, and those are also being driven by featured snippets.

So the good news I think, and will hopefully stay good news going forward, is that because Google wants all these devices to run off that same core engine, the things you do to rank well for desktop and to be useful for desktop users are also going to help you rank on mobile. They're going to help you rank on voice, and they're going to help you rank across all these devices. So I want you to be aware of this. I want you to try and not to break that chain. But I think the things we're already good at will actually help us going forward in the future, and I'd highly encourage you to experiment with featured snippets to see how questions and answers appear on mobile and to see how they appear on Google Home, and to know that there's going to be an evolution where all of these devices benefit somewhat from the kind of optimization techniques that we're already good at hopefully.

Encourage the journey chain

So I also want to say that when you optimize for answers, the best answers leave searchers wanting more. So what you want to do is actually encourage this chain, encourage people to do more research, give them rich content, give them the kinds of things that draw them back to your site, that build credibility, because this chain is actually good news for us in a way. This can help us make a purchase. If we're credible on these devices, if we have a decent mobile experience, if we come up on voice, that's going to help us really kind of build our brand and be a positive thing for us if we work on it.

So I'd like you to tell me, what are your fears right now? I think we're a little scared of the mobile index. What are you worried about with voice? What are you worried about with IoT? Are you concerned that we're going to have to rank on our refrigerators, and what does that mean? So it's getting into science fiction territory, but I'd love to talk about it more. I will see you in the comment section.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Desktop, Mobile, or Voice? (D) All of the Above - Whiteboard Friday posted first on https://moz.com/blog

How to Optimize Car Dealership Websites

Posted by sherrybonelli

Getting any local business to rank high on Google is becoming more and more difficult, but one of the most competitive — and complex — industries for local SEO are car dealerships. Today’s car shoppers do much of their research online before they even step into a dealership showroom. (And many people don’t even know what type of car they even want when they begin their search.)

According to Google, the average car shopper only visits two dealerships when they’re searching for a new vehicle. That means it’s even more important than ever for car dealerships to show up high in local search results.

However, car dealerships are more complex than the average local business — which means their digital marketing strategies are more complex, as well. First, dealers often sell new vehicles from several different manufacturers with a variety of makes and models. Next, because so many people trade in their old cars when they purchase new cars, car dealers also sell a variety of used vehicles from even more manufacturers. Additionally, car dealerships also have a service department that offers car maintenance and repairs — like manufacturer warranty work, oil changes, tire rotations, recall repairs, and more. (The search feature on a car dealer’s website alone is a complex system!)

Essentially, a car dealer is like three businesses in one: they sell new cars, used cars, AND do vehicle repairs. This means your optimization strategy must also be multi-faceted, too.

Also, if you look at the car dealerships in your city, you will probably find at least one dealership with multiple locations. These multi-location family of dealerships may be in the same city or in surrounding cities.

Additionally, depending on that family of dealerships, they may have one website or they might have different websites for each location. (Many auto manufacturers require dealers to have separate websites if they sell certain competitors’ vehicles.)

So if you’re helping car dealers with SEO, you must be thinking about the various manufacturers, the types of vehicles being sold (new and used), the repair services being offered, the number of websites and locations you’ll be managing, manufacturer requirements — among other things.

So what are some of the search optimization strategies you should use when working with a car dealership? Here are some SEO recommendations.

Google My Business

Google My Business has been shown to have a direct correlation to local SEO — especially when it comes to showing up in the Google Local 3-Pack.

One important factor with Google My Business is making sure that the dealership’s information is correct and contains valuable information that searchers will find helpful. This is important for competitive markets — especially when only a handful of sites show up on the first page of Google search results. Here are some key Google My Business features to take advantage of:

Name, address, and phone number

Ensure that the dealership’s name, address and phone number is correct. (If you have a toll-free number, make sure that your LOCAL area code phone number is the one listed on your Google My Business listing.) It’s important that this information is the same on all local online directories that the dealership is listed on.

Categories

Google My Business allows you to select categories (a primary category and additional categories) to describe what your dealership offers. Even though the categories you select affect local rankings, keep in mind that the categories are just one of many factors that determine how you rank in search results.

  • These categories help connect you with potential customers that are searching for what your car dealership sells. You can select a primary category and additional categories – but don’t go overboard by selecting too many categories. Be specific. Choose as few categories as possible to describe the core part of your dealership’s business.
  • If the category you want to use isn’t available, choose a general category that’s still accurate. You can’t create your own categories. Here are some example categories you could use:
    • Car Dealer
    • Used Car Dealer
    • BMW dealer
  • Keep in mind that if you’re not ranking as high as you want to rank, changing your categories may improve your rankings. You might need to tweak your categories until you get it right. If you add or edit one of your categories, you might be asked by Google to verify your business again. (This just helps Google confirm that your business information is accurate.)

Photos

Google uses photo engagement on Google My Business to help rank businesses in local search. Show photos of the new and used cars you have on your dealership’s lot — and be sure to update them frequently. After you make a sale, make sure you get a photo consent form signed and ask if you can take a picture of your happy customers with their new car to upload to Google My Business (and your other social media platforms.)

If you’re a digital marketing agency or a sales manager at a dealership, getting your salespeople to upload photos to Google My Business can be challenging. Steady Demand’s LocalPics tool makes it easy for salespeople to send pictures of happy customers in their new cars by automatically sending text message reminders. You simply set the frequency of these reminders. The LocalPics tool automatically sends text messages to the sales reps reminding them to submit their photos:

All the sales reps have to do is save their customers’ photos to their phone. You set up text message reminders to each sales rep and when they get the text message reminder, the sales team simply has to go into their smartphone’s pictures and upload their images through the text message, and the photos are automatically posted to the dealership’s Google My Business listing! (They can also text photos to their Google My Business anytime they want as well — they don’t have to wait for the reminder text messages.)

Videos

Google recently began allowing businesses to upload 30-second videos to their Google My Business listing. Videos are a great way to show off the uniqueness of your dealership. These videos auto-play on mobile devices — which is where many people do their car searching on — so you should include several videos to showcase the cars and what’s going on at your dealership.

Reviews

Online reviews are crucial for when people search for the right type of car AND the dealership they should purchase that car from. Make sure you ask happy customers to leave reviews on your Google My Business listing and ensure that you keep up by responding to all reviews left on your Google My Business listing.

Questions & Answers

The Google My Business Q&A feature has been around for several months, yet many businesses still don’t know about it — or pay attention to it. It’s important that you are constantly looking at questions that are being asked of your dealership and that you promptly answer those questions with the correct answer.

Just like most things on Google My Business, anyone can answer questions that are asked — and that means that it’s easy for misinformation to get out about your dealership and the cars on your lot. Make sure you have a person dedicated on your team to watch the Q&As being asked on your listing.

Also, be sure to frequently check your GMB dashboard. Remember, virtually anyone can make changes to your Google My Business listing. You want to check to make sure nobody has changed your information without you knowing.

Online directories (especially car directories)

If you’re looking for ways to improve your dealership’s rankings and backlink profile, online automotive directories are a great place to start. Submitting your dealership’s site to an online automotive directory or to an online directory that has an automotive category can help build your backlink profile. Additionally, many of these online directories show up on the first page of Google search results, so if your dealership isn’t listed on them, you’re missing out.

There are quite a few paid-for and free automotive online directories. Yelp, YellowPages, Bing, etc. are some of the larger general online directories that have dedicated automotive categories you can get listed on for free. Make sure your dealership’s name, address, and phone number (NAP) are consistent with the information that you have listed on Google My Business.

Online reviews

Online reviews are important. If your dealership has bad reviews, people are less likely to trust you. There are dedicated review sites for vehicle reviews and car dealership reviews. Sites like Kelley Blue Book, DealerRater, Cars.com, and Edmunds are just a few sites that make it easy for consumers to check out dealership reviews. DealerRater even allows consumers to list — and review — the employees they worked with at a particular dealership:

If they have a negative experience with your dealership — or one of your employees — you can bet that unhappy customer will leave a review. (And remember that reviews are not only left about your new and used car sales — they are also left about your repair shop as well!)

There are software platforms you can install on your dealership’s site that make it easier for customers to leave reviews for your dealership. These tools also make it simple to monitor and deflect negative reviews to certain review websites. (It’s important to note that Google recently changed their policies and no longer support “review gating” — software that doesn’t allow a negative review to be posted on Google My Business.)

NOTE: Many automotive manufacturers offer dealerships coop dollars that can be used for advertising and promotions; however, sometimes they make it easier for the dealers to get that money if they use specific turnkey programs from manufacturer-approved vendors. As an example, if you offer a reputation marketing software tool that can help the dealership get online reviews, the dealership may be incentivized to use DealerRater instead because they’ve been “approved” by the manufacturer. (And this goes for other marketing and advertising as well — not just reputation marketing.)

Select long-tail keywords

Selecting the right keywords has always been a part of SEO. You want to select the keywords that have a high search frequency, mid-to-low competitiveness, ones that have direct relevance to your website’s content — and are keyword phrases that your potential car buyers are actually using to search for the cars and services your dealership offers.

When it comes to selecting keywords for your site’s pages, writing for long-tailed keywords (e.g. “2018 Ford Mustang GT features”) have a better chance of ranking highly in Google search results than a short-tailed and generic keyword phrase like “Ford cars.”

Other car-related search keywords — like “MSRP” and “list prices” — are keywords you should add to your arsenal.

Optimize images

According to Google, searches for "pictures of [automotive brand]" is up 37% year-over-year. This means when you’re uploading various pictures of the cars for sale on your car lot, be sure to include the words “pictures of” and the brand name, make, and model where appropriate.

For instance, if you’re showing the interior of the 2018 Dodge Challenger, you may want to name the actual picture image file “picture-of-dodge-challenger-2018-awd-front-seat-interior.png” and use the alt tag “Pictures of Dodge Challenger 2018 AWD Front Seat Interior for Sale in Cedar Rapids.”

As with everything SEO-related, use discretion with the “pictures of” strategy. Don’t overdo it, but it should be a part of your image optimization strategy to a certain extent on specific car overview pages.

Optimize for local connections

One thing many car dealerships fail to realize is how important it is to make local connections — not only for local SEO purposes but also for community trust and support as well. You should make a connection on at least one of the pages on your site that relates to what’s going on in your local community/city.

For instance, on your About Us page, you may want to include a link to a city-specific page that talks about what’s going on in your city. Is there a July 4th parade? And if so, are you having a float or donating a convertible for the town’s mayor to ride in? If you sponsor a local charity or belong to the Chamber of Commerce, it’d be great to mention it on one of these localized pages (mentioning your city’s name, of course) and talk about what your dealership’s role is and what you do. Is there an upcoming charity walk or do you donate to your local animal shelter? Share pictures (and be sure to use alt tags) and write about what you’re doing to help.

All of this information not only helps beef up your local SEO because you’re using the city’s name you’re trying to rank for, but it also creates good will for future customers. Additionally, you can create links to these various charities and organizations and ask that they, in turn, create a link to your site. Local backlinking at its best!

Schema

If you want to increase the chances of Google — and the other search engines — understanding what your site’s pages are about, using schema markup will give you a leg-up over your competition. (And chances are your car dealership competitors aren’t yet using schema markup.)

You’ll want to start by using the Vehicle “Type” schema and then markup each particular car using the Auto schema markup JSON-LD code. You can find the Schema.org guidelines for using Schema Markup for Cars on Schema.org. Below is an example of what JSON-LD schema markup looks like for a 2009 Volkswagen Golf:

Listen to the SEO for Car Dealerships podcast episode to learn EVEN MORE!

If you want to learn even more information about the complexities of car dealerships and search optimization strategies, be sure to listen to my interview on MozPod’s SEO for Car Dealerships.

In this podcast we’ll cover even more topics like:

  • What NOT to include in your page’s title tag
  • How to determine if you really own your dealership’s website or not
  • How to handle it if your dealership moves locations
  • Why using the manufacturer-provided car description information verbatim is a bad idea
  • Does “family owned” really matter?
  • How to handle car dealers with multiple locations
  • How to get creative with your Car Service pages by showing off your employees
  • Why blogging is a must-do SEO strategy and some topic ideas to get you started
  • Ways to get local backlinks
  • Tips for getting online reviews
  • What other digital marketing strategies you should try and why
  • And more

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How to Optimize Car Dealership Websites posted first on https://moz.com/blog

What Do SEOs Do When Google Removes Organic Search Traffic? – Whiteboard Friday

Posted by randfish

We rely pretty heavily on Google, but some of their decisions of late have made doing SEO more difficult than it used to be. Which organic opportunities have been taken away, and what are some potential solutions? Rand covers a rather unsettling trend for SEO in this week's Whiteboard Friday.

What Do SEOs Do When Google Removes Organic Search?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're talking about something kind of unnerving. What do we, as SEOs, do as Google is removing organic search traffic?

So for the last 19 years or 20 years that Google has been around, every month Google has had, at least seasonally adjusted, not just more searches, but they've sent more organic traffic than they did that month last year. So this has been on a steady incline. There's always been more opportunity in Google search until recently, and that is because of a bunch of moves, not that Google is losing market share, not that they're receiving fewer searches, but that they are doing things that makes SEO a lot harder.

Some scary news

Things like...

  • Aggressive "answer" boxes. So you search for a question, and Google provides not just necessarily a featured snippet, which can earn you a click-through, but a box that truly answers the searcher's question, that comes directly from Google themselves, or a set of card-style results that provides a list of all the things that the person might be looking for.
  • Google is moving into more and more aggressively commercial spaces, like jobs, flights, products, all of these kinds of searches where previously there was opportunity and now there's a lot less. If you're Expedia or you're Travelocity or you're Hotels.com or you're Cheapflights and you see what's going on with flight and hotel searches in particular, Google is essentially saying, "No, no, no. Don't worry about clicking anything else. We've got the answers for you right here."
  • We also saw for the first time a seasonally adjusted drop, a drop in total organic clicks sent. That was between August and November of 2017. It was thanks to the Jumpshot dataset. It happened at least here in the United States. We don't know if it's happened in other countries as well. But that's certainly concerning because that is not something we've observed in the past. There were fewer clicks sent than there were previously. That makes us pretty concerned. It didn't go down very much. It went down a couple of percentage points. There's still a lot more clicks being sent in 2018 than there were in 2013. So it's not like we've dipped below something, but concerning.
  • New zero-result SERPs. We absolutely saw those for the first time. Google rolled them back after rolling them out. But, for example, if you search for the time in London or a Lagavulin 16, Google was showing no results at all, just a little box with the time and then potentially some AdWords ads. So zero organic results, nothing for an SEO to even optimize for in there.
  • Local SERPs that remove almost all need for a website. Then local SERPs, which have been getting more and more aggressively tuned so that you never need to click the website, and, in fact, Google has made it harder and harder to find the website in both mobile and desktop versions of local searches. So if you search for Thai restaurant and you try and find the website of the Thai restaurant you're interested in, as opposed to just information about them in Google's local pack, that's frustratingly difficult. They are making those more and more aggressive and putting them more forward in the results.

Potential solutions for marketers

So, as a result, I think search marketers really need to start thinking about: What do we do as Google is taking away this opportunity? How can we continue to compete and provide value for our clients and our companies? I think there are three big sort of paths — I won't get into the details of the paths — but three big paths that we can pursue.

1. Invest in demand generation for your brand + branded product names to leapfrog declines in unbranded search.

The first one is pretty powerful and pretty awesome, which is investing in demand generation, rather than just demand serving, but demand generation for brand and branded product names. Why does this work? Well, because let's say, for example, I'm searching for SEO tools. What do I get? I get back a list of results from Google with a bunch of mostly articles saying these are the top SEO tools. In fact, Google has now made a little one box, card-style list result up at the top, the carousel that shows different brands of SEO tools. I don't think Moz is actually listed in there because I think they're pulling from the second or the third lists instead of the first one. Whatever the case, frustrating, hard to optimize for. Google could take away demand from it or click-through rate opportunity from it.

But if someone performs a search for Moz, well, guess what? I mean we can nail that sucker. We can definitely rank for that. Google is not going to take away our ability to rank for our own brand name. In fact, Google knows that, in the navigational search sense, they need to provide the website that the person is looking for front and center. So if we can create more demand for Moz than there is for SEO tools, which I think there's something like 5 or 10 times more demand already for Moz than there is tools, according to Google Trends, that's a great way to go. You can do the same thing through your content, through your social media, and through your email marketing. Even through search you can search and create demand for your brand rather than unbranded terms.

2. Optimize for additional platforms.

Second thing, optimizing across additional platforms. So we've looked and YouTube and Google Images account for about half of the overall volume that goes to Google web search. So between these two platforms, you've got a significant amount of additional traffic that you can optimize for. Images has actually gotten less aggressive. Right now they've taken away the "view image directly" link so that more people are visiting websites via Google Images. YouTube, obviously, this is a great place to build brand affinity, to build awareness, to create demand, this kind of demand generation to get your content in front of people. So these two are great platforms for that.

There are also significant amounts of web traffic still on the social web — LinkedIn, Facebook, Twitter, Pinterest, Instagram, etc., etc. The list goes on. Those are places where you can optimize, put your content forward, and earn traffic back to your websites.

3. Optimize the content that Google does show.

Local

So if you're in the local space and you're saying, "Gosh, Google has really taken away the ability for my website to get the clicks that it used to get from Google local searches," going into Google My Business and optimizing to provide information such that people who perform that query will be satisfied by Google's result, yes, they won't get to your website, but they will still come to your business, because you've optimized the content such that Google is showing, through Google My Business, such that those searchers want to engage with you. I think this sometimes gets lost in the SEO battle. We're trying so hard to earn the click to our site that we're forgetting that a lot of search experience ends right at the SERP itself, and we can optimize there too.

Results

In the zero-results sets, Google was still willing to show AdWords, which means if we have customer targets, we can use remarketed lists for search advertising (RLSA), or we can run paid ads and still optimize for those. We could also try and claim some of the data that might show up in zero-result SERPs. We don't yet know what that will be after Google rolls it back out, but we'll find out in the future.

Answers

For answers, the answers that Google is giving, whether that's through voice or visually, those can be curated and crafted through featured snippets, through the card lists, and through the answer boxes. We have the opportunity again to influence, if not control, what Google is showing in those places, even when the search ends at the SERP.

All right, everyone, thanks for watching for this edition of Whiteboard Friday. We'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


What Do SEOs Do When Google Removes Organic Search Traffic? - Whiteboard Friday posted first on https://moz.com/blog

The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?
2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

"Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?"

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we're asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here's what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today posted first on https://moz.com/blog

How to Diagnose Your SEO Client’s Search Maturity

Posted by HeatherPhysioc

One of the biggest mistakes I see (and am guilty of making) is assuming a client is knowledgeable, bought-in, and motivated to execute search work simply because they agreed to pay us to do it. We start trucking full-speed ahead, dumping recommendations in their laps, and are surprised when the work doesn’t get implemented.

We put the cart before the horse. It’s easy to forget that clients start at different points of maturity and knowledge levels about search, and even clients with advanced knowledge may have organizational challenges that create barriers to implementing the work. Identifying where your client falls on a maturity curve can help you better tailor communication and recommendations to meet them where they are, and increase the likelihood that your work will be implemented.

How mature is your client?

No, not emotional maturity. Search practice maturity. This article will present a search maturity model, and provide guidance on how to diagnose where your client falls on that maturity spectrum.

This is where maturity models can help. Originally developed for the Department of Defense, and later popularized by Six Sigma methodologies, maturity models are designed to measure the ability of an organization to continuously improve in a practice. They help you diagnose the current maturity of the business in a certain area, and help identify where to focus efforts to evolve to the next stage on the maturity curve. It’s a powerful tool for meeting the client where they are, and understanding how to move forward together with them.

There are a number of different maturity models you can research online that use different language, but most maturity models follow a pattern something like this:

  • Stage 1 - Ad Hoc & Developing
  • Stage 2 - Reactive & Repeatable
  • Stage 3 - Strategic & Defined
  • Stage 4 - Managed & Measured
  • Stage 5 - Efficient & Optimizing

For search, we can think about a maturity model two ways.

One is the actual technical implementation of search best practices — is the client implementing exceptional, advanced SEO, just the basics, nothing at all, or even operating counterproductively? This can help you figure out what kinds of projects make the most sense to activate.

The second way is the organizational maturity around search engine optimization as a marketing program. Is the client aligned to the importance of organic search, allocating budget and personnel appropriately, and systematically integrating search into marketing efforts? This can help you identify the most important institutional challenges to solve for that can otherwise block the implementation of your work.

Technical SEO capabilities maturity

First, let’s dive into a maturity model for search knowledge and capabilities.

SEO capabilities criteria

We measure an organization on several important criteria that contribute to the success of SEO:

  • Collaboration - how well relevant stakeholders integrate and collaborate to do the best work possible, including inside the organization, and between the organization and the service providers.
  • Mobility - how mobile-friendly and optimized the brand is.
  • Technical - how consistently foundational technical best practices are implemented and maintained.
  • Content - how integrated organic search is into the digital content marketing practice and process.
  • On-page - how limited or extensive on-page optimization is for the brand’s content.
  • Off-page - the breadth and depth of the brand’s off-site optimization, including link-building, local listings, social profiles and other non-site assets.
  • New technology -the appetite for and adoption of new technology that impacts search, such as voice search, AMP, even structured data.
  • Analytics - how data-centric the organization is, ranging from not managed and measured at all, to rearview mirror performance reporting, to entirely data-driven in search decision-making.

Search Capabilities Score Card

Click the image to see the full-size version.

SEO capabilities maturity stages

We assign each of the aforementioned criteria to one of these stages:

  • Stage 0 (Counterproductive) - The client is engaging in harmful or damaging SEO practices.
  • Stage 1 (Nonexistent) - There is no discernible SEO strategy or tactical implementation, and search is an all-new program for the client.
  • Stage 2 (Tactical) - The client may be doing some basic SEO best practices, but it tends to be ad hoc inclusion with little structure or pre-planning. The skills and the work meet minimum industry standards, but work is fairly basic and perhaps not cohesive.
  • Stage 3 (Strategic) - The client is aligned to the value of SEO, and makes an effort to dedicate resources to implementing best practices and staying current, as well as bake it into key initiatives. Search implementation is more cohesive and strategic.
  • Stage 4 (Practice) - Inclusion of SEO is an expectation for most of the client’s marketing initiatives, if not mandatory. They are not only implementing basic best practices but actively testing and iterating new techniques to improve their search presence. They use performance of past initiatives to drive next steps.
  • Stage 5 (Culture) - At this stage, clients are operating as if SEO is part of their marketing DNA. They have resources and processes in place, and they are knowledgeable and committed to learning more, their processes are continually reviewed and optimized, and their SEO program is evolving as the industry evolves. They are seeking cutting-edge new SEO opportunities to test.

Search Capabilities Maturity Model

Click the image to see the full-size version.

While this maturity model has been peer reviewed by a number of respected SEO peers in the industry (special thanks to Kim Jones at Seer Interactive, Stephanie Briggs at Briggsby, John Doherty at Credo, Dan Shure at Evolving SEO, and Blake Denman at Rickety Roo for your time and expertise), it is a fluid, living document designed to evolve as our industry does. If necessary, evolve this to your own reality as well.

You can download a Google Sheets copy of this maturity model here to begin using it with your client.

Download the maturity model

Why Stage 0?

In this search capabilities maturity model, I added an unconventional “Stage 0 - Counterproductive,” because organic search is unique in that they could do real damage and be at a deficit, not just at a baseline of zero.

In a scenario like this, the client has no collaboration inside the company or with the partner agency to do smart search work. Content may be thin, weak, duplicative, spun, or over-optimized. Perhaps their mobile experience is nonexistent or very poor. Maybe they’re even engaging in black hat SEO practices, and they have link-related or other penalties.

Choosing projects based on a client’s capabilities maturity

For a client that is starting on the lower end of the maturity scale, you may not recommend starting with advanced work like AMP and visual search technology, or even detailed Schema markup or extensive targeted link-building campaigns. You may have to start with the basics like securing the site, cleaning up information architecture, and fixing title tags and meta descriptions.

For a client that is starting on the higher end of the maturity scale, you wouldn’t want to waste their time recommending the basics — they’ve probably already done them. You're better off finding new and innovative opportunities to do great search work they haven’t already mastered.

But we’re just getting started...

But technical capabilities and knowledge are only beginning to scratch the surface with clients. This starts to solve for what you should implement, but doesn’t touch why it’s so hard to get your work implemented. The real problems tend to be a lot squishier, and aren’t so simple as checking some SEO best practices boxes.

How mature is your client’s search practice?

The real challenges to implementation tend to be organizational, people, integration, and process problems. Conducting a search maturity assessment with your client can be eye-opening as to what needs to be solved internally before great search work can be implemented and start reaping the rewards. Pair this with the technical capabilities maturity model above, and you have a powerhouse of knowledge and tools to help your client.

Before we dig in, I want to note one important caveat: While this maturity model focuses heavily on organizational adoption and process, I don’t want to suggest that process and procedure are substitutes for using your actual brain. You still have to think critically and make hard choices when you execute a best-in-class search program, and often that requires solving all-new problems that didn’t exist before and therefore don’t have a formal process.

Search practice maturity criteria

We measure an organization on several important criteria that contribute to the success of SEO:

  • Process, policy, or procedure - Do documented, repeatable processes for inclusion of organic search exist, and are they continually improving? Is it an organizational policy to include organic search in marketing efforts? This can mean that the process of including organic search in marketing initiatives is defined as a clear series of actions or steps taken, including both developing organic search strategy and implementing SEO tactics.
  • Personnel resources & integration - Does the necessary talent exist at the organization or within the service provider’s scope? Personnel resources may include SEO professionals, as well as support staff such as developers, data analysts, and copywriters necessary to implement organic search successfully. Active resources may work independently in a disjointed manner or collaboratively in an integrated manner.
  • Knowledge & learning - Because search is a constantly evolving field, is the organization knowledgeable about search and committed to continuously learning? Information can include existing knowledge, past experience, or training in organic search strategy and tactics. It can also include a commitment to learning more, possibly through willingness to undertake trainings, attendance of conferences, regular consumption of learning materials, or staying current in industry news and trends.
  • Means, capacity, & capabilities - Does the organization budget appropriately for and prioritize the organic search program? Means, capacity and capabilities can include being scoped into a client contract, adequate budget being allocated to the work, adequate human resources being allocated to the work, the capacity to complete the work when measured against competing demands, and the prioritization of search work alongside competing demands.
  • Planning & preparation - Is organic search aligned to business goals, brand goals, and/or campaign goals? Is organic search proactively planned, reactive, or not included at all? This measure evaluates how frequently organic search efforts are included in marketing efforts for a brand. It also measures how frequently the work is included proactively and pre-planned, as opposed to reactively as an afterthought. Work may be aligned to or disconnected from the "big picture."

Organizational search maturity

Click the image to see the full-size version.

Search practice stages of maturity

Stage 1 - Initial & ad hoc

At this stage, the organizations’ search application may be nonexistent, unstable, or uncontrolled. There may be rare and small SEO efforts, but they are entirely ad hoc and inconsistent, and retrofitted to the work after the fact, at best. They tend to lack any discernible goal orientation. If SEO exists, it is disconnected from larger goals, and not integrated with any other practices across the organization. They may be just beginning their search practice for the first time.

Stage 2 - Repeatable but reactive

These organizations are at least doing some search basics, though there is no rigorous use or enforcement of it. It is very reactive and in-the-moment while projects are being implemented; it is rarely pre-planned and often SEO is applied as an afterthought. They are executing only in the present or when it’s too late to do the highest caliber search work, but they are making an effort. SEO efforts may occasionally be going after goals, but it is unlikely to be tied to larger business goals. (Most of my client relationships have started here.)

Stage 3 - Defined & understood

These organizations have started to document their processes and are satisfactorily knowledgeable and competent in search. They have minimum standards for search best practices and process is emerging. Many people inside and outside the organization understand that search is important and are taking steps to integrate. There is a clear search strategy that aligns to organizational goals and processes. Proactive search preparation and planning happens prior to activating projects.

Stage 4 - Managed & capable

These organizations have proactive, predictable implementation of search work. They have quality-focused rules for products and processes, and can quickly detect and correct missteps. They have clearly defined processes for integration, implementation and oversight, but are flexible enough to adapt to a range of conditions without sacrificing quality. These organizations consider search part of their “way of life.”

Stage 5 - Efficient & optimizing

Organizations at this stage have a strong mastery of search and efficiently implementing as a matter of policy. They have cross-organizational integration and proactively work to strengthen their search performance. They are always improving the process through incremental or innovative change. They review and analyze their process and implementation to keep optimizing. These organizations could potentially be considered market-leading or innovative.

Scorecard exercise

Click the image to see the full-size version.

You are here

Before you can know how to get where you want to go, you need to know where you are. It's important to understand where the organization stands, and then where they need to be in the future. Going through the quantitative exercise of diagnosing their maturity can help everyone align to where to start.

You can use these scorecards to assess factors like leadership alignment to the value of search, employee availability and involvement, knowledge and training, process and standardization, their culture (or lack thereof) of data-driven problem-solving and continuous improvement, and even budget.

A collaborative exercise

This should be a deeper exercise than just punching numbers into a spreadsheet, and it certainly shouldn’t be a one-sided assessment from you as an outsider. It is much more valuable to ask several relevant people at multiple levels across the client organization to participate in this exercise, and can become much richer if you take the time to talk to people at various points in the process.

How to use the scorecard & diagnose maturity

Once you download the scorecards, follow these steps to begin the maturity assessment process.

  1. Client-side distribution - Distribute surveys to relevant stakeholders on the client's internal team. Ideally, these individuals serve at a variety of levels at the company and occupy a mix of roles relevant to the organic search practice. These could include CEO, CMO, Marketing VPs and directors, digital marketing coordinators, and in-house SEOs.
  2. Agency-side distribution - Distribute surveys to relevant stakeholders on the agency team. Ideally, these individuals serve at a variety of levels at the agency and occupy a mix of roles relevant to the organic search practice. These could include digital marketing coordinators, client engagement specialists, analysts, digital copywriters, or SEO practitioners.
  3. Assign a level of maturity to each criteria - Each survey participant can simply mark one "X" per category row in the column that most accurately reflects perception of the brand organization as it pertains to organic search. (For example, if the survey respondent feels that SEO process and procedure are non-existent based on the description, they can mark an “X” in the “Initial/Ad Hoc” column. Alternatively, if they feel they are extraordinarily advanced and efficient in their processes, they may mark the “X” in the “Efficient & Optimizing” column.)
  4. Collect the surveys - Assign a point value of 1, 2, 3, 4, or 5 to the responses from left to right in the scorecard. Average the points to get a final score for each. (For example, if five client stakeholders score their SEO process and procedure as 3, 4, 2, 3, 3 respectively, the average score is 3 for that criteria.)
  5. Comparing client to agency perception - You may also choose to ask survey respondents to denote whether they are client-side or agency-side so you can look at the data both in aggregate, and by client and agency separately, to determine if there is alignment or disagreement on where the brand falls on the maturity curve. This can be great material for discussion with the client that can open up conversations about why those differences in perception exist.

Screenshot of scorecard

To get your own scorecard, click the image and make a copy of the Google Sheet.

Choosing where to start

The goal is to identify together where to start working. This means finding the strengths to capitalize upon, areas of acceptability that can be nudged to a strength with a little work, weaknesses to improve upon, agreeing on areas to focus, and finally, how to get started tackling the first change together.

For a client that is starting on the low end of the maturity scale, it is unrealistic to expect that they have connected all the dots between important stakeholders, that they have a clearly defined and repeatable process, and that their search program is a well-oiled machine. If you don’t work together to solve the underlying problems like knowledge or adequate personnel resources first, you will struggle to get buy-in for the work or the resources to get it done, so it doesn’t matter what projects you recommend.

For a client that is advanced in a few areas, say process, planning, and capacity, but weaker in others like knowledge and capacity, that might suggest that you need to focus efforts on an education campaign to help the client prioritize the work and fit it into a busy queue.

For a client that is already advanced across the board, your role instead may be to keep the machine running while also helping them spot minor areas of improvement so they can keep iterating and perfecting the process. This client might also be ready for more advanced search strategies and tactical recommendations, or perhaps more robust integrations across additional disciplines.

One foot in front of the other

It’s rare that we live in a world of radical change where we overhaul everything en masse and see epic change overnight. We tweak, test, learn, and iterate. A maturity model is a continuum, and brands must evolve from one step to the next. Skipping levels is not an option. Some may also call this a “crawl, walk, run” approach.

Your goal as their trusted search advisor is not to help them leap from Stage 2 to Stage 5. Accomplishing that trajectory and speed of growth is exceedingly difficult and rare. Instead, focus your efforts on how the client can get to the next stage over the next 12 months. As they progress up the maturity model, the length of time it takes to unlock the next level may grow longer and longer.

Organizational Search Maturity

Click the image to see the full-size version.

Even when an organization reaches Stage 5, your/their work is not done. Master-level organizations continue to refine and optimize their processes and capabilities.

There is no finish line to search maturity

There is a French culinary phrase, “mise en place,” that refers to having everything — ingredients, tools, recipe — in its place to begin cooking most successfully. There are several key ingredients to any successful project implementation: buy-in, process, knowledge and skills, capacity, planning, and more.

As your client evolves up the maturity curve, you will see and feel a transition from thinking about aspects only once a project is sliding off the rails, to including these things real-time and reactively, to anticipating these before every project and doing your due diligence to come prepared. Essentially, the client can move from not being able to spell “SEO” to making SEO a part of their DNA by moving up these maturity curves.

It is important to revisit the maturity model discussion periodically — I recommend doing so at least annually — to level-set and realign with the client. Conducting this exercise again can remind us to pause and reflect on all we have accomplished since the first scoring. It can also re-energize stakeholders to make even more progress in the upcoming year.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How to Diagnose Your SEO Client's Search Maturity posted first on https://moz.com/blog

How to Rock MozCon 2018 Like the Marketing Superhero You Are

Posted by FeliciaCrawford

MozCon is just around the corner, meaning it’s time to share one of our absolute favorite posts of the year: the semi-official MozCon Guide to Seattle!

For those of you following the yellow brick road of I-5 into the heart of the Emerald City to spend three days absorbing all the SEO insight you can hold, this should help you plan both how you spend your time at the conference and outside of it. For those watching on the sidelines, scroll along and you’ll find a treasure trove of fun Seattle ideas and resources for future cons or trips you might make to this fair city by the sea.

And if you’ve been waffling on whether or not to take the plunge (to attend the conference — I wouldn’t recommend plunging into the Puget Sound, it’s quite cold), there may still be time:

Register for MozCon!

We’re now over 99% sold out, so act fast if you’ve got your heart set on MozCon 2018!

Official MozCon activities:

We know you’re here for a conference, but that’s only part of your day. After you’ve stuffed every inch of space in your brain with cutting-edge SEO insights, you’re going to want to give yourself a break — and that’s exactly why we’ve put together an assortment of events, activities, suggestions, and Seattle insider pro tips for how to fill your time outside of MozCon.

The MozCon kickoff party!

With day one behind you, we’re guessing you’ll be some mix of energized, inspired, and ready to relax just a bit. Celebrate the first day of MozCon at our Monday night kickoff party with a night of networking, custom cocktails, and good music at beautiful Block 41 in Belltown.

Meet with fellow marketers and the Mozzers that keep your SEO software shiny while you unwind after your first full day of conferencing. It’s our privilege and delight to bring our community together on this special night.

Our famously fun MozCon Bash

There ain’t no party like a MozCon party! We invite all MozCon attendees and Mozzers to join us on Wednesday night at the Garage Billiards in Seattle’s Capitol Hill neighborhood. From karaoke to photobooth, from billiards to shuffleboard, and peppered liberally with snacks and libations, the Wednesday Night MozCon Bash is designed to celebrate the completion of three days of jam-packed learning. This is the industry party of the year — you won’t want to miss it!

Birds of a Feather lunch tables

In between bites of the most delicious lunch you’ll find in the conference circuit, you’ll have the opportunity to connect with your fellow community members around the professional topics that matter most to you. Each day there will be seven-plus tables with different topics and facilitators; find one with a sign noting the topic and join the conversation to share advice, learn new tips and tricks, and discover new friends with similar interests.

Monday, July 9th

  • Google Analytics & Tag Management hosted by Ruth Burr Reedy at UpBuild
  • Content-Driven Link Building hosted by Paddy Moogan at Aira
  • Mobile App Growth hosted by Emily Grossman at Skyscanner
  • Content Marketing hosted by Casie Gillette at KoMarketing
  • Local SEO hosted by Mike Ramsey at Nifty Marketing
  • Podcasting hosted by Heidi Noonan-Mejicanos at Moz
  • Workflow Optimization hosted by Juan Parra at Accelo

Tuesday, July 10th

  • SEO A/B Testing hosted by Will Critchlow at Distilled
  • Community Speaker Connection hosted by Sha Menz at Moz
  • PPC + SEO Integration hosted by Jonathon Emery at Distilled
  • Meet Your Help Team hosted by Kristina Keyser at Moz
  • Agency Collaboration hosted by Yosef Silver at Fusion Inbound
  • Site Speed hosted by Jono Alderson at Yoast
  • Featured Snippets hosted by Rob Bucci at STAT Search Analytics
  • Voice Search hosted by Dr. Pete Meyers at Moz

Wednesday, July 11th

  • Content Marketing Q&A hosted by Kane Jamison at Content Harmony
  • Paid Search Marketing for High-Cost Keywords hosted by Trenton Greener at the Apex Training
  • SEO A/B Testing hosted by Will Critchlow at Distilled
  • Team Hiring, Retention, & Growth hosted by Heather Physioc at VML
  • Local Search hosted by Darren Shaw at Whitespark
  • Machine Learning & Advanced SEO by Britney Muller at Moz
  • Reporting Q&A hosted by Dana DiTomaso at Kick Point

The delight is in the details

MozCon is literally brimming with things to do and ways to support our attendees when they need it. Aside from our hosted events and three days’ worth of talks, we’ve got things to fill in the cracks and make sure your MozCon experience is everything you’ve ever wanted from a conference.

Photobooth with Roger: Admit it — you see that cute, googly-eyed robot face and you just want to hug it forever. At MozCon, you can do just that — and memorialize the moment with a picture at the photobooth! Roger’s a busy bot, but his photobooth schedule will be posted so you can plan your hugs accordingly.

Ping pong play sesh: Don your sweat bands and knee-high socks and keep your paddle arm limber! During breaks, we’ll have ping pong tables available to burn some excess energy and invite a little casual competition.

The world map of MozCon: Ever play pin the tail on the donkey? Well, this is sort of like that, but the donkey is a world map and (thankfully) there’s no blindfold. You’ll place a pin from wherever in the world you traveled from. It’s amazing to see how far some folks come for the conference!

Local snacks galore: Starbucks, Piroshky Piroshky, Ellenos Yogurt, and Top Pot Donuts will happily make themselves acquainted with your tastebuds! Carefully chosen from local Seattle businesses, our snacks will definitely please your local taste pallet and, if past feedback is to be believed, possibly tempt you to move here.

Stay charged: Pining for power? Panicking at that battery level of 15% at 10am? Find our charging sofas to fuel up your mobile device.

MozCon is for everyone

We want marketers of all stripes to feel comfortable and supported at our conference. Being “for everyone” means we’re working hard to make MozCon more accessible in many different ways. The Washington State Convention Center is fully ADA compliant, as are our other networking event venues. But it’s important for us to get even better, and we welcome your feedback and ideas.

Here are a few of the ways we’ve worked to make MozCon a welcoming event for everyone:

  • A ramp on the stage
  • Live closed captioning of the main event
  • Walkways for traffic flow
  • Menus featuring options or special meals (that actually taste good) for dietary restrictions
  • A nursing room
  • Gender-neutral bathroom options
  • Lots of signage
  • T-shirts that fit different body types
  • Visible staff to help make everyone’s experience the best possible
  • A proud partnership with 50/50 Pledge, furthering our commitment to better representation of women on stage
  • Strict enforcement of our Code of Conduct and TAGFEE

Bespoke city exploration — Get to know Seattle!

In years past, Tuesday nights were reserved for our MozCon Ignite event, where brave folks from myriad backgrounds would share stories in lighting-fast Ignite-style talks of five minutes each — the only rule being it can’t be about marketing!

While MozCon Ignite has always been a much-loved and highly anticipated event, we’ve also listened closely to your feedback about wanting more time to network on your own, plan client dinners, go on outings with your team, and in general just catch your breath — without missing a thing. That’s why this year, we’re folding Ignite into the official MozCon schedule so everyone can benefit from the tales shared and enjoy a fun five-minute break between SEO talks.

Wondering about what topics will be covered at Ignite this year?:

  • The Ninja Kit to NOT Get Sick While Traveling by Dana Weber at Seer Interactive
  • My Everest: How 10 Years of Chasing Tornadoes Came Down to One Moment by Tom Romero at Uncommon Goods
  • Baseball Made Me a Better Engineer by Tom Layson at Moz
  • Trailblazer: How Reading One Book Changed My Life for Good by Lina Soliman at OSUWMC
  • Drag Queen Warlocks, Skateboarding Sorcerers, & Other Folks by Jay Ricciardi at Tableau
  • Voice Dialogue Therapy: Listening to the Voices Inside Your Head by Kayla Walker at Distilled

We’re opening up Tuesday night as your chance to explore the Emerald City. We’ll have a travel team onsite at the conference on Tuesday to help you and your friends plan an exciting Seattle adventure. Perhaps you’ve met a fantastic group of like-minded folks at a Birds of a Feather lunch table and would love to talk featured snippets over fresh fish n’ chips at the Pike Place Market. Maybe you’ve always wanted to catch the view at the top of the Space Needle (recently renovated and reopened to provide even better views!). Or perhaps a quiet sunset picnic overlooking the water at Gasworks Park seems like the perfect way to relax after a long day of learning and networking. Regardless of whatever floats your boat, we encourage you to plan local meetups, invite your newfound and long-standing friends, and forge a few irreplaceable Seattle memories.

Wondering what there is to do, drink, eat, and see in Seattle?

Well, who better to ask than us Seattleites? Using tons of real suggestions from real Mozzers, we’ve put together a Google Map you can use to guide your exploration outside the confines of the event venue — check it out below!

Seattle’s got more to offer than we can name — get out there and discover the renowned Emerald City quirks and quaintness we’re famous for!

Travel options:

Seattle’s got a pretty solid transit system that can get you where you need to go, whether you’re traveling by bus or train. The city also has its share of rideshare services, as well as taxis, bikes, ferries, and water taxis, depending on where you're headed.

Public transportation

  • King County Metro Trip Planner: Traverse the city by bus! You can also download an app to get real-time bus info (I like the One Bus Away app, developed here in Seattle by University of Washington grads)
  • Light Rail: Connecting the north end to the south, the Light Rail can move you across Seattle quickly (and even drop you off right at SeaTac for your flight home!)
  • Water taxis and ferries can float you right across the Sound (and offer a lovely view while they’re at it)
  • A Transit Go ticket or ORCA card will happily power your public transit excursions
  • Bikeshare programs: As you wander the city, you may notice brightly colored bicycles patiently awaiting riders on the sidewalks. That rider could be you! If you’re feeling athletic, take advantage of the city’s bikeshare programs and see Seattle on two wheels.

Rideshares and taxis

  • Uber & Lyft can get you where you need to go
  • Moovn is a Seattle startup rideshare company
  • Two taxi services, Seattle Yellow Cab and Orange Cab, allow for online booking via their apps (or you can call ‘em the old-fashioned way!)

Are you ready to rock MozCon?!

If you’re already MozCon-bound come this July, make sure to download the app (must be on mobile) and join our Facebook group to maximize your networking opportunities, get to know fellow attendees, and stay up-to-date on conference news and activities.

If you’re thinking about grabbing a ticket last-minute, we still have a few left:

Grab a ticket while you can

And whether you’re going to be large, in charge, and live at the conference or just following along at home and eagerly waiting the video release, follow along with the #MozCon hashtag on Twitter to indulge in the juicy tidbits and takeaways attendees will undoubtedly share.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


How to Rock MozCon 2018 Like the Marketing Superhero You Are posted first on https://moz.com/blog

The Goal-Based Approach to Domain Selection – Whiteboard Friday

Posted by KameronJenkins

Choosing a domain is a big deal, and there's a lot that goes into it. Even with everything that goes into determining your URL, there are two essential questions to ask that ought to guide your decision-making: what are my goals, and what's best for my users? In today's edition of Whiteboard Friday, we're beyond delighted to welcome Kameron Jenkins, our SEO Wordsmith, to the show to teach us all about how to select a domain that aligns with and supports your business goals.

Goal-based Approach to Domain Selection

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, everyone. Welcome to this week's edition of Whiteboard Friday. My name is Kameron Jenkins, and I am the SEO Wordsmith here at Moz. Today we're going to be talking about a goals-based approach to choosing a domain type or a domain selection.

There are a lot of questions in the SEO industry right now, and as an agency, I used to work at an agency, and a lot of times our clients would ask us, "Should I do a microsite? Should I do a subdomain? Should I consolidate all my sites?" There is a lot of confusion about the SEO impact of all of these different types of domain choices, and there certainly are SEO ramifications for each type, but today we're going to be taking a slightly different approach and focusing on goals first. What are your business goals? What are your goals for your website? What are your goals for your users? And then choosing a domain that matches those goals. By the end, instead of what's better for SEO, we're going to hopefully have answered, "What best suits my unique goals?"

Before we start...define!

Before we start, let's launch into some quick definitions just so we all kind of know what we're talking about and why all the different terminology we're going to be using.

Main domain

Main domain, this is often called a root domain in some cases. That's anything that precedes your dot com or other TLD. So YourSite.com, it lives right before that.

Subdomain

A subdomain is a third-level domain name for your domain. So example, Blog.YourSite.com, that would be a subdomain.

Subfolder

A subfolder, or some people call this subdirectory, those are folders trailing the dot com. An example would be YourSite.com/blog. That /blog is the folder. That's a subfolder.

Microsite

A microsite, there's a lot of different terminology around this type of domain selection, but it's just a completely separate domain from your main domain. The focus is usually a little bit more niche than the topic of your main website.

That would be YourSite1.com and YourSite2.com. They're two totally, completely separate domains.

Business goals that can impact domain structure

Next we're going to start talking about business goals that can impact domain structure. There are a lot of different business goals. You want to grow revenue. You want more customers. But we're specifically here going to be talking about the types of business goals that can impact domain selection.

1. Expand locations/products/services

The first one here that we're going to talk about is the business wants to expand their locations, their products, or their services. They want to grow. They want to expand in some way. An example I like to use is say this clothing store has two locations. They have two storefronts. They have one in Dallas and one in Fort Worth.

So they launch two websites — CoolClothesDallas.com and CoolClothesFortWorth.com. But the problem with that is if you want to grow, you're going to open stores in Austin, Houston, etc. You've set the precedent that you're going to have a different domain for every single location, which is not really future-proof. It's hard to scale. Every time you launch a brand-new website, that's a lot of work to launch it, a lot of work to maintain it.

So if you plan on growing and getting into new locations or products or services or whatever it might be, just make sure you select a domain structure that's going to accommodate that. In particular, I would say a main root domain with subfolders for the different products or services you have is probably the best bet for that situation. So you have YourSite.com/Product1, /Product2, and you talk about it in that sense because it's all related. It's all the same topic. It's more future-proof. It's easier to add a page than it is to launch a whole new domain.

2. Set apart distinct facets of business

So another business goal that can affect your domain structure would be that the business wants to set apart distinct facets within their business. An example I found that was actually kind of helpful is Apple.com has a subdomain for Trailers.Apple.com.

Now, I'm not Apple. I don't really know exactly why they do this, but I have to imagine that it was because there are very different intents and uses for those different types of content that live on the subdomain versus the main site. So Trailers has movie trailers, lots of different content, and Apple.com is talking more about their consumer products, more about that type of thing.

So the audiences are slightly different. The intents are very different. In that situation, if you have a situation like that and that matches what your business is encountering, you want to set it apart, it has a different audience, you might want to consider a subdomain or maybe even a microsite. Just keep in mind that it takes effort to maintain each domain that you launch.

So make sure you have the resources to do this. You could, if you didn't have the resources, put it all on the main domain. But if you want a little bit more separation, the different aspects of your business are very disparate and you don't want them really associated on the same domain, you could separate it out with a subdomain or a microsite. Just, again, make sure that you have the resources to maintain it, because while both have equal ability to rank, it's the effort that increases with each new website you launch.

3. Differentiate uniquely branded sub-departments

Three, another goal is to differentiate uniquely branded sub-departments. There is a lot of this I've noticed in the healthcare space. So the sites that I've worked on, say they have Joe Smith Health, and this is the health system, the umbrella health system. Then within that you have Joe Smith Endocrinology.

Usually those types of situations they have completely different branding. They're in a different location. They reach a different audience, a different community. So in those situations I've seen that, especially healthcare, they usually have the resources to launch and maintain a completely different domain for that uniquely branded sub-department, and that might make sense.

Again, make sure you have the resources. But if it's very, very different, whether in branding or audience or intent, than the content that's on your main website, then I might consider separating them. Another example of this is sometimes you have a parent company and they own a lot of different companies, but that's about where the similarities stop.

They're just only owned by the parent company. All the different subcompanies don't have anything to do with each other. I would probably say it's wisest to separate those into their own unique domains. They probably definitely have unique branding. They're totally different companies. They're just owned by the same company. In those situations it might make sense, again, to separate them, but just know that they're not going to have any ranking benefit for each other because they're just completely separate domains.

4. Temporary or seasonal campaigns

The fourth business goal we're going to talk about is a temporary or a seasonal campaign. This one is not as common, but I figured I would just mention it. Sometimes a business will want to run a conference or sponsor an event or get a lot of media attention around some initiative that's separate from what their business does or offers, and it's just more of an events-based, seasonal type of thing.

In those situations it might make sense to do a microsite that's completely branded for that event. It's not necessary. For example, Moz has MozCon, and that's located on subfolder Moz.com/MozCon. You don't have to do that, but it certainly is an option for you if you want to uniquely brand it.

It can also be really good for press. I've noticed just in my experience, I don't know if this is widely common, but sometimes the press tends to just link to the homepage because that's what they know. They don't link to a specific page on your site. They don't know always where it's located. It's just easier to link to the main domain. If you want to build links specifically for this event that are really relevant, you might want to do a microsite or something like that.

Just make sure that when the event is over, don't just let it float out there and die. Especially if you build links and attention around it, make sure you 301 that back to your main website as long as that makes sense. So temporary or seasonal campaigns, that could be the way to go — microsite, subfolder. You have some options there.

5. Test out a new agency or consultant

Then finally the last goal we're going to be talking about that could impact domain structure is testing out a new agency or consultant.

Now this one holds a special place in my heart having worked for an agency prior to this for almost seven years. It's actually really common, and I can empathize with businesses who are in this situation. They are about to hand over their keys to their domain to a brand-new company. They don't quite know if they trust them yet.

Especially this is concerning if a business has a really strong domain that they've built up over time. It can be really scary to just let someone take over your domain. In some cases I have encountered, the business goes, "Hey, we'd love to test you out. We think you're great.However, you can't touch the main domain.You have to do your SEO somewhere else." That's okay, but we're kind of handcuffed in that situation.

You would have to, at that point, use a subdomain or a microsite, just a completely different website. If you can't touch the main domain, you don't really have many other options than that. You just have to launch on a brand-new thing. In that situation, it's a little frustrating, actually quite frustrating for SEOs because they're starting from nothing.

They have no authority inherited from that main domain. They're starting from square one. They have to build that up over time. While that's possible, just know that it kind of sets you back. You're way behind the starting line in that situation with using a subdomain or a microsite, not being able to touch that main domain.

If you find yourself in this situation and you can negotiate this, just make sure that the company that's hiring you is giving you enough time to prove the value of SEO. This is tried-and-true for a reason, but SEO is a marathon. It's not a sprint. It's not pay to play like paid advertising is. In that situation, just make sure that whoever is hiring you is giving you enough time.

Enough time is kind of dependent on how competitive the goals are. If they're asking you, "Hey, I'm going to test you out for this really, really competitive, high-volume keyword or group of keywords and you only have one month to do it," you're kind of set up to fail in that situation. Either ask them for more time, or I probably wouldn't take that job. So testing out a new agency or consultant is definitely something that can impact your ability to launch on one domain type versus another.

Pitfalls!

Now that we've talked about all of those, I'm just going to wrap up with some pitfalls. A lot of these are going to be repeat, but just as a way of review just watch out for these things.

Failing to future-proof

Like I said earlier, if you're planning on growing in the future, just make sure that your domain matches your future plans.

Exact-match domains

There's nothing inherently wrong with exact-match domains. It's just that you can't expect to launch a microsite with a bunch of keywords that are relevant to your business in your domain and just set it and forget it and hope that the keywords in the domain alone are what's going to get it to rank. That doesn't work anymore. It's not worked for a while. You have to actually proactively be adding value to that microsite.

Maybe you've decided that that makes sense for your business. That's great. Just make sure that you put in the resources to make it valuable outside of just the keywords in the domain.

Over-fragmenting

One thing I like to say is, "Would you rather have 3 websites with 10 backlinks each, or 1 website with 30 backlinks?" That's just a way to illustrate that if you don't have the resources to equally dedicate to each of those domains or subdomains or microsites or whatever you decided to launch, it's not going to be as strong.

Usually what I see when I evaluate a customer or a client's domain structure, usually there is one standout domain that has all of the content, all of the authority, all of the backlinks, and then the other ones just kind of suffer and they're usually stronger together than they are apart. So while it is totally possible to do separate websites, just make sure that you don't fragment so much that you're spread too thin to actually do anything effective on the SEO front.

Ignoring user experience

Look at your websites from the eyes of your users. If someone is going to go to the search results page and Google search your business name, are they going to see five websites there? That's kind of confusing unless they're very differently branded, different intents. They'll probably be confused.

Like, "Is this where I go to contact your business? How about this? Is it this?" There are just a lot of different ways that can cause confusion, so just keep that in mind. Also if you have a website where you're addressing two completely different audiences within your website — if a consumer, for example, can be browsing blouses and then somehow end up accidentally on a section that's only for employees — that's a little confusing for user experience.

Make sure you either gate that or make it a subdomain or a microsite. Just separate them if that would be confusing for your main user base.

Set it and forget it

Like I said, I keep repeating this just because it's so, so important. Every type of domain has equal ability to rank. It really does.

It's just the effort that gets harder and harder with each new website. Just make sure that you don't just decide to do microsites and subdomains and then don't do anything with them. That can be a totally fine choice. Just make sure that you don't set it and forget it, that you actually have the resources and you have the ability to keep building those up.

Intent overlap between domains

The last one I'll talk about in the pitfall department is intent overlap between domains.

I see this one actually kind of a lot. It can be like a winery. So they have tastings.winery.com or something like that. In that situation, their Tasting subdomain talks all about their wine tasting, their tasting room. It's very focused on that niche of their business. But then on Winery.com they also have extensive content about tastings. Well, you've got overlap there, and you're kind of making yourself do more work than you have to.

I would choose one or the other and not both. Just make sure that there's no overlap there if you do choose to do separate domains, subdomains, microsites, that kind of thing. Make sure that there's no overlap and each of them has a distinct purpose.

Two important questions to focus on:

Now that we're to the end of this, I really want the takeaway to be these two questions. I think this will make domain selection a lot easier when you focus on these two questions.

What am I trying to accomplish? What are the goals? What am I trying to do? Just focus on that first. Then second of all, and probably most important, what is best for my users? So focus on your goals, focus on your users, and I think the domain selection process will be a lot easier. It's not easy by any means.

There are some very complicated situations, but I think, in the end, it's going to be a lot easier if you focus on your goals and your users. If you have any comments regarding domain selection that you think would be helpful for others to know, please share it in the comments below. That's it for this week's Whiteboard Friday, and come back next week for another one. Thanks everybody.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


The Goal-Based Approach to Domain Selection - Whiteboard Friday posted first on https://moz.com/blog

Q&A: Lost Your Anonymous Google Reviews? The Scoop on Removal and Moving Forward

Posted by MiriamEllis

Did you recently notice a minor or major drop in your Google review count, and then realize that some of your actual reviews had gone missing, too? Read on to see if your experience of removal review was part of the action Google took in late May surrounding anonymous reviews.

Q: What happened?

A: As nearly as I can pinpoint it, Google began discounting reviews left by “A Google User” from total review counts around May 23, 2018. For a brief period, these anonymous reviews were still visible, but were then removed from display. I haven’t seen any official announcement about this, to date, and it remains unclear as to whether all reviews designated as being from “A Google User” have been removed, or whether some still remain. I haven’t been able to discover a single one since the update.

Q: How do I know if I was affected by this action?

A: If, prior to my estimated date, you had reviews that had been left by profiles marked “A Google User,” and these reviews are now gone, that’s the diagnostic of why your total review count has dropped.

Q: The reviews I’ve lost weren’t from “A Google User” profiles. What happened?

A: If you’ve lost reviews from non-anonymous profiles, it’s time to investigate other causes of removal. These could include:

  • Having paid for or incentivized reviews, either directly or via an unethical marketer
  • Reviews stemming from a review station/kiosk at your business
  • Getting too many reviews at once
  • URLs, prohibited language, or other objectionable content in the body of reviews
  • Reviewing yourself, or having employees (past or present) do so
  • Reviews were left on your same IP (as in the case of free on-site Wi-Fi)
  • The use of review strategies/software that prohibit negative reviews or selectively solicit positive reviews
  • Any other violation of Google’s review guidelines
  • A Google bug, in which case, check the GMB forum for reports of similar review loss, and wait a few days to see if your reviews return; if not, you can take the time to post about your issue in the GMB forum, but chances are not good that removed reviews will be reinstated

Q: Is anonymous review removal a bug or a test?

A: One month later, these reviews remain absent. This is not a bug, and seems unlikely to be a test.

Q: Could my missing anonymous reviews come back?

A: Never say “never” with Google. From their inception, Google review counts have been wonky, and have been afflicted by various bugs. There have been cases in which reviews have vanished and reappeared. But, in this case, I don’t believe these types of reviews will return. This is most likely an action on Google’s part with the intention of improving their review corpus, which is, unfortunately, plagued with spam.

Q: What were the origins of “A Google User” reviews?

A: Reviews designated by this language came from a variety of scenarios, but are chiefly fallout from Google’s rollout of Google+ and then its subsequent detachment from local. As Mike Blumenthal explains:

As recently as 2016, Google required users to log in as G+ users to leave a review. When they transitioned away from + they allowed users several choices as to whether to delete their reviews or to create a name. Many users did not make that transition. For the users that chose not to give their name and make that transition Google identified them as ” A Google User”…. also certain devices like the old Blackberry’s could leave a review but not a name. Also users left + and may have changed profiles at Google abandoning their old profiles. Needless to say there were many ways that these reviews became from “A Google User.”

Q: Is the removal of anonymous reviews a positive or negative thing? What’s Google trying to do here?

A: Whether this action has worked out well or poorly for you likely depends on the quality of the reviews you’ve lost. In some cases, the loss may have suddenly put you behind competitors, in terms of review count or rating. In others, the loss of anonymous negative reviews may have just resulted in your star rating improving — which would be great news!

As to Google’s intent with this action, my assumption is that it’s a step toward increasing transparency. Not their own transparency, but the accountability of the reviewing public. Google doesn’t really like to acknowledge it, but their review corpus is inundated with spam, some of it the product of global networks of bad actors who have made a business of leaving fake reviews. Personally, I welcome Google making any attempts to cope with this, but the removal of this specific type of anonymous review is definitely not an adequate solution to review spam when the livelihoods of real people are on the line.

Q: Does this Google update mean my business is now safe from anonymous reviews?

A: Unfortunately, no. While it does mean you’re unlikely to see reviews marked as being from “A Google User”, it does not in any way deter people from creating as many Google identities as they’d like to review your business. Consider:

  • Google’s review product has yet to reach a level of sophistication which could automatically flag reviews left by “Rocky Balboa” or “Whatever Whatever” as, perhaps, somewhat lacking in legitimacy.
  • Google’s product also doesn’t appear to suspect profiles created solely to leave one-time reviews, though this is a clear hallmark of many instances of spam
  • Google won’t remove text-less negative star ratings, despite owner requests
  • Google hasn’t been historically swayed to remove reviews on the basis of the owner claiming no records show that a negative reviewer was ever a customer

Q: Should Google’s removal of anonymous reviews alter my review strategy?

A: No, not really. I empathize with the business owners expressing frustration over the loss of reviews they were proud of and had worked hard to earn. I see actions like this as important signals to all local businesses to remember that you don’t own your Google reviews, you don’t own your Google My Business listing/Knowledge Panel. Google owns those assets, and manages them in any way they deem best for Google.

In the local SEO industry, we are increasingly seeing the transformation of businesses from the status of empowered “website owner” to the shakier “Google tenant,” with more and more consumer actions taking place within Google’s interface. The May removal of reviews should be one more nudge to your local brand to:

  • Be sure you have an ongoing, guideline-compliant Google review acquisition campaign in place so that reviews that become filtered out can be replaced with fresh reviews
  • Take an active approach to monitoring your GMB reviews so that you become aware of changes quickly. Software like Moz Local can help with this, especially if you own or market large, multi-location enterprises. Even when no action can be taken in response to a new Google policy, awareness is always a competitive advantage.
  • Diversify your presence on review platforms beyond Google
  • Collect reviews and testimonials directly from your customers to be placed on your own website; don’t forget the Schema markup while you’re at it
  • Diversify the ways in which you are cultivating positive consumer sentiment offline; word-of-mouth marketing, loyalty programs, and the development of real-world relationships with your customers is something you directly control
  • Keep collecting those email addresses and, following the laws of your country, cultivate non-Google-dependent lines of communication with your customers
  • Invest heavily in hiring and training practices that empower staff to offer the finest possible experience to customers at the time of service — this is the very best way to ensure you are building a strong reputation both on and offline

Q: So, what should Google do next about review spam?

A: A Google rep once famously stated,

The wiki nature of Google Maps expands upon Google’s steadfast commitment to open community.”

I’d welcome your opinions as to how Google should deal with review spam, as I find this a very hard question to answer. It may well be a case of trying to lock the barn door after the horse has bolted, and Google’s wiki mentality applied to real-world businesses is one with which our industry has contended for years.

You see, the trouble with Google’s local product is that it was never opt-in. Whether you list your business or not, it can end up in Google’s local business index, and that means you are open to reviews (positive, negative, and fallacious) on the most visible possible platform, like it or not. As I’m not seeing a way to walk this back, review spam should be Google’s problem to fix, and they are obliged to fix it if:

  • They are committed to their own earnings, based on the trust the public feels in their review corpus
  • They are committed to user experience, implementing necessary technology and human intervention to protect consumers from fake reviews
  • They want to stop treating the very businesses on whom their whole product is structured as unimportant in the scheme of things; companies going out of business due to review spam attacks really shouldn’t be viewed as acceptable collateral damage

Knowing that Alphabet has an estimated operating income of $7 billion for 2018, I believe Google could fund these safeguards:

  1. Take a bold step and resource human review mediators. Make this a new department within the local department. Google sends out lots of emails to businesses now. Let them all include clear contact options for reaching the review mediation department if the business experiences spam reviews. Put the department behind a wizard that walks the business owner through guidelines to determine if a review is truly spam, and if this process signals a “yes,” open a ticket and fix the issue. Don’t depend on volunteers in the GMB forum. Invest money in paid staff to maintain the quality of Google’s own product.
  2. If Google is committed to the review flagging process (which is iffy, at best), offer every business owner clear guidelines for flagging reviews within their own GMB dashboard, and then communicate about what is happening to the flagged reviews.
  3. Improve algorithmic detection of suspicious signals, like profiles with one-off reviews, the sudden influx of negative reviews and text-less ratings, global reviews within a single profile, and companies or profiles with a history of guideline violations. Hold the first few reviews left by any profile in a “sandbox,” à la Yelp.

Now it’s your turn! Let’s look at Google’s removal of “A Google User” reviews as a first step in the right direction. If you had Google’s ear, what would you suggest they do next to combat review spam? I’d really like to know.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Q&A: Lost Your Anonymous Google Reviews? The Scoop on Removal and Moving Forward posted first on https://moz.com/blog

Moz’s Mid-Year Retrospective: Exciting Upgrades from the First Half of 2018

Posted by NeilCrist

Every year, we publish an overview of all the upgrades we’ve made to our tools and how those changes benefit our customers and Moz Community members. So far, 2018 has been a whirlwind of activity here at Moz — not only did we release a massive, long-awaited update to our link building tool, we've also been improving and updating systems and tools across the board to make your Moz experience even better. To that end, we’re sharing a mid-year retrospective to keep up with the incredible amount of progress we’ve made.

We receive a lot of amazing feedback from our customers on pain points they experience and improvements they’d like to see. Folks, we hear you.

We not only massively restructured some of our internal systems to provide you with better data, we also innovated new ways to display and report on that data, making the tools more accurate and more useful than ever before.

If you’ve been tasked with achieving organic success, we know your job isn’t easy. You need tools that get the job done, and done well. We think Moz delivered.

Check out our 2018 improvements so far:

Our new link index: Bigger, fresher, better than ever

Our link index underwent a major overhaul: it's now 20x larger and 30x fresher than it was previously. This new link index data has been made available via our Mozscape API, as well as integrated into many Moz Pro tools, including Campaigns, Keyword Explorer, the MozBar, and Fresh Web Explorer. But undoubtedly the largest and most-anticipated improvement the new link index allowed us to make was the launch of Link Explorer, which we rolled out at the end of April as a replacement for Open Site Explorer.

Link Explorer addresses and improves upon its predecessor by providing more data, fresher data, and better ways to visualize that data. Answering a long-asked-for feature in OSE, Link Explorer includes historical metrics, and it also surfaces newly discovered and lost links:

Below are just a few of the many ways Link Explorer is providing some of the best link data available:

  • Link Explorer’s link index contains approximately 4.8 trillion URLs — that’s 20x larger than OSE and surpasses Ahrefs’ index (~3 trillion pages) and Majestic’s fresh index (~1 trillion pages).
  • Link Explorer is 30x fresher than OSE. All data updates every 24 hours.
  • We believe Link Explorer is unique in how accurately our link index represents the web, resulting in data quality you can trust.
  • Link Explorer has the closest robots.txt profile to Google among the three major link indexes, which means we get more of the links Google gets.
  • We also improved Domain Authority, Page Authority, and Spam Score. The size and freshness of our index has allowed us to offer a more stable DA and PA score. Though it will still fluctuate as the index fluctuates (which has always been by design), it will not be as dramatic as it was in Open Site Explorer.

Explore your link profile

You can learn more about Link Explorer by reading Sarah Bird’s announcement, watching Rand’s Whiteboard Friday, or visiting our Link Explorer Help Guide. Even though it's still in beta, Link Explorer already blows away OSE's data quality, freshness, and capabilities. Look for steady improvements to Link Explorer as we continue to iterate on it and add more key features.

New-and-improved On-Page Grader

Moz’s On-Page Grader got a thorough and much-needed overhaul! Not only did we freshen up the interface with a new look and feel, but we also added new features and improved upon our data.

Inside the new On-Page Grader, you’ll find:

  • An updated metrics bar to show you Page Title, Meta Description, and the number of Keywords Found. No need to dig through source code!
  • An updated Optimization Score to align with the Page Optimization feature that’s inside Campaigns and in the MozBar. Instead of a letter grade (A–F), you now have Page Score (0–100) for a more precise measurement of page optimization performance.
  • On-page factors are now categorized so you can see what is hurting or helping your Page Score.
  • On-page factors are organized by importance so you can prioritize your efforts. Red indicates high importance, yellow indicates moderate importance, and blue indicates low importance.

On-Page Grader is a great way to take a quick look at how well a page is optimized for a specified keyword. Here’s how it works.

Input your page and the keyword you want that page to rank for…

… and On-Page Grader will return a list of suggestions for improving your on-site optimization.

Check it out!

Keyword ranking data now available for Canada, UK, and Australia

We're very excited to announce that, as of just last week, international data has been added to the Keywords by Site feature of Keyword Explorer! This will now allow Moz Pro customers to see which keywords they rank for and assess their visibility across millions of SERPs, now encompassing the US, Canada, the United Kingdom, and Australia! Keywords by Site is a newer feature within Keyword Explorer, added just last October to show which and how many keywords any domain, subdomain, or page ranks for.

Want to see which keywords your site ranks for in the US, UK, Canada, or Australia?

See what you rank for

It's easy to use — just select a country from the dropdown menu to the right. This will show you which keywords a domain or page is ranking for from a particular country.

On-Demand Crawl now available

We know it can be important to track your site changes in real time. That's why, on June 29th, we're replacing our legacy site audit tool, Crawl Test, with the new and improved On-Demand Crawl:

Whether you need to double-check a change you've made or need a one-off report, the new On-Demand Crawl offers an updated experience for Moz Pro customers:

  • Crawl reports are now faster and available sooner, allowing you to quickly assess your site, a new client or prospect’s, or the competition.
  • Your site issues are now categorized by issue type and quantity, making it easier to identify what to work on and how to prioritize:

  • Recommendations are now provided for how to fix each issue, along with resources detailing why it matters:

  • Site audit reports are now easier than ever to package and present with PDF exports.
  • An updated, fresh design and UX!

On-Demand Crawl is already available now in Moz Pro. If you’re curious how it works, check it out:

Try On-Demand Crawl

Improvements to tool notifications & visuals

Moz’s email notification system and tools dashboard didn't always sync up perfectly with the actual data update times. Sometimes, customers would receive an email or see updated dates on their dashboard before the data had rolled out, resulting in confusion. We've streamlined the process, and now customers no longer have to wonder where their data is — you can rest assured that your data is waiting for you in Moz Pro as soon as you're notified.

Rank Tracker is sticking around

While we had originally planned to retire Rank Tracker at the beginning of June, we've decided to hold off in light of the feedback we received from our customers. Our goal in retiring Rank Tracker was to make Moz Pro easier to navigate by eliminating the redundancy of having two options for tracking keyword rankings (Rank Tracker and Campaigns), but after hearing how many people use and value Rank Tracker, and after weighing our options, we decided to postpone its retirement until we had a better solution than simply shutting it down.

Right now, we’re focused on learning more from our community on what makes this tool so valuable, so if you have feedback regarding Rank Tracker, we’d love it if you would take our survey. The information we gather from this survey will help us create a better solution for you!

Updates from Moz Academy

New advanced SEO courses

In response to the growing interest in advanced and niche-specific training, Moz is now offering ongoing classes and seminars on topics such as e-commerce SEO and technical site audits. If there’s an advanced topic you’d like training on, let us know by visiting https://moz.com/training and navigating to the “Custom” tab to tell us exactly what type of training you’re looking for.

On-demand coursework

We love the fact that we have Moz customers from around the globe, so we’re always looking for new ways to accommodate those in different timezones and those with sporadic schedules. One new way we’re doing this is by offering on-demand coursework. Get training from Moz when it works best for you. With this added scheduling flexibility (and with added instructors to boot), we hope to be able to reach more people than ever before.

To view Moz’s on-demand coursework, visit moz.com/training and click on the “On-Demand” tab.

Certificate development

There’s been a growing demand for a meaningful certification program in SEO, and we’re proud to say that Moz is here to deliver. This coursework will include a certificate and a badge for your LinkedIn profile. We’re planning on launching the program later this year, so stay tuned by signing up for Moz Training Alerts!

Tell us what you think!

Have feedback for us on any of our 2018 improvements? Any ideas on new ways we can improve our tools and training resources? Let us know in the comments! We love hearing from marketers like you. Your input helps us develop the best tools possible for ensuring your content gets found online.

If you’re not a Moz Pro subscriber and haven’t gotten a chance to check out these new features yet, sign up for a free trial!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


Moz's Mid-Year Retrospective: Exciting Upgrades from the First Half of 2018 posted first on https://moz.com/blog

An 8-Point Checklist for Debugging Strange Technical SEO Problems

Posted by Dom-Woodman

Occasionally, a problem will land on your desk that's a little out of the ordinary. Something where you don't have an easy answer. You go to your brain and your brain returns nothing.

These problems can’t be solved with a little bit of keyword research and basic technical configuration. These are the types of technical SEO problems where the rabbit hole goes deep.

The very nature of these situations defies a checklist, but it's useful to have one for the same reason we have them on planes: even the best of us can and will forget things, and a checklist will provvide you with places to dig.


Fancy some examples of strange SEO problems? Here are four examples to mull over while you read. We’ll answer them at the end.

1. Why wasn’t Google showing 5-star markup on product pages?

  • The pages had server-rendered product markup and they also had Feefo product markup, including ratings being attached client-side.
  • The Feefo ratings snippet was successfully rendered in Fetch & Render, plus the mobile-friendly tool.
  • When you put the rendered DOM into the structured data testing tool, both pieces of structured data appeared without errors.

2. Why wouldn’t Bing display 5-star markup on review pages, when Google would?

  • The review pages of client & competitors all had rating rich snippets on Google.
  • All the competitors had rating rich snippets on Bing; however, the client did not.
  • The review pages had correctly validating ratings schema on Google’s structured data testing tool, but did not on Bing.

3. Why were pages getting indexed with a no-index tag?

  • Pages with a server-side-rendered no-index tag in the head were being indexed by Google across a large template for a client.

4. Why did any page on a website return a 302 about 20–50% of the time, but only for crawlers?

  • A website was randomly throwing 302 errors.
  • This never happened in the browser and only in crawlers.
  • User agent made no difference; location or cookies also made no difference.

Finally, a quick note. It’s entirely possible that some of this checklist won’t apply to every scenario. That’s totally fine. It’s meant to be a process for everything you could check, not everything you should check.

The pre-checklist check

Does it actually matter?

Does this problem only affect a tiny amount of traffic? Is it only on a handful of pages and you already have a big list of other actions that will help the website? You probably need to just drop it.

I know, I hate it too. I also want to be right and dig these things out. But in six months' time, when you've solved twenty complex SEO rabbit holes and your website has stayed flat because you didn't re-write the title tags, you're still going to get fired.

But hopefully that's not the case, in which case, onwards!

Where are you seeing the problem?

We don’t want to waste a lot of time. Have you heard this wonderful saying?: “If you hear hooves, it’s probably not a zebra.”

The process we’re about to go through is fairly involved and it’s entirely up to your discretion if you want to go ahead. Just make sure you’re not overlooking something obvious that would solve your problem. Here are some common problems I’ve come across that were mostly horses.

  1. You’re underperforming from where you should be.
    1. When a site is under-performing, people love looking for excuses. Weird Google nonsense can be quite a handy thing to blame. In reality, it’s typically some combination of a poor site, higher competition, and a failing brand. Horse.
  2. You’ve suffered a sudden traffic drop.
    1. Something has certainly happened, but this is probably not the checklist for you. There are plenty of common-sense checklists for this. I’ve written about diagnosing traffic drops recently — check that out first.
  3. The wrong page is ranking for the wrong query.
    1. In my experience (which should probably preface this entire post), this is usually a basic problem where a site has poor targeting or a lot of cannibalization. Probably a horse.

Factors which make it more likely that you’ve got a more complex problem which require you to don your debugging shoes:

  • A website that has a lot of client-side JavaScript.
  • Bigger, older websites with more legacy.
  • Your problem is related to a new Google property or feature where there is less community knowledge.

1. Start by picking some example pages.

Pick a couple of example pages to work with — ones that exhibit whatever problem you're seeing. No, this won't be representative, but we'll come back to that in a bit.

Of course, if it only affects a tiny number of pages then it might actually be representative, in which case we're good. It definitely matters, right? You didn't just skip the step above? OK, cool, let's move on.

2. Can Google crawl the page once?

First we’re checking whether Googlebot has access to the page, which we’ll define as a 200 status code.

We’ll check in four different ways to expose any common issues:

  1. Robots.txt: Open up Search Console and check in the robots.txt validator.
  2. User agent: Open Dev Tools and verify that you can open the URL with both Googlebot and Googlebot Mobile.
    1. To get the user agent switcher, open Dev Tools.
    2. Check the console drawer is open (the toggle is the Escape key)
    3. Hit the … and open "Network conditions"
    4. Here, select your user agent!

  1. IP Address: Verify that you can access the page with the mobile testing tool. (This will come from one of the IPs used by Google; any checks you do from your computer won't.)
  2. Country: The mobile testing tool will visit from US IPs, from what I've seen, so we get two birds with one stone. But Googlebot will occasionally crawl from non-American IPs, so it’s also worth using a VPN to double-check whether you can access the site from any other relevant countries.
    1. I’ve used HideMyAss for this before, but whatever VPN you have will work fine.

We should now have an idea whether or not Googlebot is struggling to fetch the page once.

Have we found any problems yet?

If we can re-create a failed crawl with a simple check above, then it’s likely Googlebot is probably failing consistently to fetch our page and it’s typically one of those basic reasons.

But it might not be. Many problems are inconsistent because of the nature of technology. ;)

3. Are we telling Google two different things?

Next up: Google can find the page, but are we confusing it by telling it two different things?

This is most commonly seen, in my experience, because someone has messed up the indexing directives.

By "indexing directives," I’m referring to any tag that defines the correct index status or page in the index which should rank. Here’s a non-exhaustive list:

  • No-index
  • Canonical
  • Mobile alternate tags
  • AMP alternate tags

An example of providing mixed messages would be:

  • No-indexing page A
  • Page B canonicals to page A

Or:

  • Page A has a canonical in a header to A with a parameter
  • Page A has a canonical in the body to A without a parameter

If we’re providing mixed messages, then it’s not clear how Google will respond. It’s a great way to start seeing strange results.

Good places to check for the indexing directives listed above are:

  • Sitemap
    • Example: Mobile alternate tags can sit in a sitemap
  • HTTP headers
    • Example: Canonical and meta robots can be set in headers.
  • HTML head
    • This is where you’re probably looking, you’ll need this one for a comparison.
  • JavaScript-rendered vs hard-coded directives
    • You might be setting one thing in the page source and then rendering another with JavaScript, i.e. you would see something different in the HTML source from the rendered DOM.
  • Google Search Console settings
    • There are Search Console settings for ignoring parameters and country localization that can clash with indexing tags on the page.

A quick aside on rendered DOM

This page has a lot of mentions of the rendered DOM on it (18, if you’re curious). Since we’ve just had our first, here’s a quick recap about what that is.

When you load a webpage, the first request is the HTML. This is what you see in the HTML source (right-click on a webpage and click View Source).

This is before JavaScript has done anything to the page. This didn’t use to be such a big deal, but now so many websites rely heavily on JavaScript that the most people quite reasonably won’t trust the the initial HTML.

Rendered DOM is the technical term for a page, when all the JavaScript has been rendered and all the page alterations made. You can see this in Dev Tools.

In Chrome you can get that by right clicking and hitting inspect element (or Ctrl + Shift + I). The Elements tab will show the DOM as it’s being rendered. When it stops flickering and changing, then you’ve got the rendered DOM!

4. Can Google crawl the page consistently?

To see what Google is seeing, we're going to need to get log files. At this point, we can check to see how it is accessing the page.

Aside: Working with logs is an entire post in and of itself. I’ve written a guide to log analysis with BigQuery, I’d also really recommend trying out Screaming Frog Log Analyzer, which has done a great job of handling a lot of the complexity around logs.

When we’re looking at crawling there are three useful checks we can do:

  1. Status codes: Plot the status codes over time. Is Google seeing different status codes than you when you check URLs?
  2. Resources: Is Google downloading all the resources of the page?
    1. Is it downloading all your site-specific JavaScript and CSS files that it would need to generate the page?
  3. Page size follow-up: Take the max and min of all your pages and resources and diff them. If you see a difference, then Google might be failing to fully download all the resources or pages. (Hat tip to @ohgm, where I first heard this neat tip).

Have we found any problems yet?

If Google isn't getting 200s consistently in our log files, but we can access the page fine when we try, then there is clearly still some differences between Googlebot and ourselves. What might those differences be?

  1. It will crawl more than us
  2. It is obviously a bot, rather than a human pretending to be a bot
  3. It will crawl at different times of day

This means that:

  • If our website is doing clever bot blocking, it might be able to differentiate between us and Googlebot.
  • Because Googlebot will put more stress on our web servers, it might behave differently. When websites have a lot of bots or visitors visiting at once, they might take certain actions to help keep the website online. They might turn on more computers to power the website (this is called scaling), they might also attempt to rate-limit users who are requesting lots of pages, or serve reduced versions of pages.
  • Servers run tasks periodically; for example, a listings website might run a daily task at 01:00 to clean up all it’s old listings, which might affect server performance.

Working out what’s happening with these periodic effects is going to be fiddly; you’re probably going to need to talk to a back-end developer.

Depending on your skill level, you might not know exactly where to lead the discussion. A useful structure for a discussion is often to talk about how a request passes through your technology stack and then look at the edge cases we discussed above.

  • What happens to the servers under heavy load?
  • When do important scheduled tasks happen?

Two useful pieces of information to enter this conversation with:

  1. Depending on the regularity of the problem in the logs, it is often worth trying to re-create the problem by attempting to crawl the website with a crawler at the same speed/intensity that Google is using to see if you can find/cause the same issues. This won’t always be possible depending on the size of the site, but for some sites it will be. Being able to consistently re-create a problem is the best way to get it solved.
  2. If you can’t, however, then try to provide the exact periods of time where Googlebot was seeing the problems. This will give the developer the best chance of tying the issue to other logs to let them debug what was happening.

If Google can crawl the page consistently, then we move onto our next step.

5. Does Google see what I can see on a one-off basis?

We know Google is crawling the page correctly. The next step is to try and work out what Google is seeing on the page. If you’ve got a JavaScript-heavy website you’ve probably banged your head against this problem before, but even if you don’t this can still sometimes be an issue.

We follow the same pattern as before. First, we try to re-create it once. The following tools will let us do that:

  • Fetch & Render
    • Shows: Rendered DOM in an image, but only returns the page source HTML for you to read.
  • Mobile-friendly test
    • Shows: Rendered DOM and returns rendered DOM for you to read.
    • Not only does this show you rendered DOM, but it will also track any console errors.

Is there a difference between Fetch & Render, the mobile-friendly testing tool, and Googlebot? Not really, with the exception of timeouts (which is why we have our later steps!). Here’s the full analysis of the difference between them, if you’re interested.

Once we have the output from these, we compare them to what we ordinarily see in our browser. I’d recommend using a tool like Diff Checker to compare the two.

Have we found any problems yet?

If we encounter meaningful differences at this point, then in my experience it’s typically either from JavaScript or cookies

Why?

We can isolate each of these by:

  • Loading the page with no cookies. This can be done simply by loading the page with a fresh incognito session and comparing the rendered DOM here against the rendered DOM in our ordinary browser.
  • Use the mobile testing tool to see the page with Chrome 41 and compare against the rendered DOM we normally see with Inspect Element.

Yet again we can compare them using something like Diff Checker, which will allow us to spot any differences. You might want to use an HTML formatter to help line them up better.

We can also see the JavaScript errors thrown using the Mobile-Friendly Testing Tool, which may prove particularly useful if you’re confident in your JavaScript.

If, using this knowledge and these tools, we can recreate the bug, then we have something that can be replicated and it’s easier for us to hand off to a developer as a bug that will get fixed.

If we’re seeing everything is correct here, we move on to the next step.

6. What is Google actually seeing?

It’s possible that what Google is seeing is different from what we recreate using the tools in the previous step. Why? A couple main reasons:

  • Overloaded servers can have all sorts of strange behaviors. For example, they might be returning 200 codes, but perhaps with a default page.
  • JavaScript is rendered separately from pages being crawled and Googlebot may spend less time rendering JavaScript than a testing tool.
  • There is often a lot of caching in the creation of web pages and this can cause issues.

We’ve gotten this far without talking about time! Pages don’t get crawled instantly, and crawled pages don’t get indexed instantly.

Quick sidebar: What is caching?

Caching is often a problem if you get to this stage. Unlike JS, it’s not talked about as much in our community, so it’s worth some more explanation in case you’re not familiar. Caching is storing something so it’s available more quickly next time.

When you request a webpage, a lot of calculations happen to generate that page. If you then refreshed the page when it was done, it would be incredibly wasteful to just re-run all those same calculations. Instead, servers will often save the output and serve you the output without re-running them. Saving the output is called caching.

Why do we need to know this? Well, we’re already well out into the weeds at this point and so it’s possible that a cache is misconfigured and the wrong information is being returned to users.

There aren’t many good beginner resources on caching which go into more depth. However, I found this article on caching basics to be one of the more friendly ones. It covers some of the basic types of caching quite well.

How can we see what Google is actually working with?

  • Google’s cache
    • Shows: Source code
    • While this won’t show you the rendered DOM, it is showing you the raw HTML Googlebot actually saw when visiting the page. You’ll need to check this with JS disabled; otherwise, on opening it, your browser will run all the JS on the cached version.
  • Site searches for specific content
    • Shows: A tiny snippet of rendered content.
    • By searching for a specific phrase on a page, e.g. inurl:example.com/url “only JS rendered text”, you can see if Google has manage to index a specific snippet of content. Of course, it only works for visible text and misses a lot of the content, but it's better than nothing!
    • Better yet, do the same thing with a rank tracker, to see if it changes over time.
  • Storing the actual rendered DOM
    • Shows: Rendered DOM
    • Alex from DeepCrawl has written about saving the rendered DOM from Googlebot. The TL;DR version: Google will render JS and post to endpoints, so we can get it to submit the JS-rendered version of a page that it sees. We can then save that, examine it, and see what went wrong.

Have we found any problems yet?

Again, once we’ve found the problem, it’s time to go and talk to a developer. The advice for this conversation is identical to the last one — everything I said there still applies.

The other knowledge you should go into this conversation armed with: how Google works and where it can struggle. While your developer will know the technical ins and outs of your website and how it’s built, they might not know much about how Google works. Together, this can help you reach the answer more quickly.

The obvious source for this are resources or presentations given by Google themselves. Of the various resources that have come out, I’ve found these two to be some of the more useful ones for giving insight into first principles:

But there is often a difference between statements Google will make and what the SEO community sees in practice. All the SEO experiments people tirelessly perform in our industry can also help shed some insight. There are far too many list here, but here are two good examples:

7. Could Google be aggregating your website across others?

If we’ve reached this point, we’re pretty happy that our website is running smoothly. But not all problems can be solved just on your website; sometimes you’ve got to look to the wider landscape and the SERPs around it.

Most commonly, what I’m looking for here is:

  • Similar/duplicate content to the pages that have the problem.
    • This could be intentional duplicate content (e.g. syndicating content) or unintentional (competitors' scraping or accidentally indexed sites).

Either way, they’re nearly always found by doing exact searches in Google. I.e. taking a relatively specific piece of content from your page and searching for it in quotes.

Have you found any problems yet?

If you find a number of other exact copies, then it’s possible they might be causing issues.

The best description I’ve come up with for “have you found a problem here?” is: do you think Google is aggregating together similar pages and only showing one? And if it is, is it picking the wrong page?

This doesn’t just have to be on traditional Google search. You might find a version of it on Google Jobs, Google News, etc.

To give an example, if you are a reseller, you might find content isn’t ranking because there's another, more authoritative reseller who consistently posts the same listings first.

Sometimes you’ll see this consistently and straightaway, while other times the aggregation might be changing over time. In that case, you’ll need a rank tracker for whatever Google property you’re working on to see it.

Jon Earnshaw from Pi Datametrics gave an excellent talk on the latter (around suspicious SERP flux) which is well worth watching.

Once you’ve found the problem, you’ll probably need to experiment to find out how to get around it, but the easiest factors to play with are usually:

  • De-duplication of content
  • Speed of discovery (you can often improve by putting up a 24-hour RSS feed of all the new content that appears)
  • Lowering syndication

8. A roundup of some other likely suspects

If you’ve gotten this far, then we’re sure that:

  • Google can consistently crawl our pages as intended.
  • We’re sending Google consistent signals about the status of our page.
  • Google is consistently rendering our pages as we expect.
  • Google is picking the correct page out of any duplicates that might exist on the web.

And your problem still isn’t solved?

And it is important?

Well, shoot.

Feel free to hire us…?

As much as I’d love for this article to list every SEO problem ever, that’s not really practical, so to finish off this article let’s go through two more common gotchas and principles that didn’t really fit in elsewhere before the answers to those four problems we listed at the beginning.

Invalid/poorly constructed HTML

You and Googlebot might be seeing the same HTML, but it might be invalid or wrong. Googlebot (and any crawler, for that matter) has to provide workarounds when the HTML specification isn't followed, and those can sometimes cause strange behavior.

The easiest way to spot it is either by eye-balling the rendered DOM tools or using an HTML validator.

The W3C validator is very useful, but will throw up a lot of errors/warnings you won’t care about. The closest I can give to a one-line of summary of which ones are useful is to:

  • Look for errors
  • Ignore anything to do with attributes (won’t always apply, but is often true).

The classic example of this is breaking the head.

An iframe isn't allowed in the head code, so Chrome will end the head and start the body. Unfortunately, it takes the title and canonical with it, because they fall after it — so Google can't read them. The head code should have ended in a different place.

Oliver Mason wrote a good post that explains an even more subtle version of this in breaking the head quietly.

When in doubt, diff

Never underestimate the power of trying to compare two things line by line with a diff from something like Diff Checker. It won’t apply to everything, but when it does it’s powerful.

For example, if Google has suddenly stopped showing your featured markup, try to diff your page against a historical version either in your QA environment or from the Wayback Machine.


Answers to our original 4 questions

Time to answer those questions. These are all problems we’ve had clients bring to us at Distilled.

1. Why wasn’t Google showing 5-star markup on product pages?

Google was seeing both the server-rendered markup and the client-side-rendered markup; however, the server-rendered side was taking precedence.

Removing the server-rendered markup meant the 5-star markup began appearing.

2. Why wouldn’t Bing display 5-star markup on review pages, when Google would?

The problem came from the references to schema.org.

        <div itemscope="" itemtype="https://schema.org/Movie">
        </div>
        <p>  <h1 itemprop="name">Avatar</h1>
        </p>
        <p>  <span>Director: <span itemprop="director">James Cameron</span> (born August 16, 1954)</span>
        </p>
        <p>  <span itemprop="genre">Science fiction</span>
        </p>
        <p>  <a href="../movies/avatar-theatrical-trailer.html" itemprop="trailer">Trailer</a>
        </p>
        <p></div>
        </p>

We diffed our markup against our competitors and the only difference was we’d referenced the HTTPS version of schema.org in our itemtype, which caused Bing to not support it.

C’mon, Bing.

3. Why were pages getting indexed with a no-index tag?

The answer for this was in this post. This was a case of breaking the head.

The developers had installed some ad-tech in the head and inserted an non-standard tag, i.e. not:

  • <title>
  • <style>
  • <base>
  • <link>
  • <meta>
  • <script>
  • <noscript>

This caused the head to end prematurely and the no-index tag was left in the body where it wasn’t read.

4. Why did any page on a website return a 302 about 20–50% of the time, but only for crawlers?

This took some time to figure out. The client had an old legacy website that has two servers, one for the blog and one for the rest of the site. This issue started occurring shortly after a migration of the blog from a subdomain (blog.client.com) to a subdirectory (client.com/blog/…).

At surface level everything was fine; if a user requested any individual page, it all looked good. A crawl of all the blog URLs to check they’d redirected was fine.

But we noticed a sharp increase of errors being flagged in Search Console, and during a routine site-wide crawl, many pages that were fine when checked manually were causing redirect loops.

We checked using Fetch and Render, but once again, the pages were fine.

Eventually, it turned out that when a non-blog page was requested very quickly after a blog page (which, realistically, only a crawler is fast enough to achieve), the request for the non-blog page would be sent to the blog server.

These would then be caught by a long-forgotten redirect rule, which 302-redirected deleted blog posts (or other duff URLs) to the root. This, in turn, was caught by a blanket HTTP to HTTPS 301 redirect rule, which would be requested from the blog server again, perpetuating the loop.

For example, requesting https://www.client.com/blog/ followed quickly enough by https://www.client.com/category/ would result in:

  • 302 to http://www.client.com - This was the rule that redirected deleted blog posts to the root
  • 301 to https://www.client.com - This was the blanket HTTPS redirect
  • 302 to http://www.client.com - The blog server doesn’t know about the HTTPS non-blog homepage and it redirects back to the HTTP version. Rinse and repeat.

This caused the periodic 302 errors and it meant we could work with their devs to fix the problem.

What are the best brainteasers you've had?

Let’s hear them, people. What problems have you run into? Let us know in the comments.

Also credit to @RobinLord8, @TomAnthonySEO, @THCapper, @samnemzer, and @sergeystefoglo_ for help with this piece.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


An 8-Point Checklist for Debugging Strange Technical SEO Problems posted first on https://moz.com/blog