How to Boost Bookings & Conversions with Google Posts: An Interview with Joel Headley

Posted by MiriamEllis

Have you been exploring all the ways you might use Google Posts to set and meet brand goals?

Chances are good you’ve heard of Google Posts by now: the micro-blogging Google My Business dashboard feature which instantly populates content to your Knowledge Panel and individual listing. We’re still only months into the release of this fascinating capability, use of which is theorized as having a potential impact on local pack rankings. When I recently listened to Joel Headley describing his incredibly creative use of Google Posts to increase healthcare provider bookings, it’s something I was excited to share with the Moz community here.

Joel Headley

Joel Headley worked for over a decade on local and web search at Google. He’s now the Director of Local SEO and Marketing at healthcare practice growth platform PatientPop. He’s graciously agreed to chat with me about how his company increased appointment bookings by about 11% for thousands of customer listings via Google Posts.

How PatientPop used Google Posts to increase bookings by 11%

Miriam: So, Joel, Google offers a formal booking feature within their own product, but it isn’t always easy to participate in that program, and it keeps users within “Google’s walled garden” instead of guiding them to brand-controlled assets. As I recently learned, PatientPop innovated almost instantly when Google Posts was rolled out in 2017. Can you summarize for me what your company put together for your customers as a booking vehicle that didn’t depend on Google’s booking program?

Joel: PatientPop wants to provide patients an opportunity to make appointments directly with their healthcare provider. In that way, we're a white label service. Google has had a handful of booking products. In a prior iteration, there was a simpler product that was powered by schema and microforms, which could have scaled to anyone willing to add the schema.

Today, they are putting their effort behind Reserve with Google, which requires a much deeper API integration. While PatientPop would be happy to provide more services on Google, Reserve with Google doesn't yet allow most of our customers, according to their own policies. (However, the reservation service is marketed through Google My Business to those categories, which is a bit confusing.)

Additionally, when you open the booking widget, you see two logos: G Pay and the booking software provider. I'd love to see a product that allows the healthcare provider to be front and center in the entire process. A patient-doctor relationship is personal, and we'd like to emphasize you're booking your doctor, not PatientPop.

Because we can't get the CTAs unique to Reserve with Google, we realized that Google Posts can be a great vehicle for us to essentially get the same result.

When Google Posts first launched, I tested a handful of practices. The interaction rate was low compared to other elements in the Google listing. But, given there was incremental gain in traffic, it seemed worthwhile, if we could scale the product. It seemed like a handy way to provide scheduling with Google without having to go through the hoops of the Maps Booking (reserve with) API.

Miriam: Makes sense! Now, I’ve created a fictitious example of what it looks like to use Google Posts to prompt bookings, following your recommendations to use a simple color as the image background and to make the image text quite visible. Does this look similar to what PatientPop is doing for its customers and can you provide recommendations for the image size and font size you’ve seen work best?

Joel: Yes, that's pretty similar to the types of Posts we're submitting to our customer listings. I tested a handful of image types, ones with providers, some with no text, and the less busy image with actionable text is what performed the best. I noticed that making the image look more like a button, with button-like text, improved click-through rates too — CTR doubled compared to images with no text.

The image size we use is 750x750 with 48-point font size. If one uses the API, the image must be square cropped when creating the post. Otherwise, Posts using the Google My Business interface will give you an option to crop. The only issue I have with the published version of the image: the cropping is uneven — sometimes it is center-cropped, but other times, the bottom is cut off. That makes it hard to predict when on-image text will appear. But we keep it in the center which generally works pretty well.

Miriam: And, when clicked on, the Google Post takes the user to the client’s own website, where PatientPop software is being used to manage appointments — is that right?

Joel: Yes, the site is built by PatientPop. When selecting Book, the patient is taken directly to the provider's site where the booking widget is opened and an appointment can be selected from a calendar. These appointments can be synced back to the practice's electronic records system.

Miriam: Very tidy! As I understand it, PatientPop manages thousands of client listings, necessitating the need to automate this use of Google Posts. Without giving any secrets away, can you share a link to the API you used and explain how you templatized the process of creating Posts at scale?

Joel: Sure! We were waiting for Google to provide Posts via the Google My Business API, because we wanted to scale. While I had a bit of a heads-up that the API was coming — Google shared this feature with their GMB Top Contributor group — we still had to wait for it to launch to see the documentation and try it out. So, when the launch announcement went out on October 11, with just a few developers, we were able to implement the solution for all of our practices the next evening. It was a fun, quick win for us, though it was a bit of a long day. :)

In order to get something out that quickly, we created templates that could use information from the listing itself like the business name, category, and location. That way, we were able to create a stand-alone Python script that grabbed listings from Google. When getting the listings, all the listing content comes along with it, including name, address, and category. These values are taken directly from the listing to create Posts and then are submitted to Google. We host the images on AWS and reuse them by submitting the image URL with the post. It's a Python script which runs as a cron job on a regular schedule. If you're new to the API, the real tricky part is authentication, but the GMB community can help answer questions there.

Miriam: Really admirable implementation! One question: Google Posts expire after 7 days unless they are events, so are you basically automating re-posting of the booking feature for each listing every seven days?

Joel: We create Posts every seven days for all our practices. That way, we can mix up the content and images used on any given practice. We're also adding a second weekly post for practices that offer aesthetic services. We'll be launching more Posts for specific practice types going forward, too.

Miriam: Now for the most exciting part, Joel! What can you tell me about the increase in appointments this use of Google Posts has delivered for your customers? And, can you also please explain what parameters and products you are using to track this growth?

Joel: To track clicks from listings on Google, we use UTM parameters. We can then track the authority page, the services (menu) URL, the appointment URL, and the Posts URL.

When I first did this analysis, I looked at the average of the last three weeks of appointments compared to the 4 days after launch. Over that period, I saw nearly an 8% increase in online bookings. I've since included the entire first week of launch. It shows an 11% average increase in online bookings.

Additionally, because we're tracking each URL in the knowledge panel separately, I can confidently say there's no cannibalization of clicks from other URLs as a result of adding Posts. While authority page CTR remained steady, services lost over 10% of the clicks and appointment URLs gained 10%. That indicates to me that not only are the Posts effective in driving appointments through the Posts CTA, it emphasizes the existing appointment CTA too. This was in the context of no additional product changes on our side.

Miriam: Right, so, some of our readers will be using Google’s Local Business URLs (frequently used for linking to menus) to add an “Appointments” link. One of the most exciting takeaways from your implementation is that using Google Posts to support bookings didn’t steal attention away from the appointment link, which appears higher up in the Knowledge Panel. Can you explain why you feel the Google Posts clicks have been additive instead of subtractive?

Joel: The “make appointment” link gets a higher CTR than Posts, so it shouldn't be ignored. However, since
Posts include an image, I suspect it might be attracting a different kind of user, which is more primed to interact with images. And because we're so specific on the type of interaction we want (appointment booking), both with the CTA and the image, it seems to convert well. And, as I stated above, it seems to help the appointment URLs too.

Miriam: I was honestly so impressed with your creativity in this, Joel. It’s just brilliant to look at something as simple as this little bit of Google screen real estate and ask, “Now, how could I use this to maximum effect?” Google Posts enables business owners to include links labeled Book, Order Online, Buy, Learn More, Sign Up, and Get Offer. The “Book” feature is obviously an ideal match for your company’s health care provider clients, but given your obvious talent for thinking outside the box, would you have any creative suggestions for other types of business models using the other pre-set link options?

Joel: I’m really excited about the events feature, actually. Because you can create a long-lived post while adding a sense of urgency by leveraging a time-bound context. Events can include limited-time offers, like a sale on a particular product, or signups for a newsletter that will include a coupon code. You can use all the link labels you've listed above for any given event. And, I think using the image-as-button philosophy can really drive results. I'd like to see an image with text Use coupon code XYZ546 now! with the Get Offer button. I imagine many business types, especially retail, can highlight their limited time deals without paying other companies to advertise your coupons and deals via Posts.

Miriam: Agreed, Joel, there are some really exciting opportunities for creative use here. Thank you so much for the inspiring knowledge you’ve shared with our community today!

Ready to get the most from Google Posts?

Reviews can be a challenge to manage. Google Q&A may be a mixed blessing. But as far as I can see, Posts are an unalloyed gift from Google. Here’s all you have to do to get started using them right now for a single location of your business:

  • Log into your Google My Business dashboard and click the “Posts” tab in the left menu.
  • Determine which of the options, labeled “Buttons,” is the right fit for your business. It could be “Book,” or it could be something else, like “Sign up” or “Buy.” Click the “Add a Button” option in the Google Posts wizard. Be sure the URL you enter includes a UTM parameter for tracking purposes.
  • Upload a 750x750 image. Joel recommends using a simple-colored background and highly visible 42-point font size for turning this image into a CTA button-style graphic. You may need to experiment with cropping the image.
  • Alternatively, you can create an event, which will cause your post to stay live through the date of the event.
  • Text has a minimum 100-character and maximum 300-character limit. I recommend writing something that would entice users to click to get beyond the cut-off point, especially because it appears to me that there are different display lengths on different devices. It’s also a good idea to bear in mind that Google Posts are indexed content. Initial testing is revealing that simply utilizing Posts may improve local pack rankings, but there is also an interesting hypothesis that they are a candidate for long-tail keyword optimization experiments. According to Mike Blumenthal:
“...If there are very long-tail phrases, where the ability to increase relevance isn't up against so many headwinds, then this is a signal that Google might recognize and help lift the boat for that long-tail phrase. My experience with it was it didn't work well on head phrases, and it may require some amount of interaction for it to really work well. In other words, I'm not sure just the phrase itself but the phrase with click-throughs on the Posts might be the actual trigger to this. It's not totally clear yet.”
  • You can preview your post before you hit the publish button.
  • Your post will stay live for 7 days. After that, it will be time to post a new one.
  • If you need to implement at scale across multiple listings, re-read Joel’s description of the API and programming PatientPop is utilizing. It will take some doing, but an 11% increase in appointments may well make it worth the investment! And obviously, if you happen to be marketing health care providers, checking out PatientPop’s ready-made solution would be smart.

Nobody likes a ball-hog

I’m watching the development of Google Posts with rapt interest. Right now, they reside on Knowledge Panels and listings, but given that they are indexed, it’s not impossible that they could eventually end up in the organic SERPs. Whether or not that ever happens, what we have right now in this feature is something that offers instant publication to the consumer public in return for very modest effort.

Perhaps even more importantly, Posts offer a way to bring users from Google to your own website, where you have full control of messaging. That single accomplishment is becoming increasingly difficult as rich-feature SERPs (and even single results) keep searchers Google-bound. I wonder if school kids still shout “ball-hog” when a classmate refuses to relinquish ball control and be a team player. For now, for local businesses, Google Posts could be a precious chance for your brand to handle the ball.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

How to Boost Bookings & Conversions with Google Posts: An Interview with Joel Headley posted first on

A Step-by-Step Guide to Setting Up and Growing Your YouTube Presence

Posted by AnnSmarty

When was the last time you saw a video on YouTube? I bet you've seen one today. YouTube is too huge and too popular for marketers to ignore.

If you don't have a YouTube channel, now's the time to start one.

If you have a channel and you never got it off the ground, now's the time to take action.

This article will take you through the process of setting up your YouTube presence, listing steps, tools, and important tips to get you started and moving forward.

1. Define your goals

If your goal is to become a YouTube star, you might be a bit late to the party: it's really hard to get noticed these days — too competitive. Stardom will take years of hard work to achieve because of the number of channels users have to choose from.

Even back in 2014, when I was reading about YouTube celebrity bloggers, one quote really stood out to me:

“We think, if we were coming to YouTube today, it would be too hard. We couldn't do it.”

That’s not to say, however, that you cannot achieve other, more tangible goals on YouTube. It's an excellent venue for business owners and marketers.

Here are three achievable goals that make more sense than fame from a business perspective:

1.1. YouTube for reputation management

Here's one thing about reputation management on Google: You’re never finished.

Even if your reputation is fabulous and you love every single result that comes up in the SERPs for your business name, you may still want to publish more content around your brand.

The thing is, for reputation management purposes, the more navigational queries you can control, the better:


YouTube is the perfect platform for reputation management. YouTube videos rank incredibly well in Google, especially when it comes to low-competition navigational queries that include your brand name.

Furthermore, YouTube videos almost always get that rich snippet treatment (meaning that Google shows the video thumbnail, author, and length of the video in the SERPs). This means you can more easily attract attention to your video search result.

That being said, think about putting videos on YouTube that:

  • Give your product/service overview
  • Show happy customers
  • Visualize customer feedback (for example, visual testimonials beautifully collected and displayed in a video)
  • Offer a glimpse inside your team (show people behind the brand, publish videos from events or conferences, etc.)

1.2 YouTube videos for improved conversions

Videos improve conversions for a clear reason: They offer a low-effort way for your customer to see why they need your product. Over the years, there have been numerous case studies proving the point:

  • An older study (dating back to 2011) states that customers are 144% more likely to add products to a shopping cart after watching the product video
  • Around 1 in 3 millennials state they have bought a product directly as a result of watching a how-to video on it
  • This Animoto survey found that almost all the participants (96%) considered videos "helpful when making purchasing decisions online"
  • Wistia found that visitors who engage with a video are much more likely to convert than those who don't

That being said, YouTube is a perfect platform to host your video product overviews: it's free, it offers the additional benefit of ranking well in Google, and it provides additional exposure to your products through their huge community, allowing people to discover your business via native search and suggested videos.

1.3 YouTube for creating alternative traffic and exposure channels

YouTube has huge marketing potential that businesses in most niches just cannot afford to ignore: it serves as a great discovery engine.

Imagine your video being suggested next after your competitor's product review. Imagine your competitors' customers stumbling across your video comparison when searching for an alternative service on Youtube.

Just being there increases your chances of getting found.

Again, it's not easy to reach the YouTube Top 10, but for specific low-competition queries it's quite doable.

Note: To be able to build traffic from inside your YouTube videos, you need to build up your channel to 10,000 public overall views to qualify to become a YouTube partner. Once approved, you'll be able to add clickable links to your site from within your videos using cards and actually build up your own site traffic via video views.

2. Develop a video editorial calendar

As with any type of content, video content requires a lot of brainstorming, organizing, and planning.

My regular routine when it comes to creating an editorial calendar is as follows:

  1. Start with keyword research
  2. Use question research to come up with more specific ideas
  3. Use seasonality to come up with timing for each piece of content
  4. Allocate sufficient time for production and promotion

You can read about my exact editorial process here. Here's a sample of my content roadmap laying out a major content asset for each month of the year, based on keyword research and seasonality:

Content roadmap

For keyword and question research I use Serpstat because they offer a unique clustering feature. For each keyword list you provide, they use the Google search results page to identify overlapping and similar URLs, evaluate how related different terms in your list are, and based on that, cluster them into groups.

Keyword clustering

This grouping makes content planning easier, allowing you to see the concepts behind keyword groups and put them into your roadmap based on seasonality or other factors that come into play (e.g. is there a slot/gap you need to fill? Are there company milestones or events coming up?).

Depending on how much video content you plan to create, you can set up a separate calendar or include videos in your overall editorial calendar.

When creating your roadmap, keep your goals in mind, as well. Some videos, such as testimonials and product reviews, won't be based on your keyword research but still need to be included in the roadmap.

3. Proceed to video production

Video production can be intimidating, especially if you have a modest budget, but these days it's much easier and more affordable than you'd imagine.

Keeping lower-budget campaigns in mind, here are few types of videos and tools you can try out:

3.1 In-house video production

You can actually handle much of your video production in-house without the need to set up a separate room or purchase expensive gadgets.

Here are a few ideas:

  • Put together high-quality explanatory videos using Animatron (starts at $15/month): Takes a day or so to get to know all the available tools and options, but after that the production goes quite smoothly
  • Create beautiful visual testimonials, promo videos, and visual takeaways using Animoto ($8/month): You don’t need much time to learn to use it; it's very easy and fun.
  • Create video tutorials using iMovie (free for Mac users): It will take you or your team about a week to properly figure out all its options, but you'll get there eventually.
  • Create video interviews with niche influencers using Blue Jeans (starts at $12.49/month)
  • Create (whiteboard) presentations using ClickMeeting (starts at $25/month): Host a webinar first, then use the video recording as a permanent brand asset. ClickMeeting will save your whiteboard notes and let you reuse them in your article. You can brand your room to show your logo and brand colors in the video. Record your entire presentation using presentation mode, then upload them to your channel.


3.2 How to affordably outsource video production

The most obvious option for outsourcing video production is a site like Fiverr. Searching its gigs will actually give you even more ideas as to what kinds of videos you might create. While you may get burned there a few times, don’t let it discourage you — there are plenty of creative people who can put together awesome videos for you.

Another great idea is to reach out to YouTube bloggers in your niche. Some of them will be happy to work for you, and as a bonus you'll be rewarded with additional exposure from their personal branding and social media channels.

I was able to find a great YouTube blogger to work for my client for as low as $75 per video; those videos were of top quality and upload-ready.

There's lots of talent out there: just spend a few weeks searching and reaching out!

4. Optimize each video page

When uploading your videos to YouTube, spend some time optimizing each one. Add ample content to each video page, including a detailed title, a detailed description (at least 300–500 characters), and a lot of tags.

  • Title of the video: Generally, a more eye-catching and detailed title including:
    • Your core term/focus keyword (if any)
    • Product name and your brand name
    • The speaker's name when applicable (for example, when you post interviews). This may include their other identifiable personal brand elements, such as their Twitter handle
    • Event name and hashtag (when applicable)
    • City, state, country (especially if you're managing a local business)
  • Description of the video: The full transcript of the video. This can be obtained via services such as Speechpad.
  • A good readable and eye-catching thumbnail: These can be created easily using a tool like Canva.

Use a checklist:

Youtube SEO checklist

5. Generate clicks and engagement

Apart from basic keyword matching using video title and description, YouTube uses other video-specific metrics to determine how often the video should be suggested next to related videos and how high it should rank in search results.

Here's an example of how that might work:

The more people that view more than the first half of your video, the better. If more than 50% of all your video viewers watched more than 50% of the video, YouTube would assume your video is high quality, and so it could pop up in "suggested" results next to or at the end of other videos. (Please note: These numbers are examples, made up using my best judgment. No one knows the exact percentage points YouTube is using, but you get the general idea of how this works.)

That being said, driving "deep" views to your videos is crucial when it comes to getting the YouTube algorithm to favor you.

5.1 Create a clickable table of contents to drive people in

Your video description and/or the pinned comment should have a clickable table of contents to draw viewers into the video. This will improve deep views into the video, which are a crucial factor in YouTube rankings.

Table of contents

5.2 Use social media to generate extra views

Promoting your videos on social media is an easy way to bring in some extra clicks and positive signals.

5.2.1 First, embed the video to your site

Important: Embed videos to your web page and promote your own URL instead of the actual YouTube page. This approach has two important benefits:

  • Avoid auto-plays: Don't screw up your YouTube stats! YouTube pages auto-play videos by default, so if you share a YouTube URL on Twitter, many people will click and immediately leave (social media users are mostly lurkers). However, if you share your page with the video embedded on it, it won't play until the user clicks to play. This way you'll ensure the video is played only by people who seriously want to watch it.
  • Invest time and effort into your own site promotion instead of marketing the page: Promoting your own site URL with the video embedded on it, you can rest assured that more people will keep interacting with your brand rather than leave to watch other people's videos from YouTube suggested results.

There are also plenty of ways to embed YouTube videos naturally in your blog and offer more exposure. Look at some of these themes, for example, for ideas to display videos in ways that invite views and engagement.

Video sharing Wordpress

5.2.2 Use tools to partially scale social media promotion

For better, easier social media exposure, consider these options:

  • Investing in paid social media ads, especially Facebook ads, as they work best for engagement
  • Use recurring tweets to scale video promotion. There are a few tools you can try, such as DrumUp. Schedule the same update to go live several times on your chosen social media channels, generating more YouTube views from each repeated share. This is especially helpful for Twitter, because the lifespan of a tweet is just several minutes (between two and ten minutes, depending on how active and engaged your Twitter audience is). With recurring tweets, you'll make sure that more of your followers see your update.

  • A project I co-founded, Viral Content Bee, can put your videos in front of niche influencers on the lookout for more content to share on their social media accounts.

5.3 Build playlists

By sorting your videos into playlists, you achieve two important goals:

  • Keeping your viewers engaged with your brand videos longer: Videos within one playlist keep playing on autopilot until stopped
  • Creating separate brand assets of their own: Playlist URLs are able to rank both in YouTube and Google search results, driving additional exposure to your videos and brand overall, as well as allowing you to control more of those search results:


Using playlists, you can also customize the look and feel of your YouTube channel more effectively to give your potential subscribers a glimpse into additional topics you cover:

Customize Youtube channel

Furthermore, by customizing the look of your YouTube channel, you transform it into a more effective landing page, highlighting important content that might otherwise get lost in the archives.

6. Monitor your progress

6.1 Topvisor

Topvisor is the only rank tracker I am aware of that monitors YouTube rankings. You'll have to create a new project for each of your videos (which is somewhat of a pain), but you can monitor multiple keywords you're targeting for each video. I always monitor my focus keyword, my brand name, and any other specific information I'm including in the video title (like location and the speaker's name):


6.2 YouTube Analytics

YouTube provides a good deal of insight into how your channel and each individual video is doing, allowing you to build on your past success.

  • You'll see traffic sources, i.e. where the views are coming from: suggested videos, YouTube search, external (traffic from websites and apps that embed your videos or link to them on YouTube), etc.
  • The number of times your videos were included in viewers' playlists, including favorites, for the selected date range, region, and other filters. This is equal to additions minus removals.
  • Average view duration for each video.
  • How many interactions (subscribers, likes, comments) every video brought.

Youtube Analytics

You can see the stats for each individual video, as well as for each of your playlists.

6.3 Using a dashboard for the full picture

If you produce at least one video a month, you may want to set up a dashboard to get an overall picture of how your YouTube channel is growing.

Cyfe (disclaimer: as of recently, Cyfe is a content marketing client of mine) is a tool that offers a great way to keep you organized when it comes to tracking your stats across multiple platforms and assets. I have a separate dashboard there which I use to keep an eye on my YouTube channels.

Cyfe Youtube


Building a YouTube channel is hard work. You're likely to see little or no activity for weeks at a time, maybe even months after you start working on it. Don’t let this discourage you. It's a big platform with lots of opportunity, and if you keep working consistently, you'll see your views and engagement steadily growing.

Do you have a YouTube channel? What are you doing to build it up and increase its exposure? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

A Step-by-Step Guide to Setting Up and Growing Your YouTube Presence posted first on

Where Clickbait, Linkbait, and Viral Content Fit in SEO Campaigns – Whiteboard Friday

Posted by randfish

When is it smart to focus on viral-worthy content and clickbait? When is it not? To see fruitful returns from these kinds of efforts, they need to be done the right way and used in the right places. Rand discusses what kind of content investments make sense for this type of strategy and explains why it works in this week's Whiteboard Friday.

Where clickbait, linkbait, and viral content fit in SEO campaigns

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about when and where you might use clickbait and linkbait and viral-focused content as compared to other types for your SEO-driven campaigns.

There's a lot of savvy sort of folks at the intersection of SEO and content marketing who are practicing things like this right now. We've actually spoken to a few agencies who are specifically focused on this, and they have really solid businesses because many brands understand that these types of investments can produce significant returns. But you have to apply them in the right scenarios and the right spaces. So let's walk through that.

Content investments

Let's say that you're a payroll software provider. Your goal is to increase traffic and conversions, and so you're considering what types of content investments you and your consultant or agency or in-house team might be making on the content front. That could be things like what we've got here:

A. Viral, news-worthy linkbait

I don't necessarily love the word "linkbait," but it still gets a lot of searches, so we're putting it in the title of the Whiteboard Friday because we practice what we preach here, baby.

So this might be something like "The Easiest and Hardest Places to Start a Company." Maybe it's countries, maybe it's states, regions, whatever it is. So here are the easy ones and the hard ones and the criteria, and you go out to a bunch of press and you say, "Hey, we produced this list. We think it's worth covering. Here's the criteria we used." You go out to a bunch of companies. You go out to a bunch of state governments. You go out to a bunch of folks who cover this type of space, and hopefully you can get some clickbait, some folks actually clicking, some folks linking.

It doesn't necessarily have the most search volume. Folks aren't necessarily interested in, "Oh, what are the hardest places to start a company? Or what are the hardest versus easiest places to start a company?" Maybe you get a few, but it's not necessarily going to drive direct types of traffic that this payroll software provider can convert into customers.

B. Searcher-focused solutions

But there are other options for that, like searcher-focused solutions. So they might say, "Hey, we want to build some content around how to set up payroll as an LLC. That gets a lot of searches. We serve LLCs with our payroll solution. Let's try and target those folks. So here's how to set up payrolls in LLCs in six easy steps. There are the six steps."

C. Competitor comparison content

They see that lots of people are looking for them versus other competitors. So they set up a page that's "QuickBooks versus Gusto versus Square: Which Software is Right for Your Business?" so that they can serve that searcher intent.

D. Conversion-funnel-serving content

So they see that, after searching for their brand name, people also search for, "Can I use this for owner employees, businesses that have owner employees only?" So no employees who are not owners. What's the payroll story with them? How do I get that sorted out? So you create content around this.

All of these are types of content that serve SEO, but this one, this viral-focused stuff is the most sort of non-direct. Many times, brands have a tough time getting their head around why they would invest in that. So do SEOs. So let's explain that.

If a website's domain authority, their sort of overall link equity at the domain level is already high, they've got lots and lots of links going to lots of places on the site and additional links that don't go to the conversion-focused pages that they're specifically trying to rank for, for focused keyword targets isn't really required, then really B, C, and D are where you should spend your time and energy. A is not a great investment. It's not solving the problem you want to solve.

If the campaign needs...

  • More raw brand awareness - People knowing who the company is, they haven't heard of them before. You're trying to build that first touch or that second touch so that people in the space know who you are.
  • Additional visitors for re-targeting - You're trying to get additional visitors who might fit into your target audience so that you can re-target and remarket to them, reach them again;
  • You have a need for more overall links really to anywhere on the domain - Just to boost your authority, to boost your link equity so that you can rank for more stuff...

Then A, that viral-focused content makes a ton of sense, and it is a true SEO investment. Even though it doesn't necessarily map very well to conversions directly, it's an indirect path to great potential SEO success.

Why this works:

Why does this work? Why is it that if I create a piece of viral content on my site that earns a lot of links and attention and awareness, the other pieces of content on my site will suddenly have a better opportunity to rank? That's a function of how Google operates fundamentally, well, Google and people.

So, from Google's perspective, it works because in the case where Google sees, which has lots of pages earning many, many different links from all around the web, and, which may be equally relevant to the search query and maybe has just as good content but has few links pointing to it and those links, maybe the same number of links are pointing to the specific pages targeting a specific keyword, but overall across the domain, X is just much, much greater than Y. Google interprets that as more links spread across the content on X makes the search engine believe that X is more authoritative and potentially even more relevant than Y is. This content has been referenced more in more different ways from more places, therefore its relevance and authority are perceived as higher. If Y can go ahead and make a viral content investment that draws in lots and lots of new links, it can potentially compete much better against X.

This is true for people and human beings too. If you're getting lots and lots of visitors all over Domain X, but very few on Domain Y, even if they're going in relatively similar proportion to the product-focused pages, the fact that X is so much better known by such a broader audience means that conversions are likely to be better. People know them, they trust them, they've heard of them before, therefore, your conversion rate goes up and Domain X outperforms Domain Y. So for people and for search engines, this viral-focused content in the right scenario can be a wonderful investment and a wise one to make to serve your SEO strategy.

All right, everyone. Look forward to your comments below. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Where Clickbait, Linkbait, and Viral Content Fit in SEO Campaigns - Whiteboard Friday posted first on

Zero-Result SERPs: Welcome to the Future We Should’ve Known Was Coming

Posted by Dr-Pete

On Wednesday, Google launched a large-scale experiment, removing organic results from a small set of searches with definitive answers such as this one for "What time is it in Seattle?":

These SERPs display a Knowledge Card with a "Show all results" button and no additional organic results or SERP features. Danny Sullivan wrote on Twitter that this is currently limited to a small set of answers, including calculators, unit conversions, and some time/date queries. Here's another one, converting yesterday's MozCast temperature ("108 degrees in celsius"):

At first glance, this is a startling development, but it shouldn't be entirely surprising. So, let's get to the hard questions — is this a sign of things to come, and how quickly do we need to adapt?

For today, don't panic

First off, preliminary data suggests that these really are isolated cases. Across the 10,000 searches that MozCast tracks daily, one search (0.01%) currently displays zero results: "1 gigabit to gigabyte." This change is not impacting most high-volume, competitive queries or even the vast majority of results with Knowledge Cards.

Second, we have to face the reality that Knowledge Cards, even paired with organic results, already dramatically impact search user behavior. Thanks to Russ Jones, we've pulled some data from an internal CTR study we're currently working on at Moz. In that study, SERPs with 10 blue links have a roughly 79% organic click-through rate (overall). Add just a Knowledge Card, with no other features, and that drops to 25%. That's a 68% drop-off, a loss of over two-thirds of organic clicks. Google has tested this change and likely found that showing organic links on these particular searches provided very little additional value.

This isn't new (part 1)

I'm going to argue that this change is one that we in the industry should've seen coming, and I'm going to do it in two parts. First, we know that Knowledge Cards and other answers (including Featured Snippets) power SERPs on devices where screen size is at a minimum or non-existent.

Take for example, a search for "Where was Stephen Hawking born?" Even though the answer is definitive (there is one factual answer to this question), Google displays a rich Knowledge Card plus a full set of organic SERPs. On mobile, though, that Knowledge Card dominates results. Here's a full-screen image:

The Knowledge Card extends below the fold and dominates the mobile screen. This assumes I see the SERP at all. Even as I was typing the question, Google tried to give me the answer...

If the basic information is all I need, and if I trust Google as a source for that information, why would I need to even click at this point?

On mobile, I at least have the option to peruse organic results. On Google Home, if I ask the same question ("Where was Stephen Hawking born?"), I get no SERP at all, just the answer:

"Stephen Hawking was born in Oxford, United Kingdom."

Obviously, this is born of necessity on a voice-only device like Google Home, but we get a similarly truncated result with voice searches through Google Assistant. This is the same answer on my phone (the same phone as the previous screenshots), but using voice search instead of text search...

Google's push toward voice UI and mobile-first design means that these considerations sometimes move back up the chain of devices. If the answer is enough for voice and mobile, maybe it's enough for desktop.

This isn't new (part 2)

Over the past couple of years, I've talked a lot about how SERPs have expanded well beyond 10 blue links. What we talk about less is the flip-side, that SERPs are also shrinking. Adding SERP features is, in some cases, a zero-sum game, at the cost of organic results.

Each of the following features take up one organic position:

  • Full site-links (each row)
  • Image results
  • Top Stories
  • In-depth articles (3 articles = 1 organic)
  • Tweets (carousel)
  • Tweets (single)

Across the 10,000 SERPs in our data set, over half (51%) had less than 10 traditional organic results. While very-low counts are rare, over one-fourth of page-one SERPs fell into the range of 5–8 organic results.

While the zero-result SERP is certainly a new and extreme case, the removal of organic results in favor of other features has been happening (and expanding) for quite some time now. SERPs with as few as 3–4 page-one organic results have been appearing in the wild for well over a year.

In some cases, you might not even realize that a result isn't organic. Consider, for example, the following set of results on desktop. Can you spot the In-depth Articles?

On desktop results, there are no visual markers separating In-depth Articles from organic results, even though these results are powered by two different aspects of the algorithm. From the source code markers, we can see that the answer is #2–#5, three results which displace one organic result:

Another example is Twitter results. You've probably seen the Twitter carousel, which is a visually distinct format with three tweets, but have you seen a result like this one (on a search for "cranberry")?

At first glance, it looks organic (except for the Twitter icon), but this result is a vertical result pulled directly from the Twitter data feed. It is not subject to traditional organic optimization and ranking factors.

All of this is to say that organic real estate has been shrinking for quite a while, giving way to vertical results, Knowledge Graph results, and other rich features. Google will continue to experiment, and we can expect that some SERPs will continue to shrink. Where the data suggests that one answer is enough, we may only see one answer, at the cost of organic results.

Search intent vs. opportunity

It's easy to let our imaginations run wild, but we have to consider intent. The vast majority of searches are never going to have one definitive answer, and some queries aren't even questions, in the traditional sense.

From an SEO and content standpoint, I think we have to expand our idea of informational search intent (vs. transactional or navigational, using the classic model). Some questions are factual, and can be answered by the ever-expanding Knowledge Graph. As of today, a search like "When is Pi Day?" still shows organic results, but the Knowledge Card gives us a definitive answer...

Here, organic opportunity is very limited. Think of this as a "closed informational" search.

On the other hand, open-ended questions still rely very much on a variety of answers, even when Google tries to choose one of those answers. Consider the search "What is the best pie?", which returns the following Featured Snippet (a hybrid of organic result and answer box)...

No one answer will ever suffice for this question. Even the author of this post had the decency to say "Go ahead and let me have it in the comments," knowing the disagreement would soon flow like cherry filling.

Think of these searches as "open informational" searches. Even if we have to compete for the Featured Snippet (especially on voice results), there will be organic/SEO opportunity here for the foreseeable future.

Ultimately, we have to adapt, and we have to get smarter about the searchers we target. Where Google can answer a question, they will try to answer that question, and if organic results add no measurable value (regardless of whether you agree with how Google measures value), they will continue to shrink.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Zero-Result SERPs: Welcome to the Future We Should've Known Was Coming posted first on

Getting Around the “One Form” Problem in Unbounce

Posted by R0bin_L0rd

What is Unbounce?

Unbounce is a well-known and well-regarded landing page creation tool designed to allow HTML novices to create impressive and impactful landing pages, while offering scope for more experienced coders to have a bit more fun.

In this post, I’m going to list some solutions to what I refer to as the “one form” problem of Unbounce, their strengths and weaknesses, and which I personally prefer.

What is the "one form" problem?

As with any system that tries to take complex processes and make them simple to implement, there’s a certain amount of nuance and flexibility that has to be sacrificed.

One limitation is that each landing page on Unbounce can only have one embedded form (there are a few community articles discussing the topic, for instance: 1, 2, 3). While there’s a definite risk of call-to-action fatigue if you bombard your visitors with forms, it’s a reasonable requirement to want to provide easy access to your form at more than one point.

For example, you could lead with a strong call to action and the form at the top of the page, then follow up further down the page when users have had time to absorb more information about your offering. A simple example of this is the below Teambit landing page, which was featured in Hubspot’s 16 of the Best Landing Page Design Examples You Need to See in 2017.

The top of this Teambit page features a simple email collection form

The form is repeated at the bottom of the page once visitors have had a chance to read more.

Potential solutions to the one-form issue

Now that we’ve established the problem, let’s run through some solutions, shall we?

Fortunately, there are a few possible ways to solve this problem, either using built-in Unbounce tools or by adding code through open HTML, CSS, and JavaScript inputs.

It’s worth bearing in mind that one solution is to not have the form on your page at all, and have your call-to-action buttons linking to other pages with forms. This is the approach Unbounce uses in the example below. While that’s a perfectly valid approach, I wouldn’t call it so much a solution to this problem as a completely different format, so I haven’t included it in the list below.

Here Unbounce use two CTAs (the orange buttons), but don’t rely on having the form on the page.

1. Scrolling anchor button

This is potentially the simplest solution, as it’s natively supported by Unbounce:

  1. Create a button further down the page where you would want your second form.
  2. Edit that button, in the “Click Action” section of the right-hand button settings panel, where you would normally put the URL you are linking to
  3. Add in the unique ID code for the box that holds your form (you can find that by editing the box and scrolling to the bottom of the right-hand panel to "Element Metadata")

Register button

“Click Action” section of right-hand button settings panel

“Element Metadata” section at bottom of right-hand element setting panel


Quick and easy to implement, little direct JavaScript or HTML manipulation needed.


There are far more seamless ways to achieve this from the user perspective. Even with smooth scrolling (see “bonus points” below), the experience can be a little jarring for users, particularly if they want to go back to check information elsewhere on a page.

Bonus points

Just adding that in as-is will mean a pretty jarring experience for users. When they click the button, the page will jump back to your form as though it’s loaded a new page. To help visitors understand what’s going on, add smooth scrolling through JavaScript. Unbounce has how-to information here.

Double bonus

The link anchors work by aligning the top of your screen with the top of the thing you’ve anchored. That can leave it looking like you’ve undershot a bit, because the form is almost falling off the screen. You can solve this simply by putting a tiny, one-pixel-wide box a little bit above the form, with no fill or border, positioning it how you want, and linking to the ID of that box instead, allowing a bit of breathing room above your form.

Without and with the one-pixel-wide box for headroom

2. iFrames

Unbounce allows free <HTML> blocks, which you can use to embed a form from another service or even another Unbounce page that consists of only a form. You’ll need to drag the “Custom HTML” block from the left bar to where you want the form to be and paste in your iFrame code.

The “Custom HTML” block in the left-hand bar

Blank HTML box that pops up

How HTML blocks look in the editor


This will allow for multiple forms, for each form to be positioned differently on the page, to function in a different way, and for entries to each form to be tagged differently (which will offer insight on the effectiveness of the page).

This solution will also allow you to make the most of functionality from other services, such as Wufoo (Unbounce has documented the process for that here).


Having chosen Unbounce as a one-stop-shop for creating landing pages, breaking out of that to use external forms could be considered a step away from the original purpose. This also introduces complications in construction, because you can’t see how the form will look on the page in the editing mode. So your workflow for changes could look like:

  1. Change external form
  2. Review page and see styling issues
  3. Change layout in Unbounce editor
  4. Review page and see that the external form isn’t as readable
  5. Change external form
  6. Etc.

Bonus points

Unbounce can’t track conversions through an iFrame, so even if you use another Unbounce page as the form you draw in, you’re going to be breaking out of Unbounce’s native tracking. They have a script here you can use to fire external tracking hits to track page success more centrally so you get more of a feel for whether individual pages are performing well.

Double bonus

Even if you’re using an identical Unbounce page to pull through the same form functionality twice, tag the form completions differently to give you an idea of whether users are more likely to convert at the top of the page before they get distracted, or lower down when they have had time to absorb the benefits of your offering.

3. Sticky form (always there)

An option that will keep everything on the same page is a sticky form. You can use CSS styling to fix it in place on a screen rather than on a page, then when your visitor scrolls down, the form or CTA will travel with them — always within easy reach.

This simple CSS code will fix a form on the right-hand side of a page for screen widths over 800px (that being where Unbounce switches from Desktop to Mobile design, meaning the positioning needs to be different).

Each ID element below corresponds to a different box which I wanted to move together. You’ll need to change the “lp-pom-box-xxx” below to match the IDs of what you want to move down the page with the user (you can find those IDs in the “Element Metadata” section as described in the Scrolling Anchor Button solution above).

@media (min-width: 800px) {  
    #lp-pom-box-56{ position:fixed; left:50%; margin-left: 123px; top:25%; margin-top:-70px}  
    #lp-pom-form-59{ position:fixed; left:50%; margin-left: 141px; top:25%; margin-top:60px}  
    #lp-pom-box-54{ position:fixed; left:50%; margin-left: 123px; top:25%; margin-top:50px}}


This allows you to keep tracking within Unbounce. It cuts out a lot of the back and forth of building the form elsewhere and then trying to make that form, within an iFrame, act on your page the way you want it to.


The problem with this is that users can quickly become blind to a CTA that travels with them, adding some kind of regular attention seeking effect is likely to just annoy them. The solution here is to have your call to action or form obscured during parts of the page, only to reappear at other, more appropriate times (as in the next section).

It can be difficult to see exactly where the form will appear because your CSS changes won’t take effect in the editor preview, but you will be able to see the impact when you save and preview the page.

4. Sticky form (appearing and disappearing)

The simplest way to achieve this is using z-index. In short, the z-index is a way of communicating layers through HTML, an image with a z-index of 2 will be interpreted as closer to the user than a box with a z-index of 1, so when viewing the page it’ll look like the image is in front of the box.

For this method, you’ll need some kind of opaque box in each section of your page. The box can be filled with a color, image, gradient — it doesn’t matter as long as it isn’t transparent. After you’ve put the boxes in place, make a note of their z-index, which you can find in the “Meta Data” section of the right-hand settings bar, the same place that the Element ID is shown.

This box has a z-index of 31, so it’ll cover something with an index of 30

Then use CSS to select the elements you’re moving down the page and set their z-index to a lower number. In the below lines I’ve selected two elements and set their z-index to 30, which means that they’ll be hidden behind the box above, which has a z-index of 31. Again, here you’ll want to replace the IDs that start #lp-pom-box-xxxx with the same IDs you used in the Sticky Form (Always There) solution above.

    #lp-pom-box-133{z-index: 30; }  
    #lp-pom-box-135{z-index: 30; }

When you're choosing the place where you want your form to be visible again, just remove any items that might obscure the form during that section. It’ll scroll into view.


This will allow you to offer a full form for users to fill out, at different points on the page, without having to worry about it becoming wallpaper or whether you can marry up external conversions. Using only CSS will also mean that you don’t have to worry about users with JavaScript turned off (while the bonus points below rely on JavaScript, this will fall back gracefully if JavaScript is turned off).


Unlike the iFrame method, this won’t allow you to use more than one form format. It also requires a bit more CSS knowledge (and the bonus points will require at least a bit of trial and error with JavaScript).

Bonus points

Use JavaScript to apply and remove CSS classes based on your scrolling position on the page. For example you can create CSS classes like these which make elements fade in and out of view.


@media (min-width: 800px) {    
   /* make the opacity of an element 0 where it has this class */  
       .hide {  
       opacity: 0;
   /* instead of applying an effect immediately, apply it gradually over 0.2 seconds */    .transition {
   -webkit-transition: all 0.2s ease-in-out; 
       -moz-transition: all 0.2s ease-in-out; 
       -o-transition: all 0.2s ease-in-out; 
       transition: all 0.2s ease-in-out;  

You could then use this JavaScript to apply the .hide class when user scrolls through certain points, and remove it when they get to the points where you want them to see the form. This can be used for finer-grained control of where the form appears, without having to just cover it up. As before, you’ll need to update the #lp-pom-box-xxx strings to match the IDs in your account.


// This script applies the “hide” class, which makes opacity zero, to certain elements when we scroll more than 100 pixels away from the top of the page. Effectively, if we scroll down the page these items will fade away.
$(window).scroll(function() {
    if ($(window).scrollTop() > 100 ){
// This section removes the hide class if we’re less than 500 pixels from the bottom of the page or scroll back up to be less than 100 from the top. This means that those elements will fade back into view when we’re near the bottom of the page or go back to the top.
if ($(document).height() - ($(window).height() + $(window).scrollTop()) < 500 ||
$(window).scrollTop() < 100 ){

Double bonus

You could consider using JavaScript to selectively hide or show form fields at different points. That would allow you to show a longer form initially, for example, and a shorter form when it appears the second time, despite it actually being the same form each time.

For this, you’d just add to your .scroll JavaScript function above:

   if ($(document).height() - ($(window).height() + $(window).scrollTop()) < 75){    
// This part hides the “full name” part of the form, moves the submit button up and reduces the size of the box when we scroll down to less than 75 pixels away from the bottom of the page
    $('#lp-pom-box-54').stop().animate({height: "200px"},200);
    $('.lp-pom-button-60-unmoved').animate({top: '-=75'}, 200);
// This part adds the “full name” part back in to the form, moves the submit button back down and increases the size of the box if we scroll back up. 
    $('#lp-pom-box-54').stop().animate({height: "300px"},200);
    $('.lp-pom-button-60-moved').animate({top: '+=75'}, 200);

When scrolling within 75px of the bottom of the page, our JavaScript hides the Full Name field, reduces the size of the box, and moves the button up. This could all happen when the form is hidden from view; I’ve just done it in view to demonstrate.


In the table below I’ve pulled together a quick list of the different solutions and their strengths and weaknesses.




Scrolling anchor button

Easy implementation, little coding needed

Jarring user experience


Multiple different forms

Requires building the form elsewhere and introduces some styling and analytics complexity to workflow

Sticky form (always there)

Keeps and design tracking within one Unbounce project

CTA fatigue, using up a lot of page space

Sticky form (appearing and disappearing)

The benefits of a sticky form, plus avoiding the CTA fatigue and large space requirement

CSS knowledge required, can only use one form

Personally, my favorite has been the Sticky Form (appearing and disappearing) option, to reduce the need to integrate external tools, but if I had to use multiple different forms I could definitely imagine using an iFrame.

Which is your favorite? Have I missed any cool solutions? Feel free to ping me in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Getting Around the "One Form" Problem in Unbounce posted first on

8 Common Website Mistakes Revealed Via Content Audits

Posted by AlliBerry3

One of the advantages of working for an agency is the volume of websites we get to evaluate. The majority of clients who sign up for ongoing SEO and/or content services will receive a content audit. Similar to a technical SEO audit, the results of the content audit should drive the strategies and priorities of the next stages of content work. Without the audit, you can’t create an effective strategy because you first need to know what types of content you’ve got, what content you’re missing, and what content you’ve got too much of.

While there are many posts out there about how to perform a content audit (and I encourage you to check out these posts: How to Do a Content Audit and 5 Lessons Learned from a Content Audit), I am going to be focusing on what my common findings have been from recently conducting 15 content audits. My aim is to give you more of a framework on how you can talk to clients about their content or, if you are the client, ways you can improve your website content to keep users on the site longer and, ultimately, convert.

Mistake #1: No clear calls-to-action

I have yet to complete a content audit where creating clearer calls-to-action wasn’t a focus. The goal of a page should be obvious to any visitor (or content auditor). What is it that you want a visitor who lands on this page to do next? Many of our clients are not e-commerce, so it may feel less obvious; however, assuming you want someone to stay on your website, what’s next?

Even if answer is “I want them to visit my store,” make it easy for them. Add a prominent “Visit Our Store” button. If it’s a simple blog page, what are the next blog articles someone should read based on what they just read? Or do you have a relevant e-book you’d like them to download? You got them to the end of your post — don’t lose the visitor because they aren’t sure what to do next!

Mistake #2: A lack of content for all stages of the customer journey

One thing we often do when conducting content audits is track where in the sales funnel each page is aimed (awareness, consideration, purchase, or retention). What we sometimes find is that clients tend to have a disproportionate amount of content aimed at driving a purchase, but not enough for awareness, consideration, and retention. This isn’t always the case, particularly if they have a blog or resources hub; however, the consideration and retention stages are often overlooked. While the buyer cycle is going to be different for every product, it’s still important to have content that addresses each stage, no matter how brief the stage is.

Retention is a big deal too! It is way more cost-efficient and easier to upsell and cross-sell current customers than bring in new. Your customers are also less price-sensitive because they know your brand is worth it. You definitely want to provide content for this audience too to keep them engaged with the brand and find new uses for your products. Plus, you’ve already got their contact information, so delivering content to them is much easier than a prospect.

Here are some examples of content for each stage:

Awareness: Blog posts (explainers, how-tos, etc), e-books, educational webinars, infographics

Consideration: Product comparisons, case studies, videos

Purchase: Product pages, trial offers, demos, coupons

Retention: Blog posts (product applications, success stories, etc), newsletters, social media content

Mistake #3: Testimonials aren’t used to their full potential

There are so many pages dedicated solely to testimonials out there on the Interwebs. It’s painful. Who trusts a testimonials page over reviews on third-party sites like Yelp, Google My Business, or Tripadvisor? No one. That being said, there is a place for testimonials. It’s just not on a testimonials page.

The best way to use a testimonial is to pair it with the appropriate copy. If it’s a testimonial about how easy and fast a customer received their product, use that on a shipping page. If it’s a testimonial about how a product solved a problem they had, use it on that product page. This will enhance your copy and help to alleviate any anxieties a prospective customer has with their decision to purchase.

Testimonials can also help you improve your local relevance in search. If you have a storefront that is targeting particular cities, ask for a customer’s city and state when you gather testimonials. Then, include relevant testimonials along with their city and state on the appropriate location page(s). Even if your store is in Lakewood, Colorado, collecting testimonials from customers who live in Denver and including them on your location page will help both search engines and users recognize that Denver people shop there.

Mistake #4: Not making content locally relevant (if it matters)

If location matters to your business, you should not only use testimonials to boost your local relevance, but your content in general. Take the auto dealership industry, for example. There are over 16,000 car dealerships in the United States and they all (presumably) have websites. Many of them have very similar content because they are all trying to sell the same or similar models of cars.

The best car dealership websites, however, are creating content that matters to their local communities. People who live in Denver, for example, care about what the best cars are for driving in the mountains, whereas people in the Los Angeles area are more likely to want to know which cars get the best highway gas mileage. Having your sales team take note of common questions they get asked and addressing them in your content can go a long way toward improving local relevance and gaining loyal customers.

Mistake #5: Not talking about pricing

Many companies, B2B companies in particular, do not want to list pricing on their website. It’s understandable, especially when the honest answer to “how much does your service cost?” is “it depends.” The problem with shying away from pricing altogether, though, is that people are searching for pricing information. It’s a huge missed opportunity not to have any content related to pricing, and it annoys prospective customers who would rather know your cost range before giving you a call or submitting a form for follow up.

It’s mutually beneficial to have pricing information (or at least information on how you determine pricing) on your website because it’ll help qualify leads. If a prospect knows your price range and they still reach out for more information, they're going to be a much better lead than someone who is reaching out to get pricing information. This saves your sales team the trouble of wasting their time on bad leads.

Having pricing information on your website also helps establish trust with the prospect. If you aren’t transparent about your pricing, it looks like you charge as much as you can get away with. The more information you provide, the more trustworthy your business looks. And if all of your competitors are also hiding their pricing, you’re the first one they’ll likely reach out to.

Mistake #6: Getting lost in jargon

There are a lot of great companies out there doing great work. And more often than not, their website does not reflect it as well as it could. It isn’t uncommon for those tasked with writing web copy to be quite close to the product. What sometimes happens is jargon and technical language dominates, and the reason why a customer should care gets lost. When it comes to explaining a product or service, Joel Klettke said it best at MozCon 2017. A web page should include:

  • What is the product and why should a prospect care about it?
  • How will this product make the prospect’s life easier/better?
  • What’s the next step? (CTA)

It’s also important to include business results, real use cases, and customer successes with the product on your website too. This establishes more trust and supports your claims about your products. Doing this will speak to your customers in a way that jargon simply will not.

Mistake #7: Page duplication from migration to HTTPS

With more sites getting an SSL certificate and moving to HTTPS, it’s more important than ever to make sure you have 301 redirects set up from the HTTP version to the HTTPS version to prevent unintentional duplication of your entire website. Duplicate content can impact search rankings as search engines struggle to decide which version of a page is more relevant to a particular search query. We’ve been seeing quite a few sites that have an entire duplicate site or some isolated pages that didn’t get redirects in place in their migrations. We also keep seeing sites that have www and non-www versions of pages without 301 redirects as well. Running regular crawls will help you stay on top of this kind of duplicate content.

Here are a couple of good resources to check out when doing an HTTPS migration:

Mistake #8: Poor internal linking and site architecture

How content is organized on a site can be just as important as what the content is. Without proper organization, users can struggle to surf a website successfully and search engines have a difficult time determining which pages are considered most important. Making sure your most important pages are structured to be easy to find, by listing them in your navigation, for example, is a good user experience and will help those pages perform better.

Part of making important pages easy to find is through internal linking. Web content is often created on an ongoing basis, and being smart about internal linking requires taking the time to look holistically at the site and figuring out which pages make the most sense to link to and from. I keep encountering blog content that does not link back to a core page on the site. While you don’t want product to be the focus of your blog, it should be easy for a user to get to the core pages of your site if they want to do so. As you’re auditing a site, you’ll find pages that relate to one another that don’t link. Make notes of those as you go so you can better connect pages both in copy and with your calls to action.

Wrapping up

What I find most interesting about content audits is how subjective they are. Defining what makes content good or bad is gray in a way that identifying whether or not a page has, say, a canonical tag, is not. For that reason, I have found that what content auditors focus most heavily on tend to be a reflection of the background of the person doing the audit. And the most common content mistakes I have touched on here reflect my background perfectly, which is a meld of SEO and content marketing.

So, I’m curious: what do you look for and find in your content audits? What would you add to my list?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

8 Common Website Mistakes Revealed Via Content Audits posted first on

The Moz Year in Review 2017

Posted by SarahBird

Yay! We’ve traversed another year around the sun. And I’m back with another Moz year-in-review post that promises to be as boring as its predecessors. Reading it feels like being locked in your tin can space capsule through lightyears of empty space. If you’re a little odd and like this kind of thing, do please continue.

Before we begin our odyssey, I invite you to check out previous reports: 2016, 2015, 2014, 2013, 2012. Transparency is a Moz core value. Putting detailed financial and customer data on the blog is one of the ways we live our values. We’re a little weird like that.

Image: Tesla's red car in outer space, floating past Earth with Rocketman at the wheel

Okay spacepeople: take your protein pills and put your helmets on.

Launch to your favorite parts:

Part 1: TL;DR
Commencing countdown, engines on

Part 2: SO MANY wins
Now it's time to leave the capsule if you dare

Part 3: Customer metrics
You've really made the grade

Part 4: Financial performance
And the stars look very different today

Part 5: Inside Moz HQ
The papers want to know whose shirts you wear

Part 6: Into the future
I think my spaceship knows which way to go

Part 1: TL;DR

Commencing countdown, engines on

What a year! 2017 was a time of doing new things differently — new teams, new goals, and new ways of operating. I’m so proud of Mozzers; they can do anything. If I’m sent to a far-off space colony, I want them with me.

In spite of (and because of!) all this change, we grew revenue and became significantly EBITDA and cash flow positive. Nice! We have a nice economic engine primed and ready to make some more big investments in 2018. Stay tuned.

These positive results were not from one single thing. Rather, iterative product and operations improvements helped improve both our top and bottom line. Plus, we made a bunch of longer-term investments that don’t show up yet in the 2017 report but will bear fruit in 2018 and beyond.

Part 2: Ch-ch-ch... Changes!

Now it's time to leave the capsule if you dare

Here’s a little more detail on some of the changes I talked about.

We launched Keywords By Site, relaunched our crawler (a major technical undertaking), sunsetted two products (content and Followerwonk), built a bunch of new developer tools and standardized on some dev frameworks, and improved our local data distribution network. Check out Adam Feldstein’s post for a lot more detail on our 2017 major product accomplishments!

We’ve got another exciting launch on the way, too. We’ve invested a ton of blood, sweat, and tears into it during 2017 and can’t wait to share it with everyone.

All of these changes support our 2016 strategy of “more wood behind fewer arrows.” We choose to focus our energy on being the best place to learn and do SEO. Our mission is to simplify SEO for everyone through software, education, and community.

Image: an arrow with the text "More wood behind fewer arrows"

For those of you worried about Followerwonk, it’s going to okay. Better than okay. Our beloved “Work Dad” Marc Mims is now the proud father of Followerwonk! Marc’s dedication to the success of Followerwonk has never wavered over the many, many years he’s been building and maintaining it. We already miss his compassion, humor, and bike stories around the Mozplex. We wish him and Followerwonk the best! We bought that product because we loved it then; we love it even now. Sadly, though, it never quite fit with our mission as well as we'd hoped.

We created new programs to help people get the SEO help that’s right for them. We completely rebuilt our SEO Learning Center with fresh educational content. There’s a brand-new SEO podcast, MozPod, for you to check out.

We also began experimenting with and are now expanding SEO training workshops delivered by experts we trust and admire. I’m so excited about this because it’s a new way for Moz to have impact; it’s personal, live, interactive, and immediate in a way that most of our SEO education work can’t be. We won’t stop doing free, scalable education. It’s core to our beliefs. But it is fun to deliver custom, live training sessions in the mix too.

1340 Training registrations. Training programs offer live webinars taught by real SEO instructors.

5574 Walthroughs completed. Book your walkthrough to get 1:1 time with a Moz expert and learn how the tools can help you achieve your SEO goals.

Many of our accomplishments are behind the scenes, and will deliver long-term positive impact.

Our investments in retiring tech debt, improving monitoring, investing in our development platforms, and nurturing our engineering culture have resulted in the most stable and performant software in Moz history. Our hard work and ingenuity is paying off in resilient and performant software.

We’ve also rebuilt most of our customer stack: new Salesforce implementation, HubSpot launch, new internal data warehouse, new CMS (Craft),, and more! Phew! That’s a lot! In Q1 2018, we started with Terminus for Account-Based Marketing, and partnered with third-party data vendors, like Full Contact, to supplement our data warehouse. These big changes are going to set us up really well for the years ahead. And we’ve got more internal tools launching soon!

Image: an animated gif of rockets boosting

We are on a roll with internal improvements and momentum.

Part 3: Customer metrics

You've really made the grade

We could ship and launch until our circuits go dead, but at the end of the day all our work is in service of meeting your needs.

Image: stats about community & customer numbers

Image: graph showing +9% increase in organic traffic to

We know you can hear us! You’re following us now more than ever before.

Image: stats about social media followers

Part 4: Financial performance

And the stars look very different today

Check out the infographic view of our data barf.

I’m proud of what we accomplished in 2017, especially considering the incredible amount of change in strategy and team structure. More revenue while spending less = magic! Also, the economic strength we’ve built will allow us to place some nice-sized bets this year. Boom!

We made $47.4 million in GAAP revenue in 2017, an increase of 11% from 2016.

$47.4 million. 2017 revenue

We brought our over all expenses way down in 2017. Cost of Revenue increased slightly to $11.8 million. We reduced operating expenses aggressively. Curious on what we spend on, and trends? Check out this breakdown of our major expenses (OpEx and Cost of Revenue) as a percentage of annual revenue:

Image: Major expenses graphed for 2013 through 2017

We generated cash, positive EBITDA, and for the first time in recent Moz history, we were positive net income.

$5.5 million EBITDA gain. EBIDTA is one of our superstar metrics at $5.5 million, an 11.5% improvement!

$1.5 million cash added. Did you say "cashflow positive?!" Yes. Yes we did.

That’s quite a turnaround from 2016, in which we closed the year negative EBITDA of $5.7 million! We flipped EBITDA! We have adopted a cash-flow-neutral-to-positive operating philosophy right now to be ready for some future investments. We may decide to go cash flow negative again to fund further growth.

Part 5: Inside Moz HQ

The papers want to know whose shirts you wear

So, who is behind the wheel here?

We ended 2017 with roughly the same number of Mozzers as we began. It was a conscious choice to remain approximately headcount neutral in 2017; we only opened up new positions after ensuring rigorous conversations took place around the business need for the role. This discipline is hard to live under, but we like the results. We’re working smarter, and getting more rigorous in our decision-making.

Let me be clear: WE ARE HIRING! These are just 5 of our currently open positions:

See more at our Careers page!

A graph: "Inside Moz HQ" detailing the number of Mozzers from 2007-2017. 2007 has 9 employees, 2017 has 150.

Here’s where we need YOU: Moz is committed to bringing more women into tech. There is a dire lack of diversity in the technology industry. This past year we added 6% more women to the company overall and 9% to engineering specifically. We must and will do better. We need more women in engineering and leadership roles here. Check out those jobs above and join the team!

Stats on women in tech roles at Moz

Moz partners with some fantastic organizations focused on getting more women into the tech pipeline. Ada Academy, Year Up, Ignite Worldwide, and Techbridge all encourage women and girls to pursue STEM careers early in their lives. Our newest partner, Unloop, enables people who have been in prison to develop skills and succeed in careers in tech. It is our responsibility to ensure that all people have opportunity and access to participate in STEM fields.

Generosity comes in many forms. One way in which we support the generosity of Mozzers is to match charitable donations to 501c3 organizations by 150%.

Moz Charitable Donation Match: $65810 donated to charity. We have a generous 150% charity donation match program.

We also donated our space 35 times to various organizations in the community requesting to use the Mozplex as a venue for their meetups. Check our our event brochure and take a 360 tour of the Mozplex!

Mozzers also donate a ton of time to causes they are passionate about. We also offer a very discounted price for nonprofits that we’re happy many folks take advantage of. We’re passionate about communities and helping folks.

303 Coaching sessions in 2017

Moz partnered with Halo Partners to provide professional coaching to all employees. 54 Mozzers received coaching. 27 Mozzers used this benefit for the first time! I’m a huge believer in coaching and training. Beginner’s mind is how we grow and become the best versions of ourselves.

Through it all, we made sure to have some fun. Moz offers a Paid Paid Vacation benefit, reimbursing employees $3k per year in vacation costs. Yes, that’s right. You get your regular pay, plus another $3k a year to spend on your trip! It’s bonkers!

Paid, paid vacation: $456,728.40. Paid vacation isn't enough. We also pay for Mozzers' airfare, hotels, yours, and food while they vacation.

Mozzers visited 6 of the 7 continents last year!

We also had 7 Mozling babies last year. Luv those babies.

Part 6: Into the future

I think my spaceship knows which way to go

2017 was a strengthening year for Moz. We went through a lot of change and made some important investments. Mozzers are dynamic, helpful, smart, and hardworking. They have a service orientation and build for the long term. The investments we made in 2017 will bear fruit in the years ahead. And we’re poised to make some ambitious moves in the coming months.

While I’m proud of what we’ve accomplished, I believe we have higher mountains still to climb. We have had triumphs and tribulations, heartbreaks and happy dances. These many years later, the SEO industry is healthy, growing, and dynamic. Many organizations are still struggling with basic web presence, let alone thoughtful SEO strategy. Moz is still teensy-tiny compared to the opportunity. I believe the opportunity for SEO expertise is vast.

I want to close on a note of gratitude.

First, a bunch of folks helped pull together the metrics for our 2017 report, and I am deeply grateful for their help. This post is kind of a bear! Thank you Jess, Felicia, Christian, Kevin, Susan, Michael, Jeremy, and anyone else who pulled data and helped get this post off the ground!

Second, thank you to this community. It’s because of you that we are here. This community would be nothing if it wasn’t for your care, attention, and feedback. We will continue to work hard to make your work lives more enjoyable and successful. We want to be your favorite resource for doing great SEO. If we’re not there yet, trust that we will keep working to be. Thank you for the opportunity to serve.

Gratitude also to David Bowie for inspiring this post and so much more. We miss you. <3

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Moz Year in Review 2017 posted first on

Discover Featured Snippet Opportunities – Whiteboard Friday

Posted by BritneyMuller

Winning featured snippets is one of the best ways to get visibility on page one of Google's SERPs. It's a competitive environment, though, and there are tons of specific considerations when it comes to increasing your chances of earning that spot. Today's Whiteboard Friday, number one of an upcoming three-part series, is brought to you by Moz's resident SEO and mini-pig advocate, Britney Muller. She covers the keyword research you'll need to do, evaluating your current ranking, and recording relevant data in a spreadsheet with the help of a couple useful tools.

Discover Featured Snippet Opportunities

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, welcome to another edition of Whiteboard Friday. Today we're going over all things discovering featured snippet opportunities. So this is the first part to three videos. So this will be the discover, but we're also going to have a target and a measure video as well. So really, really excited. It's going to be a ton of fun. I'm doing it with you, so you're not going to be alone. It's going to be this cool thing we all do together.

Part 1 of 3: Discover, target, measure

So for those of you that don't know what a featured snippet is, it is one of those answer boxes at the top of search results. So let's say you do a search like, oh, I don't know, "Are teacup pigs healthy?" which they're not, super sad. I love pigs. But you'll get a featured snippet box like this that tells you like, "No, they're actually starved." It gives you all this information. So it's different than something like "People also ask..." boxes or your typical search results.

Discover Featured Snippet Opportunities

They're particularly appealing because of the voice search component. So most voice searches come from those featured snippet boxes as well as it just being really appealing in general to have that top spot.

#1 Keyword research

So this process is pretty straightforward. You're going to start with just your basic keyword research. So you're also going to focus on questions. A.J. Ghergich did this incredible study write-up, on SEMrush, about featured snippets, where he discovered that around 41% of all featured snippets come from questions, which makes sense. The how questions are really interesting because those results are most likely to result in lists.

Now, lists take both the form of numerical as well as bullets. So something to kind of keep in mind there. But what's interesting about these lists is that they tend to be a little bit easier to truncate. So if you know that Google is showing 8 results, maybe you go back to your page and you make sure that you have 10. That way it lures people in to click to see the full list. Really interesting there.

#2 Evaluate your current ranking

You also want to evaluate your current ranking for these particular keywords. You want to prioritize keywords based on ones that you rank on page one for. It tends to be much easier to grab a featured snippet or to steal one if you're also on page one.

#3 Put data into a spreadsheet

Discover Featured Snippet Opportunities

From there, we're going to put all of this data and more data into a big spreadsheet so that we can really analyze and prioritize some of these opportunities. So some of the metrics I came up with — feel free to share some ideas below — are your keyword, average monthly search volume, current featured snippet URL, that's this guy over here. What is that domain authority and page authority? You want to make note of those. Is it a featured snippet hub? This is such a cool term, that A.J. came up with in his article, that essentially coins a featured snippet URL that ranks for 10 or more featured snippet boxes. You probably won't know this right away, so this might stay blank. But once you start seeing more and more of those same URLs, you might think it's one of those hubs. It's kind of cool.

Discover Featured Snippet Opportunities

Featured snippet type. Is it a paragraph, a list, or a table? Is there any markup? Is there schema markup? What's going on, on the page in general? Just sort of scope all that out. What's your rank? This is actually supposed to be over here. So, again, you want to see if you're ranking 10 or under on a particular page, hopefully on page 1.

Then is there an image? So the featured snippet images are really interesting, because Google likes to swap them out and remove them and test them and do all this crazy stuff. I got to talk about these images and the tests that I've been doing on them on the Two Peas podcast with Brian Childs, part of his MozPod podcast series. It was super fun. I share some secrets in there, so go check it out.

Then what's the type of image? So typically, you can start to see a theme of a particular niche or industry in their featured snippet images. Maybe they're all stock photos, or maybe they're all informational type photos. Maybe they all have the same color. Really interesting to sort of keep an eye on those.

What's your desired featured snippet URL? This will typically be whatever URL is ranking. But maybe not. Make note of that.

Other notes, you can mention where Google is pulling the featured snippet from that page. I think that stuff is super interesting. You can do all sorts of fun stuff.

Research tools to use

So two primary tools to do all of this research within are Moz Keyword Explorer and SEMrush. Both have some caveats. Moz Keyword Explorer is great because you can do a bunch of keyword research and save them into lists. However, you can't do keyword research only viewing the keywords that have featured snippets. You can't do that. You have to do all the keyword research, put it into a list, and then we give you that data.

With SEMrush, it's pretty cool. You get to filter only by featured snippet keywords. So that, off the bat, is awesome.

However, once you get a keyword list put together in Keyword Explorer, not only do you get that information of whether or not there's a featured snippet, but right within your list of keywords, you have the ability to add your website and immediately see your rank for all of those particular keywords in your list, making this super, super easy.

I tried to do this with SEMrush. I know they have all of the features necessary to do so. However, it's just not as easy. You have to use a combination of their different tools within their tool. I hit a couple different limits within Keyword Analyzer, and then by the time I got to position tracking, I lost my search volume from Keyword Magic tool, which was super frustrating.

There might be a better, easier way to do that. Maybe their API are pulling some stuff a little bit differently. Feel free to comment down below. Maybe there's a better way than either of these. I don't know. You could also do it pretty manually too. You could use Google Keyword Planner and look some of this stuff up yourself.

But I hope you enjoyed this. Thank you so much for joining me on this edition of Whiteboard Friday. I look forward to seeing you all soon. Thanks.

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Discover Featured Snippet Opportunities - Whiteboard Friday posted first on

The Website Migration Guide: SEO Strategy & Process

Posted by Modestos

What is a site migration?

A site migration is a term broadly used by SEO professionals to describe any event whereby a website undergoes substantial changes in areas that can significantly affect search engine visibility — typically substantial changes to the site structure, content, coding, site performance, or UX.

Google’s documentation on site migrations doesn’t cover them in great depth and downplays the fact that so often they result in significant traffic and revenue loss, which can last from a few weeks to several months — depending on the extent search engine ranking signals have been affected, as well as how long it may take the affected business to rollout a successful recovery plan.

Quick access links

Site migration examples
Site migration types
Common site migration pitfalls
Site migration process
1. Scope & planning
2. Pre-launch preparation
3. Pre-launch testing
4. Launch day actions
5. Post-launch testing
6. Performance review
Appendix: Useful tools

Site migration examples

The following section discusses how both successful and unsuccessful site migrations look and explains why it is 100% possible to come out of a site migration without suffering significant losses.

Debunking the “expected traffic drop” myth

Anyone who has been involved with a site migration has probably heard the widespread theory that it will result in de facto traffic and revenue loss. Even though this assertion holds some truth for some very specific cases (i.e. moving from an established domain to a brand new one) it shouldn’t be treated as gospel. It is entirely possible to migrate without losing any traffic or revenue; you can even enjoy significant growth right after launching a revamped website. However, this can only be achieved if every single step has been well-planned and executed.

Examples of unsuccessful site migrations

The following graph illustrates a big UK retailer’s botched site migration where the website lost 35% of its visibility two weeks after switching from HTTP to HTTPS. It took them about six months to fully recover, which must have had a significant impact on revenue from organic search. This is a typical example of a poor site migration, possibly caused by poor planning or implementation.

Example of a poor site migration — recovery took 6 months!

But recovery may not always be possible. The below visibility graph is from another big UK retailer, where the HTTP to HTTPS switchover resulted in a permanent 20% visibility loss.

Another example of a poor site migration — no signs of recovery 6 months on!

In fact, it’s is entirely possible to migrate from HTTP to HTTPS without losing so much traffic and for so such a long period, aside from the first few weeks where there is high volatility as Google discovers the new URLs and updates search results.

Examples of successful site migrations

What does a successful site migration look like? This largely depends on the site migration type, the objectives, and the KPIs (more details later). But in most cases, a successful site migration shows at least one of the following characteristics:

  1. Minimal visibility loss during the first few weeks (short-term goal)
  2. Visibility growth thereafter — depending on the type of migration (long-term goal)

The following visibility report is taken from an HTTP to HTTPS site migration, which was also accompanied by significant improvements to the site’s page loading times.

The following visibility report is from a complete site overhaul, which I was fortunate to be involved with several months in advance and supported during the strategy, planning, and testing phases, all of which were equally important.

As commonly occurs on site migration projects, the launch date had to be pushed back a few times due to the risks of launching the new site prematurely and before major technical obstacles were fully addressed. But as you can see on the below visibility graph, the wait was well worth it. Organic visibility not only didn’t drop (as most would normally expect) but in fact started growing from the first week.

Visibility growth one month after the migration reached 60%, whilst organic traffic growth two months post-launch exceeded 80%.

Example of a very successful site migration — instant growth following new site launch!

This was a rather complex migration as the new website was re-designed and built from scratch on a new platform with an improved site taxonomy that included new landing pages, an updated URL structure, lots of redirects to preserve link equity, plus a switchover from HTTP to HTTPS.

In general, introducing too many changes at the same time can be tricky because if something goes wrong, you’ll struggle to figure out what exactly is at fault. But at the same time, leaving major changes for a later time isn’t ideal either as it will require more resources. If you know what you’re doing, making multiple positive changes at once can be very cost-effective.

Before getting into the nitty-gritty of how you can turn a complex site migration project into a success, it’s important to run through the main site migration types as well as explain the main reason so many site migrations fail.

Site migration types

There are many site migration types. It all depends on the nature of the changes that take place to the legacy website.

Google’s documentation mostly covers migrations with site location changes, which are categorised as follows:

  • Site moves with URL changes
  • Site moves without URL changes

Site move migrations


These typically occur when a site moves to a different URL due to any of the below:

Protocol change

A classic example is when migrating from HTTP to HTTPS.

Subdomain or subfolder change

Very common in international SEO where a business decides to move one or more ccTLDs into subdomains or subfolders. Another common example is where a mobile site that sits on a separate subdomain or subfolder becomes responsive and both desktop and mobile URLs are uniformed.

Domain name change

Commonly occurs when a business is rebranding and must move from one domain to another.

Top-level domain change

This is common when a business decides to launch international websites and needs to move from a ccTLD (country code top-level domain) to a gTLD (generic top-level domain) or vice versa, e.g. moving from to .com, or moving from .com to and so on.

Site structure changes

These are changes to the site architecture that usually affect the site’s internal linking and URL structure.

Other types of migrations

There are other types of migration which are triggered by changes to the site’s content, structure, design, or platform.


This is the case when a website is moved from one platform/CMS to another, e.g. migrating from WordPress to Magento or just upgrading to the latest platform version. Replatforming can, in some cases, also result in design and URL changes because of technical limitations that often occur when changing platforms. This is why replatforming migrations rarely result in a website that looks exactly the same as the previous one.

Content migrations

Major content changes such as content rewrites, content consolidation, or content pruning can have a big impact on a site’s organic search visibility, depending on the scale. These changes can often affect the site’s taxonomy, navigation, and internal linking.

Mobile setup changes

With so many options available for a site’s mobile setup moving, enabling app indexing, building an AMP site, or building a PWA website can also be considered as partial site migrations, especially when an existing mobile site is being replaced by an app, AMP, or PWA.

Structural changes

These are often caused by major changes to the site’s taxonomy that impact on the site navigation, internal linking and user journeys.

Site redesigns

These can vary from major design changes in the look and feel to a complete website revamp that may also include significant media, code, and copy changes.

Hybrid migrations

In addition to the above, there are several hybrid migration types that can be combined in practically any way possible. The more changes that get introduced at the same time the higher the complexity and the risks. Even though making too many changes at the same time increases the risks of something going wrong, it can be more cost-effective from a resources perspective if the migration is very well-planned and executed.

Common site migration pitfalls

Even though every site migration is different there are a few common themes behind the most typical site migration disasters, with the biggest being the following:

Poor strategy

Some site migrations are doomed to failure way before the new site is launched. A strategy that is built upon unclear and unrealistic objectives is much less likely to bring success.

Establishing measurable objectives is essential in order to measure the impact of the migration post-launch. For most site migrations, the primary objective should be the retention of the site’s current traffic and revenue levels. In certain cases the bar could be raised higher, but in general anticipating or forecasting growth should be a secondary objective. This will help avoid creating unrealistic expectations.

Poor planning

Coming up with a detailed project plan as early as possible will help avoid delays along the way. Factor in additional time and resources to cope with any unforeseen circumstances that may arise. No matter how well thought out and detailed your plan is, it’s highly unlikely everything will go as expected. Be flexible with your plan and accept the fact that there will almost certainly be delays. Map out all dependencies and make all stakeholders aware of them.

Avoid planning to launch the site near your seasonal peaks, because if anything goes wrong you won’t have enough time to rectify the issues. For instance, retailers should avoid launching a site close to September/October to avoid putting the busy pre-Christmas period at risk. In this case, it would be much wiser launching during the quieter summer months.

Lack of resources

Before committing to a site migration project, estimate the time and effort required to make it a success. If your budget is limited, make a call as to whether it is worth going ahead with a migration that is likely to fail in meeting its established objectives and cause revenue loss.

As a rule of thumb, try to include a buffer of at least 20% in additional resource than you initially think the project will require. This additional buffer will later allow you to quickly address any issues as soon as they arise, without jeopardizing success. If your resources are too tight or you start cutting corners at this early stage, the site migration will be at risk.

Lack of SEO/UX consultation

When changes are taking place on a website, every single decision needs to be weighted from both a UX and SEO standpoint. For instance, removing great amounts of content or links to improve UX may damage the site’s ability to target business-critical keywords or result in crawling and indexing issues. In either case, such changes could damage the site’s organic search visibility. On the other hand, having too much text copy and few images may have a negative impact on user engagement and damage the site’s conversions.

To avoid risks, appoint experienced SEO and UX consultants so they can discuss the potential consequences of every single change with key business stakeholders who understand the business intricacies better than anyone else. The pros and cons of each option need to be weighed before making any decision.

Late involvement

Site migrations can span several months, require great planning and enough time for testing. Seeking professional support late is very risky because crucial steps may have been missed.

Lack of testing

In addition to a great strategy and thoughtful plan, dedicate some time and effort for thorough testing before launching the site. It’s much more preferable to delay the launch if testing has identified critical issues rather than rushing a sketchy implementation into production. It goes without saying that you should not launch a website if it hasn’t been tested by both expert SEO and UX teams.

Attention to detail is also very important. Make sure that the developers are fully aware of the risks associated with poor implementation. Educating the developers about the direct impact of their work on a site’s traffic (and therefore revenue) can make a big difference.

Slow response to bug fixing

There will always be bugs to fix once the new site goes live. However, some bugs are more important than others and may need immediate attention. For instance, launching a new site only to find that search engine spiders have trouble crawling and indexing the site’s content would require an immediate fix. A slow response to major technical obstacles can sometimes be catastrophic and take a long time to recover from.

Underestimating scale

Business stakeholders often do not anticipate site migrations to be so time-consuming and resource-heavy. It’s not uncommon for senior stakeholders to demand that the new site launch on the planned-for day, regardless of whether it’s 100% ready or not. The motto “let's launch ASAP and fix later” is a classic mistake. What most stakeholders are unaware of is that it can take just a few days for organic search visibility to tank, but recovery can take several months.

It is the responsibility of the consultant and project manager to educate clients, run them through all the different phases and scenarios, and explain what each one entails. Business stakeholders are then able to make more informed decisions and their expectations should be easier to manage.

Site migration process

The site migration process can be split into six main essential phases. They are all equally important and skipping any of the below tasks could hinder the migration’s success to varying extents.

Phase 1: Scope & Planning

Work out the project scope

Regardless of the reasons behind a site migration project, you need to be crystal clear about the objectives right from the beginning because these will help to set and manage expectations. Moving a site from HTTP to HTTPS is very different from going through a complete site overhaul, hence the two should have different objectives. In the first instance, the objective should be to retain the site’s traffic levels, whereas in the second you could potentially aim for growth.

A site migration is a great opportunity to address legacy issues. Including as many of these as possible in the project scope should be very cost-effective because addressing these issues post-launch will require significantly more resources.

However, in every case, identify the most critical aspects for the project to be successful. Identify all risks that could have a negative impact on the site’s visibility and consider which precautions to take. Ideally, prepare a few forecasting scenarios based on the different risks and growth opportunities. It goes without saying that the forecasting scenarios should be prepared by experienced site migration consultants.

Including as many stakeholders as possible at this early stage will help you acquire a deeper understanding of the biggest challenges and opportunities across divisions. Ask for feedback from your content, SEO, UX, and Analytics teams and put together a list of the biggest issues and opportunities. You then need to work out what the potential ROI of addressing each one of these would be. Finally, choose one of the available options based on your objectives and available resources, which will form your site migration strategy.

You should now be left with a prioritized list of activities which are expected to have a positive ROI, if implemented. These should then be communicated and discussed with all stakeholders, so you set realistic targets, agree on the project, scope and set the right expectations from the outset.

Prepare the project plan

Planning is equally important because site migrations can often be very complex projects that can easily span several months. During the planning phase, each task needs an owner (i.e. SEO consultant, UX consultant, content editor, web developer) and an expected delivery date. Any dependencies should be identified and included in the project plan so everyone is aware of any activities that cannot be fulfilled due to being dependent on others. For instance, the redirects cannot be tested unless the redirect mapping has been completed and the redirects have been implemented on staging.

The project plan should be shared with everyone involved as early as possible so there is enough time for discussions and clarifications. Each activity needs to be described in great detail, so that stakeholders are aware of what each task would entail. It goes without saying that flawless project management is necessary in order to organize and carry out the required activities according to the schedule.

A crucial part of the project plan is getting the anticipated launch date right. Ideally, the new site should be launched during a time when traffic is low. Again, avoid launching ahead of or during a peak period because the consequences could be devastating if things don’t go as expected. One thing to bear in mind is that as site migrations never go entirely to plan, a certain degree of flexibility will be required.

Phase 2: Pre-launch preparation

These include any activities that need to be carried out while the new site is still under development. By this point, the new site’s SEO requirements should have been captured already. You should be liaising with the designers and information architects, providing feedback on prototypes and wireframes well before the new site becomes available on a staging environment.

Wireframes review

Review the new site’s prototypes or wireframes before development commences. Reviewing the new site’s main templates can help identify both SEO and UX issues at an early stage. For example, you may find that large portions of content have been removed from the category pages, which should be instantly flagged. Or you may discover that some high traffic-driving pages no longer appear in the main navigation. Any radical changes in the design or copy of the pages should be thoroughly reviewed for potential SEO issues.

Preparing the technical SEO specifications

Once the prototypes and wireframes have been reviewed, prepare a detailed technical SEO specification. The objective of this vital document is to capture all the essential SEO requirements developers need to be aware of before working out the project’s scope in terms of work and costs. It’s during this stage that budgets are signed off on; if the SEO requirements aren’t included, it may be impossible to include them later down the line.

The technical SEO specification needs to be very detailed, yet written in such a way that developers can easily turn the requirements into actions. This isn’t a document to explain why something needs to be implemented, but how it should be implemented.

Make sure to include specific requirements that cover at least the following areas:

  • URL structure
  • Meta data (including dynamically generated default values)
  • Structured data
  • Canonicals and meta robots directives
  • Copy & headings
  • Main & secondary navigation
  • Internal linking (in any form)
  • Pagination
  • XML sitemap(s)
  • HTML sitemap
  • Hreflang (if there are international sites)
  • Mobile setup (including the app, AMP, or PWA site)
  • Redirects
  • Custom 404 page
  • JavaScript, CSS, and image files
  • Page loading times (for desktop & mobile)

The specification should also include areas of the CMS functionality that allows users to:

  • Specify custom URLs and override default ones
  • Update page titles
  • Update meta descriptions
  • Update any h1–h6 headings
  • Add or amend the default canonical tag
  • Set the meta robots attributes to index/noindex/follow/nofollow
  • Add or edit the alt text of each image
  • Include Open Graph fields for description, URL, image, type, sitename
  • Include Twitter Open Graph fields for card, URL, title, description, image
  • Bulk upload or amend redirects
  • Update the robots.txt file

It is also important to make sure that when updating a particular attribute (e.g. an h1), other elements are not affected (i.e. the page title or any navigation menus).

Identifying priority pages

One of the biggest challenges with site migrations is that the success will largely depend on the quantity and quality of pages that have been migrated. Therefore, it’s very important to make sure that you focus on the pages that really matter. These are the pages that have been driving traffic to the legacy site, pages that have accrued links, pages that convert well, etc.

In order to do this, you need to:

  1. Crawl the legacy site
  2. Identify all indexable pages
  3. Identify top performing pages

How to crawl the legacy site

Crawl the old website so that you have a copy of all URLs, page titles, meta data, headers, redirects, broken links etc. Regardless of the crawler application of choice (see Appendix), make sure that the crawl isn’t too restrictive. Pay close attention to the crawler’s settings before crawling the legacy site and consider whether you should:

  • Ignore robots.txt (in case any vital parts are accidentally blocked)
  • Follow internal “nofollow” links (so the crawler reaches more pages)
  • Crawl all subdomains (depending on scope)
  • Crawl outside start folder (depending on scope)
  • Change the user agent to Googlebot (desktop)
  • Change the user agent to Googlebot (smartphone)

Pro tip: Keep a copy of the old site’s crawl data (in a file or on the cloud) for several months after the migration has been completed, just in case you ever need any of the old site’s data once the new site has gone live.

How to identify the indexable pages

Once the crawl is complete, work on identifying the legacy site’s indexed pages. These are any HTML pages with the following characteristics:

  • Return a 200 server response
  • Either do not have a canonical tag or have a self-referring canonical URL
  • Do not have a meta robots noindex
  • Aren’t excluded from the robots.txt file
  • Are internally linked from other pages (non-orphan pages)

The indexable pages are the only pages that have the potential to drive traffic to the site and therefore need to be prioritized for the purposes of your site migration. These are the pages worth optimizing (if they will exist on the new site) or redirecting (if they won’t exist on the new site).

How to identify the top performing pages

Once you’ve identified all indexable pages, you may have to carry out more work, especially if the legacy site consists of a large number of pages and optimizing or redirecting all of them is impossible due to time, resource, or technical constraints.

If this is the case, you should identify the legacy site’s top performing pages. This will help with the prioritization of the pages to focus on during the later stages.

It’s recommended to prepare a spreadsheet that includes the below fields:

  • Legacy URL (include only the indexable ones from the craw data)
  • Organic visits during the last 12 months (Analytics)
  • Revenue, conversions, and conversion rate during the last 12 months (Analytics)
  • Pageviews during the last 12 months (Analytics)
  • Number of clicks from the last 90 days (Search Console)
  • Top linked pages (Majestic SEO/Ahrefs)

With the above information in one place, it’s now much easier to identify your most important pages: the ones that generate organic visits, convert well, contribute to revenue, have a good number of referring domains linking to them, etc. These are the pages that you must focus on for a successful site migration.

The top performing pages should ideally also exist on the new site. If for any reason they don’t, they should be redirected to the most relevant page so that users requesting them do not land on 404 pages and the link equity they previously had remains on the site. If any of these pages cease to exist and aren’t properly redirected, your site’s rankings and traffic will negatively be affected.


Once the launch of the new website is getting close, you should benchmark the legacy site’s performance. Benchmarking is essential, not only to compare the new site’s performance with the previous one but also to help diagnose which areas underperform on the new site and to quickly address them.

Keywords rank tracking

If you don’t track the site’s rankings frequently, you should do so just before the new site goes live. Otherwise, you will later struggle figuring out whether the migration has gone smoothly or where exactly things went wrong. Don’t leave this to the last minute in case something goes awry — a week in advance would be the ideal time.

Spend some time working out which keywords are most representative of the site’s organic search visibility and track them across desktop and mobile. Because monitoring thousands of head, mid-, and long-tail keyword combinations is usually unrealistic, the bare minimum you should monitor are keywords that are driving traffic to the site (keywords ranking on page one) and have decent search volume (head/mid-tail focus)

If you do get traffic from both brand and non-brand keywords, you should also decide which type of keywords to focus on more from a tracking POV. In general, non-brand keywords tend to be more competitive and volatile. For most sites it would make sense to focus mostly on these.

Don’t forget to track rankings across desktop and mobile. This will make it much easier to diagnose problems post-launch should there be performance issues on one device type. If you receive a high volume of traffic from more than one country, consider rank tracking keywords in other markets, too, because visibility and rankings can vary significantly from country to country.

Site performance

The new site’s page loading times can have a big impact on both traffic and sales. Several studies have shown that the longer a page takes to load, the higher the bounce rate. Unless the old site’s page loading times and site performance scores have been recorded, it will be very difficult to attribute any traffic or revenue loss to site performance related issues once the new site has gone live.

It’s recommended that you review all major page types using Google’s PageSpeed Insights and Lighthouse tools. You could use summary tables like the ones below to benchmark some of the most important performance metrics, which will be useful for comparisons once the new site goes live.






Optimization score







Category page






Subcategory page






Product page











Optimization score







Category page






Subcategory page






Product page






Old site crawl data

A few days before the new site replaces the old one, run a final crawl of the old site. Doing so could later prove invaluable, should there be any optimization issues on the new site. A final crawl will allow you to save vital information about the old site’s page titles, meta descriptions, h1–h6 headings, server status, canonical tags, noindex/nofollow pages, inlinks/outlinks, level, etc. Having all this information available could save you a lot of trouble if, say, the new site isn’t well optimized or suffers from technical misconfiguration issues. Try also to save a copy of the old site’s robots.txt and XML sitemaps in case you need these later.

Search Console data

Also consider exporting as much of the old site’s Search Console data as possible. These are only available for 90 days, and chances are that once the new site goes live the old site’s Search Console data will disappear sooner or later. Data worth exporting includes:

  • Search analytics queries & pages
  • Crawl errors
  • Blocked resources
  • Mobile usability issues
  • URL parameters
  • Structured data errors
  • Links to your site
  • Internal links
  • Index status

Redirects preparation

The redirects implementation is one of the most crucial activities during a site migration. If the legacy site’s URLs cease to exist and aren’t correctly redirected, the website’s rankings and visibility will simply tank.

Why are redirects important in site migrations?

Redirects are extremely important because they help both search engines and users find pages that may no longer exist, have been renamed, or moved to another location. From an SEO point of view, redirects help search engines discover and index a site’s new URLs quicker but also understand how the old site’s pages are associated with the new site’s pages. This association will allow for ranking signals to pass from the old pages to the new ones, so rankings are retained without being negatively affected.

What happens when redirects aren’t correctly implemented?

When redirects are poorly implemented, the consequences can be catastrophic. Users will either land on Not Found pages (404s) or irrelevant pages that do not meet the user intent. In either case, the site’s bounce and conversion rates will be negatively affected. The consequences for search engines can be equally catastrophic: they’ll be unable to associate the old site’s pages with those on the new site if the URLs aren’t identical. Ranking signals won’t be passed over from the old to the new site, which will result in ranking drops and organic search visibility loss. In addition, it will take search engines longer to discover and index the new site’s pages.

301, 302, JavaScript redirects, or meta refresh?

When the URLs between the old and new version of the site are different, use 301 (permanent) redirects. These will tell search engines to index the new URLs as well as forward any ranking signals from the old URLs to the new ones. Therefore, you must use 301 redirects if your site moves to/from another domain/subdomain, if you switch from HTTP to HTTPS, or if the site or parts of it have been restructured. Despite some of Google’s claims that 302 redirects pass PageRank, indexing the new URLs would be slower and ranking signals could take much longer to be passed on from the old to the new page.

302 (temporary) redirects should only be used in situations where a redirect does not need to live permanently and therefore indexing the new URL isn’t a priority. With 302 redirects, search engines will initially be reluctant to index the content of the redirect destination URL and pass any ranking signals to it. However, if the temporary redirects remain for a long period of time without being removed or updated, they could end up behaving similarly to permanent (301) redirects. Use 302 redirects when a redirect is likely to require updating or removal in the near future, as well as for any country-, language-, or device-specific redirects.

Meta refresh and JavaScript redirects should be avoided. Even though Google is getting better and better at crawling JavaScript, there are no guarantees these will get discovered or pass ranking signals to the new pages.

If you’d like to find out more about how Google deals with the different types of redirects, please refer to John Mueller’s post.

Redirect mapping process

If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.

The redirect mapping file is a spreadsheet that includes the following two columns:

  • Legacy site URL –> a page’s URL on the old site.
  • New site URL –> a page’s URL on the new site.

When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.

Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.

Increasing efficiencies during the redirect mapping process

Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.

Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping. There’s nothing riskier than mapping URLs that will be updated before the new site goes live. When URLs are updated after the redirect mapping is completed, you may have to deal with undesired situations upon launch, such as broken redirects, redirect chains, and redirect loops. A content-freeze should be placed on the old site well in advance of the migration date, so there is a cut-off point for new content being published on the old site. This will make sure that no pages will be missed from the redirect mapping and guarantee that all pages on the old site get redirected.

Don’t forget the legacy redirects!

You should get hold of the old site’s existing redirects to ensure they’re considered when preparing the redirect mapping for the new site. Unless you do this, it’s likely that the site’s current redirect file will get overwritten by the new one on the launch date. If this happens, all legacy redirects that were previously in place will cease to exist and the site may lose a decent amount of link equity, the extent of which will largely depend on the site’s volume of legacy redirects. For instance, a site that has undergone a few migrations in the past should have a good number of legacy redirects in place that you don’t want getting lost.

Ideally, preserve as many of the legacy redirects as possible, making sure these won’t cause any issues when combined with the new site’s redirects. It’s strongly recommended to eliminate any potential redirect chains at this early stage, which can easily be done by checking whether the same URL appears both as a “Legacy URL” and “New site URL” in the redirect mapping spreadsheet. If this is the case, you will need to update the “New site URL” accordingly.


URL A redirects to URL B (legacy redirect)

URL B redirects to URL C (new redirect)

Which results in the following redirect chain:


To eliminate this, amend the existing legacy redirect and create a new one so that:

URL A redirects to URL C (amended legacy redirect)

URL B redirects to URL C (new redirect)

Pro tip: Check your redirect mapping spreadsheet for redirect loops. These occur when the “Legacy URL” is identical to the “new site URL.” Redirect loops need to be removed because they result in infinitely loading pages that are inaccessible to users and search engines. Redirect loops must be eliminated because they are instant traffic, conversion, and ranking killers!

Implement blanket redirect rules to avoid duplicate content

It’s strongly recommended to try working out redirect rules that cover as many URL requests as possible. Implementing redirect rules on a web server is much more efficient than relying on numerous one-to-one redirects. If your redirect mapping document consists of a very large number of redirects that need to be implemented as one-to-one redirect rules, site performance could be negatively affected. In any case, double check with the development team the maximum number of redirects the web server can handle without issues.

In any case, there are some standard redirect rules that should be in place to avoid generating duplicate content issues:

Even if some of these standard redirect rules exist on the legacy website, do not assume they’ll necessarily exist on the new site unless they’re explicitly requested.

Avoid internal redirects

Try updating the site’s internal links so they don’t trigger internal redirects. Even though search engines can follow internal redirects, these are not recommended because they add additional latency to page loading times and could also have a negative impact on search engine crawl time.

Don’t forget your image files

If the site’s images have moved to a new location, Google recommends redirecting the old image URLs to the new image URLs to help Google discover and index the new images quicker. If it’s not easy to redirect all images, aim to redirect at least those image URLs that have accrued backlinks.

Phase 3: Pre-launch testing

The earlier you can start testing, the better. Certain things need to be fully implemented to be tested, but others don’t. For example, user journey issues could be identified from as early as the prototypes or wireframes design. Content-related issues between the old and new site or content inconsistencies (e.g. between the desktop and mobile site) could also be identified at an early stage. But the more technical components should only be tested once fully implemented — things like redirects, canonical tags, or XML sitemaps. The earlier issues get identified, the more likely it is that they’ll be addressed before launching the new site. Identifying certain types of issues at a later stage isn’t cost effective, would require more resources, and cause significant delays. Poor testing and not allowing the time required to thoroughly test all building blocks that can affect SEO and UX performance can have disastrous consequences soon after the new site has gone live.

Making sure search engines cannot access the staging/test site

Before making the new site available on a staging/testing environment, take some precautions that search engines do not index it. There are a few different ways to do this, each with different pros and cons.

Site available to specific IPs (most recommended)

Making the test site available only to specific (whitelisted) IP addresses is a very effective way to prevent search engines from crawling it. Anyone trying to access the test site’s URL won’t be able to see any content unless their IP has been whitelisted. The main advantage is that whitelisted users could easily access and crawl the site without any issues. The only downside is that third-party web-based tools (such as Google’s tools) cannot be used because of the IP restrictions.

Password protection

Password protecting the staging/test site is another way to keep search engine crawlers away, but this solution has two main downsides. Depending on the implementation, it may not be possible to crawl and test a password-protected website if the crawler application doesn’t make it past the login screen. The other downside: password-protected websites that use forms for authentication can be crawled using third-party applications, but there is a risk of causing severe and unexpected issues. This is because the crawler clicks on every link on a page (when you’re logged in) and could easily end up clicking on links that create or remove pages, install/uninstall plugins, etc.

Robots.txt blocking

Adding the following lines of code to the test site’s robots.txt file will prevent search engines from crawling the test site’s pages.

User-agent: *
Disallow: /

One downside of this method is that even though the content that appears on the test server won’t get indexed, the disallowed URLs may appear on Google’s search results. Another downside is that if the above robots.txt file moves into the live site, it will cause severe de-indexing issues. This is something I’ve encountered numerous times and for this reason I wouldn’t recommend using this method to block search engines.

User journey review

If the site has been redesigned or restructured, chances are that the user journeys will be affected to some extent. Reviewing the user journeys as early as possible and well before the new site launches is difficult due to the lack of user data. However, an experienced UX professional will be able to flag any concerns that could have a negative impact on the site’s conversion rate. Because A/B testing at this stage is hardly ever possible, it might be worth carrying out some user testing and try to get some feedback from real users. Unfortunately, user experience issues can be some of the harder ones to address because they may require sitewide changes that take a lot of time and effort.

On full site overhauls, not all UX decisions can always be backed up by data and many decisions will have to be based on best practice, past experience, and “gut feeling,” hence getting UX/CRO experts involved as early as possible could pay dividends later.

Site architecture review

A site migration is often a great opportunity to improve the site architecture. In other words, you have a great chance to reorganize your keyword targeted content and maximize its search traffic potential. Carrying out extensive keyword research will help identify the best possible category and subcategory pages so that users and search engines can get to any page on the site within a few clicks — the fewer the better, so you don’t end up with a very deep taxonomy.

Identifying new keywords with decent traffic potential and mapping them into new landing pages can make a big difference to the site’s organic traffic levels. On the other hand, enhancing the site architecture needs to be done thoughtfully. Itt could cause problems if, say, important pages move deeper into the new site architecture or there are too many similar pages optimized for the same keywords. Some of the most successful site migrations are the ones that allocate significant resources to enhance the site architecture.

Meta data & copy review

Make sure that the site’s page titles, meta descriptions, headings, and copy have been transferred from the old to the new site without issues. If you’ve created any new pages, make sure these are optimized and don’t target keywords that have already been targeted by other pages. If you’re re-platforming, be aware that the new platform may have different default values when new pages are being created. Launching the new site without properly optimized page titles or any kind of missing copy will have an immediate negative impact on your site’s rankings and traffic. Do not forget to review whether any user-generated content (i.e. user reviews, comments) has also been uploaded.

Internal linking review

Internal links are the backbone of a website. No matter how well optimized and structured the site’s copy is, it won’t be sufficient to succeed unless it’s supported by a flawless internal linking scheme. Internal links must be reviewed throughout the entire site, including links found in:

  • Main & secondary navigation
  • Header & footer links
  • Body content links
  • Pagination links
  • Horizontal links (related articles, similar products, etc)
  • Vertical links (e.g. breadcrumb navigation)
  • Cross-site links (e.g. links across international sites)

Technical checks

A series of technical checks must be carried out to make sure the new site’s technical setup is sound and to avoid coming across major technical glitches after the new site has gone live.

Robots.txt file review

Prepare the new site’s robots.txt file on the staging environment. This way you can test it for errors or omissions and avoid experiencing search engine crawl issues when the new site goes live. A classic mistake in site migrations is when the robots.txt file prevents search engine access using the following directive:

Disallow: /

If this gets accidentally carried over into the live site (and it often does), it will prevent search engines from crawling the site. And when search engines cannot crawl an indexed page, the keywords associated with the page will get demoted in the search results and eventually the page will get de-indexed.

But if the robots.txt file on staging is populated with the new site’s robots.txt directives, this mishap could be avoided.

When preparing the new site’s robots.txt file, make sure that:

  • It doesn’t block search engine access to pages that are intended to get indexed.
  • It doesn’t block any JavaScript or CSS resources search engines require to render page content.
  • The legacy site’s robots.txt file content has been reviewed and carried over if necessary.
  • It references the new XML sitemaps(s) rather than any legacy ones that no longer exist.

Canonical tags review

Review the site’s canonical tags. Look for pages that either do not have a canonical tag or have a canonical tag that is pointing to another URL and question whether this is intended. Don’t forget to crawl the canonical tags to find out whether they return a 200 server response. If they don’t you will need to update them to eliminate any 3xx, 4xx, or 5xx server responses. You should also look for pages that have a canonical tag pointing to another URL combined with a noindex directive, because these two are conflicting signals and you;’ll need to eliminate one of them.

Meta robots review

Once you’ve crawled the staging site, look for pages with the meta robots properties set to “noindex” or “nofollow.” If this is the case, review each one of them to make sure this is intentional and remove the “noindex” or “nofollow” directive if it isn’t.

XML sitemaps review

Prepare two different types of sitemaps: one that contains all the new site’s indexable pages, and another that includes all the old site’s indexable pages. The former will help make Google aware of the new site’s indexable URLs. The latter will help Google become aware of the redirects that are in place and the fact that some of the indexed URLs have moved to new locations, so that it can discover them and update search results quicker.

You should check each XML sitemap to make sure that:

  • It validates without issues
  • It is encoded as UTF-8
  • It does not contain more than 50,000 rows
  • Its size does not exceed 50MBs when uncompressed

If there are more than 50K rows or the file size exceeds 50MB, you must break the sitemap down into smaller ones. This prevents the server from becoming overloaded if Google requests the sitemap too frequently.

In addition, you must crawl each XML sitemap to make sure it only includes indexable URLs. Any non-indexable URLs should be excluded from the XML sitemaps, such as:

  • 3xx, 4xx, and 5xx pages (e.g. redirected, not found pages, bad requests, etc)
  • Soft 404s. These are pages with no content that return a 200 server response, instead of a 404.
  • Canonicalized pages (apart from self-referring canonical URLs)
  • Pages with a meta robots noindex directive
<!DOCTYPE html>
<meta name="robots" content="noindex" />
  • Pages with a noindex X-Robots-Tag in the HTTP header
HTTP/1.1 200 OK
Date: Tue, 10 Nov 2017 17:12:43 GMT
X-Robots-Tag: noindex
  • Pages blocked from the robots.txt file

Building clean XML sitemaps can help monitor the true indexing levels of the new site once it goes live. If you don’t, it will be very difficult to spot any indexing issues.

Pro tip: Download and open each XML sitemap in Excel to get a detailed overview of any additional attributes, such as hreflang or image attributes.

HTML sitemap review

Depending on the size and type of site that is being migrated, having an HTML sitemap can in certain cases be beneficial. An HTML sitemap that consists of URLs that aren’t linked from the site’s main navigation can significantly boost page discovery and indexing. However, avoid generating an HTML sitemap that includes too many URLs. If you do need to include thousands of URLs, consider building a segmented HTML sitemap.

The number of nested sitemaps as well as the maximum number of URLs you should include in each sitemap depends on the site’s authority. The more authoritative a website, the higher the number of nested sitemaps and URLs it could get away with.

For example, the HTML sitemap consists of three levels, where each one includes over 1,000 URLs per sitemap. These nested HTML sitemaps aid search engine crawlers in discovering articles published since 1851 that otherwise would be difficult to discover and index, as not all of them would have been internally linked.

The NYTimes HTML sitemap (level 1)

The NYTimes HTML sitemap (level 2)

Structured data review

Errors in the structured data markup need to be identified early so there’s time to fix them before the new site goes live. Ideally, you should test every single page template (rather than every single page) using Google’s Structured Data Testing tool.

Be sure to check the markup on both the desktop and mobile pages, especially if the mobile website isn’t responsive.

Structured Data Testing Tool.png

The tool will only report any existing errors but not omissions. For example, if your product page template does not include the Product structured data schema, the tool won’t report any errors. So, in addition to checking for errors you should also make sure that each page template includes the appropriate structured data markup for its content type.

Please refer to Google’s documentation for the most up to date details on the structured data implementation and supported content types.

JavaScript crawling review

You must test every single page template of the new site to make sure Google will be able to crawl content that requires JavaScript parsing. If you’re able to use Google’s Fetch and Render tool on your staging site, you should definitely do so. Otherwise, carry out some manual tests, following Justin Brigg’s advice.

As Bartosz Góralewicz’s tests proved, even if Google is able to crawl and index JavaScript-generated content, it does not mean that it is able to crawl JavaScript content across all major JavaScript frameworks. The following table summarizes Bartosz’s findings, showing that some JavaScript frameworks are not SEO-friendly, with AngularJS currently being the most problematic of all.

Bartosz also found that other search engines (such as Bing, Yandex, and Baidu) really struggle with indexing JavaScript-generated content, which is important to know if your site’s traffic relies on any of these search engines.

Hopefully, this is something that will improve over time, but with the increasing popularity of JavaScript frameworks in web development, this must be high up on your checklist.

Finally, you should check whether any external resources are being blocked. Unfortunately, this isn’t something you can control 100% because many resources (such as JavaScript and CSS files) are hosted by third-party websites which may be blocking them via their own robots.txt files!

Again, the Fetch and Render tool can help diagnose this type of issue that, if left unresolved, could have a significant negative impact.

Mobile site SEO review

Assets blocking review

First, make sure that the robots.txt file isn’t accidentally blocking any JavaScript, CSS, or image files that are essential for the mobile site’s content to render. This could have a negative impact on how search engines render and index the mobile site’s page content, which in turn could negatively affect the mobile site’s search visibility and performance.

Mobile-first index review

In order to avoid any issues associated with Google’s mobile-first index, thoroughly review the mobile website and make there aren’t any inconsistencies between the desktop and mobile sites in the following areas:

  • Page titles
  • Meta descriptions
  • Headings
  • Copy
  • Canonical tags
  • Meta robots attributes (i.e. noindex, nofollow)
  • Internal links
  • Structured data

A responsive website should serve the same content, links, and markup across devices, and the above SEO attributes should be identical across the desktop and mobile websites.

In addition to the above, you must carry out a few further technical checks depending on the mobile site’s set up.

Responsive site review

A responsive website must serve all devices the same HTML code, which is adjusted (via the use of CSS) depending on the screen size.

Googlebot is able to automatically detect this mobile setup as long as it’s allowed to crawl the page and its assets. It’s therefore extremely important to make sure that Googlebot can access all essential assets, such as images, JavaScript, and CSS files.

To signal browsers that a page is responsive, a meta="viewport" tag should be in place within the <head> of each HTML page.

<meta name="viewport" content="width=device-width, initial-scale=1.0">

If the meta viewport tag is missing, font sizes may appear in an inconsistent manner, which may cause Google to treat the page as not mobile-friendly.

Separate mobile URLs review

If the mobile website uses separate URLs from desktop, make sure that:

  1. Each desktop page has a tag pointing to the corresponding mobile URL.
  2. Each mobile page has a rel="canonical" tag pointing to the corresponding desktop URL.
  3. When desktop URLs are requested on mobile devices, they’re redirected to the respective mobile URL.
  4. Redirects work across all mobile devices, including Android, iPhone, and Windows phones.
  5. There aren’t any irrelevant cross-links between the desktop and mobile pages. This means that internal links on found on a desktop page should only link to desktop pages and those found on a mobile page should only link to other mobile pages.
  6. The mobile URLs return a 200 server response.

Dynamic serving review

Dynamic serving websites serve different code to each device, but on the same URL.

On dynamic serving websites, review whether the vary HTTP header has been correctly set up. This is necessary because dynamic serving websites alter the HTML for mobile user agents and the vary HTTP header helps Googlebot discover the mobile content.

Mobile-friendliness review

Regardless of the mobile site set-up (responsive, separate URLs or dynamic serving), review the pages using a mobile user-agent and make sure that:

  1. The viewport has been set correctly. Using a fixed width viewport across devices will cause mobile usability issues.
  2. The font size isn’t too small.
  3. Touch elements (i.e. buttons, links) aren’t too close.
  4. There aren’t any intrusive interstitials, such as Ads, mailing list sign-up forms, App Download pop-ups etc. To avoid any issues, you should use either use a small HTML or image banner.
  5. Mobile pages aren’t too slow to load (see next section).

Google’s mobile-friendly test tool can help diagnose most of the above issues:

Google’s mobile-friendly test tool in action

AMP site review

If there is an AMP website and a desktop version of the site is available, make sure that:

  • Each non-AMP page (i.e. desktop, mobile) has a tag pointing to the corresponding AMP URL.
  • Each AMP page has a rel="canonical" tag pointing to the corresponding desktop page.
  • Any AMP page that does not have a corresponding desktop URL has a self-referring canonical tag.

You should also make sure that the AMPs are valid. This can be tested using Google’s AMP Test Tool.

Mixed content errors

With Google pushing hard for sites to be fully secure and Chrome becoming the first browser to flag HTTP pages as not secure, aim to launch the new site on HTTPS, making sure all resources such as images, CSS and JavaScript files are requested over secure HTTPS connections.This is essential in order to avoid mixed content issues.

Mixed content occurs when a page that’s loaded over a secure HTTPS connection requests assets over insecure HTTP connections. Most browsers either block dangerous HTTP requests or just display warnings that hinder the user experience.

Mixed content errors in Chrome’s JavaScript Console

There are many ways to identify mixed content errors, including the use of crawler applications, Google’s Lighthouse, etc.

Image assets review

Google crawls images less frequently than HTML pages. If migrating a site’s images from one location to another (e.g. from your domain to a CDN), there are ways to aid Google in discovering the migrated images quicker. Building an image XML sitemap will help, but you also need to make sure that Googlebot can reach the site’s images when crawling the site. The tricky part with image indexing is that both the web page where an image appears on as well as the image file itself have to get indexed.

Site performance review

Last but not least, measure the old site’s page loading times and see how these compare with the new site’s when this becomes available on staging. At this stage, focus on the network-independent aspects of performance such as the use of external resources (images, JavaScript, and CSS), the HTML code, and the web server’s configuration. More information about how to do this is available further down.

Analytics tracking review

Make sure that analytics tracking is properly set up. This review should ideally be carried out by specialist analytics consultants who will look beyond the implementation of the tracking code. Make sure that Goals and Events are properly set up, e-commerce tracking is implemented, enhanced e-commerce tracking is enabled, etc. There’s nothing more frustrating than having no analytics data after your new site is launched.

Redirects testing

Testing the redirects before the new site goes live is critical and can save you a lot of trouble later. There are many ways to check the redirects on a staging/test server, but the bottom line is that you should not launch the new website without having tested the redirects.

Once the redirects become available on the staging/testing environment, crawl the entire list of redirects and check for the following issues:

  • Redirect loops (a URL that infinitely redirects to itself)
  • Redirects with a 4xx or 5xx server response.
  • Redirect chains (a URL that redirects to another URL, which in turn redirects to another URL, etc).
  • Canonical URLs that return a 4xx or 5xx server response.
  • Canonical loops (page A has a canonical pointing to page B, which has a canonical pointing to page A).
  • Canonical chains (a canonical that points to another page that has a canonical pointing to another page, etc).
  • Protocol/host inconsistencies e.g. URLs are redirected to both HTTP and HTTPS URLs or www and non-www URLs.
  • Leading/trailing whitespace characters. Use trim() in Excel to eliminate them.
  • Invalid characters in URLs.

Pro tip: Make sure one of the old site’s URLs redirects to the correct URL on the new site. At this stage, because the new site doesn’t exist yet, you can only test whether the redirect destination URL is the intended one, but it’s definitely worth it. The fact that a URL redirects does not mean it redirects to the right page.

Phase 4: Launch day activities

When the site is down...

While the new site is replacing the old one, chances are that the live site is going to be temporarily down. The downtime should be kept to a minimum, but while this happens the web server should respond to any URL request with a 503 (service unavailable) server response. This will tell search engine crawlers that the site is temporarily down for maintenance so they come back to crawl the site later.

If the site is down for too long without serving a 503 server response and search engines crawl the website, organic search visibility will be negatively affected and recovery won’t be instant once the site is back up. In addition, while the website is temporarily down it should also serve an informative holding page notifying users that the website is temporarily down for maintenance.

Technical spot checks

As soon as the new site has gone live, take a quick look at:

  1. The robots.txt file to make sure search engines are not blocked from crawling
  2. Top pages redirects (e.g. do requests for the old site’s top pages redirect correctly?)
  3. Top pages canonical tags
  4. Top pages server responses
  5. Noindex/nofollow directives, in case they are unintentional

The spot checks need to be carried out across both the mobile and desktop sites, unless the site is fully responsive.

Search Console actions

The following activities should take place as soon as the new website has gone live:

  1. Test & upload the XML sitemap(s)
  2. Set the Preferred location of the domain (www or non-www)
  3. Set the International targeting (if applicable)
  4. Configure the URL parameters to tackle early any potential duplicate content issues.
  5. Upload the Disavow file (if applicable)
  6. Use the Change of Address tool (if switching domains)

Pro tip: Use the “Fetch as Google” feature for each different type of page (e.g. the homepage, a category, a subcategory, a product page) to make sure Googlebot can render the pages without any issues. Review any reported blocked resources and do not forget to use Fetch and Render for desktop and mobile, especially if the mobile website isn’t responsive.

Blocked resources prevent Googlebot from rendering the content of the page

Phase 5: Post-launch review

Once the new site has gone live, a new round of in-depth checks should be carried out. These are largely the same ones as those mentioned in the “Phase 3: Pre-launch Testing” section.

However, the main difference during this phase is that you now have access to a lot more data and tools. Don’t underestimate the amount of effort you’ll need to put in during this phase, because any issues you encounter now directly impacts the site’s performance in the SERPs. On the other hand, the sooner an issue gets identified, the quicker it will get resolved.

In addition to repeating the same testing tasks that were outlined in the Phase 3 section, in certain areas things can be tested more thoroughly, accurately, and in greater detail. You can now take full advantage of the Search Console features.

Check crawl stats and server logs

Keep an eye on the crawl stats available in the Search Console, to make sure Google is crawling the new site’s pages. In general, when Googlebot comes across new pages it tends to accelerate the average number of pages it crawls per day. But if you can’t spot a spike around the time of the launch date, something may be negatively affecting Googlebot’s ability to crawl the site.

Crawl stats on Google’s Search Console

Reviewing the server log files is by far the most effective way to spot any crawl issues or inefficiencies. Tools like Botify and On Crawl can be extremely useful because they combine crawls with server log data and can highlight pages search engines do not crawl, pages that are not linked to internally (orphan pages), low-value pages that are heavily internally linked, and a lot more.

Review crawl errors regularly

Keep an eye on the reported crawl errors, ideally daily during the first few weeks. Downloading these errors daily, crawling the reported URLs, and taking the necessary actions (i.e. implement additional 301 redirects, fix soft 404 errors) will aid a quicker recovery. It’s highly unlikely you will need to redirect every single 404 that is reported, but you should add redirects for the most important ones.

Pro tip: In Google Analytics you can easily find out which are the most commonly requested 404 URLs and fix these first!

Other useful Search Console features

Other Search Console features worth checking include the Blocked Resources, Structured Data errors, Mobile Usability errors, HTML Improvements, and International Targeting (to check for hreflang reported errors).

Pro tip: Keep a close eye on the URL parameters in case they’re causing duplicate content issues. If this is the case, consider taking some urgent remedial action.

Measuring site speed

Once the new site is live, measure site speed to make sure the site’s pages are loading fast enough on both desktop and mobile devices. With site speed being a ranking signal across devices and becauseslow pages lose users and customers, comparing the new site’s speed with the old site’s is extremely important. If the new site’s page loading times appear to be higher you should take some immediate action, otherwise your site’s traffic and conversions will almost certainly take a hit.

Evaluating speed using Google’s tools

Two tools that can help with this are Google’s Lighthouse and Pagespeed Insights.

ThePagespeed Insights Tool measures page performance on both mobile and desktop devices and shows real-world page speed data based on user data Google collects from Chrome. It also checks to see if a page has applied common performance best practices and provides an optimization score. The tool includes the following main categories:

  • Speed score: Categorizes a page as Fast, Average, or Slow using two metrics: The First Contentful Paint (FCP) and DOM Content Loaded (DCL). A page is considered fast if both metrics are in the top one-third of their category.
  • Optimization score: Categorizes a page as Good, Medium, or Low based on performance headroom.
  • Page load distributions: Categorizes a page as Fast (fastest third), Average (middle third), or Slow (bottom third) by comparing against all FCP and DCL events in the Chrome User Experience Report.
  • Page stats: Can indicate if the page might be faster if the developer modifies the appearance and functionality of the page.
  • Optimization suggestions: A list of best practices that could be applied to a page.

Google’s PageSpeed Insights in action

Google’s Lighthouse is very handy for mobile performance, accessibility, and Progressive Web Apps audits. It provides various useful metrics that can be used to measure page performance on mobile devices, such as:

  • First Meaningful Paint that measures when the primary content of a page is visible.
  • Time to Interactive is the point at which the page is ready for a user to interact with.
  • Speed Index measures shows how quickly a page are visibly populated

Both tools provide recommendations to help improve any reported site performance issues.

Google’s Lighthouse in action

You can also use this Google tool to get a rough estimate on the percentage of users you may be losing from your mobile site’s pages due to slow page loading times.

The same tool also provides an industry comparison so you get an idea of how far you are from the top performing sites in your industry.

Measuring speed from real users

Once the site has gone live, you can start evaluating site speed based on the users visiting your site. If you have Google Analytics, you can easily compare the new site’s average load time with the previous one.

In addition, if you have access to a Real User Monitoring tool such as Pingdom, you can evaluate site speed based on the users visiting your website. The below map illustrates how different visitors experience very different loading times depending on their geographic location. In the below example, the page loading times appear to be satisfactory to visitors from the UK, US, and Germany, but to users residing in other countries they are much higher.

Phase 6: Measuring site migration performance

When to measure

Has the site migration been successful? This is the million-dollar question everyone involved would like to know the answer to as soon as the new site goes live. In reality, the longer you wait the clearer the answer becomes, as visibility during the first few weeks or even months can be very volatile depending on the size and authority of your site. For smaller sites, a 4–6 week period should be sufficient before comparing the new site’s visibility with the old site’s. For large websites you may have to wait for at least 2–3 months before measuring.

In addition, if the new site is significantly different from the previous one, users will need some time to get used to the new look and feel and acclimatize themselves with the new taxonomy, user journeys, etc. Such changes initially have a significant negative impact on the site’s conversion rate, which should improve after a few weeks as returning visitors are getting more and more used to the new site. In any case, making data-driven conclusions about the new site’s UX can be risky.

But these are just general rules of thumb and need to be taken into consideration along with other factors. For instance, if a few days or weeks after the new site launch significant additional changes were made (e.g. to address a technical issue), the migration’s evaluation should be pushed further back.

How to measure

Performance measurement is very important and even though business stakeholders would only be interested to hear about the revenue and traffic impact, there are a whole lot of other metrics you should pay attention to. For example, there can be several reasons for revenue going down following a site migration, including seasonal trends, lower brand interest, UX issues that have significantly lowered the site’s conversion rate, poor mobile performance, poor page loading times, etc. So, in addition to the organic traffic and revenue figures, also pay attention to the following:

  • Desktop & mobile visibility (from SearchMetrics, SEMrush, Sistrix)
  • Desktop and mobile rankings (from any reliable rank tracking tool)
  • User engagement (bounce rate, average time on page)
  • Sessions per page type (i.e. are the category pages driving as many sessions as before?)
  • Conversion rate per page type (i.e. are the product pages converting the same way as before?)
  • Conversion rate by device (i.e. has the desktop/mobile conversion rate increased/decreased since launching the new site?)

Reviewing the below could also be very handy, especially from a technical troubleshooting perspective:

  • Number of indexed pages (Search Console)
  • Submitted vs indexed pages in XML sitemaps (Search Console)
  • Pages receiving at least one visit (analytics)
  • Site speed (PageSpeed Insights, Lighthouse, Google Analytics)

It’s only after you’ve looked into all the above areas that you could safely conclude whether your migration has been successful or not.

Good luck and if you need any consultation or assistance with your site migration, please get in touch!

Appendix: Useful tools


  • Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
  • Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
  • Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
  • Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
  • On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.

Handy Chrome add-ons

  • Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
  • User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
  • Ayima Redirect Path: A great header and redirect checker.
  • SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
  • Scraper: An easy way to scrape website data into a spreadsheet.

Site monitoring tools

  • Uptime Robot: Free website uptime monitoring.
  • Robotto: Free robots.txt monitoring tool.
  • Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
  • SEO Radar: Monitors all critical SEO elements and fires alerts when these change.

Site performance tools

  • PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
  • Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
  • Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.

Structured data testing tools

Mobile testing tools

Backlink data sources

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Website Migration Guide: SEO Strategy & Process posted first on

Declining Organic Traffic? How to Tell if it’s a Tracking or Optimization Issue

Posted by andrewchoco

Picture this scenario. You’re a new employee that has just been brought in to a struggling marketing department (or an agency brought on to help recover lost numbers). You get access to Google Analytics, and see something like this:

(Actual screenshot of the client I audited)

This can generate two types of emotional response: excitement or fear (or both). The steady decline in organic traffic excites you because you have so many tactics and ideas that you think can save this company from spiraling downward out of control. But there’s also the fear that these tactics wont be enough to correct the course.

Regardless of whether these new tactics would work or not, it’s important to understand the history of the account and determine not only what is happening, but why.

The company may have an idea of why the traffic is declining (i.e. competitors have come in and made ranking for keywords much harder, or they did a website redesign and have never recovered).

Essentially, this boils down to two things: 1) either you’re struggling with organic optimization, or 2) something was off with your tracking in Google Analytics, has since been corrected, and hasn’t been caught.

In this article, I’ll go over an audit I did for one of my clients to help determine if the decline we saw in organic traffic was due to actual poor SEO performance, an influx in competitors, tracking issues, or a combination of these things.

I’ll be breaking it down into five different areas of investigation:

  1. Keyword ranking differences from 2015–2017
    1. Did the keywords we were ranking for in 2015 change drastically in 2017? Did we lose rankings and therefore lose organic traffic?
  2. Top organic landing pages from 2015–2017
    1. Are the top ranking organic landing pages the same currently as they were in 2015? Are we missing any pages due to a website redesign?
  3. On-page metric
    1. Did something happen to the site speed / bounce rate / page views etc.
  4. SEMrush/Moz keyword, traffic, and domain authority data
    1. Looking at the SEMrush organic traffic cost metric as well as Moz metrics like Domain Authority and competitors.
  5. Goal completions
    1. Did our conversion numbers stay consistent throughout the traffic drop? Or did the conversions drop in correlation with the traffic drop?

By the end of this post, my goal is that you’ll be able to replicate this audit to determine exactly what’s causing your organic traffic decline and how to get back on the right track.

Let’s dive in!

Keyword ranking differences from 2015–2017

This was my initial starting point for my audit. I started with this specifically because the most obvious answer, for a decline in traffic is a decline in keyword rankings.

I wanted to look at what keywords we were ranking for in 2015 to see if we significantly dropped in the rankings or if the search volume had dropped. If the company you’re auditing has had a long-running Moz account, start by looking at the keyword rankings from the initial start of the decline, compared to current keyword rankings.

I exported keyword data from both SEMrush and Moz, and looked specifically at the ranking changes of core keywords.

March was a particularly strong month across the board, so I narrowed it down and exported the keyword rankings in:

  • March 2015
  • March 2016
  • March 2017
  • December 2017 (so I could get the most current rankings)

Once the keywords were exported, I went in and highlighted in red the keywords that we were ranking for in 2015 (and driving traffic from) that we were no longer ranking for in 2017. I also highlighted in yellow the keywords we were ranking for in 2015 that were still ranking in 2017.

2015 keywords:

2017 keywords:

(Brand-related queries and URLs are blurred out for anonymity)

One thing that immediately stood out: in 2015, this company was ranking for five keywords, including the word “free.” They have since changed their offering, so it made sense that in 2017, we weren’t ranking for those keywords.

After removing the free queries, we pulled the “core” keywords to look at their differences.

March 2015 core keywords:

  • Appointment scheduling software: position 9
  • Online appointment scheduling: position 11
  • Online appointment scheduling: position 9
  • Online scheduling software: position 9
  • Online scheduler: position 9
  • Online scheduling: position 13

December 2017 core keywords:

  • Appointment scheduler: position 11
  • Appointment scheduling software: position 10
  • Online schedule: position 6
  • Online appointment scheduler: position 11
  • Online appointment scheduling: position 12
  • Online scheduling software: position 12
  • Online scheduling tool: position 10
  • Online scheduling: position 15
  • SaaS appointment scheduling: position 2

There were no particular red flags here. While some of the keywords have moved down 1–2 spots, we had new ones jump up. These small changes in movement didn’t explain the nearly 30–40% drop in organic traffic. I checked this off my list and moved on to organic landing pages.

Top organic landing page changes

Since the dive into keyword rankings didn’t provide the answer for the decline in traffic, the next thing I looked at were the organic landing pages. I knew this client had switched over CMS systems in early 2017, and had done a few small redesign projects the past three years.

After exporting our organic landing pages for 2015, 2016, and 2017, we compared the top ten (by organic sessions) and got the following results.

2015 top organic landing pages:

2016 top organic landing pages:

2017 top organic landing pages:

Because of their redesign, you can see that the subfolders changed between 2015/2016 to 2017. What really got my attention, however, is the /get-started page. In 2015/2016, the Get Started page accounted for nearly 16% of all organic traffic. In 2017, the Get Started page was nowhere to be found.

If you run into this problem and notice there are pages missing from your current top organic pages, a great way to uncover why is to use the Wayback Machine. It's a great tool that allows you to see what a web page looked like in the past.

When we looked at the /get-started URL in the Wayback Machine, we noticed something pretty interesting:

In 2015, their /get-started page also acted as their login page. When people were searching on Google for “[Company Name] login,” this page was ranking, bringing in a significant amount of organic traffic.

Their current setup sends logins to a subdomain that doesn’t have a GA code (as it’s strictly used as a portal to the actual application).

That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.

While this helped explain some of the traffic loss, the next thing we looked at was the on-page metrics to see if we could spot any obvious tracking issues.

Comparing on-page engagement metrics

Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.

Looking at the same top organic pages, I compared the on-page engagement metrics.

2015 on-page metrics:

2016 on-page metrics:

2017 on-page metrics:

While the overall engagement metrics changed slightly, the biggest and most interesting discrepancy I saw was in the bounce rates for the home page and Get Started page.

According to a number of different studies (like this one, this one, or even this one), the average bounce rate for a B2B site is around 40–60%. Seeing the home page with a bounce rate under 20% was definitely a red flag.

This led me to look into some other metrics as well. I compared key metrics between 2015 and 2017, and was utterly confused by the findings:

Looking at the organic sessions (overall), we saw a decrease of around 80,000 sessions, or 27.93%.

Looking at the organic users (overall) we saw a similar number, with a decrease of around 38,000 users, or 25%.

When we looked at page views, however, we saw a much more drastic drop:

For the entire site, we saw a 50% decrease in pageviews, or a decrease of nearly 400,000 page views.

This didn’t make much sense, because even if we had those extra 38,000 users, and each user averaged roughly 2.49 pages per session (looking above), that would only account for, at most, 100,000 more page views. This left 300,000 page views unaccounted for.

This led me to believe that there was definitely some sort of tracking issue. The high number of page views and low bounce rate made me suspect that some users were being double counted.

However, to confirm these assumptions, I took a look at some external data sources.

Using SEMrush and Moz data to exclude user error

If you have a feeling that your tracking was messed up in previous years, a good way to confirm or deny this hypothesis is to check external sources like Moz and SEMrush.

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.

Another good thing to look at is domain authority and core page authority. If your site has had a few redesigns, moved URLs, or anything like that, it’s important to make sure that the domain authority has carried over. It’s also important to look at the page authorities of your core pages. If these are much lower than when they were before the organic traffic slide, there’s a good chance your redirects weren’t done properly, and the page authority isn’t being carried over through those new domains.

If, like me, you don’t have Moz data that dates back far enough, a good thing to check is the organic traffic cost in SEMrush.

Organic traffic cost can change because of a few reasons:

  1. Your site is ranking for more valuable keywords, making the organic traffic cost rise.
  2. More competitors have entered the space, making the keywords you were ranking for more expensive to bid on.

Usually it’s a combination of both of these.

If our organic traffic really was steadily decreasing for the past 2 years, we’d likely see a similar trendline looking at our organic traffic cost. However, that’s not what we saw.

In March of 2015, the organic traffic cost of my client’s site was $14,300.

In March of 2016, the organic traffic cost was $22,200

In December of 2017, the organic traffic cost spiked all the way up to $69,200. According to SEMrush, we also saw increases in keywords and traffic.

Looking at all of this external data re-affirmed the assumption that something must have been off with our tracking.

However, as a final check, I went back to internal metrics to see if the conversion data had decreased at a similar rate as the organic traffic.

Analyzing and comparing conversion metrics

This seemed like a natural final step into uncovering the mystery in this traffic drop. After all, it’s not organic traffic that's going to profit your business (although it’s a key component). The big revenue driver is goal completions and form fills.

This was a fairly simple procedure. I went into Google Analytics to compare goal completion numbers and goal completion conversion rates over the past three years.

If your company is like my client’s, there’s a good chance you’re taking advantage of the maximum 20 goal completions that can be simultaneously tracked in Analytics. However, to make things easier and more consistent (since goal completions can change), I looked at only buyer intent conversions. In this case it was Enterprise, Business, and Personal edition form fills, as well as Contact Us form fills.

If you’re doing this on your own site, I would recommend doing the same thing. Gated content goal completions usually have a natural shelf life, and this natural slowdown in goal completions can skew the data. I’d look at the most important conversion on your site (usually a contact us or a demo form) and go strictly off those numbers.

For my client, you can see those goal completion numbers below:

Goal completion name




Contact Us




Individual Edition




Business Edition




Enterprise Edition








Conversion rates:

Goal completion name




Contact Us




Individual Edition




Business Edition




Enterprise Edition








This was pretty interesting. Although there was clearly fluctuation in the goal completions and conversion rates, there were no differences that made sense with our nearly 40,000 user drop from 2015 to 2016 to 2017.

All of these findings further confirmed that we were chasing an inaccurate goal. In fact, we spent the first three months working together to try and get back a 40% loss that, quite frankly, was never even there in the first place.

Tying everything together and final thoughts

For this particular case, we had to go down all five of these roads in order to reach the conclusion that we did: Our tracking was off in the past.

However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

If your goal completions are way down (by a similar percentage as your traffic), that’s also a good clue that your declining traffic numbers are correct.

If you’ve looked at all of these metrics and still can’t seem to figure out the reasoning for the decrease and you’re blindly trying tactics and struggling to crawl your way back up, this is a great checklist to go through to confirm the ominous question of tracking issue or optimization issue.

If you’re having a similar issue as me, I’m hoping this post helps you get to the root of the problem quickly, and gets you one step closer to create realistic organic traffic goals for the future!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Declining Organic Traffic? How to Tell if it’s a Tracking or Optimization Issue posted first on