Keyword research is time-consuming. Implementing a strong keyword strategy from start to finish in order to rank competitively can take hours and involves a combination of independent research and utilizing a keyword research tool. Having a solid basic understanding...
We may earn money or products from the companies mentioned in this post, at no additional cost to you. As an Amazon Associate, FTB earns from qualifying purchases.
In 2018 Google launched the Beta version of their new Google Search Console (GSC). It is now officially out of the Beta version and the main version launched to everyone in 2019. The last date you may be able to use your old GSC is March 28th. So, if you’re clinging on to the old version, it is time to let go and embrace the new. The new 2018/19 version is much simpler, easier to understand, comes with a better UX design, and everything is clickable allowing you to really dig deep into the analytics and problem solve any issues your site might be having. On top of that Google has made efforts to clarify what the various errors might mean and the resources they provide on their webmaster tools are much easier to understand.
If you have no idea what Google Search Console is, don’t worry. I’m also going to walk you through the basics of what it is, what it is used for, and how to get it set up. If you are already set up and using search console it is best to get a review to make sure you set yourself up correctly the first time before digging into the new features.
- 1 What Is Google Search Console?
- 2 Why Is Google Search Console Important?
- 3 Getting Set-up on the New Google Search Console
- 4 Understanding Your New Search Console
- 5 Vocab
- 6 In Summary: Setting Up and Using the New Google Search Console in 2019
What Is Google Search Console?
Google Search Console is a tool created by Google to help you understand technical search-related data for your website. Search Console helps you understand keywords people use to find your content, if those keywords mean people just see your content or if they click on it, root domains that link to your website, if your site is mobile friendly, if your content is appearing on Google search, and any issues that may be preventing you from optimizing your search presence. It is a powerful tool.
It is different than Google Analytics (GA) because of GA tracks website traffic and flow while GSC tracks how you look on the web and optimizing your website for SERP (Search Engine Results Page) ranking.
Why Is Google Search Console Important?
Google has been publicly attacking third-party tools, that claim to provide in-depth looks at their algorithms and ranking system. Read more about that in my latest post about DA. However, third-party companies are often better at marketing than Google, so many bloggers think these tools are going to give them better insight into how they are doing. Bloggers should first be using GSC for the data and analytics it provides and supplementing their SEO game with third-party tools. Rather than purely using third-party tools and ignoring GSC.
GSC has amazing data on root domains that link to your site and the search terms people use to see your content and how that converts to visits on your site, making it a tool and one that should be used and taken seriously if you want to increase your SERP ranking. This is incredibly helpful in optimizing old posts with terms people are actually searching for your site, taking the guesswork out of what keywords to target.
Additionally, GSC indexes your site and keeps a sitemap on file, which is really the foundation of even being seen on Google in the first place. If a URL is not indexed it will most likely not show up in search results, no matter how good your SEO strategy is. If you have anything preventing your site from appearing in search, GSC, gives you the information and tools you need to problem solve on your own.
Understanding and getting yourself set up on Google Search Console is the first step to developing SEO and ranking on page one. If you ask me without utilizing this tool you are doing you and your blog a serious disservice.
Getting Set-up on the New Google Search Console
If you’ve already been using Google Search Console, then all of your current data should have transferred from the old GSC to the new one, so really if you set it up properly the first time then you don’t need to worry about this step, BUT make sure you went through all the right motions the first time, like linking your GSC to GA, indexing your site, and adding a site map. Just registering your domain is not enough to make the most out of this tool.
Adding Your Property to Search Console
This is the first thing you need to do to start utilizing Google Search Console. In the new version, this step looks very different and gives you two options. You can list your domain or a direct URL. It is important to note that Google considers http:// https:// www and no www and all combinations and versions of that as separate domains.
While it really shouldn’t matter which option you choose as long as you set it up properly, we suggest using the new domain registry, but there are some different pros/cons for each.
- Domain: I recommend using this option to avoid any further confusion as to whether your site includes www, no www, http, or https. This is a one-stop solution to ensure data from all versions of your site are being collected in one space, so you don’t need to worry about http or https. This covers you for any and everything! It is a bit harder to verify and you need to verify this using your host, such as SiteGround, and that is your only option for verification.
- URL Prefix: Use this if you want to specify a URL of your domain to verify. The con is if you don’t set this up properly you might be getting too little data, however, you have several easy verification options, such as using your Google Analytics tracking code you most likely have in your site or uploading a file to your site. Ideally, you have chosen URL (we suggest using this as an example: https://femaletravelbloggers.com as we find www not necessary and you should have SSL security set up on your website already) and you use this as your domain across all Google platforms include GSC and GA. You need to make sure you have the proper redirects set up with your host to ensure that regardless of what people type into search they land on your chosen domain URL.
Verifying Your Site
As soon as you add your domain or URL to GSC it automatically starts collecting data. However, in order to access this data, you need to verify you are in fact the owner. To verify a domain, please follow these Google guidelines and use the TXT provided to you in GSC. To verify a URL, we suggest using your hopefully already verified Google Analytics for easy verification. You can also use any of the other methods listed, here.
Once your site is verified please wait at least 48 hours for Google to generate data on your site. GSC is always 48 hours behind, unlike GA that provides real-time data. To ensure you have done this properly, go to settings and you should see a green checkbox saying “you are a verified owner!”
Connecting Your Search Console to Google Analytics
Linking both Google Analytics and Google Search Console is going to give you the ultimate power tool to understand your users, what they are searching for, where they find value in your site, and any issue preventing them from accessing your valuable content.
The first step is to open your Google Analytics. From there go to your admin panel and select settings. Then choose property settings and scroll down until you find the Search Console. Follow the steps to connect your GA and your GSC.
Once you have successfully connected the two sites, you now have access to GSC search term data and statistics in your GA. You can find this new data under Acquisition –> Search Console. Remember this data will have the same 2-day delay as GSC.
What is different in the new GSC? This data only appears for how long GSC retains your data, which used to be three months. Thankfully, you now have access to 16 months of search engine data.
Adding Your Site Map and Indexing your Site
The last and maybe the most important step in setting up your GSC is adding a site map and indexing your site. Thankfully the new GSC makes this easier than ever. Just head to Site Map under Index in the left-hand navigation menu in GSC. From there you need to add your Site Map URL and hit submit.
Check if you have a site map by going to example.com/sitemap.xml If something generates, then you have a site map and can add sitemap.xml in GSC and hit submit. If nothing generates then you need to make a site map. You can do this using the Yoast SEO plugin. Once you have generated a site map with the plugin add the URL ending they provide. From there your site should begin the indexing process with Google and this is a sure fire way to know your content will be seen on Google with the right SEO strategy. Without this indexing step, it is unlikely Google will call your content for related keywords. Think of it like putting every one of your URLs and categories on a library card. If you’re not on a library card, the librarian (Google) can’t find you and will generate another related site instead.
You should see a nice green checkmark that says Site Map Index Processed Properly.
Understanding Your New Search Console
As I mentioned before the new Search Console in 2019 is easy to use, and makes problem-solving something you can do on your own. However, it is quite different than the old version, and this brief overview of the different tools will help kick start your SEO game.
You’ll notice at the very top of your new GSC is a search option which is the URL inspection tool. This can also be found in the left sidebar under URL inspection. You can add any URL attached to your domain into this search and instantly get feedback on this URL.
The first thing you will see after you inspect a URL is if it is indexed and showing up on Google or not. Look under availability for more information such as:
If it is indexed and the URL is on Google. Good job, this URL is showing up on search results and you are potentially bringing in traffic. If you have blocked this URL for any reason it will not show up in search even if it is indexed and on Google.
Remember: Even if a URL is indexed on Google, but you made massive changes and an update for SEO you need to request indexing again. So the new version of the post appears in search and not an old cached version.
URL is unknown to Google. This means it is not indexed, and you need to request indexing. Or it could be that it is an alternate page with a proper canonical note, but it can’t be crawled. You will need to run a live inspection and troubleshoot then submit for indexing. If it is an alternate page with a canonical note you may need to inspect the original URL for issues.
If it is not indexed/on Google, but available for indexing. You will get a notification saying that the URL is not indexed, and a message that if it is available for indexing. You can then request indexing. There is a lot of back and forth about whether you should index a new post or not, some say wait for Google to crawl and others say index right away. The following is my personal opinion: if my URL isn’t indexed within 24-48 hours I will manually request indexing through GSC. I’ve had posts immediately start ranking and bringing in traffic within 24 hours, so I want it appearing on search as soon as possible, and I always say why wait for Google to crawl you when you can request indexing immediately? The example I showed you was from a post I published on FTB almost a week ago. I came back to check and it was not indexed so I went through the processing of indexing it.
If it is not indexed/on Google, and there are indexing errors. This means you have an error preventing your URL from being indexed and simply asking Google to index your URL will not solve your problem. You then need to troubleshoot by clicking on the index coverage section and following the notifications. You can read more, here.
URL is indexed/on Google, but has issues. This might be the most complicated issue. It means that your URL is on Google, but not appearing like it should be due to some technical issues. You will need to read the warnings or errors and proceed with troubleshooting.
URL is not on Google. Most likely you chose for the URL to not appear by making it indexed, hiding it behind a password protected page, or it is the alternative version of original content (duplicate content with a proper canonical link). If this URL is being blocked by a noindex or a robot.txt that you did not mean to be intentional, try indexing and if that fails, troubleshoot to ensure nothing is blocking your URL from being indexed, like remove robot.txt by allowing it in GSC or the noindex from your page or post.
Once your URL is indexed, make sure you test the live URL and ensure that what Google sees is what your page should look like visually to your readers. If it does not look the same you may have a more complicated issue in your source code.
Lastly, the URL inspection tool will tell you if your URL is mobile friendly. If it is not, troubleshoot by solving the issues mentioned until the mobile section or the URL inspection tool. This is critical to fix as Google is prioritizing mobile-first content, but more on that later.
Hint: sometimes the mobile errors aren’t valid. For example, Google was telling me I had a mobile error regarding a tag that I had deleted from my site. This was fixed simply by asking it to validate fix.
This has always been my favorite part of GSC, and probably the most useful for practical SEO. Utilizing this section properly means that you can always refresh old posts with keywords that people are actually searching for. It also gives you useful data like how many people saw your content and out of those people who actually clicked to read more.
When you first open the Performance tab in the left-hand menu you will see a line chart with four statistics:
Clicks: This is how many people actually clicked and landed on your site.
Impressions: This is how many people typed a keyphrase into Google Search and saw a page or post of yours.
Avg CTR: This is how many people who saw your site actually clicked on it. The higher this percent the better. A higher percent means that people are looking for something and feel that your post or page is going to provide them an answer so they click on it. If the percentage is lower you can strategize ways to make your search more appealing perhaps by making your meta or title more appealing or revamping posts with better-targeted keywords to appear in a higher position. A few sources say that a 2% is an average CTR for GoogleAds I can’t find any data non-ads. My personal travel blog is between 5-10%
Avg Position: This is the average position of your site in search results. Remember that a page in Google has about 7-9 positions. If you are in position 1-7 you are on page one, 8-15 might be page 2, etc. A higher number means that you appear higher in SERP and your CTR is most likely going to increase as a result.
You can then scroll down and look at the clicks and impressions filtering by position and CTR for individual pages, countries, queries and other statistics.
Queries: This is what people are searching for to find your content. This is basically the ultimate keyword and keyphrase generator, but it is not a keyword search tool, like Keysearch. We suggest you use Keysearch to start ranking and then use queries to update and refresh. I go through this data monthly and see what new key phrases people are searching to find my content and which ones have the best clicks to impression ratio. I will then update some posts and add new headers and terms in the post to increase my CTP percent for those keywords. I try to include the best 5 key phrases in all relevant posts.
Pages: This gives you a look at which of your pages are being shown on Google. Target any pages that are being seen on Google with impressions, but aren’t getting many clicks and work on optimizing those posts to bring in more traffic.
New to Google Search Console in 2019 is the new discover feature. Which can bring you lots of traffic with a really high CTR rate, mine is currently at 10%, which is about 5% higher than my regular search traffic and the average time spent on the page that has been featured is about a minute longer than normal with the average increase of time spent on a post at 3%.
So, what is the discover feature on Google? Google has what they call a “Discover Page.” It acts almost like Google Search, but it auto generates topics that it knows you are interested in based on previous searches and your general likes and interests based on algorithms. This is why the CTR rate is so high because if your post is shown to someone in their Discover feed chances are they are interested in it and are most likely going to click on your post.
If you don’t see this feature in your Google Search Console then that means you haven’t been featured on the discover page, but don’t despair, you can get there too!
So, how do you get featured on the Discover page? There is no real strategy for getting featured, which makes it both frustrating and liberating. You can’t really write content to be featured on this, but you can write content for your audience. If you write relevant content that your readers or other Google users might be interested in then you have the chance on being featured. Once a post is indexed and meets Google’s content requirements then you have the potential to be featured.
Google also like to feature images that are of high quality and appealing. Think front page of Google-worthy, so if you have low-quality grainy feature images with messy text chances are you aren’t going to be featured. Google prefers images that are 1200 px wide and high quality.
The index coverage in the new 2019 Google Search Console is more or less the same, but MUCH easier to understand Here you can find any indexing and crawling issues and fix them on the spot to ensure all the content you want to be appearing online is. Over time as you build content on your site you should see an increase in the number of pages indexed. Also, the first time you look at this you may have many errors and over time you want to see those errors decrease. You can click on all four sections to present all the data in one chart, or deselect some to have only some of the data show.
Errors: Errors are generally no-nos and should be fixed as soon as possible. They include things such as 404s, redirect errors, robot.txt errors, server errors, and a few others. To find what pages have errors, click on the error and then a list of URLs where this error occurs generates. From there you have access to four troubleshooting options by clicking on the individual URL.
Fixing the 404s and redirects are fairly straightforward. Usually, you need to set up a 301 redirect to avoid a 404. Or perhaps you’ve deleted pages, posts, tags or categories, from your site with no intention to bring them back and Google still thinks they are part of your site because of an old index that included them. In this case, you can re-submit your site map so Google will crawl your site see those URLs, tags, and categories are no longer part of your site. You can then select the “validate fix” and Google will take about 48-72 hours to validate the fix and you should get an email update.
Robot.txt files are a little more complicated and confusing because you probably have never heard of them or used them before. Basically, this is a file that blocks Google from indexing your page, post, tag or category. You shouldn’t really have any of these because if you don’t want something indexed you should use a “noindex.” To fix these, you can click on the individual error blocked by robot.txt and then a menu will up on the right hand side and you can “Test as Robots.txt.” A new window will appear. Here you can change things that are disallowed to allowed in the code. At the bottom of the screen, you also want to make sure that the URL is allowed by a Googlebot. After that, you can request Google to validate your fix.
Valid with Warnings: This is similar to the above and usually means that a URL is valid, but again being blocked by robots.txt. You can then use the settings to make sure you use noindex instead of a robot.txt, or ensure a Googlebot is allowed to crawl and then validate your fixes. This is not as big of an issue as an error warning and it is ok to have a few of these.
Valid: Green means good! You have no issues on x amount of URLs on your site and you don’t need to do anything here. As I mentioned before this number should increase as you add more content to your site. You may see notifications here that say, “Indexed, but not Submitted” or “Indexed Mark as Canonical,” In this case, you may need to re-submit your site map or ensure you do not have duplicate content on your site that does not have a canonical link.
Excluded: This is for anything that is valid by should not be indexed due to a noindex or the use of a page removal tool. You should see most of your tags here. Most SEO experts, agree that you do not need to index tags, as it means you are potentially competing for keywords with yourself. If you ask me personally, tags are outdated and you honestly don’t need to use them. Categories are fine to index and if you see them as excluded you may choose to include them.
For more information on coverage reports, read this.
A lot of people complain about MOZ never updating or reflecting the actual links that they have back to their site. I have an easy solution for that! Use GSC new easy link data to find out who is linking to your site, create a strategy for internal linking and find spammy links that you need to disavow or people who stole your content.
New in 2019: Export your link data into a spreadsheet to help you track links and develop a strategy to secure a new root domain. You can also click on every link to find the source of the link, down to the very page you will find it on!
Top External Links: The more external links you have to your site the better it is for SEO. That means a larger number of people treat you as an authority and suggest you to their readers.
Here you can analyze what type of content you are publishing that receives a lot of links. Do you notice perhaps that your Paris content gets a lot of links, maybe you can focus on publishing more Paris content to increase your backlinks.
Who is linking back to you? One link from a high-quality site is better than 10 links from a lower quality site. Also, new root domains that link back to you are better than the total number of links. So, it is better to have 10 links from 10 different sites than 10 links from one site.
Top Internal Links: This helps you understand your own internal linking habits. This is a good place to make sure that you have strong cornerstone content that is linking to relevant content on your site. Also, ensure you are creating a well rounded internal link building strategy. Are there posts that have no links to them? Why no? Are there posts linked to too many times? Are your internal links relevant?
Top Linking Sites: This helps you better understand the root domains that are linking back to you. The more sites that link back to you the better. Here you can ensure that there are no spammy sites that link back to you and you can do research on the type of site that links back to you. If all the sites are small underwhelming sites you might want to create a strategy to bring in some bigger hitting sites.
Top Linking Text: This is a great area to help you determine if people are linking to you – relevantly. Meaning that if someone links back to your Paris post from a post about Scuba Diving in Thailand, perhaps that isn’t actually a quality link and you may choose to ask for it to be removed. It is also a tool to help you determine if you have been hacked. Is the top linking text porn related? Then you might need to disavow some links and clean up your security.
Importance of Mobile Visibility
Google has made it loud and clear that if you do not have a mobile friendly site you will lose ranking. Believe it or no,t most of the world uses mobile devices for web surfing and if your travel blog is not user-friendly for mobile devices Google may already be penalizing you or you may have received an email from them (if you’re already linked to GSC). This aspect of the new Google Search Console allows you to look at what aspects are not user-friendly and troubleshoot them. Sometimes this can be fixed with a simple validate fix, other times it might mean a complete overhaul of your theme.
Make sure you read about choosing which theme is right for you, including themes that are optimized for SEO.
If you have any issues you can click on each URL and view as a search result to see how Google views your content. You can also just hop on a mobile device and look at your website and make sure there is nothing funky going on that you need to change.
Index: Your content has been crawled by Google and appears in a search result.
No index: Your content does not appear in a search result, even though Google may have crawled it. Usually, because you have chosen not to.
Robot.txt: A code used to communicate to search engine crawlers to do or not do something. If you have a robot.txt blocking your page from being indexed Google will not index it. Google prefers you to use noindex, but having robots.txt isn’t the end of the world. Not all robot.text blocks are bad. Typically you do not want to index tags.
SERP: Search Engine Results Page is the page that is generated when you type a search query or keyphrase into a search engine. If you appear in a SERP you appear on a page based on what someone has searched. The ultimate goal of SEO is to rank higher on a SERP.
Google Bot: The entity that crawls pages and posts to know what information is in the web content in order to generate you on SERP in response to search queries. You can use code to communicate telling them to crawl and no index or to index.
Canonical: Duplicate content is ok as long as you mark it as Canonical. Google needs to know where to find the original content so it knows content is not being stolen. If you cross-post content ensure you use a Canonical link, so Google knows how to index the content.
In Summary: Setting Up and Using the New Google Search Console in 2019
The new Google Search Console fully released in 2019 is a powerful tool that is now easier to use and understand. When used properly bloggers can find how what people are searching for their content and maximize reach by implementing what is working and restructure what is not working. It also allows you more control to troubleshoot and validate fixes on your own.
Remember: if you get stuck or confused Google has a ton of resources to walk you through things, and when in doubt when it comes to the new Google Search Console, everything is clickable and I suggest you just spend some time clicking to help you troubleshoot. Not all errors are bad errors and focus on critical issues and insuring your main posts and pages are indexed and showing up on search console.
We hope you get set up and using the new Search Console to its full potential and let us know what you think of it and if you have any questions!
Susanna Kelly is an adrenaline junkie from Alaska, on a quest to explore the great outdoors.
However, she openly admits to being a total geek at heart. Her blog, the Wandering Chocobo, focuses on adventure travel and eco-tourism, while hitting pause for what she’s defining as hipster city travel. Her hipster city guides explore craft cocktail bars, boutique hotels, markets, local businesses, and geek hideouts.
When she’s not creating content for her travel blog or freelance ventures, she likes to work on her fiction novel, LARPing and gaming, volunteering and getting to level 99 in life. She currently lives in Munich, Germany.
Connect with Susanna at her site Wandering Chocobo.