Learn more about Google Search Console :SEO Professionals

Learn more about Google Search Console and how it can help your site perform better in searches. 

Exclusively accessible through Search Console, Google Search Console offers the data required to track website performance in search and enhance search rankings. 

Because of this, it is essential for publishers and online businesses that want to succeed to the fullest. 

Using the free tools and reports makes it simpler to take control of your search presence. 

Table Content: 

• What Is Google Search Console?
• How To Get Started
• How to Confirm Website Ownership.
• Troubleshooting With GSC
• Utilizing Features of the GSC
• Search Console Benefits SEO

What Is Google Search Console? 

Publishers and search marketing experts can monitor their site's general health and performance in relation to Google search using Google Search Console, a free web tool offered by Google. 

It provides a summary of indicators relating to user experience and search performance to assist publishers in enhancing their websites and boosting traffic. 

Google can also inform via Search Console when it finds security flaws (such as hacking vulnerabilities) or when a manual action penalty has been applied by the search quality team. 

Important features: 

• Identify and fix errors.
• Monitor indexing and crawling.
• Overview of search performance.
• Request indexing of updated pages.
• Review internal and external links. 

Neither using Search Console is required nor is it a ranking element for improving rankings. 

However, the Search Console's value makes it essential for enhancing search performance and increasing website traffic. 

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Get Started 

Verifying site ownership is the first step to using Search Console. 

Depending on whether you're validating a website, a domain, a Google site, or a Blogger-hosted site, Google offers a variety of methods for doing so. 

When a domain is registered with Google, Search Console immediately verifies it. 

Most people will use one of four techniques to check their websites: 

• Google Tag Manager.
• Meta tag
• Google Analytics tracking code.
• HTML file upload.

How to Confirm Website Ownership 

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site. 

• HTML file upload.
• Meta tag. 

You will select the URL-prefix properties process if you validate a site using one of these two techniques. 

Let's recognize right now that, aside from the Googler who coined the concept, "URL-prefix properties" implies absolutely nothing. 

Don't let that give you the impression that you are about to enter a maze while wearing blinders. It's simple to verify a website with Google.

HTML File Upload Method 

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 3: Select the HTML file upload method and download the HTML file. 

Step 4: Upload the HTML file to the root of your website. 

Step 5: Finish the verification process by clicking Verify back in the Search Console. 

Similar to the procedures above, verifying a normal website on a platform like Wix or Weebly that uses its own domain involves adding a meta description tag to your Wix site. 

Duda has a straightforward strategy that makes use of a Search Console App to quickly verify the website and get users up and running. 

Troubleshooting With GSC 

The ability of Google to crawl and index websites affects how websites rank in search results. 

Any crawling or indexing problems are flagged by the Search Console URL Inspection Tool before they become serious problems and cause pages to drop from the search results. 

URL Inspection Tool 

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result. 

For each submitted URL a user can: 

• Check mobile usability status.
• Check enhancements like breadcrumbs.
• Request indexing for a recently updated webpage.
• View how Google discovered the webpage (sitemaps and referring internal pages).
• View the last crawl date for a URL.
• Check if Google is using a declared canonical URL or is using another one. 


The coverage section has three subsections: Discovery (how Google found the URL), Crawl (if Google successfully crawled the URL and, if not, why), and Enhancements (structured data status).

The coverage section can be reached from the left-hand menu: 

Coverage Error Reports 

Even though these reports are marked as mistakes, there may not actually be a problem. Sometimes it simply indicates that indexing can be made better. 

For instance, Google is displaying a 403 Forbidden server response for about 6,000 URLs in the screenshot below. 

When a server returns a 403 error message, it signifies that crawling certain URLs is prohibited.

The reason for the aforementioned issues is that an online forum's member pages are not crawlable by Googlebot. 

Every forum user has a member page with a list of their most recent posts and other information. 

A list of the URLs that are causing the error is included in the report.

When one of the listed URLs is clicked, a menu with the opportunity to view the compromised URL appears on the right. 

Additionally, there is an option to Inspect URL in the contextual menu that appears to the right of the URL itself and is represented by a magnifying glass icon.

Clicking on the Inspect URL reveals how the page was discovered. 

It also shows the following data points: 

• Last crawl.
• Crawled as.
• Crawl allowed?
• Indexing allowed?

There is also information about the canonical used by Google: 

• User-declared canonical.
• Google-selected canonical. 

The crucial diagnostic data for the forum website in the aforementioned example can be found in the Discovery section. 

This section lists the pages that Googlebot sees as having connections to member profiles. 

With this knowledge, the publisher can now write a PHP statement that, when a search engine bot crawls, causes the connections to the member pages to vanish. 

Writing a new entry in the robots.txt file to prevent Google from attempting to crawl these pages is another option to solve the issue. 

By fixing this 403 error, we give Googlebot more crawling time to index the rest of the website. 

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them. 

Fixing 404 Errors 

In addition to notifying a publisher of 404 and 500 series error responses, the coverage report can also confirm that everything is fine. 

The sole reason a 404 server answer is referred to as an error is because the browser or crawler requested a page that doesn't exist. 

It doesn’t mean that your site is in error. 

The coverage report will display a 404 response if a page that isn't there is linked to by another website (or an internal link). 

You may find out which pages (or sitemaps) are referring to the nonexistent page by clicking on one of the impacted URLs and using the Inspect URL tool. 

From there, you may choose whether the internal link needs to be fixed or routed to the right page if it is broken (in the case of an external link from another website). 

It's also possible that the website didn't exist and the person who linked to it made a mistake. 

It's acceptable to display a 404 response if the page is either no longer available or has never existed. 

Utilizing Features of the GSC 

The Performance Report 

A site's performance in search, especially in search features like highlighted snippets, is extensively covered in the top section of the Search Console Performance Report. 

There are four search types that can be explored in the Performance Report: 

• Image.
• Video.
• Web.
• News. 

By default, Search Console displays the web search type.

Click the Search Type button to switch the search type that is displayed:

You can select a different search type to view from a menu that pops up by clicking on one of the options below:

Comparing the effectiveness of two search types within a graph is a handy feature. 

Four metrics are prominently displayed at the top of the Performance Report: 

• Total Impressions.
• Total Clicks.
• Average position.
• Average CTR (click-through rate). 

The Total Clicks and Total Impressions metrics are chosen by default. 

One can select which metrics to display on the bar chart by clicking within the tabs for each metric. 


The quantity of impressions indicates how frequently a website appears in the search results. It counts as an impression if a person can view the URL without having to click a link. 

Additionally, even if a URL appears at the bottom of the page and the visitor doesn't scroll down to see it, an impression is still recorded. 

The fact that Google is displaying the website in the search results is a positive sign that there are many impressions. 

However, the Clicks and Average Position metrics add significance to the impressions measure. 


The clicks measure displays the frequency with which users clicked through from the search results to the website. It's beneficial to have a lot of clicks in addition to a lot of impressions. 

A low click-through rate combined with a high impression rate is less desirable but still acceptable. It implies that changes may be necessary to the site in order to increase visitors. 

When average CTR and average position are taken into account, the clicks metric has additional meaning. 

Average CTR 

The typical CTR is a percentage that shows how frequently people click through from search results to a website. 

To increase visits from the search results, something must be improved if the CTR is low. 

The site is operating efficiently if its CTR is higher. 

When paired with the Average Position metric, this number becomes more meaningful. 

Average Position 

Average Position displays the typical position in which a website typically appears in search results. 

An average in the range of 1 to 10 is excellent. 

The site is showing up on pages two or three of the search results while it has an average position in the twenties (20-29). It's not that horrible. It simply signifies that the website requires more work in order to go up to the top 10. 

Average placements below 30 may generally indicate that the site could use major improvement. 

Or, it's possible that the website ranks for a lot of low-quality keyword phrases and a select few excellent keywords that rank extraordinarily well. 

In either scenario, it might entail paying closer attention to the information. It might be a sign that the website has a content gap where the material that ranks for particular keywords isn't strong enough and might require a page specifically for that keyword phrase to rank better. 

When evaluated collectively, the four indicators (Impressions, Clicks, Average CTR, and Average Position) provide a useful summary of the website's performance. 

The main lesson to be learned from the Performance Report is that it can be used as a springboard for quickly comprehending how well a website performs in search. 

It’s like a mirror that reflects back how well or poorly the site is doing. 

Performance Report Dimensions 

Several of the so-called Dimensions of a website's performance statistics are visible by scrolling down to the second section of the Performance page. 

There are six dimensions: 

1. Queries: displays the most popular search terms along with the quantity of clicks and impressions connected to each keyword phrase. 

2. Pages: Shows the top-performing web pages (plus clicks and impressions). 

3. Devices: Shows the top devices, segmented into mobile, desktop, and tablet. 

4. Countries: Top countries (plus clicks and impressions).

5. Dates: The clicks and impressions are arranged chronologically on the dates tab. It is possible to sort the clicks and impressions in either descending or ascending order. 

6.Search Appearance: This displays the many rich results types that the site was presented in. Additionally, it indicates whether Google used Web Light results or video results to display the website, along with the related click-through and impression data. Results that are optimized for extremely sluggish devices are known as Web Light results. 


The keywords are one of the dimensions of the Performance Report that are shown in the Queries (as noted above). The top 3,000 search terms that led to traffic are displayed in the searches report. 

The underperforming questions are of particular importance. 

Some of those inquiries show low traffic volumes because they are uncommon; this type of traffic is referred to as long-tail traffic. 

However, some search requests come from websites that may need to be improved, may require additional internal links, or may indicate that the keyword phrase needs to have its own webpage. 

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic. 


A list of all links leading to the website is provided by Search Console. 

It's crucial to note that the links report does not include links that support the site's ranking. 

It only lists every link that leads to the website. 

This indicates that some of the links on the list are hurting the site's ranking. That explains why links with a nofollow link property may appear in the report. 

The Links report can be accessed from the menu's bottom left: 

Internal Links and External Links are the two columns in the Links report. 

External links are those that direct users to a website from elsewhere. 

Internal links are those that start on one page of a website and point to another page elsewhere on the same website. 

The External links column has three reports: 

• Top linking sites.
• Top linked pages.
• Top linking text. 

The Top Linked Pages are listed in the Internal Links report. 

In order to view and expand the report for each type, each report (top connected pages, top linking sites, etc.) has a link to more results that may be accessed. 

For instance, the Top Target Pages, or the pages from the site that are most linked to, are displayed in the expanded report for Top Linked Pages. 

When a URL is clicked, the report is updated to show every external domain that links to that particular page. 

The precise page that links to the external site is not displayed in the report; only its domain is. 


Search engines may find webpages and other types of information on a website by using a sitemap, which is often an XML file with a list of URLs.

Sitemaps are especially useful for huge websites that are challenging to crawl if they frequently update new content. 

Indexing and crawling are not ensured. The extent to which a site is crawled and its pages are indexed might depend on factors including page quality, site quality overall, and links. 

Sitemaps do nothing more than make it simple for search engines to find certain pages. 

The CMS, plugins, or internet platform where the site is housed automatically produce more, making it simple to create a sitemap.
Some hosting services create sitemaps for each website that is hosted on their platform, and they update the sitemaps whenever a website changes. 

Publishers can upload a sitemap through Search Console, which also includes a sitemap report. 

Click the link in the menu on the left to access this feature. 

Any sitemap errors will be reported in the sitemap section. 

You can delete a sitemap from the reports by using Search Console. However, it's crucial to truly delete the sitemap from the website itself since otherwise, Google might recall it and visit the page again. 

The Coverage report will populate a sitemap section after it has been submitted and processed, which will aid in troubleshooting any issues with URLs submitted through the sitemaps. 

Report on the Search Console's Page Experience 

The page experience report provides information on the website user experience in relation to site speed. 

Information about Core Web Vitals and Mobile Usability can be seen in Search Console. 

This is a nice place to start if you want a general overview of how fast the site is. 

Rich Status Reports for Results 

Through the Performance Report, Search Console provides commentary on rich results. Search Appearance is one of the six dimensions given after the graph that is visible at the top of the page. 

The clicks and impressions information for the various types of rich results displayed in the search results may be found by selecting the Search Appearance tabs. 

This report explains the value of rich results traffic to the website and can assist in determining the cause of particular trends in website traffic. 

The Search Appearance report can aid in the diagnosis of structured data-related problems. 

For instance, a decline in traffic to rich results could indicate that Google has modified its requirements for structured data and that the structured data needs to be updated. 

It’s a starting point for diagnosing a change in rich results traffic patterns. 

Search Console Benefits SEO 

Publishers and SEOs can use Search Console for the aforementioned advantages as well as to upload link disavow reports, address penalties (manual actions), and handle security incidents like site hacks, all of which help their search presence. 

Every web publisher worried about search visibility should make use of this beneficial service.


Post a Comment

Post a Comment (0)

Previous Post Next Post