How to Use Google Search Console: The Complete GSC SEO Guide
Google Search Console (GSC) is a free tool provided by Google that gives you a direct window into how Googlebot sees your website. It shows which search queries drive traffic to your pages, which pages are indexed, what crawl errors exist, how your Core Web Vitals score, and whether Google has applied any manual penalties. Whether you are a beginner launching your first site or an experienced SEO professional managing hundreds of pages, this google search console guide covers everything you need to use the tool confidently and effectively.
This search console tutorial walks through every major report, every key workflow, and every common mistake — with practical, step-by-step instructions throughout. By the end you will know how to use GSC not just for monitoring, but as an active driver of keyword research, content optimisation, and technical SEO improvements.
What Is Google Search Console?
Google Search Console, previously known as Google Webmaster Tools, is a property management and diagnostic platform within Google Search. It is completely free and available to any verified owner or delegated user of a website. Unlike Google Analytics, which tracks user behaviour after a visit, GSC focuses on what happens before and during the Google crawl — how Googlebot discovers, crawls, and indexes your content, and how that content then performs in the Search Engine Results Pages (SERPs).
Understanding what GSC reports is important. All data inside GSC is based on Google's own crawl data and search data. This makes it the single most authoritative source for:
- Seeing exactly which search queries triggered impressions or clicks to your site
- Confirming which URLs Google has indexed — and why others are excluded
- Identifying crawl errors, redirect issues, and server errors that Google encountered
- Monitoring Core Web Vitals from real-user (field) data as Google measures them
- Receiving notifications about security issues and manual actions
No third-party SEO tool can replicate this data directly — it comes straight from Google. That is why setting up GSC should be the first thing you do after launching any new website. Learn the fundamentals in our SEO glossary if any terms below are unfamiliar.
Setting Up Google Search Console
Before you can use any of the reports in this gsc seo guide, you need to add your website as a property and verify that you own it. This section covers both property types and all four verification methods.
Step 1: Choose a Property Type
When you visit search.google.com/search-console and click "Add property", you are prompted to choose between two property types:
- Domain property — covers all URLs across all subdomains (www, blog, shop, m, etc.) and all protocols (http and https). You verify this via a DNS TXT record only. This is the recommended option for most sites because it gives a unified view of all your Google Search traffic in one place.
- URL-prefix property — covers only the exact URL prefix you enter (e.g.,
https://www.example.com/). HTTP and HTTPS are treated separately. Multiple verification methods are available. Use this if you only want to monitor a specific subdomain or subdirectory, or if you cannot add DNS records.
Step 2: Verify Ownership (4 Methods)
Google requires proof that you own or control the property before showing you its data. There are four main verification methods for URL-prefix properties and one for Domain properties.
Method 1: DNS TXT Record (Recommended for Domain Properties)
- In GSC, select "Domain property" and enter your root domain (e.g.,
example.comwithout www). - Google displays a TXT record value such as
google-site-verification=aBcDeFgHiJkLmNoPqRsTuV. - Log in to your domain registrar's DNS management panel (GoDaddy, Namecheap, Cloudflare, etc.).
- Create a new TXT record: set the host/name to
@(representing the root domain) and paste the value Google provided. - Save the record. DNS propagation can take a few minutes to up to 48 hours, though it is usually under an hour.
- Return to GSC and click "Verify".
Method 2: HTML File Upload
- Download the HTML verification file Google provides (e.g.,
googleXXXXXXXXXXXXXXXX.html). - Upload it to the root of your website so it is accessible at
https://www.example.com/googleXXXXXXXXXXXXXXXX.html. - Do not rename the file or add content to it.
- Click "Verify" in GSC.
- Keep the file in place permanently — removing it will cause verification to fail.
Method 3: HTML Meta Tag
- Google provides a meta tag like
<meta name="google-site-verification" content="XXXXXX" />. - Paste it into the
<head>section of your homepage's HTML, before the</head>tag. - In WordPress, use a plugin like Yoast SEO or RankMath which has a dedicated field for this, or add it via your theme's header.php.
- Once published, click "Verify" in GSC.
- Like the HTML file method, this tag must remain in the page permanently.
Method 4: Google Analytics or Google Tag Manager
If you already have Google Analytics (GA4) or Google Tag Manager (GTM) implemented on your site with the same Google account, you can use these to verify GSC ownership instantly:
- Google Analytics: Your GA tracking code must be in the
<head>section and the GA property must be associated with the same Google account you are using in GSC. - Google Tag Manager: Your GTM container snippet must be published and the Google account must match.
This is the fastest method if you already use GA4 or GTM. The verification remains active as long as the tracking code is present on your site.
Step 3: Add Users and Permissions
Once verified, navigate to Settings > Users and permissions to give access to developers, SEO agencies, or team members. Permission levels are:
- Owner — full access including adding/removing users and managing the property.
- Full user — can view all data and take most actions but cannot manage users.
- Restricted user — read-only access to most reports.
Submitting Your Sitemap
After verifying your property, the first action to take is submitting your XML sitemap. A sitemap is a file that lists all the important URLs on your site so Google can discover and crawl them efficiently. This is especially valuable for new sites, large sites, or sites with pages that are not well linked internally.
- In GSC, navigate to Indexing > Sitemaps in the left sidebar.
- In the "Add a new sitemap" field, enter the path to your sitemap. Common locations are
sitemap.xml,sitemap_index.xml, orsitemap.xml.gz. If using WordPress with Yoast, the sitemap is usually at/sitemap_index.xml. - Click "Submit".
- GSC will show the submission status: "Success", "Couldn't fetch", "Has errors", or "Pending".
- After a successful fetch, GSC displays the number of URLs discovered from the sitemap versus the number actually indexed. A large gap between these two numbers is a signal worth investigating.
For a full walkthrough of creating and submitting sitemaps, see our guide on how to submit your sitemap to Google.
The Performance Report
The Performance report is the most-used section of Google Search Console and the heart of any gsc seo workflow. It shows aggregated data on how your website performs in Google Search over a selected time period. The default view covers the last 3 months.
The Four Core Metrics
| Metric | Definition | What to Watch For |
|---|---|---|
| Total Clicks | Number of times users clicked through to your site from Google Search results | Declining clicks despite stable impressions signals a CTR problem |
| Total Impressions | How many times your URLs appeared in Google search results | Rising impressions without clicks may mean you rank but not in the top 3 |
| Average CTR | Clicks divided by impressions, expressed as a percentage | CTR below 2% for positions 1-5 suggests weak title tags or meta descriptions |
| Average Position | Mean ranking position across all queries that triggered an impression | Averages can mask individual query movement — always drill into queries |
Queries Tab
The Queries tab is where this search console tutorial delivers the most value for keyword research. Switch to the Queries view to see every search term that generated an impression or click to your site. Key actions:
- Sort by Impressions to find queries where you appear frequently but get few clicks — these are optimisation opportunities.
- Filter by Position using the "Position" filter to isolate queries ranked 4–20 (page one bottom and page two). These are your "striking distance" keywords — small improvements to the page could move them into the top three, dramatically increasing clicks.
- Compare date ranges using the date filter to spot queries that lost or gained position after a content update or Google algorithm update.
- Export to CSV for further analysis in a spreadsheet. GSC shows a maximum of 1,000 rows in the UI but the export includes up to 50,000 rows, giving far more data for large sites.
Pages Tab
Switch to the Pages tab to see performance broken down by URL rather than query. This view answers questions like "How many clicks does my homepage get?" and "Which blog posts drive the most organic traffic?". Click any individual page to see which queries are driving impressions and clicks to that specific URL — essential for content audits.
Countries Tab
The Countries tab breaks down your performance by the country where searchers are located. Useful for:
- Identifying unexpected international traffic that might justify creating localised content or implementing hreflang tags.
- Checking whether a geo-targeted site is performing in its target country.
- Comparing position differences across markets for the same queries.
Devices Tab
The Devices tab splits data across Desktop, Mobile, and Tablet. Most sites see 50–70% of impressions from mobile. If your mobile CTR is significantly lower than desktop CTR at the same positions, this points to a mobile experience issue — either your titles/descriptions are truncating poorly on mobile, or the page experience discourages clicks.
Search Type Filter
At the top of the Performance report you can switch between Web, Image, Video, News, and Discover. The default view is Web search. If your site uses structured data for recipes, products, or articles, check the Image and Discover views to understand additional traffic sources.
Date Ranges and Comparisons
GSC stores 16 months of data. Use the date picker to compare two periods side by side (e.g., this month versus last month, or this quarter versus the same quarter last year). This is the fastest way to identify whether a content change, a site migration, or a Google core update caused a traffic shift.
The Pages (Indexing) Report
Understanding which pages Google has indexed — and which it has deliberately or accidentally excluded — is a foundational part of technical SEO. The Pages report (found under Indexing > Pages in the left sidebar) shows this in detail.
The report is split into two groups:
- Indexed — URLs that Google has crawled and included in its index. These are eligible to appear in search results.
- Not indexed — URLs Google knows about but has excluded from its index, grouped by reason.
For more background on what indexing means and why it matters, read our explainer on what is indexing in SEO.
Common "Not Indexed" Reasons and How to Fix Them
| Status | Meaning | Action |
|---|---|---|
| Crawled – currently not indexed | Google crawled the page but chose not to index it (often thin content) | Improve content quality and depth, then request re-indexing |
| Discovered – currently not indexed | Google knows the URL exists but has not yet crawled it | Improve internal links to the page; submit sitemap; use URL Inspection |
| Excluded by 'noindex' tag | A noindex directive on the page told Google not to index it | Remove the noindex tag if the page should be indexed |
| Blocked by robots.txt | Your robots.txt is preventing Googlebot from crawling the page | Update robots.txt to allow crawling if the page should be indexed |
| Page with redirect | The URL redirects to another URL | Normal for redirected pages; ensure destination page is indexed |
| Soft 404 | Page returns a 200 status but appears to have no useful content | Return a proper 404/410 or add meaningful content |
| Duplicate, Google chose different canonical than user | Google selected a different version as the canonical | Review canonical tags and consolidate duplicate content |
| Alternate page with proper canonical tag | Page is intentionally treated as a duplicate; canonical points elsewhere | Normal if set up intentionally; verify canonical tag is correct |
| Not found (404) | Page returns a 404 error | Either recreate the page, redirect to a relevant page, or leave if content is obsolete |
To verify whether a specific URL is indexed right now, use the URL Inspection tool (covered below). You can also check individual URLs at scale using our how to check if a page is indexed by Google guide.
Debugging Indexing Issues Step by Step
- Open the Pages report and click on a specific "Not indexed" reason to see which URLs are affected.
- Export the URL list for that reason to CSV.
- For a sample of URLs, run the URL Inspection tool to get a detailed crawl report.
- Check the page's rendered HTML (available in URL Inspection) to confirm Google can see the content.
- Review the page's canonical tag, noindex tag, and robots.txt allowance.
- Fix the issue, then either resubmit the URL via URL Inspection or resubmit your sitemap.
- Monitor the Pages report over the following weeks to confirm the fix is working.
A thorough indexing audit is part of any full technical review. Run a site audit to catch indexing-related issues across your whole site at once.
The URL Inspection Tool
The URL Inspection tool is one of the most powerful features in GSC. It lets you inspect any individual URL in your property to see exactly what Google knows about it. Access it by clicking "URL Inspection" in the left sidebar, or by clicking on any URL within the Pages report.
What URL Inspection Shows
- Index status — whether the URL is indexed, and if not, why.
- Crawl details — the last time Googlebot crawled the page, which Googlebot user-agent was used, and the crawl source.
- Canonical URL — both the user-declared canonical (from your
<link rel="canonical">tag) and Google's selected canonical. Mismatches here explain why some pages are not being indexed as expected. - Page fetch and render — clicking "Test Live URL" fetches the current live version of the page as Googlebot and shows you the rendered HTML. This is invaluable for diagnosing JavaScript rendering issues where content that you can see in a browser is not visible to Googlebot.
- Referring sitemaps — shows which sitemaps list this URL, helping you confirm sitemap coverage.
- Enhancements — lists any eligible rich result types detected on the page (structured data).
Requesting Indexing via URL Inspection
After fixing an issue on a page — whether it was a noindex tag, a broken canonical, a thin content problem, or anything else — use URL Inspection to request that Google re-crawl and re-index the updated page:
- Enter the URL in the URL Inspection bar at the top of the screen.
- Wait for the inspection to complete.
- Click "Request Indexing".
- Google adds the URL to a priority crawl queue. Indexing usually happens within hours to a few days, though there is no guarantee of timing.
Core Web Vitals Report
Core Web Vitals are a set of real-world performance metrics that Google uses as page experience signals. They are measured using field data from the Chrome User Experience Report (CrUX) — meaning they reflect actual user experience, not just lab conditions. The three metrics are:
- Largest Contentful Paint (LCP) — measures loading performance. The LCP should occur within 2.5 seconds of when the page starts loading. LCP is commonly the main image, hero banner, or largest block of text.
- Interaction to Next Paint (INP) — measures responsiveness to user interactions. A good INP score is 200 milliseconds or less. INP replaced First Input Delay (FID) as a Core Web Vital in 2024.
- Cumulative Layout Shift (CLS) — measures visual stability. A good CLS score is 0.1 or less. High CLS is often caused by images without explicit dimensions, dynamically injected content above the fold, or late-loading web fonts.
For deeper background on these metrics and how to improve them, read our guide to what are Core Web Vitals.
Reading the Core Web Vitals Report
The Core Web Vitals report in GSC (found under Experience > Core Web Vitals) shows URLs classified as:
- Good — all three metrics pass their thresholds.
- Needs improvement — one or more metrics fall in the middle range.
- Poor — one or more metrics fail their thresholds.
GSC groups URLs by issue type (e.g., "LCP issue: longer than 4s") and shows representative URLs for each group. This grouping is important: you do not need to fix every affected URL individually. Fixing the underlying performance issue (such as optimising the server response time or compressing images) typically resolves the issue across the entire group.
Steps to address Core Web Vitals issues:
- Open the Core Web Vitals report and click into the "Poor" URLs section.
- Click on a specific issue to see which URLs are affected.
- Click a representative URL and run "Test Live URL" via URL Inspection to see the rendered page.
- Use Google PageSpeed Insights or Lighthouse to get specific recommendations for the affected pages.
- Implement fixes (image optimisation, deferred JavaScript, explicit image dimensions, font-display settings, etc.).
- Click "Validate Fix" in GSC after deploying changes. GSC will re-check a sample of URLs over the following 28 days and update the status.
The Links Report
The Links report (found under Links in the left sidebar) shows two categories of links Google has discovered on your site:
External Links
External links (also called backlinks) are links from other domains pointing to your site. The GSC Links report shows:
- Top linked pages — your most-linked URLs, sorted by the number of linking domains. These are usually your strongest pages in terms of authority.
- Top linking sites — which domains link to you most frequently.
- Top linking text — the anchor text used in links pointing to your site. A natural link profile has diverse anchor text; if you see over-optimised exact-match anchor text, this can be a risk signal.
Internal Links
Internal links are links between pages within your own site. GSC shows how many internal links point to each of your pages. Pages with very few internal links are harder for Googlebot to discover and are typically weaker in terms of PageRank distribution. Use this data to:
- Identify important pages that are "orphaned" (receiving zero or very few internal links) and add links to them from relevant content.
- Prioritise internal linking to pages that are stuck in position 5–15 in the Performance report — boosting internal link equity to them can improve rankings.
Manual Actions
Manual Actions are penalties applied by human reviewers at Google when a site violates Google's spam policies. Unlike algorithmic penalties (such as those from the Helpful Content system or Panda/Penguin updates), Manual Actions are applied deliberately and are listed explicitly in GSC.
Find them under Security & Manual Actions > Manual Actions. If the report shows "No issues detected", your site has no active manual actions — which is the normal, healthy state. If an issue is listed, you will see:
- The type of violation (e.g., "Unnatural links to your site", "Pure spam", "Thin content with little or no added value", "Cloaking and/or sneaky redirects")
- Whether it affects the whole site or specific pages
- A description of what was found
To resolve a Manual Action:
- Read the specific policy violation described in the notice carefully.
- Make the necessary corrections to your site (remove spammy content, clean up unnatural links, fix cloaking, etc.).
- Submit a Reconsideration Request via the Manual Actions report. Explain what the problem was, what you did to fix it, and evidence that the issues are resolved.
- Google typically responds within a few weeks. If the review is successful, the manual action is removed and rankings can recover. If denied, the response will explain what further action is needed.
Security Issues
Adjacent to Manual Actions, the Security Issues report (under Security & Manual Actions > Security Issues) alerts you if Google detects that your site has been hacked or is serving malware. Common types include:
- Hacked content (spam pages injected by attackers)
- Malware (malicious code injected into your site)
- Social engineering / phishing pages
If Google detects a security issue, it may show a "This site may be hacked" or "Dangerous site" warning in search results, which devastates click-through rates. Resolving security issues quickly is critical. Clean the infection, check all files and the database, update all passwords and access credentials, then use the Security Issues report to request a review once the site is clean.
The Removals Tool
The Removals tool (found under Index > Removals) allows you to temporarily hide URLs from Google Search results. It is a blunt instrument and should only be used in specific circumstances:
- Temporary removal — hides a URL from search results for approximately six months. This is useful for content that needs to be taken down urgently (e.g., accidentally published private data) while you implement a permanent solution. After six months, if the page is still live and indexed, it will reappear in results.
- Outdated content — lets you request removal of cached versions of a page that no longer exists, or removal of content in snippets that has been updated on the live page.
The Removals tool does not permanently remove pages from the index. For a permanent solution, either delete the page and return a 410 status code, add a noindex tag, or use a canonical tag pointing to a preferred URL.
Using GSC for Keyword Research
Many SEOs overlook that GSC is one of the best free keyword research tools available — because it shows real queries your site already ranks for, with click, impression, and position data that no third-party tool can access.
Finding Keyword Opportunities with GSC
- Open the Performance report and select the maximum date range (16 months).
- Switch to the Queries tab and sort by Impressions descending.
- Add a filter: Position > 4 AND Position < 20. This isolates queries where you rank on page one but outside the top three, or on page two — the highest-leverage opportunities.
- Look for high-impression queries in positions 5–15 where your CTR is below the expected rate. These pages are "almost there" and targeted content improvements can produce meaningful ranking gains.
- Click on any query to see which specific URLs it is driving impressions to, then click through to that page in the Pages tab to see its full query mix.
Identifying Content Gaps and Cannibalization
In the Pages tab, filter by a specific page and look at its Queries. If a single page ranks for hundreds of loosely related queries spanning very different intents, it may be diluting its authority. Consider splitting content into dedicated pages for each distinct topic. Conversely, if two different pages on your site are both ranking for the same core query, you may have keyword cannibalization — consolidate the content onto one authoritative page and redirect the weaker one.
Tracking Branded vs. Non-Branded Traffic
Use the query filter to segment branded queries (queries containing your brand name) from non-branded queries. Non-branded organic traffic represents demand that is not already aware of your brand — this is typically the most commercially valuable segment to grow. Track the non-branded click trend over time to measure genuine SEO progress.
Using GSC for Content Optimisation
Once you identify which pages rank but underperform, GSC gives you the data to optimise them precisely.
Improving Click-Through Rate
A page in position 3 with a 1% CTR is being outclicked by competitors. Improve CTR by:
- Rewriting the title tag to be more specific, include the target keyword, and add a compelling reason to click.
- Updating the meta description to act as ad copy — include a call to action, highlight a benefit, or mention something unique.
- Adding structured data (FAQ schema, HowTo schema, review schema) to earn rich snippets that increase visual real estate in the SERPs.
Refreshing Content That Is Losing Position
- In the Performance report, compare the current 3-month period to the same period last year.
- Filter by Pages and sort by "Difference in clicks" or "Difference in position" to see pages that have declined.
- Export the declining pages list.
- For each declining page, review the queries driving impressions. Are there new sub-topics or questions that competitors are covering that you are not?
- Update the content to cover gaps, refresh statistics and dates, improve the page structure, and add relevant internal links.
- After updating, use URL Inspection to request re-indexing.
Monitoring Ranking Changes
One of the most practical ongoing uses of GSC is monitoring position changes over time — particularly after Google algorithm updates, which now happen multiple times per year.
Setting Up a Ranking Monitoring Workflow
- Weekly: Check the Performance report for significant traffic drops. Filter by the previous 7 days compared to the 7 days before. A drop of 20%+ in clicks without an obvious seasonal explanation is worth investigating.
- After algorithm updates: Google announces core updates through its Search Status dashboard. When a core update rolls out, compare your traffic for the update period against the same period before the update using the date comparison feature.
- After content changes: Any time you significantly update a page, note the date and monitor that page's position trend in the Performance report. GSC annotations are not available, so keep a separate log of when major changes were made.
Distinguishing Algorithm Drops from Technical Issues
When traffic drops suddenly, there are two possible causes: a technical issue (your pages become inaccessible or un-indexed) or an algorithmic quality change. To distinguish them:
- If impressions drop sharply alongside clicks, it is likely a technical or indexing issue (pages disappeared from the index). Check the Pages report immediately.
- If impressions are stable but clicks dropped, it is likely a CTR problem (lower rankings or changes to SERP features).
- If positions dropped gradually across many pages, it is likely an algorithm quality update.
- Use the URL Inspection tool to confirm your most important pages are still indexed and rendering correctly.
The GSC API: An Overview
For developers and advanced SEOs, the Google Search Console API provides programmatic access to the same data you see in the GSC interface, enabling automation and integration into custom dashboards or reporting tools.
What the GSC API Can Do
- Pull Performance data (queries, pages, countries, devices, clicks, impressions, CTR, position) for any date range up to 16 months.
- Query the URL Inspection API to check the index status of individual URLs programmatically.
- Submit sitemaps via the API.
- Access the Sitemaps list to check submitted sitemap status.
Getting Started with the GSC API
- Go to the Google Cloud Console and create a project.
- Enable the Google Search Console API for that project.
- Create OAuth 2.0 credentials (for user-delegated access) or a Service Account (for automated server-to-server access).
- Grant the Service Account access to your GSC property by adding its email as a user in GSC Settings > Users and permissions.
- Use the official Python client library (
google-api-python-client) or direct HTTP requests to the REST endpoint athttps://searchconsole.googleapis.com/.
The most common use case is pulling Performance data for all queries and pages into a database for long-term trend analysis beyond GSC's 16-month limit. You can also use the API to build automated weekly ranking reports delivered to Slack, Google Sheets, or email.
Common GSC Mistakes to Avoid
| Mistake | Why It Matters | What to Do Instead |
|---|---|---|
| Not verifying the property as a Domain property | URL-prefix properties miss traffic from www vs non-www and http vs https variations | Use Domain property with DNS verification to capture all traffic |
| Ignoring the "Crawled – currently not indexed" URLs | These pages consume crawl budget but contribute nothing to rankings | Review for thin content; improve or noindex them |
| Submitting a sitemap that includes noindex pages | Sends conflicting signals to Google about what to index | Only include pages in your sitemap that you want indexed |
| Using "Request Indexing" for hundreds of URLs | Exceeds the daily quota and does not scale | Resubmit the sitemap for bulk indexing requests |
| Misreading average position | Averages blend high-volume and low-volume queries, masking real movement | Always filter to specific queries or pages for meaningful position data |
| Only checking the last 3 months | Misses seasonal trends and year-over-year comparisons | Use the full 16-month range and year-over-year comparisons regularly |
| Ignoring the Links report | Low internal link counts to important pages suppress their rankings | Use the internal links data monthly to improve link equity distribution |
| Treating GSC as a set-and-forget tool | Issues accumulate silently between check-ins | Set a recurring weekly review of Performance and monthly review of all reports |
GSC and Technical SEO: Working Together
GSC is most powerful when combined with other technical SEO processes. Use it alongside:
- A regular site audit to catch broken links, redirect chains, missing canonical tags, and duplicate content issues that GSC may not surface explicitly.
- Google Analytics 4 to correlate organic search traffic patterns with on-site user behaviour (bounce rate, session depth, conversions).
- A keyword rank tracker for daily position monitoring on your target keywords — GSC position data is an average over the reporting period, not a point-in-time rank.
- Log file analysis to see exactly which URLs Googlebot is crawling, how often, and in what order — this complements the GSC crawl data with much more granular detail.
Frequently Asked Questions About Google Search Console
How long does it take for GSC to show data after setup?
GSC will begin showing data as soon as Google starts crawling your property. For new sites, this can take several days to a few weeks. Performance data typically appears within 2–3 days of crawling, but it may take a few weeks for meaningful volumes to accumulate. The Pages report updates as Google indexes your content — which can happen within hours for well-linked pages or take weeks for new sites with few backlinks.
Why are some of my pages not showing in GSC?
Pages may not appear in GSC because they have not been crawled yet, they are blocked by robots.txt, they carry a noindex directive, they return an error status code, or Google has chosen a different canonical URL for them. Use the URL Inspection tool to check the specific status of any URL, and check the Pages report for a full list of excluded URLs with reasons.
What is the difference between clicks and sessions in GSC vs Google Analytics?
GSC counts a click each time a user clicks a link to your site from Google Search results, regardless of what happens after. Google Analytics counts sessions, which begin when a user arrives on your site. Discrepancies between the two figures are normal and expected. They occur because of bot filtering, same-session multiple clicks, users clicking and immediately bouncing before GA fires, and differences in data sampling methodology.
Can I see keyword data for individual pages?
Yes. In the Performance report, click the Pages tab, click on a specific page URL, then switch to the Queries tab. This shows all the queries that generated impressions and clicks to that specific page. This is one of the most useful workflows in the tool for content-level keyword analysis.
Does GSC show all backlinks to my site?
No. The Links report shows a sample of backlinks that Google has discovered, not a complete list. It is useful for identifying your most-linked pages and top linking domains, but for a comprehensive backlink audit you will need a third-party tool like Ahrefs, Majestic, or Semrush.
How do I check if my site has been penalised by Google?
Check the Manual Actions report under Security & Manual Actions in the left sidebar. If you see "No issues detected", you have no manual penalties. For algorithmic penalties (like those from core updates), there is no explicit notification — you will see it as a traffic drop in the Performance report correlated with a known update date.
What does "average position" actually mean in GSC?
Average position is the mean ranking position for a given query or page across all the impressions recorded in the selected time period. Because position varies by user, device, location, and query variations, the same page can appear in different positions for different users searching the same term. The GSC average is a useful directional indicator but should not be treated as a precise real-time rank. For point-in-time rank checking, see our RankNibbler rank checker tools.
How often should I check Google Search Console?
At minimum, check the Performance report weekly to catch traffic drops early, and check the Pages, Core Web Vitals, and Links reports monthly. After major site changes (migrations, redesigns, bulk content updates), check GSC daily for the first two weeks. Also check immediately after Google announces a core algorithm update.
Can I use GSC for a client's website?
Yes. The property owner can grant you access as a "Full user" or "Restricted user" via Settings > Users and permissions in GSC. You will see the property in your own GSC dashboard without needing the owner's login credentials. Agencies typically request "Full user" access to be able to submit sitemaps and request indexing.
What is the GSC Search Performance export limit?
The GSC interface shows a maximum of 1,000 rows per report view. However, when you use the "Export" button (top right of the Performance report), you can export up to 50,000 rows per request to a CSV or Google Sheets. For programmatic access to more data, use the Google Search Console API which also allows pulling data in bulk with pagination.
Does submitting a sitemap guarantee Google will index my pages?
No. Submitting a sitemap tells Google where your pages are, but whether Google indexes them depends on their quality, relevance, crawl budget, and whether they meet Google's content guidelines. Low-quality, thin, or duplicate pages are frequently crawled but not indexed even when submitted via sitemap. The sitemap submission improves discovery speed but is not an indexing guarantee. See our guide on how to submit a sitemap to Google for best practices.
What should I do if my Core Web Vitals report shows "Poor" URLs?
Click into the "Poor" group in the Core Web Vitals report, identify the specific metric causing the failure (LCP, INP, or CLS), and note the representative URLs listed. Run those URLs through Google PageSpeed Insights for specific improvement recommendations. After implementing fixes, use the "Validate Fix" button in GSC. Our guide on Core Web Vitals explains the metrics and common fixes in detail.
Consistent use of Google Search Console — combined with regular site audits and a solid understanding of SEO fundamentals — is one of the highest-leverage activities any website owner can undertake. The data is free, it comes directly from Google, and it surfaces issues and opportunities that would otherwise remain invisible.
Last updated: April 2026