How to Get Your Website on Google

Getting your website on Google is the first step to receiving organic search traffic. Google will eventually find most websites on its own, but you can speed up the process significantly by taking a few simple steps.

Step 1: Create a Google Search Console Account

Go to Google Search Console, add your website, and verify ownership. This gives you direct communication with Google about your site.

Step 2: Submit Your Sitemap

Submit your sitemap in Search Console. If you do not have a sitemap, most CMS platforms (WordPress, Shopify, Squarespace) generate one automatically. Check if yours exists at yoursite.com/sitemap.xml.

Step 3: Check Your Technical SEO

Run a free RankNibbler audit on your homepage and key pages to make sure there are no issues blocking Google:

Step 4: Build Internal Links

Google discovers new pages by following links. Make sure every important page is linked from at least one other page on your site. See our internal linking guide.

How Long Does It Take?

New pages can be indexed within hours to a few days if you have submitted a sitemap. Without a sitemap, it can take weeks. Regularly updated sites with good internal linking get crawled more frequently.

Check your site: Run a free audit on the RankNibbler homepage — 30+ SEO checks with no signup required.

Last updated: March 2026

Step-by-Step: From New Site to Google-Indexed

Step 1: Verify Your Site in Google Search Console

Search Console is Google's direct line to your site. Until you're verified, you're flying blind. The verification process:

  1. Go to search.google.com/search-console
  2. Click "Add Property" and enter your domain
  3. Choose "Domain" property (covers all subdomains and protocols) or "URL prefix" (specific version only)
  4. Verify using DNS TXT record (preferred), HTML file upload, meta tag, or Google Analytics
  5. Once verified, Search Console begins collecting data immediately

Step 2: Submit Your XML Sitemap

An XML sitemap is a machine-readable list of every URL on your site you want indexed. Most modern CMSs (WordPress with Yoast, Shopify, Webflow, Ghost) generate one automatically at /sitemap.xml.

  1. Verify your sitemap loads: visit yourdomain.com/sitemap.xml in a browser
  2. In Search Console, go to Sitemaps in the sidebar
  3. Enter the sitemap URL and click Submit
  4. Google begins crawling within hours

Also reference the sitemap in your robots.txt: Sitemap: https://yourdomain.com/sitemap.xml

Step 3: Ensure robots.txt Allows Crawling

Your robots.txt file at the root of your domain tells crawlers what they can access. A common launch mistake is leaving a Disallow: / rule from staging, which blocks everything. Verify your robots.txt looks like this at minimum:

User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml

Step 4: Check for Noindex Tags

Verify no pages have accidental noindex meta tags. In WordPress, check Settings → Reading for "Discourage search engines" — if checked, uncheck it. In Yoast or Rank Math, check each page's advanced settings.

Step 5: Submit Individual URLs via URL Inspection

While waiting for Google to crawl naturally, you can push urgent pages:

  1. In Search Console, use the URL Inspection tool (top of page)
  2. Paste the URL you want indexed
  3. Click "Request Indexing"
  4. Google prioritises crawl within hours to days

Quota: around 10 requests per day. Use for your most important pages; rely on sitemap for the rest.

Step 6: Build Initial Backlinks (Any At All)

Google finds pages primarily by following links. A new site with zero backlinks might not be discovered for weeks beyond what Search Console submits. Kickstart discovery:

Step 7: Monitor in Search Console

Within days of submitting, check Coverage in Search Console. You should see:

Full indexing of a new site typically takes 2–8 weeks. Patience is required. If after 4 weeks you see no indexed pages, something is blocking access — investigate immediately.

Common Reasons Google Doesn't Index Your Site

Robots.txt Blocks Everything

A Disallow: / in robots.txt blocks all crawlers. Common at launch from staging config. Fix: update robots.txt to allow crawling.

Noindex Tag Site-Wide

A global noindex meta tag or X-Robots-Tag header tells Google explicitly not to index. Check page source, HTTP headers, and CMS settings.

Authentication or Password Required

If your site requires login, Googlebot cannot access pages. Use password protection only for truly private pages, not site-wide during launch.

Duplicate Content with Canonical to Another URL

If every page's canonical tag points to a different URL (a common migration error), Google indexes that canonical destination instead of your actual URLs. Verify canonicals point to the correct URLs.

Thin Content

Pages with <300 words of unique content may be crawled but not indexed. Google calls this "discovered, currently not indexed" in Search Console. Expand content; this guide has a companion word count checker.

New Domain With Zero Authority

Brand-new domains take longer to fully index. Google is cautious about investing crawl budget in unproven sites. Time + backlinks are the solution.

Server Errors (5xx)

If your site returns server errors when Googlebot crawls, pages won't be indexed. Check Search Console's Coverage report for server errors and fix hosting issues.

JavaScript-Heavy Rendering Problems

Single-page apps and JavaScript-rendered content sometimes don't render correctly for Googlebot. Use server-side rendering, static generation, or pre-rendering to ensure content is in the initial HTML response.

Indexing Timeline Expectations

Site TypeFirst Page IndexedFull Site Indexed
Established domain with new contentHoursDays
Brand new domain, well-linked1–7 days2–4 weeks
Brand new domain, no backlinks1–4 weeks1–3 months
New domain with technical issuesNever (until fixed)Never

Beyond Indexing: Actually Ranking

Being indexed is necessary but not sufficient. To actually appear for useful searches, you also need:

Indexing is a zero-traffic baseline. Ranking for meaningful queries takes months and ongoing effort.