O
Octo
CoursesPricing
O
Octo
CoursesPricingDashboardPrivacyTerms

© 2026 Octo

SEO Fundamentals
1How Search Engines Work2Keyword Research3On-Page SEO4Technical SEO Basics5Content SEO & Topical Authority6Link Building & Off-Page SEO7Local SEO8SEO Analytics & Measurement9SEO Strategy: Putting It All Together
Module 4~20 min

Technical SEO Basics

The best content in the world can't rank if Google can't access it. Here's what technical SEO is, what actually matters, and what you can fix today.

The e-commerce site that couldn't be found

A UK fashion brand launched in 2020 with a beautifully designed website, strong product photography, and a solid social media following. They published content consistently. They got backlinks from fashion blogs.

Eighteen months in, their organic traffic was 400 sessions per month. Their competitor — similar product range, similar domain age — was at 85,000.

An SEO audit found the problem: the site was accidentally blocking Googlebot via the robots.txt file. It had been misconfigured during the initial site build. For 18 months, Google had been unable to crawl and index the site's content.

One file. One misconfiguration. 18 months of wasted effort.

Technical SEO isn't glamorous. It's not the reason most people get into digital marketing. But it's the foundation everything else sits on — and when it breaks, nothing else works.

What technical SEO covers

Technical SEO is the practice of ensuring search engines can efficiently crawl, index, and understand your website. While on-page SEO controls what's on the page, technical SEO controls how the site itself functions as an infrastructure.

The technical checks that matter most

1. Crawlability: can Google get in?

Robots.txt: A file at yourdomain.com/robots.txt that tells search engines which pages to crawl and which to avoid. A misconfiguration here can accidentally block your entire site.

Check it right now: visit yourdomain.com/robots.txt. If you see Disallow: / under User-agent: Googlebot — your site is blocking Google from crawling everything.

Healthy robots.txt for most sites:

User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml

You may legitimately block: admin pages, thank-you pages, duplicate content, internal search results.

Noindex tags: A <meta name="robots" content="noindex"> tag tells Google not to index a specific page. Useful for thin pages, duplicate pages, or paginated archives. Dangerous when accidentally placed on important pages you want to rank.

Check your most important pages: right-click → View Page Source → search for "noindex." If you find it on pages you want indexed — remove it.

Crawl errors: Google Search Console → Coverage report shows you pages that have crawl errors. Common types:

  • 404 (Not Found): Page was crawled but doesn't exist. Fix by restoring the page or redirecting to the correct URL.
  • 5xx (Server errors): Server failed to deliver the page. Usually a hosting or caching problem.
  • Soft 404: Page returns a 200 status but has no useful content. Google treats it as a 404.

2. Indexability: are pages worthy of indexing?

Duplicate content: When two URLs return substantially identical content, Google struggles to determine which to rank. Common causes:

  • http vs https versions of your site both accessible
  • www vs non-www both accessible
  • URL parameters creating variations (/blog?sort=date vs /blog)
  • Paginated archives being indexed

Canonical tags: The solution to duplicate content. The canonical tag tells Google which version of a URL is the "master" that should be indexed.

<link rel="canonical" href="https://www.yourdomain.com/blog/original-post/" />

Thin content: Pages with very little meaningful content (under 300 words, or content that doesn't provide real value) may be excluded from Google's index or actively penalised. Audit for thin pages and either improve them, consolidate them into better pages, or noindex them.

3. Page speed: how fast is fast enough?

Page speed affects both user experience and rankings. Google has used page speed as a ranking factor since 2010 (desktop) (Google, 2010) and 2018 (mobile) (Google, 2018), and the Core Web Vitals — a set of three metrics — are an official ranking factor since 2021 (mobile: June 2021, desktop: February 2022) (Google, 2021–2022).

Core Web Vitals (as of 2024–2025 per Google's Core Web Vitals specifications):

MetricMeasuresGood threshold
LCP (Largest Contentful Paint)How quickly the main content loadsUnder 2.5 seconds
INP (Interaction to Next Paint)How quickly the page responds to interactions≤200ms (200ms or less)
CLS (Cumulative Layout Shift)How much the page "jumps" during loadingUnder 0.1

How to check your page speed:

  • PageSpeed Insights (pagespeed.web.dev) — Google's free tool. Enter any URL. Get a score from 0–100 and specific recommendations.
  • Google Search Console → Core Web Vitals report — shows aggregate CWV data across your site.

The most impactful speed improvements:

  1. Compress images — uncompressed images are the #1 cause of slow pages. Use WebP format and compress all images before uploading.
  2. Use a CDN (Content Delivery Network) — serves your site from servers closest to each visitor. Many hosting providers include this.
  3. Enable browser caching — lets returning visitors load your site from their local cache instead of your server.
  4. Minify CSS and JavaScript — removes whitespace and comments from code files.
  5. Choose good hosting — a cheap shared hosting plan with slow server response time is a technical SEO problem money can fix quickly.

4. Mobile-friendliness

Google uses mobile-first indexing — it primarily uses the mobile version of your site for ranking and indexing decisions. A site that looks great on desktop but is broken or difficult to use on mobile will rank worse than it should.

Check your mobile experience:

  • Google Search Console → Mobile Usability report
  • Simply open your site on your phone and try to use it as a visitor would

Common mobile issues:

  • Text too small to read without zooming
  • Buttons and links too close together to tap accurately
  • Content wider than the screen (horizontal scrolling)
  • Pop-ups that cover content on mobile

5. HTTPS

HTTPS (encrypted connections) has been a Google ranking signal since 2014. All modern sites should use HTTPS — most hosting providers now offer free SSL certificates via Let's Encrypt.

Check: does your URL start with https://? If it starts with http:// — this needs fixing.

💭You're Probably Wondering…

There Are No Dumb Questions

"Do I need to know how to code to do technical SEO?"

For most technical SEO tasks — no. Checking robots.txt, reading a coverage report, running a PageSpeed test, checking the Mobile Usability report in Search Console — these require a browser and a few free tools. Where code is needed (implementing canonical tags, fixing redirect chains, adding structured data), most CMS platforms (WordPress, Shopify, Squarespace, Webflow) handle this through plugins or settings panels. If you encounter something that requires developer implementation, knowing what needs to be done is the SEO job — the developer does the coding.

"How much does page speed actually matter for rankings?"

It's a real ranking signal but not an overwhelming one. A very slow page with excellent content and strong backlinks will still outrank a very fast page with mediocre content and no links. Speed becomes a differentiator when other signals are equal — which, among competing pages targeting the same keyword, they often are. Fix obvious speed problems; don't obsess over moving a PageSpeed score from 78 to 95.

Redirects: the SEO implications of moving pages

When you move or delete a page, the URL changes. Anyone who had bookmarked, linked to, or ranked for the old URL loses their connection.

301 redirect: A permanent redirect. Tells browsers and search engines that a page has permanently moved to a new URL. Google transfers most of the ranking equity from the old URL to the new one. Use this when permanently moving or renaming a page.

302 redirect: A temporary redirect. Tells Google to keep the old URL in its index — because the redirect is temporary. Use this for temporary redirects only (A/B testing, maintenance pages).

The redirect mistake: Deleting pages without redirecting them creates 404 errors. Every 404 that receives links is wasted link equity — the authority of those links is lost. Audit for 404 errors regularly (Google Search Console → Coverage report) and redirect them to relevant existing pages.

Redirect chains: Avoid A → B → C → D. Each additional redirect in a chain reduces the amount of authority passed. Keep redirects to one hop: A → D directly.

Structured data: helping Google understand your content

Structured data (schema markup) is code you add to your pages to explicitly tell Google what kind of content a page contains. It doesn't directly affect rankings, but it can enable rich results — enhanced search result displays that include ratings, prices, FAQs, event dates, and more.

Common schema types:

Schema typeEnablesBest for
ArticleAuthor bylines in Google NewsBlog posts and articles
FAQExpandable FAQ sections in resultsFAQ pages
HowToStep-by-step instructions in resultsTutorial content
ProductPrice, availability, reviews in resultsE-commerce product pages
LocalBusinessMap integration, hours, contactLocal businesses
ReviewStar ratings in search resultsReview sites

Rich results increase click-through rates — a result with star ratings gets significantly more clicks than a plain text result at the same position.

Adding structured data without coding: Most CMS plugins (Yoast SEO for WordPress, RankMath) add structured data automatically or semi-automatically. For custom implementations, use Google's Structured Data Markup Helper.

Test your structured data: Google's Rich Results Test (search.google.com/test/rich-results) checks whether your structured data is correctly implemented.

⚡

Technical SEO Audit

25 XP
Run a technical SEO audit on a website you own or a competitor's site. Use only free tools. Checklist: 1. **Robots.txt:** Visit `/robots.txt`. Is anything being blocked that shouldn't be? 2. **HTTPS:** Does the site use HTTPS? Does `http://` redirect to `https://`? 3. **Page speed:** Run the homepage through PageSpeed Insights. What score does it get? What are the top 3 recommendations? 4. **Mobile:** Check Google Search Console → Mobile Usability report. Any issues? 5. **Coverage:** If you have Search Console access — check the Coverage report for errors 6. **Canonical:** View source on a key page. Is there a canonical tag? Does it point to itself (correct) or somewhere else? 7. **Indexing:** Search Google for `site:yourdomain.com`. How many pages are indexed? Produce: a prioritised list of 3 technical issues to fix, ordered by likely impact. _Most sites have some technical issues. The goal of this audit isn't perfection — it's finding the 3 things that, if fixed, would have the most meaningful impact on how Google can crawl and index the site._

The technical SEO toolkit (all free)

ToolUse
Google Search ConsoleCrawl errors, indexing status, Core Web Vitals, mobile usability
PageSpeed InsightsPage speed scores and recommendations
Screaming Frog SEO Spider (free up to 500 URLs)Crawl your whole site, find broken links, identify thin content, audit redirects
Google's Rich Results TestValidate structured data implementation
Google's URL Inspection Tool (in Search Console)See what Google knows about any specific URL

⚡

Fix One Technical Issue

25 XP
Using the audit you completed in the previous challenge, pick the highest-priority technical issue and fix it. Document: 1. What the issue was 2. What you did to fix it 3. How you verified the fix worked If you don't have a site to work on: use a free Webflow, Wix, or WordPress.com site. Publish 3 pages, run the technical audit, find the issues, and practice fixing them in the site's settings panel. _Technical SEO skills are best learned by doing. Reading about redirects is different from configuring one and verifying it works. The muscle memory of "I've done this before" is what makes technical SEO feel manageable rather than intimidating._

Back to the UK fashion brand

The fix took less than an hour. A developer opened the robots.txt file, removed the Disallow: / line that had been blocking Googlebot since launch, and replaced it with the correct configuration. Within days, Google's crawlers were finally able to access the site. Within weeks, pages started appearing in search results. Within two months, organic traffic had recovered to where it should have been 18 months earlier. The backlinks they'd earned, the content they'd published, the social following they'd built — all of it was waiting to produce results. One misconfiguration had been holding everything back. Great content cannot rank if Google cannot reach it. Check your robots.txt before you check anything else.


Key takeaways

  • Technical SEO is the foundation. Great content and strong backlinks can't overcome a site that Googlebot can't properly access and index.
  • The biggest technical risks: Accidentally blocking Googlebot in robots.txt; noindex tags on pages you want ranked; 404 errors from deleted pages without redirects; duplicate content confusing Google about which version to rank.
  • Page speed is a real ranking signal — especially for mobile. Fix the obvious issues (compress images, choose good hosting). Don't obsess over moving from 85 to 100.
  • Mobile-first indexing means your mobile experience is your ranking experience. Google's primary view of your site is on mobile.
  • Use Google Search Console as your technical SEO dashboard. It surfaces crawl errors, indexing problems, Core Web Vitals issues, and mobile usability problems — all for free.

?

Knowledge Check

1.A website's Google Search Console Coverage report shows 200 pages as 'Valid' (indexed) and 150 pages as 'Excluded — noindex tag.' The site owner didn't intentionally add noindex tags to those pages. What is the most likely cause and what should they do?

2.An e-commerce site has product pages accessible at both 'example.com/product/running-shoes' and 'example.com/product/running-shoes?ref=homepage'. What technical SEO problem does this create, and what is the solution?

3.A blog deletes 30 old posts that had accumulated backlinks from other websites. No redirects are set up. What is the SEO consequence?

4.A website's PageSpeed Insights score is 41 out of 100 on mobile. The largest contributing factors are uncompressed images (4–8MB each) and no browser caching configured. The site ranks page 2 for competitive keywords with strong content. What is the most strategic response?

Previous

On-Page SEO

Next

Content SEO & Topical Authority