Technical SEO is often overlooked, but it's the foundation that makes all other SEO efforts possible. You can create the best content in the world, but if search engines can't crawl, index, or understand your site, you're invisible.
After auditing hundreds of websites at Appzivo, I've found that most sites have the same technical SEO issues. The good news? These issues are fixable, and addressing them can significantly improve your search rankings.
This guide covers the 10 most common technical SEO issues I encounter and how to fix them.
Why Technical SEO Matters
Technical SEO ensures that:
- Search engines can crawl your site
- Your pages get indexed properly
- Your site loads quickly
- Your site provides a good user experience
- Your site is accessible to all users
Without proper technical SEO, you're fighting an uphill battle. Here are the issues to fix:
1. Missing or Incorrect robots.txt
Your robots.txt file tells search engines which pages they can and cannot crawl. A missing or incorrect robots.txt can prevent important pages from being indexed or allow search engines to crawl pages you don't want indexed.
The Problem:
- No robots.txt file
- Blocking important pages
- Allowing crawling of admin/private pages
- Incorrect syntax
The Fix:
Create a proper robots.txt file in your site's root directory:
# Allow all crawlers
User-agent: *
Allow: /
# Disallow admin and private areas
Disallow: /admin/
Disallow: /api/
Disallow: /private/
# Allow specific bots if needed
User-agent: Googlebot
Allow: /
# Sitemap location
Sitemap: https://yoursite.com/sitemap.xmlFor Next.js:
// public/robots.txt
User-agent: *
Allow: /
Disallow: /api/
Sitemap: https://yoursite.com/sitemap.xmlCommon Mistakes:
- Blocking CSS/JS files (breaks rendering)
- Blocking important pages
- Using wildcards incorrectly
- Not including sitemap location
2. No XML Sitemap or Incorrect Sitemap
An XML sitemap helps search engines discover and index all your pages. Without one, search engines might miss important pages.
The Problem:
- No sitemap.xml
- Sitemap not submitted to Google Search Console
- Sitemap contains errors
- Sitemap missing important pages
The Fix:
Create a Sitemap:
For Next.js, create a sitemap:
// pages/sitemap.xml.tsx
import { GetServerSideProps } from 'next';
function generateSiteMap() {
return `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://yoursite.com</loc>
<lastmod>${new Date().toISOString()}</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://yoursite.com/about</loc>
<lastmod>${new Date().toISOString()}</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
</urlset>
`;
}
export const getServerSideProps: GetServerSideProps = async ({ res }) => {
const sitemap = generateSiteMap();
res.setHeader('Content-Type', 'text/xml');
res.write(sitemap);
res.end();
return { props: {} };
};
export default function SiteMap() {
return null;
}Submit to Google Search Console:
1. Go to Google Search Console
2. Navigate to Sitemaps
3. Add your sitemap URL
4. Monitor for errors
Best Practices:
- Update sitemap when content changes
- Include all important pages
- Set appropriate priorities
- Use lastmod dates
- Keep sitemap under 50,000 URLs
3. Missing or Incorrect Meta Tags
Meta tags provide important information to search engines. Missing or incorrect meta tags can hurt your SEO.
The Problem:
- Missing title tags
- Missing meta descriptions
- Duplicate title tags
- Missing Open Graph tags
- Missing Twitter Card tags
The Fix:
Essential Meta Tags:
<head>
<!-- Title (50-60 characters) -->
<title>Your Page Title | Your Brand</title>
<!-- Meta Description (150-160 characters) -->
<meta name="description" content="Your compelling page description that encourages clicks.">
<!-- Open Graph (for social sharing) -->
<meta property="og:title" content="Your Page Title">
<meta property="og:description" content="Your page description">
<meta property="og:image" content="https://yoursite.com/image.jpg">
<meta property="og:url" content="https://yoursite.com/page">
<meta property="og:type" content="website">
<!-- Twitter Card -->
<meta name="twitter:card" content="summary_large_image">
<meta name="twitter:title" content="Your Page Title">
<meta name="twitter:description" content="Your page description">
<meta name="twitter:image" content="https://yoursite.com/image.jpg">
<!-- Canonical URL -->
<link rel="canonical" href="https://yoursite.com/page">
</head>For Next.js:
import Head from 'next/head';
export default function Page() {
return (
<>
<Head>
<title>Your Page Title | Your Brand</title>
<meta name="description" content="Your page description" />
<meta property="og:title" content="Your Page Title" />
<meta property="og:description" content="Your page description" />
<meta property="og:image" content="https://yoursite.com/image.jpg" />
<link rel="canonical" href="https://yoursite.com/page" />
</Head>
{/* Your content */}
</>
);
}4. Duplicate Content Issues
Duplicate content confuses search engines and can hurt your rankings. Search engines don't know which version to index.
The Problem:
- Same content on multiple URLs
- www vs non-www versions
- HTTP vs HTTPS versions
- Trailing slash inconsistencies
- Parameter variations
The Fix:
Use Canonical Tags:
<link rel="canonical" href="https://yoursite.com/page">Redirect Duplicates:
// Next.js redirects in next.config.js
module.exports = {
async redirects() {
return [
{
source: '/old-page',
destination: '/new-page',
permanent: true, // 301 redirect
},
];
},
};Choose Preferred Domain:
Redirect www to non-www (or vice versa):
// Redirect www to non-www
module.exports = {
async redirects() {
return [
{
source: '/:path*',
has: [{ type: 'host', value: 'www.yoursite.com' }],
destination: 'https://yoursite.com/:path*',
permanent: true,
},
];
},
};Handle Trailing Slashes:
// Consistent trailing slash handling
module.exports = {
trailingSlash: true, // or false, but be consistent
};5. Slow Page Speed
Page speed is a ranking factor and affects user experience. Slow pages rank lower and have higher bounce rates.
The Problem:
- Large images
- Render-blocking resources
- No compression
- Slow server response
- Too many HTTP requests
The Fix:
See the previous article on website speed optimization. Key fixes:
- Optimize images
- Enable compression
- Use CDN
- Minimize HTTP requests
- Optimize server response time
Measure Performance:
- Google PageSpeed Insights
- Lighthouse
- Core Web Vitals
6. Mobile Usability Issues
With mobile-first indexing, mobile usability is critical. Issues can prevent your site from ranking well.
The Problem:
- Text too small to read
- Clickable elements too close together
- Content wider than screen
- Viewport not configured
- Slow mobile load times
The Fix:
Configure Viewport:
<meta name="viewport" content="width=device-width, initial-scale=1.0">Responsive Design:
/* Mobile-first approach */
.container {
width: 100%;
padding: 1rem;
}
@media (min-width: 768px) {
.container {
max-width: 1200px;
margin: 0 auto;
}
}Test Mobile Usability:
- Google Search Console Mobile Usability report
- Test on real devices
- Use responsive design mode in browsers
7. Broken Links and 404 Errors
Broken links hurt user experience and can waste crawl budget. Too many 404 errors can signal a poorly maintained site.
The Problem:
- Internal broken links
- External broken links
- Redirect chains
- Soft 404s (pages that return 200 but show 404 content)
The Fix:
Find Broken Links:
# Use tools like:
# - Screaming Frog
# - Ahrefs Site Audit
# - Google Search ConsoleFix or Redirect:
// Redirect broken links
module.exports = {
async redirects() {
return [
{
source: '/broken-page',
destination: '/working-page',
permanent: true,
},
];
},
};Custom 404 Page:
// pages/404.tsx
export default function Custom404() {
return (
<div>
<h1>404 - Page Not Found</h1>
<p>The page you're looking for doesn't exist.</p>
<Link href="/">Go back home</Link>
</div>
);
}8. Missing Structured Data (Schema Markup)
Structured data helps search engines understand your content and can enable rich snippets in search results.
The Problem:
- No structured data
- Incorrect schema markup
- Missing important schema types
- Schema validation errors
The Fix:
Add Structured Data:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Article",
"headline": "Your Article Title",
"author": {
"@type": "Person",
"name": "Your Name"
},
"datePublished": "2025-01-15",
"dateModified": "2025-01-15",
"description": "Your article description",
"image": "https://yoursite.com/image.jpg"
}
</script>Common Schema Types:
- Article
- Organization
- Person
- Product
- LocalBusiness
- BreadcrumbList
Validate Schema:
Use Google's Rich Results Test to validate your structured data.
9. Missing HTTPS/SSL Certificate
HTTPS is required for modern SEO. Google marks HTTP sites as "not secure" and HTTPS is a ranking factor.
The Problem:
- Site not using HTTPS
- Mixed content (HTTP resources on HTTPS page)
- Expired SSL certificate
- Incorrect SSL configuration
The Fix:
Get SSL Certificate:
- Use Let's Encrypt (free)
- Use your hosting provider's SSL
- Use Cloudflare (free SSL)
Force HTTPS:
// Redirect HTTP to HTTPS
module.exports = {
async redirects() {
return [
{
source: '/:path*',
has: [{ type: 'header', key: 'x-forwarded-proto', value: 'http' }],
destination: 'https://yoursite.com/:path*',
permanent: true,
},
];
},
};Fix Mixed Content:
Ensure all resources (images, scripts, stylesheets) use HTTPS URLs.
10. Poor URL Structure
Clean, descriptive URLs help both users and search engines understand your content.
The Problem:
- Long, complex URLs
- URLs with unnecessary parameters
- Non-descriptive URLs
- Inconsistent URL structure
The Fix:
Best Practices:
- Use descriptive keywords
- Keep URLs short
- Use hyphens to separate words
- Avoid unnecessary parameters
- Be consistent
Good URLs:
https://yoursite.com/blog/seo-best-practices
https://yoursite.com/products/web-development-servicesBad URLs:
https://yoursite.com/page?id=123&cat=abc&ref=xyz
https://yoursite.com/p/12345/67890/abcFor Next.js:
Use descriptive file names and folder structure:
pages/
blog/
[slug].tsx # Dynamic routes
products/
web-development.tsxTechnical SEO Audit Checklist
Use this checklist to audit your site:
Crawlability:
- [ ] robots.txt configured correctly
- [ ] XML sitemap exists and submitted
- [ ] No crawl errors in Search Console
- [ ] Important pages are crawlable
Indexability:
- [ ] No duplicate content issues
- [ ] Canonical tags implemented
- [ ] Proper redirects in place
- [ ] No index/no follow tags used correctly
On-Page SEO:
- [ ] Title tags optimized (unique, descriptive)
- [ ] Meta descriptions written
- [ ] Heading structure (H1, H2, H3) correct
- [ ] Alt text on images
- [ ] Internal linking structure
Technical:
- [ ] HTTPS implemented
- [ ] Mobile-friendly
- [ ] Fast page speed
- [ ] Structured data implemented
- [ ] Clean URL structure
Monitoring:
- [ ] Google Search Console set up
- [ ] Google Analytics configured
- [ ] Regular SEO audits scheduled
- [ ] Performance monitoring in place
Tools for Technical SEO
Free Tools:
- Google Search Console
- Google PageSpeed Insights
- Google Rich Results Test
- Screaming Frog (free version)
- Lighthouse
Paid Tools:
- Ahrefs Site Audit
- SEMrush Site Audit
- Screaming Frog (paid version)
- Sitebulb
Conclusion
Technical SEO might not be glamorous, but it's essential. These 10 issues are the most common problems I see, and fixing them can have a significant impact on your search rankings.
Start with the basics:
1. Fix robots.txt and sitemap
2. Add proper meta tags
3. Resolve duplicate content
4. Improve page speed
5. Ensure mobile usability
Then move to advanced:
6. Fix broken links
7. Add structured data
8. Ensure HTTPS
9. Optimize URL structure
10. Set up monitoring
Remember: Technical SEO is an ongoing process. Regular audits and monitoring help catch issues before they hurt your rankings. Don't let these common issues prevent your great content from being found.