Skip to main content

SEO Settings

Configure technical SEO settings to help search engines discover, crawl, and index your content.

Overview

SEO settings control how search engines interact with your site. From sitemaps to automatic indexing, these tools help ensure your content gets discovered and ranked.

Where to find it

Navigate to Settings → SEO in your dashboard.

Related documentation

For a comprehensive guide to SEO best practices, see the main SEO Documentation.

Site Verification

Verify your site ownership with Google Search Console to access advanced SEO tools and performance data.

Verification Code

Enter your Google verification code

Find this in Google Search Console → Settings → Ownership verification.

1

Go to Google Search Console

Visit search.google.com/search-console and add your property.
2

Choose HTML tag verification

Select the HTML tag verification method from the options.
3

Copy the verification code

Copy the meta tag content (the part after content=).
4

Paste and save

Paste the code in the verification field and save your settings.

XML Sitemap

Your sitemap is automatically generated and updated whenever you publish or modify content. It lists all your public pages for search engines to crawl.

Sitemap Status

Your sitemap is automatically managed

Sitemap Active
Auto-generated

yourblog.com/sitemap.xml

Last updated: 5 minutes ago • 42 URLs indexed

No action needed

Your sitemap is created and updated automatically. Just submit the URLyourdomain.com/sitemap.xml to Google Search Console once.

Auto Indexing

When enabled, new posts are automatically submitted to Google for indexing, helping them appear in search results faster.

Auto Indexing Toggle

Automatically notify Google of new content

Submit new posts to Google automatically

Requires Google connection

Auto indexing requires connecting your Google account. See the Service Account section below.

Benefits of Auto Indexing

  • New posts appear in search results within minutes, not days
  • Updated posts are re-indexed automatically
  • Deleted posts are removed from the index

Robots.txt

The robots.txt file tells search engine crawlers which pages they can and cannot access. A default robots.txt is automatically generated for your site.

Robots.txt Preview

Your current robots.txt configuration

User-agent: *
Allow: /

Sitemap: https://yourblog.com/sitemap.xml

Default configuration

The default robots.txt allows all search engines to crawl all public pages. This is typically what you want for maximum SEO visibility.

Service Account

A Google service account is required for advanced features like auto indexing and the Analytics Dashboard. This account allows your blog to communicate directly with Google's APIs.

Service Account Setup

Upload your service account JSON

Upload service account JSON file

Your credentials are encrypted and stored securely.

1

Create a Google Cloud project

Go to console.cloud.google.com and create a new project.
2

Enable required services

Enable the services Postlyo needs: Indexing, Search Console, and Analytics.
3

Create a service account

Go to IAM & Admin → Service Accounts → Create Service Account.
4

Generate and download JSON key

Create a key for the service account and download the JSON file.
5

Upload to Postlyo

Upload the JSON file in your SEO settings.

Detailed guide

For step-by-step instructions with screenshots, see the Analytics Documentation.