Skip to main content

How to block dynamic urls by robots txt in blogger

URL structure in blogger is restricted feature. It can not be either modified or deleted, specially when it relates to post & page url. Post url has fixed date inclusion in every new article you write. Similarly Pages has pre-fixed /P/ inclusion in every new page you create in blog spot as shown in below examples.

Post permalink

Page permalink

According to google's policy these permalink cant be changed by users, Unlike word-press it is default and prefixed.

That is why it becomes biggest issue in terms of SEO. However Unique article with date inclusion can generates good traffic but as far as i know after some time those websites articles without date will win. No matter how good you write or represent.

Read more: Optimize Blogger SEO by Choosing right meta description
Read more: Basic Blogger SEO Tricks And Tips Forever | Best SEO Help For Your Blog/Website

Dynamic and static URL

In simple words auto generated link structure which includes special characters like "?", "#" and "=" called as dynamic url and rest of static.

Dynamic link

Static link

According to google blog post now they recommend that do not rewrite dynamic urls that looks ugly on your blog.

However it is not mandatory but refers to those websites which has selling, trading and other facility services like Online games, tools, tips based on use queries.

Google has now smarter solution for both static ans dynamic links.

Dynamic URL in blogger

As mentioned above there are certain limitations in blogger specially with permalink. You can only modify the post link once while you are writing it first time.

Editable post Permalink in Blogger before published
Editable post Permalink in Blogger before published 

Fixed Link after post published
Fixed Link after post published

Whenever user comments on your blog, or spamming with unusual queries generates Dynamic-links. It will not harm your website but will affects your reputation. Because it will throw 404 error when user click on it.

Once Dynamic url generated it can not be deleted or modified later. So that to improve website reputation and completely block that unusual links we need to configure ROBOTS TXT file.

ROBOTS configuration gives us total control over web-crawlers.

Block Every ugly Urls with ROBOTS TXT

Disallow function blocks every links to be fetched by web crawlers, So by implementing certain rules we can also use it on blogger.

For example if you need some page to be hidden from search engines, following rule will resist it to be fetch by web-crawlers
Disallow: /search
Disallow: /*?*
Disallow: /*=*

Here Disallow /Search will block all links which includes Search, Similarly second and third line will obstruct all link which contains "?" and ''=" characters.

Thus no dynamic urls will ever fetch by search engines and will not occur in any search results. Although wrong configuration will affect your entire website, may cause traffic lost or may ignored by Search engines, so be careful while working with ROBOTS configurations.

You can check and validate it in Google webmaster tools.
Go to Crawl > Robots txt tester

Test your Sitemap and it will gives you over all idea if everything is fine or not.

After successful implementation, you can see the effect on your google webmaster tool's crawl errors section. All the 404 Errors will dramatically decrees. Thus it will also impact on your overall Search engine optimization score and website reputation.