Outsource And Offshore Testing Company India
Blazingcoders an Outsource and Offshore testing company in India provides various options in So
Read More
From what I’ve read and seen, CloudFront does not consistently identify itself in requests. But you can get around this problem by overriding robots.txt at the CloudFront distribution.
1) Create a new S3 bucket that only contains one file: robots.txt. That will be the robots.txt for your CloudFront domain.
2) Go to your distribution settings in the AWS Console and click Create Origin. Add the bucket.
3) Go to Behaviors and click Create Behavior: Path Pattern: robots.txt Origin: (your new bucket).
4) Set the robots.txt behavior at a higher precedence (lower number).
5) Go to invalidations and invalidate /robots.txt.
Now domainname.cloudfront.net/robots.txt will be served from the bucket and everything else will be served from your domain. You can choose to allow/disallow crawling at either level independently.
Another domain/subdomain will also work in place of a bucket, but why go to the trouble.
Blazingcoders an Outsource and Offshore testing company in India provides various options in So
Read MoreIn this Article, we are going to explain why this error occurs this is because of Google can’t
Read MoreBlazingcoders love to exchange articles and Guest posting. We love to share knowledge, learning, and
Read More