How to achieve environment specific robots.txt
vikram.bishnoi
Spryker Solution Partner Posts: 15 🧑🏻🚀 - Cadet
Hi Team,
Can you please how can we create environment specific robots.txt?
The requirment is to avoid crawling of lower environments and allow only PROD environment.
Please suggest.
Thanks
Vikram
0
Answers
-
fsmeier Senior Software Engineer & Developer Enablement Advocate Sprykee Posts: 1,075 ⚖️ - Guardians (admin)
Heyhey,
how are you creating the current robots.txt? And why is your robots.txt from a hopefully protected staging/testing environment publicly available to the crawlers?
All the best,
Florian
0 -
Hi Florian,
We have created two versions of robots.txt. In UAT/Staging robots.txt we have added:
User-agent: *
Disallow: /This way lower environments are not crawlable.
For PROD, we have added different version on robots.txt in our PROD/Master branch.
Thanks
0
Categories
- All Categories
- 42 Getting Started & Guidelines
- 7 Getting Started in the Community
- 8 Additional Resources
- 7 Community Ideas and Feedback
- 75 Spryker News
- 919 Developer Corner
- 779 Spryker Development
- 89 Spryker Dev Environment
- 362 Spryker Releases
- 3 Oryx frontend framework
- 34 Propel ORM
- 68 Community Projects
- 3 Community Ideation Board
- 30 Hackathon
- 3 PHP Bridge
- 6 Gacela Project
- 25 Job Opportunities
- 3.2K 📜 Slack Archives
- 116 Academy
- 5 Business Users
- 370 Docker
- 551 Slack General
- 2K Help
- 75 Knowledge Sharing
- 6 Random Stuff
- 4 Code Testing
- 32 Product & Business Questions
- 69 Spryker Safari Questions
- 50 Random