Description
Have you encountered an obstacle while creating and editing robots.txt file on your website?
Bisteinoff SEO Robots.txt is an easy-to-use plugin that helps you generate and configure a correct robots.txt file, which is essential for search engine optimization (SEO). This file defines crawling rules for search engine bots such as Google, Bing, Yahoo!, Yandex, and others.
The plugin works perfectly both if the file robots.txt has never been created or if it already exists. Once installed the plugin makes an optimized robots.txt file that includes special rules common for WordPress websites. After that you can proceed further customization specific for your own website if needed.
If the plugin detects one or several Sitemap XML files it will include them into robots.txt file.
No FTP access, manual coding or file editing is required that makes managing settings easy and convenient!
Key Features
- Automatic generation of optimized robots.txt with WordPress-specific rules
- Special rules for Google and Yandex search engines
- Custom rules support for any search engine bot
- Automatic sitemap detection and inclusion
- WooCommerce compatibility with specific rules
- Multisite support
- Easy-to-use admin interface
- Modern PHP architecture with namespaces for conflict-free operation
Screenshots
Installation
- Upload db-robotstxt folder to the
/wp-content/plugins/directory - Activate the plugin through the ‘Plugins’ menu in WordPress
- The plugin will automatically create a virtual robots.txt file
- Go to Settings > SEO Robots.txt to customize rules
FAQ
-
Will it conflict with any existing robots.txt file?
-
No, it will not. If the file robots.txt is found in the root folder it will not be overridden. On the Settings page you will see a notification with two options: rename or delete the existing file robots.txt. The plugin provides this functionality directly in the admin interface.
-
Could I accidentally block all search robots?
-
Once the plugin is installed it will work fine for all search engine robots. If you are not aware of the rules for fine-tuning a robots.txt it is better to leave the file as is or read first a corresponding manual to learn more about the directives used for robots.txt.
Note: the following directives would block the corresponding search robot(s):
Disallow: Disallow: / Disallow: * Disallow: /* Disallow: */You should use any of these directives only if you do not want any page of your website to be accessible for crawling.
-
Where can I read the up-to-date guide on robots.txt?
-
What happens when I update to version 4.0?
-
For regular users: Nothing changes! The plugin will automatically migrate all your settings. Everything continues to work exactly as before.
For developers: Version 4.0 introduces a complete code refactoring with modern PHP classes and namespaces. If you have custom code that references this plugin’s functions, constants, or options, please review the migration information below.
-
Migration to v.4.0 – Information for Developers
-
If you have custom code that integrates with this plugin, please note these changes:
Checking for deprecation notices: All deprecated elements will trigger
_doing_it_wrong()notices whenWP_DEBUGis enabled. Enable debug mode to identify any issues:
define(‘WP_DEBUG’, true);Changed option names:
*db_robots_custombisteinoff_plugin_robots_custom
*db_robots_custom_googlebisteinoff_plugin_robots_custom_google
*db_robots_if_yandexbisteinoff_plugin_robots_enable_yandex
*db_robots_custom_yandexbisteinoff_plugin_robots_custom_yandex
*db_robots_custom_otherbisteinoff_plugin_robots_custom_otherNote: Options are migrated automatically. Old option names are removed from the database after successful migration.
Changed constants:
*DB_PLUGIN_ROBOTSTXT_VERSIONBISTEINOFF_PLUGIN_ROBOTS_VERSION
*DB_PLUGIN_ROBOTSTXT_DIRBISTEINOFF_PLUGIN_ROBOTS_DIRNote: Old constants remain defined for backward compatibility.
Changed functions (now deprecated):
*publish_robots_txt()Use\Bisteinoff\Plugin\RobotsTXT\Generator::generate()instead
*db_robots_admin()Use\Bisteinoff\Plugin\RobotsTXT\Admin::add_menu_page()instead
*db_robotstxt_admin_settings()Use\Bisteinoff\Plugin\RobotsTXT\Admin::render_settings_page()instead
*db_settings_link()Use\Bisteinoff\Plugin\RobotsTXT\Loader::add_settings_link()insteadNote: Deprecated functions continue to work with backward compatibility.
Action required:
Update your custom code to use the new naming conventions. All deprecated elements will be removed after Feb 16th 2027.
Reviews
Contributors & Developers
“Bisteinoff SEO Robots.txt” is open source software. The following people have contributed to this plugin.
Contributors“Bisteinoff SEO Robots.txt” has been translated into 3 locales. Thank you to the translators for their contributions.
Translate “Bisteinoff SEO Robots.txt” into your language.
Interested in development?
Browse the code, check out the SVN repository, or subscribe to the development log by RSS.
Changelog
4.0
- MAJOR UPDATE: Complete code refactoring with modern PHP architecture
- Compatible with WordPress 6.9
- Compatible with WordPress Theme Bisteinoff 2.4+
- Compatible with PHP 7.0 through PHP 8.4 (no deprecated PHP features used)
- Feature: Modern PHP namespaces (
Bisteinoff\Plugin) to prevent conflicts with other plugins - Feature: Seamless integration with Bisteinoff WordPress themes and plugins
- Feature: Efficient class-based architecture with lazy loading
- Feature: Automatic migration system for settings and options
- Fix: Undefined variable warnings for
$db_renamedand$db_deleted - Backward Compatibility: All old function names preserved until at least February 16, 2027
- Backward Compatibility: Old constant names (DB_PLUGIN_ROBOTSTXT_*) preserved
- Backward Compatibility: Options automatically migrated from old to new names
- For Developers: See FAQ section “Migration to v.4.0” for detailed technical information
3.12
- Compatible with WordPress 6.7
- Rewritten the code with depricated and discouraged functions
- Security issues
3.11
- Design of the Settings page in admin panel
3.10
- Custom rules for WooCommerce if the plugin is installed and activated
- Fixing ampersand symbol
3.9
- Security issues
3.8
- Compatible with WordPress 6.5
3.7
- Security issues
3.6
- Compatible with WordPress 6.3
- Security issues
3.5
- Compatible with multisites
3.4.2
- Corrected errors in the functions for translation of the plugin
3.4.1
- Now the translations are automatically downloaded from https://translate.wordpress.org/projects/wp-plugins/db-robotstxt/ If there is not a translation into your language, please, don’t hesitate to contribute!
3.4
- Compatible with GlotPress
3.3
- New options to rename or delete the existing robots.txt file
3.2
- New option to disable the rules for Yandex
- Design of the Settings page in admin panel
3.1
- New basic regular rules for Googlebot and Yandex
- Now more possibilities to manage your robots.txt: you can add custom rules for Googlebot, Yandex and other User-agents
- More information about your robots.txt on the settings page
3.0
- Added a settings page in admin panel for custom rules
2.3
- Tested with WordPress 6.2.
- The code is optimized
- Added the robots directives for new types of images WebP, Avif
2.2
- Fixed Sitemap option
2.1
- Tested with WordPress 5.5.
- Added wp-sitemap.xml
2.0
- Tested with WordPress 5.0.
- The old Host directive is removed, as no longer supported by Yandex.
- The robots directives are improved and updated.
- Added the robots directives, preventing indexind duplicate links with UTM, Openstat, From, GCLID, YCLID, YMCLID links
1.0
- Initial release.



