Stott.Optimizely.RobotsHandler
4.0.0
dotnet add package Stott.Optimizely.RobotsHandler --version 4.0.0
NuGet\Install-Package Stott.Optimizely.RobotsHandler -Version 4.0.0
<PackageReference Include="Stott.Optimizely.RobotsHandler" Version="4.0.0" />
paket add Stott.Optimizely.RobotsHandler --version 4.0.0
#r "nuget: Stott.Optimizely.RobotsHandler, 4.0.0"
// Install Stott.Optimizely.RobotsHandler as a Cake Addin #addin nuget:?package=Stott.Optimizely.RobotsHandler&version=4.0.0 // Install Stott.Optimizely.RobotsHandler as a Cake Tool #tool nuget:?package=Stott.Optimizely.RobotsHandler&version=4.0.0
Stott.Optimizely.RobotsHandler
This is an admin extension for Optimizely CMS 12+ for managing robots content. Stott Robots Handler is a free to use module, however if you want to show your support, buy me a coffee on ko-fi:
Robots.Txt Management
Robots.txt content can be managed on a per site and host definition basis. A host of "default" applies to all unspecified hosts within a site, while specific host definitions will only apply to the specific host.
Environment Robots
Introduced within version 4.0.0
Environment Robots allows you to configure the meta robots tag and X-Robots-Tag
header for all page requests within the current environment. This functionality provides the ability to prevent search engine robots from scanning indexing a site that is a lower level environment or a production environment that is not ready for general consumption.
Options will always exist for Integration, Preproduction, Production and the current environment name. This allows you to preconfigure a lower enviroment when cloning content from production to lower environments.
When a configuration is active, a Meta Tag Helper will look for and update the meta robots tag while a middleware will include the X-Robots-Tag
header. Its best in this case that your solution always renders the meta robots element and allow the Meta Tag Helper to either override it or remove it where needed.
The meta tag helper will execute for any meta
tag with a name
attribute. The logic within the robots tag helper will only execute where the name
attribute has a value of robots
. In such a circumstance it will perform one of the following actions
- When
name
isrobots
- If the page has a robots definition and an environment configuration is not set, then the page robots value will be used.
- If the page has a robots definition and an environment configuration is set, then the environment configuration replaces the page definition.
- If the page does not have robots definition and an environment configuration is not set, then the meta tag will be removed.
- If the page does not have a robots definition and an environment configuration is set, then the meta tag will use the environment configuration
- When
name
is notrobots
- Preserve existing state.
Examples: | Page Robots | Environment Robots | Result | |-------------|--------------------|--------| | noindex,nofollow | noindex,nofollow,noimageindex | noindex,nofollow,noimageindex | | noindex,nofollow | - | noindex,nofollow | | - | noindex,nofollow,noimageindex | noindex,nofollow,noimageindex | | - | - | meta robots tag is removed |
Installation
Install the Stott.Optimizely.RobotsHandler
package into your website.
Startup.cs
You need to ensure the following lines are added to the startup class of your solution:
public void ConfigureServices(IServiceCollection services)
{
services.AddRobotsHandler();
}
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
app.UseRobotsHandler();
app.UseEndpoints(endpoints =>
{
endpoints.MapContent();
endpoints.MapControllers();
});
}
The call to services.AddRobotsHandler()
sets up the dependency injection requirements for the RobotsHandler solution and is required to ensure the solution works as intended. This works by following the Services Extensions pattern defined by Microsoft.
The call to app.UseRobotsHandler()
sets up the middleware required to create the X-Robots-Tag
header
The call to endpoints.MapControllers();
ensures that the routing for the administration page, assets and robots.txt are correctly mapped.
Razor Files
In the _ViewImports.cshtml
file you will need to add the following line to include meta robots tag helper.
@addTagHelper *, Stott.Optimizely.RobotsHandler
Program.cs
As this package includes static files such as JS and CSS files within the Razor Class Library, your solution must be configured to use Static Web Assets. This is done by adding webBuilder.UseStaticWebAssets();
to your Program.cs
as follows:
Host.CreateDefaultBuilder(args)
.ConfigureCmsDefaults()
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
webBuilder.UseStaticWebAssets();
});
You can read more about shared assets in Razor Class Libraries here: Create reusable UI using the Razor class library project in ASP.NET Core
Adding Robots Admin to the menus.
This solution also includes an implementation of IMenuProvider
which ensures that the Robots Handler administration pages are included in the CMS Admin menu under the title of "Robots". You do not have to do anything to make this work as Optimizely CMS will scan and action all implementations of IMenuProvider
.
Authorisation Configuration
The configuration of the module has some scope for modification by providing configuration in the service extension methods. The provision of authorizationOptions
is optional in the following example.
Example:
services.AddRobotsHandler(authorizationOptions =>
{
authorizationOptions.AddPolicy(RobotsConstants.AuthorizationPolicy, policy =>
{
policy.RequireRole("WebAdmins");
});
});
If the authorizationOptions
is not provided, then any of the following roles will be required by default:
- CmsAdmins
- Administrator
- WebAdmins
Authentication With Optimizely Opti ID
If you are using the new Optimizely Opti ID package for authentication into Optimizely CMS and the rest of the Optimizely One suite, then you will need to define the authorizationOptions
for this module as part of your application start up. This should be a simple case of adding policy.AddAuthenticationSchemes(OptimizelyIdentityDefaults.SchemeName);
to the authorizationOptions
as per the example below.
services.AddRobotsHandler(authorizationOptions =>
{
authorizationOptions.AddPolicy(RobotsConstants.AuthorizationPolicy, policy =>
{
policy.AddAuthenticationSchemes(OptimizelyIdentityDefaults.SchemeName);
policy.RequireRole("WebAdmins");
});
});
Contributing
I am open to contributions to the code base. The following rules should be followed:
- Contributions should be made by Pull Requests.
- All commits should have a meaningful message.
- All commits should have a reference to your GitHub user.
- Ideally, all new changes should include appropriate unit test coverage.
Contributors
Thank you for your feedback and contributions go to the following members of the community:
Contributor | Bug Reports | Pull Requests |
---|---|---|
Paul Mcgann | 1 | 1 |
Ellinge | 1 | 1 |
Tomas Hensrud Gulla | - | 1 |
Anish Peethambaran | 1 | - |
Deepa V Puranik | 1 | - |
Mahdi Shahbazi | 1 | - |
Praveen Soni | 1 | - |
jhope-kc | 1 | - |
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net6.0
- EPiServer.CMS.UI.Core (>= 12.29.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Add the ability define environment level robots meta tags and headers.