Reduce boilerplate and maintain project consistency
Before reading: If you work in a small company with only one or two projects, this post may not be relevant for you.
I've worked for companies of various sizes, from small startups to organizations with thousands of developers. While small companies typically have one or two projects, large companies can have hundreds or thousands of repositories. As organizations grow, keeping all projects aligned and consistent over time becomes challenging. It's easy to use a template to start a new project, but how do you keep all projects aligned over time? How do you ensure that all projects update their coding standards, CI/CD, security, observability, and more to the latest standards?
I've focused on developer experience and productivity for about 10 years, mostly with .NET. I've contributed to different topics such as coding standards, static analysis, CI/CD, build machines, observability, libraries, productivity tools, and more. I've learned a lot from my experiences, and I can tell you there is no single solution. Each company faces unique challenges, but I want to share some ideas and strategies that can help improve consistency across projects.
The solution I prefer is to remove as much boilerplate code and configuration as possible from each project and replace them with implicit defaults or references that can be updated using tools such as Dependabot or Renovate. This approach provides a baseline configuration that can be updated automatically. It's also important to allow extensibility points for project-specific settings.
Advantages:
- Focus on your product: Developers can focus on building features instead of configuring and maintaining shared code.
- Greater organizational impact: Each additional project referencing a shared library amplifies its value.
- Accelerated delivery: Reusing existing components speeds up the journey from idea to production.
- Consistency: Promotes uniform practices and standards across projects.
- Reduced maintenance: Less duplicated code means fewer things to maintain and test.
- Enhanced collaboration: Facilitates knowledge sharing and teamwork.
- Less boilerplate code: Easier to read and maintain codebases, as you don't have to repeat the same configuration in multiple projects. It's also clearer what is specific to the project as the boilerplate is reduced.
- Updatable: Providing new features or fixing bugs without requiring changes in every project.
Most of the time, developers only think about creating packages for code. But sharing by reference goes beyond just code libraries. It can also include configuration files, coding standards, CI/CD pipelines, and more.
Also, before you start your journey, be prepared for pushback from developers and product managers who may not immediately see the value of these efforts. The benefits are rarely immediate, but they compound over time. You'll need to advocate and evangelize the benefits of sharing by reference. Start small, with a few projects, and demonstrate the benefits. Once you have a few success stories, it will be easier to convince others to join!
You also need to understand that it's not about generic libraries that anyone can use. It's about creating shared components specific to your company and projects. It's definitely opinionated, as you want to provide the right defaults to set developers up for success. You can make assumptions about the technology stack.
Let's look at some strategies to share by reference. Some are .NET specific, but the principles apply to any technology stack.
#.NET Coding standards and static analysis
I've already written about sharing coding standards and Roslyn analyzers across projects. The idea is to create a NuGet package that contains the .editorconfig
files, analyzers, and MSBuild configuration. This package provides:
- A common coding style across all projects that reference it
- A set of Roslyn analyzers and a baseline configuration to enforce static analysis for best practices and security
- MSBuild properties, items, and targets to configure the project, such as enabling nullable reference types, enforcing code style in the build, treating warnings as errors on CI, enabling NuGet auditing, enabling source link, and more
This package allows you to remove hundreds of lines of boilerplate code and configuration from each project:
- C# and VB.NET sections from the
.editorconfig
(~100 lines) - Configuration files like
Directory.Build.props
andDirectory.Build.targets
(~70 lines) - Analyzer configuration (up to a few hundred lines)
As it's a NuGet package, it's easy to update when a new version is published. All projects can benefit from the latest coding standards and analyzers, and it prevents configuration drift.
<GlobalPackageReference Include="Meziantou.DotNet.CodingStandard" Version="1.0.175" />
If you're curious about the package, you can find it on NuGet and the source code on GitHub. As a bonus, it's also possible to write tests to ensure all the settings work as expected. You'll find some tests in the source code of the package.
At Workleap, we also add some opt-in features to the coding standard package, such as preventing people from using Newtonsoft.Json
types in their projects. Opt-in only requires adding a new property in the project file:
<PropertyGroup>
<BanNewtonsoftJsonSymbols>true</BanNewtonsoftJsonSymbols>
</PropertyGroup>
What's great is that you now have a single location to provide features for .NET developers, and it's updatable. This solution has been greatly appreciated by developers. It allows them to focus on writing code instead of configuring their projects and reduces discussions about coding standards. Apply what the package says, and you're good to go.
If you're using .NET, I strongly recommend reading about MSBuild as it allows you to do a lot! Also, MSBuild SDKs go further than simple NuGet packages. A good example is MSTest.Sdk.
#Renovate configuration
Renovate is a tool that automates dependency updates. It can be configured using a renovate.json
file. The configuration file can be quite large, especially if you have many dependencies or want to customize Renovate's behavior. This can lead to duplication across multiple projects if each project has its own renovate.json
file. Also, each team may configure Renovate differently, leading to inconsistencies. For instance, teams may use different pull request title formats, making it harder to create inbox rules to filter Renovate pull requests.
When a package is updated and changes its license, you may want to prevent the update company-wide until the legal team reviews the new license. If each project has its own renovate.json
file, you have to update each file to prevent the update, which is tedious and error-prone.
A better solution is to create a shared Renovate configuration file and reference it in each project. This is one reason I prefer Renovate over Dependabot, the other being its flexibility.
Here's what a Renovate configuration file looks like in my projects:
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": ["github>meziantou/renovate-config"], // import the shared configuration
"packageRules": [
// project specific rules
]
}
As a bonus, at least for GitHub, Renovate will automatically reference the shared configuration if it is in the same organization and the repo is named renovate-config
. This means you can remove the renovate.json
file from each project and let Renovate use the shared configuration if you don't need to override it.
More info about creating a shared Renovate configuration. This configuration can be up to a few hundred lines, depending on the complexity of your dependencies.
Sample configuration files:
#ServiceDefaults package
If you create multiple web APIs or workers, you may find yourself repeating the same configuration for each project. For instance, you have to add Azure AppConfiguration, configure OpenTelemetry, logging, health checks, HTTP client resiliency, and other common services. Instead of duplicating this configuration across all projects, you can create a shared ServiceDefaults
package that contains the common configuration and use it in each project. The main benefit of using a package is the ability to update it easily over time. If you need to change how OpenTelemetry is configured, you can do it in the package and all projects that reference it will benefit.
If you're familiar with the .NET Aspire template, you may have seen the ServiceDefaults project. It provides a set of common services and configurations that can be reused across multiple projects in a solution. But why not create a NuGet package that can be used in any project instead of having it per solution?
Here's what the code looks like in the Program.cs
file of a web API project:
<PackageReference Include="Meziantou.AspNetCore.ServiceDefaults" Version="1.0.5" />
var builder = WebApplication.CreateBuilder(args);
builder.UseMeziantouConventions(); // Register common services (e.g. OpenTelemetry, logging, JSON serialization options, etc.)
// Application-specific services
builder.Services.AddSingleton<CatalogService>();
var app = builder.Build();
app.MapMeziantouDefaultEndpoints(); // Register common endpoints (e.g. health checks, OpenAPI, SwaggerUI, etc.)
// Application-specific endpoints
app.MapGet("api/item", async (CatalogService catalog) => "...");
app.Run();
By running the two lines UseMeziantouConventions
and MapMeziantouDefaultEndpoints
, I know my application will have the most common services configured, and I can update them in the future without changing the project's code. Here's a list of services you can add by default, and you can customize them if needed:
- Standard configuration sources
- OpenTelemetry including the expected activity sources
- Logging
- JSON serialization options (e.g.
System.Text.Json
withJsonStringEnumConverter
, etc.) - Health checks with standard endpoints
- OpenAPI generation + UI
- Exception handling
- Resilience for
HttpClient
(circuit breaker, retry, etc.) andUser-Agent
header for all HTTP requests - HTTPS redirection and HSTS
- Feature flags
- Authentication
It also simplifies project dependencies. The ServiceDefaults
package already references many required packages, reducing the number of pull requests to update them. Keeping project dependencies up-to-date is often a challenge as most developers don't see value to do it regularly. So, if you have less dependencies, you have fewer pull requests to update them, and it reduces the risk of having outdated dependencies.
You should also provide a way to configure the services if needed. But keep in mind that this library is opinionated, so configuration should be minimal and for specific services. For instance, if you want to disable HTTPS features (HTTPS redirection, HSTS), or register additional tracing events for OpenTelemetry, you can do it by passing options to the UseMeziantouConventions
method:
builder.UseMeziantouConventions(options =>
{
options.Https.Enabled = false;
options.OpenTelemetry.ConfigureTracing = tracing => tracing.AddSource("CustomSource");
});
As you can see, the configuration is declarative, not imperative. Tell us what you want, not how to do it. This is a key principle that gives the package flexibility to change the implementation without breaking the code that uses it. For instance, if you want to change how OpenTelemetry is configured, you can do it without changing the code that uses the package.
#MSBuild tasks
MSBuild is an extensible build system that allows you to create custom tasks and targets. nuget.org has a few packages to insert MSBuild tasks in your project. For instance, you can automatically set the version of your project using MinVer
, GitVersion.MsBuild
, or Nerdbank.GitVersioning
. At Workleap, we created a package to help with OpenAPI generation and validation. The package Workleap.OpenApi.MSBuild
integrates into the build process (dotnet build
) to ensure the OpenAPI specification file is up-to-date with the code, compliant with company standards (using spectral), and that there are no breaking changes in the API.
#File fragments
Sometimes, you cannot remove the boilerplate code. In this case, a strategy is to wrap it into a "region" that can easily be identified and updated. The shared part of the file can be versioned and updated using a tool like Renovate.
If you look at the .editorconfig
file in my projects, you can see some regions. The shared parts of files are surrounded by comments # reference:URL
and # endreference
. This way, you can easily find the shared part of the file and update it if needed. The shared part of the file can be versioned using tags and updated using Renovate. Note that Renovate allows running custom tools after updating a reference if you use a self-hosted instance. This way, Renovate can update the version in the reference URL and run the tool to update the content of the section.
# reference:https://raw.githubusercontent.com/meziantou/Meziantou.DotNet.CodingStandard/refs/tags/1.0.0/.editorconfig
root = true
[*]
indent_style = space
trim_trailing_whitespace = true
end_of_line = lf
[*.{csproj,vbproj,vcxproj,vcxproj.filters,proj,projitems,shproj}]
indent_size = 2
...
# endreference
# Project specific settings
[*.cs]
dotnet_diagnostic.CA1008.severity = none
#CI/CD pipelines
Pipelines are another good candidate to be shared across projects. You can have different strategies to share pipelines:
- For compliance pipelines (e.g. SAST, DAST, dependency scanning, etc.), you can move them to the organization level. This ensures all projects have the same compliance checks and you can update them in a single place. It also avoids updating each project when you want to change the compliance checks.
- GitHub provides flexible ways to share parts of your CI/CD pipelines:
In my previous job, a CD pipeline for an API or a worker looked like the following (GitLab). It works well because there are assumptions about the technology stack (e.g. .NET, Docker, Kubernetes) and the pipeline is opinionated. For instance, it expects a Dockerfile somewhere in the repository. It also has knowledge about the registry and the Kubernetes cluster to deploy to. That's why it only asks for a few specific pieces of information to configure the pipeline.
include: https://registry.example.com/k8s-recipe/k8s-recipe-1.0.0.yaml
variables:
CHART_VALUES_DEFAULT: default-1.0.0.yaml
CHART_VALUES_STAGING: my-app-staging-1.0.0.yaml
CHART_VALUES_PRODUCTION: my-app-production-1.0.0.yaml
HOST_NAME: "my-app.com"
DOCKER_IMAGE_NAME: app-name
#Dockerfile
Do you really need a Dockerfile? Some tech, such as .NET, can create Docker images without a Dockerfile. For instance, you can use dotnet publish
to create your image and push it to a registry. It's not only about removing boilerplate code, but also about performance. It also avoids many common issues with Dockerfiles, such as caching.
# no Dockerfile needed
dotnet publish -p:PublishProfile=DefaultContainer
#Helm charts
Because you can set standards using MSBuild packages or SDKs, you can reuse those standards in your Helm charts. For instance, you can already set the readiness and liveness probes in a shared Helm chart. You can also add parameters specific to your infrastructure, such as certificates, autoscaling, Azure pod identity, etc. This way, you can remove boilerplate code from your Helm charts and make them more reusable across projects. Most projects need to set image name, CPU, and memory limits. Other parameters should be common to most projects.
#PowerShell modules
PowerShell is a powerful scripting language that can be used to automate tasks and manage systems. It's very common on CI or for local operations. I do use them more than bash scripts, as they provide a better debugging experience and are cross-platform. PowerShell modules are a great way to share scripts and functions across projects. You can create a module that contains common functions and scripts that can be reused in multiple projects using Import-Module -Name <ModuleName> -RequiredVersion <ModuleVersion>
. Modules are versioned and published as NuGet packages, so you can easily update them when needed.
- Publish module: Publish-Module
- Consume module: Install-Module
#MCP server to share AI prompts
With the rise of AI, developers add more prompts to their projects. Some prompts are generic and can be reused across projects. One way to share these prompts is to use a MCP server. MCP servers can provide tools, resources, and prompts that can be reused across projects. This way, you can share AI prompts and resources without duplicating them in each project.
Note that GitHub allows setting a custom prompt for Copilot in the organization settings. This is a good way to share a prompt that can be used by all developers in the organization.
It's very easy to create an MCP server using the ModelContextProtocol.AspNetCore package.
using ModelContextProtocol.Server;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddMcpServer()
.WithStdioServerTransport()
.WithPromptsFromAssembly();
var app = builder.Build();
app.Run();
[McpServerPromptType]
public static class SamplePrompts
{
[McpServerPrompt, Description("Sample prompt.")]
public static ChatMessage SamplePrompt() =>
new(ChatRole.User, "Sample prompt content");
}
#Update existing projects
Providing shared libraries and configuration is great, but you also need to update existing projects to use them. This can be a challenge as you may have hundreds of projects to update. The best way to do this is to create a tool that can update the projects automatically. You can check Meziantou.ProjectUpdater to see how I do it. The tool clones all repositories and apply the migration to them. Then, it creates a pull request with the changes. Some migrations are simple and can be written in a deterministc way. For more complex migrations, the migration script can rely on AI to help with the migration.
Note that you may apply a throttling strategy to avoid overwhelming the Git server with too many pull requests at once, but also the build system with too many jobs. It would prevent the development team from working on other tasks while you run a migration.
#Conclusion
Take a look at your projects and see if there are other things that can be shared by reference. For instance, you can share:
- IDE configurations (e.g.
.vscode
, vsconfig for Visual Studio) - Git hooks
- Dev tools needed to run the project (e.g. dev containers, GitHub Codespaces,
winget configure
,devbox
, etc.) In fact, look at everything that is not business code and see if there is a way to remove it from the project and also to improve the overall experience.
Also, I recommend reading The InnerSource Commons. It's a great resource to learn more about inner source and how to apply it in your organization. Sharing code is one aspect of inner source, but you also need to share knowledge about it, so developers can discover existing solutions. Developer catalogs can help, but they require maintenance and updates to stay relevant. Internal meetups or hackathons can also help promote inner source and break silos, but they are not enough. Experiment and find what works best for your organization.
Do you have a question or a suggestion about this post? Contact me!