The Greener Web: How Smarter Digital Marketing is Better for the Planet
Sustainability is becoming a priority for businesses across every sector. But while companies focus on supply chains, office energy use and carbon offsetting, there’s a quieter conversation that the digital industry needs to have with itself.
Written by: Amber, SEO Manager
17/04/2026
4 min read
The web has an environmental cost, and how we build and manage digital presences has a very real impact on it.
The good news is that a greener approach to digital marketing doesn’t require compromise. In fact, the practices that reduce your environmental footprint are almost always the same ones that improve performance, user experience and search visibility. Here’s how Northern Media do things with WordPress development and SEO.
The Carbon Cost of the Web
Before we get into specifics, it’s worth grounding this in some data. The internet accounts for roughly 2 – 4% of global carbon emissions. To put that into context, that’s roughly comparable to the aviation industry- and the figure is growing. Every page view, every server request, every unnecessary script loaded in the background consumes energy. Data centres, network infrastructure and the devices used to access your site all draw power.
When someone visits a webpage, data travels from a server to their browser. The heavier the data, the more energy is consumed. What this means in practice is that a bloated, slow website isn’t just a bad user experience – it’s an inefficient use of energy, replicated across every single site visit.
Why Custom Builds are Better for the Environment
Page builders and pre-packaged WordPress themes have made websites more accessible to small or lower-budget businesses. But they come with a hidden cost that goes well beyond clunky editing features and slower load times.
Junk Code and Inflated Page Weight
Template-based page builders like Elementor, WPBakery, Divi and Beaver Builder come pre-packed with features that your site doesn’t need, generating enormous amounts of extra code. That could be styling rules for components you’re not using, or scripts loaded globally when they’re only needed on one page. All of this inflates the size of every page your visitors load, increasing the data transferred and the energy consumed with every request.
A custom-built WordPress site, by contrast, serves only the code it actually needs and nothing more. That leanness translates directly into smaller page sizes, fewer server round-trips and a meaningfully lower energy footprint per visit.
Plugin Bloat and Generated Pages
Many popular plugins generate pages like archives and indexes you never asked for, never use and probably don’t know exist. Page builder plugins register template libraries and landing page drafts in the background that are sometimes necessary for certain functions to work. Each of these represents a unique URL, which means a page that browsers can request, servers must respond to and crawlers must process, even if they’re not visible on the frontend of your site. Since page builders often rely on additional plugins, it’s common for additional pages to come along with that. Some examples we’ve seen include empty pages generated by menu plugins and duplicated image pages generated by gallery plugins.
Custom development means we can control exactly what exists. A custom-built gallery page, for example, would only load exactly the code needed for it to function on that specific page. No empty pages, no redundant endpoints and no bloated plugin junk inflating your site.
Faster Sites Use Less Energy
Page speed and environmental impact are directly linked. A site that loads in 1.5 seconds is going to use much less energy than one that takes 4 seconds. That’s because it’s doing less work: fewer requests, less data transferred, less processing time on both the server and the visitor’s device.
Google’s own performance metrics- Core Web Vitals can be used alongside other SEO and marketing efforts to help secure better visibility in search engines. Businesses are often happy to spend time and budget on SEO services to improve this but don’t always realise how much a cheaper site build can hold them back in the long run.
While a bespoke web build means higher upfront cost, making the same investment in clean code as you would in other areas of your marketing benefits the planet and also benefits your visibility in search. This isn’t a trade-off. It’s alignment.
SEO and Crawl Budget
Search engine optimisation has a sustainability factor that rarely gets discussed. Every time Google, Bing or any other search engine crawls your website, it deploys automated bots that consume computing resources. That data source requires energy to run. How efficiently your site can be crawled has a direct impact on how much of those resources are used on your behalf.
This is what SEOs call crawl budget: the number of pages a search engine will crawl on your site within a given period. Larger sites with strong authority get more frequent crawl visits while smaller sites tend to get fewer. Wasting that budget on pages that shouldn’t be crawled is a lose-lose – bad for your SEO and bad for the environment.
So, How Can We Make SEO Greener?
Tech giants like Google have ambitious carbon-free goals, which is likely to mean fewer crawling resources and longer processing times. So how can we reduce the environmental impact through SEO?
Reduce the Number of Pages that Need to be Crawled
The most impactful thing you can do is have fewer low-value pages on your site. Thin content, duplicate entries, boilerplate category pages with minimal original text, auto-generated tag archives- all these pages consume crawl budget without returning any meaningful value for you or your site user.. Auditing and consolidating your content architecture so that every URL earns its place is both an SEO win and a sustainability one.
Fix Redirect Chains
A redirect chain occurs when URL A redirects to URL B, which then redirects to URL C. Every hop in that chain requires an additional server request. For a crawl bot, this means more processing time and more energy per URL resolved. For users, it means slower page loads. Resolving chains down to a single direct redirect or, better still, making sure all links resolve to the final URL, eliminates those unnecessary requests.
Resolve Duplicate Content
Duplicate content is one of the most common sources of crawl waste. The same content appearing at multiple URLs forces search engines to crawl, process and compare multiple versions of essentially the same page. Whether it’s caused by URL parameters, trailing slashes, HTTP/HTTPS or www/non www variants, cleaning up that duplication reduces the number of URLs that need to be processed.
Noindex Low-Value Pages
Not every page on your site needs to be in the index. Pages that your site needs to function but you don’t need search engines to find like form submission/thank you pages, test pages, internal search results and some types of WordPress tags or archive categories can all be noindexed. A noindex directive tells search engines not to include these pages in their index. But note that indexing is one step of the process, and these pages will still use crawl resources, they’ll just become less important and be crawled less frequently.
Block Pages from Being Crawled Entirely
Blocking a page in the robots.txt file prevents crawlers from accessing it at all. In some cases, this is the most efficient outcome: no server resources consumed, no crawl budget spent, no energy used processing a URL that was never going to help anyone. This approach is often used for admin or login areas or to block default generation like /feed/ pages.
A clean, well-structured robots.txt combined with a lean site architecture means search engines spend their limited resources on the pages that matter. That’s better for rankings, and it’s better for the environment.
How to Make Digital Marketing Greener
While we’ve gone into more detail on WordPress and SEO, taking a greener approach to all digital marketing is still possible:
Email marketing – Sending to unengaged lists isn’t just bad for deliverability. It’s a waste of server resources and energy. Regularly review email lists to remove old or bounced email addresses.
Digital advertising – If you’re still using multiple tracking and retargeting scripts, you’re wasting resources. Consolidating everything through Google Tag Manager is a good practice.
Video and media – Hosting large, uncompressed videos on your own server or website is energy-intensive. Properly compress all images and video before uploading and consider hosting large videos on YouTube or Vimeo and embedding on your website instead.
Clean Code is a Business Decision
For businesses and in-house teams thinking about how to make their digital practice more sustainable, the answer isn’t complicated. It’s the same answer we’d give for performance, for SEO, for user experience: do less but do it better.
Invest in custom builds where you can. Audit and clean your URL architecture. Fix technical issues and think critically about what actually needs to exist on your website before you publish a new page.
None of this requires a sustainability budget or a carbon offsetting programme. It just requires understanding about how the web impacts the environment and what changes you can make when working with digital agencies or marketing teams.
Interested in auditing the environmental footprint of your current website? Get in touch, we’d love to help.