Category: Sobriété du web

12 reasons not to eco-design a digital service

Reading Time: 6 minutes

The ecodesign of digital services has been gaining momentum in recent years. However, this is not happening without a hitch. The aim of this article is to examine the obstacles to the adoption of ecodesign. It draws heavily on Perrine Croix’s article (12 excuses de l’inaction en accessibilité et comment y répondre [FR]), which is itself adapted from an article by Cambridge University (Discourses of climate delay [PDF]).

The excuses

1. It costs too much

That’s true and it’s normal, at least for the time it takes to build up skills and implement the approach so that it becomes an integral part of the whole. But it’s also a way of reducing costs: de-prioritizing functionalities considered not very useful/used/usable because of their environmental impacts, producing a digital service that’s easier to maintain, reducing the need for infrastructure to run and store it. This is where DevGreenOps meets FinOps.

Last but not least, ecodesign is a Quality issue, which has the advantage of reducing the cost of ownership. Thus, the service can be used on a wider range of devices and is less costly to maintain, and so on.

2. It takes too much time

Yes, implementing ecodesign takes time, but continuous improvement and automation can help (a lot). Similarly, prioritization based on environmental aspects can reduce the time needed to design the digital service by limiting it to what is strictly necessary for the user: removing functionalities from the scope, simplifying user paths, etc.

3. The team doesn’t have the necessary skills

It’s indeed possible that some employees don’t yet have some ecodesign skills. However, as we often hear, much of what we need to know is based on common sense, or has already been covered for accessibility or performance. Online resources abound, whether in the form of articles, tools or sets of guidelines. What’s more, this upskilling often enhances the motivation of the people concerned, and may even be attractive to those wishing to join an organization more committed to this trajectory.

Of course, you also have the option of hiring experts in the field to help you and your employees improve their skills.

4. There’s no consensus on estimating the environmental impacts of a digital service.

That’s true, and today there are several models of environmental projection that co-exist. ADEME (the French Environment and Energy Management Agency) is currently tackling the subject from the angle of RCP (methodological guidelines on how to communicate about environmental impacts). However, the absence of consensus should not be a brake on action. Even if they have known limitations, tools and guidelines do exist. And in the broader context of climate change, action is urgently needed.

But beware: depending on the methodology and tools used, results can vary widely, and certain areas of over-consumption (and potential for optimization) may go undetected. See this article: https://greenspector.com/en/reduce-the-weight-of-a-web-page-which-elements-have-the-greatest-impact/

5. No need for ecodesign, we’re already considering performance

Ecodesign meets performance in many respects, especially when it comes to technical optimization. If you’re already heavily involved in the performance of digital services, chances are you’re helping to reduce their environmental impacts. However, there are also some areas of divergence between the two approaches. Performance is sometimes improved by delaying the loading or execution of some resources, or by anticipating their loading in case the user performs certain actions. This improvement in efficiency and perceived performance should not make us lose sight of the need for sobriety (doing less of the same thing: fewer images, for example) and frugality (giving up on something: a conversational bot based on artificial intelligence, for example).

It remains questionable to try to make a component faster that degrades accessibility, ecodesign or any other aspect linked to Responsible Digital.

Performance is only one indicator of sustainability, but it shouldn’t be the only one.

6. The best ecodesigned site is the one that doesn’t exist

That’s true, but it’s not necessarily the one that best meets the needs of your users. To make sure of this, the General Policy Framework [PDF] proposes, in its first criterion, to take into account the environmental impacts according to the need for the service (criterion 1.1 with, in particular, the Ethical Designers questionnaire [FR]). It is also important to ask yourself these questions and to take measurements before considering the redesign of a digital service (whose environmental impact could thus be degraded).

7. An ecodesigned site is bound to be ugly

This issue is also often raised for accessible websites. Both are first and foremost design constraints that can enhance creativity, and there are a growing number of examples of this. An article on this subject is available : https://greenspector.com/en/does-a-sober-site-have-to-be-ugly/

8. It’s a matter for developers

If developers are indeed concerned, they are not the only ones. Their scope of action is often limited to technical efficiency or optimization. They will often be limited in their actions by decisions that do not depend on them: limiting the number of media or third-party services, choosing a hosting provider with less impact, deleting some functionalities, etc. The impact of code is far less than that of design choices. The developer’s skills remain essential to the success of the ecodesign approach, but it is much broader than that, both in terms of the digital service’s lifecycle and the roles involved.

In addition, the developer may have a duty to advise on overconsumptions linked to functionality, ergonomics, graphic choices or even published content.

9. The environmental impacts of digital technology are negligible compared to other sectors

Studies published in recent years tend to prove the contrary, for example: https://presse.ademe.fr/2023/03/impact-environnemental-du-numerique-en-2030-et-2050-lademe-et-larcep-publient-une-evaluation-prospective.html [FR]

The environmental impact of digital technology appears non-negligible, particularly due to the impact on user terminals, the manufacture of which is critical from this point of view. What’s more, given the urgency of the climate change situation, we need to act as holistically as possible. All the more so as the benefits of ecodesign for the user experience are increasingly well documented.

10. Greenwashing

There are two aspects to consider here. For some people, ecodesign of digital services is greenwashing, because the impact of this approach is negligible. To convince you otherwise, I refer you to the previous point on the environmental impacts of digital. As far as the results of this approach are concerned, you can find testimonials and feedback on the Greenspector website: https://greenspector.com/en/resources/success-stories/

The second aspect to take into account concerns structures that fear being accused of greenwashing if they communicate on the ecodesign of digital services. This may be because these structure are also responsible for much greater environmental impacts, for example in their core business. To avoid this, it’s essential to pay close attention to communication elements. This means drawing on existing, recognized benchmarks, and making the approach and tools used, as well as the action plan, as explicit as possible. The aforementioned General Policy Framework can be used to structure your approach and build your ecodesign declaration. Furthermore, the ecodesign of digital services must not replace other efforts to reduce the environmental impact of the structure. Lastly, you’ll need to rely on ADEME’s RCP (see above) when they become available, in order to comply with the suggested elements for appropriate communication on the subject.

11. My customers aren’t interested

Users are increasingly sensitive not only to environmental issues, but also to the way in which their digital services are designed. It’s also about improving the user experience, which is an essential but unexpressed need for most of them. For example, taking action to limit phone battery discharge is an important lever for ecodesign, even though this subject is increasingly seen as a source of anxiety for users (https://www.counterpointresearch.com/insights/report-nomophobia-low-battery-anxiety-consumer-study/). Similarly, ecodesign can improve the user experience in the event of a degraded connection or an old terminal, or even help to improve accessibility. Conversely, beware of certain actions that are sometimes taken to improve the user experience without consulting them (and without taking into account the consequences on environmental impacts). For example, adding auto-play videos or carousels to improve attractiveness. More generally, communicating on ecodesign also serves to highlight expertise and interest in product quality.

12. In the absence of legal constraints on ecodesign, we prefer to prioritize accessibility

Improving the accessibility of your digital services is essential, especially as it often helps to reduce their environmental impacts. The reinforcement of obligations linked to the accessibility of digital services at European and French level marks a turning point for many structures. In particular, the scope of application has been extended, and financial penalties have been increased.

To date, there is no similar system for the ecodesign of digital services. However, ARCEP (Autorité de Régulation des Communications Electroniques, des Postes et de la Distribution de la Presse) is responsible for the aforementioned General Policy Framework, but the subject of legal obligations has been pushed directly to European level via BEREC (Body of European Regulators for Electronic Communications). It is to be hoped, therefore, that the subject will progress rapidly so that legal obligations can emerge. Perhaps some countries will take the initiative. In any case, there’s no need to wait for this type of mechanism before embarking on an ecodesign approach. Its benefits are becoming better known, documented and even measured. The aim is not only to improve the user experience, but also to enhance the skills of teams, and even to increase the attractiveness of the company for recruitment and potential customers. While ecodesign has been a differentiating factor for some years, its increasingly widespread adoption tends to make it a discriminating factor. As more and more organizations take up the subject, it’s undesirable to lag behind, whatever the reasons.

Conclusion

There may be many reasons not to engage in ecodesign, not least because the subject may seem intimidating or even non-priority. It’s important to bear in mind that ecodesign contributes to improving not only the user experience, but also other aspects of the digital service. Its integration into the project must be gradual, the essential point being to adopt a continuous improvement approach. Results won’t be perfect overnight. However, the first step can be simple and effective, and results and skills improve over time.

Analysis of overconsumptions on a light website

Reading Time: 7 minutes

In May 2024, on the Designers Ethiques Slack, Julien-Antoine Boyaval of web agency Konfiture shares a site created for Leroy Merlin. This site (which contains a single page) is presented as ecodesigned: https://lesdesignersdedemain.com/

At first glance (via web browser tools), the site does indeed appear rather light. However, certain elements catch my eye. More on that later.

As usual, I’m launching a benchmark with Greenspector Studio to take things a step further.

Analyze site overconsumptions 

The measurements were carried out on a Samsung Galaxy S9 phone, in WIFI (3 iterations).

After measurements, the results confirm the initial suspicions:

  • EcoScore: 59/100 (Network: 82, Client: 35)
  • Data transferred: 292 kB
  • Total battery discharge: 5.28 mAh
  • CPU process (1.11%)

Data transfers are indeed low and, as a result, the score on the Network side is very good.

Original site results via Greenspector Studio: Ecoscore 61/100
Original site results via Greenspector Studio

On the other hand, the Client-side score is low, which correlates with high battery discharge and high CPU impact (especially for such a light, static page). Generally speaking, this can be due to third-party services, animations or even calculations (mainly JS) performed in a loop.

Let’s start by looking at what happens when the user is inactive, via Greenspector Studio :

Observation via Greenspector Studio of CPU and data transferred over a pause stage: 3 peaks of data transferred, several CPU-related peaks
Observation through Greenspector Studio of the CPU and data transferred over a pause stage

We noticed 3 data peaks that are probably directly linked to Chrome (which collects usage metrics and regularly checks the functionalities offered by the browser version).

This hypothesis was then investigated using a web proxy (as the requests in question did not appear in the browser). This confirmed that these requests were indeed linked to Chrome.

On a heavier site, these requests may go unnoticed, but not here.

The methodology used is based on that described here: https://greenspector.com/en/how-to-audit-android-mobile-application-requests/

But above all, we need to question the strong fluctuations of the CPU. There are a few animations on the site, but most of them are only triggered by scrolling. So they shouldn’t have a direct impact on the CPU when the user is inactive and the animations are not triggered.

We therefore fall back on the Performance tool in Chrome’s developer tools: https://developer.chrome.com/docs/devtools/performance

Observation of a pause stage via Chrome's Performance tool: several solicitations due to animations.
Chrome Performance Tool observation of a pause step 

If we look at what happens during 10 seconds of inactivity, we can see that the processor is very busy, with a large number of events to be processed continuously. This quickly gives rise to a large number of JS processes (listening or observing) waiting for certain user interactions to trigger animations.

All this is managed by a widely-used library: GSAP.

Having reached this point, and before going any further, I contacted Julien-Antoine directly to schedule a time to present my findings to his team.

After a few exchanges, it appeared interesting to work together on this subject. The aim is to see how we can reduce the impact of the page through analysis and action. To do this, we decided to proceed in an iterative way: proposing an initial list of recommendations and applying them one by one, so as to be able to estimate the impact of each one through measurement.

Experimentation around the site

First of all, we need to make sure that the badge displayed on the site, taken from Website Carbon Calculator, is not involved (which would be the last straw). To do this, such a badge is integrated into an empty HTML page and measured using a benchmark.

The EcoScore is 95, the data transferred is very low (a simple JS script of less than 2 Kb retrieves everything needed for display in a single operation) and the impact on the processor is negligible (around 0.25% CPU load).

The badge is therefore found not-guilty.

At the same time, the Konfiture team is deploying the site we want to study on a separate server, which will host the different versions produced. An initial measurement is carried out to set the benchmark for the rest of the project, as certain metrics may vary depending on the site’s hosting conditions.

The first version measured removes the Lenis library, which partly manages animations.

Version 1.0.2 corresponds to the further optimization of SVG (vector graphics). The result is a slight reduction in transferred data.

Version 1.0.3 adds native progressive loading for SVGs, as well as the implementation of a CDN and compression (brotli) of text files (including SVGs). The result is a significant reduction in data transfer.

Version 1.0.5 removes all animations. For the end customer, this is not an option, as animations are considered essential to make the site more attractive. But once the other elements have been optimized, this measure gives us a target to aim for. Here, we can see a reduction in data transfer (less JS required), but above all in CPU usage (which remains one of the metrics most affected by animations, due to the calculations required).

To go further on this subject, I refer you to two other articles on this blog:

Version 1.0.6 does away with the need for JS code to manage animations. The problem is that animations are continuous. Even if, technically, this approach has less impact on the processor (which can easily be verified using Chrome’s Performance tool), it degrades the user experience and poses a problem for accessibility.

After discussion of the subject, this point appears to be prohibitive. While CSS-only animation management is a good compromise for environmental impacts, accessibility degradation must be avoided.

Initial results do not correspond exactly to expectations. After analysis, it appeared that having continuous animations hindered the detection of inactivity during measurements and artificially prolonged the scroll time.

As a result, version 1.0.7 already offers a first option: use the browser’s preferences-reduced-motion parameter to, at the very least, disable animations for users who wish to do so. Failing the ability to disable automatic playback of animations, it would be necessary (to be compliant) to reduce their duration to less than 5 seconds (or even 4 seconds, if we comply with criterion 4.1 of the RGESN [FR] : https: //www.arcep.fr/mes-demarches-et-services/entreprises/fiches-pratiques/referentiel-general-ecoconception-services-numeriques.html#c36264 ) and/or propose a means of control to pause them. This point is still under discussion.

To take things a step further, version 1.08 seeks to reconcile ecodesign and accessibility. To this end, it has been decided to limit the duration of animations and, consequently, to trigger them only on scroll to ensure that they are still visible.

Results

The following results have been obtained from measurements taken over time:

Measurement results: the page without animations is the least impactful, followed by the one where animations are triggered on scroll.
Measurement results for different versions

Environmental projection for the different versions: the ranking remains more or less the same as for measurements, with the most advantageous option being to avoid using animations.
Environmental projections for different versions

First of all, it’s worth remembering that the impact on CPU, memory and battery discharge is highly dependent on the model of device used for the measurement, but can also vary between two devices of the same model. For this reason, each measurement also includes a reference step, not shown here. For web pages, this reference step consists in measuring what happens when the user is inactive on a Chrome tab displaying an entirely black page (minimal energy impact, especially compared with the empty Chrome tab, which is very bright and therefore more impactful when using a device with an OLED screen).

Results for the final version of the site (EcoScore 70/100)
Results for the final version of the website

Measurements on such light sites are often more complicated, as deviations and overconsumption may be slight or even difficult to distinguish from measurement artifacts, for example. Sometimes, it’s possible to get around this by adapting the methodology. For example, to measure a very light component, we integrate it 100 or 1000 times on the page and proceed in the same way with other components we want to compare.

The increase in scroll time resulting from the continuous application of animations has led to a consequent lengthening of scroll time (17 seconds instead of 6), which directly increases energy and environmental impact.

For such lightweight sites, Chrome’s “parasitic” requests (telemetry, variant checking) appear all the more impactful, even if only a few or tens of Kb of data are transferred.

In our case, the best solution for limiting the impact of animation integration is version 1.0.8. This benefits from the implementation of the following best practices:

  • Extensive SVG optimization (including compression and lazy-loading)
  • Limiting the duration of animations, stopping them for users who choose to do so, and triggering them only on scroll.

Overall, in terms of the number of requests and transferred data, the gains are undeniable (even if the site was originally very light).

In terms of battery discharge speed, the gains are not negligible. Even if environmental impact and energy consumption appear to be the same overall, or even slightly higher (due to the increase in scroll time), the results are encouraging.

Conclusion

As already emphasized in the article on sober sites, estimating a site’s sobriety is a complex task, since it takes into account many factors as well as a specific methodology. Even on a site announced as sober, there are often improvements to be made (even if not all of them are worthwhile).

Once again, the subject of animations comes up. Sometimes used to compensate for the reduction in the number of images, they very often have an impact, even if free tools hide this impact (by concentrating on the data transfers carried out during page loading). When we want to go further to integrate them as efficiently as possible, the results to date are not necessarily conclusive. The priority should be frugality (getting rid of animations), then sobriety (reducing their number) and finally efficiency (optimizing their integration). However, for accessibility reasons in particular, their use should be prohibited (based on criterion 4.1 of the RGESN).

As for the efficient integration of animations, everything remains to be done. This is a very complex area to tackle, as the metrics to be taken into account are numerous and complex to measure and compare (CPU, GPU, battery discharge, etc.). Add to this the risks of impact transfers (opting for CSS rather than JS or vice versa) and you end up with a technical subject that is thorny, to say the least. However, here we note that limiting their duration, coupled with simple logic for triggering them, brings the best results.

Today’s standards and knowledge allow us to set out how to make an animation compliant from the point of view of accessibility. For ecodesign, however, this is not yet the case (even if the RGESN suggests a few insights). To my knowledge, there is no universal solution for proposing animations that do not lead to over-consumption.

So, from a very pragmatic point of view, it’s best to return to a simple but important approach: avoid integrating animations whenever possible, for reasons of accessibility as well as ecodesign (and more generally, user experience).  

RGESN / REEN law: what are we talking about?

Reading Time: 9 minutes

The subject of the environmental impact of digital technology has been gaining momentum in recent years. Particularly in France, where it is benefiting from the rapid establishment of a structuring legal context. This topic was discussed in another article on the Greenspector blog: https://greenspector.com/fr/le-cadre-legislatif-de-lecoconception-de-services-numeriques/

As a company seeking to reduce the environmental and societal impacts of digital technology, Greenspector is keen to explore this subject in detail. Here, we’d like to take a brief look at the REEN law (Reducing the Environmental Footprint of Digital Services), before moving on to the RGESN (Référentiel général d’écoconception de services numériques).

REEN law framework

The REEN law requires towns and cities with more than 50,000 inhabitants to define their Responsible Digital strategy by 2025. This necessarily includes elements linked to the eco-design of digital services. However, local authorities are often confronted with a first obstacle: the subject of eco-design of digital services is still relatively recent. As a result, it can be difficult to find one’s way around, whether it’s a question of choosing a measurement tool or a guide or repository that will enable effective progress to be made on the subject.

This is why another aspect of the REEN law is eagerly awaited by many: the definition of legal obligations for the eco-design of digital services. This should take the form of 2 items:

  • The RGESN, which we’ll look at in more detail in this article.
  •  An implementing decree that defines who is subject to these obligations, and with what constraints (what types of digital services, what deadlines for implementation, what deliverables are expected, etc.).

The reference to bind them all together: the RGESN

Its origins

In 2020, the INR (Institution du Numérique Responsable) is bringing together a hundred (!) experts to work on a reference framework for the eco-design of digital services. The aim: to offer recommendations covering all types of digital services, at all stages of the lifecycle and for everyone involved. In short, a holistic approach. It’s a colossal project, but it’s nearing completion in the summer of 2021. It will give rise to GR491, which currently comprises 61 recommendations and 516 criteria. It is due to be updated once again in the near future. To date, it represents a unique reference worldwide.

Just before the repository went online, DINUM (Direction interministérielle du numérique) intervened. Its objective was simple, and entirely relevant: to build on the work already done, and to create its own repository. This is how, in autumn 2021, two repositories came into being: GR491 and RGESN.

There have already been two versions of the RGESN: the first proposed by DINUM, then a new version put out to public consultation by ARCEP (Autorité de régulation des communications électroniques, des postes et de la distribution de la presse) at the end of 2023.

The final version is scheduled for release in early 2024, and may already have been released by the time you read this.

Its role

Existing versions of the RGESN referential already highlight its specific features. In the case of accessibility, the RGAA (Référentiel général d’amélioration de l’accessibilité) enables us to check the accessibility of a digital service, based on criteria derived from the WCAG (Web Content Accessibility Guidelines) issued by the W3C (World Wide Web Consortium). The French legal framework also requires compliance to be demonstrated by means of an accessibility declaration, as well as the publication of a multi-year plan for the digital accessibility of the entity. All these elements can be consulted here: https://accessibilite.numerique.gouv.fr/

In the case of the RGESN, the notion of ecodesign declaration is included directly in the standard, and its content is detailed throughout the criteria. However, this standard is not based on an international benchmark. Indeed, the WSGs (Web Sustainability Guidelines: Web Sustainability Guidelines (WSG) 1.0 [EN]) were published by the W3C after the RGESN. As a result, the WSG are partly based on the RGESN and not vice versa.

In the case of the RGESN, the ambition is not so much to “verify” that a digital service is eco-designed, as to check that an eco-design approach has indeed been implemented. This makes it possible to involve all stakeholders in the process (including the host and third-party service providers, as well as questioning the strategy and even the business model), and to adopt a continuous improvement approach. This approach is ambitious, but it is also linked to the fact that it is complicated, if not impossible, to establish factually (via purely technical criteria) whether a digital service is eco-designed or not. Rather, it’s a matter of ensuring that it is part of an eco-design approach.

Contents

V1 (the DINUM version)

In its first version, the RGESN proposes 79 recommendations divided into 8 families:

Each recommendation takes the following form:

  • Objective
  • Implementation 
  • Means of testing or checking

So, for example, the first recommendation of the standard is entitled “1.1 Has the digital service been favorably evaluated in terms of utility, taking into account its environmental impacts?”

  • Its “Objective” is to ensure that the digital service we are seeking to eco-design does indeed contribute to the Sustainable Development Goals (SDGs).
  • To this end, the “Implementation” section suggests a few ways of checking this, as well as the elements to be specified in the ecodesign declaration.
  • The “Means of testing or checking” section summarizes what to look for to ensure that this criterion is met.

Here we come to one of the limits of this version of the standard: the objective is laudable, but it lacks concrete means of verification and implementation.

Other points have been raised by experts in the field, but the tool remains important, and many are taking it up to test it in the field.

The standard defines a number of elements for structuring the eco-design approach, in particular by :

  • Appointment of a referent
  • Drawing up an ecodesign declaration (with full details of its content)
  • Implementation of a measurement strategy. In particular, the definition of an environmental budget, aiming among other things at wider service compatibility in terms of browsers, operating systems, terminal types and connectivity.

The tools that accompany the repository (a browser extension, Excel spreadsheet templates as audit grids) are welcome, but sometimes insufficient in the field. This is particularly true when it comes to carrying out multiple audits on different digital services, or building a comprehensive action plan.

To take all this into account, here is the version of the RGESN proposed by ARCEP [PDF, 1.6 Mo].

V2 (ARCEP’s version)

This version was put out to public consultation two years after the first version.

It introduces a number of significant changes:

  • The number of criteria has risen from 79 to 91, notably thanks to the addition of a “Learning” section (relating to machine learning) which introduces 5 new criteria.
  • In addition to “Objective”, “Implementation” and “Means of test or control”, 3 new attributes appear:
  • difficulty level
  • priority level
  • Non-applicability criteria

As a result of the addition of the priority level, the recommendations are first grouped by priority. 20 of them have been identified as priorities, in particular all those related to the new Learning section.

Beyond these contributions, the new version differs from the previous one in being more operational: it aims to provide concrete elements to facilitate the implementation of recommendations.

For example, we find the same 1.1 criterion presented in a more complete way:

  • Action identified as a priority and easy to implement, no cases of non-applicability
  • Objective more or less identical
  • More contextual information to go further in the process of verifying the contributions of the digital service in terms of environmental (and societal) impacts.
  • Concrete control tools: the Designers Éthiques questionnaire and the consequence tree as formalized by ADEME (Agence de l’Environnement et de la Maîtrise de l’Energie). This consequence tree is used again later, in Criteria 2.1, as part of design reviews.

The criterion relating to the ecodesign declaration has disappeared. The ecodesign declaration is nonetheless essential, and its content has been defined in various recommendations.

Another element emerging from this new version of the standard is the implementation of a measurement strategy via the definition of environmental indicators (at least primary energy, greenhouse gas emissions, blue water consumption and depletion of abiotic resources) as well as a strategy for their reduction and an environmental budget via thresholds. This measurement strategy should also include elements for verifying that the digital service functions correctly on older terminals and operating systems (or even older browsers), and in degraded connections. Through the changes made to recommendation 4.4, this measurement strategy should be extended to include user paths.

This is where Greenspector can help, both in strategy development and implementation. This includes not only the measurement itself, but also the definition of environmental indicators and their calculation, as well as the definition of routes, terminals and connection conditions. Today, this approach can be applied to websites, mobile applications and connected objects alike.

Some of the new criteria make the link with the RGPD (Réglement général sur la protection des données), the RGS (Référentiel général de sécurité), the IoT (Internet of Things) and open source. Recommendation 2.6 also requires that the environmental impact of software bricks such as AI and blockchain be taken into account. That said, this recommendation could have been placed directly in the Strategy section.

The Content section provides a wealth of information on content compression formats and methods, enabling us to go even further into the technical aspects of a sober editorial approach.

New criteria also provide information on blockchain, as well as on the asynchronous launch of complex processes.

This is clearly a step in the right direction. There’s no doubt that the public consultation will have yielded an enormous amount of input for an excellent repository, as well as the tools that must accompany it (by improving the browser extension, but above all the Excel template for conducting compliance audits and monitoring them over time via an action plan).

It is already clear from these additions and clarifications that carrying out an ESMR audit will take longer than with V1, which is important in order to take account of the criteria as a whole and thus remove any ambiguities as far as possible. While the intentions of RGESN V1 were already good, V2 provides the necessary elements to facilitate its adoption and implementation. This version also reflects a high degree of maturity on the subject, making it a resource that can already be read to facilitate skills upgrading.

What to expect next?

Already, the final version of the RGESN is expected (which is in itself a very positive sign).

It will undoubtedly be an essential tool for structuring eco-design initiatives for digital services. This will enable everyone’s practices to evolve in this area.

The accompanying tools are also eagerly awaited, as they should facilitate audits as well as compliance monitoring over time, notably through the definition of an action plan.

Among other things, the standard requires the publication of a complete ecodesign declaration, which not only raises awareness more widely, but also enables practices to be compared. In other words, to help this field of expertise evolve.

The big unknown remains the forthcoming application decree, which will set out the framework for the application of the REEN law, based on the RGESN. There are still several unknowns in this respect. Based on what is being done for accessibility (and in particular following the decree of October 2023), questions indeed remain unanswered:

  • Will the use of RGESN be limited to the web or extended to other types of digital services (mobile applications, street furniture, etc.)? At the very least, it would be important to include mobile applications in addition to web sites and applications.
  • What will the penalties be?
  • How long will it take to implement?
  • Which structures will be concerned? Public structures will be the first to be affected, but as with accessibility, it would be interesting to target businesses too. In fact, some of them have already begun to take up the subject, recognizing the value of this reference framework in guiding their eco-design initiatives for digital services.
  • What means will be officially put in place to facilitate the adoption of the RGESN (training, guides, tools, etc.)?

Other, more general questions arise. In particular, how will certain companies and professionals evolve their practices and offers, perhaps for some of them by evolving towards auditor roles (or even by training future auditors). It is also to be hoped that a more complete definition of the eco-design of digital services will lead to the emergence of training courses leading to certification (i.e., skills repositories validated by France Compétences).

One point of concern remains the declarative nature of the recommendations. The advantage of the RGAA is that it offers a technical and even factual approach (even if certain criteria are sometimes open to interpretation). In the case of the RGESN, the criteria are less factual and less easy to verify, which can sometimes make them rely on the auditor’s objectivity. The question of defining methods for validating certain criteria through measurement also remains open.

It will also be interesting to see how all these elements will find an echo beyond France, and how the RGESN will fit in with the possible introduction of new standards and other reference frameworks.

Where does Greenspector fit into all this?

The RGESN is an unprecedented, but above all indispensable, basis for improving our own practices and providing our customers with the best possible support. All the more so as they will soon be obliged to use these standards.

To this end, a number of actions have been carried out:

  • Integrate V1 of the RGESN into our own internal repository of best practices. As the time between V2 and the final version has been announced as being rather short, we have decided to wait for the final version before implementing the modifications. However, this does not prevent us from incorporating these changes into our day-to-day practices, and from taking V2’s contributions further.
  • Incorporate the RGESN into the training courses we offer: present the standard and its context, and propose activities based on it, notably via the rapid and supervised implementation of an RGESN audit. Other standards are also presented for comparison purposes, as well as their use cases.
  • We regularly carry out RGESN audits on behalf of our customers, and centralize information that enables us to track compliance rates and their evolution over time. What’s more, these audits enable us to develop our use of RGESN.
  • We systematically rely on the RGESN during audits and design reviews. Our Ecobuild offer is also evolving. The original aim of this offer was to support a project team from the outset, through training, design reviews, audits, monitoring and, more broadly, expertise. We are now proposing to back up this offer with the RGESN, enabling us to go even further in setting up or consolidating our customers’ eco-design approach.
  • In addition to the approach of using RGESN to audit/improve a site, we also use it as part of our support for a site creation solution, in order to have more global levers, but also to start thinking about the RGESN criteria that can be taken into account directly at this level. This type of reasoning could subsequently be extended to other tools such as WordPress, Drupal and other CMS. The interest here is manifold:
  • Raising customer and user awareness on the subject of RGESN
  • Reassure customers by taking responsibility for part of the criteria, which could ultimately have a differentiating effect (we can imagine customers opting for “RGESN-compliant” solutions to more easily meet their legal obligations on the subject).
  • Provide the means for users/customers to create less impactful sites

Conclusion 

The RGESN has already established itself as an essential tool not only for the eco-design of digital services, but also for structuring eco-design approaches. As such, it should help everyone to develop their skills in this area. It remains to be seen how the legal framework will facilitate this evolution and, in time, bring about what we hope will be far-reaching changes in the structures concerned.

Environmental analysis of analytics tools: use, impact and responsible choice

Reading Time: 11 minutes

Context

The constant evolution of regulations, such as the RGPD (General Data Protection Regulation) and the REEN (Reducing the Digital Environmental Footprint) law, highlights a paradigm shift in the digital world. Companies and organizations are increasingly aware of the importance of regulatory compliance and the need to reduce their environmental impact. This has far-reaching implications for the tools and technologies used, particularly when it comes to web analytics solutions.

Today, these tools are used on a massive scale to monitor our behavior, and their impact is often underestimated when compared with other subjects, such as advertising. These are major challenges, as tracking is omnipresent in the paths and pages of digital services. What’s more, analyzing the areas frequented by the user via analytics makes it possible to target the points through which the user often passes, and therefore its main impacts. Tracking also helps to determine the usefulness of functionalities, encouraging the deactivation of unused functional elements. In this way, judicious use of analytics can bring environmental benefits by avoiding widespread impacts. Optimization and moderation in its use are crucial to minimize systemic impacts.

Choosing the right tools and adopting a good tracking strategy therefore seems to be a key element in the Digitally Responsible approach of your digital service.

In this article, we’ll explore the environmental impact of different solutions for web page tracking, to give you some idea of the impact generated by tracking, and to help you make an informed choice about which solutions to implement, based on their level of sobriety.

Why use Analytics?

Web tracking, also known as web monitoring, is the activity of collecting data on users’ interactions on the Internet, including website visits, clicks, browsing behavior and much more. It enables companies and organizations to analyze and understand users’ online behavior, measure the effectiveness of their marketing campaigns and personalize user experiences.

Web analytics focuses on the measurement and interpretation of website usage data, giving operators a detailed insight into the online activity of their visitors. This practice encompasses a wide range of information, such as :

  • Number of visitors over time, distinguishing between regular visitors and newcomers, as well as the duration of their visit and the pages consulted
  • Traffic sources: whether direct (when a user enters the site address directly), from other websites, from advertising or via search engines
  • Geographical location of visitors
  • Technical details, such as visitors’ operating system, screen resolution and web browser version
  • And much more, depending on the tool used

The initial idea behind web analytics is to collect and analyze this information for a number of reasons:

  • Personalizing the user experience: by gathering data collected in user profiles, these are then used to personalize ads. Instead of showing random ads to users, their profile information, such as age, gender and the sites they have visited in the past, is used to select content that matches their interests. This enables advertisers to focus their budgets on consumers who are likely to be influenced.
  • Security: law enforcement and intelligence agencies can use web tracking technologies to spy on individuals. The unique identification of individuals on the Internet is important in the fight against identity theft and for the prevention of credit card fraud, for example. This subject remains closely linked to the notion of privacy, because of the potential for abuse.
  • Web application usability testing or understanding user behavior: by observing the steps taken by an individual when trying to solve a certain task on a web page, usability problems can be discovered and corrected.
  • Measuring performance and objectives: the aim is to maximize revenues, for example by evaluating which pages generate the most revenue, which banner ads generate the most traffic, or which stages of the order process lose customers.

These motivations support data-driven decision-making. Indeed, the data collected through web tracking helps companies or other entities to make decisions based on proven statistics. Information on user behavior helps to identify potential problems, spot opportunities for improvement and guide decisions on marketing investments, user experience and other aspects of online activity. In particular, this is how the impact of SEO (Search Engine Optimization) or SEA (Search Engine Advertising) can be assessed.

However, retrieving such a mass of information not only generates data traffic and storage for daily or long-term analysis, but also involves processing on the user’s side, whether or not they use the digital service in question. This also involves the risk of temporarily blocking the loading of a website, or failing to respect the user’s consent.

As a site owner/operator, you need to think about the economic, social and environmental impact of these tracking solutions.

While it’s important to collect digital service usage data, you need to keep it to the essentials (which is in line with the RGPD: General Data Protection Regulation).

All the more so as external services tend to weigh down sites, notably via unwanted scripts collecting user data, for example. Examples include Google Analytics, Google Recaptcha (bot detection), Google Maps and FontAwesome.

What criteria should you use to make your choice?

So what criteria should you take into account when choosing an analytics tool? Which solutions can help you make this informed collection?

We won’t go into all the criteria for user requirements in terms of ergonomics, technical support, functionality, etc. of course, but these are of prime importance in making the right choice. Of course, this remains a key point in this choice, but it differs from one organization to another.

It’s important to prioritize tools that rigorously comply with data protection regulations, such as the RGPD. Sensitive user data must be secure and treated confidentially.

When selecting analytics tools, it’s crucial to maintain a smooth user experience that’s accessible to all users.

It’s also important to consider the tool’s ecological footprint. Does the data collected correspond to the stated need? The tool must also be able to evolve with technological advances and changes in the analytics landscape. Do servers and data centers have renewable energy sources and are they managed sustainably?

We’ve also published an article on the environmental commitments of web hosting offers.

It can be difficult to have access to all this information, but it can help refine the search for more respectful solutions. If the tool is transparent about how it collects, processes and uses data, this reflects a commitment to the company’s values. Users need to have a clear understanding of how their data is used.

Selection of solutions and definition of measurement scope

We’ve taken the trouble to select 3 analytics tools that are available free of charge. Here is our selection:

  • Google Analytics 
  • Matomo
  • Plausible

Methodology

Choice of solutions studied

The choice of solutions to be analyzed was made taking into account several key criteria, such as market popularity and cost. The aim was to select solutions representative of the current web analytics landscape, in order to obtain relevant and significant results.

It should be noted that this experimental study is not intended to promote a specific solution, but rather to provide an objective assessment based on concrete data. The results of this study can serve as a reference and decision-making tool for digital players seeking to optimize their web analytics while taking into account environmental and privacy issues.

According to usage statistics provided by HTTP Archive and Patrick Hulce’s third-party service identification tool, Google Analytics, Matomo and Plausible are the most popular web analytics solutions.

 Google Analytics  Matomo Plausible 
Occurrences d’utilisation 9 887 783 11 610 17 628 

Study preparation

As part of this comparative study of web analytics solutions, a necessary step is to measure the performance of a reference page that has no web analytics solution implemented, and to measure this same page with pages implementing web tracking solutions. This approach enables us to assess the specific impact of each solution in terms of page performance and consumption (energy, data, etc.). It’s important to note that we’ve deliberately excluded more advanced uses such as Tag Manager or advanced configuration of collected data. In addition, we have taken into account as far as possible the reality of the impact of server-side processing and storage of collected data, as projected by our model detailed in this article. Also excluded is the administrative part of these tools and the analysis of dashboards.

It’s worth noting that Matomo also offers a server-side only solution, which avoids worries about the RGPD (General Data Protection Regulation) in addition to reducing the environmental impact on the client side. We have not evaluated this solution.

We deployed a simple reference web page as well as 3 identical pages on which we implemented the 3 respective solutions. The reference page is a black screen with a standard text font and no script.

User path definition

To measure the activity of Analytics tools, we have established the following path:

  • Step 1: launch browser application
  • Step 2: launch url of page to be measured
  • Step 3: pause (30 sec)
  • Step 4: page scroll

The course consists in launching the browser application (here Chrome) and entering the url of the page to be measured (reference or with implemented solution). The process then pauses for 30 seconds to measure what happens when the user is inactive. Finally, a scroll is performed to detect the sending of additional requests describing the user’s behavior.

Measurement context

  • Samsung S7, Android 10
  • Network: 3G: used here to extend test performance and enable more measurement points
  • Brightness: 50%.
  • Tests carried out over at least 5 iterations to ensure reliability of results

Assumptions used for environmental projections

  • User location: 2% France, 98% Worldwide
  • Server localization: 100% worldwide (if not available for each application)
  • Devices used: 60% smartphone, 38% PC, 2% tablet
 Google Analytics  Matomo Plausible 
User location98% World 2% France 
Server localization100% World
Devices Used  60% smartphone, 38% PC, 2% tablet

The environmental footprint depends on the location of the application’s servers, their type, the location of users and the type of devices they use. We have chosen to study all users, which corresponds to a breakdown of 2% in France and 98% for the rest of the world. This ratio is taken from We are Social’s Digital report. The global report states that 5.16 billion people are Internet users, and the French edition indicates that 53.96 million French people are Internet users.

For the overall breakdown of devices used, the previous year’s report stated a split of around 60% for smartphones, 38% for PCs and 2% for tablets.

What’s the environmental impact?

By carrying out our actual environmental impact measurements for each of our web analytics solutions, we can directly calculate the unit impact of the tool alone on a visit (loading, pausing and scrolling) from which we have subtracted the impact of the reference page. The unit impact shown below is the delta between the black page presented with analytics and the black reference page without analytics implemented.

Solution Unitary impact per route (g CO2e) Impact for 10 visits/day of each instance over one year
Google Analytics 0,069 2 490 T CO2e 
Matomo 0,012 508 kg CO2e 
Plausible 0,039 2,5 T CO2e 

For each of the analytics solutions, we have assumed that each of the sites using the solutions has a visit frequency of 10 per day.

For Google Analytics, which produces 0.069 g CO2e per visit, generates almost 2,500 tonnes of CO2e on the scale of its 9,887,783 hits over a year.

Plausible, it has a unit impact per load of 0.039 g CO2e, i.e. 2.5 T CO2e over one year for 17,628 hits.

Finally, Matomo, with 11,610 hits and an impact of 0.012 g CO2e per trip, produces 508 kg CO2e per year.

We can specify that the difference is very small because the pages are very sober, but there is very little difference between a very business-oriented solution like Google Analytics, and Plausible, which is supposed to offer a lighter solution in terms of environmental impact. The biggest impact is on the volume of use of analytics solutions.

While the difference in unit impact is very small, at the same utilization rate, some solutions are much more environmentally sober.

It is therefore in our interest to limit the use of these solutions and to favor those with the lowest impact.

For example, if web services using Google Analytics transferred their analytics usage to Matomo, the environmental impact would be greatly reduced: while visits to the almost 10 million hits of Google Analytics have an impact of 2,490 T CO2e, using the Matomo alternative, this impact would be 433 T CO2e. That’s 6 times less than the impact of Google Analytics!

Especially as Matomo offers a server-side solution. Apart from the privacy benefits of having no intermediary at data collection level and improved performance for website visitors, greenhouse gas emissions are also reduced.

For comparison

Gerry McGovern, user experience expert and author of several books on digital design, including World Wide Waste, calculates the environmental cost of using Google Analytics.

He estimates that :

  • 21.6 kb of data are transferred to Google per visit
  • 50 M sites use Google Analytics according to Marketing Land in 2015 (which does not correspond to our estimates)

For an estimated total of 10 visits per day per website using Google Analytics, this represents 500M page views and therefore nearly 10,800GB transferred per day or 4MGB/year.

According to his research, 1GB = 4.2 g CO2eq. So the pollution caused by the Google Analytics solution amounts to 16556kg/year.

So, for the simplest use of the tool on a very sober page, Gerry McGovern’s estimates are very low compared to the impact we’ve measured.

However, this estimate is made by taking into account only the weight of the data to make a carbon impact projection, which differs from our methodology.

To go further…

Beyond general considerations of environmental impact, an in-depth technical analysis of the requests generated by analytics tools can provide information on how these solutions operate and interact with websites (request weight, delayed loading, third-party services, etc.).

Here are the measurement values for the path (loading, pause, scroll) of the 3 web pages from which we have subtracted the reference values:

 Performance (s) Battery discharge rate (µAh/s)Mobile data (Ko) 
Google Analytics 2,3 21 955 145,9 
Plausible 1,6 3 604 29,1 
Matomo 0,4 15 272 9,2 

Unsurprisingly, Google Analytics is the most consuming and least efficient, followed by Plausible and Matomo. In fact, for every 150KB of data exchanged on the route, the Javascript file responsible for sending the request to the Google server weighs over 90KB. That’s 66 times more than Plausible. Matomo, on the other hand, uses over 40kb for this request.

Page avec GA implémenté – Inspecteur Firefox, onglet network
Page with GA implemented – Firefox Inspector, network tab

On the other hand, this suggests that the larger the JS file, the more information it retrieves about the user, even if this is not necessarily a direct correlation. Other factors, such as client-side processing or code optimization, can also influence performance and data collection.

Here, a large volume of data is transmitted to the Google Tag Manager platform, yet this is not implemented in the code. The difference is obvious with Matomo, which transfers a smaller volume of data than its competitor.

What’s more, both Google Analytics and Matomo transfer cookies.

Basically, cookies were designed for a simple purpose: to store a user’s log-in information on a given site, so they’re not problematic in themselves, but they do serve many advertising, marketing and other needs to enable more targeted content based on user behavior.

So it’s important to look at the size and expiration date of these cookies. Google’s cookies are easily distinguished by their _ga prefix, while Matomo’s cookies can be identified by their _pk prefix. Google’s cookies have a total size of 80 bytes and expire only 13 months later, corresponding to the expiration date of advertising cookies. Matomo’s cookies account for 56 bytes, and one of the 2 cookies loaded expires on the same day. In both cases, the relevance of these cookies on such sober pages is questionable.

As we’ve seen, Google Analytics is the least efficient and most ecologically damaging solution, especially as the request to Google Analytics is loaded asynchronously. Although asynchronous loading is a common performance practice to avoid delaying page display, it can actually mask the real environmental impact of this solution.

In our measurement process, we sought to obtain a complete view of Google Analytics loading. It’s important to note that Google has implemented various strategies to minimize its impact on website performance. However, despite these efforts, our measurement data reveals that the impact in terms of energy and data transfer remains higher for GA than for its competitors.

The limits of our study

The results of our study have a number of limitations. Firstly, the pages measured are very simple in terms of functionality and visuals, which also implies a simple scenario, which is not necessarily representative of websites equipped with analytics tools. What’s more, due to their sobriety, these pages are very light, and the measurements taken may therefore fall within the margin of error of our measurement tool. Finally, we have very little information on the varying factors of environmental impact (server location, for example).

To conclude

In conclusion, our study of the various web analysis tools highlights some interesting nuances in terms of their environmental impact. It’s important to note that our analyses were carried out on a sober page and a very basic use case, which considerably limits the differences in impact. However, even in this context, we note high data volumes with efficiency techniques differing in certain loadings. All this for ever more analysis of user behavior, with a high environmental impact to boot.