Pages have weight, some more than others, and the median size is growing.
Measuring emissions associated with page weight is a relatively new practice and given the number of factors to be controlled for, any values should be regarded as estimates.
But estimates are sufficient to compare pages across sites and for development teams to use them to compare versions of their pages over time.
Keeping an eye on emissions also means being aware of the individual requests that make up a page. This number is in the hundreds for sites that pull in data from many sources and employ third party tracking and advertising.
The emissions tracker lets visitors view requests and emissions in real time, as the page loads.
Architecture
The extension relies on the service worker to monitor and process requests and the side panel to display them.
The service worker has access to indexedDB (the browser’s built-in database) but not the DOM.
This side panel has access to the Window object and Chrome's APIs.
When a visitor first opens the extension by clicking on the icon, they are asked to refresh the page to begin tracking emissions. Thereafter one of three scenarios takes place.
- Scenarios: The extension keeps tracking emissions as new requests come in but when the page is reloaded or the visitor navigates to a new site, we:
- Clear the indexedDB store
- Reset the display in the side panel
- Clear the indexedDB store
- Close the side panel
- Disable request listeners
- Initial state: Every time the visitor opens the side panel by clicking on the extension icon, we show the message "Please reload the page to capture website emissions". This is necessary because we don't have access to 'historical' requests i.e. requests made before the extension service worker was initialised.
Service worker
The extension service worker is responsible for intercepting requests, storing them, processing them (sorting, etc.) and messaging the side panel.
-
fetch(url)
: The service worker intercepts requests using the fetch API. -
response.clone()
: Clone the response because a Response object can only be consumed once. Cloning lets you keep the original response intact while processing a copy. -
indexedDB
: Save request information to indexedDB so that the data can be read back and processed e.g. sorted and grouped. -
chrome.runtime.sendMessage
: The service worker messages the side panel after every request with a fresh data.
Side panel
-
chrome.runtime.onMessage.addListener
: The side panel listens for messages from the service worker. It is responsible for displaying a summary (bytes, request count, associated emissions and whether the site is green hosted), request details grouped bytype
, and listing failed requests.
Request size (bytes transferred)
Unless the server responsible for serving a cross-origin request has set Access-Control-Expose-Headers
the content length is 0. In order to find bytes transferred, we need to look at the content length and then, if compression has been used, apply it.
- Performance API: We cannot use the Performance API because transferSize is 0 for cross-origin requests where the appropriate CORS headers have not been set e.g.
Timing-Allow-Origin: https://developer.mozilla.org
-
fetch(url)
: Where available, we use content-length, and, if not, byte length. Since this value is the original size of the resource, we have to compress it. - Compression strategy: Given the size in bytes of the original resource, resource type (CSS, JavaScript, etc.) and the encoding type (brotli, gzip, etc.) we estimate the size of the compressed, transferred, size of the resource. Compression rates for resources vary between sites due to server behaviour, so we select compression levels, and rates, that are representative. For some sites, the resulting byte value may be high or low.
const compressionRate = getCompressionStrategy( requestType, encoding, uncompressedBytes ).rate
- Practice: Due to uncertainty around compression rates and emissions calculations, the tracker is most useful for comparing values between different versions of the same site.
How emissions are calculated
The Green Web Foundation (TGWF)
Once we have a byte count, we can calculate its associated emissions. TGWF provides a useful set of methods and a clear methodology. We recommend using Ecograder for a comparative analysis.
- In order to use the methods provided by the TGWF, I forked the repo, and copied the build files generated by running
npm run build
. We importedhosting
andco2
into the project. - A call to
hosting.check(domain)
returns a boolean value forgreenHosting
, displayed as "green hosted". - A call to
CO2.perByte(bytes, greenHosting)
returns a value for emissions in g (grams) of CO2.
Features
- Comparison with previous: If the visitor previously saved data (in the side panel's Local storage), current data are compared against them. An arrow up (red) indicates a rise (in emissions, requests or bytes), and an arrow down (green) a fall.
- Exporting data: Clicking on the "Export data" button results in the option to save the summary and requests grouped by type to a local file.
- Overriding compression rates: There is a pending feature request to override the default rates with custom rates better tailored to the visitor's needs (the compression rates used on a site). Since compression strategy is an object literal, overrides could be exported in JSON format.
Purpose
The cost of our actions is of general interest and the emissions tracker provides a simple way to see the impact of a web page.
The emissions tracker is also an inducement to development teams to spend more time looking at how their web sites and apps behave in the browser where they are used and seen by others.
At People and Code we spend a lot of time in the browser and build tools to help us there.