FMCG Digital Maturity: Scoring Rationale

FMCG Digital Maturity: Scoring Rationale

Find out how our research team carried out their research and testing to uncover the digital maturity of the UK’s top 100 FMCG brands

The top 100 FMCG brands have been chosen for this report based on sales figures reported in The Grocers “Britain’s Biggest Brands 2018” report.

To measure the digital effectiveness of the FMCG brands we looked at the organisations’ customer facing branded websites. A minority of brands did not have their own UK focused website. Some brands have a section within a parent company website or no website at all. A full list of websites tested can be seen below. For further details about the research methodology please see the full rationale or get in touch at


For Kagool to understand the digital maturity of UK FMCG brands, a wide range of data captured using a plethora of third party tools. The data was then analysed by our team of digital marketing analysts. The data for this report was collated in February 2019 and all information is accurate to the research period.

Our aim was to collate and analyse information on each of the UK’s top 100 FMCG brands to learn how each performs digitally. By measuring key metrics in 11 core aspects of the organisations’ digital marketing activity, we were able to calculate an overall digital maturity score (%).

The categories measured include:

  • Mobile Optimisation
  • Customer Service
  • Content
  • Email Marketing
  • Search marketing
  • Social Media
  • Website Speed
  • Ecommerce
  • Video
  • App
  • Data Protection
  • Brand Popularity

Each company was given a score for each category. The scores were then aggregated. This is then converted into a percentage to give an overall view of how digitally mature the organisation is.

The key digital marketing aspects included within the scope of this work was:

1. Mobile Optimisation

We used Google’s, ‘Mobile-friendly test’ tool is used to test how easily a visitor can use a web page on a mobile device.  Each website was given a score based on the Google test results:

0 errors = 10 points,

1 error = 7 points,

2 errors = 5 points,

3 errors = 3 points,

4+ errors = 0 points

2. Customer Service

Each organisation was given a score up to a maximum of 10. They were awarded 2 points for each of the following calls to action that were applicable to their site:

  • A link to contact page on the primary navigation
  • Live chat functionality
  • A phone number on the main contact page or homepage
  • Contact form
  • FAQs section

3. Content

Each organisation was awarded a score out of 10. They were given two points for each of the following:

  • News content – Do they have dedicated news or press release pages where news content/articles are provided?
  • Blog content – Do they have a dedicated blog or insight section this could include recipes or how to guides (separate to news)?
  • Anchor links – Are there links in news or blog content to other sections of the website within the main body of the text?
  • Products – Does the site have a dedicated products page?
  • About us – Does the site have an about us section telling the brands story

4. Email Marketing

Each company was awarded points up to a maximum of 10. They were awarded points for following:

  • 5 points if there was a newsletter / email sign up or subscription on website
  • 2 points if a welcome email was received within 24 hours of registration
  • 3 points if you could customise / customise your subscription (i.e. the types of content you could subscribe for or chose the amount of emails you received).

5. Search Marketing

We used SEMrush, a competitive intelligence suite for online marketing, to look at 9 different variables which would affect an organisation’s search marketing.

  • Google UK organic traffic
  • Google UK keywords
  • Organic traffic cost
  • Google UK paid traffic
  • Google UK paid keywords
  • Paid traffic cost
  • Total domain backlinks
  • Referring domains
  • Referring IPs

For each variable we scored the companies that ranked in the top 33 positions 1 point. The next 33 companies received 0.5 points, and the bottom 34 companies 0 points.

The points for each company’s variables were then added up to a possible score of 9. The top company overall was given a score of 10.

6. Social Media

We looked at how each organisation uses 4 social media platforms (Facebook, LinkedIn, Instagram and Twitter). For each social platform a company was given 1 point if they used the channel (Where possible we used the channels the brands linked to from their sites). The amount of times a company posted per month on each channel was then counted as well as the number followers. The 33 companies that posted the most content on a channel were awarded 0.5 point the next 33 companies received 0.25 points and the final 34 were awarded 0 points. For followers the 25 companies with the most received 1 point, the next 25 0.75 the third 25 received 0.5 and the final 25 received 0.25 points. And company that had 0 followers or didn’t post in the last month also received 0 points. A maximum of 10 points was available per company.

7. Website Speed

Google’s website speed tool was used to test the mobile website speed of each website on a 3G connection. Each company’s home page was tested twice, and the speeds averaged. Each company started with a score of 10 and one point was taken off for every second a site took to load. For example, if a site took 7 seconds to load they would be scored 3. Any site which took 10 seconds or more to load was given 0 points and their time noted down to provide accurate ranking.

8. Ecommerce

The brands were tested and scored if they

  • 10 points for selling directly to the consumer
  • 3 points if they directed you to a third-party site to buy their product
  • 2 points if they offered vouchers / discount codes to buy the product.

A maximum of 10 points was available.

9. Video

The brands were scored on their YouTube content each brand was scored on:

  • 2 points were awarded if they have a dedicated YouTube channel? (Where possible this was by using a link form the brands website)
  • The number of subscribers they had was counted. The list we split with channels with the most subscribers getting 2 points and the least 0.5 a channel with no subscribers received 0.
  • How many views an average their last 5 videos had. The list we split with channels with the most views getting 2 points and the least 0.5 points. A channel with no views received 0.
  • How many videos a month they posted. The list we split with channels with the most videos posted getting 2 points and the least getting 1 point. A channel with no videos posted received 0.
  • Using YouTubes own search engine the brand was searched. 2 points were awarded if the brands own content was first. 1 point if branded content shared by another user was top and 0 if the content was unrelated to the brand.

A maximum of 10 points was available for Video.

10. App

Each brand was scored on whether they had an app on either the Apple App Store or the Google Play Store. Each brand was awarded 2.5 points for each channel they had a presence on. A maximum of 5 points was available.

11. Data Protection

For data protection each site as tested to see if they had a privacy policy and whether they asked for prior consent for cookies. 2.5 points were awarded for each.

Additional information

Maturity percentages were worked out to three decimal places and then rounded to the nearest percentile. This means that while two companies may appear to have the same score their maturity rank may be different.  For example, both companies with a maturity % of 68.125 and 67.5 will show as 68% but one will rank above the other.

A small number of companies had multiple sites under the same brand. The website chosen to test was the first to appear in the Google search results which was UK facing when searching for the brand.

9 companies didn’t have their own consumer facing brand website in the UK. These usually had pages sat on their parents’ companies’ sites. As they didn’t have their own website we could not test them for speed, search, mobile optimisation, customer service, content or eCommerce. During the first two weeks of February the Kettle Foods UK website was temporarily unavailable. This means it couldn’t be tested.

We have acted to ensure that the data is as accurate as possible. To ensure that the results were accurate we followed a rigorous and scientific process throughout the study. This included secondary individuals following the same rationale for a selection of organisations to ensure that the same score was reached upon re-testing.

Please note, this is a research piece. Not all the top 100 UK FMCG brands have been designed and developed by Kagool. Any imagery taken from the websites and used as part of the report was taken in February 2019.

List of websites tested: