The UK’s data protection watchdog has clarified that even companies which cease to offer their services within the country can still be subject to regulatory scrutiny for prior infringements.

The Information Commissioner’s Office issued a brief statement following the decision of image-sharing platform Imgur to block its website in the UK yesterday.

“Imgur’s decision to restrict access in the UK is a commercial decision taken by the company,” it stated.

A statement on the Imgur site now reads: “From September 30, 2025, access to Imgur from the United Kingdom is no longer available. UK users will not be able to log in, view content, or upload images. Imgur content embedded on third-party sites will not display for UK users.”

Read more on ICO investigations: UK ICO Fires GDPR “Warning Shot” Over Use of Children’s Data

It’s still unclear whether the decision was influenced by the ICO’s investigation into Imgur, TikTok and Reddit, which the regulator announced in March.

The ICO had expressed concerns over the way these sites use children’s personal information, with its own analysis claiming that Imgur:

  • Did not ask users to state their age when signing up
  • Did not use any techniques to estimate or verify users’ ages during account setup
  • Did not ensure children’s profiles are high privacy by default
  • Did show adverts to children, including targeted ads based on age and location

Tim Capel, ICO interim executive director – regulatory supervision, revealed that the privacy watchdog issued a notice of intent to fine Imgur parent company MediaLab on September 10.

“Our findings are provisional and the ICO will carefully consider any representations from MediaLab before taking a final decision whether to issue a monetary penalty,” he added.

“We have been clear that exiting the UK does not allow an organization to avoid responsibility for any prior infringement of data protection law, and our investigation remains ongoing.”

Following the Children’s Code

The penalty was issued in line with the requirements of the ICO’s “Children’s code” – a data protection code of practice for online service providers whose services are likely to be accessed by children.

Known officially as the Age Appropriate Design Code, it is designed to help these businesses understand what they need to do to comply with the relevant sections of the UK GDPR.

Other MediaLab offerings such as the Kik messaging app are still available in the UK and are compliant with the Online Safety Act; another law with major implications for how social media companies handle children’s data. This hints that the Imgur decision was indeed made for commercial reasons.