The act definitely looks like bad, lazy legislation, as others have said. I can see why the guy behind LFGSS is pulling the plug, but I think in reality the risk of action against sites like this are very slim - although at the very least, the act does seem to require some fairly laborious 'risk assessment' work. I think this from the Guardian is useful:
'Ofcom is supposed to focus on the rules the platforms themselves set, and monitor whether they are doing what they say they will. In theory, this means a platform that wants to be minimally censorious can be, provided it doesn’t give users a false sense of security by claiming otherwise.'
The main thing seems to be that a site owner has to assess the risk of illegal content appearing on their platform
According to Ofcom:
'One way to comply with your duties is to implement applicable safety measures set out in Ofcom’s illegal content Codes of Practice for user-to-user services...such as measures around content moderation, reporting and complaints, user settings and tools.'
I don't think the act requires, for example, every image uploaded to be somehow filtered (although the LFGSS guy seems to suggest so?). I would think all the current rules and moderation on this site would mean it complies with the rules.
I really hope so, otherwise this act really is an absolutely ridiculous overreach.