The Online Safety Bill has a Committee. TikTok was discussed, and they spoke as well:
committees.parliament.uk/oralevidence/2934/pdf/
Ofcom: "Enormous amounts of time are spent by children under the age of 13 on TikTok and so on. Our data suggests that half of 10 year-olds in the UK are on TikTok; whether they have accounts or not, they are on TikTok. We have to get real about some of that."
Ofcom and the Chair
"Ofcom: ...it is about how users are able to make things more viral, and about how the companies encourage that through things like recommended algorithms and so on, as you say. In the end, the test for me is: what is it like from the user perspective? That is the thing that we really need to keep coming back to, and to shift the culture on to. It is not, “I’m here in a company, I’ve designed it this way and this should all be fine”. When you put it together and you are a user, particularly a younger or more vulnerable user, what do you experience when all those features come together? That is the real test for me about whether or not harm is being properly managed.
The Chair: I think that is right in terms of what the user sees, but it has to be clear as well in the definition of this. The role of the algorithm is not just allowing other users to expose their content to a bigger audience, although that does happen. There is clearly evidence of people understanding how you game the system through groups and advertising to drive content through it. The platforms are also making decisions themselves as to the content they think is likely to have a bigger audience. In the case of a company like TikTok, it is doing that purely on data profiling rather than even initial social interaction.
Ofcom: I agree with what you are saying there; absolutely."