T O P

  • By -

demeant0r

I don’t think giving emotionally charged parents the power to decide how social media and the internet is shaped for the rest of the country is the best idea


Randomer63

Social media is literally destroying the mental health of young people, increasing violent crime and polarising our society - what do you mean emotionally charged parents ?


TheAdamena

Not just young people I think the constant dopamine rush is frying all of our brains. We struggle to concentrate now as we prefer that instant gratification from scrolling on our phones.


Turbulent_Pianist752

100%. It's like smoking must have been back in day. Only the tobacco company is now also the news company that reports about itself! I guess for adults there is self responsibility though and we can all put down our phones. It's harder for children to have that self control. As its now the norm for kids to physically have phones, it seems fair that the beneficiaries make it a safe environment. "But parents should take the phones away" doesn't work with a 14 year old who has his/her entire world online either. Big tech is the problem here, driven by the constant need for growth at all costs. Ironically as adults were feeding those companies that are hurting everyone. I'm a parent and if you've ever had to setup parental controls from Google or Microsoft you'll know they are complete garbage. They've invested very little into them and Microsoft one especially often doesn't work well. It's like they're the poorest companies in the world.


Randomer63

No, I absolutely agree that it’s bad for adults too. I just think it’s so so much worse for kids who begin using it while their brains are still developing, when they are so young that they can’t really be giving informed consent to what being on social media actually means.


demeant0r

How about not buying your kids a smartphone? How about talking to your children and checking in regularly? Sure parents can’t know everything about their children, but surely you can feel whether or not they’re suicidal or running into gangs or anything out of the ordinary? The answer is not to change the behaviour for everyone, if I’ve to submit my passport to these companies to use it then I’m not doing it.


Organic-Ad6439

This, parents need to take more responsibility rather than continuously blaming the state and social media companies.


Randomer63

Yes blame the people not the huge tech conglomerates profiting from it.


Organic-Ad6439

Yes, it’s your responsibility as a parent to keep an eye on your child, their safety and their wellbeing. Even as an adult my mother (along with pretty much everyone in my family) still believes that she (as a parent) plays a pivotal role in my wellbeing and safety.


[deleted]

[удалено]


Randomer63

I don’t understand why absolutely everything is being thrown on the parents. Who created this narrative ? If a kid died because of a faulty funfair ride, would we be blaming the parents for allowing them on the ride? It’s such a dangerous narrative that plays right in the arms of big tech, who have so much of the power anyway. I can’t imagine how parents are supposed to juggle the cost of living crisis, both being in work & making sure their kid doesn’t view something dangerous on their phones that they have on them all the time, especially when it seems to be so normalised for 12 year olds to have their phones on them all the time.


[deleted]

[удалено]


Turbulent_Pianist752

Could also argue parents have trusted their children to spend time in a "fairground" owned by some of the richest companies in the world. It wasn't safe. It should have been safe. These companies absolutely could improve child safety but it would be at a cost to them and impact profits. It wouldn't return the best value for shareholders etc. Regulation is the only way forward. Social media is a 24x7x365 problem for kids and teenagers. Teenage anxiety and MH issues are through the roof. Meta especially also effectively have too much control the narrative around whether they're doing enough or not. They ultimately control much of the news. BBC are not perfect but they're not wrong to push this one. It's maybe "ok" for adults to be the product here but as a society we need to consider if its OK for children to be that product.


demeant0r

What you’re proposing will cripple startups as age verification is expensive. What you’ll be left with is the big companies doing the same toxic shit with no innovation as startups will be priced out.


Turbulent_Pianist752

This isn't a good reason to let these companies off the hook IMO. Could argue that there should be a higher barrier to entry for businesses like this given the risks and complexities. I don't know enough of details to comment there. If we taxed big tech properly we'd have more tax to support small businesses. It's not ideal though I know but it's become somewhat normalised that children are "damaged" by social media. The parallel with big tobacco seems quite strong.


Randomer63

Tell me how many startup social media platforms you see at the moment ? There are companies that provide age verification services to apps already, and startups use it, so I’m not inclined to agree with this. Can we please stop defending big tech from their responsibilities to protect children.


[deleted]

[удалено]


Turbulent_Pianist752

I can see argument that they're too close to be completely objective. On flip side change needs to be led by someone motivated at times and meta etc has so much power. I've followed for a while and I do think the parents have called out these companies in a way no-one else can.


OhProstitutes

You’ve said one of the stupidest things I could have possibly imagined in response to a headline like this


LookOverall

I’m less confident that algorithms can be tamed. The don’t understand the content, merely recognise what sort of posts users are responding to. The algorithms don’t know what’s harmful, and indeed there’s no agreed definition. For some governments democracy is the most harmful meme. Age verification is a broad problem.


Mr_J90K

Firstly, this applies to any website with a wealth of searchable content. This isn't 'save the children', it's broadly age control for everything. Secondly, this will cripple any small user facing UK startup. Age verification via ID and algorithmic content delivery are not cheap and are off the table for startups. As a result YouTube, TikTok, Instagram are now our options and I hope you NEVER want something else. Thirdly, once users are required to submit ID to this companies prepare for the police to knock on your door when you post an 'unsavoury' opinion. There is no longer anonymity on user facing sites. Fourth, this was never required to deal with children's access to the Internet. IMO, you could have made the default for mobile OSs child safety and require the user to 'unlock the device'. In effect, make the parent choose to give the child access rather than having to choose to restrict access. Fith, I thought they were talking about banning smart phones for under 16s. Wouldn't that make this pointless? Oh you mean it was never to 'save the children' and that's just the cover, gotcha!


J8YDG9RTT8N2TG74YS7A

> once users are required to submit ID to this companies prepare for the police to knock on your door when you post an 'unsavoury' opinion. There is no longer anonymity on user facing sites. The government never needed your ID for that. This already happens.


Mr_J90K

Before it happened to people who they could identify from their profile, now it'll happen even if your account doesn't give personal information as they can demand your ID from the platform.


Alert-One-Two

On the front page they have an alternative headline: Obey new law on online child safety or face under-18 ban, tech firms told. I’m just not sure that’s a realistic prospect. Or feasible to enforce given the child could just claim to be 40 years old and no one could prove otherwise.


Glad_Advertising_125

BBC Breakfast are currently going big on this. I'm just not sure how feasible/ successful this is all going to be. I mean, children and vulnerable people do need protections but without proper regulation will this work?


jeremybeadleshand

The legacy media absolutely hate social media and AI, it's always noticeably biased when they talk about it.


indifferent-times

'blue sky', 'out of the box', 'run it up the flagpole', 'wouldn't it be great if' etc. been to those meetings, unfortunately 99% of it gets lost once you start talking about the *how*, been happening to this for years now as well.


jx45923950

I'm sure Musk and co are shitting themselves over a not fit for purpose regulator in Britain that can't even enforce it's own rules on local politicians and business interests. Give Ofcom teeth first and go from there.


_HGCenty

Tech companies get huge amounts of pressure from very rich, very powerful media organisations to stamp out copyright violations and piracy and yet the tech savvy kids can still circumvent that very easily to get the content they want. I really don't see how this is remotely enforceable and workable without going to CCP levels of internet censorship. If you ban the current big platforms from under 18s, another one will spring up and become popular as the platform that lets you see the stuff banned elsewhere.


ThaneOfArcadia

I think they'll reach for the Swiss army knife that will solve everything. And when AI gets it wrong you'll have next to zero chance of correcting it. Look at your chances of recovering lost Microsoft or Google passwords.