[THREAD]
1/ Last week @TechFreedom explained to Congress how the Kids Online Safety Act endangers free speech *and* kids: https://techfreedom.org/wp-content/uploads/2022/12/Kosa-Letter-December-6-2022.pdf
Yesterday, its sponsors introduced revisions: https://aboutbgov.com/52Q
It's better, but still pretty bad.
#KOSA simply doesn't belong in an omnibus spending bill.
2/ Age verification: We explained how KOSA's mandates would unconstitutionally require age verification for a huge part of the Internet.
The revised text tries to avoid this, primarily by narrowing its application to where a service knows *or should know* that a user is a minor.
3/ But what constitutes constructive knowledge that a user is a minor? Is it enough that the service “should know” that some users are minors? The bill doesn’t say.
So the FTC/state AGs may allege that knowing a certain % of users are minors creates such knowledge.
Take TikTok, for example: estimates indicate that more than half of the platform’s US users may be minors. Does that mean that TikTok “should know” that *any* user is more likely than not to be a minor?
4/ If audience composition (>X% of users are minors) can create constructive knowledge, then services above some threshold will *still* have to age-verify users to protect themselves from liability under #KOSA, with all the attendant free speech problems.
5/ This could be prevented by explicitly stating that audience composition doesn't create constructive knowledge that users are minors.
Otherwise, services popular with minors will be pressured to age-verify; and adults may find the Internet Disneylandified for themselves too.
6/ Duty of care: The revised bill pays lip service to our concerns that KOSA would cut minors off from entirely legitimate, protected material, by “allowing” platforms to let minors seek out information for themselves.
But this doesn't actually fix the underlying problem.
7/ Sure, services won't have to prevent minors from *searching* for content.
But they would still be subject to a broad, vague duty of care to prevent/mitigate harms allegedly caused by *seeing* or *encountering* often legitimate, constitutionally protected content.
8/ As we explained, the clearest risk-averse option is to block minors from even legitimate content, like drug policy reform/harm reduction discourse, or anti-eating disorder content. Blocking minors from valuable, informational content does not serve their best interests.
9/ Another problem left unfixed: KOSA's duty doesn't require a harm to result *from* a minor's usage of the service.
So services may have to censor content shared between adults, if there's some conceivable way a child could be harmed, no matter how tenuous the connection.
10/ Parental Consent: KOSA §5 requires services to provide notices to a parent of any user under 17, and obtain an acknowledgment from the parent, before the minor uses the service.
This might make sense for young kids, but it burdens the First Amendment rights of older teens.
Congress removed very similar requirements from COPPA, the 1998 kids privacy law, after the Center for Democracy & Technology explained the implications for teens of being able to access sensitive information.
11/ And teens in abusive, hostile, or unsupportive situations are likely to find themselves unable to access a massive amount of information, or to seek help or community online.
At-risk minors deserve more thoughtfulness than this.
12/ Default safeguards: KOSA §4 requires services to provide safeguards, including the ability to control/limit algorithmic recommendations that use personal information.
For minors, the default must be the "most protective level of control."
That may sound well and good at first, but consider that algorithmic recommendations can actually help make sure that kids only see age-appropriate content.
13/ This provision could require YouTube to, by default, turn off algorithms that recommend other kid-friendly videos (or prevent non kid-friendly videos from being recommended) based on a user's age (personal information). What sense does that make?
14/ If you weren't convinced already that #KOSA would be bad for minors and adults, #gamers—listen up.
KOSA explicitly applies to video #gaming, and a required safeguard is limiting features that increase, sustain, or extend use, such as...rewards for time spent on the game.
And the default setting when a platform knows or should know a user is a minor must be the most restrictive.
15/ Do you know what increases, sustains, or extends use, and rewards for time spent? Scoring. And leveling up. And plenty of other core video game elements.
16/ If #KOSA passes as is, you may find that every game ships with these critical features disabled—making the games...not fun...until you figure out all of the settings you have to toggle to really play them.
It's a great bill for people who think video games are evil, I guess.
17/ It's good that #KOSA's sponsors acknowledged that the bill raised serious concerns. But lipstick will not save this pig; it's still a ham-fisted attempt that will wreck the Internet for everyone--and harm kids in the process.