Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This just seems ripe for selective enforcement if not codified in law. I agree the algorithm they use can be addicting, but it's because it's simply good at providing content the user wants to consume.

Besides a general 'don't be too good' I'm really not sure what companies should do about it. It just seems like it'll lead to some judges allowing rulings against companies they don't like.

Television's goal was always viewer retention as well, they were just never able to target as well as you can on the internet.



I see it as similar to the public health crisis created when protonated nicotine salts made their way into vapes along with flavors allowing 2-10x more nicotine to be delivered and the innovation that made Juul so popular with children.

The subsequent effects - namely being easier to consume and more addictive - eventually resulted in legislation catching up, and restrictions on what Juul could do. It being "too good" of a product parallels what we're seeing in social media seven years later.

Like most[all] all public health problems we see individualization of responsibility touted as a solution. If individualization worked, it would have already succeeded. Nothing prevents individualization except its failure of efficacy.

What does work is systems-level thinking and considering it an epidemiological problem rather than a problem of responsibility. Responsibility didn't work with the AIDS crisis, it didn't work on Juul, and it's not going to work on social media.

It is ripe for public health strategies. The biggest impediment to this is people who mistakingly believe that negative effects represent a personal moral failure.


> it's because it's simply good at providing content the user wants to consume.

Well, a drug addict wants to consume his drug. Because his drug is good at keeping abstinence syndrome at a bay and probably the tolerance hasn't build up to levels when the addict couldn't feel the "positive" effects of it.

The user feels an impulse to consume the content, but whether they want it we can know only by questioning them. They can lie consciously or unconsciously, but there are no better ways to measure a desire to consume it. When talking about doom scrolling I never met a person who said they want to do it, but there are people who do it nevertheless.

> This just seems ripe for selective enforcement if not codified in law.

I agree. I'm not sure how they define "addiction" and how they measure "addictiveness". It is the most important detail in this story.


Companies that sell products to the public have managed this for a hundred years. Some are good at it, some are not, some completely disregarded their obligations. This is not all that new.


Lets just be honest, if you make enough money its legal in America.

Unless you hurt children, then its mostly legal and a slap on the wrist.


thats the point


Nukes are the same as knives, just different in magnitude. Should one have special rules?


I think in America the second amendment makes it legal to own a nuke.


> I'm really not sure what companies should do about it

disassemble the intentionally addictive properties they built into their platforms to maximise engagement and revenue at the cost of the mental health of their users.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: