- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
About 59% of Americans say TikTok a threat to the national security of the United States, according to a new survey of U.S. adults.
About 59% of Americans say TikTok a threat to the national security of the United States, according to a new survey of U.S. adults.
I’ve come to the conclusion that it is algorithms that have become evil. There was a thread where someone was asking for help stoppinh YouTube from radicalizing their mother due to the videos it would suggest to her.
I use stuff like newpipe and freetube to try and get away from these personalized attempts at content, since there is still good content on YouTube. It’s just that so many sites try and keep you there as long as possible and then start feeding you content that can warp people. But, algorithms don’t understand the impact of it, since it’s either a 0 or 1 of user stays or user leaves.
If you can be radicalized by videos from YouTube, it isn’t the algorithm, it’s you
https://lemmy.world/post/1184403
Yeah that’s entirely on her.
World doesn’t exist in an individual vacuum. The people negatively influenced by disinformation go onto to take a role in society and interact with others to either negatively or positively affect the people they encounter. Congratulations on your individual resilience, but the world is not a population consisting of only you with you alone determining the impact other people have on the world.
Yeah and, once again, those people are the problem.
Unless you want to ban any food that isn’t fruits and vegetables, cars, not sleeping enough, not getting enough exercise etc, at some point you have to accept that people do in fact make their own choices.
I’m not for banning things because some people are idiots.
I feel bad that you’ve been radicalized into thinking this way.
Im not concerned with your feelings.
It’s not just all you or all youtube. Both matter. It’s harmfully reductionist to act like it, not both.
Both really don’t matter, since adults have a right to chose to consume any content they’d like.
If your grandma finds Q fascinating, that’s on your grandma
It’s also the fault of those producing Q content.
algorithms can’t “become evil” any more than your toaster can. It’s being directed and programmed by people who know exactly what they’re intending to achieve.
It’s to say algorithms despite no intent to be evil have led to negative impact due to no care for the context of the recommendation. So someone can go in searching up health information then go down a rabbit hole of being recommended pseudo health advice then flat earth and so on. Not because the algorithm wants to turn people a certain way, but because it’s just recommending videos that users that liked similar videos might find of interest. It’s just broad categories.
Wasn’t implying algorithms are sentient. At least not yet until AI integration happens.