I'm skeptical of this. Even largely the same is not the same. What is the difference besides censorship by the Chinese government are there other differences?
I agree that it should be based on their likes and viewing habits but the current methodology is proving to be too effective and addictive. Constantly showing people what they like and things that align with their viewing habits can create these echo chambers. The personalized algorithms themselves become echo chambers. I understand that mental health is never prioritized in this country so maybe I'm just gonna be alone on this hill but these apps are causing a host of issues on the users who use them mentally. And you are correct in that on all these platforms it does send people down political and social echo chamber rabbit holes. Further and further until they hit the extremes. That's how you radicalize people. An independent agency that regulates social media algorithms precisely to prevent that sort of bias to make sure users are allowed personalized content but also shown content every so often that slightly misaligns with their current viewing habits. But then as you will mention how do we prevent corruption within that independent agency? Most people aren't going to think critically about the things they are shown and will only seek out the views that align with their own. Giving people content outside their echo chambers from time to time will at least allow them a chance to re-assess their views.
You're arguing the content shouldn't be forced on them but couldn't we say that the content is already forced on them? On these endless scroll apps you don't choose the content that comes next, they just present a suggestion and you scroll if you don't like it. Companies already control the algorithm for their own agendas anyway to focus engagement and revenue. The problem is they don't put the user's own wellbeing into play just what they want.
This would probably work if they were transparent in what each algorithm does and the differences between them. At least now we have some semblance of choice in how our content is curated for us. But you'd never get them to admit how true downsides to the most addictive of the algorithms.
my understanding is the algorithm are largely the same in that they curate content relevant to the user interest but also weigh educational content and upskilling content more due to local laws. The U.S and other markets by default are more art and entertainment which most other social media platforms do as well. it's not just what content is curated but what content is allowed on the platforms due to local laws, there are some content that is legal to share in other countries but illegal or against moderation policies by local regulatory bodies.
the thing is no one can't say they can't find educational or upskilling content on tik tok, they can and there are many users who do just that.
I don't think the government should be choosing what content users view online via regulating algorithms because that can easily escalate to what they make as well. I much prefer individuals make their own choice and the algorithm acts accordingly based on observed behavior.
i didn't say anything about echo chambers in general, i made reference to the already observed and indisputable conclusion that many social media platforms have are algorithmically geared towards steering users down an alt-right pipeline of content because anger increases engagement and more ad views. my issue with that is no matter what benign interest a users starts off with the algorithms have been shown to ultimately steer them down alt-right political content. a user could start off with an interest in planting and in just a few short video recommendations later end up watching alt-right propaganda. the problem is these companies don't tell their users this can or will happen. thats why i proposed a marketplace of algorithms users can choose from instead. companies would reject that idea because some algorithms would be more profitable than others and they all know what makes more money for them which is outrage thus the right-wing propaganda on every social media site.
i don't support any nanny-state position in regards to online platforms. platforms should put in reasonable measures like content warning labels and guides but if they have content a user is interested in, they should show it to them. whats the point of hosting content that a platforms algorithm will ignore if it's requested by a user?
anything effective can be construed as addictive. good books can make people addictive readers.