CEO Assassin Luigi Faces Terrorism Charge

cyndaquil

Lv 100 Bold natured
Joined
Sep 2, 2014
Messages
7,309
Reputation
1,413
Daps
24,992
Reppin
JOHTO REGION
how should the algorithms be designed? :jbhmm:
That's an extremely good question. Now this is the territory we should really be having detailed discussions on. What are the proper constraints to design an ethical algorithm for displaying social media content to an end user?

Right now considering the algorithms are using PhD level psychologists, mathematicians, and data scientists to analyze human brain chemistry and behavior to create the most addictive applications possible without any regard for the deleterious ramifications on the the lives of the users and good of the general public. There's been a few whistle blowers who have come forth about this as well.

We need to have an ongoing conversation in the tech space on the standards of ethical design of these algorithms as well as a regulatory agency that enforces these standards across the industry. I know it will he difficult to force companies to voluntarily allow their solutions to be analyzed due to fear of leakage of their proprietary tools and also deterring corruption of the regulating body itself. But something has to be done.

The Chinese government realizes this with tiktok hence they have a different app over there with an entirely different set of algorithms compared to those used on tiktok in the united states since they knew how detrimental it would be to their own citizenry. Its an advantage when your government holds many STEM individuals in high positions and actually listens to their advice. :manny:
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,368
Reputation
8,499
Daps
160,077
That's an extremely good question. Now this is the territory we should really be having detailed discussions on. What are the proper constraints to design an ethical algorithm for displaying social media content to an end user?

Right now considering the algorithms are using PhD level psychologists, mathematicians, and data scientists to analyze human brain chemistry and behavior to create the most addictive applications possible without any regard for the deleterious ramifications on the the lives of the users and good of the general public. There's been a few whistle blowers who have come forth about this as well.

We need to have an ongoing conversation in the tech space on the standards of ethical design of these algorithms as well as a regulatory agency that enforces these standards across the industry. I know it will he difficult to force companies to voluntarily allow their solutions to be analyzed due to fear of leakage of their proprietary tools and also deterring corruption of the regulating body itself. But something has to be done.

The Chinese government realizes this with tiktok hence they have a different app over there with an entirely different set of algorithms compared to those used on tiktok in the united states since they knew how detrimental it would be to their own citizenry. Its an advantage when your government holds many STEM individuals in high positions and actually listens to their advice. :manny:

The chinese government didn't realize anything with their version of tiktok called Douyin, douyin simply follows the laws and regulations all online platforms follow. the algorithms on douyin is largely the same except that the chinese government regulations is weighing in on what content should be promoted and when. the tiktok algorithm is considered to be one of the best because they show relevant content to a users interest and allow new creators to gain an audience faster than any other social media platform.

many americans will absolutely oppose the government designating what content users should be steered to on free speech grounds. tiktok is guilty of letting users choose what content they want to be exposed to based on their likes and viewing habits. thats how social media should work rather than companies claiming not to be bias and sending people down a right-wing echo chamber when they have no interest in politics or no interest in right-wing politics.

imagine if your turned on your TV and it automatically tuned in to the discovery channel, nation geographic channel, history channel, or pbs. thats what your ideal algorithm sounds like which would suck because unless a user sought out that content, it shouldn't be forced on them. it would set a very bad precedent with people wanting to control the algorithm for their own agendas. algorithms should work to adhere to a users agenda.

one thing i'd probably support is regulators mandating a marketplace of algorithms users can probably choose from for platforms of a certain size..
 

cyndaquil

Lv 100 Bold natured
Joined
Sep 2, 2014
Messages
7,309
Reputation
1,413
Daps
24,992
Reppin
JOHTO REGION
The chinese government didn't realize anything with their version of tiktok called Douyin, douyin simply follows the laws and regulations all online platforms follow. the algorithms on douyin is largely the same
I'm skeptical of this. Even largely the same is not the same. What is the difference besides censorship by the Chinese government are there other differences?
many americans will absolutely oppose the government designating what content users should be steered to on free speech grounds. tiktok is guilty of letting users choose what content they want to be exposed to based on their likes and viewing habits. thats how social media should work rather than companies claiming not to be bias and sending people down a right-wing echo chamber when they have no interest in politics or no interest in right-wing politics.
I agree that it should be based on their likes and viewing habits but the current methodology is proving to be too effective and addictive. Constantly showing people what they like and things that align with their viewing habits can create these echo chambers. The personalized algorithms themselves become echo chambers. I understand that mental health is never prioritized in this country so maybe I'm just gonna be alone on this hill but these apps are causing a host of issues on the users who use them mentally. And you are correct in that on all these platforms it does send people down political and social echo chamber rabbit holes. Further and further until they hit the extremes. That's how you radicalize people. An independent agency that regulates social media algorithms precisely to prevent that sort of bias to make sure users are allowed personalized content but also shown content every so often that slightly misaligns with their current viewing habits. But then as you will mention how do we prevent corruption within that independent agency? Most people aren't going to think critically about the things they are shown and will only seek out the views that align with their own. Giving people content outside their echo chambers from time to time will at least allow them a chance to re-assess their views.
imagine if your turned on your TV and it automatically tuned in to the discovery channel, nation geographic channel, history channel, or pbs. thats what your ideal algorithm sounds like which would suck because unless a user sought out that content, it shouldn't be forced on them. it would set a very bad precedent with people wanting to control the algorithm for their own agendas. algorithms should work to adhere to a users agenda.
You're arguing the content shouldn't be forced on them but couldn't we say that the content is already forced on them? On these endless scroll apps you don't choose the content that comes next, they just present a suggestion and you scroll if you don't like it. Companies already control the algorithm for their own agendas anyway to focus engagement and revenue. The problem is they don't put the user's own wellbeing into play just what they want.
one thing i'd probably support is regulators mandating a marketplace of algorithms users can probably choose from for platforms of a certain size..
This would probably work if they were transparent in what each algorithm does and the differences between them. At least now we have some semblance of choice in how our content is curated for us. But you'd never get them to admit how true downsides to the most addictive of the algorithms.
 

Rosecrans

CPT
Supporter
Joined
Apr 27, 2014
Messages
10,362
Reputation
5,503
Daps
49,997
b6Utt-e6-noEm3371o3DyRbQiAYsVtr5fbyptH6icmM.gif
77ZiwC1yGMo7dT9vkCDJoqqv7TU-D7P20SyBjiXaffg.gif
 

Pazzy

Superstar
Joined
Jun 11, 2012
Messages
27,780
Reputation
-6,990
Daps
44,431
Reppin
NULL
Feds literally turning luigi into a martyr with the death penalty charge. Theyre overcharging him at this point. More copycats to follow
 

DetroitEWarren

Superstar
Joined
Jul 15, 2012
Messages
19,318
Reputation
7,016
Daps
61,948
Reppin
Detroit You bytch Ass nikka
Im saying. This point is flying over so many peoples heads. You notice how so many people in america dont want to take any level of self accountability to take action in order to push back against the system when they feel it is being unjust like how there were actually citizens willing to do so in the 60s? Theyre just cheering for luigi hoping someone elses crashes out, not achieving or impacting anything. This is why shyt wont change.

The same people cheering for luigi were busy trying to intimidate and tell everybody not vote for jill stein or a third party candidate that would have been a curbball for the two party establishment running washington and enabling the healthcare system where united healthcare is able to move the way it does.
Man shut your dumbass up my dawg. You are one of the dumbest nikkas I have ever read before and you should be completely absent from any form of serious conversation
 
Top