Algorithms under which we are being analyzed and predicted, do you feel it?
Did you watch the video or movie because of the watermelon video or the love of Chiyi recommended to you? Or add a friend to a quick hand or shaker from the "People You May Know" list? Do JD.com and Taobao push your heart-tinging things from time to time?
These platforms are driven by algorithms that sort content based on our data and make corresponding priority recommendations and pushes.
If we make decisions based on the results shown to us by these algorithms, are our decisions made with our free will?
An algorithm is a numeric formula: a list of rules that use a set of data to achieve results. Often, for technology companies, the result is to make money by persuading us to buy something or by letting us scroll through more ads, and of course attracting more user traffic will appeal to advertisers.
When we are online, the data involved in intentional or unintentional behavior is captured, analyzed and utilized by algorithms. Every time you like to post, watch a video, or buy something, your data can be used to predict your next move.
These algorithms can affect us even if we don't know. If the recommendation algorithm is not guided or controlled, it may drive viewers to accept increasingly extreme content, leading to online radicalization.
The algorithm sorts the content to keep us engaged on the platform. It creates a phenomenon known as "emotional contagion" in which seeing positive posts makes us write positive posts ourselves, and seeing negative posts means that we are more likely to write negative posts.
Some of the "dark patterns" mentioned by the industry are designed to trick us into sharing more or spending more on shopping sites. These are all tips for website design, such as hiding the unsubscribe button or showing how many people are buying the product you're looking at. They subconsciously push you to do what the site wants you to do.
"Cookies" are small blocks of data that track us on different websites. They are records of online actions stored in the browser, such as clicked links and pages visited. This is called "data enrichment" when combined with data from multiple sources, including from large-scale hacking. It links our personal data ( such as email addresses ) to other information , such as our level of education .
This data is often used by internet technology companies to build our archives and predict our future behavior.
So how much of our behavior can be predicted by data-based algorithms?
In last year's Nature Human Behavior, a U.S. research team explored the issue by looking at how much information the subject's friends included in their social media posts.
Using data from Twitter, the researchers used only data from friends of the subjects to estimate the accuracy of predictions based on people's Twitter activity. The researchers found that eight or nine friends had enough data to predict someone's tweets, just as their tweets were viewed directly (with more than 50 percent accuracy). In fact, 95% of the potential predictive accuracy that machine learning algorithms can achieve is derived from a friend's data.
About 8-9 friends are accurate enough to predict your future posts, just as algorithms can access your own data.
The result means that even if you delete Facebook (which became popular after the Cambridge Analytics scandal in 2018), you can still be described because of the social relationships that still exist. These things make it difficult for subjects to be removed from the network.
The researchers also found that non-user profiles, known as "shadow profiles," could be built based on non-user contacts on the platform. Even if you've never used Facebook, it's possible to build your shadow profile if your friends have.
On social media platforms such as Facebook and Twitter, privacy is no longer limited to individuals, but is linked to the entire network.
Can't we enjoy a privacy exclusion zone and cleansing in an online environment? If you do delete your account, the information contained in your social relationships with friends becomes stale over time. At this point, the researchers found that your predictability gradually dropped to a low level, and your privacy and anonymity were eventually restored.
Although it seems that algorithms are eroding our ability to think independently, this is not the case. When it comes to the role of people and algorithms in disseminating information, people are equally important. On the web platform, the extent to which you engage with different perspectives is closely related to your social group, not the way news presents you with content. On some platforms, while "fake news" may spread faster than facts, it is mainly humans, not robots, that spread it.
Of course, content creators use the algorithms of social media platforms to promote content on fast hands, shakes and other platforms, not just the other way around.
Go to "Discovery" - "Take a look" browse "Friends are watching"
sent to have a look