It’s common for ads and posts to appear in our social networks feeds due to tracking our travels in cyberspace.
Congress wants to regulate the tools that social media companies use in order to monitor us.
Filter Bubble Transparency Act is a bipartisan bill that would require internet companies such YouTube and Facebook to allow users to opt-out from AI-driven content recommendation programs that rely on personal data to provide them advertisements.
Some social media platforms like Facebook and Twitter allow users to switch from algorithmic-driven news feeds into feeds that display posts chronologically.
It means what you see will be based on what pages have published content, and not on what social media thinks you may want to see.
YouTube users can also disable the auto-play feature. This suggests videos based upon personal data.
Newsy interviewed algorithm experts, who agreed that the legislation was a good step in the right direction. However, the regulations were not yet perfected and would be difficult to implement.
Robin Burke, University of Colorado professor, said that while we may not want to look at things in a chronological order, it might give people more control over or at least more insight into the system’s workings.
“I would love to be able to do this kind of semi-incognito, where I just say, ‘Hey, Facebook, turn off’ — not the recommendation algorithm, but ‘turn off the data reading.’ Professor Noah Giansiracusa from Bentley University said, “I won’t remain anonymous but none of the engagement actions I take will then be used to feedback into the algorithm.” “And when it’s ready, I can turn the semi-incognito back on, and it’ll take my data.”
Some entrepreneurs have already begun to offer an algorithmic-less experience. MeWe’s CEO Mark Weinstein said that MeWe was designed for individuals who are looking for a social-media experience that doesn’t depend on membership fees and personal-data-driven marketing.
He said that the legislation does not stop companies supporting their data-driven business model.
“This legislation that would attempt to mandate a translucent and opaque experience — If the user wanted it — the social network would still be collecting massive amounts of data, and any time the user toggled it then would use all that data,” Weinstein said.
Experts offer a possible solution: Create tools that slow the spread false content or conspiracy theories.
Filippo Menczer from Indiana University’s Observatory on Social Media told Newsy that his research has shown that people are more inclined to share lower-quality and less verifiable content if they see a lot of likes, comments, and shares.
Menczer said, “We’ve moved to frictionless communication system.” “We must increase friction to make the ecosystem more resistant to manipulation and to help us deal the floods. We are more able to discern between quality information from junk when we have information overload.
This story was first reported by Tyler Adkisson, Newsy.