About a week ago, Frances Haugen, a former Facebook product manager, spoke to the senate about their disturbing facts. She illuminated how Facebook has been hiding that teenagers feel worse after using the app, and they are willing to show hateful content to keep people coming back. Of course, Zuckerberg fought back saying the evidence is false or taken out of context, yet senators insisted on the bombshell status of the information the whistleblower revealed. Facebook’s response? To make it harder for employees to leak. They created more private communication company-wide, and they’ve assigned certain employees to be in charge of security.
This disturbing testimony only continued to prove how the media has been putting profits before all else. Facebook prided itself on being an app that connects people, yet they’re truly just exploiting them to keep the company afloat. As we learned in gov this week, that’s exactly what most tv media stations are doing now too - putting the most entertaining news on front to keep viewers coming back. Online media has become an entertainment business first and foremost.
Questions:
What do you believe is the correct plan of action the senate should take on? What changes should Facebook make?
What is the line between freedom of private companies, and the mental-health dangers they may be inflicting?
Senate testimony:
https://www.nytimes.com/2021/10/05/technology/facebook-whistle-blower-hearing.html
Facebook’s response:
https://www.nytimes.com/2021/10/13/technology/facebook-workplace-transparency-leaks.html
10 comments:
Link to article: https://nypost.com/2021/10/09/how-facebook-can-and-should-be-tamed-by-the-government/
I believe the government should pass strong legislation that limits big tech companies with what they can do with personal data and surveillance. Facebook already has too much power, having access to about ½ the world’s population -- so by going after the way that Facebook makes profit, the legislation has more potential at being much more influential in the long run. (with an exception for public health) By forcing Facebook to delete personal data, tactics like ad targeting and the algorithm creating a bubble for users would become much less precise. There could also be a new government agency tasked with overseeing these huge tech companies, created so that there can be a direct line of communication with the government about how these companies can be regulated for the public good -- that means answering questions like whether banning social media for minors or imposing severe restrictions when minors do use social media would benefit the public good.
There really isn’t a question anymore on whether or not Facebook is prioritizing profits over public good -- Frances Haugen's report and thousands of pages of Facebook’s internal document research already essentially proves that, along with tons of other examples in the recent past. It’s pathetic to see how Facebook is trying to do some damage control to prevent other whistleblowers, since opinion within Facebook about Haugen’s report is divided.
The way that Adam Mosseri, the head of Facebook’s Instagram service, tried to defend Facebook by comparing it to automobiles is deafening. He said, “We know that more people die than would otherwise because of car accidents, but by and large, cars create way more value in the world than they destroy…And I think social media is similar.” (sourced from the article)
And I think Tech columnist Kevin Roose’s tweet sums up the problem with Adam Mosseri’s analogy pretty well: “Like if Chipotle was getting criticized for having salmonella in its guac or whatever and the CEO’s response was like, ‘Well, scaled food production has had many benefits for humanity, including freeing us from being hunter-gatherers.’ ”
But it brings up a good point as well: Facebook is alive because it has become centerpiece to the way that people communicate now, which has strengthened with the pandemic. So any regulations on Facebook have the possibility of messing up the way that millions of people communicate nowadays -- it’s definitely something to consider.
I believe that Facebook should not have hidden that information, and the fact that they did says something about their overall trustworthiness and how they value money over their consumers mental health. I don’t believe that the Senate has the ability to do much about the situation since it doesn’t look like Facebook broke any law or will admit to anything without a confrontation. Also, it is everyone's personal choice to use Facebook and willingly expose themselves to the negative mental emotions they feel when using the app, and this can’t be controlled by Facebook. Something the Senate can do to lessen the negative emotional impact is to educate people beforehand of the possible consequences and let them decide. I agree with Anthony that companies should prioritize keeping users information safe and make sure their products are secure. Private social media companies should be stopped from providing faulty or unsafe products, and keep their platform free of libel or mudslinging, but they shouldn’t be responsible for user’s decisions when using the app unless it harms others.
I agree with Anthony that a government agency or task force should be created to solely focus on the issue of social media companies, especially considering how large these corporations have become and the influence they have in the modern world. The main concern lies with people's personal information and the data collected by Facebook and other social media sites that exploit users. While I don't disagree with Stephanie's perspective that people should be better educated, I believe that social media has become such a common commodity in modern life that education is not enough to limit the damage caused by social media companies.
Addressing the question of mental health and social media company responsibility, I believe that social media is inherently bad for people's mental health. Comparing ourselves to other people is an intrinsic part of social media and also the dangerous aspect of social media to mental health. I believe that the solution lies in the regulation of social media corporations as they have been free to exploit users and their data. Furthermore, there should be some rules in place to ensure that there are standards set to protect people as much as possible from the dangers of social media. How that would be done is probably a difficult question to answer but there is likely something that can be done such as stricter posting guidelines. Ultimately, Facebook and other social media corporations have grown out of control and action should definitely be taken to prevent anything dangerous from occurring.
I believe that there needs to be more government regulation of big social media/tech companies. I totally agree with what Anthony said, especially with the fact that Facebook’s exploiting/prioritizing profits is not even a question, it’s pretty clear. To protect the interests of people using Facebook, it is best for regulation to occur. You never know if the information found on Facebook can be harmful to a person, perhaps dramatically. Mental health is very important. Of course, there is going to be fierce resistance. Facebook is a big company that can use its power and influence to try to combat what is being said. If the evidence is clear, then it is only right to punish Facebook with regulations. A good way to do this is to have a specific department specialized at regulating social media. I know that regulation of social media by the government has been a hot debate for a while. I think full regulation may not be good since it is good to have freedom of speech to criticize the government if needed but some regulation is required. It is important to get rid of all hate speech or clearly detrimental content.
Private companies should have some freedom but not to the point where they are clearly damaging mental health. In my opinion, mental health should be put over private company freedom especially if that company, Facebook, is clearly prioritizing profits over customer health. The fact that they made it harder for employees to leak information makes it even worse. If I were Facebook, I would censor hateful content and maybe even demonstrate that happening so people can know that Facebook is doing something good. This can improve the image of Facebook and can be a win win for both the customers and the companies.
Private companies are entitled to their freedom but as mentioned in multiple comments above, there needs to be some sort of regulation. Unfortunately all forms of media aren't here for our benefit. Media is here to sell us something. We allow the media to continue making money so it makes sense as to why companies such as Facebook hide certain information. I’m not sure if social media companies intend for their sites to be a place of competition but it definitely benefits them that it has become just that. People are always trying to outdo what is being done; to get the best photo or a larger amount of attention, many people will do whatever it takes. Constantly seeing unattainable lifestyles that everyone wants but not many can “get” is detrimental to mental health. Luckily for Facebook and other companies, this is just another cycle that makes them more money for exploiting their users.
I do agree with Stephanie that users of social media are there on their own free will but the thing is that younger users aren’t as aware that they’re just dollar signs to these companies. For this reason there needs to be stricter regulations, especially for protecting the youth, but as Anothony mentioned it can be hard to regulate one thing and protect other aspects, like communication, on the platform.
It’s extremely difficult to hear about all the mental and emotional damage social media companies like Facebook are causing. Other peers brought up very insightful comments about needing more regulation on what can get posted on social media and how social media companies should monitor what’s on their sites. My only concern is how effective this regulation would truly be. People are always finding ways around rules and regulations. For example, most social media companies say that you have to be 13-years-old or older to join their site, but a lot of children under the age of 13 create fake emails and lie about their birthdays to gain access to social media. That seems extremely hard to monitor. Having stricter posting guidelines seems like a good idea, but they might not be as effective as we hope. Simply changing a font or replacing letters with symbols could make it easy for people to bypass the regulations set in place. There does obviously need to be better monitoring of extreme posts. However, another way to possibly help the mental health of social media users is to boost the posts that are positive and that promote good mental health. If Facebook tried to create a software that boots positive messages, or if Facebook posted more uplifting messages themselves, this could possibly positively impact users’ mental health. It seems like a difficult process and it’s hard to say that one solution will fix everything. But if Facebook tried to fix the problem from multiple angles, there could be a better outcome than if they only tried to focus on one solution.
I agree with Elizabeth about the difficulties of regulating online activity. Even though content online can appear to be self-selected, algorithms like those on insta or tiktok are dedicated to pushing content that will keep you on the platforms the longest. This can lead to a lot of issues when it comes to mental health (overall, it's not good to spend hours after hours scrolling) but the kind of content can exploit user's insecurities. Especially on apps where the content is 99% visual, like on insta, this can enforce unhealthy expectations and beauty standards.
These algorithms have also been observed as pathways for political radicalization. Posts that are the most outrageous often get the most engagement and are pushed out more, and this can lead to dangerous ideas / misinformation that should definitely be regulated on these platforms.
This isn't totally as related as my earlier points, but I feel like social media is having a big impact on consumer culture. We are consuming a lot more now, especially when it comes to clothing. Apps like TikTok, especially, as well as Insta, contribute to a rapid turnover of trend cycles. This leads to people spending a lot on fast fashion in order to fit in (insecurities created by algorithms discussed earlier). Fast fashion relies on worker exploitation, and these clothes are often really cheaply made as well, and are donated / thrown out after a few months. People are consuming more and using the items for less time, which is having a huge impact on the environment.
Relating to the media chapter, I feel like the media that we consume every day is having a large impact on our lifestyle choices and our mental health, as well as political opinions. Especially after being stuck inside for so long with only a phone and computer screen, media is increasingly intertwined with every part of our lives (what we think, watch, and consume).
I think that the senate should try to pass some big legislation on how much data and influence large tech companies have. I hope that legislation can be done to limit big tech so that their power can be put in control but I worry it may not by the amount of control and lobbying large companies do. From an article from Public Citizen I found that in the 2020 election alone big tech spent around 124 million dollars on lobbying and campaign donations. While I do somewhat understand that regulation of online activity can be hard especially for Facebook which people can create their own groups, essentially their own echo chamber in sharing dangerous and often false information. Facebook as a whole has honestly caused lots of the problems we face in modern society. In order to correct Facebook more needs to be done rather than just a new "metaverse" name. In the past few years, we have found out about the hidden influence Facebook and other big tech companies have on US elections which should be essential to monitor for the sake of the health of our country. In addition to this Facebook's entire goal is to keep addicting people to its platforms and bring in large profits. While this may not be exactly related to some of the questions, I found it an interesting article from the New York Times. https://www.nytimes.com/2021/10/16/technology/instagram-teens.html This article talks about how Facebook's new goal of instagram is addicting young teens to their sites by doing whatever is possible so that they won't stop using their products, essentially gaining a generation of consumers. These things should be extremely important to our government and regulating such things is a necessity.
Here is the link to a very interesting article on how much influence large companies especially big tech have on the US government:
https://www.citizen.org/article/big-tech-lobbying-update/
Many other comments have called out big tech on their horrific actions, and I completely agree that they need to be held accountable. However, what's the answer? Shutting down Facebook? Snapchat? Instagram? The truth is, we use these apps daily if not hourly. These apps DO serve as a source of connection to people around the world, have touched lives, and even saved them, particularly during the isolating time of quarantine. I think these companies need to step forward and admit to their flaws. Post these numbers on staggering mental health and then publish the actions they plan to take to avoid them. Mental health matters more than money, even if these companies don't view it that way. Publishing these numbers, especially to teenagers, will be a sharp reminder to use them in moderation. Not everything you see on these apps is real, and it's important to take them with a grain of salt. In this way, people can benefit from the pros of social media while taking a step back to re-evaluate the flaws and companies can do the same, taking crucial steps to mitigate them. The government is in a position to pass legislation that mandates it and should, but I think removing them or making them pay money is unrealistic as neither will be effective in the long run.
Facebook's platform centers around advertisement revenue, and to earn that money, they need to continue to generate clicks. Yes, there is a moral dilemma about how far is too far to profit, but I think the current level is acceptable. First, Facebook plays into the difficulties that come with being a teenager. There is always pressure to look, act, talk a certain way, and Facebook and their products play into that. However, these "standards" never really change throughout later life. People at any age want to fit in, and Facebook/Instagram/Snapchat promote that. Big tech connects people to each other, and keeps isolated communities in touch with the world. Demonizing them ignores all the good they do for the world. I don't think the Senate is really in the place to do anything here. If anything, Facebook should implement "reminders" to users after they scroll though a certain amount of posts/stories encouraging them to take a break, or asking them if they've walked their dog/done dishes etc like some kind of reminder not to become absorbed in the app. Companies shouldn't promote bad mental health, but if a teen's mental health is damaged by seeing their peers post about their lives on Facebook, then it is up to the user to detach when they need to. It is also just as important for the youth to develop good habits with technology; it's here to stay and learning how to engage with social media positively is crucial.
Post a Comment