Earlier this month, Facebook had a leak of documents that prove they fail to protect their users. Today, on October 25, 2021, Mark Zuckerberg said “my view is that what we are seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company”. But does this paint a false picture?
Facebook, the biggest social media company, knows how to make a profit. They strategize in order to gain from their sites’ users. Media is not here for us, we are here for them. We are the dollars they make. In America, Facebook has made an effort to provide support in order to avoid spreading false information but the rest of the world doesn’t receive the same “treatment”.
For example, India. Two years ago, a couple of employees created a fake account in order to see what this profile's feed would look like. The profile was from the point of view of a young woman in India. This was during an exceptionally rough time between Pakistan and India and her new feed quickly filled with violent Indian and Pakistani content.
Even though Facebook tries to prevent this situation in the states, they neglect other parts of the world. It’s hard to moderate the platform when it’s being used in a different language, but the risk of disinformation gets higher because of the different languages.
Questions:
Facebook is known for their lack of user safety. Do you think there is anyway to protect users beyond what Facebook is doing?
Facebook has the resources to ensure better protection for their users. Why don’t they?
Should Facebook be held more accountable?
Sources:
https://www.cnbc.com/2021/10/25/facebook-earnings-zuckerberg-slams-press-coverage.html
https://www.cnn.com/2021/10/25/tech/facebook-earnings-q3/index.html
https://www.washingtonpost.com/technology/2021/10/24/india-facebook-misinformation-hate-speech/
8 comments:
I believe that there could be more ways for Facebook to protect its users' information, but that they do give out specifics to avoid giving away information that could be used to further exploit their security flaws. It should be in Facebook's best interest to keep its users safe, and their information secure, not just from a moral standpoint, but from a financial one as well. Facebook stock prices dropped by a substantial amount when the news of this incident broke out. Data leaks are never good for business, an example of this from another company would be T-Mobile's data leak from which stock prices still haven't recovered. I believe that the only way to protect users beyond what Facebook is already doing is up to the users themselves. Keeping as little information as possible linked to Facebook would work at lessening the damage from these data leaks. Facebook should be held more accountable for its site as a whole, it currently exists as one of the most toxic places on the internet, not just in foreign nations, but certain groups in the United States act as a plague on the platform. Facebook is a global company with money to spare, and if they want to keep stock prices up they have to stay accountable for what is put on their site(s) and increase their security efforts.
I certainly agree with Cody that there are many ways that Facebook could protect its user' data and personal information and that they should hope to do so in order to preserve their company's reputation. However, the current issue lies with the environment that Facebook has allowed its users and algorithm to create and allowed misinformation to spread. As for why Facebook doesn't do more on their own to prevent misinformation and a toxic environment on their platform is that these isolated online communities allow them to make money through ads directed towards radical right-wing supporters for example. Just like the answers to many illogical situations, money is the answer; Facebook has no incentive to prevent misinformation from spreading unless there is a legal or PR reason to do so. In order for Facebook to become a safer site for people to use both from a data and misinformation perspective, Facebook must be held accountable whether that means legal regulation or people deciding not to use their site, and thus reducing their profits. Ultimately, Facebook has a multitude of issues related to misinformation, data, and their monopolistic business practices and there is little reason to trust that they can resolve the issues alone. Unless a truly drastic scandal destroys their company's value, they have little incentive to put user safety and information standards over profits and site traffic.
I also think that Facebook needs to be held more accountable. If the government won’t hold Facebook accountable for its harmful actions, then I think the people need to step up. Like Cody mentioned, people could stop using Facebook. People can switch to other social media sites permanently or simply boycott Facebook until the company begins protecting its users more. There are so many other social media sites and ways to connect with people, so people can find which other social media sites work best for them fairly easily. One the company sees a majority of their user base no longer on the app, they may begin to respond and start listening to people’s concerns more. The name change of the company isn’t going to do anything to make people really feel safer using Facebook. The company needs to create policies and algorithms to protect its users and build back user faith in their site.
Obviously we need to take action so that corporations like Facebook are held accountable for what they do. I found a really interesting article from the Atlantic about how we should not think about Facebook as a company but rather as a separate nation. Facebook has around 2.9 billion monthly active users which is almost 30% of the world. This means that whatever Facebook does or doesn't do it has an extreme influence on the entire world. In just the first part of 2021 around "$54 billion" was used by advertisers to market on Facebook alone. In only the past 5 years, Facebooks has caused huge and almost irreparable damage to the US with elections, misinformation and division. But this is not only an issue in the US with most of Facebooks users being in India. There needs to be large government action on corporations like Facebook before there is not way back to normal life.
https://www.theatlantic.com/magazine/archive/2021/11/facebook-authoritarian-hostile-foreign-power/620168/
Here is an interesting story from a segment with John Oliver about how misinformation is such a huge issue especially outside of the US and what companies like Facebook are allowing
https://www.youtube.com/watch?v=l5jtFqWq5iU
As mentioned by Cody and Alex, Facebook is ultimately a corporation that’s main drive is to make money. Facebook lacks the agency to ensure better protection as I assume they want to save their money for investing in more ways to market or expand their threshold. To expand on Cody’s point about how the safety of users is within the best interest of Facebook on a moral and financial level, it’s important to take into consideration Facebook’s effect on other social media platforms. Rival social media apps want to stand out compared to Facebook, to portray themselves in a more unique and safe perspective. Interestingly, an article from the conservative leaning website Politico mentions how Zuckerberg was worried that smaller companies will view attacks on their big corporation as discouraging. As a result, other platforms would presumably run away from their responsibility to improve their platforms/provide for users. (In my opinion, Zuckerberg appears to be hypocritical as they're still ultimately running away from their responsibility, not to mention that other sites like TikTok are apparent striving for better protection anyways). Because Facebook is such a wide platform, they thus should be held more accountable as they ultimately serve as a guide for smaller organizations. Whether it’s by directly responding to public demands or making public statements, Facebook needs to take action to maintain a distinguished status as soon as possible otherwise they will be stereotyped as an unsecure source even if changes were eventually made.
https://www.politico.com/news/2021/10/26/facebook-social-media-protecting-kids-517282
Facebook has access to an abundance of private user information, and they should be held responsible for protecting this privacy. As a big platform with the ability to safeguard this private information, it does not make sense that they don't exercise this ability. It is even more interesting when considering Zuckerberg's vision for the Metaverse, an online platform that will give you the ability to interact with friends. This would involve "sensors in people’s homes, collecting vast amounts of data" that would increase Facebook's access to private information. Without secure protection, the risk of private information being exposed will grow largely if Zuckerberg follows through with his plans.
https://www.forbes.com/sites/kateoflahertyuk/2021/11/13/why-facebooks-metaverse-is-a-privacy-nightmare/?sh=48d184336db8
I believe Facebook should be held accountable. This is because users on social media apps have the freedom to post whatever they want to share with the public, with some restrictions, and they trust that certain things like this won’t happen. It is a complete break of trust between Facebook employees and Facebook users. A leading issue from this incident is that it can put people in harm and put their safety at risk. In the future, Facebook employees need to verify the security information and verification of users’ accounts. There was a lack of concern for these people’s safety, but at least it can help other uprising social media platforms to open their eyes and make sure this exploit will not happen again.
This article connects well to the media seminar that we conducted in class, and which went over topics such as disinformation being spread, and whether or not these newer media forms are driving polarization, amongst others. Learning about algorithmic boosting, which built into software regulates platform user experiences, should be looked at in order to prevent the spread of false information or issue-specific news (oversight boards have been established but it is impossible to monitor the quantity of comments). Passive sharing exacerbates the problems created by large media companies such as Facebook, so attempts to disqualify this concern regarding the platform should be shut down. Private information of users needs to be protected by companies that gather data and user information, as it is expected of them. Users on platforms over time build archives in the system based on things they click on and chose to interact with. Some information provided by users to these sites can also be very sensitive, and should it fall into the wrong hands of people such as hackers (which can then sell information globally to buyers), could prove detrimental to the user and dangerous as well. In this digital age I think that there needs to be more priority placed on user safety and information privacy, and companies such as Facebook need to be criticized so that action can be taken to strengthen cybersecurity and information storing systems.
Post a Comment