Misplaced Pages

Criticism of Facebook: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 22:26, 6 January 2021 editKeith D (talk | contribs)Autopatrolled, Administrators544,506 edits Fix cite date errorTag: AWB← Previous edit Revision as of 07:17, 7 January 2021 edit undoHenryansari123 (talk | contribs)2 editsmNo edit summaryTags: Reverted Visual editNext edit →
Line 54: Line 54:


===Complaint from CIPPIC=== ===Complaint from CIPPIC===
On May 31, 2008, the ] (CIPPIC), per Director Phillipa Lawson, filed a 35-page complaint with the Office of the Privacy Commissioner against Facebook based on 22 breaches of the Canadian ] (PIPEDA). ] law students Lisa Feinberg, ], and Jordan Eric Plener, initiated the "minefield of privacy invasion" suit. Facebook's ] contradicted the claims, saying that: "We've reviewed the complaint and found it has serious factual errors—most notably its neglect of the fact that almost all Facebook data is willingly shared by users."<ref>{{Cite web|url=http://ap.google.com/article/ALeqM5hpMWm7pPdLssNR9Uu4OksMSOpZcgD910STD80|archive-url=https://web.archive.org/web/20080603214050/http://ap.google.com/article/ALeqM5hpMWm7pPdLssNR9Uu4OksMSOpZcgD910STD80|url-status=dead|title=ap.google.com, Canada launches privacy probe into Facebook|archive-date=June 3, 2008}}</ref> Assistant Privacy Commissioner Elizabeth Denham released a report of her findings on July 16, 2009.<ref name="Report">{{cite web|title=Privacy Commissioner's Findings in the case of CIPPIC against Facebook|url=http://www.priv.gc.ca/cf-dc/2009/2009_008_0716_e.pdf|access-date=January 15, 2010}}</ref> In it, she found that several of CIPPIC's complaints were well-founded. Facebook agreed to comply with some, but not all, of her recommendations.<ref name="Report" /> The Assistant Commissioner found that Facebook did not do enough to ensure users granted meaningful consent for the disclosure of personal information to third parties and did not place adequate safeguards to prevent unauthorized access by third party developers to personal information.<ref name="Report" /> On May 31, 2008, the ] (CIPPIC), per Director Phillipa Lawson, filed a 35-page complaint with the Office of the Privacy Commissioner against Facebook based on 22 breaches of the Canadian ] (PIPEDA). ] law students Lisa Feinberg, ], and Jordan Eric Plener, initiated the "minefield of privacy invasion" suit. Facebook's ] contradicted the claims, saying that: "We've reviewed the complaint and found it has serious factual errors—most notably its neglect of the fact that almost all Facebook data is willingly shared by users."<ref>{{Cite web|url=http://ap.google.com/article/ALeqM5hpMWm7pPdLssNR9Uu4OksMSOpZcgD910STD80|archive-url=https://web.archive.org/web/20080603214050/http://ap.google.com/article/ALeqM5hpMWm7pPdLssNR9Uu4OksMSOpZcgD910STD80|url-status=dead|title=ap.google.com, Canada launches privacy probe into Facebook|archive-date=June 3, 2008}}</ref> Assistant Privacy Commissioner Elizabeth Denham released a report of her findings on July 16, 2009.<ref name="Report">{{cite web|title=Privacy Commissioner's Findings in the case of CIPPIC against Facebook|url=http://www.priv.gc.ca/cf-dc/2009/2009_008_0716_e.pdf|access-date=January 15, 2010}}</ref> In it, she found that several of CIPPIC's complaints were well-founded. agreed to comply with some, but not all, of her recommendations.<ref name="Report" /> The Assistant Commissioner found that Facebook did not do enough to ensure users granted meaningful consent for the disclosure of personal information to third parties and did not place adequate safeguards to prevent unauthorized access by third party developers to personal information.<ref name="Report" />


===Data mining=== ===Data mining===

Revision as of 07:17, 7 January 2021

Media coverage of the shortcomings of Facebook's market dominance

It has been suggested that this article be split into articles titled Privacy issues of Facebook and Litigation of Facebook. (discuss) (July 2018)
This article may be too long to read and navigate comfortably. Consider splitting content into sub-articles, condensing it, or adding subheadings. Please discuss this issue on the article's talk page. (June 2017)
This article is part of a series about
Meta Platforms
Products and services
Facebook
Other products
People
Executives and board members
Notable employees
Related organizations
Business
Criticism
Litigation
Related

Criticism of Facebook has led to international media coverage and significant reporting of its legal troubles and the outsize influence it has on the lives and health of its users and employees, as well on its influence on the way media, specifically news, is reported and distributed. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users, possible indefinite records of user information, automatic facial recognition software, and its role in the workplace, including employer-employee account disclosure. The use of Facebook can have negative psychological effects that include feelings of jealousy and stress, a lack of attention, and social media addiction that in some cases is comparable to drug addiction.

Facebook's operations have also received coverage. The company's electricity usage, tax avoidance, real-name user requirement policies, censorship policies, handling of user data, and its involvement in the United States PRISM surveillance program have been highlighted by the media and by critics. Facebook has come under scrutiny for 'ignoring' or shirking its responsibility for the content posted on its platform, including copyright and intellectual property infringement, hate speech, incitement of rape and terrorism, fake news, Facebook murder, crimes, and violent incidents live-streamed through its Facebook Live functionality.

The company and its employees have also been subject to litigation cases over the years, with its most prominent case concerning allegations that CEO Mark Zuckerberg broke an oral contract with Cameron Winklevoss, Tyler Winklevoss, and Divya Narendra to build the then-named "HarvardConnection" social network in 2004, instead allegedly opting to steal the idea and code to launch Facebook months before HarvardConnection began. The original lawsuit was eventually settled in 2009, with Facebook paying approximately $20 million in cash and 1.25 million shares. A new lawsuit in 2011 was dismissed. Some critics make predictions of Facebook's end based on the problems which they identify.

Facebook has been banned by several governments for various reasons, including Syria, China, and Iran.

On August 13, 2019, it was revealed that the company had enlisted contractors to create and obtain transcripts of users.

Privacy issues

Main article: Privacy concerns of Facebook

Widening exposure of member information 2011–2012

In 2010, the Electronic Frontier Foundation identified two personal information aggregation techniques called "connections" and "instant personalization". They demonstrated that anyone could get access to information saved to a Facebook profile, even if the information was not intended to be made public. A "connection" is created when a user clicks a "Like" button for a product or service, either on Facebook itself or an external site. Facebook treats such relationships as public information, and the user's identity may be displayed on the Facebook page of the product or service.

Instant Personalization was a pilot program which shared Facebook account information with affiliated sites, such as sharing a user's list of "liked" bands with a music website, so that when the user visits the site, their preferred music plays automatically. The EFF noted that "For users that have not opted out, Instant Personalization is instant data leakage. As soon as you visit the sites in the pilot program (Yelp, Pandora, and Microsoft Docs) the sites can access your name, your picture, your gender, your current location, your list of friends, all the Pages you have Liked—everything Facebook classifies as public information. Even if you opt out of Instant Personalization, there's still data leakage if your friends use Instant Personalization websites—their activities can give away information about you, unless you block those applications individually."

On December 27, 2012, CBS News reported that Randi Zuckerberg, sister of Facebook founder Mark Zuckerberg, criticized a friend for being "way uncool" in sharing a private Facebook photo of her on Twitter, only to be told that the image had appeared on a friend-of-a-friend's Facebook news feed. Commenting on this misunderstanding of Facebook's privacy settings, Eva Galperin of the EFF said "Even Randi Zuckerberg can get it wrong. That's an illustration of how confusing they can be."

Issues during 2007

In August 2007, the code used to generate Facebook's home and search page as visitors browse the site was accidentally made public. A configuration problem on a Facebook server caused the PHP code to be displayed instead of the web page the code should have created, raising concerns about how secure private data on the site was. A visitor to the site copied, published and later removed the code from his web forum, claiming he had been served and threatened with legal notice by Facebook. Facebook's response was quoted by the site that broke the story:

A small fraction of the code that displays Facebook web pages was exposed to a small number of users due to a single misconfigured web server that was fixed immediately. It was not a security breach and did not compromise user data in any way. Because the code that was released powers only Facebook user interface, it offers no useful insight into the inner workings of Facebook. The reprinting of this code violates several laws and we ask that people not distribute it further.

In November, Facebook launched Beacon, a system (discontinued in September 2009) where third-party websites could include a script by Facebook on their sites, and use it to send information about the actions of Facebook users on their site to Facebook, prompting serious privacy concerns. Information such as purchases made and games played were published in the user's news feed. An informative notice about this action appeared on the third party site and allowed the user to cancel it. The user could also cancel it on Facebook. Originally if no action was taken, the information was automatically published. On November 29 this was changed to require confirmation from the user before publishing each story gathered by Beacon.

On December 1, Facebook's credibility in regard to the Beacon program was further tested when it was reported that The New York Times "essentially accuses" Mark Zuckerberg of lying to the paper and leaving Coca-Cola, which is reversing course on the program, with a similar impression. A security engineer at CA, Inc. also claimed in a November 29, 2007, blog post that Facebook collected data from affiliate sites even when the consumer opted out and even when not logged into the Facebook site. On November 30, 2007, the CA security blog posted a Facebook clarification statement addressing the use of data collected in the Beacon program:

When a Facebook user takes a Beacon-enabled action on a participating site, information is sent to Facebook in order for Facebook to operate Beacon technologically. If a Facebook user clicks 'No, thanks' on the partner site notification, Facebook does not use the data and deletes it from its servers. Separately, before Facebook can determine whether the user is logged in, some data may be transferred from the participating site to Facebook. In those cases, Facebook does not associate the information with any individual user account, and deletes the data as well.

The Beacon service ended in September 2009 along with the settlement of a class-action lawsuit against Facebook resulting from the service.

News Feed and Mini-Feed

On September 5, 2006, Facebook introduced two new features called "News Feed" and "Mini-Feed". The first of the new features, News Feed, appears on every Facebook member's home page, displaying recent Facebook activities of the member's friends. The second feature, Mini-Feed, keeps a log of similar events on each member's profile page. Members can manually delete items from their Mini-Feeds if they wish to do so, and through privacy settings can control what is actually published in their respective Mini-Feeds.

Some Facebook members still feel that the ability to opt out of the entire News Feed and Mini-Feed system is necessary, as evidenced by a statement from the Students Against Facebook News Feed group, which peaked at over 740,000 members in 2006. Reacting to users' concerns, Facebook developed new privacy features to give users some control over information about them that was broadcast by the News Feed. According to subsequent news articles, members have widely regarded the additional privacy options as an acceptable compromise.

In May 2010, Facebook added privacy controls and streamlined its privacy settings, giving users more ways to manage status updates and other information broadcast to the public News Feed. Among the new privacy settings is the ability to control who sees each new status update a user posts: Everyone, Friends of Friends, or Friends Only. Users can now hide each status update from specific people as well. However, a user who presses "like" or comments on the photo or status update of a friend cannot prevent that action from appearing in the news feeds of all the user's friends, even non-mutual ones. The "View As" option, used to show a user how privacy controls filter out what a specific given friend can see, only displays the user's timeline and gives no indication that items missing from the timeline may still be showing up in the friend's own news feed.

Cooperation with government requests

Government and local authorities rely on Facebook and other social networks to investigate crimes and obtain evidence to help establish a crime, provide location information, establish motives, prove and disprove alibis, and reveal communications. Federal, state, and local investigations have not been restricted to profiles that are publicly available or willingly provided to the government; Facebook has willingly provided information in response to government subpoenas or requests, except with regard to private, unopened inbox messages less than 181 days old, which would require a warrant and a finding of probable cause under federal law under Electronic Communications Privacy Act (ECPA). One 2011 article noted that "even when the government lacks reasonable suspicion of criminal activity and the user opts for the strictest privacy controls, Facebook users still cannot expect federal law to stop their 'private' content and communications from being used against them".

Facebook's privacy policy states that "We may also share information when we have a good faith belief it is necessary to prevent fraud or other illegal activity, to prevent imminent bodily harm, or to protect ourselves and you from people violating our Statement of Rights and Responsibilities. This may include sharing information with other companies, lawyers, courts or other government entities". Since the U.S. Congress has failed to meaningfully amend the ECPA to protect most communications on social-networking sites such as Facebook, and since the U.S. Supreme Court has largely refused to recognize a Fourth Amendment privacy right to information shared with a third party, no federal statutory or constitutional right prevents the government from issuing requests that amount to fishing expeditions and there is no Facebook privacy policy that forbids the company from handing over private user information that suggests any illegal activity.

The 2013 mass surveillance disclosures identified Facebook as a participant in the U.S. National Security Administration's PRISM program. Facebook now reports the number of requests it receives for user information from governments around the world.

Complaint from CIPPIC

On May 31, 2008, the Canadian Internet Policy and Public Interest Clinic (CIPPIC), per Director Phillipa Lawson, filed a 35-page complaint with the Office of the Privacy Commissioner against Facebook based on 22 breaches of the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA). University of Ottawa law students Lisa Feinberg, Harley Finkelstein, and Jordan Eric Plener, initiated the "minefield of privacy invasion" suit. Facebook's Chris Kelly contradicted the claims, saying that: "We've reviewed the complaint and found it has serious factual errors—most notably its neglect of the fact that almost all Facebook data is willingly shared by users." Assistant Privacy Commissioner Elizabeth Denham released a report of her findings on July 16, 2009. In it, she found that several of CIPPIC's complaints were well-founded. Facebook agreed to comply with some, but not all, of her recommendations. The Assistant Commissioner found that Facebook did not do enough to ensure users granted meaningful consent for the disclosure of personal information to third parties and did not place adequate safeguards to prevent unauthorized access by third party developers to personal information.

Data mining

There have been some concerns expressed regarding the use of Facebook as a means of surveillance and data mining.

Two Massachusetts Institute of Technology (MIT) students used an automated script to download the publicly posted information of over 70,000 Facebook profiles from four schools (MIT, NYU, the University of Oklahoma, and Harvard University) as part of a research project on Facebook privacy published on December 14, 2005. Since then, Facebook has bolstered security protection for users, responding: "We've built numerous defenses to combat phishing and malware, including complex automated systems that work behind the scenes to detect and flag Facebook accounts that are likely to be compromised (based on anomalous activity like lots of messages sent in a short period of time, or messages with links that are known to be bad)."

A second clause that brought criticism from some users allowed Facebook the right to sell users' data to private companies, stating "We may share your information with third parties, including responsible companies with which we have a relationship." This concern was addressed by spokesman Chris Hughes, who said, "Simply put, we have never provided our users' information to third party companies, nor do we intend to." Facebook eventually removed this clause from its privacy policy.

In the United Kingdom, the Trades Union Congress (TUC) has encouraged employers to allow their staff to access Facebook and other social-networking sites from work, provided they proceed with caution.

In September 2007, Facebook drew criticism after it began allowing search engines to index profile pages, though Facebook's privacy settings allow users to turn this off.

Concerns were also raised on the BBC's Watchdog program in October 2007 when Facebook was shown to be an easy way to collect an individual's personal information to facilitate identity theft. However, there is barely any personal information presented to non-friends - if users leave the privacy controls on their default settings, the only personal information visible to a non-friend is the user's name, gender, profile picture and networks.

An article in The New York Times in February 2008 pointed out that Facebook does not actually provide a mechanism for users to close their accounts, and raised the concern that private user data would remain indefinitely on Facebook's servers. As of 2013, Facebook gives users the options to deactivate or delete their accounts. Deactivating an account allows it to be restored later, while deleting it will remove the account "permanently", although some data submitted by that account ("like posting to a group or sending someone a message") will remain.

Onavo and Facebook Research

Main article: Onavo

In 2013, Facebook acquired Onavo, a developer of mobile utility apps such as Onvao Protect VPN, which is used as part of an "Insights" platform to gauge the use and market share of apps. This data has since been used to influence acquisitions and other business decisions regarding Facebook products. Criticism of this practice emerged in 2018, when Facebook began to advertise the Onavo Protect VPN within its main app on iOS devices in the United States. Media outlets considered the app to effectively be spyware due to its behavior, adding that the app's listings did not readily disclaim Facebook's ownership of the app and its data collection practices. Facebook subsequently pulled the iOS version of the app, citing new iOS App Store policies forbidding apps from performing analytics on the usage of other apps on a user's device.

Since 2016, Facebook has also run "Project Atlas"—publicly known as "Facebook Research"—a market research program inviting teenagers and young adults between the ages of 13 and 35 to have data such as their app usage, web browsing history, web search history, location history, personal messages, photos, videos, emails, and Amazon order history, analyzed by Facebook. Participants would receive up to $20 per-month for participating in the program. Facebook Research is administered by third-party beta testing services, including Applause, and requires users to install a Facebook root certificate on their phone. After a January 2019 report by TechCrunch on Project Atlas, which alleged that Facebook bypassed the App Store by using an Apple enterprise program for apps used internally by a company's employees, Facebook refuted the article but later announced its discontinuation of the program on iOS.

On January 30, 2019, Apple temporarily revoked Facebook's Enterprise Developer Program certificates for one day, which caused all of the company's internal iOS apps to become inoperable. Apple stated that "Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple", and that the certificates were revoked "to protect our users and their data". US Senators Mark Warner, Richard Blumenthal, and Ed Markey separately criticized Facebook Research's targeting of teenagers, and promised to sponsor legislation to regulate market research programs.

Inability to voluntarily terminate accounts

Facebook had allowed users to deactivate their accounts but not actually remove account content from its servers. A Facebook representative explained to a student from the University of British Columbia that users had to clear their own accounts by manually deleting all of the content including wall posts, friends, and groups. The New York Times noted the issue and raised a concern that emails and other private user data remain indefinitely on Facebook's servers. Facebook subsequently began allowing users to permanently delete their accounts in 2010. Facebook's Privacy Policy now states, "When you delete an account, it is permanently deleted from Facebook."

Memorials

A notable ancillary effect of social-networking websites is the ability for participants to mourn publicly for a deceased individual. On Facebook, friends often leave messages of sadness, grief, or hope on the individual's page, transforming it into a public book of condolences. This particular phenomenon has been documented at a number of schools. Facebook originally held a policy that profiles of people known to be deceased would be removed after 30 days due to privacy concerns. Due to user response, Facebook changed its policy to place deceased members' profiles in a "memorialization state". Facebook's Privacy Policy regarding memorialization says, "If we are notified that a user is deceased, we may memorialize the user's account. In such cases we restrict profile access to confirmed friends and allow friends and family to write on the user's Wall in remembrance. We may close an account if we receive a formal request from the user's next of kin or other proper legal request to do so."

Some of these memorial groups have also caused legal issues. Notably, on January 1, 2008, one such memorial group posted the identity of murdered Toronto teenager Stefanie Rengel, whose family had not yet given the Toronto Police Service their consent to release her name to the media, and the identities of her accused killers, in defiance of Canada's Youth Criminal Justice Act, which prohibits publishing the names of the under-age accused. While police and Facebook staff attempted to comply with the privacy regulations by deleting such posts, they noted difficulty in effectively policing the individual users who repeatedly republished the deleted information.

Customization and security

In July 2007, Adrienne Felt, an undergraduate student at the University of Virginia, discovered a cross-site scripting (XSS) hole in the Facebook Platform that could inject JavaScript into profiles. She used the hole to import custom CSS and demonstrate how the platform could be used to violate privacy rules or create a worm.

Not totally in the control

From the users' posting side there are some options in Facebook to choose from. In order to provide privacy, users can choose who can view their posts: only friends; friends and friends of the friends; everyone; custom (specific choice of which friends can and cannot see posts). Even though, these options exist, there is still problems of third parties who might get access for these posts. For example, posting a picture to only your friends to see, but by tagging someone else on that picture, the post is automatically also seen for the fiends of a tagged persons. Also, by commenting to a private post, the comment owner gets no information if the post is made public afterwards. There is also a concern about posted photos of people. Anyone can post a photo of whoever without them knowing. Users have no idea if and how many photos of them might circulate in the Facebook. It is also studied that a harmful photograph can make more harm than losing a password. These are some of the examples that privacy is not totally in the users’ hand.

Quit Facebook Day

Quit Facebook Day was an online event which took place on May 31, 2010 (coinciding with Memorial Day), in which Facebook users stated that they would quit the social network due to privacy concerns. It was estimated that 2% of Facebook users coming from the United States would delete their accounts. However, only 33,000 (roughly 0.0066% of its roughly 500 million members at the time) users quit the site. The number one reason for users to quit Facebook was privacy concerns (48%), being followed by a general dissatisfaction with Facebook (14%), negative aspects regarding Facebook friends (13%), and the feeling of getting addicted to Facebook (6%). Facebook quitters were found to be more concerned about privacy, more addicted to the Internet, and more conscientious.

Photo recognition and face tagging

Facebook enabled an automatic facial recognition feature in June 2011, called "Tag Suggestions", a product of a research project named "DeepFace". The feature compares newly uploaded photographs to those of the uploader's Facebook friends, to suggest photo tags.

National Journal Daily claims "Facebook is facing new scrutiny over its decision to automatically turn on a new facial recognition feature aimed at helping users identify their friends in photos". Facebook has defended the feature, saying users can disable it. Facebook introduced the feature on an opt-out basis. European Union data-protection regulators said they would investigate the feature to see if it violated privacy rules. Naomi Lachance stated in a web blog for NPR, All Tech Considered, that Facebook's facial recognition is right 98% of the time compared to the FBI's 85% out of 50 people. However, the accuracy of Facebook searches is due to its larger, more diverse photo selection compared to the FBI's closed database. Mark Zuckerberg showed no worries when speaking about Facebook's AIs, saying, "Unsupervised learning is a long-term focus of our AI research team at Facebook, and it remains an important challenge for the whole AI research community" and "It will save lives by diagnosing diseases and driving us around more safely. It will enable breakthroughs by helping us find new planets and understand Earth's climate. It will help in areas we haven't even thought of today".

Investigation by the Irish Data Protection Commissioner, 2011–2012

In August 2011, the Irish Data Protection Commissioner (DPC) started an investigation after receiving 22 complaints by europe-v-facebook.org, which was founded by a group of Austrian students. The DPC stated in first reactions that the Irish DPC is legally responsible for privacy on Facebook for all users within the European Union and that he will "investigate the complaints using his full legal powers if necessary". The complaints were filed in Ireland because all users who are not residents of the United States or Canada have a contract with "Facebook Ireland Ltd", located in Dublin, Ireland. Under European law Facebook Ireland is the "data controller" for facebook.com, and therefore, facebook.com is governed by European data protection laws. Facebook Ireland Ltd. was established by Facebook Inc. to avoid US taxes (see Double Irish arrangement).

The group 'europe-v-facebook.org' made access requests at Facebook Ireland and received up to 1,222 pages of data per person in 57 data categories that Facebook was holding about them, including data that was previously removed by the users. The group claimed that Facebook failed to provide some of the requested data, including "likes", facial recognition data, data about third party websites that use "social plugins" visited by users, and information about uploaded videos. Currently the group claims that Facebook holds at least 84 data categories about every user.

The first 16 complaints target different problems, from undeleted old "pokes" all the way to the question if sharing and new functions on Facebook should be opt-in or opt-out. The second wave of 6 more complaints was targeting more issues including one against the "Like" button. The most severe could be a complaint that claims that the privacy policy, and the consent to the privacy policy is void under European laws.

In an interview with the Irish Independent, a spokesperson said that the DPC will "go and audit Facebook, go into the premises and go through in great detail every aspect of security". He continued by saying: "It's a very significant, detailed and intense undertaking that will stretch over four or five days." In December 2011 the DPC published its first report on Facebook. This report was not legally binding but suggested changes that Facebook should undertake until July 2012. The DPC is planning to do a review about Facebook's progress in July 2012.

Changes

In spring 2012, Facebook had to undertake many changes (e.g., having an extended download tool that should allow users to exercise the European right to access all stored information or an update of the worldwide privacy policy). These changes were seen as not sufficient to comply with European law by europe-v-facebook.org. The download tool does not allow, for example, access to all data. The group has launched our-policy.org to suggest improvements to the new policy, which they saw as a backdrop for privacy on Facebook. Since the group managed to get more than 7.000 comments on Facebook's pages, Facebook had to do a worldwide vote on the proposed changes. Such a vote would have only been binding if 30% of all users would have taken part. Facebook did not promote the vote, resulting in only 0.038% participation with about 87% voting against Facebook's new policy. The new privacy policy took effect on the same day.

Tracking of non-members of Facebook

An article published by USA Today in November 2011 claimed that Facebook creates logs of pages visited both by its members and by non-members, relying on tracking cookies to keep track of pages visited.

In early November 2015, Facebook was ordered by the Belgian Privacy Commissioner to cease tracking non-users, citing European laws, or risk fines of up to £250,000 per day. As a result, instead of removing tracking cookies, Facebook banned non-users in Belgium from seeing any material on Facebook, including publicly posted content, unless they sign in. Facebook criticized the ruling, saying that the cookies provided better security.

Stalking

By statistics, 63% of Facebook profiles are automatically set "visible to the public", meaning anyone can access the profiles that users have updated. Facebook also has its own built-in messaging system that people can send messages to any other user, unless they have disabled the feature to "from friends only". Stalking is not only limited to SNS stalking, but can lead to further "in-person" stalking because nearly 25% of real-life stalking victims reported it started with online instant messaging (e.g., Facebook chat).

Performative surveillance

Performative surveillance is the notion that people are very much aware that they are being surveilled on websites, like Facebook, and use the surveillance as an opportunity to portray themselves in a way that connotes a certain lifestyle—of which, that individual may, or may not, distort how they are perceived in reality.

2010 application privacy breach

In 2010, the Wall Street Journal found that many of Facebook's top-rated apps—including apps from Zynga and Lolapps—were transmitting identifying information to "dozens of advertising and Internet tracking companies" like RapLeaf. The apps used an HTTP referer that exposed the user's identity and sometimes their friends' identities. Facebook said that "While knowledge of user ID does not permit access to anyone’s private information on Facebook, we plan to introduce new technical systems that will dramatically limit the sharing of User ID’s". A blog post by a member of Facebook's team further stated that "press reports have exaggerated the implications of sharing a user ID", though still acknowledging that some of the apps were passing the ID in a manner that violated Facebook's policies.

Facebook and Cambridge Analytica data scandal

Main article: Facebook and Cambridge Analytica data scandal

In 2018, Facebook admitted that an app made by Global Science Research and Alexandr Kogan, related to Cambridge Analytica, was able in 2014 to harvest personal data of up to 87 million Facebook users without their consent, by exploiting their friendship connection to the users who sold their data via the app. Following the revelations of the breach, several public figures, including industrialist Elon Musk and WhatsApp cofounder Brian Acton, announced that they were deleting their Facebook accounts, using the hashtag "#deletefacebook".

Facebook was also criticized for allowing the 2012 Barack Obama presidential campaign to analyze and target select users by providing the campaign with friendship connections of users who signed up for an application. However, users signing up for the application were aware that their data, but not the data of their friends, was going to a political party.

Employer-employee privacy issues

In an effort to surveil the personal lives of current, or prospective, employees, some employers have asked employees to disclose their Facebook login information. This has resulted in the passing of a bill in New Jersey making it illegal for employers to ask potential or current employees for access to their Facebook accounts. Although the U.S government has yet to pass a national law protecting prospective employees and their social networking sites, from employers, the fourth amendment of the US constitution can protect prospective employees in specific situations. Many companies examine Facebook profiles of job candidates looking for reasons to not hire them. According to a survey of hiring managers by CareerBuilder.com, the most common deal breakers they found on Facebook profiles include references to drinking, poor communication skills, inappropriate photos, and lying about skills and/or qualifications.

Facebook requires employees and contractors working for them to give permission for Facebook to access their personal profiles, including friend requests and personal messages.

Users violating minimum age requirements

A 2011 study in the online journal First Monday examines how parents consistently enable children as young as 10 years old to sign up for accounts, directly violating Facebook's policy banning young visitors. This policy is in compliance with a United States law, the 1998 Children's Online Privacy Protection Act, which requires minors aged under 13 to gain explicit parental consent to access commercial websites. In jurisdictions where a similar law sets a lower minimum age, Facebook enforces the lower age. Of the 1,007 households surveyed for the study, 76% of parents reported that their child joined Facebook at an age younger than 13, the minimum age in the site's terms of service. The study also reported that Facebook removes roughly 20,000 users each day for violating its minimum age policy. The study's authors also note, "Indeed, Facebook takes various measures both to restrict access to children and delete their accounts if they join." The findings of the study raise questions primarily about the shortcomings of United States federal law, but also implicitly continue to raise questions about whether or not Facebook does enough to publicize its terms of service with respect to minors. Only 53% of parents said they were aware that Facebook has a minimum signup age; 35% of these parents believe that the minimum age is merely a recommendation or thought the signup age was 16 or 18, not 13.

Student-related issues

Student privacy concerns

Students who post illegal or otherwise inappropriate material have faced disciplinary action from their universities, colleges, and schools including expulsion. Others posting libelous content relating to faculty have also faced disciplinary action. The Journal of Education for Business states that "a recent study of 200 Facebook profiles found that 42% had comments regarding alcohol, 53% had photos involving alcohol use, 20% had comments regarding sexual activities, 25% had seminude or sexually provocative photos, and 50% included the use of profanity." It is inferred that negative or incriminating Facebook posts can affect alumni's and potential employers' perception of them. This perception can greatly impact the students' relationships, ability to gain employment, and maintain school enrollment. The desire for social acceptance leads individuals to want to share the most intimate details of their personal lives along with illicit drug use and binge drinking. Too often, these portrayals of their daily lives are exaggerated and/or embellished to attract others like minded to them.

Effect on higher education

On January 23, 2006, The Chronicle of Higher Education continued an ongoing national debate on social networks with an opinion piece written by Michael Bugeja, director of the Journalism school at Iowa State University, entitled "Facing the Facebook". Bugeja, author of the Oxford University Press text Interpersonal Divide (2005), quoted representatives of the American Association of University Professors and colleagues in higher education to document the distraction of students using Facebook and other social networks during class and at other venues in the wireless campus. Bugeja followed up on January 26, 2007 in The Chronicle with an article titled "Distractions in the Wireless Classroom", quoting several educators across the country who were banning laptops in the classroom. Similarly, organizations such as the National Association for Campus Activities, the Association for Education in Journalism and Mass Communication, and others have hosted seminars and presentations to discuss ramifications of students' use of Facebook and other social-networking sites.

The EDUCAUSE Learning Initiative has also released a brief pamphlet entitled "7 Things You Should Know About Facebook" aimed at higher education professionals that "describes what is, where it is going, and why it matters to teaching and learning".

Some research on Facebook in higher education suggests that there may be some small educational benefits associated with student Facebook use, including improving engagement which is related to student retention. 2012 research has found that time spent on Facebook is related to involvement in campus activities. This same study found that certain Facebook activities like commenting and creating or RSVPing to events were positively related to student engagement while playing games and checking up on friends was negatively related. Furthermore, using technologies such as Facebook to connect with others can help college students be less depressed and cope with feelings of loneliness and homesickness.

Effect on college student grades

As of February 2012, only four published peer-reviewed studies have examined the relationship between Facebook use and grades. The findings vary considerably. Pasek et al. (2009) found no relationship between Facebook use and grades. Kolek and Saunders (2008) found no differences in overall grade point average (GPA) between users and non-users of Facebook. Kirschner and Karpinski (2010) found that Facebook users reported a lower mean GPA than non-users. Junco's (2012) study clarifies the discrepancies in these findings. While Junco (2012) found a negative relationship between time spent on Facebook and student GPA in his large sample of college students, the real-world impact of the relationship was negligible. Furthermore, Junco (2012) found that sharing links and checking up on friends were positively related to GPA while posting status updates was negatively related. In addition to noting the differences in how Facebook use was measured among the four studies, Junco (2012) concludes that the ways in which students use Facebook are more important in predicting academic outcomes.

Phishing

See also: Facebook malware

Phishing refers to a scam used by criminals to trick people into revealing passwords, credit card information, and other sensitive information. On Facebook, phishing attempts occur through message or wall posts from a friend's account that was breached. If the user takes the bait, the phishers gain access to the user's Facebook account and send phishing messages to the user's other friends. The point of the post is to get the users to visit a website with viruses and malware.

Unpublished photo disclosure bug

In September 2018, a software bug meant that photos that had been uploaded to Facebook accounts, but that had not been "published" (and which therefore should have remained private between the user and Facebook), were exposed to app developers. Approximately 6.8 million users and 1500 third-party apps were affected.

Sharing private messages and contacts' details without consent

In December 2018, it emerged that Facebook had, during the period 2010–2018, granted access to users' private messages, address book contents, and private posts, without the users' consent, to more than 150 third parties including Microsoft, Amazon, Yahoo, Netflix, and Spotify. This had been occurring despite public statements from Facebook that it had stopped such sharing years earlier.

Denial of location privacy, regardless of user settings

In December 2018, it emerged that Facebook's mobile app reveals the user's location to Facebook, even if the user does not use the "check in" feature and has configured all relevant settings within the app so as to maximize location privacy.

E-commerce and drop shipping scams

In April 2016, Buzzfeed published an article exposing drop shippers who were using Facebook and Instagram to swindle unsuspecting customers. Located mostly in China, these drop shippers and e-commerce sites would steal copyrighted images from larger retailers and influencers to gain credibility. After luring a customer with a low price for the item, they would then deliver a product that is nothing like what was advertised or deliver no product at all.

Health data from apps sent to Facebook without user consent

In February 2019, it emerged that a number of Facebook apps, including Flo, had been sending users' health data such as blood pressure and ovulation status to Facebook without users' informed consent. New York governor Andrew Cuomo called the practice an "outrageous abuse of privacy", ordered New York's department of state and department of financial services to investigate, and encouraged federal regulators to step in.

International lobbying against privacy protections

In early 2019, it was reported that Facebook had spent years lobbying extensively against privacy protection laws around the world, such as the General Data Protection Regulation (GDPR).

The lobbying included efforts by Sandberg to "bond" with female European officials including Enda Kenny (then Prime Minister of Ireland, where Facebook's European operations are based), to influence them in Facebook's favor. Other politicians reportedly lobbied by Facebook in relation to privacy protection laws included George Osborne (then Chancellor of the Exchequer), Pranab Mukherjee (then President of India), and Michel Barnier.

Unencrypted password storage

In March 2019, Facebook admitted that it had mistakenly stored "hundreds of millions" of passwords of Facebook and Instagram users in plaintext (as opposed to being hashed and salted) on multiple internal systems accessible only to Facebook engineers, dating as far back as 2012. Facebook stated that affected users would be notified, but that there was no evidence that this data had been abused or leaked.

In April 2019, Facebook admitted that its subsidiary Instagram also stored millions of unencrypted passwords.

Promotion of service as "free"

In December 2019, the Hungarian Competition Authority fined Facebook around US$4 million for false advertising, ruling that Facebook cannot market itself as a "free" (no cost) service because the use of detailed personal information to deliver targeted advertising constituted a compensation that must be provided to Facebook to use the service.

Providing ads through "snooping"

Concerns exist whether Facebook is targeting advertising by listening to user conversations. Facebook has declined any serrupticious listening is taking place.

Oculus Antitrust Investigation

Oculus is a trademark of Facebook Technologies, LLC (formerly identified as Oculus VR, LLC) and it is producing virtual reality headsets. In March 2014, Facebook Inc. purchased Oculus for 2.3 billion US dollars.

On October 13, 2020, Oculus released its new model, Oculus Quest 2. The latter requires linking to a Facebook account. This is a problem not only at the individual level but also in the government as well. In Germany, the Federal Cartel Office has launched an investigation into competition law and is concerned about how Facebook is abusing it. Given Facebook's powerful position on social media and its huge impact on virtual reality devices, requiring linking to your personal social media account may be an anti-competitive behavior. As it is not yet known exactly what the outcome of the investigation will be, Oculus and the Facebook company are currently awaiting a hearing in the Düsseldorf Higher Regional Court on March 26, 2021. In the summer of 2020, it was announced that the latest Oculus model must be linked to a specific person's Facebook account. By January 1, 2023, the person must be linked to a Facebook account, or the older Oculus model will no longer provide a full-fledged experience - new games will no longer be available and existing games may no longer work. A personal Facebook account is required for full functionality. Facebook accounts that do not use the user's real name and correct date of birth may result in a ban on access to the Oculus headset. All this leads to a situation where a person who does not have a Facebook account has to do it for oneself and a fake social media account does not count. So people need to be willing to give Facebook their data for the benefit of their gaming habits. This allows, in addition to the information available on social media, to analyze even more sensitive data, such as a person's biometric responses to VR games and entertainment being viewed.

Psychological/sociological effects

See also: Digital media use and mental health

Facebook addiction

See also: Mobile phones and driving safety and Problematic social media use

The "World Unplugged" study, which was conducted in 2011, claims that for some users quitting social networking sites is comparable to quitting smoking or giving up alcohol. Another study conducted in 2012 by researchers from the University of Chicago Booth School of Business in the United States found that drugs like alcohol and tobacco could not keep up with social networking sites regarding their level of addictiveness. A 2013 study in the journal CyberPsychology, Behavior, and Social Networking found that some users decided to quit social networking sites because they felt they were addicted. In 2014, the site went down for about 30 minutes, prompting several users to call emergency services.

In April 2015, the Pew Research Center published a survey of 1,060 U.S. teenagers ages 13 to 17 who reported that nearly three-quarters of them either owned or had access to a smartphone, 92 percent went online daily with 24 percent saying they went online "almost constantly." In March 2016, Frontiers in Psychology published a survey of 457 post-secondary student Facebook users (following a face validity pilot of another 47 post-secondary student Facebook users) at a large university in North America showing that the severity of ADHD symptoms had a statistically significant positive correlation with Facebook usage while driving a motor vehicle and that impulses to use Facebook while driving were more potent among male users than female users.

In June 2018, Children and Youth Services Review published a regression analysis of 283 adolescent Facebook users in the Piedmont and Lombardy regions of Northern Italy (that replicated previous findings among adult users) showing that adolescents reporting higher ADHD symptoms positively predicted Facebook addiction, persistent negative attitudes about the past and that the future is predetermined and not influenced by present actions, and orientation against achieving future goals, with ADHD symptoms additionally increasing the manifestation of the proposed category of psychological dependence known as "problematic social media use."

Self-harm and suicide

Main article: Social media and suicide

Research shows that people who are feeling suicidal use the internet to search for suicide methods. Websites provide graphic details and information on how to take your own life. This cannot be right. Where this content breaches the policies of internet and social media providers it must be removed.

— Matt Hancock, Health Secretary of the United Kingdom

I do not think it is going too far to question whether even you, the owners, any longer have any control over content. If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.

— Anne Longfield, Children's Commissioner for England

In January 2019, both the Health Secretary of the United Kingdom, and the Children’s Commissioner for England, urged Facebook and other social media companies to take responsibility for the risk to children posed by content on their platforms related to self-harm and suicide.

Envy

Facebook has been criticized for making people envious and unhappy due to the constant exposure to positive yet unrepresentative highlights of their peers. Such highlights include, but are not limited to, journal posts, videos, and photos that depict or reference such positive or otherwise outstanding activities, experiences, and facts. This effect is caused mainly by the fact that most users of Facebook usually only display the positive aspects of their lives while excluding the negative, though it is also strongly connected to inequality and the disparities between social groups as Facebook is open to users from all classes of society. Sites such as AddictionInfo.org state that this kind of envy has profound effects on other aspects of life and can lead to severe depression, self-loathing, rage and hatred, resentment, feelings of inferiority and insecurity, pessimism, suicidal tendencies and desires, social isolation, and other issues that can prove very serious. This condition has often been called "Facebook Envy" or "Facebook Depression" by the media.

A joint study conducted by two German universities demonstrated Facebook envy and found that as many as one out of three people actually feel worse and less satisfied with their lives after visiting the site. Vacation photos were found to be the most common source of feelings of resentment and jealousy. After that, social interaction was the second biggest cause of envy, as Facebook users compare the number of birthday greetings, likes, and comments to those of their friends. Visitors who contributed the least tended to feel the worst. "According to our findings, passive following triggers invidious emotions, with users mainly envying happiness of others, the way others spend their vacations; and socialize," the study states.

A 2013 study by researchers at the University of Michigan found that the more people used Facebook, the worse they felt afterwards.

Narcissistic users who show excessive grandiosity give negative emotion to viewers and cause envy, but as a result, that may cause viewers' loneliness. Viewers sometimes need to terminate relationships with them to avoid this negative emotion. However, this "avoidance" such as "terminate relationships" would be reinforcement and it may lead to loneliness. The cyclical pattern is a vicious circle of loneliness and avoidance coping, the study states.

Divorce

Social networks, like Facebook, can have a detrimental effect on marriages, with users becoming worried about their spouse's contacts and relations with other people online, leading to marital breakdown and divorce. According to a 2009 survey in the UK, around 20 percent of divorce petitions included references to Facebook. Facebook has given us a new platform for interpersonal communication. Researchers proposed that high levels of Facebook use could result in Facebook-related conflict and breakup/divorce. Previous studies have shown that romantic relationships can be damaged by excessive Internet use, Facebook jealousy, partner surveillance, ambiguous information, and online portrayal of intimate relationships. Excessive Internet users reported having greater conflict in their relationships. Their partners feel neglected and there's lower commitment and lower feelings of passion and intimacy in the relationship. According to the article, researchers suspect that Facebook may contribute to an increase in divorce and infidelity rates in the near future due to the amount and ease of accessibility to connect with past partners.

Stress

Research performed by psychologists from Edinburgh Napier University indicated that Facebook adds stress to users' lives. Causes of stress included fear of missing important social information, fear of offending contacts, discomfort or guilt from rejecting user requests or deleting unwanted contacts or being unfriended or blocked by Facebook friends or other users, the displeasure of having friend requests rejected or ignored, the pressure to be entertaining, criticism or intimidation from other Facebook users, and having to use appropriate etiquette for different types of friends. Many people who started using Facebook for positive purposes or with positive expectations have found that the website has negatively impacted their lives.

Next to that, the increasing number of messages and social relationships embedded in SNS also increases the amount of social information demanding a reaction from SNS users. Consequently SNS users perceive they are giving too much social support to other SNS friends. This dark side of SNS usage is called ‘social overload’. It is caused by the extent of usage, number of friends, subjective social support norms, and type of relationship (online-only vs offline friends) while age has only an indirect effect. The psychological and behavioral consequences of social overload include perceptions of SNS exhaustion, low user satisfaction, and high intentions to reduce or stop using SNS.

Narcissism

Main article: Narcissistic personality disorder See also: Law of effect and Problematic social media use

In July 2018, a meta-analysis published in Psychology of Popular Media found that grandiose narcissism positively correlated with time spent on social media, frequency of status updates, number of friends or followers, and frequency of posting self-portrait digital photographs, while a meta-analysis published in the Journal of Personality in April 2018 found that the positive correlation between grandiose narcissism and social networking service usage was replicated across platforms (including Facebook). In March 2020, the Journal of Adult Development published a regression discontinuity analysis of 254 Millennial Facebook users investigating differences in narcissism and Facebook usage between the age cohorts born from 1977 to 1990 and from 1991 to 2000 and found that the later born Millennials scored significantly higher on both. In June 2020, Addictive Behaviors published a systematic review finding a consistent, positive, and significant correlation between grandiose narcissism and the proposed category of psychological dependence called "problematic social media use". Also in 2018, social psychologist Jonathan Haidt and FIRE President Greg Lukianoff noted in The Coddling of the American Mind that former Facebook president Sean Parker stated in a 2017 interview that the Like button was consciously designed to prime users receiving likes to feel a dopamine rush as part of a "social-validation feedback loop".

Non-informing, knowledge-eroding medium

See also: Confirmation bias

Facebook is a Big Tech company with over 2.7 billion monthly active users as of the second quarter of 2020 and therefore has a meaningful impact on the masses that use it. Big Data algorithms are used in personalized content creation and automatization; however, this method can be used to manipulate users in various ways. The problem of misinformation is exacerbated by the educational bubble, users' critical thinking ability and news culture. Based on a 2015 study, 62.5% of the Facebook users are oblivious to any curation of their News Feed. Furthermore, scientists have started to investigate algorithms with unexpected outcomes that may lead to antisocial political, economic, geographic, racial, or other discrimination. Facebook has remained scarce in transparency of the inner workings of the algorithms used for News Feed correlation. Algorithms use the past activities as a reference point for predicting users' taste in order to keep them engaged. However, this leads to the formation of a filter bubble that starts to refrain users from diverse information. Users are left with a skewed worldview derived from their own preferences and biases.

Facebook has, at least in the political field, a counter-effect on being informed: in two studies from the US with a total of more than 2,000 participants, the influence of social media on the general knowledge on political issues was examined in the context of two US presidential elections. The results showed that the frequency of Facebook use was moderately negatively related to general political knowledge. This was also the case when considering demographic, political-ideological variables and previous political knowledge. According to the latter, a causal relationship is indicated: the higher the Facebook use, the more the general political knowledge declines. In 2015, researchers from Facebook published a study indicating that the Facebook algorithm perpetuates an echo chamber amongst users by occasionally hiding content from individual feeds that users potentially would disagree with: for example the algorithm removed one in every 13 diverse content from news sources for self-identified liberals. In general, the results from the study indicated that the Facebook algorithm ranking system caused approximately 15% less diverse material in users' content feeds, and a 70% reduction in the click-through-rate of the diverse material. In 2018, social psychologist Jonathan Haidt and FIRE President Greg Lukianoff argued in The Coddling of the American Mind that the filter bubbles created by the News Feed algorithm of Facebook and other platforms are one of the principal factors amplifying political polarization in the United States since 2000 (when a majority of U.S. households first had at least one personal computer and then internet access the following year), and Haidt and Tobias Rose-Stockwell suggested in The Atlantic in December 2019 that increased support in the United States among Millennials and Generation Z for communism and socialism stems from ignorance about the economic history of the 20th century.

Other psychological effects

It has been admitted by many students that they have experienced bullying on the site, which leads to psychological harm. Students of high schools face a possibility of bullying and other adverse behaviors over Facebook every day. Many studies have attempted to discover whether Facebook has a positive or negative effect on children’s and teenagers’ social lives, and many of them have come to the conclusion that there are distinct social problems that arise with Facebook usage. British neuroscientist Susan Greenfield stuck up for the issues that children encounter on social media sites. She said that they can rewire the brain, which caused some hysteria over whether or not social networking sites are safe. She did not back up her claims with research, but did cause quite a few studies to be done on the subject. When that self is then broken down by others by badmouthing, criticism, harassment, criminalization or vilification, intimidation, demonization, demoralization, belittlement, or attacking someone over the site it can cause much of the envy, anger, or depression.

Sherry Turkle, in her book Alone Together: Why We Expect More from Technology and Less from Each Other, argues that social media brings people closer and further apart at the same time. One of the main points she makes is that there is a high risk in treating persons online with dispatch like objects. Although people are networked on Facebook, their expectations of each other tend to be lessened. According to Turkle, this could cause a feeling of loneliness in spite of being together.

Between 2016-2018, the number of 12- to 15-year-olds who reported being bullied over social media rose from 6% to 11%, in the region covered by Ofcom.

User influence experiments

Academic and Facebook researchers have collaborated to test if the messages people see on Facebook can influence their behavior. For instance, in "A 61-Million-Person Experiment in Social Influence And Political Mobilization," during the 2010 elections, Facebook users were given the opportunity to "tell your friends you voted" by clicking on an "I voted" button. Users were 2% more likely to click the button if it was associated with friends who had already voted.

Much more controversially, a 2014 study of "Emotional Contagion Through Social Networks" manipulated the balance of positive and negative messages seen by 689,000 Facebook users. The researchers concluded that they had found "some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, the effect sizes from the manipulations are small."

Unlike the "I voted" study, which had presumptively beneficial ends and raised few concerns, this study was criticized for both its ethics and methods/claims. As controversy about the study grew, Adam Kramer, a lead author of both studies and member of the Facebook data team, defended the work in a Facebook update. A few days later, Sheryl Sandburg, Facebook's COO, made a statement while traveling abroad. While at an Indian Chambers of Commerce event in New Delhi she stated that "This was part of ongoing research companies do to test different products, and that was what it was. It was poorly communicated and for that communication we apologize. We never meant to upset you."

Shortly thereafter, on July 3, 2014, USA Today reported that the privacy watchdog group Electronic Privacy Information Center (EPIC) had filed a formal complaint with the Federal Trade Commission claiming that Facebook had broken the law when it conducted the study on the emotions of its users without their knowledge or consent. In its complaint, EPIC alleged that Facebook had deceived users by secretly conducting a psychological experiment on their emotions: "At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers."

Beyond the ethical concerns, other scholars criticized the methods and reporting of the study's findings. John Grohol, writing for Psych Central, argued that despite its title and claims of "emotional contagion," this study did not look at emotions at all. Instead, its authors used an application (called "Linguistic Inquiry and Word Count" or LIWC 2007) that simply counted positive and negative words to infer users' sentiments. He wrote that a shortcoming of the LIWC tool is that it does not understand negations. Hence, the tweet "I am not happy" would be scored as positive: "Since the LIWC 2007 ignores these subtle realities of informal human communication, so do the researchers." Grohol concluded that given these subtleties, the effect size of the findings are little more than a "statistical blip."

Kramer et al. (2014) found a 0.07%—that's not 7 percent, that's 1/15th of one percent!!—decrease in negative words in people's status updates when the number of negative posts on their Facebook news feed decreased. Do you know how many words you'd have to read or write before you've written one less negative word due to this effect? Probably thousands.

The consequences of the controversy are pending (be it FTC or court proceedings) but it did prompt an "Editorial Expression of Concern" from its publisher, the Proceedings of the National Academy of Sciences, as well as a blog posting from OkCupid titled "We experiment on human beings!" In September 2014, law professor James Grimmelmann argued that the actions of both companies were "illegal, immoral, and mood-altering" and filed notices with the Maryland Attorney General and Cornell Institutional Review Board.

In the UK, the study was also criticized by the British Psychological Society which said, in a letter to The Guardian, "There has undoubtedly been some degree of harm caused, with many individuals affected by increased levels of negative emotion, with consequent potential economic costs, increase in possible mental health problems and burden on health services. The so-called 'positive' manipulation is also potentially harmful."

Tax avoidance

See also: Ireland as a tax haven

Facebook uses a complicated series of shell companies in tax havens to avoid paying billions of dollars in corporate tax. According to The Express Tribune, Facebook is among the corporations that "avoided billions of dollars in tax using offshore companies." For example, Facebook routes billions of dollars in profits using the Double Irish and Dutch Sandwich tax avoidance schemes to bank accounts in the Cayman Islands. The Dutch newspaper NRC Handelsblad concluded from the Paradise Papers published in late 2017 that Facebook pays "practically no taxes" worldwide.

For example, Facebook paid:

  • In 2011, £2.9m tax on £840m profits in the UK;
  • In 2012 and 2013 no tax in the UK;
  • In 2014 £4,327 tax on hundreds of millions of pounds in UK revenues which were transferred to tax havens.

According to economist and member of the PvdA delegation inside the Progressive Alliance of Socialists & Democrats in the European Parliament (S&D) Paul Tang, between 2013 and 2015 the EU lost an estimated €1,453m – €2,415m to Facebook. When comparing to others countries outside the EU, the EU is only taxing Facebook with a rate of 0.03% to 0.1% of its revenue (around 6% of its EBT) whereas this rate is near 28% in countries outside the EU. Even had a rate between 2% and 5% been applied during this period - as suggested by the ECOFIN Council - a fraud of this rate by Facebook would have meant a loss to the EU between €327m and €817m.

Revenue´s, profits, tax and effective tax rates, Facebook Inc. 2013-2015.
Revenue (m EUR) EBT (m EUR) Tax (m EUR) Tax / EBT Tax / Revenue
Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world
Facebook Inc. 2013 5,720 3,069 2,651 2,001 (4) 2,005 911 3 908 46% n.a 45% 15.93% 0.10% 34.25%
2014 10,299 5,017 5,282 4,057 (20) 4,077 1,628 5 1,623 40% n.a 40% 15.81% 0.09% 30.73%
2015 16,410 8,253 8,157 5,670 (43) 5,627 2,294 3 2,291 40% 6% 41% 13.98% 0.03% 28.09%

On July 6, 2016, the U.S. Department of Justice filed a petition in the U.S. District Court in San Francisco, asking for a court order to enforce an administrative summons issued to Facebook, Inc., under Internal Revenue Code section 7602, in connection with an Internal Revenue Service examination of Facebook's year 2010 U.S. Federal income tax return.

In November 2017, the Irish Independent recorded that for the 2016 financial year, Facebook had paid €30 million of Irish corporation tax on €12.6 billion of revenues that were routed through Ireland, giving an Irish effective tax rate of under 1%. The €12.6 billion of 2016 Facebook revenues routed through Ireland was almost half of Facebook's global revenues. In April 2018, Reuters wrote that all of Facebook's non–U.S. accounts were legally housed in Ireland for tax purposes, but were being moved due to the May 2018 EU GDPR regulations.

In November 2018, the Irish Times reported that Facebook routed over €18.7 billion of revenues through Ireland (almost half all global revenues), on which it paid €38 million of Irish corporation tax.

Treatment of employees and contractors

Moderators

See also: Cognizant § Working conditions and mental health issues, and Arvato § Facebook content moderation

Facebook hires some employees through contractors, including Accenture, Arvato, Cognizant, CPL Resources, and Genpact, to serve as content moderators, reviewing potentially problematic content posted to both Facebook and Instagram. Many of these contractors face unrealistic expectations, harsh working conditions, and constant exposure to disturbing content, including graphic violence, animal abuse, and child pornography. Contractor employment is contingent on achieving and maintaining a score of 98 on a 100-point scale on a metric known as "accuracy". Falling below a score of 98 can result in dismissal. Some have reported posttraumatic stress disorder (PTSD) stemming from lack of access to counseling, coupled with unforgiving expectations and the violent content they are assigned to review.

Content moderator Keith Utley, who was employed by Cognizant, experienced a heart attack during work in March 2018; the office lacked a defibrillator, and Utley was transported to a hospital where he died. Selena Scola, an employee of contractor Pro Unlimited, Inc., sued her employer after she developed PTSD as a result of "constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace". In December 2019, former Cpl employee Chris Gray began legal action in the High Court of Ireland, claiming damages for PTSD suffered as a moderator, the first of an estimated 20+ pending cases. In February 2020, employees in Tampa, Florida filed a lawsuit against Facebook and Cognizant alleging they developed PTSD and related mental health impairments as a result of constant and unmitigated exposure to disturbing content.

In February 2020, the European Union Commissioners criticized the plans that Facebook has for dealing with the working conditions of those who are contracted to moderate content on the social media platform.

Facebook agreed to settle a class action lawsuit for $52 million on May 12, 2020, which included a $1,000 payment to each of the 11,250 moderators in the class, with additional compensation available for the treatment of PTSD and other conditions resulting from the jobs.

Employees

Plans for a Facebook-owned real estate development known as "Willow Village" have been criticized for resembling a "company town", which often curtails the rights of residents, and encourages or forces employees to remain within an environment created and monitored by their employer outside of work hours. Critics have referred to the development as "Zucktown" and "Facebookville" and the company has faced additional criticism for the effect it will have on existing communities in California.

Misleading campaigns againts competitors

In May 2011, emails were sent to journalists and bloggers making critical allegations about Google's privacy policies; however, it was later discovered that the anti-Google campaign, conducted by PR giant Burson-Marsteller, was paid for by Facebook in what CNN referred to as "a new level skullduggery" and which Daily Beast called a "clumsy smear". While taking responsibility for the campaign, Burson-Marsteller said it should not have agreed to keep its client's (Facebook's) identity a secret. "Whatever the rationale, this was not at all standard operating procedure and is against our policies, and the assignment on those terms should have been declined", it said in a statement.

In December 2020, Apple Inc. announced an initiative of Anti-Tracking measures (opt-in tracking policy) to be introduced to their App Store Services. Facebook quickly reacted and started to criticise the initiative, claiming the Apple's anti-tracking privacy focused change will have "harmful impact on many small businesses that are struggling to stay afloat and on the free internet that we all rely on more than ever". Facebook also launched a so-called "Speak Up For Small Businesses" page. Apple in their response stated that "users should know when their data is being collected and shared across other apps and websites — and they should have the choice to allow that or not". Apple was also backed up by Electronic Frontier Foundation (EFF) who stated that "Facebook touts itself in this case as protecting small businesses, and that couldn't be further from the truth".

Content

An example of a Facebook post censored due to an unspecified conflict with "Community Standards".
Error message generated by Facebook for an attempt to share a link to a website that is censored due to Community Standards in a private chat. Messages containing certain links will not be delivered to the recipient.

Facebook has been criticized for removing or allowing various content (posts, photos and entire groups and profiles).

Intellectual property infringement

Facebook has also been criticized for having lax enforcement of third-party copyrights for videos uploaded to the service. In 2015, some Facebook pages were accused of plagiarizing videos from YouTube users and re-posting them as their own content using Facebook's video platform, and in some cases, achieving higher levels of engagement and views than the original YouTube posts. Videos hosted by Facebook are given a higher priority and prominence within the platform and its user experience (including direct embedding within the News Feed and pages), giving a disadvantage to posting it as a link to the original external source. In August 2015, Facebook announced a video-matching technology aiming to identify reposted videos, and also stated its intention to improve its procedures to remove infringing content faster. In April 2016, Facebook implemented a feature known as "Rights Manager", which allows rights holders to manage and restrict the upload of their content onto the service by third-parties.

Violent content

In 2013, Facebook was criticized for allowing users to upload and share videos depicting violent content, including clips of people being decapitated. Having previously refused to delete such clips under the guideline that users have the right to depict the "world in which we live", Facebook changed its stance in May, announcing that it would remove reported videos while evaluating its policy. The following October, Facebook stated that it would allow graphic videos on the platform, as long as the intention of the video was to "condemn, not glorify, the acts depicted", further stating that "Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it." However, Facebook once again received criticism, with the Family Online Safety Institute saying that such videos "crossed a line" and can potentially cause psychological damage among young Facebook users, and then-Prime Minister of the United Kingdom David Cameron calling the decision "irresponsible", citing the same concerns regarding young users. Two days later, Facebook removed a video of a beheading following "worldwide outrage", and while acknowledging its commitment to allowing people to upload gory material for the purpose of condemnation, it also stated that it would be further strengthening its enforcement to prevent glorification. The company's policies were also criticized as part of these developments, with some drawing particular attention to Facebook's permission of graphic content but potential removal of breastfeeding images. In January 2015, Facebook announced that new warnings would be displayed on graphic content, requiring users to explicitly confirm that they wish to see the material.

War crimes

Facebook has been criticized for failing to take down violent content depicting war crimes in Libya. A 2019 investigation by the BBC found evidence of alleged war crimes in Libya being widely shared on Facebook and YouTube. The BBC found images and videos on social media of the bodies of fighters and civilians being desecrated by fighters from the self-styled Libyan National Army. The force, led by General Khalifa Haftar, controls a swathe of territory in the east of Libya and is trying to seize the capital, Tripoli. BBC Arabic found almost one hundred images and videos from Libya shared on Facebook and YouTube, in violation of their companies’ guidelines. The UK Foreign Office said it took the allegations extremely seriously and is concerned about the impact the recent violence is having on the civilian population.

In 2017, a Facebook video of Libyan National Army (LNA) special forces commander Mahmoud al-Werfalli was uploaded showing him shooting dead three captured fighters. The video was then shared on YouTube over ten thousand times. The International Criminal Court used it as evidence to indict al-Werfalli for the war crime of murder. The BBC found the original video was still on Facebook 2 years after his indictment and also discovered videos showing the bodies of civilians being desecrated. These were taken in Ganfouda, a district of Benghazi which was under siege by the LNA between 2016-2017. More than 300 people, including dozens of children died during the siege. A video uncovered by BBC Arabic showed soldiers mocking a pile of corpses of dead civilians and trampling on bodies. Among them was a 77 year old woman, Alia Hamza. Her son, Ali Hamza had five family members killed in Ganfouda.

Ali Hamza told BBC Arabic, “I sent links to lawyers to send to the ICC in the Hague against Khalifa Haftar and his military commanders regarding the massacres of civilians,” said Hamza. In the video, the LNA soldiers label the civilians as terrorists. Human rights lawyer and war crimes specialist Rodney Dixon QC reviewed the evidence BBC Arabic found. “If groups are using those platforms to propagate their campaigns then those platforms should seriously look at their role because they could then be assisting in that process of further crimes being committed,” he said. After presenting our findings to Facebook they removed all the videos that show a suspected war crime taking place. However, they opted not to suspend any of the accounts which we found linked to the images. Erin Saltman, Facebook's policy manager for Counterterrorism in Europe, Middle East and Africa, told BBC Arabic, “Sometimes there are very conflicting narratives of whether or not the victim is a terrorist, or whether it's a civilian over who's committing that act, we cannot be the pure arbiters of truth.” But Facebook and Youtube’s own community guidelines explicitly prohibit content that promotes or depicts acts of violence.

Facebook Live

Facebook Live, introduced in August 2015 for celebrities and gradually rolled out for regular users starting in January 2016, lets users broadcast live videos, with Facebook's intention for the feature to be presenting public events or private celebrations. However, the feature has been used to record multiple crimes, deaths, and violent incidents, causing significant media attention.

Facebook has received criticism for not removing videos faster, and Facebook Live has been described as a "monster cannot tame" and "a gruesome crime scene for murders". In response, CEO Mark Zuckerberg announced in May 2017 that the company would hire 3,000 people to review content and invest in tools to remove videos faster.

Pro-anorexia groups

In 2008, Facebook was criticized for hosting groups dedicated to promoting anorexia. The groups promoted dramatic weight loss programs, shared extreme diet tips, and posted pictures of emaciated girls under "Thinspiration" headlines. Members reported having switched to Facebook from Myspace, another social networking service, due to a perceived higher level of safety and intimacy at Facebook. In a statement to BBC News, a Facebook spokesperson stated that "Many Facebook groups relate to controversial topics; this alone is not a reason to disable a group. In cases where content is reported and found to violate the site's terms of use, Facebook will remove it."

Pro-mafia groups' case

In Italy in 2009, the discovery of pro-mafia groups, one of them claiming Bernardo Provenzano's sainthood, caused an alert in the country and brought the government to rapidly issue a law that would force Internet service providers to deny access to entire websites in case of refused removal of illegal contents. The amendment was passed by the Italian Senate and now needs to be passed unchanged by the Chamber of Deputies to become effective.

Facebook criticized the government's efforts, telling Bloomberg that it "would be like closing an entire railway network just because of offensive graffiti at one station", and that "Facebook would always remove any content promoting violence and already had a takedown procedure in place."

Trolling

On March 31, 2010, The Today Show ran a segment detailing the deaths of three separate adolescent girls and trolls' subsequent reactions to their deaths. Shortly after the suicide of high school student Alexis Pilkington, anonymous posters began trolling for reactions across various message boards, referring to Pilkington as a "suicidal CUSS", and posting graphic images on her Facebook memorial page. The segment also included an exposé of a 2006 accident, in which an eighteen-year-old student out for a drive fatally crashed her father's car into a highway pylon; trolls emailed her grieving family the leaked pictures of her mutilated corpse.

There have been cases where Facebook "trolls" were jailed for their communications on Facebook, particularly memorial pages. In Autumn 2010, Colm Coss of Ardwick, Britain, was sentenced to 26 weeks in jail under s127 of the Communications Act 2003 of Great Britain, for "malicious communications" for leaving messages deemed obscene and hurtful on Facebook memorial pages.

In April 2011, Bradley Paul Hampson was sentenced to three years in jail after pleading guilty to two counts of using a carriage service (the Internet) to cause offense, for posts on Facebook memorial pages, and one count each of distributing and possessing child pornography when he posted images on the memorial pages of the deceased with phalluses superimposed alongside phrases such as "Woot I'm dead".

Rape pages

A series of pro-rape and 'rape joke' content on Facebook drew attention from the media and women's groups. Rape Is No Joke (RINJ), a group opposing the pages, argued that removing "pro-rape" pages from Facebook and other social media was not a violation of free speech in the context of Article 19 of the Universal Declaration of Human Rights and the concepts recognized in international human rights law in the International Covenant on Civil and Political Rights. RINJ repeatedly challenged Facebook to remove the rape pages. RINJ then turned to advertisers on Facebook telling them not to let their advertising be posted on Facebook's 'rape pages'.

Following a campaign that involved the participation of Women, Action and the Media, the Everyday Sexism Project and the activist Soraya Chemaly, who were among 100 advocacy groups, Facebook agreed to update its policy on hate speech. The campaign highlighted content that promoted domestic and sexual violence against women, and used over 57,000 tweets and more than 4,900 emails to create outcomes such as the withdrawal of advertising from Facebook by 15 companies, including Nissan UK, House of Burlesque and Nationwide UK. The social media website initially responded by stating that "While it may be vulgar and offensive, distasteful content on its own does not violate our policies", but then agreed to take action on May 29, 2013 after it had "become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate".

Child abuse images

In June 2015, the UK National Society for the Prevention of Cruelty to Children raised concerns about Facebook's apparent refusal when asked to remove controversial video material which allegedly showed a baby in emotional distress.

In March 2017, BBC News reported in an investigation that Facebook only removed 18 of the 100 groups and posts it had reported for containing child exploitation images. The BBC had been granted an interview with Facebook policy director Simon Milner under the condition that they provide evidence of the activity. However, when presented with the images, Facebook canceled the interview, and told the BBC that it had been reported to the National Crime Agency for illegally distributing child exploitation images (the NCA could not confirm whether the BBC was actually being investigated). Milner later stated to the BBC that the investigation had exposed flaws in its image moderation process that have since been addressed, and that all of the reported content was removed from the service.

Objectification of women

In July 2017, GMA News reported that "a number" of secret Facebook groups that had been engaging in illegal activity of sharing "obscene" photos of women had been exposed, with the Philippine National Bureau of Investigation warning group members of the possibility of being liable for violating child pornography and anti-voyeurism laws. Facebook stated that it would remove the groups as violations of its community guidelines. A few days later, GMA News had an interview with one of the female victims targeted by one of the groups, who stated that she received friend requests from strangers and inappropriate messages. After reporting to authorities, the Philippine National Police's anti-cybercrime unit promised to take action in finding the accounts responsible. Senator Risa Hontiveros responded to the incidents with the proposal of a law that would impose "stiff penalties" on such group members, stating that "These people have no right to enjoy our internet freedom only to abuse our women and children. We will not allow them to shame our young women, suppress their right to express themselves through social media and contribute to a culture of misogyny and hate".

Anti-Semitism

Facebook has been suspected of having a double standard when it comes to pages and posts regarding the Arab–Israeli conflict. When it comes to alleged incitement, Facebook has been accused of being unfair, removing only posts and pages that attack Palestinians, while turning a blind eye to similar posts that are violently anti-Semitic. The NGO Shurat Hadin-Israel Law Center conducted an experiment over the incitement issue, which sought to expose what it viewed as double standards regarding anti-Israel sentiment vis-a-vis the simultaneous launch of two Facebook pages: "Stop Palestinians" and "Stop Israel". Following the launch of the two nearly identical pages, the NGO posted hateful content simultaneously on both pages. Next, Shurat Hadin reported both faux-incitement pages to Facebook to see which, if either, would be removed. According to them, despite featuring nearly identical content, only one was removed from the online platform. They said the page inciting against Palestinians was closed by Facebook (on the same day that it was reported) for "containing credible threat of violence" which "violated our community standards", but not the page inciting against Israelis. Shurat Hadin said that Facebook claimed that this page was "not in violation of Facebook's rules". Shurat Hadin's staged anti-Israel group "Stop Israel" still remains active on Facebook. ProPublica stated in September 2017 that a website was able to target ads at Facebook users who were interested in "how to burn Jew" and "Jew hater". Facebook removed the categories and said it would try to stop them from appearing to potential advertisers.

In March 2019, Facebook subsidiary Instagram declined to remove an anti-semitic image posted by right-wing conspiracy theorist Alex Jones, saying that it did not violate their community standards.

Incitement of violence against Israelis

Facebook has been accused of being a public platform that is used to incite violence. In October 2015, 20,000 Israelis claimed that Facebook was ignoring Palestinian incitement on its platform and filed a class-action suit demanding that Facebook remove all posts "containing incitement to murder Jews".

Israeli politicians have complained that Facebook does not comply or assist with requests from the police for tracking and reporting individuals when they share their intent to kill or commit any other act of violence on their Facebook pages. In June 2016, following the murder of Hallel Ariel, 13, by a terrorist who posted on Facebook, Israeli Minister of Public Security Gilad Erdan charged that "Facebook, which has brought a positive revolution to the world, has become a monster...The dialogue, the incitement, the lies of the young Palestinian generation are happening on the Facebook platform." Erdan accused Facebook of "sabotaging the work of Israeli police" and "refusing to cooperate" when Israel Police turns to the site for assistance. It also "sets a very high bar" for removing inciteful content.

In July 2016, a civil action for $1 billion in damages was filed in the United States District Court for the Southern District of New York on behalf of the victims and family members of four Israeli-Americans and one US citizen killed by Hamas militants since June 2014. The victims and plaintiffs in the case are the families of Yaakov Naftali Fraenkel, a 16-year-old who was kidnapped and murdered by Hamas operatives in 2014; Taylor Force, a 29-year-old American MBA student and US Army veteran killed in a stabbing spree in Jaffa in 2016; Chaya Braun, a three-month-old thrown from her stroller and slammed into the pavement when a Hamas attacker drove his car into a light rail station in Jerusalem in an October 2014; 76-year-old Richard Lakin who was killed in the October 2015 shooting and stabbing attack on a Jerusalem bus; and Menachem Mendel Rivkin, who was seriously wounded in a January 2016 stabbing attack in Jerusalem. The plaintiffs claimed that Facebook knowingly provided its social media platform and communication services to Hamas in violation of provisions of US Anti-Terrorism laws which prohibits US businesses from providing any material support, including services, to designated terrorist groups and their leaders. The government of the United States has designated Hamas as a "Foreign Terrorist Organization" as defined by US law. The suit claims that Hamas "used and relied on Facebook's online social network platform and communications services to facilitate and carry out its terrorist activity, including the terrorist attacks in which Hamas murdered and injured the victims and their families in this case". The legal claim was rejected; the court found that Facebook and other social media companies are not considered to be the publishers of material users post when digital tools used by the company match content with what the tool identifies as interested consumers.

In August 2016, Israel's security service, the Shin Bet, reported that it had arrested nine Palestinians who had been recruited by the Lebanon-based Hezbollah terrorist organization. Operatives of Hezbollah in Lebanon and Gaza Strip recruited residents of the West Bank, Gaza and Israel through Facebook and other social media sites. After recruiting cell leaders on Facebook, Hezbollah and the recruits used encrypted communications to avoid detection, and the leaders continued to recruit other members. The terror cells received Hezbollah funding and planned to conduct suicide bombings and ambushes and had begun preparing explosive devices for attacks, said the security service, which claimed credit for preventing the attacks. The Shin Bet said it also detected multiple attempts by Hezbollah to recruit Israeli Arabs through a Facebook profile.

Currently, legislation is being prepared in Israel, allowing fines of 300,000 shekels for Facebook and other social media like Twitter and YouTube for every post inciting or praising terrorism that isn't removed within 48 hours, and could possibly lead to further acts of terrorism.

Countermeasure efforts

In June 2017, Facebook published a blog post, offering insights into how it detects and combats terrorism content. The company claimed that the majority of the terrorism accounts that are found are discovered by Facebook itself, while it reviews reports of terrorism content "urgently", and, in cases of imminent harm, "promptly inform authorities". It also develops new tools to aid in its efforts, including the use of artificial intelligence to match terrorist images and videos, detecting when content is shared across related accounts, and developing technologies to stop repeat offenders. The company stated that it has 150 people dedicated to terrorism countermeasures, and works with governments and industries in an effort to curb terrorist propaganda. Its blog post stated that "We want Facebook to be a hostile place for terrorists."

Employee data leak

In June 2017, The Guardian reported that a software bug had exposed the personal details of 1,000 Facebook workers involved in reviewing and removing terrorism content, by displaying their profiles in the "Activity" logs of Facebook groups related to terrorism efforts,. In Facebook's Dublin, Ireland headquarters, six individuals were determined to be "high priority" victims of the error, after the company concluded that their profiles were likely viewed by potential terrorists in groups such as ISIS, Hezbollah and the Kurdistan Workers' Party. The bug itself, discovered in November 2016 and fixed two weeks later, was active for one month, and had also been retroactively exposing censored personal accounts from August 2016. One affected worker had fled Ireland, gone into hiding, and only returned to Ireland after five months due to a lack of money. Suffering from psychological distress, he filed a legal claim against Facebook and CPL Resources, an outsourcing company, seeking compensation. A Facebook spokesperson stated that "Our investigation found that only a small fraction of the names were likely viewed, and we never had evidence of any threat to the people impacted or their families as a result of this matter", and Craig D’Souza, Facebook's head of global investigations, said: "Keep in mind that when the person sees your name on the list, it was in their activity log, which contains a lot of information there is a good chance that they associate you with another admin of the group or a hacker". Facebook offered to install a home-alarm monitoring system, provide transport to and from work, and counseling through its employee assistance program. As a result of the data leak, Facebook is reportedly testing the use of alternative, administrative accounts for workers reviewing content, rather than requiring workers to sign in with their personal profiles.

On October 26, 2018 social network Facebook announced that it has deleted 82 accounts created in Iran that included posts advocating harsh issues such as race, immigration, and U.S. President Donald Trump.

Fake news

Main article: Fake news website

Facebook has been criticized for not doing enough to limit the spread of fake news stories on their site, especially after the 2016 United States presidential election, which some have claimed Donald Trump would not have won if Facebook had not helped spread what they claim to have been fake stories that were biased in his favor. Mark Zuckerberg has begun to take steps to eliminate the prevalence of fake news on Facebook as a result of criticisms of Facebook's influence on the presidential election. At a conference called Techonomy Mark Zuckerberg stated in regards to Donald Trump, "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news". Zuckerberg affirms the idea that people do not stray from their own ideals and political leanings. He stated, "I don't know what to do about that" and, "When we started, the north star for us was: We're building a safe community".

Zuckerberg has also been quoted in his own Facebook post, "Of all the content on Facebook, more than 99 percent of what people see is authentic". In addition, The Pew Research Center, stated that "62% of Americans obtain some, or all, of their news on social media-the bulk of it from Facebook". The former editor at Facebook leaked inflammatory information about the websites' algorithm's pointing to certain falsehoods and bias by the news created within Facebook. Although Facebook initially denied claims of issues with fake new stories and their algorithms, they fired the entire trending team involved with a fake news story about Megyn Kelly being a "closeted liberal".

Incitement of violence in Sri Lanka

Sri Lankan telecommunications minister Harin Fernando stated that Facebook had been too slow removing content and banning users who were using its platforms to facilitate violence during the 2018 anti-Muslim riots in Sri Lanka. Facebook stated that it is increasing the number of Sinhalese-speakers it employs to review content.

Myanmar abuses

See also: Rohingya genocide § Facebook's role in the genocide

The chairman of the U.N. Independent International Fact-Finding Mission on Myanmar said Facebook played a "determining role" in the Rohingya genocide. Facebook has been criticized for enabling Islamophobic content targeting the Rohingya people to spread. The United Nations Human Rights Council has called the platform "a useful instrument for those seeking to spread hate".

In response, Facebook removed accounts owned by the Myanmar Armed Forces for inciting hatred against the Rohingya people, and “engaging in coordinated inauthentic behavior.”

Blue tick

Facebook grants blue tick to verified accounts of public personalities, brands, and celebrities (including politicians and artists). They have no policy in the cases where an individual who has a verified blue tick account is convicted in a serious criminal case. There has been a recent case in India where a politician was convicted and sentenced to 10 years in jail in a serious bribery criminal case but his FB page still continues to be verified.

Neo-Nazi and white supremacist content

From c.2018 until 27 March 2019, Facebook's internal policy was to permit "white nationalist" content but not "white supremacist" content, despite advice stating there is no distinction. In practice, it hosted much white supremacist and neo-Nazi content. On 27 March 2019, Facebook backtracked and stated that white nationalism "cannot be meaningfully separated from white supremacy and organized hate groups".

Technical

Real-name policy controversy and compromise

Main article: Facebook real-name policy controversy

Facebook has a real-name system policy for user profiles. The real-name policy stems from the position "that way, you always know who you're connecting with. This helps keep our community safe." The real-name system does not allow adopted names or pseudonyms, and in its enforcement has suspended accounts of legitimate users, until the user provides identification indicating the name. Facebook representatives have described these incidents as very rare. A user claimed responsibility via the anonymous Android and iOS app Secret for reporting "fake names" which caused user profiles to be suspended, specifically targeting the stage names of drag queens. On October 1, 2014, Chris Cox, Chief Product Officer at Facebook, offered an apology: "In the two weeks since the real-name policy issues surfaced, we've had the chance to hear from many of you in these communities and understand the policy more clearly as you experience it. We've also come to understand how painful this has been. We owe you a better service and a better experience using Facebook, and we're going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were."

On December 15, 2015, Facebook announced in a press release that it would be providing a compromise to its real name policy after protests from groups such as the gay/lesbian community and abuse-victims. The site is developing a protocol that will allow members to provide specifics as to their "special circumstance" or "unique situation" with a request to use pseudonyms, subject to verification of their true identities. At that time, this was already being tested in the U.S. Product manager Todd Gage and vice president of global operations Justin Osofsky also promised a new method for reducing the number of members who must go through ID verification while ensuring the safety of others on Facebook. The fake name reporting procedure will also be modified, forcing anyone who makes such an allegation to provide specifics that would be investigated and giving the accused individual time to dispute the allegation.

Deleting users' statuses

There have been complaints of user statuses being mistakenly or intentionally deleted for alleged violations of Facebook's posting guidelines. Especially for non-English speaking writers, Facebook does not have a proper support system to genuinely read the content and make decisions. Sometimes the content of a status did not have any "abusive" or defaming language, but it nevertheless got deleted on the basis that it had been secretly reported by a group of people as "offensive". For other languages than English, Facebook till now is not able to identify the group approach that is used to vilify humanitarian activism. In another incident, Facebook had to apologize after it deleted a free speech group's post about the abuse of human rights in Syria. In that case, a spokesman for Facebook said the post was "mistakenly" removed by a member of its moderation team, which receives a high volume of take-down requests.

Enabling of harassment

Facebook instituted a policy by which it is now self-policed by the community of Facebook users. Some users have complained that this policy allows Facebook to empower abusive users to harass them by allowing them to submit reports on even benign comments and photos as being "offensive" or "in violation of Facebook Rights and Responsibilities" and that enough of these reports result in the user who is being harassed in this way getting their account blocked for a predetermined number of days or weeks, or even deactivated entirely.

Facebook UK policy director Simon Milner told Wired Magazine that "Once the piece of content has been seen, assessed and deemed OK, (Facebook) will ignore further reports about it."

Lack of customer support

Facebook lacks live support, making it difficult to resolve issues that require the services of an administrator or are not covered in the FAQs, such as the enabling of a disabled account. The automated emailing system used when filling out a support form often refers users back to the help center or to pages that are outdated and cannot be accessed, leaving users at a dead end with no further support available. Further a person who lost access to Facebook has no easy way to find an email to contact the company regarding an account deletion.

Downtime and outages

Facebook has had a number of outages and downtime large enough to draw some media attention. A 2007 outage resulted in a security hole that enabled some users to read other users' personal mail. In 2008, the site was inaccessible for about a day, from many locations in many countries. In spite of these occurrences, a report issued by Pingdom found that Facebook had less downtime in 2008 than most social-networking websites. On September 16, 2009, Facebook started having major problems with loading when people signed in. On September 18, 2009, Facebook went down for the second time in 2009, the first time being when a group of hackers were deliberately trying to drown out a political speaker who had social networking problems from continuously speaking against the Iranian election results.

In October 2009, an unspecified number of Facebook users were unable to access their accounts for over three weeks.

Tracking cookies

Facebook has been criticized heavily for 'tracking' users, even when logged out of the site. Australian technologist Nik Cubrilovic discovered that when a user logs out of Facebook, the cookies from that login are still kept in the browser, allowing Facebook to track users on websites that include "social widgets" distributed by the social network. Facebook has denied the claims, saying they have 'no interest' in tracking users or their activity. They also promised after the discovery of the cookies that they would remove them, saying they will no longer have them on the site. A group of users in the United States have sued Facebook for breaching privacy laws.

As of December 2015, to comply with a court order citing violations of the European Union Directive on Privacy and Electronic Communications—which requires users to consent to tracking and storage of data by websites, Facebook no longer allows users in Belgium to view any content on the service, even public pages, without being registered and logged in.

Email address change

In June 2012, Facebook removed all existing email addresses from user profiles, and added a new @facebook.com email address. Facebook claimed this was part of adding a "new setting that gives people the choice to decide which addresses they want to show on their timelines". However, this setting was redundant to the existing "Only Me" privacy setting which was already available to hide addresses from timelines. Users complained the change was unnecessary, they did not want an @facebook.com email address, and they did not receive adequate notification their profiles had been changed. The change in email address was synchronized to phones due to a software bug, causing existing email addresses details to be deleted. The facebook.com email service was retired in February 2014.

Safety Check bug

On March 27, 2016, following a bombing in Lahore, Pakistan, Facebook activated its "Safety Check" feature, which allows people to let friends and loved ones know they are okay following a crisis or natural disaster, to people who were never in danger, or even close to the Pakistan explosion. Some users as far as the US, UK and Egypt received notifications asking if they were okay.

Censorship

The warning box that appears when Internet users try to view censored or blocked content on Facebook

Search function

Facebook's search function has been accused of preventing users from searching for certain terms. Michael Arrington of TechCrunch has written about Facebook's possible censorship of "Ron Paul" as a search term. MoveOn.org's Facebook group for organizing protests against privacy violations could for a time not be found by searching. The very word privacy was also restricted.

Censorship of conservative news

In May 2016, Facebook was accused by a former employee for leaving out conservative topics from the trending bar. Although Facebook denied these allegations, the site planned to improve the trending bar.

In August 2018, Facebook deleted videos posted to it by PragerU. Facebook later reversed its decision and restored the PragerU content, saying that PragerU content was falsely reported to have hate speech.

As a result of perception that conservatives are not treated neutrally on Facebook alternative social media platforms have been established. This perception has led to a reduction of trust in Facebook, and reduction of usage by those who consider themselves to be conservative.

In July 2020, Congressman Matt Gaetz filed a criminal referral against Facebook citing that evidence produced by Project Veritas demonstrated that Facebook CEO, Mark Zuckerberg, had made materially false statements to congress while under oath in hearings which occurred in April 2018. Congressman Gaetz claimed that the evidence provided demonstrated that Zuckerberg's claims that the website did not engage in bias against conservative speech were false.

Competing social networks

In October 2018, Facebook and Facebook Messenger was said to be blocking urls to minds.com, a social network website that is a competitor of Facebook. Users have complained that Facebook marks links to Facebook's competitor as "insecure" and have to fill a captcha to share it with other users. In 2015, Facebook was accused of banning rival network Tsu.co in a similar manner.

Content critical of Facebook

Newspapers regularly report stories of users who claim they've been censored on Facebook for being critical of Facebook itself, with their posts removed or made less visible. Examples include Elizabeth Warren in 2019 and Rotem Shtarkman in 2016.

Facebook has systems to monitor specific terms and keywords and trigger automatic or semi-automatic action. In the context of media reports and lawsuits from people formerly working on Facebook content moderation, a former employee has claimed that specific rules existed to monitor and sometimes target posts about Facebook which are anti-Facebook or criticize Facebook for some action, for instance by matching the keywords "Facebook" or "DeleteFacebook".

Image censorship

Facebook has a policy of removing photos which they believe violate the terms and conditions of the website. Images have been removed from user pages on topics such as breastfeeding, nudes in art, apparent breasts, naked mannequins, kisses between persons of the same sex and family photos.

In September 2016, Norwegian author Tom Egeland published Nick Ut's iconic napalm girl photo on his Facebook page. He was banned for publishing "a picture of a nude child". A few weeks later, the newspaper Aftenposten published an open letter to Zuckerberg after the banning of "Napalm Girl", a Pulitzer Prize-winning documentary photograph from the Vietnam War made by Nick Ut. Half of the ministers in the Norwegian government shared the famous Nick Ut photo on their Facebook pages, among them prime minister Erna Solberg from the Conservative Party (Høyre). But after only a few hours, several of the Facebook posts, including the Prime Minister's post, were deleted by Facebook.

As a reaction to the letter, Facebook reconsidered its opinion on this picture and republished it, recognizing "the history and global importance of this image in documenting a particular moment in time".

Breastfeeding photos

Facebook has been repeatedly criticized for removing photos uploaded by mothers breastfeeding their babies. Although photos that show an exposed breast violate Facebook's decency code, even when the baby covered the nipple, Facebook took several days to respond to criticism and deactivate a paid advertisement for a dating service that used a photo of a topless model.

The breastfeeding photo controversy continued following public protests and the growth in membership of a Facebook group titled "Hey, Facebook, breastfeeding is not obscene! (Official petition to Facebook)." In December 2011, Facebook removed photos of mothers breastfeeding and after public criticism, restored the photos. The company said it removed the photos they believed violated the pornographic rules in the company's terms and conditions. During February 2012, the company renewed its policy of removing photos of mothers breastfeeding. Founders of a Facebook group "Respect the Breast" reported that "women say they are tired of people lashing out at what is natural and what they believe is healthy for their children."

Censorship of editorial content

On February 4, 2010, a number of Facebook groups against the Democratic Alliance for the Betterment and Progress of Hong Kong (DAB) were removed without any reason given. The DAB is one of the largest pro-Beijing political parties in Hong Kong. The affected groups have since been restored.

Censorship of the word "moskal"

Around July 1, 2015 Facebook started to automatically ban accounts that use the word "moskal", which is a widely used historical slang term for people of Russia (formerly Moskovia until 1721), which may be seen as offensive by some individuals. However, use of similar words such as "khokhol", which are widely used by Russian nationalists against Ukrainians, as well as insulting uses of "ukrop" (literally dill), were not prosecuted. In an experiment, journalist Max Kononenko has posted the poem "Моя родословная" by Alexander Pushkin for this account to be banned automatically within minutes. Posts of vice minister of Roskomnadzor, Max Ksenzov, were similarly automatically deleted. Ksenzov has accused Facebook of censorship and double standards and has removed his account in protest.

Censorship on the Kashmir Freedom Movement

In 2016, Facebook banned and also removed content regarding the Kashmir dispute, triggering a response from The Guardian, BBC and other media groups on Facebook's policies on censorship. Facebook censorship policies have been criticized especially after the company banned the posts about the Indian army's attack on protesters, including children, with pellet guns. A human rights group superimposed pellet injuries similar to those inflicted on Kashmiri people on the faces of popular Indian actors, famous people including Facebook founder Mark Zuckerberg and even Prime Minister Narendra Modi as a response, which went viral.

Kurdish opposition censorship

Facebook has a policy to censor anything related to Kurdish opposition against Turkey, such as maps of Kurdistan, flags of Kurdish armed groups (such as PKK and YPG), and criticism of Mustafa Kemal Atatürk, the founder of Turkey.

Censorship of 'blasphemous' content

Facebook has worked with Pakistani government to censor 'blasphemous' pages and speech inside Pakistan.

Censorship of anti-immigrant speech

In Germany, Facebook actively censors anti-immigrant speech.

In May 2016, Facebook and other technology companies agreed to a new "code of conduct" by the European Commission to review hateful online content within 24 hours of being notified, and subsequently remove such content if necessary. A year later, Reuters reported that the European Union had approved proposals to make Facebook and other technology companies tackle hate speech content on their platforms, but that a final agreement in the European Parliament is needed to make the proposals into law. In June 2017, the European Commission praised Facebook's efforts in fighting hateful content, having reviewed "nearly 58 percent of flagged content within 24 hours".

Third-party responses to Facebook

Government censorship

Main article: Censorship of Facebook

Several countries have banned access to Facebook, including Syria, China, and Iran. In 2010, the Office of the Data Protection Supervisor, a branch of the government of the Isle of Man, received so many complaints about Facebook that they deemed it necessary to provide a "Facebook Guidance" booklet (available online as a PDF file), which cited (amongst other things) Facebook policies and guidelines and included an elusive Facebook telephone number. This number when called, however, proved to provide no telephone support for Facebook users, and only played back a recorded message advising callers to review Facebook's online help information.

In 2010, Facebook reportedly allowed an objectionable page, deemed by the Islamic Lawyers Forum (ILF), to be anti-Muslim. The ILF filed a petition with Pakistan's Lahore High Court. On May 18, 2010, Justice Ijaz Ahmad Chaudhry ordered Pakistan's Telecommunication Authority to block access to Facebook until May 31. The offensive page had provoked street demonstrations in Muslim countries due to visual depictions of Prophet Mohammed, which are regarded as blasphemous by Muslims. A spokesman said Pakistan Telecommunication Authority would move to implement the ban once the order has been issued by the Ministry of Information and Technology. "We will implement the order as soon as we get the instructions", Khurram Mehran told AFP. "We have already blocked the URL link and issued instruction to Internet service providers yesterday", he added. Rai Bashir told AFP that "We moved the petition in the wake of widespread resentment in the Muslim community against the Facebook contents". The petition called on the government of Pakistan to lodge a strong protest with the owners of Facebook, he added. Bashir said a PTA official told the judge his organization had blocked the page, but the court ordered a total ban on the site. People demonstrated outside court in the eastern city of Lahore, Pakistan, carrying banners condemning Facebook. Protests in Pakistan on a larger scale took place after the ban and widespread news of that objectionable page. The ban was lifted on May 31 after Facebook reportedly assured the Lahore High Court that it would remedy the issues in dispute.

In 2011, a court in Pakistan was petitioned to place a permanent ban on Facebook for hosting a page called "2nd Annual Draw Muhammad Day May 20th 2011".

Organizations blocking access

Ontario government employees, Federal public servants, MPPs, and cabinet ministers were blocked from access to Facebook on government computers in May 2007. When the employees tried to access Facebook, a warning message "The Internet website that you have requested has been deemed unacceptable for use for government business purposes". This warning also appears when employees try to access YouTube, MySpace, gambling or pornographic websites. However, innovative employees have found ways around such protocols, and many claim to use the site for political or work-related purposes.

A number of local governments including those in the UK and Finland imposed restrictions on the use of Facebook in the workplace due to the technical strain incurred. Other government-related agencies, such as the US Marine Corps have imposed similar restrictions. A number of hospitals in Finland have also restricted Facebook use citing privacy concerns.

Schools blocking access

The University of New Mexico (UNM) in October 2005 blocked access to Facebook from UNM campus computers and networks, citing unsolicited emails and a similar site called UNM Facebook. After a UNM user signed into Facebook from off campus, a message from Facebook said, "We are working with the UNM administration to lift the block and have explained that it was instituted based on erroneous information, but they have not yet committed to restore your access." UNM, in a message to students who tried to access the site from the UNM network, wrote, "This site is temporarily unavailable while UNM and the site owners work out procedural issues. The site is in violation of UNM's Acceptable Computer Use Policy for abusing computing resources (e.g., spamming, trademark infringement, etc.). The site forces use of UNM credentials (e.g., NetID or email address) for non-UNM business." However, after Facebook created an encrypted login and displayed a precautionary message not to use university passwords for access, UNM unblocked access the following spring semester.

The Columbus Dispatch reported on June 22, 2006, that Kent State University's athletic director had planned to ban the use of Facebook by athletes and gave them until August 1 to delete their accounts. On July 5, 2006, the Daily Kent Stater reported that the director reversed the decision after reviewing the privacy settings of Facebook. As long as they followed the university's policies of online conduct, they could keep their profiles.

Closed social networks

Several web sites concerned with social networking, such as Plugtodo.com and Salesforce.com have criticized the lack of information that users get when they share data. Advanced users cannot limit the amount of information anyone can access in their profiles, but Facebook promotes the sharing of personal information for marketing purposes, leading to the promotion of the service using personal data from users who are not fully aware of this. Facebook exposes personal data, without supporting open standards for data interchange. According to several communities and authors closed social networking, on the other hand, promotes data retrieval from other people while not exposing one's personal information.

Openbook was established in early 2010 both as a parody of Facebook and a critique of its changing privacy management protocols.

Litigation

Further information: List of lawsuits involving Facebook
This article or section may need to be cleaned up or summarized because it has been split from/to List of lawsuits involving Facebook.

Terms of use controversy

While Facebook originally made changes to its terms of use or, terms of service, on February 4, 2009, the changes went unnoticed until Chris Walters, a blogger for the consumer-oriented blog, The Consumerist, noticed the change on February 15, 2009. Walters complained the change gave Facebook the right to "Do anything they want with your content. Forever." The section under the most controversy is the "User Content Posted on the Site" clause. Before the changes, the clause read:

You may remove your User Content from the Site at any time. If you choose to remove your User Content, the license granted above will automatically expire, however you acknowledge that the Company may retain archived copies of your User Content.

The "license granted" refers to the license that Facebook has to one's "name, likeness, and image" to use in promotions and external advertising. The new terms of use deleted the phrase that states the license would "automatically expire" if a user chose to remove content. By omitting this line, Facebook license extends to adopt users' content perpetually and irrevocably years after the content has been deleted.

Many users of Facebook voiced opinions against the changes to the Facebook Terms of Use, leading to an Internet-wide debate over the ownership of content. The Electronic Privacy Information Center (EPIC) prepared a formal complaint with the Federal Trade Commission. Many individuals were frustrated with the removal of the controversial clause. Facebook users, numbering more than 38,000, joined a user group against the changes, and a number of blogs and news sites have written about this issue.

After the change was brought to light in Walters's blog entry, in his blog on February 16, 2009, Zuckerberg addressed the issues concerning the recently made changes to Facebook's terms of use. Zuckerberg wrote "Our philosophy is that people own their information and control who they share it with." In addition to this statement Zuckerberg explained the paradox created when people want to share their information (phone number, pictures, email address, etc.) with the public, but at the same time desire to remain in complete control of who has access to this info.

To calm criticism, Facebook returned to its original terms of use. However, on February 17, 2009, Zuckerberg wrote in his blog, that although Facebook reverted to its original terms of use, it is in the process of developing new terms to address the paradox. Zuckerberg stated that these new terms will allow Facebook users to "share and control their information, and it will be written clearly in language everyone can understand." Zuckerberg invited users to join a group entitled "Facebook Bill of Rights and Responsibilities" to give their input and help shape the new terms.

On February 26, 2009, Zuckerberg posted a blog, updating users on the progress of the new Terms of Use. He wrote, "We decided we needed to do things differently and so we're going to develop new policies that will govern our system from the ground up in an open and transparent way." Zuckerberg introduces the two new additions to Facebook: the Facebook Principles and the Statement of Rights and Responsibilities. Both additions allow users to vote on changes to the terms of use before they are officially released. Because "Facebook is still in the business of introducing new and therefore potentially disruptive technologies", Zuckerberg explains, users need to adjust and familiarize themselves with the products before they can adequately show their support.

This new voting system was initially applauded as Facebook's step to a more democratized social network system. However, the new terms were harshly criticized in a report by computer scientists from the University of Cambridge, who stated that the democratic process surrounding the new terms is disingenuous and significant problems remain in the new terms. The report was endorsed by the Open Rights Group.

In December 2009, EPIC and a number of other U.S. privacy organizations filed another complaint with the Federal Trade Commission (FTC) regarding Facebook's Terms of Service. In January 2011 EPIC filed a subsequent complaint claiming that Facebook's new policy of sharing users' home address and mobile phone information with third-party developers were "misleading and fail to provide users clear and privacy protections", particularly for children under age 18. Facebook temporarily suspended implementation of its policy in February 2011, but the following month announced it was "actively considering" reinstating the third-party policy.

Interoperability and data portability

Facebook has been criticized for failing to offer users a feature to export their friends' information, such as contact information, for use with other services or software. The inability of users to export their social graph in an open standard format contributes to vendor lock-in and contravenes the principles of data portability. Automated collection of user information without Facebook's consent violates its Statement of Rights and Responsibilities, and third-party attempts to do so (e.g., Web scraping) have resulted in litigation, Power.com.

Facebook Connect has been criticized for its lack of interoperability with OpenID.

Lawsuits over privacy

Facebook’s strategy of making revenue through advertising has created a lot of controversy for its users as some argue that it is "a bit creepy… but it is also brilliant." Some Facebook users have raised privacy concerns because they do not like that Facebook sells user’s information to third parties. In 2012, users sued Facebook for using their pictures and information on a Facebook advertisement. Facebook gathers user information by keeping track of pages users have "Liked" and through the interactions users have with their connections. They then create value from the gathered data by selling it. In 2009 users also filed a lawsuit for Facebook’s privacy invasion through the Facebook Beacon system. Facebook’s team believed that through the Beacon system people could inspire their friends to buy similar products, however, users did not like the idea of sharing certain online purchases with their Facebook friends. Users were against Facebook’s invasion of privacy and sharing that privacy with the world. Facebook users became more aware of Facebook’s behavior with user information in 2009 as Facebook launched their new Terms of Service. In Facebook’s terms of service, Facebook admits that user information may be used for some of Facebook’s own purposes such as sharing a link to your posted images or for their own commercials and advertisements.

As Dijck argues in his book that, "the more users know about what happens to their personal data, the more inclined they are to raise objections." This created a battle between Facebook and Facebook users described as the "battle for information control." Facebook users have become aware of Facebook’s intentions and people now see Facebook "as serving the interests of companies rather than its users." In response to Facebook selling user information to third parties, concerned users have resorted to the method of "Obfuscation." Through obfuscation users can purposely hide their real identity and provide Facebook with false information that will make their collected data less accurate. By obfuscating information through sites such as "FaceCloak," Facebook users have regained control of their personal information.

Better Business Bureau review

As of December 2010, the Better Business Bureau gave Facebook an "A" rating.

As of December 2010, the 36-month running count of complaints about Facebook logged with the Better Business Bureau is 1136, including 101 ("Making a full refund, as the consumer requested"), 868 ("Agreeing to perform according to their contract"), 1 ("Refuse to adjust, relying on terms of agreement"), 20 ("Unassigned"), 0 ("Unanswered") and 136 ("Refusing to make an adjustment").

Security

Facebook's software has proven vulnerable to likejacking. On July 28, 2010, the BBC reported that security consultant Ron Bowes used a piece of code to scan Facebook profiles to collect data of 100 million profiles. The data collected was not hidden by the user's privacy settings. Bowes then published the list online. This list, which has been shared as a downloadable file, contains the URL of every searchable Facebook user's profile, their name and unique ID. Bowes said he published the data to highlight privacy issues, but Facebook claimed it was already public information.

In early June 2013, The New York Times reported that an increase in malicious links related to the Trojan horse malware program Zeus were identified by Eric Feinberg, founder of the advocacy group Fans Against Kounterfeit Enterprise (FAKE). Feinberg said that the links were present on popular NFL Facebook fan pages and, following contact with Facebook, was dissatisfied with the corporation's "after-the-fact approach". Feinberg called for oversight, stating, "If you really want to hack someone, the easiest place to start is a fake Facebook profile—it's so simple, it's stupid."

Rewards for vulnerability reporting

On August 19, 2013, it was reported that a Facebook user from Palestinian Autonomy, Khalil Shreateh, found a bug that allowed him to post material to other users' Facebook Walls. Users are not supposed to have the ability to post material to the Facebook Walls of other users unless they are approved friends of those users that they have posted material to. To prove that he was telling the truth, Shreateh posted material to Sarah Goodin's wall, a friend of Facebook CEO Mark Zuckerberg. Following this, Shreateh contacted Facebook's security team with the proof that his bug was real, explaining in detail what was going on. Facebook has a bounty program in which it compensates people a $500+ fee for reporting bugs instead of using them to their advantage or selling them on the black market. However, it was reported that instead of fixing the bug and paying Shreateh the fee, Facebook originally told him that "this was not a bug" and dismissed him. Shreateh then tried a second time to inform Facebook, but they dismissed him yet again. On the third try, Shreateh used the bug to post a message to Mark Zuckerberg's Wall, stating "Sorry for breaking your privacy ... but a couple of days ago, I found a serious Facebook exploit" and that Facebook's security team was not taking him seriously. Within minutes, a security engineer contacted Shreateh, questioned him on how he performed the move and ultimately acknowledged that it was a bug in the system. Facebook temporarily suspended Shreateh's account and fixed the bug after several days. However, in a move that was met with much public criticism and disapproval, Facebook refused to pay out the 500+ fee to Shreateh; instead, Facebook responded that by posting to Zuckerberg's account, Shreateh had violated one of their terms of service policies and therefore "could not be paid." Included with this, the Facebook team strongly censured Shreateh over his manner of resolving the matter. In closing, they asked that Shreateh continue to help them find bugs.

On August 22, 2013, Yahoo News reported that Marc Maiffret, a chief technology officer of the cybersecurity firm BeyondTrust, is prompting hackers to help raise a $10,000 reward for Khalil Shreateh. On August 20, Maiffret stated that he had already raised $9,000 in his efforts, including the $2,000 he himself contributed. He and other hackers alike have denounced Facebook for refusing Shreateh compensation. Maiffret said: "He is sitting there in Palestine doing this research on a five-year-old laptop that looks like it is half broken. It's something that might help him out in a big way." Facebook representatives have since responded, "We will not change our practice of refusing to pay rewards to researchers who have tested vulnerabilities against real users." Facebook representatives also claimed they'd paid out over $1 million to individuals who have discovered bugs in the past.

Environmental impacts

See also: Green computing

In 2010, Prineville, Oregon, was chosen as the site for Facebook's new data center. However, the center has been met with criticism from environmental groups such as Greenpeace because the power utility company contracted for the center, PacifiCorp, generates 60% of its electricity from coal. In September 2010, Facebook received a letter from Greenpeace containing half a million signatures asking the company to cut its ties to coal-based electricity.

On April 21, 2011, Greenpeace released a report showing that of the top ten big brands in cloud computing, Facebook relied the most on coal for electricity for its data centers. At the time, data centers consumed up to 2% of all global electricity and this amount was projected to increase. Phil Radford of Greenpeace said "we are concerned that this new explosion in electricity use could lock us into old, polluting energy sources instead of the clean energy available today".

On December 15, 2011, Greenpeace and Facebook announced together that Facebook would shift to use clean and renewable energy to power its own operations. Marcy Scott Lynn, of Facebook's sustainability program, said it looked forward "to a day when our primary energy sources are clean and renewable" and that the company is "working with Greenpeace and others to help bring that day closer".

Advertising

Click fraud

In July 2012, startup Limited Run claimed that 80% of its Facebook clicks came from bots. Limited Run co-founder Tom Mango told TechCrunch that they "spent roughly a month testing this" with six web analytics services including Google Analytics and in-house software. Click fraud (Allege reason) Limited Run said it came to the conclusion that the clicks were fraudulent after running its own analysis. It determined that most of the clicks for which Facebook was charging it came from computers that weren’t loading Javascript, a programming language that allows Web pages to be interactive. Almost all Web browsers load Javascript by default, so the assumption is that if a click comes from one that isn’t, it’s probably not a real person but a bot.

Like fraud

Facebook offers an advertising tool for pages to get more "likes". According to Business Insider, this advertising tool is called "Suggested Posts" or "Suggested Pages", allowing companies to market their page to thousands of new users for as little as $50.

Global Fortune 100 firms are increasingly using social media marketing tools as the number of "likes" per Facebook page has risen by 115% globally. Biotechnology company Comprendia investigated Facebook’s "likes" through advertising by analyzing the life science pages with the most likes. They concluded that at as much as 40% of "likes" from company pages are suspected to be fake. According to Facebook’s annual report, an estimated 0.4% and 1.2% of active users are undesirable accounts that create fake likes.

Small companies such as PubChase have publicly testified against Facebook’s advertising tool, claiming legitimate advertising on Facebook creates fraudulent Facebook "likes". In May 2013, PubChase decided to build up its Facebook following through Facebook’s advertising tool, which promises to "connect with more of the people who matter to you". After the first day, the company grew suspicious of the increased likes as they ended up with 900 likes from India. According to PubChase, none of the users behind the "likes" seemed to be scientists. The statistics from Google Analytics indicate that India is not in the company’s main user base. PubChase continues by stating that Facebook has no interface to delete the fake likes; rather, the company must manually delete each follower themselves.

In February 2014, Derek Muller used his YouTube account Veritasium to upload a video titled "Facebook Fraud". Within three days, the video had gone viral with more than a million views (it has reached 2,521,614 views as of June 10, 2014). In the video, Muller illustrates how after paying US$50 to Facebook advertising, the "likes" to his fan page have tripled in a few days and soon reached 70,000 "likes", compared to his original 2,115 likes before the advertising. Despite the significant increase in likes, Muller noticed his page has actually decreased in engagement – there were fewer people commenting, sharing, and liking his posts and updates despite the significant increase in "likes". Muller also noticed that the users that "liked" his page were users that liked hundreds of other pages, including competing pages such as AT&T and T-Mobile. He theorizes that users are purposely clicking "like" on any and every page to deter attention away from the pages they were paid to "like". Muller claims, "I never bought fake likes, I used Facebook legitimate advertising, but the results are as if I paid for fake likes from a click farm".

In response to the fake "likes" complaints, Facebook told Business Insider:

We're always focused on maintaining the integrity of our site, but we've placed an increased focus on abuse from fake accounts recently. We've made a lot of progress by building a combination of automated and manual systems to block accounts used for fraudulent purposes and Like button clicks. We also take action against sellers of fake clicks and help shut them down.

Undesired targeting

On August 3, 2007, several British companies, including First Direct, Vodafone, Virgin Media, The Automobile Association, Halifax and Prudential pulled advertising in Facebook after finding that their ads were displayed on the page of the British National Party, a far-right political party.

Facilitation of housing discrimination

Facebook has faced allegations that its advertising platforms facilitate housing discrimination by means of internal functions for targeted advertising, which allowed advertisers to target or exclude specific audiences from campaigns. Researchers have also found that Facebook's advertising platform may be inherently discriminatory, since ad delivery is also influenced by how often specific demographics interact with specific types of advertising—even if they are not explicitly determined by the advertiser.

Under the United States' Fair Housing Act, it is illegal to show a preference for or against tenants based on specific protected classes (including race, ethnicity, and disabilities), when advertising or negotiating the rental or sale of housing. In 2016, ProPublica found that advertisers could target or exclude users from advertising based on an "Ethnic Affinity"—a demographic trait which is determined based on a user's interests and behaviors on Facebook, and not explicitly provided by the user. This could, in turn, be used to discriminate based on race. In February 2017, Facebook stated that it would implement stronger measures to forbid discriminatory advertising across the entire platform. Advertisers who attempt to create ads for housing, employment, or credit (HEC) opportunities would be blocked from using ethnic affinities (renamed "multicultural affinities" and now classified as behaviors) to target the ad. If an advertiser uses any other audience segment to target ads for HEC, they would be informed of the policies, and be required to affirm their compliance with relevant laws and policies.

However, in November 2017, ProPublica found that automated enforcement of these new policies was inconsistent. They were also able to successfully create housing ads that excluded users based on interests and other factors that effectively imply associations with protected classes, including interests in wheelchair ramps, the Spanish-language television network Telemundo, and New York City ZIP codes with majority minority populations. In response to the report, Facebook temporarily disabled the ability to target any ad with exclusions based on multicultural affinities.

In April 2018, Facebook permanently removed the ability to create exclusions based on multicultural affinities. In July 2018, Facebook signed a legally binding agreement with the State of Washington to take further steps within 90 days to prevent the use of its advertising platform for housing discrimination against protected classes. The following month, Facebook announced that it would remove at least 5,000 categories from its exclusion system to prevent "misuse", including those relating to races and religions. On March 19, 2019, Facebook settled a lawsuit over the matter with the National Fair Housing Alliance, agreeing to create a separate portal for HEC advertising with limited targeting options by September 2019, and to provide a public archive of all HEC advertising.

On March 28, 2019, the U.S. Department of Housing and Urban Development (HUD) filed a lawsuit against Facebook, having filed a formal complaint against the company on August 13, 2018. The HUD also took issue with Facebook's tendency to deliver ads based on users having "particular characteristics most likely to engage with the ad".

Fake accounts

In August 2012, Facebook revealed that more than 83 million Facebook accounts (8.7% of total users) are fake accounts. These fake profiles consist of duplicate profiles, accounts for spamming purposes and personal profiles for business, organization or non-human entities such as pets. As a result of this revelation, the share price of Facebook dropped below $20. Furthermore, there is much effort to detect fake profiles using automated means, in one such work, machine learning techniques are used to detect fake users.

Facebook initially refused to remove a “business” page devoted to a woman’s anus, created without her knowledge while she was underage, due to other Facebook users having expressed interest in the topic. After Buzzfeed published a story about it, the page was finally removed. The page listed her family’s former home address as that of the “business”.

User interface

Upgrades

September 2008

In September 2008, Facebook permanently moved its users to what they termed the "New Facebook" or Facebook 3.0. This version contained several different features and a complete layout redesign. Between July and September, users had been given the option to use the new Facebook in place of the original design, or to return to the old design.

Facebook's decision to migrate their users was met with some controversy in their community. Several groups started opposing the decision, some with over a million users.

October 2009

In October 2009, Facebook redesigned the news feed so that the user could view all types of things that their friends were involved with. In a statement, they said,

... your applications generate can show up in both views. The best way for your stories to appear in the News Feed filter is to create stories that are highly engaging, as high quality, interesting stories are most likely to garner likes and comments by the user's friends.

This redesign was explained as:

News Feed will focus on popular content, determined by an algorithm based on interest in that story, including the number of times an item is liked or commented on. Live Feed will display all recent stories from a large number of a user's friends.

The redesign was met immediately with criticism with users, many who did not like the amount of information that was coming at them. This was also compounded by the fact that people couldn't select what they saw.

November/December 2009

In November 2009, Facebook issued a proposed new privacy policy, and adopted it unaltered in December 2009. They combined this with a rollout of new privacy settings. This new policy declared certain information, including "lists of friends", to be "publicly available", with no privacy settings; it was previously possible to keep access to this information restricted. Due to this change, the users who had set their "list of friends" as private were forced to make it public without even being informed, and the option to make it private again was removed. This was protested by many people and privacy organizations such as the EFF.

The change was described by Ryan Tate as Facebook's Great Betrayal, forcing user profile photos and friends lists to be visible in users' public listing, even for users who had explicitly chosen to hide this information previously, and making photos and personal information public unless users were proactive about limiting access. For example, a user whose "Family and Relationships" information was set to be viewable by "Friends Only" would default to being viewable by "Everyone" (publicly viewable). That is, information such as the gender of the partner the user is interested in, relationship status, and family relations became viewable to those even without a Facebook account. Facebook was heavily criticized for both reducing its users' privacy and pushing users to remove privacy protections. Groups criticizing the changes include the Electronic Frontier Foundation and American Civil Liberties Union. Mark Zuckerberg, CEO, had hundreds of personal photos and his events calendar exposed in the transition. Facebook has since re-included an option to hide friends lists from being viewable; however, this preference is no longer listed with other privacy settings, and the former ability to hide the friends list from selected people among one's own friends is no longer possible. Journalist Dan Gillmor deleted his Facebook account over the changes, stating he "can't entirely trust Facebook" and Heidi Moore at Slate's Big Money temporarily deactivated her account as a "conscientious objection". Other journalists have been similarly disappointed and outraged by the changes. Defending the changes, founder Mark Zuckerberg said "we decided that these would be the social norms now and we just went for it". The Office of the Privacy Commissioner of Canada launched another investigation into Facebook's privacy policies after complaints following the change.

January 2018

Following a difficult 2017, marked by accusations of relaying Fake news and revelations about groups close to Russia which tried to influence the 2016 US presidential election (see Russian interference in the 2016 United States elections) via advertisements on his service, Mark Zuckerberg, announced in his traditional January post:

“We're making a major change to how we build Facebook. I'm changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions”.

— Mark Zuckerberg

Following surveys on Facebook users, this desire for change will take the form of a reconfiguration of the News Feed algorithms to:

  • Prioritize content of family members and friends (Mark Zuckerberg January 12, Facebook: “The first changes you'll see will be in News Feed, where you can expect to see more from your friends, family and groups”.)
  • Give priority to news articles from local sources considered more credible

The recent changes of the News Feed algorithm (see content : News Feed#History) are expected to improve “the amount of meaningful content viewed”. To this end, the new algorithm is supposed to determine the publications around which a user is most likely to interact with his friends, and make them appear higher in the News Feed instead of items for example from media companies or brands. These are posts “that inspire back-and-forth discussion in the comments and posts that you might want to share and react to”. But, as even Mark Zuckerberg admitted, he “expect the time people spend on Facebook and some measures of engagement will go down. But I also expect the time you do spend on Facebook will be more valuable”. The less public content a Facebook user sees on their News Feed, the less brands are able to reach consumers. That’s unarguably a major lose for advertisers and publishers.

This change which seems to be just another update of the social network, is widely criticized because of the heavy consequences it might lead to “In countries such as the Philippines, Myanmar and South Sudan and emerging democracies such Bolivia and Serbia, it is not ethical to plead platform neutrality or to set up the promise of a functioning news ecosystem and then simply withdraw at a whim”. Indeed, in such countries, Facebook was the promise of a reliable and objective platform on which they could hope for raw information. Independent media companies tried to fight censorship through their articles and were promoting in a way the right for citizens to know what is going on in their countries.

The company’s way of handling scandals and criticism over fake news by diminishing its media company image is even defined as “potentially deadly” regarding the poor and fraught political environments like Myanmar or South Sudan appealed by the “free basics” programme of the social network. Serbian journalist Stevan Dojcinovic goes further by describing Facebook as a “monster” and accuses the company of “showing a cynical lack of concern for how its decisions affect the most vulnerable”. Indeed, Facebook had experimented with withdrawing media companies’ news on user’s newsfeed in few countries such as Serbia. Stevan Docjcinovic then wrote an article explaining how Facebook helped them “to bypass mainstream channels and bring stories to hundreds of thousands of readers”. The rule about publishers is not being applied to paid posts raising the journalist’s fears about the social network “becoming just another playground for the powerful” by letting them for example buy Facebook ads. Critics are also visible in other media companies depicting the private company as the “destroyer of worlds”. LittleThings CEO, Joe Speiser states that the algorithm shift “took out roughly 75% of LittleThings" organic traffic while hammering its profit margins” compelling them to close their doors because they were relying on Facebook to share content.

Net neutrality

"Free basics" controversy in India

In February 2016, TRAI ruled against differential data pricing for limited services from mobile phone operators effectively ending zero-rating platforms in India. Zero rating provides access to a limited number of websites for no charge to the end user. Net-neutrality supporters from India (SaveTheInternet.in) brought out the negative implications of the Facebook Free Basic program and spread awareness to the public. Facebook's Free Basics program was a collaboration with Reliance Communications to launch Free Basics in India. The TRAI ruling against differential pricing marked the end of Free Basics in India.

Earlier, Facebook had spent $44 million USD in advertising and it implored all of its Indian users to send an email to the Telecom Regulatory Authority to support its program. TRAI later asked Facebook to provide specific responses from the supporters of Free Basics.

Treatment of potential competitors

In December 2018 details on Facebook's behavior against competitors surfaced. The UK parliament member Damian Collins released files from a court ruling between Six4Three and Facebook. According to those files, the social media company Twitter released its app Vine in 2013. Facebook blocked Vine's Access to its data.

In July 2020, Facebook along with other tech giants Apple, Amazon and Google were accused of maintaining harmful power and anti-competitive strategies to quash potential competitors in the market. The CEOs of respective firms appeared in a teleconference on 29 July 2020 before the lawmakers of the United States Congress.

See also

References

  1. Duncan, Geoff (June 17, 2010). "Open letter urges Facebook to strengthen privacy". Digital Trends. Retrieved June 3, 2017.
  2. Paul, Ian (June 17, 2010). "Advocacy Groups Ask Facebook for More Privacy Changes". PC World. International Data Group. Retrieved June 3, 2017.
  3. Aspen, Maria (February 11, 2008). "How Sticky Is Membership on Facebook? Just Try Breaking Free". The New York Times. Retrieved June 3, 2017.
  4. Anthony, Sebastian (March 19, 2014). "Facebook's facial recognition software is now as accurate as the human brain, but what now?". ExtremeTech. Ziff Davis. Retrieved June 3, 2017.
  5. Gannes, Liz (June 8, 2011). "Facebook facial recognition prompts EU privacy probe". CNET. Retrieved June 3, 2017.
  6. Friedman, Matt (March 21, 2013). "Bill to ban companies from asking about job candidates' Facebook accounts is headed to governor". The Star-Ledger. Advance Digital. Retrieved June 3, 2017.
  7. "How Facebook Breeds Jealousy". Seeker. Group Nine Media. February 10, 2010. Retrieved June 3, 2017.
  8. Matyszczyk, Chris (August 11, 2009). "Study: Facebook makes lovers jealous". CNET. Retrieved June 3, 2017.
  9. Ngak, Chenda (November 27, 2012). "Facebook may cause stress, study says". CBS News. Retrieved June 3, 2017.
  10. Smith, Dave (November 13, 2015). "Quitting Facebook will make you happier and less stressed, study says". Business Insider. Axel Springer SE. Retrieved June 3, 2017.
  11. Bugeja, Michael J. (January 23, 2006). "Facing the Facebook". The Chronicle of Higher Education. Archived from the original on February 20, 2008. Retrieved June 3, 2017.
  12. Hough, Andrew (April 8, 2011). "Student 'addiction' to technology 'similar to drug cravings', study finds". The Daily Telegraph. Retrieved June 3, 2017.
  13. "Facebook and Twitter 'more addictive than tobacco and alcohol'". The Daily Telegraph. February 1, 2012. Retrieved June 3, 2017.
  14. Wauters, Robin (September 16, 2010). "Greenpeace Slams Zuckerberg For Making Facebook A 'So Coal Network' (Video)". TechCrunch. AOL. Retrieved June 3, 2017.
  15. Neate, Rupert (December 23, 2012). "Facebook paid £2.9m tax on £840m profits made outside US, figures show". The Guardian. Retrieved June 3, 2017.
  16. ^ Grinberg, Emanuella (September 18, 2014). "Facebook 'real name' policy stirs questions around identity". CNN. Retrieved June 3, 2017.
  17. Doshi, Vidhi (July 19, 2016). "Facebook under fire for 'censoring' Kashmir-related posts and accounts". The Guardian. Retrieved June 3, 2017.
  18. Arrington, Michael (November 22, 2007). "Is Facebook Really Censoring Search When It Suits Them?". TechCrunch. AOL. Retrieved June 3, 2017.
  19. Wong, Julia Carrie (March 18, 2019). "The Cambridge Analytica scandal changed the world – but it didn't change Facebook". The Guardian. Retrieved May 2, 2019.
  20. Greenwald, Glenn; MacAskill, Ewen (June 7, 2013). "NSA Prism program taps in to user data of Apple, Google and others". The Guardian. Retrieved June 3, 2017.
  21. Setalvad, Ariha (August 7, 2015). "Why Facebook's video theft problem can't last". The Verge. Retrieved June 3, 2017.
  22. "Facebook, Twitter and Google grilled by MPs over hate speech". BBC News. BBC. March 14, 2017. Retrieved June 3, 2017.
  23. Toor, Amar (September 15, 2015). "Facebook will work with Germany to combat anti-refugee hate speech". The Verge. Retrieved June 3, 2017.
  24. Sherwell, Philip (October 16, 2011). "Cyber anarchists blamed for unleashing a series of Facebook 'rape pages'". The Daily Telegraph. Retrieved June 3, 2017.
  25. "20,000 Israelis sue Facebook for ignoring Palestinian incitement". The Times of Israel. October 27, 2015. Retrieved June 3, 2017.
  26. "Israel: Facebook's Zuckerberg has blood of slain Israeli teen on his hands". The Times of Israel. July 2, 2016. Retrieved June 3, 2017.
  27. Burke, Samuel (November 19, 2016). "Zuckerberg: Facebook will develop tools to fight fake news". CNN. Retrieved June 3, 2017.
  28. "Hillary Clinton says Facebook 'must prevent fake news from creating a new reality'". The Daily Telegraph. June 1, 2017. Retrieved June 3, 2017.
  29. Fiegerman, Seth (May 9, 2017). "Facebook's global fight against fake news". CNN. Retrieved June 3, 2017.
  30. Grinberg, Emanuella; Said, Samira (March 22, 2017). "Police: At least 40 people watched teen's sexual assault on Facebook Live". CNN. Retrieved June 3, 2017.
  31. Grinberg, Emanuella (January 5, 2017). "Chicago torture: Facebook Live video leads to 4 arrests". CNN. Retrieved June 3, 2017.
  32. Sulleyman, Aatif (April 27, 2017). "Facebook Live killings: Why the criticism has been harsh". The Independent. Retrieved June 3, 2017.
  33. Farivar, Cyrus (January 7, 2016). "Appeals court upholds deal allowing kids' images in Facebook ads". Ars Technica. Retrieved June 3, 2017.
  34. Levine, Dan; Oreskovic, Alexei (March 12, 2012). "Yahoo sues Facebook for infringing 10 patents". Reuters. Retrieved June 3, 2017.
  35. Wagner, Kurt (February 1, 2017). "Facebook lost its Oculus lawsuit and has to pay $500 million". Recode. Retrieved June 3, 2017.
  36. Brandom, Rusell (May 19, 2016). "Lawsuit claims Facebook illegally scanned private messages". The Verge. Retrieved June 3, 2017.
  37. Tryhorn, Chris (July 25, 2007). "Facebook in court over ownership". The Guardian. Retrieved June 3, 2017.
  38. Michels, Scott (July 20, 2007). "Facebook Founder Accused of Stealing Idea for Site". ABC News. ABC. Retrieved June 3, 2017.
  39. Carlson, Nicholas (March 5, 2010). "How Mark Zuckerberg Hacked Into Rival ConnectU In 2004". Business Insider. Axel Springer SE. Retrieved June 3, 2017.
  40. Arthur, Charles (February 12, 2009). "Facebook paid up to $65m to founder Mark Zuckerberg's ex-classmates". The Guardian. Retrieved June 3, 2017.
  41. Singel, Ryan (April 11, 2011). "Court Tells Winklevoss Twins to Quit Their Facebook Whining". Wired. Retrieved June 3, 2017.
  42. Stempel, Jonathan (July 22, 2011). "Facebook wins dismissal of second Winklevoss case". Reuters. Retrieved June 3, 2017.
  43. Oweis, Khaled Yacoub (November 23, 2007). "Syria blocks Facebook in Internet crackdown". Reuters. Retrieved June 3, 2017.
  44. Wauters, Robin (July 7, 2009). "China Blocks Access To Twitter, Facebook After Riots". TechCrunch. AOL. Retrieved June 3, 2017.
  45. "Iranian government blocks Facebook access". The Guardian. May 24, 2009. Retrieved June 3, 2017.
  46. Frier, Sarah (August 13, 2019). "Facebook Paid Contractors to Transcribe Users' Audio Chats". Bloomberg News.
  47. "Facebook paid hundreds of contractors to transcribe users' audio". Los Angeles Times. August 13, 2019. Retrieved May 8, 2020.
  48. Haselton, Todd (August 13, 2019). "Facebook hired people to transcribe voice calls made on Messenger". CNBC. Retrieved May 8, 2020.
  49. ^ "A Handy Facebook-to-English Translator | Electronic Frontier Foundation". Eff.org. April 28, 2010. Retrieved June 11, 2013.
  50. "Zuckerberg family pic stirs Facebook privacy debate". CBS News. December 27, 2012. Retrieved June 4, 2012.
  51. Hoffman, Harrison (August 12, 2007). "Facebook's source code goes public". CNET News.com.
  52. Richards, Jonathan (August 14, 2007). "Facebook Source Code Leaked Onto Internet". Fox News Channel. Archived from the original on May 29, 2013. Retrieved August 21, 2007.
  53. "Facebook's PHP leak SNAFU". Szinf.com. July 6, 2015. Archived from the original on July 7, 2015. Retrieved July 6, 2015.
  54. Cubrilovic, Nik (August 11, 2007). "Facebook Source Code Leaked". TechCrunch.com.
  55. ^ Ortutay, Barbara (September 21, 2009). "Facebook to end Beacon tracking tool in settlement". USA Today. Retrieved December 8, 2010.
  56. Henry Blodget (December 1, 2007). "NYT: Facebook's Zuckerberg Misled Us; Coke: Ditto - Silicon Alley Insider". Alleyinsider.com. Archived from the original on January 31, 2009. Retrieved June 11, 2013.
  57. Stefan Berteau (November 29, 2007). "Facebook's Misrepresentation of Beacon's Threat to Privacy: Tracking users who opt out or are not logged in". CA Security Advisor Research Blog. Archived from the original on December 17, 2007. Retrieved December 24, 2007.
  58. Stefan Berteau (November 30, 2007). "Update: A Statement From Facebook". CA Security Advisor Research Blog. Archived from the original on November 28, 2010. Retrieved December 8, 2010.
  59. Rosmarin, Rachel (September 5, 2006). "Facebook's Makeover". Forbes. Archived from the original on October 5, 2006. Retrieved April 29, 2015.
  60. "Facebook CEO: 'We Really Messed This One Up'". NBC11.com. September 8, 2006. Archived from the original on January 28, 2007. Retrieved February 21, 2007.
  61. Kirkpatrick, David (2010). The Facebook Effect: The Inside Story of the Company That Is Connecting the World. New York City: Simon & Schuster. p. 191. ISBN 978-1-4391-0211-4.
  62. Jesdanun, Anick (2006). "Facebook offers new privacy options". Associated Press. Archived from the original on December 13, 2010. Retrieved September 8, 2006.
  63. "Making Control Simple". Retrieved December 8, 2010.
  64. "Controlling How You Share". Retrieved December 8, 2010.
  65. "John Lynch & Jenny Ellickson, U.S. Dept. of Justice, Computer Crime and Intellectual Property Section, Obtaining and Using Evidence from Social Networking Sites: Facebook, MySpace, LinkedIn, and more" (PDF). Retrieved June 11, 2013.
  66. ^ Junichi P. Semitsu (2011). "From Facebook to Mug Shot: How the Dearth of Social Networking Privacy Rights Revolutionized Online Government Surveillance". Pace Law Review. 31 (1).
  67. "Rapport over verzoeken tot gegevensverstrekking van internationale overheden". Facebook. Retrieved September 4, 2013.
  68. "ap.google.com, Canada launches privacy probe into Facebook". Archived from the original on June 3, 2008.
  69. ^ "Privacy Commissioner's Findings in the case of CIPPIC against Facebook" (PDF). Retrieved January 15, 2010.
  70. Jones, Harvey & Soltren, José Hiram (2005). "Facebook: Threats to Privacy" (PDF). Cambridge, Massachusetts: MIT (MIT 6.805/STS085: Ethics and Law on the Electronic Frontier - Fall 2005). {{cite journal}}: Cite journal requires |journal= (help) (PDF)
  71. "Facebook Security Response". TheIndyChannel.com. Archived from the original on April 17, 2012. Retrieved December 8, 2010.
  72. Peterson, Chris (February 13, 2006). "Who's Reading Your Facebook?". The Virginia Informer.
  73. ^ "Facebook Privacy Policy". Retrieved December 8, 2010.
  74. Buckley, Christine (August 30, 2007). "Get a life and allow your staff to use Facebook, TUC tells bosses". The Times. London. Retrieved March 5, 2008.
  75. "Facebook Opens Profiles to Public". BBC. September 7, 2007.
  76. "Facebook security". BBC. October 24, 2007. Archived from the original on February 20, 2008. Retrieved March 5, 2008.
  77. "Controlling How You Share". Retrieved December 8, 2010.
  78. ^ Aspan, Maria (February 11, 2008). "How Sticky Is Membership on Facebook? Just Try Breaking Free". The New York Times. Retrieved September 23, 2014.
  79. "Information we receive about you". Retrieved June 11, 2013 – via Facebook.
  80. Lunden, Ingrid (October 13, 2013). "Facebook Buys Mobile Data Analytics Company Onavo, Reportedly For Up To $200M… And (Finally?) Gets Its Office In Israel". TechCrunch.
  81. Morris, Betsy; Seetharaman, Deepa (August 9, 2017). "The New Copycats: How Facebook Squashes Competition From Startups". The Wall Street Journal. ISSN 0099-9660. Retrieved August 15, 2017.
  82. "The New Copycats: How Facebook Squashes -2-". Fox Business. August 9, 2017. Retrieved August 15, 2017.
  83. "Facebook knew about Snap's struggles months before the public". Engadget. Retrieved August 15, 2017.
  84. Perez, Sarah. "Facebook is pushing its data-tracking Onavo VPN within its main mobile app". TechCrunch. Retrieved February 14, 2018.
  85. "Facebook's Protect security feature is essentially Spyware". IT PRO. Retrieved February 14, 2018.
  86. "Apple removed Facebook's Onavo from the App Store for gathering app data". TechCrunch. Retrieved August 23, 2018.
  87. "Facebook will pull its data-collecting VPN app from the App Store over privacy concerns". The Verge. Retrieved August 23, 2018.
  88. Grothaus, Michael (August 23, 2018). "Apple makes Facebook pull its spyware(ish) VPN from the App Store". Fast Company. Retrieved September 3, 2018.
  89. Newton, Casey (January 30, 2019). "Facebook will shut down its controversial market research app for iOS". The Verge. Retrieved January 30, 2019.
  90. Constine, John (January 29, 2019). "Facebook pays teens to install VPN that spies on them". TechCrunch. Retrieved January 30, 2019.
  91. ^ Wagner, Kurt (January 30, 2019). "Apple says it's banning Facebook's research app that collects users' personal information". Recode. Retrieved January 30, 2019.
  92. Warren, Tom (January 30, 2019). "Apple blocks Facebook from running its internal iOS apps". The Verge. Retrieved January 30, 2019.
  93. Isaac, Mike (January 31, 2019). "Apple Shows Facebook Who Has the Power in an App Dispute". The New York Times. ISSN 0362-4331. Retrieved February 2, 2019.
  94. Constine, Josh (January 30, 2019). "Senator Warner calls on Zuckerberg to support market research consent rules". TechCrunch. Retrieved January 31, 2019.
  95. Lapowsky, Issie (January 30, 2019). "By Defying Apple's Rules, Facebook Shows It Never Learns". Wired. ISSN 1059-1028. Retrieved January 31, 2019.
  96. "Net generation grieves with Facebook postings". News Observer. Archived from the original on August 20, 2007. Retrieved March 5, 2008.
  97. Batista, Sarah (November 21, 2005). "UVA Student Remembered". Charlottesville Newsplex. Archived from the original on January 19, 2008. Retrieved April 10, 2006.{{cite news}}: CS1 maint: bot: original URL status unknown (link)
  98. Bernhard, Stephanie (January 25, 2006). "Community mourns death of Pagan '06". Brown Daily Herald. Retrieved April 10, 2006.
  99. Kelleher, Kristina (February 22, 2007). "Facebook profiles become makeshift memorials". The Brown Daily Herald. Archived from the original on March 21, 2008. Retrieved March 5, 2008.
  100. Hortobagyi, Monica (May 8, 2007). "USA Today article". USA Today. Retrieved April 30, 2010.
  101. Drudi, Cassandra (January 5, 2008). "Facebook proves problematic for police". The Globe and Mail. Toronto. Retrieved March 5, 2008.
  102. "Angry Facebook Users Illegally Leaked the Names of Accused Underage Murderers". Digital Journal. January 5, 2008. Retrieved March 5, 2008.
  103. "Defacing Facebook". July 27, 2007. Retrieved August 17, 2007.
  104. Günel, B , Şahi̇n, S , Kogıas, D , Patrıkakıs, C . (2019). Privacy issues in post dissemination on Facebook . Turkish Journal of Electrical Engineering and Computer Science , 27 (5) , 3417-3432 . Retrieved from https://dergipark.org.tr/en/pub/tbtkelektrik/issue/50810/662311
  105. Vishwanath, A., Xu, W. and Ngoh, Z. (2018), How people protect their privacy on facebook: A cost‐benefit view. Journal of the Association for Information Science and Technology, 69: 700-709. https://doi.org/10.1002/asi.23894
  106. Paul, Ian (May 31, 2010). "It's Quit Facebook Day, Are You Leaving? - PCWorld". PC World. Retrieved May 31, 2010.
  107. Woollacott, Emma (May 31, 2010). "Quit Facebook Day set to be a flop". TG Daily. Retrieved May 31, 2010.
  108. Jemima Kiss (June 1, 2010). "Facebook: Did anyone really quit?". The Guardian. London.
  109. Stieger, Stefan; Burger, Christoph; Bohn, Manuel; Voracek, Martin (2013). "Who Commits Virtual Identity Suicide? Differences in Privacy Concerns, Internet Addiction, and Personality Between Facebook Users and Quitters". Cyberpsychology, Behavior, and Social Networking. 16 (9): 629–634. doi:10.1089/cyber.2012.0323. PMID 23374170.(subscription required)
  110. "Facebook's facial recognition software is now as accurate as the human brain, but what now? | ExtremeTech". Extremetech.com. Retrieved June 13, 2014.
  111. Facebook Taking Hits Over Facial Recognition Feature. Washington: Atlantic Media, Inc., 2011. ProQuest. Web. December 6, 2016.
  112. ^ "Facebook facial recognition raises eyebrows in Germany, EU". Deutsche Welle. Retrieved June 13, 2011.
  113. Milian, Mark. "Facebook lets users opt out of facial recognition". CNN International. Retrieved June 13, 2011.
  114. Gannes, Liz. "Facebook facial recognition prompts EU privacy probe". Cnet News. Retrieved June 13, 2011.
  115. "Facebook's Facial Recognition Software Is Different From The FBI's. Here's Why". NPR. Retrieved December 16, 2018.
  116. Computer, Express. "Facebooks' Mark Zuckerberg: 'we should Not be Afraid of AI'." Express Computer (2016) ProQuest. Web. December 6, 2016.
  117. "Was Facebook über User weiß". Orf.at. November 27, 2011. Retrieved June 11, 2013.
  118. ^ "Sound file" (MP£). Europe-v-facebook.org. Retrieved December 16, 2018.
  119. "An Coimisineir Cosanta Sonrai (Data Protection Commissioner) letter" (PDF). August 24, 2011. Retrieved June 13, 2014.
  120. Drucker, Jesse (October 21, 2010). "Google 2.4% Rate Shows How $60 Billion Lost to Tax Loopholes". Bloomberg L.P. Retrieved May 21, 2013.
  121. "Facebook's Data Pool". Europe-v-facebook.org.
  122. "Removed content" (PDF). August 22, 2011. Retrieved June 13, 2014.
  123. "Facebook Data Categories" (PDF). April 3, 2012. Retrieved June 13, 2014.
  124. "Legal Procedure against 'Facebook Ireland Limited'". Europe-v-facebook.org.
  125. "Facebook won't 'like' its 17th complaint". Irish Examiner. August 27, 2011. Retrieved June 11, 2013.
  126. "Our-Policy.org - Annuity Payments Policies and Regulations". Our-policy.org. Archived from the original on May 25, 2018. Retrieved March 24, 2018.
  127. "Europe versus Facebook". Europe-v-facebook.org. Retrieved June 11, 2013.
  128. Achohido, Byron (November 15, 2011). "Facebook tracking is under scrutiny". USA Today. Archived from the original on November 16, 2011. Retrieved June 18, 2017.
  129. "Belgian court orders Facebook to stop tracking non-members". The Guardian. November 10, 2015. Retrieved June 18, 2017.
  130. Baraniuk, Chris (December 2, 2015). "Facebook bows to Belgian privacy ruling over cookies". BBC News. BBC. Retrieved June 18, 2017.
  131. Statt, Nick (December 2, 2015). "After privacy ruling, Facebook now requires Belgium users to log in to view pages". The Verge. Retrieved June 18, 2017.
  132. Anson, Alexander (November 12, 2012). "Facebook Stalking Statistics 2012". ansonalex.com. Anson, Alexander. Retrieved October 26, 2014.
  133. "Stalking Statistics". Violence Prevention and Action Center. John Carroll University. Retrieved October 26, 2014.
  134. Westlake, E. J. (2008), "Friend Me if You Facebook: Generation Y and Performative Surveillance", The Drama Review, 52 (4): 21–40, doi:10.1162/dram.2008.52.4.21, S2CID 57572210
  135. Steel, Emily; Fowler, Geoffrey A. (October 18, 2010). "Facebook in Privacy Breach". The Wall Street Journal. Retrieved June 4, 2017.
  136. Takahashi, Dean (October 17, 2010). "WSJ reports Facebook apps — including banned LOLapps games — transmitted private user data". VentureBeat. Retrieved June 4, 2017.
  137. "Suspending Cambridge Analytica and SCL Group from Facebook | Facebook Newsroom". Retrieved March 20, 2018.
  138. "How Facebook Made Its Cambridge Analytica Data Crisis Even Worse". Bloomberg L.P. March 20, 2018. Retrieved March 20, 2018.
  139. "Academic behind Facebook breach says political influence was..." Reuters. March 21, 2018. Retrieved March 21, 2018.
  140. Solon, Olivia (April 4, 2018). "Facebook says Cambridge Analytica may have gained 37m more users' data". The Guardian. Retrieved April 6, 2018.
  141. Wong, Julia Carrie (March 23, 2018). "Elon Musk joins #DeleteFacebook effort as Tesla and SpaceX pages vanish". The Guardian. Retrieved March 24, 2018.
  142. Green, Dr. Jemma. "#DeleteFacebook Highlights The Benefits Of Blockchain". Forbes. Retrieved March 24, 2018.
  143. Grind, Kirsten (March 22, 2018). "Next Worry for Facebook: Disenchanted Users". The Wall Street Journal. Retrieved March 25, 2018.
  144. Tobias, Manuela (March 22, 2018). "Comparing Facebook data use by Obama, Cambridge Analytica". PolitiFact. Retrieved May 2, 2018.
  145. Schouten, Fredreka (March 20, 2018). "Obama 2012 team: We didn't break Facebook rules in our campaign". USA Today. Retrieved May 2, 2018.
  146. Rogers, James (March 20, 2018). "Obama 2012 campaign 'sucked' data from Facebook, former official says". Fox News Channel. Retrieved May 2, 2018.
  147. Sullivan, Mark (March 20, 2018). "Obama Campaign's "Targeted Share" App Also Used Facebook Data From Millions Of Unknowing Users". Fast Company. Retrieved May 2, 2018.
  148. Rutenberg, Jim (June 20, 2013). "The Obama Campaign's Digital Masterminds Cash In". The New York Times. Retrieved May 2, 2018.
  149. Friedman, Matt (March 21, 2013). "Bill to ban companies from asking about job candidates' Facebook accounts is headed to governor". The Star-Ledger. Advance Digital. Retrieved June 8, 2017.
  150. N. Landers, Richard (2016). Social Media in Employee Selection and Recruitment: theory, practice, and current challenges. Switzerland: Springer international publishing. pp. 19–20. ISBN 9783319299891.
  151. "Fourth Amendment Activities". uscourts.gov. Retrieved March 24, 2018.
  152. ^ Dave., Awl (2011). Facebook me! : a guide to socializing, sharing, and promoting on Facebook (2nd ed.). Berkeley, CA: Peachpit Press. ISBN 9780321743732. OCLC 699044722.
  153. "Why Parents Help Their Children Lie to Facebook About Age: Unintended Consequences of the Children's Online Privacy Protection Act". Journalist's Resource.org.
  154. Schweitzer, Sarah (October 6, 2005). "Fisher College expels student over website entries". Boston Globe.
  155. O'Toole, Catie (January 24, 2010). "Seventh-grade North Syracuse student suspended, 25 others disciplined for Facebook page about teacher". The Post-Standard. Retrieved January 25, 2010.
  156. ^ Peluchette, Joy; Karl, Katherine (2010). "Examining Students' Intended Image on Facebook: "what were they Thinking?!"". Journal of Education for Business. 85 (1): 30–7. doi:10.1080/08832320903217606. S2CID 44233400.
  157. Bugeja, Michael (January 3, 2006). "Facing the Facebook". The Chronicle of Higher Education. Archived from the original on February 20, 2008. Retrieved October 6, 2006.
  158. Bugeja, Michael J. (January 26, 2007). "Distractions in the Wireless Classroom". Chronicle Careers. The Chronicle of Higher Education. Retrieved June 26, 2007.
  159. National Association of Campus Activities (July 12, 2006). "Facing the Facebook". Archived from the original on June 27, 2006. Retrieved October 6, 2006.
  160. Association for Education in Journalism and Communication (2006). "Facing the Facebook: Administrative Issues Involving Social Networks". Archived from the original on October 8, 2007. Retrieved October 6, 2006.
  161. EDUCAUSE Learning Institute (2006). "7 Things You Should Know About Facebook". Archived from the original on September 16, 2006. Retrieved October 6, 2006.
  162. ^ Junco, R (2012). "Too much face and not enough books: The relationship between multiple indices of Facebook use and academic performance" (PDF). Computers in Human Behavior. 28 (1): 187–198. doi:10.1016/j.chb.2011.08.026.
  163. ^ Junco, R (2012). "The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement" (PDF). Computers & Education. 58 (1): 162–171. doi:10.1016/j.compedu.2011.08.004.
  164. ^ Heiberger, Greg and Harper, Ruth (2008). Have you Facebooked Astin lately? In Reynol Junco and Dianne M. Timm (Eds). Using Emerging Technologies to Enhance Student Engagement. San Francisco: Jossey-Bass.
  165. Cotten, Shelia R. (2008). Students' technology use and the impacts on well-being. In Reynol Junco and Dianne M. Timm (Eds). Using Emerging Technologies to Enhance Student Engagement. San Francisco: Jossey-Bass.
  166. ^ Kirschner, P. A.; Karpinski, A. C. (2010). "Facebook and academic performance". Computers in Human Behavior. 26 (6): 1237–1245. doi:10.1016/j.chb.2010.03.024. hdl:10818/20216. Archived from the original on December 27, 2011. Retrieved October 31, 2017.
  167. ^ Kolek, E. A., & Saunders, D. (2008). Online disclosure: An empirical examination of undergraduate Facebook profiles. NASPA Journal, 45(1), 1–25.
  168. ^ Hargittai, Eszter; More, Eian; Pasek, Josh (April 26, 2009). "Facebook and academic performance: Reconciling a media sensation with data". First Monday. 14 (5). doi:10.5210/fm.v14i5.2498. Retrieved January 30, 2019.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  169. ^ Hern, Alex (December 14, 2018). "Facebook admits bug allowed apps to see hidden photos". The Guardian. Retrieved December 15, 2018.
  170. Dance, Gabriel J. X.; LaForgia, Michael; Confessore, Nicholas (December 18, 2018). "As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants". The New York Times.
  171. Hern, Alex (December 19, 2018). "Facebook users cannot avoid location-based ads, investigation finds". The Guardian.
  172. "Say No To The Dress". BuzzFeed News. Retrieved January 22, 2019.
  173. "Facebook reportedly received users' sensitive health data from apps: "It's incredibly dishonest"". CBS News. Retrieved February 23, 2019.
  174. Doward, Jamie; Soni, Raj (February 23, 2019). "Facebook attacked over app that reveals period dates of its users". The Guardian. Retrieved February 23, 2019.
  175. Schechner, Sam; Secada, Mark (February 22, 2019). "You Give Apps Sensitive Personal Information. Then They Tell Facebook". The Wall Street Journal. Retrieved February 23, 2019.
  176. Statt, Nick (February 22, 2019). "App makers are sharing sensitive personal information with Facebook but not telling users". The Verge. Retrieved February 23, 2019.
  177. Reuters (February 23, 2019). "'Outrageous abuse of privacy': New York orders inquiry into Facebook data use". The Guardian. Retrieved February 23, 2019. {{cite web}}: |last= has generic name (help)
  178. ^ "Revealed: Facebook's global lobbying against data privacy laws - Technology - The Guardian". March 2, 2019. Archived from the original on March 2, 2019. Retrieved March 3, 2019.
  179. Laura Kayali (January 29, 2019). "Inside Facebook's fight against European regulation". Politico Europe. Retrieved May 3, 2019.
  180. "Facebook Stored Millions of Passwords in Plaintext—Change Yours Now". Wired. March 21, 2019. Retrieved March 23, 2019.
  181. Hern, Alex (March 21, 2019). "Facebook stored hundreds of millions of passwords unprotected". The Guardian. Retrieved March 22, 2019.
  182. "Facebook now says its password leak affected 'millions' of Instagram users". TechCrunch. April 18, 2019. Retrieved April 18, 2019.
  183. "Hungary competition authority fines Facebook $4 million". The Seattle Times. December 6, 2019. Retrieved December 14, 2019.
  184. "Is Facebook listening to me? Why those ads appear after you talk about things". USA Today. June 28, 2019. Retrieved June 28, 2019.
  185. "Facebook isn't secretly listening to your conversations, but the truth is much more disturbing". NEWS ATLAS. September 6, 2019. Retrieved September 6, 2019.
  186. https://www.oculus.com/legal/privacy-policy/
  187. https://kotaku.com/facebook-buys-oculus-rift-for-2-billion-1551487939
  188. https://www.slashgear.com/oculus-quest-2-facebook-account-demand-sparks-an-antitrust-investigation-10650568/amp/?fbclid=IwAR0lvOrN1o527XCkYrH8XX00djUlI-1bNJ9PCtRe_dNJyg9Soy-cbk6axoo
  189. https://www.oculus.com/blog/a-single-way-to-log-into-oculus-and-unlock-social-features/
  190. https://www.roadtovr.com/fake-facebook-account-oculus-headset-community-standards/
  191. https://www.tubefilter.com/2020/12/11/insights-facebook-antitrust-lawsuit/
  192. Hough, Andrew (April 8, 2011). "Student 'addiction' to technology 'similar to drug cravings', study finds". London.
  193. "Facebook and Twitter 'more addictive than tobacco and alcohol'". London. February 1, 2012.
  194. Edwards, Ashton (August 1, 2014). "Facebook goes down for 30 minutes, 911 calls pour in". Fox13. Retrieved August 2, 2016.
  195. Lenhart, Amanda (April 9, 2015). "Teens, Social Media & Technology Overview 2015". Pew Research Center. Retrieved July 8, 2020.
  196. Turel, Ofir; Bechara, Antoine (2016). "Social Networking Site Use While Driving: ADHD and the Mediating Roles of Stress, Self-Esteem and Craving". Frontiers in Psychology. 7: 455. doi:10.3389/fpsyg.2016.00455. PMC 4812103. PMID 27065923.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  197. Settanni, Michele; Marengo, Davide; Fabris, Matteo Angelo; Longobardi, Claudio (2018). "The interplay between ADHD symptoms and time perspective in addictive social media use: A study of adolescent Facebook users". Children and Youth Services Review. 89. Elsevier: 165–170. doi:10.1016/j.childyouth.2018.04.031.
  198. Savage, Michael (January 26, 2019). "Health secretary tells social media firms to protect children after girl's death". The Guardian. Retrieved January 30, 2019.
  199. ^ editor, Richard Adams Education (January 30, 2019). "Social media urged to take 'moment to reflect' after girl's death". The Guardian. Retrieved January 30, 2019. {{cite web}}: |last= has generic name (help)
  200. "Potential for Facebook addiction and consequences". July 15, 2012.
  201. "The Anti-Social Network". Slate. January 26, 2011.
  202. "How Facebook Breeds Jealousy". Discovery.com. February 10, 2010.
  203. "Study: Facebook makes lovers jealous". CNET. August 11, 2009.
  204. "Jealous much? MySpace, Facebook can spark it". NBC News. July 31, 2007.
  205. "Facebook Causes Jealousy, Hampers Romance, Study Finds". University of Guelph. February 13, 2007.
  206. "Facebook jealousy sparks asthma attacks in dumped boy". USA Today. November 19, 2010.
  207. Hanna Krasnova; Helena Wenninger; Thomas Widjaja; Peter Buxmann (January 23, 2013). "Envy on Facebook: A Hidden Threat to Users' Life Satisfaction?" (PDF). 11th International Conference on Wirtschaftsinformatik, February 27 – March 1, 2013, Leipzig, Germany. Archived from the original (PDF) on June 1, 2014. Retrieved June 13, 2014.
  208. BBC News - Facebook use 'makes people feel worse about themselves'. BBC.co.uk (August 15, 2013). Retrieved September 4, 2013.
  209. Myung Suh Lim; Junghyun Kim (June 4, 2018). "Facebook users' loneliness based on different types of interpersonal relationships: Links to grandiosity and envy". Information Technology & People. doi:10.1108/ITP-04-2016-0095. ISSN 0959-3845.
  210. Divorce cases get the Facebook factor. - MEN Media. Published January 19, 2011. Retrieved March 13, 2012.
  211. Facebook's Other Top Trend of 2009: Divorce Archived January 12, 2012, at the Wayback Machine - Network World. Published December 22, 2009. Retrieved March 13, 2012.
  212. "Facebook to Blame for Divorce Boom". Fox News Channel. April 12, 2010. Archived from the original on April 15, 2010. Retrieved January 3, 2012.
  213. Facebook is divorce lawyers' new best friend - MSNBC. Published June 28, 2010. Retrieved March 13, 2012.
  214. "Facebook flirting triggers divorces". The Times of India. January 1, 2012.
  215. ^ Clayton, Russell B.; Nagurney, Alexander; Smith, Jessica R. (June 7, 2013). "Cheating, Breakup, and Divorce: Is Facebook Use to Blame?". Cyberpsychology, Behavior, and Social Networking. 16 (10): 717–720. doi:10.1089/cyber.2012.0424. ISSN 2152-2715. PMID 23745615.
  216. Utz, Sonja; Beukeboom, Camiel J. (July 1, 2011). "The Role of Social Network Sites in Romantic Relationships: Effects on Jealousy and Relationship Happiness". Journal of Computer-Mediated Communication. 16 (4): 511–527. doi:10.1111/j.1083-6101.2011.01552.x. ISSN 1083-6101.
  217. Tokunaga, Robert S. (2011). "Social networking site or social surveillance site? Understanding the use of interpersonal electronic surveillance in romantic relationships". Computers in Human Behavior. 27 (2): 705–713. doi:10.1016/j.chb.2010.08.014.
  218. Muise, Amy; Christofides, Emily; Desmarais, Serge (April 15, 2009). "More Information than You Ever Wanted: Does Facebook Bring Out the Green-Eyed Monster of Jealousy?". CyberPsychology & Behavior. 12 (4): 441–444. doi:10.1089/cpb.2008.0263. ISSN 1094-9313. PMID 19366318.
  219. Kerkhof, Peter; Finkenauer, Catrin; Muusses, Linda D. (April 1, 2011). "Relational Consequences of Compulsive Internet Use: A Longitudinal Study Among Newlyweds" (PDF). Human Communication Research. 37 (2): 147–173. doi:10.1111/j.1468-2958.2010.01397.x. hdl:1871/35795. ISSN 1468-2958.
  220. Papp, Lauren M.; Danielewicz, Jennifer; Cayemberg, Crystal (October 11, 2011). ""Are We Facebook Official?" Implications of Dating Partners' Facebook Use and Profiles for Intimate Relationship Satisfaction". Cyberpsychology, Behavior, and Social Networking. 15 (2): 85–90. doi:10.1089/cyber.2011.0291. ISSN 2152-2715. PMID 21988733.
  221. "Does Facebook Stress You Out?". Webpronews.com. February 17, 2010. Archived from the original on February 18, 2011.
  222. Maier, C., Laumer, S., Eckhardt, A., and Weitzel, T. Online Social Networks as a Source and Symbol of Stress: An Empirical Analysis Proceedings of the 33rd International Conference on Information Systems (ICIS) 2012, Orlando (FL)
  223. Maier, C.; Laumer, S.; Eckhardt, A.; Weitzel, T. (2014). "Giving too much Social Support: Social Overload on Social Networking Sites". European Journal of Information Systems. 24 (5): 447–464. doi:10.1057/ejis.2014.3. S2CID 205122288.
  224. McCain, Jessica L.; Campbell, W. Keith (2018). "Narcissism and Social Media Use: A Meta-Analytic Review". Psychology of Popular Media Culture. 7 (3). American Psychological Association: 308–327. doi:10.1037/ppm0000137. S2CID 152057114. Retrieved June 9, 2020.
  225. Gnambs, Timo; Appel, Markus (2018). "Narcissism and Social Networking Behavior: A Meta-Analysis". Journal of Personality. 86 (2). Wiley-Blackwell: 200–212. doi:10.1111/jopy.12305. PMID 28170106.
  226. Brailovskaia, Julia; Bierhoff, Hans-Werner (2020). "The Narcissistic Millennial Generation: A Study of Personality Traits and Online Behavior on Facebook". Journal of Adult Development. 27 (1). Springer Science+Business Media: 23–35. doi:10.1007/s10804-018-9321-1. S2CID 149564334.
  227. Casale, Silvia; Banchi, Vanessa (2020). "Narcissism and problematic social media use: A systematic literature review". Addictive Behaviors Reports. 11. Elsevier: 100252. doi:10.1016/j.abrep.2020.100252. PMC 7244927. PMID 32467841.
  228. Lukianoff, Greg; Haidt, Jonathan (2018). The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure. New York: Penguin Press. p. 147. ISBN 978-0735224896.
  229. "Facebook MAU worldwide 2020". Statista. Retrieved January 6, 2021.
  230. Harari, Yuval Noah (2017), "Danksagung", Homo Deus, Verlag C.H.BECK oHG, pp. 539–540, ISBN 978-3-406-70402-4, retrieved January 6, 2021
  231. Reviglio, Urbano (2017), "Serendipity by Design? How to Turn from Diversity Exposure to Diversity Experience to Face Filter Bubbles in Social Media", Internet Science, Cham: Springer International Publishing, pp. 281–300, ISBN 978-3-319-70283-4, retrieved January 6, 2021
  232. Eslami, Motahhare; Rickman, Aimee; Vaccaro, Kristen; Aleyasen, Amirhossein; Vuong, Andy; Karahalios, Karrie; Hamilton, Kevin; Sandvig, Christian (April 18, 2015). ""I always assumed that I wasn't really that close to [her]": Reasoning about Invisible Algorithms in News Feeds". Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Seoul Republic of Korea: ACM: 153–162. doi:10.1145/2702123.2702556. ISBN 978-1-4503-3145-6.
  233. Adee, Sally (November 2016). "Burst the filter bubble". New Scientist. 232 (3101): 24–25. doi:10.1016/S0262-4079(16)32182-0.
  234. Lee, Sangwon; Xenos, Michael (2019). "Social distraction? Social media use and political knowledge in two U.S. Presidential elections". Computers in Human Behavior. 90: 18–25. doi:10.1016/j.chb.2018.08.006.
  235. Tufekci, Zeynep (2015). "Facebook said its algorithms do help form echo chambers, and the tech press missed it". New Perspectives Quarterly. 32: 9–12 – via Wiley Online Library.
  236. Eytan, Bakshy; Messing, Solomon; Adamic, Lada A (2015). "Exposure to ideologically diverse news and opinion on Facebook". Science. 348: 1130–1132 – via American Association for the Advancement of Science.
  237. Lukianoff, Greg; Haidt, Jonathan (2018). The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure. New York: Penguin Press. pp. 126–132. ISBN 978-0735224896.
  238. File, Thom (May 2013). Computer and Internet Use in the United States (PDF) (Report). Current Population Survey Reports. Washington, D.C.: U.S. Census Bureau. Retrieved February 11, 2020.
  239. Haidt, Jonathan; Rose-Stockwell, Tobias (2019). "The Dark Psychology of Social Networks". The Atlantic. Vol. 324, no. 6. Emerson Collective. pp. 57–60. Retrieved June 11, 2020.
  240. Gregory, Andy (November 7, 2019). "More than a third of millennials approve of communism, YouGov poll indicates". The Independent. Independent Digital News & Media Ltd. Retrieved June 11, 2020.
  241. Saad, Lydia (November 25, 2019). "Socialism as Popular as Capitalism Among Young Adults in U.S." Gallup. Retrieved June 11, 2020.
  242. Bromley, Alanna (2011). "Are social networking sites breeding antisocial young people?" (PDF). Journal of Digital Research and Publishing.
  243. "Students Take On Cyberbullying" – via YouTube.
  244. Baron, Naomi S. (2007). "My Best Day: Presentation of Self and Social Manipulation in Facebook and IM" (PDF). Archived from the original (PDF) on May 23, 2013.
  245. "A new addiction for teacher candidates: social networks" (PDF). The Turkish Online Journal of Educational Technology. 11 (3). 2012.
  246. Turkle, Sherry (2011): Alone Together. Why We Expect More from Technology and Less from Each Other. New York: Basic Books.
  247. Robert M. Bond; Christopher J. Fariss; Jason J. Jones; Adam D. I. Kramer; Cameron Marlow; Jaime E. Settle; James H. Fowler (2012). "A 61-million-person experiment in social influence and political mobilization". Nature. 489 (7415): 295–298. doi:10.1038/nature11421. PMC 3834737. PMID 22972300.
  248. Robert Booth (2014). "Facebook reveals news feed experiment to control emotions". The Guardian. Retrieved June 30, 2014.
  249. Adam D. I. Kramer, Jamie E. Guillory. Jeffrey T. Hancock (2014). "Experimental evidence of massive-scale emotional contagion through social networks". Proceedings of the National Academy of Sciences of the United States of America. 111 (24): 8788–8790. doi:10.1073/pnas.1320040111. PMC 4066473. PMID 24889601.
  250. "Facebook update". Retrieved July 14, 2019.(subscription required)
  251. David Goldman (July 2, 2014). "Facebook still won't say 'sorry' for mind games experiment". CNNMoney. Retrieved July 3, 2014.
  252. Guynn, Jessica (July 3, 2014). "Privacy watchdog files complaint over Facebook study". USA Today. Retrieved July 5, 2014.
  253. Grohol, John. "Emotional Contagion on Facebook? More Like Bad Research Methods". Psych Central. PsychCentral. Retrieved July 12, 2014.
  254. Rudder, Christian (July 28, 2014). "We experiment on human beings". okcupid.com. Archived from the original on January 23, 2015. Retrieved July 14, 2019.
  255. Grimmelmann, James (September 23, 2014). "Illegal, immoral, and mood-altering: How Facebook and OkCupid broke the law when they experimented on users". Retrieved September 24, 2014.
  256. "Facebook's 'experiment' was socially irresponsible". The Guardian. July 1, 2014. Retrieved August 4, 2014.
  257. Neate, Rupert (December 23, 2012). "Facebook paid £2.9m tax on £840m profits made outside US, figures show". The Guardian. Retrieved October 25, 2016.
  258. "Paradise Papers reveal hidden wealth of global elite". The Express Tribune. November 6, 2017.
  259. van Noort, Wouter (November 11, 2017). "Belastingontwijking is simpel op te lossen" [Tax avoidance can easily be solved]. NRC Handelsblad (in Dutch). Retrieved July 14, 2019. The quote, as heading of the article, comes from the French economist Gabriel Zucman.
  260. "Facebook paid £4,327 corporation tax in 2014". BBC. October 12, 2015. Retrieved October 25, 2016.
  261. ^ Tang, Paul (September 2017). "EU Tax Revenue Loss from Google and Facebook" (PDF).
  262. 26 U.S.C. § 7602.
  263. Seth Fiegerman, "Facebook is being investigated by the IRS," July 7, 2016, CNN, at .
  264. United States of America v. Facebook, Inc. and Subsidiaries, case no. 16-cv-03777, U.S. District Court for the Northern District of California (San Francisco Div.).
  265. "Facebook paid just €30m tax in Ireland despite earning €12bn". Irish Indepdenent. November 29, 2017.
  266. "Facebook Ireland pays tax of just €30m on €12.6bn". Irish Examiner. November 29, 2017.
  267. David Ingram (April 18, 2018). "Exclusive: Facebook to put 1.5 billion users out of reach of new EU privacy law". Reuters.
  268. Peter Hamilton (November 28, 2018). "Facebook Ireland pays €38m tax on €18.7 billion of revenue channeled through Ireland in 2017". The Irish Times. The social media giant channelled €18.7 billion in revenue through its Irish subsidiary, an increase of 48 per cent from the €12.6 billion recorded in 2016. While gross profit amounted to €18.1 billion, administrative expenses of €17.8 billion meant profit before tax increased 44 per cent to €251 million.
  269. ^ Newton, Casey (February 25, 2019). "THE TRAUMA FLOOR: The secret lives of Facebook moderators in America". The Verge. Retrieved February 25, 2019.
  270. ^ O'Connell, Jennifer (March 30, 2019). "Facebook's dirty work in Ireland: 'I had to watch footage of a person being beaten to death'". The Irish Times. Retrieved June 21, 2019.
  271. ^ Newton, Casey (June 19, 2019). "Three Facebook moderators break their NDAs to expose a company in crisis". The Verge. Retrieved June 21, 2019.
  272. Wong, Queenie (June 19, 2019). "Murders and suicides: Here's who keeps them off your Facebook feed". CNET. Retrieved June 21, 2019.
  273. Eadicicco, Lisa (June 19, 2019). "A Facebook content moderator died after suffering heart attack on the job". San Antonio Express-News. Retrieved June 20, 2019.
  274. Maiberg, Emanuel; Koebler, Jason; Cox, Joseph (September 24, 2018). "A Former Content Moderator Is Suing Facebook Because the Job Reportedly Gave Her PTSD". Vice. Retrieved June 21, 2019.
  275. Gray, Chris; Hern, Alex (December 4, 2019). "Ex-Facebook worker claims disturbing content led to PTSD". The Guardian. Retrieved February 25, 2020.
  276. "Facebook sued by Tampa workers who say they suffered trauma from watching videos". Tampa Bay Times. Retrieved May 8, 2020.
  277. Leprince-Ringuet, Daphne. "Facebook's approach to content moderation slammed by EU commissioners". ZDNet. Retrieved February 19, 2020.
  278. Newton, Casey (May 12, 2020). "Facebook will pay $52 million in settlement with moderators who developed PTSD on the job". The Verge. Retrieved June 1, 2020.
  279. Allyn, Bobby (May 12, 2020). "In Settlement, Facebook To Pay $52 Million To Content Moderators With PTSD". NPR. Retrieved June 1, 2020.
  280. Paul, Kari (May 13, 2020). "Facebook to pay $52m for failing to protect moderators from 'horrors' of graphic content". The Guardian. Retrieved June 1, 2020.
  281. Streitfeld, David (March 21, 2018). "Welcome to Zucktown. Where Everything Is Just Zucky". The New York Times. Retrieved February 25, 2019.
  282. Pepitone, Julianne. "Facebook vs. Google fight turns nasty". CNNMoney. Retrieved February 23, 2019.
  283. Setalvad, Ariha (August 7, 2015). "Why Facebook's video theft problem can't last". The Verge. Retrieved May 29, 2017.
  284. Oremus, Will (July 8, 2015). "Facebook's Piracy Problem". Slate. The Slate Group. Retrieved May 29, 2017.
  285. Luckerson, Victor (August 28, 2015). "Facebook to Crack Down on Online Video Piracy". Time. Retrieved May 29, 2017.
  286. Constine, Josh (April 12, 2016). "Facebook launches video Rights Manager to combat freebooting". TechCrunch. AOL. Retrieved May 29, 2017.
  287. Kelion, Leo (May 1, 2013). "Facebook U-turn after charities criticise decapitation videos". BBC News. BBC. Retrieved June 3, 2017.
  288. ^ Winter, Michael (October 21, 2013). "Facebook again allows violent videos, with caveat". USA Today. Retrieved June 3, 2017.
  289. ^ "Facebook pulls beheading video". The Daily Telegraph. October 23, 2013. Retrieved June 3, 2017.
  290. Harrison, Virginia (October 23, 2013). "Outrage erupts over Facebook's decision on graphic videos". CNNMoney. CNN. Retrieved June 3, 2017.
  291. Gibbs, Samuel (January 13, 2015). "Facebook tackles graphic videos and photos with 'are you sure?' warnings". The Guardian. Retrieved June 3, 2017.
  292. Kelion, Leo (January 13, 2015). "Facebook restricts violent video clips and photos". BBC News. BBC. Retrieved June 3, 2017.
  293. "Libya 'war crimes' videos shared online". BBC News. Retrieved September 23, 2019.
  294. ^ Libyan conflict: Suspected war crimes shared online - BBC Newsnight, retrieved September 23, 2019
  295. Express, Libyan (May 1, 2019). "BBC: War crimes committed by Haftar's forces shared on Facebook, YouTube". Libyan Express. Retrieved October 31, 2020.
  296. https://www.icc-cpi.int/CaseInformationSheets/al-werfalliEng.pdf
  297. "Community Standards | Facebook". Retrieved September 23, 2019 – via Facebook.
  298. Mangalindan, JP (August 5, 2015). "Facebook launches live streaming, but only for famous people". Mashable. Retrieved June 3, 2017.
  299. Barrett, Brian (January 28, 2016). "Facebook Livestreaming Opens Up to Everyone With an iPhone". Wired. Retrieved June 3, 2017.
  300. Newton, Casey (January 28, 2016). "Facebook rolls out live video streaming to everyone in the United States". The Verge. Retrieved June 3, 2017.
  301. Newton, Casey (December 3, 2015). "Facebook begins testing live video streaming for all users". The Verge. Retrieved June 3, 2017.
  302. Chrisafis, Angelique; Willsher, Kim (June 14, 2016). "French police officer and partner murdered in 'odious terrorist attack'". The Guardian. Retrieved June 3, 2017.
  303. Madden, Justin (June 17, 2016). "Chicago man shot dead while live streaming on Facebook". Reuters. Retrieved June 3, 2017.
  304. Chaykowski, Kathleen (July 7, 2016). "Philando Castile's Death On Facebook Live Highlights Problems For Social Media Apps". Forbes. Retrieved June 3, 2017.
  305. McLaughlin, Eliott C.; Blau, Max; Vercammen, Paul (September 30, 2016). "Police: Man killed by officer pointed vaping device, not gun". CNN. Retrieved June 3, 2017.
  306. Berman, Mark; Hawkins, Derek (January 5, 2017). "Hate crime charges filed after 'reprehensible' video shows attack on mentally ill man in Chicago". The Washington Post. Nash Holdings. Retrieved June 3, 2017.
  307. Steele, Billy (March 22, 2017). "Dozens watched a Facebook Live stream of sexual assault (updated)". Engadget. AOL. Retrieved June 3, 2017.
  308. Gibbs, Samuel (April 25, 2017). "Facebook under pressure after man livestreams killing of his daughter". The Guardian. Retrieved June 3, 2017.
  309. Solon, Olivia (January 27, 2017). "Why a rising number of criminals are using Facebook Live to film their acts". The Guardian. Retrieved June 3, 2017.
  310. Solon, Olivia; Levin, Sam (January 6, 2017). "Facebook refuses to explain why live torture video wasn't removed sooner". The Guardian. Retrieved June 3, 2017.
  311. Krasodomski-Jones, Alex (January 9, 2017). "Facebook has created a monster it cannot tame". CNN. Retrieved June 3, 2017.
  312. Bhattacharya, Ananya (June 18, 2016). "Facebook Live is becoming a gruesome crime scene for murders". Quartz. Retrieved June 3, 2017.
  313. Gibbs, Samuel (May 3, 2017). "Facebook Live: Zuckerberg adds 3,000 moderators in wake of murders". The Guardian. Retrieved June 3, 2017.
  314. Murphy, Mike (May 3, 2017). "Facebook is hiring 3,000 more people to monitor Facebook Live for murders, suicides, and other horrific video". Quartz. Retrieved June 3, 2017.
  315. Ingram, David (May 3, 2017). "Facebook tries to fix violent video problem with 3,000 new workers". Reuters. Retrieved June 3, 2017.
  316. Peng, Tina (November 22, 2008). "Pro-anorexia groups spread to Facebook". Newsweek. Retrieved June 13, 2017.
  317. "Pro-anorexia site clampdown urged". BBC News. BBC. February 24, 2008. Retrieved June 13, 2017.
  318. Masciarelli, Alexis (January 9, 2009). "Anger at pro-Mafia groups on Facebook". France 24. Archived from the original on September 6, 2009. Retrieved June 13, 2017.
  319. Donadio, Rachel (January 20, 2009). "Italian authorities wary of Facebook tributes to Mafia". The New York Times International Edition. Archived from the original on January 24, 2009. Retrieved June 13, 2017.
  320. Pullella, Philip (January 12, 2009). "Pro-mafia Facebook pages cause alarm in Italy". Reuters. Retrieved June 13, 2017.
  321. Krangel, Eric (February 11, 2009). "Italy Considering National Ban On Facebook, YouTube In Plan To Return To Dark Ages". Business Insider. Axel Springer SE. Retrieved June 13, 2017.
  322. Kington, Tom (February 16, 2009). "Italian bill aims to block mafia Facebook shrines". The Guardian. Retrieved June 13, 2017.
  323. Nicole, Kristen (February 12, 2009). "Mafia Bosses Could Cause Italy's Blocking of Facebook". Adweek. Beringer Capital. Retrieved June 13, 2017.
  324. Oates, John (February 12, 2009). "Facebook hits back at Italian ban". The Register. Situation Publishing. Retrieved June 13, 2017.
  325. "Trolling: The Today Show Explores the Dark Side of the Internet", March 31, 2010. Retrieved April 4, 2010. Archived June 8, 2010, at the Wayback Machine
  326. s127 of the Communications Act 2003 of Great Britain. Retrieved July 13, 2011.
  327. Murder victim-mocking troll jailed, The Register, November 1, 2010. Retrieved July 13, 2011.
  328. Jade Goody website 'troll' from Manchester jailed, BBC, October 29, 2010. Retrieved July 13, 2011.
  329. Facebook troll Bradley Paul Hampson seeks bail, appeal against jail term, The Courier-Mail, April 20, 2011. Retrieved July 13, 2011.
  330. Facebook urged to ban teens from setting up tribute pages, The Australian, June 5, 2010. Retrieved July 13, 2011.
  331. Sherwell, Philip (October 16, 2011). "Cyber anarchists blamed for unleashing a series of Facebook 'rape pages'". The Daily Telegraph. London. Retrieved May 22, 2012.
  332. "Facebook 'rape page' whitelisted and campaign goes global". Womensviewsonnews.org. Meanwhile, campaigns in other countries have begun, most notably in Canada with the Rape is no joke (RINJ) campaign, which has not only campaigned fiercely but has also put together a YouTube video.
  333. "Facebook Refuses To Remove Rape Pages..." Albuquerque Express. October 23, 2011. Archived from the original on September 3, 2017. Retrieved May 22, 2012.
  334. "Facebook Refuses to Remove 'Rape Pages' Linked to Australian, British Youth". International Business Times. October 18, 2011. Archived from the original on July 17, 2012. Retrieved May 22, 2012. O'Brien said the campaign is now focusing on Facebook advertisers telling them not to let their advertisements be posted on the "rape pages."
  335. Sara C Nelson (May 28, 2013). "#FBrape: Will Facebook Heed Open Letter Protesting 'Endorsement Of Rape & Domestic Violence'?". The Huffington Post UK. Retrieved May 29, 2013.
  336. Rory Carroll (May 29, 2013). "Facebook gives way to campaign against hate speech on its pages". The Guardian UK. London. Retrieved May 29, 2013.
  337. "Facebook criticised by NSPCC over baby ducking video clip". BBC News. June 5, 2015.
  338. "Facebook failed to remove sexualised images of children". BBC News. Retrieved March 9, 2017.
  339. "Facebook, Twitter and Google grilled by MPs over hate speech". BBC News. Retrieved March 14, 2017.
  340. Layug, Margaret Claire (July 3, 2017). "'Pastor Hokage' FB groups trading lewd photos of women exposed". GMA News. Retrieved July 8, 2017.
  341. Layug, Margaret Claire (July 5, 2017). "Victim of 'Pastor' FB reports harassment, indecent proposals". GMA News. Retrieved July 8, 2017.
  342. De Jesus, Julliane Love (July 6, 2017). "Hontiveros wants stiff penalties vs 'Pastor Hokage' FB groups". Philippine Daily Inquirer. Retrieved July 8, 2017.
  343. "When it comes to incitement, is Facebook biased against Israel? - Arab-Israeli Conflict - Jerusalem Post". The Jerusalem Post. Retrieved December 16, 2018.
  344. "Facebook tightens ad policy after 'Jew hater' controversy — J". Jweekly.com. Jewish Telegraphic Agency. September 27, 2016. Retrieved September 29, 2017.
  345. Gagliardo-Silver, Victoria (March 29, 2019). "Instagram refuses to remove Alex Jones' anti-semitic post". The Independent. Retrieved March 30, 2019.
  346. "20,000 Israelis sue Facebook for ignoring Palestinian incitement". The Times of Israel. October 27, 2015. Retrieved July 15, 2016.
  347. "Israel: Facebook's Zuckerberg has blood of slain Israeli teen on his hands". The Times of Israel. July 2, 2016. Retrieved July 15, 2016.
  348. ^ Wittes, Benjamin; Bedell, Zoe (July 12, 2016). "Facebook, Hamas, and Why a New Material Support Suit May Have Legs". Lawfare.
  349. ^ Pileggi, Tamar (July 11, 2016). "US terror victims seek $1 billion from Facebook for Hamas posts". The Times of Israel. Retrieved July 15, 2016.
  350. Dolmetsch, Chris (July 31, 2019). "Facebook Isn't Responsible as Terrorist Platform, Court Says". Bloomberg. Retrieved August 7, 2019.
  351. "Facebook Defeats Appeal Claiming It Aided Hamas Attacks". Law360. July 31, 2019. Retrieved August 6, 2019.
  352. "Hezbollah created Palestinian terror cells on Facebook, Israel says after bust". Jewish Telegraphic Agency. August 16, 2016. Retrieved August 17, 2016.
  353. Zitun, Yoav (August 16, 2016). "Shin Bet catches Hezbollah recruitment cell in the West Bank". Ynet News. Retrieved August 17, 2016.
  354. Gross, Judah Ari (August 16, 2016). "Hezbollah terror cells, set up via Facebook in West Bank and Israel, busted by Shin Bet". The Times of Israel. Retrieved August 17, 2016.
  355. "Knesset approves Facebook bill in preliminary vote". July 20, 2016. Retrieved July 24, 2016.
  356. Lecher, Colin (June 15, 2017). "Facebook says it wants 'to be a hostile place for terrorists'". The Verge. Retrieved June 16, 2017.
  357. "Facebook using artificial intelligence to fight terrorism". CBS News. June 15, 2017. Retrieved June 16, 2017.
  358. Solon, Olivia (June 16, 2017). "Revealed: Facebook exposed identities of moderators to suspected terrorists". The Guardian. Retrieved June 18, 2017.
  359. Wong, Joon Ian (June 16, 2017). "The workers who police terrorist content on Facebook were exposed to terrorists by Facebook". Quartz. Retrieved June 18, 2017.
  360. "Facebook Deletes Iran-Linked Accounts Followed By 1 Million In U.S., Britain". RFE/RL. Retrieved December 15, 2018.
  361. Shahani, Aarti (November 17, 2016). "From Hate Speech To Fake News: The Content Crisis Facing Mark Zuckerberg". NPR.
  362. Burke, Samuel (November 19, 2016). "Zuckerberg: Facebook will develop tools to fight fake news". CNN Money. Retrieved November 22, 2016.
  363. Shahani, Aarti. Zuckerberg Denies Fake News on Facebook had Impact on the Election. Washington: NPR, 2016. ProQuest.
  364. Kravets, David. Facebook, Google Seek to Gut Fake News Sites’ Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 5, 2016.
  365. Kravets, David. Facebook, Google Seek to Gut Fake News Sites’ Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  366. Newitz, Annalee. Facebook Fires Human Editors, Algorithm Immediately Posts Fake News. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  367. ^ Safi, Michael (March 14, 2018). "Sri Lanka accuses Facebook over hate speech after deadly riots". The Guardian.
  368. Fisher, Amanda Taub and Max. "Where Countries Are Tinderboxes and Facebook Is a Match". Retrieved November 28, 2018.
  369. "U.N. investigators cite Facebook role in Myanmar crisis".
  370. "In Myanmar, Facebook struggles with a deluge of disinformation". The Economist. ISSN 0013-0613. Retrieved October 27, 2020.
  371. "Report of the independent international fact-finding mission on Myanmar" (PDF).
  372. Stecklow, Steve. "Why Facebook is losing the war on hate speech in Myanmar". Reuters. Retrieved December 15, 2018.
  373. "Facebook bans Myanmar military accounts for 'enabling human rights abuses'". Social.techcrunch.com. Retrieved December 15, 2018.
  374. "Some in Myanmar Fear Fallout From Facebook Removal of Military Pages". Radio Free Asia. Retrieved December 15, 2018.
  375. "Facebook Removes More Pages And Groups Linked to Myanmar Military". Radio Free Asia. Retrieved January 30, 2019.
  376. "'Person of eminence' tag on FB for convict Ajay Chautala". December 17, 2018.
  377. ^ Beckett, Lois (March 27, 2019). "Facebook to ban white nationalism and separatism content". The Guardian. Retrieved March 28, 2019.
  378. Dearden, Lizzie (March 24, 2019). "Neo-Nazi groups allowed to stay on Facebook because they 'do not violate community standards'". The Independent. Retrieved March 28, 2019.
  379. Copley, Caroline (March 4, 2016). "German court rules Facebook may block pseudonyms". Reuters. Retrieved June 3, 2017.
  380. ^ Ortutay, Barbara (May 25, 2009). "Real users caught in Facebook fake-name purge". San Francisco Chronicle. Hearst Communications. Retrieved June 3, 2017.
  381. Levy, Karyne (October 1, 2014). "Facebook Apologizes For 'Real Name' Policy That Forced Drag Queens To Change Their Profiles". Business Insider. Axel Springer SE. Retrieved March 23, 2017.
  382. Crook, Jordan (October 1, 2014). "Facebook Apologizes To LGBT Community And Promises Changes To Real Name Policy". TechCrunch. AOL. Retrieved June 3, 2017.
  383. Osofsky, Jason; Gage, Todd (December 15, 2015). "Community Support FYI: Improving the Names Process on Facebook". Facebook Newsroom. Retrieved December 16, 2015 – via Facebook.
  384. AFP (December 16, 2015). "Facebook modifies 'real names' policy, testing use of assumed names". CTV News. Retrieved December 16, 2015.
  385. Holpuch, Amanda (December 15, 2015). "Facebook adjusts controversial 'real name' policy in wake of criticism". The Guardian. Retrieved March 23, 2017.
  386. Halliday, Josh (July 6, 2013). "Facebook apologises for deleting free speech group's post on Syrian torture". The Guardian. London. Retrieved June 4, 2013.
  387. "Jealous Wives Are Getting Courtney Stodden Banned on Facebook - Softpedia". News.softpedia.com. October 14, 2011. Retrieved July 31, 2012.
  388. "When good lulz go bad: unpicking the ugly business of online harassment". Wired. January 27, 2014. Retrieved August 23, 2017.
  389. "Niet compatibele browser". Archived from the original on June 13, 2010. Retrieved August 7, 2010 – via Facebook.
  390. "Caroline McCarthy, "Facebook outage draws more security questions", CNET News.com, ZDNet Asia, August 2, 2007". Zdnetasia.com. August 2, 2007. Archived from the original on May 31, 2008. Retrieved March 23, 2010.
  391. "David Hamilton, "Facebook Outage Hits Some Countries", Web Host Industry Review, Jun. 26, 2008". Thewhir.com. Archived from the original on April 2, 2010. Retrieved March 23, 2010.
  392. "K.C. Jones, "Facebook, MySpace More Reliable Than Peers", Information Week, February 19, 2009". InformationWeek. Retrieved March 23, 2010.
  393. "Facebook Outage and Facebook Down September 18 2009". Archived from the original on August 9, 2010. Retrieved August 30, 2010.
  394. McCarthy, Caroline (October 8, 2009). "Facebook's mounting customer service crisis | The Social - CNET News". CNET. Retrieved December 13, 2009.
  395. McCarthy, Caroline (October 10, 2009). "Downed Facebook accounts still haven't returned | The Social - CNET News". CNET. Retrieved December 13, 2009.
  396. "Facebook Outage Silences 150,000 Users". PC World. October 13, 2009. Retrieved December 13, 2009.
  397. Gaudin, Sharon (October 13, 2009). "Facebook deals with missing accounts, 150,000 angry users". Computerworld. Retrieved December 13, 2009.
  398. Reisinger, Don (May 18, 2012). "Facebook sued for $15 billion over alleged privacy infractions". CNET. Retrieved February 23, 2014.
  399. "After privacy ruling, Facebook now requires Belgium users to log in to view pages". The Verge. Retrieved December 17, 2015.
  400. Gordon, Whitson. "Facebook Changed Everyone's Email to @Facebook.com; Here's How to Fix Yours". Lifehacker.com. Retrieved October 25, 2016.
  401. Johnston, Casey (July 2, 2012). "@facebook.com e-mail plague chokes phone address books". Ars Technica. Retrieved June 14, 2017.
  402. Hamburger, Ellis (February 24, 2014). "Facebook retires its troubled @facebook.com email service". The Verge. Retrieved October 25, 2016.
  403. "Facebook mistakenly asked people if they were in Pakistan following a deadly explosion". Tech Insider. Retrieved March 27, 2016.
  404. "Facebook's Safety Check malfunctions after Pakistan bombing". CNET. Retrieved March 27, 2016.
  405. Michael Arrington, Is Facebook Really Censoring Search When It Suits Them?, TechCrunch, November 22, 2007
  406. Bowles, Nellie; Thielman, Sam (May 9, 2016). "Facebook accused of censoring conservatives, report says". The Guardian. Retrieved May 25, 2016.
    Nunez, Gizmodo (May 9, 2016). "Former Facebook Workers: We Routinely Suppressed Conservative News". Gizmodo.com. Retrieved September 8, 2018.
  407. Hunt, Elle (May 24, 2016). "Facebook to change trending topics after investigation into bias claims". The Guardian. Retrieved May 25, 2016.
  408. "Facebook apologises for blocking Prager University's videos". BBC. August 20, 2018. Retrieved August 22, 2018.
  409. Zhou, Marrian (August 21, 2018). "Facebook apologizes for removing conservative PragerU videos". CNET. Retrieved August 22, 2018.
  410. Schwartz, Jason (March 29, 2018). "Conservative outlets take on Facebook". Politico. Retrieved September 8, 2018.
  411. Flood, Brian (September 5, 2018). "Conservatives ditching Facebook over trust issues and fears of political bias, study shows". Fox News Channel. Archived from the original on September 5, 2018. Retrieved September 8, 2018.
  412. ^ "Congressman Matt Gaetz Files Criminal Referral Against Facebook CEO Mark Zuckerberg". Congressman Matt Gaetz. July 27, 2020. Retrieved July 28, 2020.
  413. "Matt Gaetz Files Criminal Referral Against Facebook CEO Mark Zuckerberg, Urges William Barr To Investigate". Florida Daily. Retrieved July 28, 2020.
  414. Dube Dwilson, Stephanie (October 13, 2018). "Yes, Facebook Is Blocking Minds Links as 'Unsecure'". Heavy.com. Retrieved October 21, 2018.
  415. Klint, Finley (November 11, 2015). "Facebook is blocking an upstart rival - but it's complicated". Wired. Retrieved October 21, 2018.
  416. Kelly, Makena (March 11, 2019). "Facebook proves Elizabeth Warren's point by deleting her ads about breaking up Facebook". The Verge. Retrieved February 25, 2020.
  417. Yaron, Oded (August 23, 2016). "Is Facebook Censoring Posts Critical of the Social Media Giant?". Haaretz. Retrieved February 25, 2020.
  418. Beckett, Lois (March 27, 2019). "Facebook to ban white nationalism and separatism content". The Guardian. Retrieved February 25, 2020.
  419. Hern, Alex (February 26, 2019). "Facebook moderators tell of strict scrutiny and PTSD symptoms". The Guardian. Retrieved February 25, 2020.
  420. Hern, Alex (December 4, 2019). "Ex-Facebook worker claims disturbing content led to PTSD". The Guardian. Retrieved February 25, 2020.
  421. Nycyk, Michael. Facebook: Exploring the Social Network and its Challenges.
  422. "Facebook Censored Breastfeeding. Sadly, I Wasn't Surprised". HuffPost. August 17, 2015. Retrieved May 8, 2020.
  423. Tijou, Sarah (March 20, 2017). "Naked mannequin photographer banned from Facebook". BBC Newsbeat. Retrieved May 8, 2020.
  424. Spanish newspaper El País, Estas son las imágenes que Facebook no quiso que vieras Ana Marcos, March 16, 2013. Retrieved March 17, 2015
  425. Norway newspaper aftenposten, Dear Mark. I am writing this to inform you that I shall not comply with your requirement to remove this picture. Espen Egil Hansen, September 9, 2016
  426. Norway newspaper aftenposten, Norway's prime minister and several government members censored by Facebook Kristin Jonassen Nordby, September 9, 2016
  427. Kafka, Peter (September 9, 2016). "Facebook changes its mind, and says it's okay to publish an iconic war photo, after all". Recode.net. Retrieved October 25, 2016.
  428. ^ "Protests mount over Facebook ban on breast-feeding photos; bigger turnout online than in Palo Alto". Mercury News. December 27, 2008.
  429. ^ McGinty, Bill (December 30, 2011). "Facebook apologizes for removing breastfeeding photo". WCNC.COM. Archived from the original on April 10, 2012. Retrieved February 17, 2012.
  430. McGinty, Bill (February 16, 2012). "Photos on breastfeeding Facebook page removed again". WCNC.COM. Archived from the original on April 10, 2012. Retrieved February 17, 2012.
  431. + name + (January 1, 1970). "組員逾八萬 疑有人眼寃不斷施壓 facebook鏟走反民建聯群組 | 蘋果日報 | 要聞港聞 | 20100205". Apple Daily (in Chinese). Hong Kong. Retrieved February 23, 2014.
  432. "Ответил за Пушкина". Livejournal.com. July 6, 2015. Archived from the original on July 8, 2015.
  433. "Журналист объяснил публикацию слова "хохол" в Facebook экспериментом". Lenta.Ru. July 7, 2015.
  434. "Колумнист Кононенко объяснил пост со словом "хохол" в Facebook желанием поэкспериментировать". Govoritmoskva.ru. July 7, 2015.
  435. "Кононенко заявил о блокировке аккаунта в Facebook за отрывок из Пушкина". RBC.ru. July 6, 2015.
  436. Photoshopped celebrities used for Kashmir pellet gun campaign. BBC News, July 28, 2016.
  437. Doshi, Vidhi. 2016. Facebook under fire for 'censoring' Kashmir-related posts and accounts. The Guardian, July 19, 2016.
  438. Lakshmi, Rama. 2016. Facebook is censoring some posts on Indian Kashmir. The Washington Post, July 27, 2016.
  439. Who removes Kashmir posts on Facebook?. Daily Dawn, July 28, 2016.
  440. Adamczyk, Ed. 2016. Kashmir activist campaign shows Facebook CEO Zuckerberg shot in face. United Press International, July 29, 2016.
  441. "Facebook's Kurdish problem?". Al Jazeera. August 24, 2013. Retrieved June 18, 2017.
  442. Livesay, Christopher (October 7, 2015). "After battling ISIS, Kurds find new foe in Facebook". Public Radio International. WGBH Educational Foundation. Retrieved June 18, 2017.
  443. "Facebook censored 54 posts for 'blasphemy' in Pakistan in second half of 2014 - The Express Tribune". The Express Tribune. Retrieved March 1, 2016.
  444. Faiola, Anthony (January 6, 2016). "Germany springs to action over hate speech against migrants". The Washington Post. Retrieved June 4, 2017.
  445. Bender, Rush; Schechner, Sam (September 14, 2015). "Facebook Outlines Measures to Combat Racist and Xenophobic Content". The Wall Street Journal. Retrieved June 4, 2017.
  446. Toor, Amar (September 15, 2015). "Facebook will work with Germany to combat anti-refugee hate speech". The Verge. Retrieved June 4, 2017.
  447. Toor, Amar (May 31, 2016). "Facebook, Twitter, Google, and Microsoft agree to EU hate speech rules". The Verge. Retrieved June 4, 2017.
  448. Hern, Alex (May 31, 2016). "Facebook, YouTube, Twitter and Microsoft sign EU hate speech code". The Guardian. Retrieved June 4, 2017.
  449. Dillet, Romain (May 31, 2016). "Facebook, Twitter, YouTube and Microsoft agree to remove hate speech across the EU". TechCrunch. AOL. Retrieved June 4, 2017.
  450. Fioretti, Julia (May 23, 2017). "EU states approve plans to make social media firms tackle hate speech". Reuters. Retrieved June 4, 2017.
  451. Toor, Amar (May 24, 2017). "EU close to making Facebook, YouTube, and Twitter block hate speech videos". The Verge. Retrieved June 4, 2017.
  452. Toor, Amar (June 2, 2017). "Facebook earns EU praise for combatting hate speech, as Twitter and YouTube lag behind". The Verge. Retrieved June 4, 2017.
  453. Macdonald, Alastair; Fioretti, Julia (June 1, 2017). "Social media firms have increased removals of online hate speech: EU". Reuters. Retrieved June 4, 2017.
  454. Yacoub Oweis, Khaled (November 23, 2007). "Syria blocks Facebook in Internet crackdown". Reuters. Retrieved March 5, 2008.
  455. "China's Facebook Status: Blocked". ABC News. July 8, 2009. Archived from the original on July 11, 2009. Retrieved July 13, 2009.
  456. "Facebook Faces Censorship in Iran". American Islamic Congress. August 29, 2007. Archived from the original on April 24, 2008. Retrieved April 30, 2008.
  457. ODPS (2010). "Isle of Man ODPS issues Facebook Guidance booklet" (PDF). Office of the Data Protection Supervisor. Archived from the original (PDF) on November 2, 2012. Retrieved May 1, 2013.
  458. "Pakistan court orders Facebook ban". The Belfast Telegraph.
  459. Crilly, Rob (May 19, 2010). "Facebook blocked in Pakistan over Prophet Mohammed cartoon row". The Daily Telegraph. London.
  460. "Pakistan blocks YouTube, Facebook over 'sacrilegious content' - CNN". May 21, 2010.
  461. "Pakistan blocks YouTube over blasphemous material". GEO.tv. May 20, 2010. Retrieved August 7, 2010.
  462. "Home - Pakistan Telecommunication Authority". Pta.gov.pk. Retrieved August 7, 2010.
  463. "LHC moved for ban on Facebook". The News International. Retrieved December 16, 2018.
  464. "Permanently banning Facebook: Court seeks record of previous petitions". The Express Tribune. May 6, 2011. Retrieved December 16, 2018.
  465. "Organizations blocking Facebook". CTV news.
  466. Benzie, Robert (May 3, 2007). "Facebook banned for Ontario staffers". Toronto Star. Retrieved March 5, 2008.
  467. "Ontario politicians close the book on Facebook". Blog Campaigning. May 23, 2007. Archived from the original on March 14, 2008. Retrieved March 5, 2008.
  468. "Facebook banned for council staff". BBC News. September 1, 2009. Retrieved February 2, 2010.
  469. "Tietoturvauhan poistuminen voi avata naamakirjan Kokkolassa (In Finnish)". Archived from the original on February 22, 2012. Retrieved February 2, 2010.
  470. "Immediate Ban of Internet Social Networking Sites (SNS) On Marine Corps Enterprise Network (MCEN) NIPRNET". Archived from the original on December 25, 2009. Retrieved February 2, 2010.
  471. "Facebook kiellettiin Keski-Suomen sairaanhoitopiirissä (In Finnish)". Archived from the original on October 25, 2009. Retrieved February 2, 2010.
  472. "Sairaanhoitopiirin työntekijöille kielto nettiyhteisöihin (In Finnish)". Archived from the original on July 20, 2011. Retrieved February 2, 2010.
  473. Fort, Caleb (October 12, 2005). "CIRT blocks access to Facebook.com". Daily Lobo (University of New Mexico). Retrieved April 3, 2006.
  474. "Popular web site, Facebook.com, back online at UNM". University of New Mexico. January 19, 2006. Archived from the original on February 12, 2007. Retrieved April 15, 2007.
  475. Loew, Ryan (June 22, 2006). "Kent banning athlete Web profiles". The Columbus Dispatch. Retrieved October 6, 2006.
  476. "The Summer Kent Stater 5 July 2006 — Kent State University". dks.library.kent.edu. Retrieved October 8, 2020.
  477. "Closed Social Networks as a Gilded Cage". August 6, 2007. Archived from the original on October 29, 2013. Retrieved February 23, 2009.
  478. see NSTeens NSTeens video about private social networking Archived March 10, 2010, at the Wayback Machine
  479. Lapeira's post (October 16, 2008) Three types of social networking
  480. "Openbook - Connect and share whether you want to or not". Youropenbook.org. May 12, 2010. Archived from the original on August 3, 2010. Retrieved August 7, 2010.
  481. ^ "Niet compatibele browser". Retrieved August 7, 2010 – via Facebook.
  482. ^ "Facebook Privacy Change Sparks Federal Complaint". PC World. Retrieved March 5, 2009.
  483. "Facebook's New Terms Of Service: "We Can Do Anything We Want With Your Content. Forever."". Consumerist. Consumer Media LLC. Archived from the original on October 8, 2009. Retrieved February 20, 2009.
  484. "Improving Your Ability to Share and Connect". Retrieved March 5, 2009 – via Facebook.
  485. ^ Haugen, Austin (October 23, 2009). "facebook DEVELOPERS". Archived from the original on December 23, 2009. Retrieved October 25, 2009 – via Facebook.
  486. "Facebook Town Hall: Proposed Facebook Principles". Archived from the original on February 27, 2009. Retrieved March 5, 2009 – via Facebook.
  487. "Facebook Town Hall: Proposed Statement of Rights and Responsibilities". Archived from the original on February 27, 2009. Retrieved March 5, 2009 – via Facebook.
  488. "Governing the Facebook Service in an Open and Transparent Way". Retrieved March 5, 2009 – via Facebook.
  489. "Rewriting Facebook's Terms of Service". PC World. Retrieved March 5, 2009.
  490. "Democracy Theatre on Facebook". University of Cambridge. Retrieved April 4, 2009.
  491. "Facebook's theatrical rights and wrongs". Open Rights Group. Archived from the original on April 6, 2009. Retrieved April 4, 2009.
  492. "Complaint, Request for Investigation, Injunction, and Other Relief" (PDF). Epic.org. Retrieved December 16, 2018.
  493. "Supplemental Materials in Support of Pending Complaint and Request for Injunction, Request for Investigation and for Other Relief" (PDF). Epic.org. Retrieved December 16, 2018.
  494. Puzzanghera, Jim (March 1, 2011). "Facebook reconsiders allowing third-party applications to ask minors for private information". Los Angeles Times.
  495. Center, Electronic Privacy Information. "EPIC - Facebook Resumes Plan to Disclose User Home Addresses and Mobile Phone Numbers". epic.org. {{cite web}}: |first= has generic name (help)
  496. Baker, Gavin (May 27, 2008). "Free software vs. software-as-a-service: Is the GPL too weak for the Web?". Free Software Magazine. Archived from the original on May 17, 2013. Retrieved June 29, 2009.
  497. "Statement of Rights and Responsibilities". May 1, 2009. Retrieved June 29, 2009 – via Facebook.
  498. Calore, Michael (December 1, 2008). "As Facebook Connect Expands, OpenID's Challenges Grow". Wired. Retrieved June 29, 2009. Facebook Connect was developed independently using proprietary code, so Facebook's system and OpenID are not interoperable. ... This is a clear threat to the vision of the Open Web, a future when data is freely shared between social websites using open source technologies.
  499. Thompson, Nicholas. "What Facebook Can Sell". The New Yorker. Retrieved May 18, 2014.
  500. Barnett, Emma (May 23, 2012). "Facebook Settles Lawsuit With Angry Users". The Telegraph. London. Retrieved May 18, 2014.
  501. ^ Dijck 2013, p. 47.
  502. Farber, Dan. "Facebook Beacon Update: No Activities Published Without Users Proactively Consenting". ZDNet. Retrieved May 18, 2014.
  503. Sinker, Daniel (February 17, 2009). "Face/Off: How a Little Change in Facebook's User Policy is Making People Rethink the Rights They Give Away Online". HuffPost. Retrieved May 28, 2014.
  504. Dijck 2013, p. 48.
  505. ^ Brunton, Finn. "Vernacular Resistance to Data Collection and Analysis: A Political Theory of Obfuscation". First Monday. Retrieved May 18, 2014.
  506. ^ "BBB Review of Facebook". Retrieved December 12, 2010.
  507. "TrustLink Review of Facebook". Archived from the original on June 13, 2010. Retrieved May 5, 2010.
  508. Emery, Daniel (July 29, 2010). "Details of 100m Facebook users collected and published". BBC. Retrieved August 7, 2010.
  509. Nicole Perlroth (June 3, 2013). "Bits: Malware That Drains Your Bank Account Thriving on Facebook". The New York Times. Retrieved June 9, 2013.
  510. Bort, Julie (April 20, 2011). "Researcher: Facebook Ignored the Bug I Found Until I Used It to Hack Zuckerberg". Yahoo! Finance. Retrieved August 19, 2013.
  511. "Zuckerberg's Facebook page hacked to prove security exploit". CNN. May 14, 2013. Retrieved August 19, 2013.
  512. Tom Warren (August 1, 2013). "Facebook ignored security bug, researcher used it to post details on Zuckerberg's wall". The Verge. Retrieved August 19, 2013.
  513. "Hacker who exposed Facebook bug to get reward from unexpected source". Yahoo! Finance. Reuters. August 20, 2013. Archived from the original on August 21, 2013. Retrieved August 22, 2013.
  514. Rogoway, Mike (January 21, 2010). "Facebook picks Prineville for its first data center". The Oregonian. Retrieved January 21, 2010.
  515. Kaufman, Leslie (September 17, 2010). "You're 'So Coal': Angling to Shame Facebook". The New York Times.
  516. Albanesius, Chloe (September 17, 2010). "Greenpeace Attacks Facebook on Coal-Powered Data Center". PC Magazine.
  517. "Facebook update: Switch to renewable energy now Greening Facebook from within". Greenpeace. February 17, 2010.
  518. Tonelli, Carla (September 1, 2010). "'Friendly' push for Facebook to dump coal". Reuters. Archived from the original on October 13, 2010. Retrieved February 23, 2014.
  519. "Dirty Data Report Card" (PDF). Greenpeace. Retrieved August 22, 2013.
  520. "Facebook and Greenpeace settle Clean Energy Feud". Techcrunch. Retrieved August 22, 2013.
  521. "Facebook Commits to Clean Energy Future". Greenpeace. Retrieved August 22, 2013.
  522. ^ "Startup Claims 80% Of Its Facebook Ad Clicks Are Coming From Bots". TechCrunch.com. January 4, 2011. Retrieved July 31, 2012.
  523. Rodriguez, Salvador (July 30, 2012). "Start-up says 80% of its Facebook ad clicks came from bots". Los Angeles Times. Retrieved July 31, 2012.
  524. Sengupta, Somini (April 23, 2012). "Bots Raise Their Heads Again on Facebook". Bits.blogs.nytimes.com. Retrieved July 31, 2012.
  525. Hof, Robert. "Stung By Click Fraud Allegations, Facebook Reveals How It's Fighting Back". Forbes. Retrieved December 16, 2018.
  526. "Guide to the Ads Create Tool". Retrieved June 11, 2014 – via Facebook.
  527. ^ "Facebook Advertisers Complain Of A Wave Of Fake Likes Rendering Their Pages Useless". Business Insider. February 11, 2014. Retrieved June 11, 2014.
  528. Kirtiş, A. Kazım; Karahan, Filiz (October 5, 2011). "Efficient Marketing Strategy". Procedia - Social and Behavioral Sciences. 24: 260–268. doi:10.1016/j.sbspro.2011.09.083.
  529. "Are 40% Of Life Science Company Facebook Page 'Likes' From Fake Users?". Comprendia. Retrieved June 7, 2014.
  530. "Facebook, Inc. Form 10K". United States Securities and Exchange Commission. January 28, 2014. Retrieved June 7, 2014.
  531. "What Do Facebook "likes" of Companies Mean?". PubChase. January 23, 2014. Archived from the original on July 3, 2014. Retrieved June 7, 2014.
  532. "Facebook Fraud". February 10, 2014. Retrieved June 11, 2014 – via YouTube.
  533. "Firms withdraw BNP Facebook ads". BBC News. August 3, 2007. Retrieved April 30, 2010.
  534. ^ "Facebook halts ads that exclude racial and ethnic groups". USA Today. Retrieved March 29, 2019.
  535. ^ Brandom, Russell (March 28, 2019). "Facebook has been charged with housing discrimination by the US government". The Verge. Retrieved March 29, 2019.
  536. ^ Julia Angwin, Ariana Tobin (November 21, 2017). "Facebook (Still) Letting Housing Advertisers Exclude Users by Race". ProPublica. Retrieved March 29, 2019.
  537. Robertson, Adi (April 4, 2019). "Facebook's ad delivery could be inherently discriminatory, researchers say". The Verge. Retrieved April 8, 2019.
  538. Julia Angwin, Terry Parris Jr (October 28, 2016). "Facebook Lets Advertisers Exclude Users by Race". ProPublica. Retrieved March 29, 2019.
  539. "Improving Enforcement and Promoting Diversity: Updates to Ads Policies and Tools". Retrieved March 29, 2019 – via Facebook.
  540. Statt, Nick (July 24, 2018). "Facebook signs agreement saying it won't let housing advertisers exclude users by race". The Verge. Retrieved March 29, 2019.
  541. Statt, Nick (August 21, 2018). "Facebook will remove 5,000 ad targeting categories to prevent discrimination". The Verge. Retrieved March 29, 2019.
  542. "Facebook agrees to overhaul targeted advertising system for job, housing and loan ads after discrimination complaints". The Washington Post. March 19, 2019. Retrieved March 29, 2019.
  543. Madrigal, Alexis C. (March 20, 2019). "Facebook Does Have to Respect Civil-Rights Legislation, After All". The Atlantic. Retrieved March 29, 2019.
  544. Yurieff, Kaya. "HUD charges Facebook with housing discrimination in ads". CNN. Retrieved March 29, 2019.
  545. "Facebook: About 83 million accounts are fake". USA Today. August 3, 2012. Retrieved August 4, 2012.
  546. "Unreal: Facebook reveals 83 million fake profiles". The Sydney Morning Herald. Retrieved August 4, 2012.
  547. Rushe, Dominic (August 2, 2012). "Facebook share price slumps below $20 amid fake account flap". The Guardian. London. Retrieved August 4, 2012.
  548. Gupta, Aditi (2017). "Towards detecting fake user accounts in facebook". 2017 ISEA Asia Security and Privacy (ISEASP). pp. 1–6. doi:10.1109/ISEASP.2017.7976996. ISBN 978-1-5090-5942-3. S2CID 37561110. {{cite book}}: |journal= ignored (help)
  549. "Facebook Takes 4 Years to Remove A Woman's Butthole as a Business Page".
  550. "The Facebook Blog - Moving to the new Facebook".
  551. "Facebook Newsroom". newsroom.fb.com.
  552. "Petition against Facebook redesign fails as old version disabled". Archived from the original on September 12, 2012.
  553. ^ "Facebook's New Privacy Changes: The Good, The Bad, and The Ugly | Electronic Frontier Foundation". Eff.org. December 9, 2009. Retrieved August 7, 2010.
  554. ^ "Gawker.com". Gawker.com. December 13, 2009. Archived from the original on May 17, 2013. Retrieved June 11, 2013.
  555. "What Does Facebook's Privacy Transition Mean for You? | ACLUNC dotRights". Dotrights.org. December 4, 2009. Archived from the original on December 12, 2009. Retrieved December 13, 2009.
  556. "Facebook faces criticism on privacy change". BBC News. December 10, 2008. Retrieved December 13, 2009.
  557. "ACLU.org". Secure.aclu.org. Archived from the original on February 24, 2012. Retrieved June 11, 2013.
  558. "Facebook CEO's Private Photos Exposed by the New 'Open' Facebook". Gawker.com. Archived from the original on December 14, 2009. Retrieved December 13, 2009.
  559. McCarthy, Caroline. "Facebook backtracks on public friend lists | The Social - CNET News". CNET. Retrieved December 13, 2009.
  560. "Mediactive.com". Mediactive.com. December 12, 2009. Retrieved June 11, 2013.
  561. Oremus, Will. "TheBigMoney.com". TheBigMoney.com. Retrieved June 11, 2013.
  562. "ReadWriteWeb.com". ReadWriteWeb.com. Archived from the original on January 13, 2010. Retrieved June 11, 2013.
  563. Benny Evangelista (January 27, 2010). San Francisco Chronicle http://www.sfgate.com/cgi-bin/blogs/techchron/detail?&entry_id=56175. Retrieved February 23, 2014. {{cite news}}: Missing or empty |title= (help)
  564. Deppa, Seetharaman (January 11, 2018). "Facebook to Rank News Sources by Quality to Battle Misinformation". The New York Times. Retrieved March 5, 2018.
  565. ^ Mark Zuckerberg, , Facebook, January 12, 2018
  566. Isaac, Mike (January 11, 2018). "Facebook Overhauls News Feed to Focus on What Friends and Family Share". The New York Times. Retrieved March 5, 2018.
  567. Mosseri, Adam (January 11, 2018). "News Feed FYI: Bringing People Closer Together". Facebook newsroom. Retrieved March 5, 2018.
  568. ENGEL BROMWICH, JONAH; HAAG, MATTHEW (January 12, 2018). "Facebook Is Changing. What Does That Mean for Your News Feed?". The New York Times. Retrieved March 5, 2018.
  569. ^ Bell, Emily (January 21, 2018). "Why Facebook's news feed changes are bad news for democracy". The Guardian. Retrieved March 11, 2018.
  570. ^ Dojcinovic, Stevan (November 15, 2017). "Hey, Mark Zuckerberg: My Democracy Isn't Your Laboratory". The New York Times. Retrieved March 11, 2018.
  571. Shields, Mike (February 28, 2018). "Facebook's algorithm has wiped out a once flourishing digital publisher". The New York Times. Retrieved March 12, 2018.
  572. "The top 10 facts about FreeBasics". December 28, 2015. Archived from the original on March 2, 2016.
  573. "Free Basics by Facebook". Internet.org.
  574. "TRAI Releases the 'Prohibition of Discriminatory Tariffs for Data Services Regulations, 2016'" (PDF). TRAI. February 8, 2016. Archived from the original (PDF) on February 8, 2016.
  575. "How India Pierced Facebook's Free Internet Program". Backchannel. February 1, 2016.
  576. "TRAI letter to Facebook" (PDF). Archived from the original (PDF) on February 19, 2016.
  577. "Trai to Seek Specific Replies From Facebook Free Basic Supporters". Press Trust of India. December 31, 2015.
  578. Brühl, Jannis; Tanriverdi, Hakan (2018). "Gut für die Welt, aber nicht für uns". Süddeutsche Zeitung (in German). ISSN 0174-4917. Retrieved December 10, 2018.
  579. "Tech bosses grilled over claims of 'harmful' power". BBC News. July 30, 2020. Retrieved July 30, 2020.
  580. Business, Brian Fung, CNN. "Congress grilled the CEOs of Amazon, Apple, Facebook and Google. Here are the big takeaways". CNN. Retrieved July 30, 2020. {{cite web}}: |last= has generic name (help)CS1 maint: multiple names: authors list (link)

Further reading

External links

Meta Platforms
Products
and services
Facebook
Instagram
Hardware
Other
Former
People
Founders
Board
Current
Former
Executive
officers
Current
Former
Oversight
Board
Members
Board of
Trustees
Former
members
Notable
employees
Current
Former
Open source
Mass media
Concepts
Business
Lists
Related
Censorship and websites
Censorship of
Censorship by
Websites blocked in
Categories: