Facebook has been fined 100,000 euros in Germany after failing to follow orders regarding clearer privacy terms and conditions for users.
The regional court of Berlin ruled that the company did not sufficiently alter the working of an intellectual property clause in its terms and conditions, despite being told to do so following a complaint filing by the Federation of German
Consumer Organizations. The entity's head, Klaus Mueller, said that Facebook keeps attempting to evade customer laws in Germany as well as in the entire continent.
In March 2012, a German court originally ruled that the company's terms and conditions were vague on the extent to which it could go with users' data and intellectual property, implying Facebook could license its users' photos and videos to third
parties for business reasons. However, the authorities' primary issue was Facebook's compliance with the US government to provide data for its mass surveillance programs. After Edward Snowden's revelations on the US government's spying programs
and how the tech industry complies, the issue has gained more gravity.
While Facebook complied with the ruling four years ago, the Berlin court now concludes that it merely changed the wording of the clause in question without changing the message that it conveyed. Meanwhile, the company defended itself saying that
it had complied with the original ruling and was issued the fine because it couldn't implement the changes quickly enough.
In a ruling of particular interest to those working in the adult entertainment biz, a German court has ruled that Facebook's real name policy is illegal and that users must be allowed to sign up for the service under pseudonyms.
The opinion comes from the Berlin Regional Court and disseminated by the Federation of German Consumer Organizations, which filed the suit against Facebook. The Berlin court found that Facebook's real name policy was a covert way of obtaining
users' consent to share their names, which are one of many pieces of information the court said Facebook did not properly obtain users' permission for.
The court also said that Facebook didn't provide a clear-cut choice to users for other default settings, such as to share their location in chats. It also ruled against clauses that allowed the social media giant to use information such as
profile pictures for commercial, sponsored or related content.
Facebook told Reuters it will appeal the ruling, but also that it will make changes to comply with European Union privacy laws coming into effect in June.
Facebook has been ordered to stop tracking people without consent, by a court in Belgium. The company has been told to delete all the data it had gathered on people who did not use Facebook. The court ruled the data was gathered illegally.
Belgium's privacy watchdog said the website had broken privacy laws by placing tracking code on third-party websites.
Facebook said it would appeal against the ruling.
The social network faces fines of 250,000 euros a day if it does not comply.
The ruling is the latest in a long-running dispute between the social network and the Belgian commission for the protection of privacy (CPP). In 2015, the CPP complained that Facebook tracked people when they visited pages on the site or clicked
like or share, even if they were not members.
For years, privacy advocates have been shouting about Facebook, and for years the population as a whole didn't care. Whatever the reason, the ongoing Cambridge Analytica saga seems to have temporarily burst this sense of complacency, and people
are suddenly giving the company a lot more scrutiny.
When you delete Facebook, the company provides you with a compressed file with everything it has on you. As well as every photo you've ever uploaded and details of any advert you've ever interacted with, some users are panicking that Facebook
seems to have been tracking all of their calls and texts. Details of who you've called, when and for how long appear in an easily accessible list -- even if you don't use Facebook-owned WhatsApp or Messenger for texts or calls.
Although it has been put around that Facebook have been logging calls without your permission, but this is not quite the case. In fact Facebook do actually follow Facebook settings and permissions, and do not track your calls if you don't give
permission. So the issue is people not realising quite how wide permissions are granted when you have ticked permission boxes.
Facebook seemed to confirm this in a statement in response:
You may have seen some recent reports that Facebook has been logging people's call and SMS (text) history without their permission. This is not the case. Call and text history logging is part of an opt-in feature for people using Messenger or
Facebook Lite on Android. People have to expressly agree to use this feature. If, at any time, they no longer wish to use this feature they can turn it off.
So there you have it, if you use Messenger of Facebook Lite on Android you have indeed given the company permission to snoop on ALL your calls, not just those made through Facebook apps,
UK Censorship Culture Secretary Matt Hancock met Facebook executives to warn them the social network is not above law.
Hancock told US-based Vice President of Global Policy Management Monika Bickert, and Global Deputy Chief Privacy Officer Stephen Deadman he would hold their feet to the fire over the privacy of British users.
Hancock pressed Facebook on accountability, transparency, micro-targeting and data protection. He also sought assurances that UK citizens data was no longer at risk and that Facebook would be giving citizens more control over their data going
Following the talks, Hancock said:
Social media companies are not above the law and will not be allowed to shirk their responsibilities to our citizens. We will do what is needed to ensure that people's data is protected and don't rule anything out - that includes further
regulation in the future.
Grovelling to the Senate Judiciary and Commerce Committees, Mark Zuckerberg apologised that Facebook had not taken a broad enough view of its responsibility for people's public information. He ssaid:
It was my mistake, and I'm sorry. I started Facebook, I run it, and I'm responsible for what happens here.
Zuckerberg said its audit of third-party apps would highlight any misuse of personal information, and said the company would alert users instantly if it found anything suspicious.
When asked why the company did not immediately alert the 87 million users whose data may have been accessed by Cambridge Analytica (CA) when first told about the improper usage in 2015, Zuckerberg said Facebook considered it a closed case after
CA said it had deleted it. He apologised:
In retrospect it was clearly a mistake to believe them.
Zuckerberg's profuse apologies seem to have been a hit at the stock exchange but techies weren't impressed when he clammed up when asked for details on how Facebook snoops on users (and non-users).
Here is an update on the Facebook app investigation and audit that Mark Zuckerberg promised on March 21.
As Mark explained, Facebook will investigate all the apps that had access to large amounts of information before we changed our platform policies in 2014 -- significantly reducing the data apps could access. He also made clear that where we had
concerns about individual apps we would audit them -- and any app that either refused or failed an audit would be banned from Facebook.
The investigation process is in full swing, and it has two phases. First, a comprehensive review to identify every app that had access to this amount of Facebook data. And second, where we have concerns, we will conduct interviews, make requests
for information (RFI) -- which ask a series of detailed questions about the app and the data it has access to -- and perform audits that may include on-site inspections.
We have large teams of internal and external experts working hard to investigate these apps as quickly as possible. To date thousands of apps have been investigated and around 200 have been suspended -- pending a thorough investigation into
whether they did in fact misuse any data. Where we find evidence that these or other apps did misuse data, we will ban them and notify people via this website. It will show people if they or their friends installed an app that misused data before
2015 -- just as we did for Cambridge Analytica.
There is a lot more work to be done to find all the apps that may have misused people's Facebook data -- and it will take time. We are investing heavily to make sure this investigation is as thorough and timely as possible. We will keep you
updated on our progress.
Add a phone number I never gave Facebook for targeted advertising to the list of deceptive and invasive ways Facebook makes money off your personal information. Contrary to user expectations and Facebook representatives' own previous statements,
the company has been using contact information that users explicitly provided for security purposes--or that users never provided at all --for targeted advertising.
A group of academic researchers from Northeastern University and Princeton University , along with Gizmodo reporters , have used real-world tests to demonstrate how Facebook's latest deceptive practice works. They found that Facebook harvests
user phone numbers for targeted advertising in two disturbing ways: two-factor authentication (2FA) phone numbers, and shadow contact information. Two-Factor Authentication Is Not The Problem
First, when a user gives Facebook their number for security purposes--to set up 2FA , or to receive alerts about new logins to their account--that phone number can become fair game for advertisers within weeks. (This is not the first time
Facebook has misused 2FA phone numbers .)
But the important message for users is: this is not a reason to turn off or avoid 2FA. The problem is not with two-factor authentication. It's not even a problem with the inherent weaknesses of SMS-based 2FA in particular . Instead, this is a
problem with how Facebook has handled users' information and violated their reasonable security and privacy expectations.
There are many types of 2FA . SMS-based 2FA requires a phone number, so you can receive a text with a second factor code when you log in. Other types of 2FA--like authenticator apps and hardware tokens--do not require a phone number to work.
However, until just four months ago , Facebook required users to enter a phone number to turn on any type of 2FA, even though it offers its authenticator as a more secure alternative. Other companies-- Google notable among them --also still
follow that outdated practice.
Even with the welcome move to no longer require phone numbers for 2FA, Facebook still has work to do here. This finding has not only validated users who are suspicious of Facebook's repeated claims that we have complete control over our own
information, but has also seriously damaged users' trust in a foundational security practice.
Until Facebook and other companies do better, users who need privacy and security most--especially those for whom using an authenticator app or hardware key is not feasible--will be forced into a corner. Shadow Contact Information
Second, Facebook is also grabbing your contact information from your friends. Kash Hill of Gizmodo provides an example :
...if User A, whom we'll call Anna, shares her contacts with Facebook, including a previously unknown phone number for User B, whom we'll call Ben, advertisers will be able to target Ben with an ad using that phone number, which I call shadow
contact information, about a month later.
This means that, even if you never directly handed a particular phone number over to Facebook, advertisers may nevertheless be able to associate it with your account based on your friends' phone books.
Even worse, none of this is accessible or transparent to users. You can't find such shadow contact information in the contact and basic info section of your profile; users in Europe can't even get their hands on it despite explicit requirements
under the GDPR that a company give users a right to know what information it has on them.
As Facebook attempts to salvage its reputation among users in the wake of the Cambridge Analytica scandal , it needs to put its money where its mouth is . Wiping 2FA numbers and shadow contact data from non-essential use would be a good start.
Facebook has files a patent that describes a method of using the devices of Facebook app users to identify various wireless signals from the devices of other users.
It explains how Facebook could use those signals to measure exactly how close the two devices are to one another and for how long, and analyses that data to infer whether it is likely that the two users have met. The patent also suggests the app
could record how often devices are close to one another, the duration and time of meetings, and can even use its gyroscope and accelerometer to analyse movement patterns, for example whether the two users may be going for a jog, smooching or
catching a bus together.
Facebook's algorithm would use this data to analyse how likely it is that the two users have met, even if they're not friends on Facebook and have no other connections to one another. This might be based on the pattern of inferred meetings, such
as whether the two devices are close to one another for an hour every Thursday, and an algorithm would determine whether the two users meeting was sufficiently significant to recommend them to each other and/or friends of friends.
I don't suppose that Facebook can claim this patent though as police and the security services have no doubt been using this technique for years.
Parliament's fake news inquiry has published a cache of seized Facebook documents including internal emails sent between Mark Zuckerberg and the social network's staff. The emails were obtained from the chief of a software firm that is suing the
tech giant. About 250 pages have been published, some of which are marked highly confidential.
Facebook had objected to their release.
Damian Collins MP, the chair of the parliamentary committee involved, highlighted several key issues in an introductory note. He wrote that:
Facebook allowed some companies to maintain "full access" to users' friends data even after announcing changes to its platform in 2014/2015 to limit what developers' could see. "It is not clear that there was any user consent for
this, nor how Facebook decided which companies should be whitelisted," Mr Collins wrote
Facebook had been aware that an update to its Android app that let it collect records of users' calls and texts would be controversial. "To mitigate any bad PR, Facebook planned to make it as hard as possible for users to know that this
was one of the underlying features," Mr Collins wrote
Facebook used data provided by the Israeli analytics firm Onavo to determine which other mobile apps were being downloaded and used by the public. It then used this knowledge to decide which apps to acquire or otherwise treat as a threat
there was evidence that Facebook's refusal to share data with some apps caused them to fail
there had been much discussion of the financial value of providing access to friends' data
In response, Facebook has said that the documents had been presented in a very misleading manner and required additional context.
Facebook has been fined ?10m (£8.9m) by Italian authorities for misleading users over its data practices.
The two fines issued by Italy's competition watchdog are some of the largest levied against the social media company for data misuse.
The Italian regulator found that Facebook had breached the country's consumer code by:
Misleading users in the sign-up process about the extent to which the data they provide would be used for commercial purposes.
Emphasising only the free nature of the service, without informing users of the "profitable ends that underlie the provision of the social network", and so encouraging them to make a decision of a commercial nature that they would not
have taken if they were in full possession of the facts.
Forcing an "aggressive practice" on registered users by transmitting their data from Facebook to third parties, and vice versa, for commercial purposes.
The company was specifically criticised for the default setting of the Facebook Platform services, which in the words of the regulator, prepares the transmission of user data to individual websites/apps without express consent from users.
Although users can disable the platform, the regulator found that its opt-out nature did not provide a fully free choice.
The authority has also directed Facebook to publish an apology to users on its website and on its app.