Over the years, Facebook and its parent company have been involved in a number of meta-privacy-related incidents. The company has been repeatedly accused of tampering with user data, and according to internal documents, we now know why: its systems were designed that way.
The people at Vice got their hands on an internal Facebook document written by privacy engineers. The document was prepared to advise the company’s leadership on “in-depth regulations” on how it uses its customers’ data. Overall, it doesn’t paint a beautiful picture. Engineers note that Facebook was “surprised” by new rules from the EU and India that restricted its use of first-party data. They are going to explain these new regulations “creating a global regulatory push platform towards consent for 1P [first person] Use data in advertising. “Yes, imagine that.
The document noted that previously Facebook’s policy enforcement was “insufficient” for any period of time for “third party concerns.” It translates to other entities that search for customer data sales, such as an enforcement agency. This sets up a problem for Facebook; With strict rules probably coming down the pike, how would it react? The short answer is that it cannot be done because it is not set up that way. Excerpt from the document:
“We do not have adequate level of control and explanation over how our systems use data and thus we cannot confidently make controlled policy changes or make external commitments such as ‘We will not use X data for Y purposes’. And yet, regulators expect us to do just that.” By doing so, we increase our risk of misrepresentation and misrepresentation. “
Engineers then give a heinous analogy to how it works. You know, if they need to spell it for their C-suit. The document states:
“We have created a system with open borders. The results of this open system and open culture are well illustrated with an analogy: Imagine you are holding a bottle of ink in your hand. This bottle of ink is a mixture of all kinds of user data (3PD, 1PD, SCD, Europe, etc.) You pour that ink into the lake of water (our open data system; our open culture)… and it flows… everywhere. How to return the ink in the bottle? How do you reorganize it so that it only flows in the permitted areas of the lake? By the way, 3PD is third party data, 1PD is first party data and SCD is sensitive department data.
A Facebook spokesman denied the allegations in a statement issued Friday stating “Similar, baseless allegations concerning Facebook have been made more than once. “New privacy regulations around the world introduce different requirements and reflect the technical solutions we are developing in this document. We are developing data management and scaling up existing arrangements to meet our obligations,” the spokesman told Vice. The Facebook representative added that the document was being taken out of context. Because it “does not describe our comprehensive procedures and controls for complying with privacy rules.”
An example of the scale of the problem engineer says Facebook uses 15 thousand features of its advertising model. To create a single feature, that is, “user_home_city_moved” requires six thousand reference points. Multiply this view by the nearly three billion daily Facebook users, and you’ll see the scale of the problem.
Regardless of the size of the work, Facebook users need to find a way to keep a close eye on the data. This is a bit of an adjunct to the EU with caution sent to Elon Musk this week. Following his acquisition of Twitter, he was reminded of the recently passed Digital Services Act. It has much tighter control over how big technology companies handle content moderation. The world of control for these companies outside the United States is changing, and in a big way. This is a situation that the authors of the document have clearly described, “We are facing a tsunami of internal regulations that all carry a great deal of uncertainty.“