ByteDance reorganization offers glimpse into TikTok parent’s future

ByteDance has long been celebrated as an “app factory” for its proven model of churning out apps and monetizing them through a robust backend of shared resources, from engineering to marketing support. The result is a rank of household apps — Douyin and Toutiao in China and TikTok in the rest of the world.

In the meantime, the firm has prided itself on its “flat” internal organization with a flurry of self-governing products. As of September 2020, Zhang had 14 executives reporting to him, according to data compiled by The Information.

But as the firm continues to flourish, its stewards recognized that a structural shakeup is needed to fit its ballooning size. That change has come. ByteDance will group its apps and operations under six new “business units”, according to an internal document seen by TechCrunch Tuesday.

Notably, Shou Zi Chew, currently chief executive at TikTok, will no longer be ByteDance’s chief financial officer. Chew, a banking executive who previously worked at Xiaomi as CFO, joined ByteDance in March as its CFO and was made TikTok’s CEO in May.

At the time of his appointment as the new finance boss, speculation was rife that Chew was brought in to work on ByteDance’s initial public offering.

But the debacle of Ant Group’s planned IPO and later Didi’s regulatory overhaul have dimmed the prospects of Chinese internet firms seeking public listings. The other and perhaps more difficult question for ByteDance is: Which of the company’s units will be listed, and where?

Rubo Liang, a co-founder of ByteDance, has taken over Zhang Yiming as the firm’s CEO.

The six newly minted business units are useful indicators of ByteDance’s strategic focuses in the foreseeable future. They are:

TikTok: This unit will manage the video-sharing app and any business spawned by it, such as the firm’s e-commerce operations outside China.

Douyin: The eponymous app is the Chinese version of TikTok and is now officially the name of a standalone business unit overseeing ByteDance’s lucrative ad-powered content businesses in China. Xigua, which features longer videos, and Toutiao, the firm’s popular news aggregator, will be folded under the unit.

Dali Education: Dali was created in 2020 as ByteDance’s foray into the online learning sector. It now oversees the firm’s vocational learning, education hardware (like a lamp that allows busy parents to remotely keep their kids’ company during homework time), and campus learning initiatives.

Lark: Lark, a workplace collaboration software, is ByteDance’s ambition to put Slack and G Suite in one and part of the company’s B2B bet.

BytePlus: This is essentially the infrastructure piece of ByteDance’s B2B endeavor. The unit sells AI and data tools to enterprise clients.

Nuverse: It’s ByteDance’s game development and publishing unit, which also manages titles intended for overseas markets. Gaming companies in China are increasingly seeking growth abroad amid regulatory uncertainties.

TikTok’s identity

The inception of a TikTok business unit is noteworthy. ByteDance has been distancing TikTok from the rest of its Chinese businesses ever since concerns from the West rose over the video app’s links to the Chinese government. TikTok says it stores all its data in the U.S. with backup servers in Singapore, rather than Beijing where its parent company is headquartered.

These measures aren’t enough to appease U.S. regulators’ worries. In its first-ever congressional hearing in the United States, TikTok faced tough questions and was repeatedly asked to clarify its semantics.

Senator Ted Cruz pressed the firm to address whether Beijing ByteDance Technology is “part of” TikTok’s “corporate group,” a term used in TikTok’s privacy policy, which states that the app “may share all of the information we collect with a parent, subsidiary, or other affiliate of our corporate group.”

TikTok’s representative maintained that TikTok has “no affiliation” with Beijing ByteDance Technology, ByteDance’s Chinese entity in which the government took a stake and board seat this year. That wording did not satisfy the Senator.

Australia sues Facebook over Cambridge Analytica, fine could scale to $529BN

Australia’s privacy watchdog is suing Facebook over the Cambridge Analytica data breach — which, back in 2018, became a global scandal that wiped billions off the tech giant’s share price yet only led to Facebook picking up a $5BN FTC fine.

Should Australia prevail in its suit against the tech giant the monetary penalty could be exponentially larger.

Australia’s Privacy Act sets out a provision for a civil penalty of up to $1,700,000 to be levied per contravention — and the national watchdog believes there were 311,074 local Facebook users in the cache of ~86M profiles lifted by Cambridge Analytica . So the potential fine here is circa $529BN. (A very far cry from the £500k Facebook paid in the UK over the same data misuse scandal.)

In a statement published on its website today the Office of the Australian Information Commissioner (OAIC) says it has lodged proceedings against Facebook in a federal court alleging the company committed serious and/or repeated interferences with privacy.

The suit alleges the personal data of Australian Facebook users was disclosed to the This is Your Digital Life app for a purpose other than that for which it was collected — thereby breaching Australia’s Privacy Act 1988. It further claims the data was exposed to the risk of being disclosed to Cambridge Analytica and used for political profiling purposes, and passed to other third parties.

This is Your Digital Life was an app built by an app developer called GSR that was hired by Cambridge Analytica to obtain and process Facebook users’ data for political ad targeting purposes.

The events from which the suit stems took place on Facebook’s platform between March 2014 and May 2015 when user data was being siphoned off by GSR, under contract with Cambridge Analytica — which worked with US political campaigns, including Ted Cruz’s presidential campaign and later (the now) president Donald Trump.

GSR was co-founded by two psychology researchers, Aleksandr Kogan and Joseph Chancellor. And in a still unexplained twist in the saga, Facebook hired Chancellor, in about November 2015, which was soon after some of its own staffers had warned internally about the “sketchy” business Cambridge Analytica was conducting on its ad platform. Chancellor has never spoken to the press and subsequently departed Facebook as quietly and serendipitously as he arrived.

In a concise statement summing up its legal action against Facebook the OIAC writes:

Facebook disclosed personal information of the Affected Australian Individuals. Most of those individuals did not install the “This is Your Digital Life” App; their Facebook friends did. Unless those individuals undertook a complex process of modifying their settings on Facebook, their personal information was disclosed by Facebook to the “This is Your Digital Life” App by default. Facebook did not adequately inform the Affected Australian Individuals of the manner in which their personal information would be disclosed, or that it could be disclosed to an app installed by a friend, but not installed by that individual.

Facebook failed to take reasonable steps to protect those individuals’ personal information from unauthorised disclosure. Facebook did not know the precise nature or extent of the personal information it disclosed to the “This is Your Digital Life” App. Nor did it prevent the app from disclosing to third parties the personal information obtained. The full extent of the information disclosed, and to whom it was disclosed, accordingly cannot be known. What is known, is that Facebook disclosed the Affected Australian Individuals’ personal information to the “This is Your Digital Life” App, whose developers sold personal information obtained using the app to the political consulting firm Cambridge Analytica, in breach of Facebook’s policies.

As a result, the Affected Australian Individuals’ personal information was exposed to the risk of disclosure, monetisation and use for political profiling purposes.

Commenting in a statement, Australia’s information commissioner and privacy commissioner, Angelene Falk, added: “All entities operating in Australia must be transparent and accountable in the way they handle personal information, in accordance with their obligations under Australian privacy law. We consider the design of the Facebook platform meant that users were unable to exercise reasonable choice and control about how their personal information was disclosed.

“Facebook’s default settings facilitated the disclosure of personal information, including sensitive information, at the expense of privacy. We claim these actions left the personal data of around 311,127 Australian Facebook users exposed to be sold and used for purposes including political profiling, well outside users’ expectations.”

Reached for comment, a Facebook spokesperson sent this statement:

We’ve actively engaged with the OAIC over the past two years as part of their investigation. We’ve made major changes to our platforms, in consultation with international regulators, to restrict the information available to app developers, implement new governance protocols and build industry-leading controls to help people protect and manage their data. We’re unable to comment further as this is now before the Federal Court.

Senators attempt to force Twitter to ban Iranian leadership

Four Senators, including Ted Cruz (R-TX), have asserted that, as a consequence of sanctions placed on Iran, Twitter must cease providing its services to Ayatollah Khamenei and other leaders in the country. “The Ayatollah enjoys zero protection from the United States Bill of Rights,” he wrote in a letter to the company.

Although the move comes as relations between Iran and the U.S. grow ever more strained following a series of violent incidents connected with the country, it is also clearly an attempt to exert executive power over tech companies that have resisted the yoke of federal regulation.

In a letter (PDF) sent to Twitter, the U.S. Attorney for Northern California, and others, the Senators explained the rationale for their demand. The Obama administration created rules in 2014 that specifically made an exception to export rules allowing free messaging and social media type services to be offered to Iranians. The idea being that, though Twitter and many other such apps are mostly banned in Iran, it could not hurt to offer tools for free expression and communication to its citizens.

But there are exceptions even to exceptions, and this is what Cruz et al. claim now apply to Twitter. Specifically, they say that following Trump’s executive order in June imposing additional sanctions on Iran, the Khamenei and foreign minister Javad Zarif have lost the protection the law previously offered.

“All Americans — including you and Twitter — are prohibited from ‘the making of any contribution of provision of…goods[] or services’ to them,” the letter reads. “While the First Amendment protects the free speech rights of Americans… the Ayatollah and any American companies providing him assistance are entirely subject to U.S. sanctions laws.”

Not being an expert in import/export law myself, I can’t judge the merits of this argument, though on its face it seems sound. But it may not be a question of whether Twitter can or can’t “offer services” to persons blacklisted by the federal government. There is the possibility that Twitter choosing to offer the use of its platform to others is itself a protected act of Free Speech.

After all, the White House could just as easily have issued an E.O. blacklisting the leaders of the countries subject to the travel ban. Should that be a possibility? Is it the right of a U.S. company to extend its platform for free speech to anyone in the world, regardless of their legal status in the eyes of the government?

Senators Ted Cruz, Marsha Blackburn (, Tom Cotton (R-AR), Marsha Blackburn (R-MS), and Marco Rubio (R-NJ) think otherwise. It is unlikely Twitter will simply let the matter rest. I have contacted the company for comment and will update this post if I hear back.

Facebook data misuse and voter manipulation back in the frame with latest Cambridge Analytica leaks

More details are emerging about the scale and scope of disgraced data company Cambridge Analytica’s activities in elections around the world — via a cache of internal documents that’s being released by former employee and self-styled whistleblower, Brittany Kaiser.

The now shut down data modelling company, which infamously used stolen Facebook data to target voters for President Donald Trump’s campaign in the 2016 U.S. election, was at the center of the data misuse scandal that, in 2018, wiped billions off Facebook’s share price and contributed to a $5BN FTC fine for the tech giant last summer.

However plenty of questions remain, including where, for whom and exactly how Cambridge Analytica and its parent entity SCL Elections operated; as well as how much Facebook’s leadership knew about the dealings of the firm that was using its platform to extract data and target political ads — helped by some of Facebook’s own staff.

Certain Facebook employees were referring to Cambridge Analytica as a “sketchy” company as far back as September 2015 — yet the tech giant only pulled the plug on platform access after the scandal went global in 2018.

Facebook CEO Mark Zuckerberg has also continued to maintain that he only personally learned about CA from a December 2015 Guardian article, which broke the story that Ted Cruz’s presidential campaign was using psychological data based on research covering tens of millions of Facebook users, harvested largely without permission. (It wasn’t until March 2018 that further investigative journalism blew the lid off the story — turning it into a global scandal.)

Former Cambridge Analytica business development director Kaiser, who had a central role in last year’s Netflix documentary about the data misuse scandal (The Great Hack), began her latest data dump late last week — publishing links to scores of previously unreleased internal documents via a Twitter account called @HindsightFiles. (At the time of writing Twitter has placed a temporary limit on viewing the account — citing “unusual activity”, presumably as a result of the volume of downloads it’s attracting.)

Since becoming part of the public CA story Kaiser has been campaigning for Facebook to grant users property rights over their data. She claims she’s releasing new documents from her former employer now because she’s concerned this year’s US election remains at risk of the same type of big-data-enabled voter manipulation that tainted the 2016 result.

“I’m very fearful about what is going to happen in the US election later this year, and I think one of the few ways of protecting ourselves is to get as much information out there as possible,” she told The Guardian.

“Democracies around the world are being auctioned to the highest bidder,” is the tagline clam on the Twitter account Kaiser is using to distribute the previously unpublished documents — more than 100,000 of which are set to be released over the coming months, per the newspaper’s report.

The releases are being grouped into countries — with documents to-date covering Brazil, Kenya and Malaysia. There is also a themed release dealing with issues pertaining to Iran, and another covering CA/SCL’s work for Republican John Bolton’s Political Action Committee in the U.S.

The releases look set to underscore the global scale of CA/SCL’s social media-fuelled operations, with Kaiser writing that the previously unreleased emails, project plans, case studies and negotiations span at least 65 countries.

A spreadsheet of associate officers included in the current cache lists SCL associates in a large number of countries and regions including Australia, Argentina, the Balkans, India, Jordan, Lithuania, the Philippines, Switzerland and Turkey, among others. A second tab listing “potential” associates covers political and commercial contacts in various other places including Ukraine and even China.

A UK parliamentary committee which investigated online political campaigning and voter manipulation in 2018 — taking evidence from Kaiser and CA whistleblower Chris Wylie, among others — urged the government to audit the PR and strategic communications industry, warning in its final report how “easy it is for discredited companies to reinvent themselves and potentially use the same data and the same tactics to undermine governments, including in the UK”.

“Data analytics firms have played a key role in elections around the world. Strategic communications companies frequently run campaigns internationally, which are financed by less than transparent means and employ legally dubious methods,” the DCMS committee also concluded.

The committee’s final report highlighted election and referendum campaigns SCL Elections (and its myriad “associated companies”) had been involved in in around thirty countries. But per Kaiser’s telling its activities — and/or ambitions — appear to have been considerably broader and even global in scope.

Documents released to date include a case study of work that CA was contracted to carry out in the U.S. for Bolton’s Super PAC — where it undertook what is described as “a personality-targeted digital advertising campaign with three interlocking goals: to persuade voters to elect Republican Senate candidates in Arkansas, North Carolina and New Hampshire; to elevate national security as an issue of importance and to increase public awareness of Ambassador Bolton’s Super PAC”.

Here CA writes that it segmented “persuadable and low-turnout voter populations to identify several key groups that could be influenced by Bolton Super PAC messaging”, targeting them with online and Direct TV ads — designed to “appeal directly to specific groups’ personality traits, priority issues and demographics”. 

Psychographic profiling — derived from CA’s modelling of Facebook user data — was used to segment U.S. voters into targetable groups, including for serving microtargeted online ads. The company badged voters with personality-specific labels such as “highly neurotic” — targeting individuals with customized content designed to pray on their fears and/or hopes based on its analysis of voters’ personality traits.

The process of segmenting voters by personality and sentiment was made commercially possible by access to identity-linked personal data — which puts Facebook’s population-scale collation of identities and individual-level personal data squarely in the frame.

It was a cache of tens of millions of Facebook profiles, along with responses to a personality quiz app linked to Facebook accounts, which was sold to Cambridge Analytica in 2014, by a company called GSR, and used to underpin its psychographic profiling of U.S. voters.

In evidence to the DCMS committee last year GSR’s co-founder, Aleksandr Kogan, argued that Facebook did not have a “valid” developer policy at the time, since he said the company did nothing to enforce the stated T&Cs — meaning users’ data was wide open to misappropriation and exploitation.

The UK’s data protection watchdog also took a dim view. In 2018 it issued Facebook with the maximum fine possible, under relevant national law, for the CA data breach — and warned in a report that democracy is under threat. The country’s information commissioner also called for an “ethical pause” of the use of online microtargeting ad tools for political campaigning.

No such pause has taken place.

Meanwhile for its part, since the Cambridge Analytica scandal snowballed into global condemnation of its business, Facebook has made loud claims to be ‘locking down’ its platform — including saying it would conduct an app audit and “investigate all apps that had access to large amounts of information”; “conduct a full audit of any app with suspicious activity”; and “ban any developer from our platform that does not agree to a thorough audit”.

However, close to two years later, there’s still no final report from the company on the upshot of this self ‘audit’.

And while Facebook was slapped with a headline-grabbing FTC fine on home soil, there was in fact no proper investigation; no requirement for it to change its privacy-hostile practices; and blanket immunity for top execs — even for any unknown data violations in the 2012 to 2018 period. So, ummm

In another highly curious detail, GSR’s other co-founder, a data scientist called Joseph Chancellor, was in fact hired by Facebook in late 2015. The tech giant has never satisfactorily explained how it came to recruit one of the two individuals at the center of a voter manipulation data misuse scandal which continues to wreak hefty reputational damage on Zuckerberg and his platform. But being able to ensure Chancellor was kept away from the press during a period of intense scrutiny looks pretty convenient.

Last fall, the GSR co-founder was reported to have left Facebook — as quietly, and with as little explanation given, as when he arrived on the tech giant’s payroll.

So Kaiser seems quite right to be concerned that the data industrial complex will do anything to keep its secrets — given it’s designed and engineered to sell access to yours. Even as she has her own reasons to want to keep the story in the media spotlight.

Platforms whose profiteering purpose is to track and target people at global scale — which function by leveraging an asymmetrical ‘attention economy’ — have zero incentive to change or have change imposed upon them. Not when the propaganda-as-a-service business remains in such high demand, whether for selling actual things like bars of soap, or for hawking ideas with a far darker purpose.

Apple’s China stance makes for strange political alliances, as AOC and Ted Cruz slam the company

In a rare instance of bipartisanship overcoming the rancorous discord that’s been the hallmark of the U.S. Congress, senators and sepresentatives issued a scathing rebuke to Apple for its decision to take down an app at the request of the Chinese government.

Signed by Senators Ron Wyden, Tom Cotton, Marco Rubio, Ted Cruz, and Congressional Representatives Alexandria Ocasio-Cortez, Mike Gallagher and Tom Malinowski, the letter was written to “express… strong concern about Apple’s censorship of apps, including a prominent app used by protestors in Hong Kong, at the request of the Chinese government.”

In 2019, it seems the only things that can unite America’s clashing political factions are the decisions made by companies in one of its most powerful industries.

At the heart of the dispute is Apple’s decision to take down an app called HKMaps that was being used by citizens of the island territory to track police activity.

For several months protestors have been clashing with police in the tiny territory over what they see as the undue influence being exerted by China’s government in Beijing over the governance of Hong Kong. Citizens of the former British protectorate have enjoyed special privileges and rights not afforded to mainland Chinese citizens since the United Kingdom returned sovereignty over the region to China on July 1, 1997.

“Apple’s decision last week to accommodate the Chinese government by taking down HKMaps is deeply concerning,” the authors of the letter wrote. “We urge you in the strongest terms to reverse course, to demonstrate that Apple puts values above market access, and to stand with the brave men and women fighting for basic rights and dignity in Hong Kong.”

Apple has long positioned itself as a defender of human rights (including privacy and free speech)… in the United States. Abroad, the company’s record is not quite as spotless, especially when it comes to pressure from China, which is one of the company’s largest markets outside of the U.S.

Back in 2017, Apple capitulated to a request from the Chinese government that it remove all virtual private networking apps from the App Store. Those applications allowed Chinese users to circumvent the “Great Firewall” of China, which limits access to information to only that which is approved by the Chinese government and its censors.

Over 1,100 applications have been taken down by Apple at the request of the Chinese government, according to the organization GreatFire (whose data was cited in the Congressional letter). They include VPNs, and applications made for oppressed communities inside China’s borders (like Uighurs and Tibetans).

Apple isn’t the only company that’s come under fire from the Chinese government as part of their overall response to the unrest in Hong Kong. The National Basketball Association and the gaming company Blizzard have had their own run-ins resulting in self-censorship as a result of various public positions from employees or individuals affiliated with the sports franchises or gaming communities these companies represent.

However, Apple is the largest of these companies, and therefore the biggest target. The company’s stance indicates a willingness to accede to pressure in markets that it considers strategically important no matter how it positions itself at home.

The question is what will happen should regulators in the U.S. stop writing letters and start making legislative demands of their own.

Facebook really doesn’t want you to read these emails

Oh hey y’all, it’s Friday! It’s August! Which means it’s a great day for Facebook to drop a little news it would prefer you don’t notice. News that you won’t find a link to on the homepage of Facebook’s Newsroom — which is replete with colorfully illustrated items it does want you to read (like the puffed up claim that “Now You Can See and Control the Data That Apps and Websites Share With Facebook”.)

The blog post Facebook would really prefer you didn’t notice is tucked away in a News sub-section of this website — where it’s been confusingly entitled: Document Holds the Potential for Confusion. And has an unenticing grey image of a document icon to further put you off — just in case you happened to stumble on it after all. It’s almost as if Facebook is saying ‘definitely don’t click here‘…

ca update grey

So what is Facebook trying to bury in the horse latitudes of summer?

An internal email chain, starting September 2015, which shows a glimpse of what Facebook’s own staff knew about the activity of Cambridge Analytica prior to The Guardian‘s December 2015 scoop — when the newspaper broke the story that the controversial (and now defunct) data analytics firm, then working for Ted Cruz’s presidential campaign, had harvested data on millions of Facebook users without their knowledge and/or consent, and was using psychological insights gleaned from the data to target voters.

Facebook founder Mark Zuckerberg’s official timeline of events about what he knew when vis-a-via the Cambridge Analytica story has always been that his knowledge of the matter dates to December 2015 — when the Guardian published its story.

But the email thread Facebook is now releasing shows internal concerns being raised almost two months earlier.

This chimes with previous (more partial) releases of internal correspondence pertaining to Cambridge Analytica  — which have also come out as a result of legal actions (and which we’ve reported on previously here and here).

If you click to download the latest release, which Facebook suggests it ‘agreed’ with the District of Columbia Attorney General to “jointly make public”, you’ll find a redacted thread of emails in which Facebook staffers raise a number of platform policy violation concerns related to the “political partner space”, writing September 29, 2915, that “many companies seem to be on the edge- possibly over”.

Cambridge Analytica is first identified by name — when it’s described by a Facebook employee as “a sketchy (to say the least) data modelling company that has penetrated our market deeply” — on September 22, 2015, per this email thread. It is one of many companies the staffer writes are suspected of scraping user data — but is also described as “the largest and most aggressive on the conservative side”.

Screenshot 2019 08 23 at 16.34.15

On September 30, 2015, a Facebook staffer responds to this, asking for App IDs and app names for the apps engaging in scraping user data — before writing: “My hunch is that these apps’ data-scraping is likely non-compliant”.

“It would be very difficult to engage in data-scraping activity as you described while still being compliant with FPPs [Facebook Platform Policies],” this person adds.

Cambridge Analytica gets another direct mention (“the Cambridge app”) on the same day. A different Facebook staffer then chips in with a view that “it’s very likely these companies are not in violation of any of our terms” — before asking for “concrete examples” and warning against calling them to ask questions unless “red flags” have been confirmed.

On October 13, a Facebook employee chips back into the thread with the view that “there are likely a few data policy violations here”.

The email thread goes on to discuss concerns related to additional political partners and agencies using Facebook’s platform at that point, including ForAmerica, Creative Response Concepts, NationBuilder and Strategic Media 21. Which perhaps explains Facebook’s lack of focus on CA — if potentially “sketchy” political activity was apparently widespread.

On December 11 another Facebook staffer writes to ask for an expedited review of Cambridge Analytica — saying it’s “unfortunately… now a PR issue”, i.e. as a result of the Guardian publishing its article.

The same day a Facebook employee emails to say Cambridge Analytica “is hi pri at this point”, adding: “We need to sort this out ASAP” — a month and a half after the initial concern was raised.

Also on December 11 a staffer writes that they had not heard of GSR, the Cambridge-based developer CA hired to extract Facebook user data, before the Guardian article named it. But other Facebook staffers chip in to reveal personal knowledge of the psychographic profiling techniques deployed by Cambridge Analytica and GSR’s Dr Aleksandr Kogan, with one writing that Kogan was their postdoc supervisor at Cambridge University.

Another says they are friends with Michal Kosinsky, the lead author of a personality modelling paper that underpins the technique used by CA to try to manipulate voters — which they described as “solid science”.

A different staffer also flags the possibility that Facebook has worked with Kogan — ironically enough “on research on the Protect & Care team” — citing the “Wait, What thread” and another email, neither of which appear to have been released by Facebook in this ‘Exhibit 1’ bundle.

So we can only speculate on whether Facebook’s decision — around September 2015 — to hire Kogan’s GSR co-founder, Joseph Chancellor, appears as a discussion item in the ‘Wait, What’ thread…

Putting its own spin on the release of these internal emails in a blog post, Facebook sticks to its prior line that “unconfirmed reports of scraping” and “policy violations by Aleksandr Kogan” are two separate issues, writing:

We believe this document has the potential to confuse two different events surrounding our knowledge of Cambridge Analytica. There is no substantively new information in this document and the issues have been previously reported. As we have said many times, including last week to a British parliamentary committee, these are two distinct issues. One involved unconfirmed reports of scraping — accessing or collecting public data from our products using automated means — and the other involved policy violations by Aleksandr Kogan, an app developer who sold user data to Cambridge Analytica. This document proves the issues are separate; conflating them has the potential to mislead people.

It has previously also referred to the internal concerns raised about CA as “rumors”.

“Facebook was not aware that Kogan sold data to Cambridge Analytica until December 2015. That is a fact that we have testified to under oath, that we have described to our core regulators, and that we stand by today,” it adds now.

It also claims that after an engineer responded to concerns that CA was scraping data and looked into it they were not able to find any such evidence. “Even if such a report had been confirmed, such incidents would not naturally indicate the scale of the misconduct that Kogan had engaged in,” Facebook adds.

The company has sought to dismiss the privacy litigation brought against it by the District of Columbia which is related to the Cambridge Analytica scandal — but has been unsuccessful in derailing the case thus far.

The DC complaint alleges that Facebook allowed third-party developers to access consumers’ personal data, including information on their online behavior, in order to offer apps on its platform, and that it failed to effectively oversee and enforce its platform policies by not taking reasonable steps to protect consumer data and privacy. It also alleges Facebook failed to inform users of the CA breach.

Facebook has also failed to block another similar lawsuit that’s been filed in Washington, DC by Attorney General Karl Racine — which has alleged lax oversight and misleading privacy standards.

‘The Great Hack’: Netflix doc unpacks Cambridge Analytica, Trump, Brexit and democracy’s death

It’s perhaps not for nothing that The Great Hack – the new Netflix documentary about the connections between Cambridge Analytica, the US election and Brexit, out on July 23 – opens with a scene from Burning Man. There, Brittany Kaiser, a former employee of Cambridge Analytica, scrawls the name of the company onto a strut of ‘the temple’ that will eventually get burned in that fiery annual ritual. It’s an apt opening.

There are probably many of us who’d wish quite a lot of the last couple of years could be thrown into that temple fire, but this documentary is the first I’ve seen to expertly unpick what has become the real-world dumpster fire that is social media, dark advertising and global politics which have all become inextricably, and, often fatally, combined.

The documentary is also the first that you could plausibly recommend those of your relatives and friends who don’t work in tech, as it explains how social media – specifically Facebook – is now manipulating our lives and society, whether we like it or not.

As New York Professor David Carroll puts it at the beginning, Facebook gives “any buyer direct access to my emotional pulse” – and that included political campaigns during the Brexit referendum and the Trump election. Privacy campaigner Carroll is pivotal to the film’s story of how our data is being manipulated and essentially kept from us by Facebook.

The UK’s referendum decision to leave the European Union, in fact, became “the petri dish” for a Cambridge Analytica experiment, says Guardian journalist Carole Cadwalladr She broke the story of how the political consultancy, led by Eton-educated CEO Alexander Nix, applied techniques normally used by ‘psyops’ operatives in Afghanistan to the democratic operations of the US and UK, and many other countries, over a chilling 20+ year history. Watching this film, you literally start to wonder if history has been warped towards a sickening dystopia.

carole

The petri-dish of Brexit worked. Millions of adverts, explains the documentary, targeted individuals, exploiting fear and anger, to switch them from ‘persuadables’, as CA called them, into passionate advocates for, first Brexit in the UK, and then Trump later on.

Switching to the US, the filmmakers show how CA worked directly with Trump’s “Project Alamo” campaign, spending a million dollars a day on Facebook ads ahead of the 2016 election.

The film expertly explains the timeline of how CA had first worked off Ted Cruz’s campaign, and nearly propelled that lack-luster candidate into first place in the Republican nominations. It was then that the Trump campaign picked up on CA’s military-like operation.

After loading up the psychographic survey information CA had obtained from Aleksandr Kogan, the Cambridge University academic who orchestrated the harvesting of Facebook data, the world had become their oyster. Or, perhaps more accurately, their oyster farm.

Back in London, Cadwalladr notices triumphant Brexit campaigners fraternizing with Trump and starts digging. There is a thread connecting them to Breitbart owner Steve Bannon. There is a thread connecting them to Cambridge Analytica. She tugs on those threads and, like that iconic scene in ‘The Hurt Locker’ where all the threads pull-up unexploded mines, she starts to realize that Cambridge Analytica links them all. She needs a source though. That came in the form of former employee Chris Wylie, a brave young man who was able to unravel many of the CA threads.

But the film’s attention is often drawn back to Kaiser, who had worked first on US political campaigns and then on Brexit for CA. She had been drawn to the company by smooth-talking CEO Nix, who begged: “Let me get you drunk and steal all of your secrets.”

But was she a real whistleblower? Or was she trying to cover her tracks? How could someone who’d worked on the Obama campaign switch to Trump? Was she a victim of Cambridge Analytica, or one of its villains?

British political analyst Paul Hilder manages to get her to come to the UK to testify before a parliamentary inquiry. There is high drama as her part in the story unfolds.

Kaiser appears in various guises which vary from idealistically naive to stupid, from knowing to manipulative. It’s almost impossible to know which. But hearing about her revelation as to why she made the choices she did… well, it’s an eye-opener.

brit

Both she and Wylie have complex stories in this tale, where not everything seems to be as it is, reflecting our new world, where truth is increasingly hard to determine.

Other characters come and go in this story. Zuckerburg makes an appearance in Congress and we learn of the casual relationship Facebook had to its complicity in these political earthquakes. Although if you’re reading TechCrunch, then you will probably know at least part of this story.

Created for Netflix by Jehane Noujaim and Karim Amer, these Egyptian-Americans made “The Square”, about the Egyptian revolution of 2011. To them, the way Cambridge Analytica applied its methods to online campaigning was just as much a revolution as Egyptians toppling a dictator from Cario’s iconic Tahrir Square.

For them, the huge irony is that “psyops”, or psychological operations used on Muslim populations in Iraq and Afghanistan after the 9/11 terrorist attacks ended up being used to influence Western elections.

Cadwalladr stands head and shoulders above all as a bastion of dogged journalism, even as she is attacked from all quarters, and still is to this day.

What you won’t find out from this film is what happens next. For many, questions remain on the table: What will happen now Facebook is entering Cryptocurrency? Will that mean it could be used for dark election campaigning? Will people be paid for their votes next time, not just in Likes? Kaiser has a bitcoin logo on the back of her phone. Is that connected? The film doesn’t comment.

But it certainly unfolds like a slow-motion car crash, where democracy is the car and you’re inside it.

Democrats draw up bill that would require tech platforms to assess algorithmic bias

Democratic lawmakers have proposed a bill to address the algorithmic biases lurking under the surface of tech’s biggest platforms. The bill, known as the Algorithmic Accountability Act, was introduced by Senators Ron Wyden (D-OR), Cory Booker (D-NJ) and Representative Yvette Clarke (D-NY) will sponsor parallel legislation in the House.

The bill is well timed. Over the last month alone, Facebook found itself settling over discriminatory practices that affected job ads as well as drawing civil charges from the Department of Housing and Urban Development over similar issues with its housing ad targeting tools. The present bill targets companies that make more than $50 million a year, though any company holding data on more than one million users would be subject to its requirements.

Like yesterday’s proposed Senate bill addressing dark pattern design, the Algorithmic Accountability Act (PDF) routes its regulatory specifics through the Federal Trade Commission. Under the bill, the FTC could require companies to perform “impact assessments” on their own algorithmic decision-making systems. Those assessment would assess potential consequences for “accuracy, fairness, bias, discrimination, privacy and security” within automated systems and companies would be required to correct any issues they uncovered during the process.

In a statement on the proposed legislation, Booker denounced discriminatory tech practices that lead to “houses that you never know are for sale, job opportunities that never present themselves, and financing that you never become aware of.”

“This bill requires companies to regularly evaluate their tools for accuracy, fairness, bias, and discrimination,” Booker said.

Bias on tech’s major platforms is a hot topic right now, though the political parties are approaching the issue from very different vantage points. Just today, the Senate Judiciary Subcommittee on the Constitution held a hearing chaired by Senator Ted Cruz, who led Republicans in repeating recent unsubstantiated allegations that Facebook and Twitter disproportionately punish users on the right.

Democrats for their part have been more interested in the off-platform implications of algorithmic bias.

“By requiring large companies to not turn a blind eye towards unintended impacts of their automated systems, the Algorithmic Accountability Act ensures 21st Century technologies are tools of empowerment, rather than marginalization, while also bolstering the security and privacy of all consumers,” Sen. Clarke said.