In the upcoming Industry 4.0 where technologies and organizations are more interconnected than ever, the need for accountability is inevitable.
Similarly to the recent Equifax case, Facebook finds itself at the mercy of its critics calling for more transparency and accountability. The latest information about the Russian backed ads during the U.S. 2016 election rejuvenates many already important questions.
Could this moral quandary have an AI solution?
Russia-Facebook Ads Inspire Bids for AccountabilityClick To TweetFacebook Missteps Stack Up
Facebook has a history of somewhat bloated metrics and a dubious transparency policy. Some went so far as to call 2016 the year that Facebook became “the bad guy.”
We won’t go that far, but as we move toward the realization of artificial intelligence, autonomous cars, and other advancements, businesses have to evolve, too.
But for Facebook, where does that evolution lead? After all, it’s been described as a “social utility” more than a way to connect with people (despite what the ads say). You have access to millions of people, tons of personal information, and media.
As a company, Facebook can manipulate that information. They can now embed ads in their messenger app if they want to do so.
Bigger questions loom beyond Facebook’s potential moral obligations to its users. As a result of the proliferation of tech and social media, vigilante justice is no longer confined to the pages of comic books or darkly lit streets.
So one has to wonder, is our Industry 4.0 also the “age of accountability”?
Facebook: A Third Party Platform for Presidential Election Espionage?
There are reports that Facebook posts, shares, and ads definitely could have affected the outcome of the 2016 presidential election. We don’t need to mention the buzzwords surrounding the subject–well, maybe for funsies.
#fakenews
#fakebook
#infowars
#splitlevelABtesting
#like&subscribe
#thisjokeisunique
#IdidNOTstealthisjokefromJohnOliver
#feminism
So what’s the deal with these Russian targeted ads?
Essentially, Facebook allowed Russian government-backed groups to target users with ads. This might seem innocuous, but it occurred during the 2016 presidential election. As reported by Chief Security Officer Alex Stamos, the ads didn’t seek to reference the election directly. The goal of the ads revolved around increasing existing prejudices and fanning the already burning flames of hot-button issues.
The Russian backed ads targeted ideological starters such as 2nd amendment rights, LGBT community concerns, and race-related topics. One-quarter of these ads involved specific geo-targeting, as well. Stamos goes on to mention authentic activity, machine learning, and reducing “clickbait” headlines.
Deep learning and artificial intelligence would be a fantastic solution as we move into 2018. But, we mustn’t forget what happened the last time Facebook tried to institute an AI into their processes.
Accountability, Capitalism, & You
No need for tarot card reading in this case. This is bad for Facebook and Mark Zuckerberg. Despite having turned over nearly 3,000 of the influential Russian ads, critics have urged further scrutiny.
Trevor Potter, former chairman of the Federal Election Commission, heads the nonpartisan election-law group Campaign Legal Center. He requested that the social media gargantuan release the ads to the public. The company declined to do so using the on-going investigations as the reason for their secrecy.
As raised in this Phys.org article, it does seem odd that a company whose empire lies atop other people’s personal information doesn’t want to share information of this nature. A fair point. However, Facebook now has serious obligations to its stakeholders and even to the investigation into U.S. President Donald Trump led by Special Counsel Robert Meuller.
Democratized Digital Justice

This reticence toward increased transparency and accountability could be remedied by a few low-tech solutions. Consider this: increased user responsibility or another bottom-up vs. top-down approach. Regardless of the solution, the problem remains: what is Facebook’s culpability?
If someone targets a small audience with a politically charged ad on Facebook, is Facebook responsible for the fallout of that ad? Does it only matter if that fallout is revealed after the fact?
You can see that we have an “if a tree falls in the forest” scenario on our hands which only opens up more question.
Comments (0)
Most Recent