The media carefully analyzed the words of Mark Zuckerberg, CEO of Facebook,  in Washington last week.  Over two days of marathon sessions, he was grilled by lawmakers feigning shock that the social media giant has been inappropriately using data of Facebook users.

While many Facebook users who are not involved in the media, advertising or technology space may be legitimately shocked and concerned about how their personal information was used and manipulated by Cambridge Analytica, the response by sophisticated journalists and lawmakers is hard to believe, but long overdue.   While many have touted what a fantastic job Zuckerberg did in answering questions from lawmakers,  there were some glaring holes in his testimony.  I think he did a good job representing his company, but the underlying missing substance has paved the way for future laws in privacy protection that could have implications for boards and c-suite executives across industries. 

While not under oath, Zuckerberg, clearly rehearsed and briefed by lawyers,  repeatedly told both the house and senate committees that users have control over their data and that Facebook doesn’t sell data.  But, to quote Zuckerberg’s beloved Star Wars, that’s only true “from a certain point of view.”

Technically, users can control their data in terms of who their photos or messages are posted to with those “in line” controls he repeatedly referenced, but that doesn’t mean Facebook isn’t tracking and using that data for their own purposes (i.e. to make money).  Additionally, when it comes to consent to use of data by Facebook and other apps (like the one that was a conduit to Cambridge Analytica), the reality is you have to really opt out rather than opt in.   Most default settings allow for use.  And, where consent is required, the pop up box makes it much easier to click yes than no.  Turning all those settings off takes time and some diligence, which the average person simply won’t do. 

The unspoken reality is that Facebook is a multi-billion dollar global organization using a mind trick on the young, naive and weak-minded.  Even Zuckerberg stated that if you turn off advertising you won’t get as good of an experience – which is a euphemism for being manipulated.  And, he feebly attempted to sell the theory that those who just lurk don’t get a good experience and those that actively engage in the social platform with sharing and posting are happier. 

We know, of course, there are plenty of studies that fully dispute this theory and support the potential harms, particularly on the young, that over use of social media can cause.  The constant “comparisons” of curated and filtered photos and perceived “better life” syndrome is pervasive among social media users.  It’s a 24-7 personal gossip channel that truly preys upon the weak-minded.  And, of course, what was discussed only on the surface was the ease of manipulating people through divisive ads and posts by the Russians (or companies selling products) on this “free” platform where manipulation is baked in to the “experience.” 

What was most concerning to me in watching the hearings was that Zuckerberg masterfully evaded the tough questions by pivoting and repeating standard answers of lofty aspirations for the social media behemoth to bring the world closer together through his tools.  Some of the difficult questions and observations were simply never addressed.  For example, despite his statements that users can leave anytime, Facebook goes to great lengths to discourage you from leaving and to lure you back in if you don’t log in for a period of time. They use provocative emails about your “friends”  to encourage you to return.   How is this predatory practice not manipulating people with the very data they collect by instilling a fear of missing out (often referred to as FOMO)?  How can he possibly sit there and say they don’t sell data, when they ostensibly use it to sell advertising and to manipulate their users to spend more time on their platform, get more friends, so they can sell more advertising to them?  Even one of the early founders, Sean Parker, announced in the fall of last year that “Facebook founders purposefully created something addictive. . . with each like and comment, Facebook is exploiting human psychology on purpose to keep users hooked on a social validation feedback loop.” 

What is even more irresponsible is that Facebook launches services like Facebook Live with their “move fast and break things” approach, but then when things get broken, they don’t fix it or take it down.  There is one headline after another of young people killing themselves on Facebook Live and of people committing violent acts against others on Facebook Live.  While Facebook’s policies require that this content be taken down (something Zuckerberg said repeatedly), these images are often seen by millions before it is removed- thus the damage is done.  If they can’t get it taken down faster, they need to pull the service until they can manage it properly.  

Here’s the next big question that no one really pushed.  “If you make your money by selling advertising, how are you not a media company?”  Zuckerberg clearly said in his testimony that their business model is selling ads.  If NBC or CBS allowed a video of a violent suicide or a crime to be posted and didn’t have it down immediately, they would be subject to potential fines and regulation by the FTC and/or FCC.  If they allowed manipulative ads by a foreign government to be run during a hotly contested political election, they would be held accountable.  If they allowed false advertising to be run, they could be subject to investigation and potential penalties.  The time has come for Facebook, Twitter, Snap, and all of these companies to be held accountable for what they are – a media company with the data and power to manipulate our citizens.  And these are just a couple of issues.  Some in Congress tried to use their 4 or 5 minutes to question him about ads selling illegal drugs, posts about violence or terrorism that are seen by millions before taken down, excluding certain users from advertising for housing or mortgages against discrimination laws, and the reality that they simply have been an order taker for Russian political ads without seemingly knowing it.  This shows and supports that Facebook can’t manage this on their own.  They need some governing authority to force them to do it.  Zuckerberg, himself, suggested it might be time for regulation, which raises questions as to why he can’t get this done on his own and bolsters the reality that we have reached the tipping point. 

It’s not going to get any easier in the future.  Regardless of how technology evolves through the use of the internet of things, artificial intelligence, blockchain and so many other “smart” technologies, the human factor is never removed.  Manipulating behavior to sell advertising is built in to all of these technology company revenue models and that’s really what consumers need protection from – themselves.  Even with informed or meaningful consent and opting in, consumers will likely still choose the better experience over protecting their privacy.  However, some protection is still needed if only to ensure that these tech companies play by the same rules as other media companies. 

There has been an uneven playing field for a very long time and these new tech companies have profited from it.  Would Uber have become the “fastest growing company in the world,” if it had to treat drivers like employees, pay taxes the same way or obtain local licenses for driving people around?  Probably not, but they claim they are a technology company not a transportation company. Would Facebook have become what it is, if it had to upfront carefully disclose to each and every person how it was using and monetizing their privacy and manage advertising like every other broadcaster?  Perhaps, but possibly not as fast or as scaled as they did when it was seemingly able to do so without any scrutiny. 

As for Facebook, they likely have violated the 2011 FTC Consent Decree by allowing a third party to sell data to Cambridge Analytica (whether knowingly or just negligently).  Facebook has an army of lawyers and lobbyists, so a settlement will likely be reached before the rest of us even know what really happened.  So after all of this, on the cusp of GDPR coming into effect in Europe next month, what should you, in the boardroom or c-suite, being doing? 

Some technology businesses have already been impacted by the new Fight Online Sex Trafficking Act with regard to personal ads online – for example, Craig’s List.  Many states already have laws requiring notice to consumers if their data is sold or breached (which Facebook likely breached).  Federal legislation of some kind similar to GPDR will likely be introduced this year.  If you sell ads or products online, it’s time to be thinking about your future business model because your operating costs are going to increase.  For every other company,  there are still some important takeaways:

  • Every company is tracking data about businesses or individuals across the digital landscape.  You need to understand what data you have, how permission is granted to obtain it, how you manage it, manipulate it, make money from it (whether directly or indirectly) and safeguard it.  Data mapping and Digital Mapping® are important exercises to at least have a baseline of where compliance begins for you under existing and future laws. 
  • The pending cybersecurity disclosure act, consent act, cybersecurity for medical devices act and other laws will include hefty penalties if you fail to protect consumer privacy and fail to notify individuals in the event of a breach.  These pending laws also hold boards accountable for having the right expertise to manage these burgeoning issues (either by hiring someone or adding someone to your board).  Regulation and laws that include even heftier penalties are on the horizon – don’t wait for the laws to be enacted to start planning.  It will take twice as long as you think to become compliant and implement a compliance program to satisfy the expectations.  Multiple layers of compliance will be required with some clear checks and balances.  Much like Enron resulted in separating accounting and audit functions, the same is likely coming here.  You can’t have your own people auditing your readiness for cybersecurity and privacy breaches – independent and object analysis will be the future standard. 
  • There is a deep intersection between privacy protection and cybersecurity that needs to be understood and carefully managed.  It’s not enough to just name one person a Chief Information Security Officer and another Chief Privacy Officer.  The board needs to understand how these two important functions have a holistic strategy and how they work with the Chief Legal Officer and Chief Marketing or Product officer so that there are checks and balances in place to ensure you don’t miss any important issue.  For example, how did Equifax not have an open source compliance program, which would have prevented the breach?  Was that the role of the law department, software development, security, privacy?  How did that simple fix fall through all of those cracks?  How do you ensure that doesn’t happen to you?  All of these functions need to work together and hold each other accountable to ensure you don’t find yourself in a precarious situation.
  • Ever since the Johnson and Johnson Tylenol scandal of the 1980s, the golden rule of public relations has been to say “you’re sorry”  and take responsibility.  But at some point, simply saying you’re sorry when you have failed to act and be truly responsible simply won’t cover it anymore.   We are almost at that point when even the best crisis communication strategy may not be able to undo a failure to act responsibility. 

If you’d like more information about Digital Mapping® and how to assess your potential exposure under current and pending cybersecurity and privacy laws, contact Jen Wolfe for a complimentary and confidential conversation at