This week’s Trust Insights welcomes our Trust Council members who joined us in addressing the following question:
Is the Apple/Google Contact Tracing Plan Worthy of our Trust?
by Barbara Brooks Kimmel, Founder Trust Across America-Trust Around the World
During these trying times, Apple and Google claim to have temporarily placed their corporate competitiveness on hold to begin collaborating on at least one very large data project. It’s called contact tracing, “the process of tracking down the people with whom infected patients have interacted, and making sure they get tested or go into quarantine’ according to this recent NPR article. The Apple/Google “alliance” will expand the reach of existing contract tracing capabilities. This initiative has raised many questions and multiple collective eyebrows, not only for our trust and ethics subject matter expert community, but also for the general public, and for good reasons. For example:
- Why should the public now trust the tech giants with their data when these companies have not proven themselves trustworthy in the past?
- Should all trust concerns be set aside in the interest of global health?
Who better to ask than Trust Across America’s Trust Council? Our council is comprised of senior members of our Trust Alliance who are some of the world’s leading trust subject matter experts.
What we already know about trusting the tech giants
Bart Alexander shared a quick retrospective on the state of tech’s visibility into our private lives:
Providers such as Apple and Google already have comprehensive information about our location. Even with location services (GPS) off, they have visibility into the relative strength of every wifi signal and cell signal. From years of collection including through Google’s fleet of Street View cars, they can correlate that triangulated location with GPS. With other data bases, they can determine if we are at home, at a shop or even a medical facility. Google recently reached a $13 million settlement on the use of Street View cars for MAC address collection that goes back a decade. This kind of information is used for target marketing to the public. To now add a permission marketing app to supplement with Bluetooth technology is a rather minor addition to the existing privacy concerns, and at least has a public health purpose.
Natalie Doyle Oldfield who spent twenty years working in IT before turning her attention to organizational trust, added a bit more historical perspective:
As history has shown, wars vastly expand governments’ powers to regulate, to collect data and introduce new measures. For example, income tax was introduced as a war time measures act in the interest of public welfare. At the same time, strict policies to protect personal income data were enacted. Census taking provides another historical example of data collection.
Banks, health care professionals, lawyers, accountants and other professionals must follow established confidentiality rules and codes of ethics to keep our personal data secure and private. For the most part, the regulatory bodies have put safeguards in place to ensure these professions do not abuse our privacy. And if they do, there are repercussions. Medical professionals can lose their licenses to practice and lawyers can be disbarred.
The question is will “Big Tech” demonstrate that they too not only can but WILL voluntarily meet the highest ethical standards? Can they provide sound answers to the following questions: Specifically, what data will be collected and who will have access to it? Are we committing to practicing privacy and security by design? What about HIPAA certification? Will we do what’s ethical and in the public’s best privacy interests, or only what’s regulated, understanding that tech regulations are lagging far behind other industries like finance and health care.
Personal Trust vs. Societal Health
Charlie Green’s response is one of “Roll the dice trust.”
Personal trust inevitably comes in conflict with tech privacy and security concerns. After all, the height of privacy and security tech models are called “zero trust” for a reason. Because it has nothing to do with personal trust.
I think the trust issue in this case is that we need to trust Apple and Google and each other, adding some clear transparency bumpers, to do something potentially tremendously positive in the face of a pandemic.
Randy Conley sits in the camp of “cautious optimism.”
I think technology can play a tremendously helpful role in public health or disaster management situations like this, AND, we have to be cognizant of the personal privacy issues involved. I believe South Korea has leveraged personal technology to a large degree in their successful management of the COVID-19 virus. The reality is that we live with an illusion of privacy. Despite our safeguards, we don’t have as much privacy as we think we do. If nefarious actors in Big Tech or any skilled hacker wants information on us, they can get it.
Linda Fisher Thornton considers the trade offs:
“The challenge we face is balancing the benefits of surveillance during the COVID-19 pandemic, which potentially includes saving lives, with the costs in terms of the loss of privacy and autonomy. The surveillance approach puts the safety of the masses ahead of the privacy and autonomy of individuals For surveillance to be effective, a strong majority will need to allow access to their location and health status data. To convince them to do that, tech companies will need to demonstrate trustworthy intentions, a clear plan, full disclosure, and implementation that includes privacy protections.”
Bob Whipple adds that with the tech solutions, just remember that anything that is made by people can be hacked by other people. So the potential of abuse in electronic tracing is immense.
Pandemics Aside, Trust is ALWAYS a Function of Leadership
Bob Vanourek, a former CEO of several large pubic companies reminds us that:
Good leaders go first in extending trust and scale up or down afterwards depending on the behavior of the other.
This pandemic is a huge Black Swan (or perhaps a “known-unknown”) event that will change much of our world forever. Some would argue that using such tech will help save lives and is, therefore, worthwhile. Others will argue the privacy invasion issues are scary, and we can’t take a step down this potentially slippery slope.
Like many ethical issues, there are legitimate pros and cons on both sides of the argument. Should the government pass a law outlawing this technology and behavior? I think not. Should we blindly accept the tech companies to handle this without close scrutiny? I think not.
Stephen M.R. Covey’s “smart trust” applies here alongside Jim Kouzes’ “go first” dictum. Let’s extend Google and Apple smart trust and closely monitor what they are doing, adjusting accordingly.
Wrapping up
Getting back to Bart Alexander:
In 1988, Shoshana Zuboff wrote “In the Age of the Smart Machine” that increasing automation can be used to empower or control us at work and beyond. Even in that pre-internet era, the key moral issue of surveillance had emerged: for whom and for what purpose are we giving up our privacy?
I’ve argued (in the work I did for the U of Denver Institute for Enterprise Ethics) that these moral issues should not and cannot be resolved by engineers. We need sociologists and ethicist to struggle with what otherwise are just technical problems to be overcome. I would add that public health officials will always err on the side of protection versus personal freedoms, embodied in the precautionary principle. They may often be right, but they and the software engineers’ solutions should not be without scrutiny.
Finally, as the Founder of Trust Across America- Trust Around the World, I’ll add my perspective. I do not believe that these two tech giants will receive adequate voluntary public buy-in to reach the scale they had hoped for. They simply haven’t earned the public trust required of such a large initiative. That being said, something tells me that Apple and Google already have all the technology and data they need to go forward, with or without permission, while other competing interests attempt to play catch up.
One member of our Trust Council shared this quote from the often controversial Winston Churchill: “In wartime, truth is so precious that she should always be attended by a bodyguard of lies.”
Trust Across America-Trust Around the World, along with members of its Trust Alliance, offers both online and in-person workshops to help leaders, teams and organizations build their trust competency. These are some samples of recent engagements.
Catch up on our 2020 Trust Insights series at this link.
Barbara Brooks Kimmel is an award-winning communications executive and the CEO and Cofounder of Trust Across America-Trust Around the World whose mission is to help organizations build trust. Barbara has consulted with many Fortune 500 CEOs and their firms, and also runs the world’s largest global Trust Alliance . She is the editor of the award-winning TRUST INC. book series and TRUST! Magazine. Barbara holds a BA in International Affairs and an MBA.
Copyright 2020, Next Decade, Inc.
Recent Comments