The Challenges for Strengthening Trust
It goes without saying that the data-driven economy is increasingly complex. Its rewards and risks are an emerging phenomenon beyond the control of any one actor.1 Against this fluid backdrop, the World Economic Forum’s global dialogue has coalesced around three pillars: delivering meaningful transparency, strengthening accountability, and empowering the individual.
The global anxiety over how personal data is used stems from the fact that we are all somewhat in the dark. According to author David Brin: “We’re in a fog of data ignorance.” Fluidly moving between jurisdictions, organizations and functions, the movement of personal data exceeds our ability to completely understand it. As noted privacy scholar Helen Nissenbaum writes, “The realm is in constant flux, with new firms entering the picture, new analytics and new back-end contracts continually being forged. We are dealing with a recursive capacity that is infinitely extensible.”2
When it comes to transparency, restoring trust demands balance. Either too little transparency or too much can undermine the larger goals.
Transparency is more than access— a one way street that is “outbound only” and reduces individuals to being spectators on how their data is used. Meaningful transparency requires institutions to listen, to have “inbound” capacities that provide individuals with the ability to influence outcomes.
Fully 78% of consumers think it is hard to trust companies when it comes to use of their personal data
– Orange, The Future of Digital Trust, 2014
A growing movement is afoot to strengthen meaningful transparency and information sharing. Terms of service agreements are being simplified, standardized and put into machine-readable formats. Personal data dashboards are growing in number and functionality on a global basis.
But transparency cannot be shouldered by individuals alone. The focus of transparency needs to expand beyond the front end of the value chain. Organizations and institutions need to more effectively align (and communicate to individuals) on the shared norms of acceptable data uses. An ecosystem-wide focus on being transparent on the business-to-business processes of data handling needs to be strengthened.
The current momentum behind the drive for greater engagement is oriented towards customer-facing entities and their drive to strengthen the relationships they hold with their customers. When it comes to the back end of the data value chain behind the scenes, there is less progress. Helping individuals understand the back office of the “data-industrial complex” and the ways that data flows out the back door of customer-facing institutions remains largely opaque. Competing incentives, supply chain complexity and a lack of technical interoperability are major points of friction within the ecosystem. The question of “Who has access to what data?” remains a nearly impossible question to answer.
While positive steps are being taken by large and highly resourced organizations – within the business-to-business context – it does not fully address the operational challenges smaller sized, yet increasingly influential, commercial entities may face. There needs to be a more coherent and coordinated set of actions and liabilities defined for the fragmented supply chain so it can manage it failures as well as its success.
Figure 3: Concerns related to online privacy
Source: 2014 World Economic Forum The Internet Trust Bubble: Global Values,
Beliefs and Practices (William H Dutton, Ginette Law, Gillian Bolsover and Soumitra Dutta)
Absent broader adoption of trust networks, business risks to particular members of the supply chain will be offloaded onto consumers. Industry leaders within the digital advertising sector have noted this stating that “the supply chain by which digital advertising is created, delivered, measured and optimized is so porous and perilous that it jeopardizes consumer trust and business growth. The risk is so severe that the underlying innovativeness of the internet itself is in danger of grinding to a halt”.3 This challenge will only increase with the increasing adoption of wearable technologies (i.e. Google Glass) where individuals themselves will be able to collect, store, analyse and share information that is increasingly intimate.
The underlying tensions of transparency centre on the incentives to either facilitate or create friction for individuals. There are approaches which can be labelled as “user-centred” and those that are “user-centric”. User-centred approaches are collaborative in nature and focused on all stakeholders working to facilitate data flows which empower individuals in meaningful transactions and experiences that are consistent with their expectations. User-centric approaches, in contrast, place all the decisions and responsibilities at the feet of individuals to manage by themselves. From this perspective, individuals are responsible for managing the data flows and permissions related to them. Often they are based upon limited capacities and tools for making appropriate decisions to preserve their interests.
Another factor fuelling the transparency challenge relates to the redistribution of power. Transparency creates social, political and economic risk, particularly for incumbents. These power dynamics serve to frame the narrative for many of the digital dilemmas shaping the personal data ecosystem. Debates on collection vs usage, anonymous vs identified, freedom vs security, and public vs private are generally all framed by incumbent interests from a highly concentrated set of powerful actors.
The power dynamics can be acutely seen in the narrative between freedom of expression and national security. It goes without saying that the nature and number of technology-related threats will grow as the digital economy expands. The impact of technology is never neutral. Yet the rhetoric of fear and uncertainty too often dominates the conversation. A one-dimensional debate persists where the interests of privacy are traded off against public safety and security.
Global leaders are recognizing the need to expand the dimensions of these conversations. As Ann Cavoukian, Information and Privacy Commissioner of Ontario, Canada, writes: “It’s not a zero-sum game. Privacy and counter-terrorism measures can co-exist, with both values being respected, instead of being positioned as opposing forces requiring unnecessary trade-offs and false dichotomies.”4
The challenges facing leaders today regarding accountability are essentially the same as 30 years ago: “How can we ensure data protection while enabling the personal and societal benefits that come from its use”.5 Despite a commitment on the part of industry, regulators and civil society for greater accountability, this principle has been elusive to fully uphold in practice. The principles and rights which have served as the foundation for the data ecosystem remain vitally important; ensuring they can be effectively applied is the challenge.
The Article 29 Working Party of the European Union describes accountability as “showing how responsibility is exercised and making this verifiable”.6Along with the need for organizations to maintain effective privacy programmes with specific individuals who are answerable for their ongoing management and monitoring, a fundamental element of accountability is evidence; there needs to be verifiable evidence that appropriate measures are being taken.
The need for verifiable evidence presents a core challenge to the personal data ecosystem. There are structural limits on the tools and capacities to monitor, measure and enforce discrete uses of personal data. Accounting for the complex realities of today’s data flows in a precise and granular manner remains a grand challenge for accountability. There is a need to develop systems and legal frameworks that recognise context and do so in a way that simplifies rather than adds to the complexity of the environment.7 However, going down the path where every data interaction is context-dependent and requires its own set of rules will overwhelm the system in complexity. Approaches which simplify complexity and look to a broad set of conditions which apply to a general range of interactions has been identified as a good way to simplify, automate and facilitate trustworthy data flows.8
The tensions regarding accountability stem from an underlying pivot away from pre-emptive and generally prescriptive interventions to those which still require protection measures in advance but are more adaptive, contextual and evidence-based. Additionally, the scope of concerns are not isolated to the domain of privacy. A number of sectors now face significant governance issues in the use of personal data. National security, disaster response, automotive, health, education, retail, logistics and financial services are just some of the sectors struggling with how to balance the innovations which arise from the use of data with the need to protect the rights and claims of individuals.
As wearable technologies and the Internet of Things achieve scale, the origination of passively generated data will increase. Additionally, with billions of individuals from emerging economies connecting to the digital economy, the complexity of the issue will only multiply and accelerate.9
A third challenge undermining trust is the lack of empowerment among individuals. As mentioned, the current system reflects an asymmetry in power that broadly favours institutions (both public and private). Large institutions have greater resources to orient notice and consent agreements to advance their interests. As legal scholar Professor Ryan Calo writes: “We are only beginning to understand how vast asymmetries of information coupled with the unilateral power to design the legal and visual terms of the transaction could alter the consumer landscape.”10
The tensions fueling the issue of individual empowerment can be viewed along two dimensions. There are a set of issues stemming from the relationships between individuals and the institutions which use data (i.e. the notice and consent challenges). There are another set of concerns based on individuals being able to use “their own data” for their own purposes. This emerging “bottom up” alternative model looks at the ways that data could be used as a utility by or with the individual.11
Figure 6: Personal data management services: A mapping of the market
Source: Mapping the Market for Personal Data Management Services, Ctrl-Shift, 2014.
The issue of empowerment is most acutely seen regarding the issue of consent and purpose specification. With an increasing proportion of personal data now being passively collected by sensors or synthetically generated by algorithms, engaging individuals for consent to use data they know nothing about (and for purposes which are yet to be defined) remains problematic. Similarly, ex-ante limits on the ways that data can be used restrict innovation and growth. Combined, these two challenges create a Gordian knot that is highly complex and will continue to destabilize the ecosystem if left unchecked.
Despite the complexity of the issue, it is clear that individuals are taking additional steps to control their data. A fall 2013 study from the PewResearch Internet project found that more than half of the Americans surveyed “are concerned about the amount of personal data on the internet” and that “86% of internet users have taken steps online to remove or mask their digital footprints—ranging from clearing cookies to encrypting their email, from avoiding using their name to using virtual networks that mask their internet protocol (IP) address.” Individuals are using increasingly sophisticated privacy enhancing technologies which provide visibility into how their online activity is being monitored, block ad tracking, encrypt messages and generally hide their online activities.
The second dimension of empowerment — individuals having access to their data to be used for their own purposes — is where the power dynamics come into play. As writer and computer scientist Jaron Lanier notes: “The dominant principle of the new economy, the information economy, has lately been to conceal the value of information.”12 A meaningful and multistakeholder dialogue on “fair value” exchange is just beginning to emerge. Disciplines such as behavioural economics and neuroscience can provide insights into these issues and also help understand how users are motivated in a society where data can be valued in multiple ways.
Figure 7: Perceptions of trust through the eyes of of Internet users
Source: 2014 World Economic Forum The Internet Trust Bubble: Global Values,
Beliefs and Practices (William H Dutton, Ginette Law, Gillian Bolsover and Soumitra Dutta)
From a commercial innovation perspective, there is growing momentum in the area of Personal Data Management Services (PDMS) which can help individuals assert more control over how personal data is leveraged and value distributed. From January 2013 to January 2014 more than one new personal data service was launched per week. Areas of particular activity included data storage and management, anonymization services, identity management and personal analytics.13
Important distinctions on individual empowerment arise across different industry sectors. As new models of healthcare emerge and new strategies are identified, patient engagement is central for policies targeted to improving health and cost outcomes.
There is increasing recognition of the role of the individual as both contributor and consumer of data. In that light, a greater sense of data literacy among individuals is essential to facilitate the sharing of data for health and wellness outcomes. The role of context and the relative control of the individual are centrally important. Medical research, particularly in the area of genomics, requires large-scale data sets of uniquely identifiable and sensitive data where missing individuals can alter the findings. For medical treatment, individuals can unintentionally put themselves (and patient communities) at medical risk with too much control over the flow of health data.
Figure 8: The impact of context on acceptable uses of data
Source: World Economic Forum 2014, Trust and Context in User-centered Data Ecosystems
The increasing adoption of digital fitness tracking devices presents a new level of complexity and highlights the importance of context for the degree of individual control. While there is an opportunity to combine and commingle these new intimate, high-resolution, activity-based health data with other data sets to provide a daily health dashboard for individuals, there are a range of new uncertainties on the data quality and how these combined data sets could be used for non-health related uses.14
Personal data generated from health tracking devices present new challenges for sharing data.
- Can such data be combined with traditional medical records for research and treatment?
- Is the device reliable and accurate?
- Can the data be authenticated and linked to only one person?
- Can insurance companies use the data in their coverage decisions?
World Economic Forum, Rethinking Personal Data: From Collection to Usage 2013
The need for primary research
A deeper understanding of individuals’ sensitivities toward personal data is needed. What levels of transparency and control are needed to establish trust? What types of data carry greater sensitivities? Which personal risks are the most sensitive? What are the proper metrics to help answer these questions? The need for open and coordinated research about individuals and how they relate to data has become a top priority for sustaining trust.
Focusing on understanding the complex needs of individuals as it relates to empowerment and personal data management, a collaborative research project with more than 9,500 global respondents was established with Microsoft (full details are available in Appendix 1 of this report). The study identified seven separate variables which can influence individuals’ perceptions of trust within a given context. The four factors which had the most impact were collection method, data usage, trust and value exchange. Overall, individuals want trustworthy behaviours throughout the ecosystem which extends beyond protecting privacy to encompass data security, data accuracy, the purpose for which data are used and any “code of ethics” that helps determine “appropriate” uses.15
To globally assess the attitudes, beliefs and practices of Internet users towards the use of data, an additional collaborative research project was done with the World Economic Forum, Oxford Internet Institute, INSEAD and Cornell University. Results from 11,000 online respondents found broad support for the freedom of expression the Internet enables as well as concerns related to privacy, security, trust and perceptions of governmental surveillance.