So as more and more of our lives are captured in digital, big data analytics is going to become increasingly ubiquitous. As a society, we have the opportunity to choose if we want the insights it generates to be used to exploit and manipulate us or to enrich our lives. How we approach privacy will have the single greatest impact on that outcome.
Let's imagine a future where every piece of information you see is filtered by an algorithm. You only see what it wants you to see. Your favorite social networking site, which you log in today to check your news feed, needs to grow advertising revenue of diet products. To do this, it primes the pump by feeding you articles tailored to reinforce your body image issues. Then, when it observes a decrease in your self-confidence, it pushes targeted advertising at you to grab you while you're at a low ebb.
I lie not. The social networking side gets its advertising revenue, its clients sell lots of diet products, and you end up feeling bad about yourself. And it could be a lot worse, I promise you. But the reality is, it doesn't have to be like this, and I don't think we want to live in a world where it's going to be like this.
Let's imagine instead, instead of manipulating you without your knowledge, the system was open and transparent with you. It shared the insights that it had generated and gave you options to choose from. Now maybe you still do end up buying diet products, or maybe you sign up for cooking classes, you join a gym, or you do nothing. Whatever you do, it would have been your conscious choice.
Social, mobile and cloud technologies are connecting us as never before. And if we want to benefit from being part of this new global human network, then we need to accept that much of our data is out there and is out of our immediate control. The reaction of many governments is to suggest that we simply turn off the data tap, either by preventing data from being gathered or wrapping it in complicated regulatory frameworks.
The problem with this approach is but in this interconnected world, data leaks between people. So the data someone else shares can generate an insight about you, and it may be an insight that you really don't want somebody else to have or to use. Data ownership is utterly meaningless.
It's who the insight refers to that counts. Also, defining data as public or private is also a challenge, because analytics can take... public data, like social media or web data, to derive a private insight.
For example, modeling your daily habits in order to predict your physical location. On the other hand, analytics can take private data, such as your GPS location, to generate a public insight that never exposes uniquely identifiable information. So what do we do in this new world, this new interconnected world, where we're faced with privacy spaghetti? There is no easy solution to privacy, and different scenarios require different approaches.
However, knowing what is possible with data analytics, the type of insights that we can infer from the seemingly innocuous, and how we can fill in missing data, I believe a key challenge that we need to address is one of transparency. In much the same way as consumers have decided that it's socially unacceptable to buy from companies, that pollute the environment. We could choose, as individuals and as a society, if we really want to engage with organizations that are not open and transparent about how they are collecting, analyzing and using information about us. And a culture of transparency could not only give us access to these insights, but it could also then enable us to exercise appropriate control.
So, how would transparency work in the real world? I'd like to share with you a project I've been working on within IBM for the last couple of years. So IBM has been using social and collaboration technologies since long before it was popular.
So we're sitting on probably one of the largest and the longest standing enterprise social networks on the planet. Our challenge is that on the one hand we've got an engaged active network of employees who are all looking to maximise the benefits that they get from their social and collaboration investment. And then we've got a management team keen to understand what the network is saying about their employees and the business as a whole. When I was asked to build a system that would analyze our enterprise social network, I decided to take a privacy by design approach. And before we wrote a line of code, we defined the philosophy that would guide our subsequent design decisions.
It had three principles, privacy and personal autonomy. At the heart of every decision that we make and everything we do will be our commitment to openness and transparency with our employees. Simplicity and ease of use.
All analytics will be clearly described, simply presented, and understandable to every employee. And personal empowerment. Knowledge is power, and we will put actionable insight into the hands of all employees. So employees get access to these new insights, and they get to choose if they want to share them with anyone else. Management get access to aggregated analysis.
that allows them to drill down to subsets of the network, but not to a uniquely identifiable individual. While this approach may seem restrictive, and some people thought I was insane when I initially suggested it, I can tell you, because we're doing all this analysis, but we're not going to show you, but the upside has been really significant for us on a number of different fronts. So the first thing is, by defining very simple principles that don't require a law degree to decipher, we've...
demonstrated trust with our employees and we built this trusting relationship with our employees. By being open and transparent with them we've been able to generate dialogue and completely change the conversation around how to use and generate value out of social and collaboration data, which for me has been one of the most rewarding parts of the project because I've seen employees who would be naturally suspicious of such an analytic system. not only proactively requesting to join, but offering to share more data and really engaging in the conversation.
The other thing as well is by putting employees in control of the analytics, we've demonstrated respect. And then this whole engagement, respect and trust, has meant that we can create new relationships with our employees. So corporate programs now have access to new insights, but in a way that is...
They can access them in a way that is... respectful and sensitive to employees. So just to demonstrate an example of this, a few months ago, an advocacy program within IBM reached out to me, and they wanted access to the analytics that we generated.
When I explained that the analytics was private to each employee and I couldn't share it, they were initially disappointed. However, when we looked in more detail at what their program needed, we recognized that we could give them so much more than analytics. We could give them an opportunity to really engage and build relationships with the IBMers they wanted to recruit. So our system could accurately map the needs of our users with the needs of their program, but by reaching back to employees first, before we shared anything, we not only demonstrated that this program respected the employees and their privacy, but we also ensured that the recommendations that we gave only included IBMers that really wanted to be part of the program. So this team, they bravely didn't look at privacy as a barrier to innovation.
Instead, they saw it as an opportunity to innovate, and I believe their program is better because of it. So could this approach to transparency, or this culture of transparency, work in the external social sphere? I personally believe it could, and more importantly, I believe it should. Much as we have given our employees access to the analytics and control on the analytics that we generate about them, why should a consumer have less level of transparency from the services that they use? I know personally, I would like to know what sort of assumptions my favorite retailer might be making about me and what they're doing with those insights.
Data analytics is going to be key to our future. And as a society, and if we want society to really benefit positively, we need to take this journey together. If we can embrace privacy instead of fight it, and instead of looking for the easy solution, we look for the best solution for all participants, then maybe we can avoid this tug-of-war between the citizen on the one hand, who doesn't want to be digitally stalked and manipulated, and the organization on the other. His very survival may depend on its ability to harvest and generate value from this data. Demonstrating openness and transparency builds trust, and it allows our users to engage more openly and more freely with us and share more data.
And more data means more value for them and for us. It's a virtuous circle. Thank you.