We leave behind a daily trail of ‘digital footprints’ that are closely followed by the likes of Google, Facebook, Twitter or even the government. That information is captured, analyzed and translated into metrics used to direct the choices you make, the relationships you create, and in some ways, how you conduct your life, generally. Sounds like science fiction - it’s not.
Our day-to-day mundane musings on Twitter, photos on Instagram and Facebook ‘likes’ create a profile of us that can be analyzed and put to use. No doubt there are ways to put all of this information to good use, but experts also argue that harnessing this behavioural data disrupts our freedom, our privacy and in many senses, our autonomy.
Rebecca Lemov, an associate professor at Harvard University has written an excellent piece “Big data is people!” , which emphasizes the fact that part of the problem is that a completely data-centric approach ignores the ‘human factor’. Limited awareness about the true nature of this data breeds a lackadaisical attitude, which in turn permits organizations to make up their own set of rules about how they use this information. Perhaps it’s time to focus more on the people who are at the core of this data, and less on how best to capitalize from it.
Whether the approach will actually move from data-centric to human-centric is uncertain. In the meantime, the fact remains that big data is a business asset. Businesses that build the capacity to leverage that asset, should be building the protection of privacy into their operations. Privacy in a big data environment is a complicated, but necessary, element of a responsible information governance program.