Tagged: VRM

Will Big Data kill Vendor Relationship Management?

Modernization of Al-Khalid Main Battle Tank (MBT) PAKISTAN ARMY I III have just finished reading Doc Searls’ Intention Economy. And about time too. The book has been out about two years and it is widely recognised as being a Very Important Book. In my defence, I have been following the Vendor Relationship Management (VRM) thing anyway and have even had some marginal contact with the good Doc himself on the issue. So it was more a case of filling-in the gaps. For those not already in the know, VRM is positioned as the counterpoint to CRM (Customer Relationship Management). CRM is how brands use data about their customers in order to define the relationship the brand decides it wants to have with the customer: VRM proposes that customers should own and control the data about themselves so that they can define the relationship they want to have with brands.

I can validate that it is indeed a Very Important Book because it not only defines this new and potentially interesting area (VRM) but also because it strays into a wider analysis of the history and operation (and philosophy) of the internet. The issues that it raises here are becoming increasingly important as pressures build to manage, regulate and appropriate the internet in order to make it conform to political or commercial vested interest. In fact, this wider analysis could turn out to be the most important aspect in the book, or perhaps a valid subject for a new book.

The Intention Economy and VRM is something I would very much like to believe in. Trouble is, form me VRM is a bit like God: something I would like to believe in if only I could get the evidence and reality to stack up. There seem to be just too many reasons why VRM (like God) doesn’t or won’t exist.   At one level, VRM appears to be overly reliant on a code-based answer. This is probably because Doc Searls himself and many of the current VRM gang come out of this place. But the concept that I found most interesting in the book was the idea of the things Doc calls ‘fourth parties’. Fourth parties are organisations that can aggregate customer intentions and thus create leverage and scale efficiencies. This takes us into the realm of community, which rings bells with me since I believe that within a few years almost all relationships between individuals and brands will be mediated by some form of community. In fact, this would be my own take on how the Intention Economy might actually come into being. I think it is the ability to connect individual customers, rather than empower them as individuals, that is likely to present the greatest opportunity to change the rules of the game – as things like TripAdvisor or even Airbnb are starting to demonstrate. However, fourth parties get relatively short shrift in the book, perhaps because they are not a code-based answer.

But my greatest area of scepticism, or perhaps fear, for the future of the customer and citizen, stems from the emerging world of Big Data and algorithms. As outlined in my previous post, algorithms suck the power out of the idea of having a personal data repository and make the ownership of this, from a government, brand, customer or citizen perspective largely irrelevant. In the world of the algorithm, your personal data file (i.e. your life) becomes little more than personal opinion. To all intents and purposes your ‘real’ identity is defined by the algorithm and the algorithm’s decision about who you are and how you shall be treated will pay scant attention to any information that is personal to you, other than to use it as a faint, initial signal to acquire ’lock-on’.

The problem with algorithms is that (like tanks) they favour governments and corporations. It is hard for a citizen to get a hold of, or be able to use, an algorithmic tank. And if you are standing in front of an algorithmic tank, giving you the rifle and flak-jacket of your own data isn’t much protection. It is why Wall Street is the first place that the world of the algorithm has really taken hold – it could afford the best geeks. And as Wall Street is showing, the world of the algorithm tends towards a very dark and opaque sort of place – about as far removed from the sun-lit commons of open-source code sharing as it is possible to be.

However, create the opportunity to connect a million people with rifles and flak-jackets to confront one algorithmic tank, and the odds get better. You may even be able to form a fourth party which can create its own tank, or at least some effective anti-tank weapons.

So, I guess my message to Doc Searls and the VRM gang would be: don’t loose faith in the idea of VRM and the Intention Economy as a destination, but think again about the route.  Build on the idea of fourth parties and focus on community and connection, rather than tools and code, and recognise that CRM is about to be swept away as brands and governments learn how to roll-out the algorithmic tanks.

Privacy: let’s have the right conversation

The whole social media, Big Data, privacy thing is getting an increasing amount of air time. This is good, because this is very important thing to start getting our heads around. However, I don’t think we are really yet having the right conversation.

The pre-dominant conversation out there seems to be focused on the issues concerned with the potential (and reality) of organisations (businesses or governments) ‘spying’ on citizens or consumers by collecting data on them, often without their knowledge or permission.

Our privacy is therefore being ‘invaded’.

But this is an old-fashioned, small data, definition of privacy. It assumes that the way to gain an understanding of an individual, which can then be used in a way which has consequences for that individual, is by collecting the maximum amount of information possible about them: it is about creating an accurate and comprehensive personalised data file. The more comprehensive and accurate the file is, the more useful it is. From a marketing perspective, it is the CRM way of looking at things (it is also the VRM way of looking at things, where the individual has responsibility for managing this data file).  It is also a view that then gives permission to the idea that if you detach the person from the data (i.e. make it anonymous) it stops it being used in a way which will have consequences for the individual concerned and is therefore ‘cleared’ for alternative usage.

But this is not the way that Big Data works. The ‘great’ thing about Big Data (or more specifically algorithms) is that they require almost no information about an individual in order to arrive at potentially very consequential decisions about that individual’s identity.   Instead they use ‘anonymised’ information gathered from everyone else. And increasingly this information is not just coming from other people, it is coming from things (see Internet of Things). The great thing about things is that they have no rights to privacy (yet) and they can produce more data than people.

The name of the game in the world of the algorithm is to create datafied (not digitised) maps of the world. I don’t mean literally geographical maps (although they can often have a geographical / locational component): from a marketing perspective it can be a datafied map of a product sector, or form of consumer behaviour. These maps are three dimensional in that they comprise a potentially limitless numbers of data layers. These layers can be seemingly irrelevant, inconsequential or in no way related to the sector of behaviour that is being mapped. The role of the algorithm is the stitch these layers together, so that a small piece of information in one layer can be related to all the other layers and thus find its position upon the datafied map.

In practical terms, this can mean that you can be refused a loan based on information concerning your usage of electrical appliances, as collected by your ‘smart’ electricity meter. This isn’t a scary, down-the-road sort of thing. Algorithmic lending is already here and the interesting thing about the layers in the datafied maps of algorithmic lenders is the extent to which they don’t rely on traditional ‘consequential’ information such as credit scores and credit histories. As I have said many times before, there is no such thing as inconsequential data anymore: all data has consequences.

Or to put it another way, your identity is defined by other peoples’ (or things’) data: your personal data file (i.e. your life) is simply a matter of personal opinion. It has little relevance to how the world will perceive you, no matter how factually correct or accurate it is. You are who the algorithm says you are, even if the algorithm itself has no idea why you are this (and cannot explain it if anyone comes asking) and has come to this conclusion based in no small part, by the number of times you use your kettle every day.

The world of the algorithm is a deeply scary place. That is why we need the conversation. But it needs to be the right conversation.

Doc Searls, Michael Wolff and The Facebook Fairy

I spend a fair bit of time puncturing organisations’ belief in The Facebook Fairy.  This is the belief that having ‘conversations’ and ‘engagement’ with a handful of your customers or consumers is a sensible thing to do because The Facebook Fairy will sprinkle some magic dust such that this ‘engagement’ will spread to all of your customers or consumers who also happen to be on Facebook.  (Note: this doesn’t mean that you shouldn’t have conversations with your consumers in Facebook – just not the type of conversations that are predicated on creating a business benefit via the ability to spread a small conversation to lots of people).

I have also been critical of Facebook’s long-term viability because its’ business model is based on the idea that it is a form of media, when in fact it is an infrastructure (or even a form of behaviour).  However, until this point I had never really questioned its utility as an advertising platform, albeit a platform that would never be able to fulfil the revenue expectations its current valuation suggests.  But then I read this piece by Doc Searls and also the article by Michael Wolff that Searls’ references.  The Michael Wolff piece is a withering exposure of the viability of the ad-supported web, largely based on research into the decreasing effectiveness of on-line advertising, whereas Doc Searls’ piece probes more deeply into the idea that advertising decreases in effectiveness as it becomes more targeted and personal – important given that increased personalisation is assumed to be the on-line salvation of advertising.  He cites two posts by Don Marti looking at the phenomenon.  Marti says in one of these pieces:

The more targeted that advertising is, the less effective that it is. Internet technology can be more efficient at targeting, but the closer it gets to perfectly tracking users, the less profitable it has to become.

The profits are in advertising that informs, entertains, or creates a spectacle—because that’s what sends a signal. Targeting is a dead end. Maybe “Do Not Track” will save online advertising from itself.

Marti also suggests that the value of advertising lies in the fact that it is, well – advertising: something that puts a single message in front of lots of people.  Its value as a statement – a signal as Marti puts it – derives from its scale and lack of personalisation.  Or to put it another way, not only does The Facebook Fairy of network influence not exist, neither does the Facebook Fairy Godmother of on-line advertising.

This idea that advertising only really works as a way of talking to lots of people seems to make eminent sense.  I am always telling people to recognise that traditional media and social media are different – what works in one doesn’t work in the other.  Social media really only works when you are dealing with small groups of people – therefore, as a business – you need to create a benefit other than that which is derived from reaching a significant proportion of your audience.  You can’t rely on the Facebook Fairy to spread your Facebook activity across the Facebook world, as though it were a media platform or channel.  These benefits have to be based on the ability to consult with, or respond to, your audience (something which an ad cannot do) and this response has to be based around what people are doing (behaviour) not who they are or what channels they are using.

This brings us onto the second part of Doc Searl’s piece.  His contention is that the role for organisations that wish to perform some sort of intermediary between individual consumers, or between institutions and consumers, lies in the area of  Vendor Relationship Management (VRM).  This is the idea that the business opportunity lies in providing customers or consumers with the tools and data they need to manage their relationships with brands / organisations: basically a reversal of the current approach where value is assumed to lie in the ability for brands to hold the data in order to control the consumer relationship.  It is an approach supported by Sir Tim Himself.  VRM is indeed a fascinating subject – but I am not yet convinced this is the way to go.  While it is almost certain that one of the principal shifts inherent in ‘The Social Media Revolution’ is the ability for individuals (consumers or citizens) to connect with each other to either manage, or by-pass, their relationships with institutions (governments or business), this doesn’t necessarily mean that the response from institutions should be to co-opt, or even support, this process.

Rather than become involved in the business of helping consumers connect with each other, I think business has to start from a recognition that consumers have, or will become, connected and deal with the challenges and opportunities that this presents.  Increasingly I think the real ‘paradigm shift’ businesses need to take is making the break from thinking about channels, tools and messages (and to a large extent VRM is still a channel / tool based approach) and think instead about identifying and responding to behaviours (see these recent posts in response to Altimeter’s Dynamic Customer Journey and also digital influence).