I’m in the process of conducting interviews with a range of media, advertising, and audience measurement industry professionals on the changes in the field of audience measurement, and their potential implications. And increasingly, not only through these interviews, but through associated research I’ve been consuming (such as this recent report from the Winterberry Group on “From Information to Audiences“), I’ve found myself wondering a bit if perhaps we need to consider the possibility that we’re in the midst of what might best be termed a “data bubble.”
In this context, the kind of bubble I’m referring to is similar to, yet different from, what we saw in the housing market, or the late 90s Internet, at least in terms of the possible underlying causes. In the case of audience data, I’m wondering if it is possible that, a few years down the line we will find that any or all of the following have come to pass:
1) That the scope, depth, and constant flow of data so far eclipse the resources and abilities have organizations to make effective use of it that the market for such data ends up contracting dramatically (basically, the supply of data so far outstripping analytical resources and abilities — essentially information overload).
2) That the organizations that presumably could benefit the most from such data ultimately do not undertake the massive changes in organizational culture and personnel necessary to fully institutionalize the full and extensive use of such data into their operations (thus, that it essentially wasn’t possible for the influx of data to be utilized to optimal effect given the circumstances surrounding this data influx; I see the causes here as slightly different from those associated with item 1).
3) Or, perhaps, a few years down the line, the growing array of data sources are being mined to their fullest potential and, it turns out, the benefits that accrue from such efforts don’t quite justify the substantial costs (in this scenario, folks were basically paying more for the data than the data turned out to be worth).
I’m just speculating here, of course. And this speculation was prompted by primarily by the pattern I’m seeing, where there’s lots of data being produced and purchased, but not always with a clear sense on the part of the purchaser (at least as far as I can tell) in terms of exactly how these data are going to be integrated into their business. Similarly, some data providers seem to be aggregating certain types of data simply because they can — and don’t always have (or can’t clearly articulate) a clear idea of exactly how these data would be useful, and to whom.
I’m reminded of the late-90s Internet bubble, to some extent, where eveyone knew that they needed to be online, but weren’t really quite sure what to do once they got there.
That Internet bubble would probably fit pretty well within the notion of technological determinism, a term used to reflect the belief that technology drives social, cultural, and institutional change; as opposed to technologies being somewhat subservient to the social, cultural, and institutional conditions in to which they are introduced.
I guess what I’m getting at, then, is do we need to worry about the presence of “data determinism”? That is, is it possible that the media sector, broadly defined, is assuming that data are capable of exerting greater power and influence over organizational performance than extent cultural, institutional, and economic circumstances will ultimately allow?
The Winterberry Group study, for instance, finds substantial “process and data structure challenges,” as well as challenges related to “rigid ‘silos’ and the paucity of data-savvy marketing operations, IT and sales talent.”
Is it a foregone conclusion that issues such as these get resolved? And if they don’t, do we see a bit of a contraction from what currently seems to be a bit of a gold rush mentality surrounding the supply and demand dynamics of audience data?