agsandrew - Fotolia
Market behaviour is easier to regulate when those markets are narrowly defined. Service value and worth, however, can only be understood in the broader context of user application.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
As the sharpest regulatory bodies are so inclined – and Ofcom’s approach to broadband is in no way an exception – the narrowing of definitions, with an intense laser-like focus on the perceived performance of specific components, stands in stark contrast to the more complex perceptions of customers.
It is entirely legitimate, even required, that market regulators should monitor dominant players and act to correct deviation from acceptable standards, such as with recent interventions in UK insurance markets to ensure proper disclosure of price increases. Ofcom is similarly right to be concerned about any sign of lax standards in line provisioning delays, complaint handling or repair times.
QoS poverty versus QoE honesty
The familiar quality of service (QoS) measures that supposedly underpin contracts are often reduced to a commitment to ‘best efforts’ with some compensation leeway for the most egregious failures. Those QoS measures are for the majority of consumers – whether business, public sector or domestic – fairly irrelevant for most of the time. What matters more is their quality of experience (QoE). However, most of those experiences are way beyond the ability of component providers or their regulators to assess.
Take, for example, the consumption of online video. The QoE will be influenced by the functionalities of diverse access/viewing devices and the adequacy of delivery networks in a complex compound of components where the consumer is primarily interested in the overall outcome. It is pointless having an optimal home WiFi network if the capacity of the serving broadband line is inadequate to stream BBC iPlayer radio without lengthy buffering delays.
This contrast between the relative poverty of QoS metrics and a more meaningful, some would say honest, approach to QoE is hardly new. Tennis fans are enriched by on-screen metrics of player performance in much the same way as American football fans appreciate “balanced scorecards” that reveal much more about the run of play. Car drivers are not unaware of engine and system performance. Some may complain that automation has gone too far, but not stopping after an accident is rendered pointless by car self-reporting.
Who has not sent an error report to the software creator when the laptop has unexpectedly frozen? Smartphone users willingly volunteer their device and cellular network performance to independent assessors such as Open Signal and make use of such data when considering a new purchase.
Self-reporting Collaborative Qualities
The ability of product design teams to embed self-monitoring functionality is gathering pace. Soon your new television will know and report the Netflix versus YouTube download times, the speed of channel switching, the latency of interactive gaming, audio and video glitches, packet losses and network congestion – and, moreover, attribute component causality for failings.
But all of this rich and very real QoE measurement is, until such time as it is routinely collected and analysed, beyond the purview of market regulators and even further beyond the providers of narrowly defined components – even when those providers are trying to sell bundled services that compound content and connectivity.
This is not simply a challenge for broadband providers and their market regulators – though one is undeniably exacerbated by an unwillingness to provide future-proofed fibre unconstrained by the limitations of legacy copper.
Businesses of any type in any sector are increasing required to be collaborative. But how can one judge their current and future capacity to collaborate?
Conventional indicators of chronic business instability, such as cash-flow constraints, are difficult for prospective partners, suppliers and clients to research and evaluate. The relative lack of corporate open data, in contrast to public sector open data, is gradually being tackled, not least via recent studies that showed how more open organisations have less difficulty in attracting investment.
This shift is, however, still seen as counter-intuitive by older business managers and their legal advisors who hold outmoded views of aggressive competitive advantage that do not sit comfortably with this greater need for collaboration. Meanwhile market regulators, who take a narrow view of their remit, pass up opportunities to spur their market players and investors towards greater openness and honesty. This is a recipe for increased regulatory irrelevance.
Enlightened product and service designers will increasingly enable citizens and organisations to find value and worth in the context of their own application environments and will also enable them to become more vigilant in the pursuit of component supplier failings.
Will market regulators be able to adapt to a more holistic view of consumer expectations and national imperatives? Will regulators demand something better than QoS? Or will the need for regulation be displaced by openly available and undeniable performance evidence?
David Brunnen is editor of Groupe Intellex, director of the UK’s Foundation for Information Society Policy and an RSA fellow.