What the Uber and the net neutrality debates have in common
It was sad and shocking to see some French taxi drivers resorting to violence in their demonstrations against Uber. These events, and the response of the French government, are the expected consequence of bad regulatory design. Regulation in the sector, as originally conceived and still enforced in many European cities, is simply not justified anymore. Applications like Uber address very effectively the market failures that used to provide a rationale for it. As an enthusiastic and regular user of the service in both London and Brussels, my impression is that Uber is in fact more effective, more responsive and more protective of the consumer than any local rules I have seen (and I have lived and used taxis in a few European cities).
The problem at the heart of the current troubles relates not so much to whether legacy regimes are justified (which they are not), but to the way in which they could be reformed to accommodate disruptive technologies. Some taxi drivers use violence because the legislator had created the expectation that existing regulation would apply forever. This expectation is reflected in the hefty prices paid for taxi licences in many cities. A government (local or national) has to be very courageous to change such a system. It is infinitely easier to prohibit a new service – no matter how good for the public at large – than to confront technological reality.
Why do I discuss all of this, which is well known by all readers of the blog? Well, if there is a lesson that the Uber debate provides is that regulation should be carefully crafted so that it can adapt seamlessly and effectively to technological change. Unless flexibility and adaptability are enshrined in the regime itself, change is unlikely to occur (or unlikely to occur at the pace required by the underlying economic and technological reality).
The European Commission had this lesson in mind when the Regulatory Framework for electronic communications was proposed in 1999. The telecommunications sector was rapidly changing, and it was already clear that technology alone would progressively address many of the concerns that were deemed to justify intervention at the time.
Thus, the Regulatory Framework was not conceived as a collection of rules imposing precise requirements on operators but as a set of broad principles that national authorities would follow when considering the need for intervention in a particular market. Because administrative action is subject to regular review, remedies are only imposed insofar as, and for as long as, they are necessary to advance the objectives of the regime.
The Regulatory Framework was an impressive legislative achievement that, alas, has been progressively undermined. Maybe it is true that good things never last. The most recent nail in the coffin has come from the recent political agreement to introduce net neutrality rules at the EU level (and which apply at least to what the press release calls the ‘open Internet’). As far as I can gather, the new rules will provide for an unconditional ban on some practices. Such prohibitions would be directly enshrined in the Framework. This is the very regulatory technique that the Commission considered to be inappropriate back in 1999.
I have never seen anything close to a theory providing a convincing case for net neutrality. But this is not really the issue here. What I find worrying is that, because of the regulatory device chosen by the European legislator, net neutrality is here to stay, and is likely to stay even if it becomes clear that it does more harm than good. The Regulatory Framework was conceived as a ‘future-proof’ instrument. It is ironic that, over time, it has evolved to become more rigid and less evidence-based.