Friday, July 08, 2011

More on profiling broadband usage and the 2Mbps universal service commitment (USC)


Analysys Mason's report for the Broadband Stakeholder Group on the potential of wireless and satellite to deliver next generation broadband, published last October, makes for interesting reading.

The report considers wireless and satellite provision in the light of three usage scenarios up to 2016 (or one year beyond the government's target for the UK to have the best superfast broadband in Europe):
  • Scenario A (mobile broadband evolution): "represents demand in a world in which the retail business model for satellite and terrestrial wireless broadband access is similar to mobile broadband today. Demand is constrained by the existence of prepaid subscriptions and relatively stringent usage caps in monthly pricing plans."
  • Scenario B (fixed broadband evolution): "represents demand in a world in which the retail business model is similar to fixed broadband today. Demand is less constrained than in Scenario A due to large (or unlimited) usage caps and predominantly pay-monthly subscriptions." (This is in Analysys Mason's view the most likely traffic growth scenario.)
  • Scenario C (accelerated IP-video evolution): "also represents demand in a world in which the retail business model is similar to fixed broadband today. However, Scenario C considers the impact of an even greater change in consumer behaviour, with a large proportion of the content viewed being on-demand video delivered over IP networks. Almost all TV content is delivered in HD."
...which offers some parallels with the FCC's profiling of broadband usage which I've covered previously. What's particularly interesting is the extrapolation of maximum bandwidth required per home from each of these scenarios:
“There are large differences in the average busy-hour bandwidth required per home in our three scenarios: Scenario A requires 85kbit/s, Scenario B requires 700kbit/s while Scenario C requires 1.5Mbit/s. These differences reflect the current uncertainty over future demand that exist within the broadband community. However, the peak bandwidth demand per home in all of our scenarios is assumed to be driven by the number of simultaneous video streams that a household may consume. We have assumed that the maximum average bandwidth requirement per home is that which is needed to deliver 2.3 video streams. Scenario A assumes that all streamed services are in standard definition (SD) which gives a maximum bandwidth of 4.6Mbit/s per household. Scenarios B and C assume that viewing is in high definition (HD), which gives a maximum bandwidth of 18.9Mbit/s per home. We therefore believe this that, despite the uncertainty over the average bandwidth required per home, there is no pressing need to implement technologies that can deliver significantly in excess of 20Mbit/s peak bandwidth per home before 2016."
The figures of 85Kbps, 700Kbps and 1.5Mbps per home provides an interesting comparison with the calculations mentioned in the conclusions report from BDUK's theoretical exercise last year:
"All networks are shared resources, engineered to accommodate busy-period loads. Operators size networks on the basis of a backhaul allocation per customer, which enables them to offer service levels for anticipated customer experience during the day, e.g. “2Mbps available 90% of the time during peak three hours of the day”. A typical backhaul allowance from suppliers when designing solutions was 30-60Kbps per customer, which is consistent with mass-market products available today. However, as the choice of backhaul infrastructure in these solutions was influenced by environmental, customer density and economic constraints, BDUK saw a wide spread in allocations: e.g. from <20Kbps for a wireline connection where backhaul would be very expensive to provision, to >200Kbps for wireless backhaul where the low density of customers on a mast meant that a significant per-customer backhaul allowance was available. These examples would result in a significantly different customer experience in peak hours of the day, but this experience is also dependent on the allowance for data transport between the operator’s point of handover and the internet as well although this is generally a commercial decision for the CP."
This also provides an interesting counterpoint to the current debate over the appropriateness of FTTC vs FTTH services? Though the report does later acknowledge that technologies other than wireless may provide a greater degree of future proofing:
"It is important to note that, although in Scenario B the cost of deploying terrestrial wireless technology in rural areas looks attractive compared to FTTC/VDSL, the latter may provide a greater degree of future-proofing. Our hypothetical terrestrial wireless networks have been dimensioned to support exactly the amount of traffic expected in each scenario in 2016. If the bandwidth required by each household continues to grow then new base stations would need to be added continuously to keep up with demand. A network based on FTTC/VDSL, by contrast, is likely to offer a certain amount of headroom to support future traffic growth depending on the lengths of the VDSL sub-loops. If the sub-loops are capable of supporting higher speeds than the 20Mbit/s peak bandwidth required in Scenarios B and C it may be that once FTTC/VDSL has been deployed in a particular area, further investment will not be required for a considerable number of years. If this is the case, it may be more cost-effective in the long term to deploy FTTC/VDSL in some areas where our 2016 snapshot implies that terrestrial wireless is a lower-cost option."
So the technologies considered in the report are pretty much reaching their limits in terms of bandwidth provision by 2016, if Analysys Mason's annual traffic growth projections from 2010 to 2016 (28% for scenario A, 40% for scenario B and 50% for scenario C) are correct. Analysys Mason also predict that demand will remain asymmetric, with fully symmetric services only being required by a small minority of residential users.

The report also confirms that wireless and satellite technologies are potentially a cost effective way to deliver the 2Mbps universal service commitment (USC):
"At the time of writing the technical definition of the universal service agreement (USC) was still being agreed by industry. Consequently, we have not considered the USC in detail. The current suggestion for a USC download service is “Access offering throughput of at least 2Mbps for 90% of the time during the busiest 3 hour period daily”. We understand that this requirement refers to a 90% chance of a particular user being able to receive 2Mbit/s during the busy hour. We believe that the performance of the networks we have modelled is likely to be commensurate with this requirement (the level of over-provisioning we included in our Erlang C calculation is sufficient to ensure a 98% probability of an on-demand video stream starting with 5 seconds). Furthermore, we believe that the average bandwidth per home in our lowest wireless demand scenario for 2016 is higher than the average bandwidth provided by a typical fixed broadband network in 2010."
Erlang C calculations are a new one on me, it seems this is a metric which can be used to determine the bandwidth and contention needed to deliver the required services acceptably:
"We have used an Erlang C calculation as a way to approximate the over-dimensioning that would be necessary to support the type of traffic that we have modelled. Erlang C is typically applied to call-centre dimensioning. The inputs include: average demand (calls per hour); average duration of calls; total number of call-centre agents. Outputs include: average time to wait before a call is answered; proportion of calls answered within a specified time.  We have used an iterative approach to modify the calculation so that it can be applied to video streaming. The inputs are the average number of concurrent video streams and the average duration of a video stream. The parameters are the average delay from the time a video is requested to the time that the stream starts, and proportion of streams that start within a specified time of requesting. The output is the total capacity required to the specified inputs given the specified parameters (which we use as an estimate of the over-dimensioning factor).  We have assumed the following parameters for acceptable quality of services:  average time to wait before a stream starts is 1 second; 98% of streams start within 5 seconds of requesting."
Useful metrics to build into service level agreements underpinning deployments based on BDUK funding allocations perhaps, especially in relation to ensuring satisfactory delivery of the USC? More detail on the USC in BDUK's request for information for last year's theoretical exercise:
6 BDUK baseline definition of 2Mbps USC
In completing their responses, suppliers or supplier teams are asked to identify the theoretical improvement in speed of individual post‐codes. As well as quantifying the improvement in maximum potential speeds, suppliers will need to confirm in their responses whether the service meets this technical definition, or identify any variation from the standard and the impact on the customer experience. 
6.1 Baseline consumer definition 
BDUK has created a customer‐facing definition for overall experience of broadband provided under the USC, to explain to the public what they should expect from the policy.
The customer experience of USC‐defined broadband is expressed as: 
  • enabling users to conduct effective home working, for example:
  • watching good‐quality (i.e. low level of interruption) Standard Definition video stream, e.g. iPlayer, most of the time
  • providing acceptable basic video conferencing, e.g. Skype, most of the time
  • enabling users sufficient to provide access to online Government services, e.g. tax self‐assessment form
The following assumptions are assumed to be met: 
  • in‐home wiring is not a limiting factor
  • service delivered on an up‐to‐date computer and up‐to‐date browser and driver software
  • no other active network devices in‐use within the household
  • access and data transport network is not subject to contention and loading in excess of that anticipated through prudent network planning and management
6.2 Baseline technical definition 
BDUK has also developed market‐facing definitions for USC. Different technologies have different access and data transport capabilities, so currently BDUK is maintaining several variations on the definition, until it is able to determine the minimum acceptable standard required to meet the policy objective. 
In this document, BDUK presents its baseline definition for 2Mbps USC: 
  • Connection capable of at least 2Mbps download speed measured at CPE
  • Access offering throughput of at least 2Mbps for 90% of the time during the busiest 3 hour period daily
  • Access offering throughput of at least 256Kbps upload speed for 90% of the time during the busiest 3 hour period daily
  • Access to an ISP such that end‐to‐end latency, jitter (packet delay variation), and packet‐loss between CPE and ISP's internet gateway is adequately controlled to maintain 2Mbps throughput and customer experience for different packet sizes under peak‐time loading.
  • Access to an ISP product whose data volume limit, throttling and packet prioritisation policy is made transparent and is in line with current market practice, but in any case no less than 5GB per month.
  • Access to an ISP product with an install and monthly charge comparable with typical, comparable retail reference price, e.g. BT Retail.
In conclusion, Analysys Mason report that wireless and satellite are potentially cost effective options for rural broadband, but in many ways a more interesting conclusion is this one:
"Although there are huge uncertainties about the level of demand in 2016, under three credible scenarios the peak demand for the average household is under 20Mbit/s. We think it unlikely that new residential applications requiring significantly in excess of 20Mbit/s will emerge before 2016. We therefore believe that the economic case for delivering higher bandwidths in the next five years is uncertain. We believe that private-sector investment in fibre, terrestrial wireless and satellite technologies will deliver incremental increases in bandwidth over the next five years that reflect the underlying demand from consumers. Given that the lack of clarity over what the average level of demand will be in 2016, and the complex interplay of other factors which ultimately determine which technology is most cost-effective for a particular location, we believe that a cautious approach to public intervention is required."
Food for thought? I think this kind of analysis is extremely important, as it move the broadband debate from the theoretical to the specific. For more of my thoughts on this, see these previous posts on telehealth and domestic broadband requirements.

Thursday, July 07, 2011

In defence of frameworks


An interesting outcome from Adrian Wooster's recent survey of opinions on BDUK's framework for wholesale broadband infrastructure - not a popular approach it would seem. I was one of the 14% of Adrian's respondents who think that BDUK's proposed framework is a good idea, and, in keeping with Adrian's request on his blog, here are my reasons.

I think a key advantage of frameworks is the opportunity they present to simplify and speed up public procurements, as well as offering value for money. See this recent example for how JANET's Telecommunications Framework saved the London Grid for Learning (LGfL) a significant amount of time and money.

Having been involved in several full EU procurements in my time at Becta (we established a number of procurement frameworks for the education sector, with the outcomes of the recent James Review acknowledging the importance of our ICT Services Framework - see page 62 of Sebastian James' report; the irony of this endorsement coming in the week following Becta's final closure was not lost on me), frameworks offer a means to short-circuit what can be an extremely lengthy and expensive process for purchasers. Which could mean significantly quicker deployments once BDUK's framework is in place.

Which isn't to say that frameworks aren't without their problems. While a framework can assist a potential purchaser, the purchaser still needs to approach any framework properly. A framework is simply a pre-selection exercise. Purchasers still need to draw up their requirements properly and tender them through the framework - it's certainly not a case of "just pick one of these suppliers and all will be fine". So a framework procurement should be approached like any other procurement, in terms of the amount of preparation and planning the customer should undertake to draw up their requirements, and the diligence with which the customer should scrutinise suppliers' proposals.

I fully acknowledge the concerns about frameworks discriminating against smaller companies. This is something we were confronted with at Becta regularly. Unfortunately, I don't know of any easy way to mitigate this. Prime contractors need to be of sufficient stature to be able to undertake the level of business that it's envisaged will go through the framework. Our approach at Becta was to encourage smaller suppliers to work with framework providers as a route to market, and vice-versa to encourage innovation, but in truth both we (and our framework suppliers) could have done a lot more to facilitate this.

The role of the informed customer is crucial in this context. The customer can create a situation where framework providers can (are required to?) work with smaller local concerns for mutual advantage. Unfortunately, because provision on this scale is complex and difficult, it's not surprising that telcos can and do play the "don't you worry your pretty little head about all this, let us do it all for you...just sign here" card, which can be very hard for local authorities and bodies struggling to get to grips with this area to resist.

However, for authorities that have grasped the nettle of broadband provision, and understand its opportunities as well as complexities, ways can be found to work together. Kent seem to be particularly progressive in this regard in my opinion. The JANET LLU reports I've mentioned previously on this blog are an excellent example of how telecoms provision can be de-mystified with a little work. Local bodies and authorities need to approach the market by saying "this is what we want from you" (commissioning), rather than by asking the market simply "what are you prepared to sell us?"

In my view, frameworks are good at delivering commodity but not very good at encouraging innovation. Which is why the informed customer role is so important, to provide a bridge between the framework suppliers, local circumstances and smaller, innovative concerns who have a lot to offer in this space, as numerous projects around the country demonstrate.

It comes down to how well a framework is conceived in the first place and subsequently how well it's used by those purchasing through it. Like any tool, a framework can be designed and wielded well or badly. If the tool is well designed, great. But if the tool doesn't reflect the requirements of its potential users, that's a problem.