Amazon Tech Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Sunday, 18 December 2011

2012 Winners and Losers

Posted on 03:04 by Unknown
It's time for the annual merry-go-round of analyst predictions, and so I'll thrown my hat into the ring as usual. Some of these will come as "no surprise" to my regular readers & clients, others represent a bit of a change of stance.

I'm hideously busy before I head off for the holidays, so I'm afraid these are just bullet points. More detail & explanation will crop up here over time, or on my new subscription @DApremium Twitter channel.


Winners:
  • Telco OTT services & business models This won't surprise many regular readers as it's a theme I've talked about for many years. It's a complex and nuanced area though, and we'll see as many failures as successes, if not more. Nevertheless, trial & experimentation are essential "you have to be in it, to win it"
  • Loyalty business models for telcos - I think much of the future revenue growth opportunity for operators comes from clever pricing and marketing, assisted by network technology where needed. "Top up your account by $10+ by Wednesday and get a free 20MB of data and a discounted movie ticket".
  • Charging/billing related policy engines  - the key enabler of loyalty schemes and other similar offers. The policy "intelligence" will primarily reside in the IT side of the operator, with enforcement on the network side. We'll also see on-device policy clients extending beyond today's simple dashboard / quota apps.
  • Enhanced telephony & SMS services or functions - Making existing 20-100 year old services better should be prioritised above creating new ones. So-called HD voice is a start, but it's shameful that the industry abdicated its responsibility and to let Apple create Visual Voicemail and Siri, or Palm and others developed threaded SMS. Unfortunately, the obsession with standards and interoperability seems to stop the telco industry itself from improving the user experience of basic functions.
  • Non-seamless operator wifi (offload, onload & other models). "Frictionless" will be a particular winner, not "seamless" (ie user will remain in control most or all of the time). A regular topic of mine: see various blog posts and this white paper I wrote on Carrier WiFi.
  • iPads & Kindle Fires as complementary devices - OK, I'll admit I was unduly pessimistic about the impact of the iPad when it first launched. However, I still see it largely as an *extra* device for people rather than a substitute. I still don't buy this "post PC era" rhetoric. Netbooks were extra devices too, sold to people wanting a cheap portable second computing device - and iPads fit in the same category, with added coolness and new use-cases. I see the Fire as doing something similar - it looks like something new and fun, but again is incremental not substitutive.
  • Microsoft & Nokia smartphones - I'll take a flyer on this and say that I expect MicroNokia to be a surprise success, although I think the buyout rumours are bunk. That said, it's critical we see continued developer traction - my main reason for not getting a Lumia at the moment is lack of support for some critical apps for me (BTFon for WiFi, Onavo for roaming data-saving, not sure if the Barclays London bike hire scheme has one yet)
  • Mobile broadband data plans based on speed, location, time, device & user - all of these criteria can be determined accurately and easily (unlike application type). Ever more sophisticated pricing plans will be developed around them, supported by realtime information about network congestion, and perhaps ways to offer "happy hours" or similar.
  • On-device network-intelligence for connectivity, policy & control (including both telco-driven innovations & more user-centric apps). Another theme of mine recently: trying to manage the network just from the core won't work - there needs to be intelligence out at the real edge to understand what's going on. Trying to guess about "video optimisation" when you don't even know if the user is watching a foreground player, or downloading in a hidden window makes no sense, for example. To know, you need footprint on the phone. Same deal with managing WiFi access and maybe policy in accordance with user wishes & expectations
  • Consent-based mobile video delivery eg based on Operator CDNs, adaptive bitrate streaming &  close cooperation with video publishers. Where the operator works directly with a content company, they get "consent" to reformat or otherwise manage a video stream. That is going to be much more acceptable than trying to use boxes in the network to "optimise" it without permission or transparency. I wrote about the idea here (and comments) and I spoke about this at a conference recently - watch for an upcoming white paper.
  • Wholesale 3G / 4G networks - yes, LightSquared has problems with GPS. But the general model of wholesale-centric LTE networks is a very strong one. Depending on the market, some will see disruptor new entrants, some will see government intervention, and others may see existing tier-2/3 operators opt for structural separation.
  • Freemium Twitter / LinkedIn business models for analysts, consultants & other professional services companies. Sooner or later, someone clever is going to find a way to make money out of Twitter *coughs modestly*

Losers:

  • RCSe - It was dead. It's now still dead, but shambling around like a zombie. I'm hoping to publish a report shaped like a wooden stake during 2012.
  • NFC payments - classic case of a solution looking for a problem, with the added bonus of squabbling over who controls the wreckage. Massively overhpyed.
  • QR codes - They're everywhere it seems. But have you ever actually seen somebody use one in public? Just give me the URL: I'm not going to "engage" with your marketing until *I* decide to.
  • LTE in Europe - Spectrum, devices, voice, business case. Oh and flattening demand for data as well.
  • SMS revenues - they've been a sitting duck for years. 20 years ago, sending 160 characters of text was quite a feat. Now, not so much. I expect massive price erosion (even within bundles) as a response to the continued viral adoption of iMessage, BBM, WhatsApp & whatever 2012 brings us.
  • Cellular-enabled tablets - Poor fit of subscription "data plan" model with computing devices that tend to get used in unpredictable and ad-hoc fashion, often in places with WiFi anyway. And in the idea that you'd be happy to let an operator policy-manage your computing/Internet experience, and you've got a recipe for disaster. I told you so, 3 years ago.
  • Telco control of user identity - I think the idea of SIM=identity will break down irretrievably in the next 24 months. Operators *do* have a role to play in ID and authentication, but it needs to be much more nuanced and user-centric. Some of the stuff that my associates at Telco 2.0 are doing with the World Economic Forum is interesting, but the industry still has too much of an arrogant belief that it can "own" more customer knowledge than it will get in practice.
  • App-based policy control & charging - Another familiar theme of mine. The network doesn't see apps the way that users do. HTML5 makes the problem even worse - how do you charge for "Facebook" when there's 800m different versions, created on the fly in the server?
  • Eurozone-exposed telecom companies - I'm not an economist, but delving into the topic with Telco 2.0 for a recent paper made me realise some of the problems that next year might bring. Hopefully I won't be writing the 2013 predictions from a cave, looking at the smouldering ruins of civilisation. All things considered though, I'm quite glad to be living and working this side of the Channel, despite the Monty-Python inspired taunting from our friends in Paris.
  • ANDSF, IFOM, I-WLAN & seamless wifi offload - There's some good stuff in ANDSF (eg advising the user which WiFi access point is best), but it's a bit buried in the general belief that operators can control users' WiFi overall, forcing connectivity and enforcing policy. They can't, expect in a few corner-case instances. Attempting over-control will backfire: telcos can either control 30% of user WiFi, or zero. I suspect they'll aim for 100%, and the push-back will mean they'll get zero.
  • Operator API balance-of-payments vs. Internet companies - Telcos are still pushing "capability exposure" via initiatives like OneAPI and Telefonica's commendable BlueVia project. But they're still not doing the right APIs in many cases (eg "does the user have a suitable data plan?", or "does the user have good coverage & network speeds?"). Meanwhile, Google, Facebook and others are starting to charge for bulk access to *their* APIs: telcos are often major users. Result: the net API balance of payments won't be pretty.
  • Full mobile network outsourcing to "under the floor" players - another theme I've talked about for a while: in some cases, vendors' "UTF" managed services offers are larger strategic threats to telco business models than so-called "OTT" providers. Loss of control over the network might mean Opex savings, but it introduces massive opportunity costs from lower flexibility. Conference presentation here - and watch for an upcoming report I've written for Telco 2.0 on the subject.

Questionmarks

  • NFC non-payment applications - interactions, not transactions. More likelihood of success, as it could be driven by a million developers doing cool new things, rather than people wanting a slice of transaction values. But dependent on Apple actually launching NFC as a catalyst, mostly.
  • Mobile VoIP as a telephony replacement - we're getting closer, but we're still not *quite* there yet. Still issues with data network coverage, user interaction, numbering, ringing and so forth. Maybe we'll get the real viral trigger in 2012. Hear the issues in more depth at a Future of Voice workshop (and sign up for the newsletter!)
  • Operator reported "voice" revenues - driven more by price bundling & accounting than VoIP at this stage. Some of the recent discussions I've had suggest that operator execs think they're inevitably losing the voice revenue battle. Add in some dry-but-important accounting changes and they're probably right.
  • Development of simultaneous dual-radio LTE + GSM/UMTS phones - In my view, a sensible alternative to CSFB or VoLTE for LTE voice, but it might be a good idea too late (a bit like VoLGA). See this post & comment stream. One sneaking suspicion - the first such device might be a TD-LTE / GSM phone as I suspect that we're a long way behind implementing both CSFB and VoLTE on TDD networks.
  • Android tablets - can't see the point really, unless you get something much cheaper and more interesting like the Kindle Fire.
  • Blackberry resurgence/demise - 2011 as been an awful year for RIM. A big outage. Storm over encryption. The PlayBook is a cool device lacking an essential capability (email). It hasn't capitalised on its BBM franchise well enough either and is being "outcooled" in its core teenage market, while being sidelined by BYOD policies in its legacy enterprise customers. Can it survive 2012 as an independent? I'm a lot more pessimistic now than a year ago. Clearly some organisational issues to deal with, but at least it's actually still making some profit.
  • Possible network overcapacity as "data explosion" fizzles out - Let's watch the stats over the next 6 months. I've a feeling that the Cisco VNI numbers have created a self-denying prophesy - the industry has developed 20+ solutions to the "data explosion" problem, and it seems that at least 10 have worked - perhaps too well.
  • HTML5 impact on mobile networks if (& that's a big if) web apps take over from downloaded apps. Nobody seems to have thought about signalling, overall traffic load etc. I spoke about this at a recent Telco 2.0 event in Singapore, and I'm writing a research note for them for early 2012.
I hope you've found this interesting & thought-provoking. As always, I'd love to hear your opinions or disagreements on any of this. 
I'd also like to suggest that you challenge your own company - I do numerous workshops and advisory projects for operators, investors and vendors on "Disruption". Some of this focuses on individual issues mentioned here (eg "What's the negative story around RCSe?"), while sometimes I'm asked for an across-the-board wake-up call to provoke thought and discussion. Please get in touch via information AT disruptive-analysis DOT com.
Read More
Posted in | No comments

Wednesday, 14 December 2011

New! Disruptive Analysis Premium Twitter subscription stream @DApremium - sign up today at an introductory promotional rate

Posted on 10:59 by Unknown

What am I doing? 

I am launching the first analyst paid channel for Twitter: a subscription for "premium" exclusive tweets of my analysis, opinion and market observations.



Why am I doing this?

I've been using Twitter for over a year and a half, posting as @disruptivedean. I was a hold-out for a long time before grudgingly signing up. Although I've got a fairly good "reach" from it, I've been vocal in criticising it. I'd much prefer to just use something more useful like LinkedIn or Quora for "status" updates and more detailed discussions instead.

But unfortunately, Twitter is like tax - as an analyst, you have to grit your teeth and do it, painful, time-consuming and distasteful as it is. I end up spending time on Twitter that could be more profitably spent writing posts on this blog, advising clients or taking briefings. It adds cost, but brings little in the way of value or revenue.

There's a temporary warm feeling of "being part of something", and some undeniable self-validation in counting followers. But that's emotion, not business (and I don't use Twitter for my personal life).

One of the problems is that the main beneficiaries of Twitter are the Fourth Estate - journalists, and their digital equivalents and entourage: bloggers [no, I'm not one], PRs, marketing folk and analyst relations people. It does a lot of their job for them. Unsurprisingly, this group of users then promotes it as being the best thing since sliced bread, since for them, it probably is. As a result, it's become table-stakes, and because I've had to do it, I've tried to do it well.

I spoke recently at an IIAR (Institute of Industry Analyst Relations) debate about social media, and described Twitter's realtime platform as "seductive but irrelevant" to the work of an analyst, with little or no real value. 
 
So today, I'm hoping to change that. To use a telecoms metaphor, I'm "monetising the opinion pipe". I'm launching a premium Twitter channel, open only to paying subscribers or my existing clients. I'm keeping @disruptivedean but adding @DApremium


How will it work?

Ideally, Twitter (or, preferably, one of its better rivals) would have had a ready-made freemium + revshare platform, but at present I'm having to work it out with a mix of "protected" Tweets and offline payment through other channels. Looking about online, it looks like there were a couple of experiments doing paid Twitter streams around 2009, but I'm not aware of anything that currently does what I want.

This is experimental. It's genuinely disruptive. I'm not aware of anyone else doing this successfully. It's possible that Twitter may take a dim view and try & shut it down. Or it's possible that the model becomes more widely successful and I get bragging rights in a couple of years' time, albeit wishing I'd set it up as a software platform instead & sold it for a huge sum.

Clearly, most analyst firms offer various subscription models for their content and advice. This is not something that Disruptive Analysis has not done to date, preferring to use a mix of free content via this blog, as well as standalone published reports, papers and advisory services. A paid subscription Twitter stream, at a low price, fits into the trend towards providing analyst material in short, pithy, concise bursts.

I will maintain this blog at its current frequency, for longer & more analytical pieces. Depending on the response to @DApremium, I may add extra pay-walled material for subscribers in 2012. I'll also keep my free Twitter stream, especially for discussions and interactions, and as a marketing tool.


What's the deal?

So... what does @DApremium offer?
  • My best insights and discoveries, delivered immediately. My existing blog & Twitter readers will recognise the regular "I told you so" moments. Future epiphanies will be on @DApremium
  • 1000+ exclusive tweets per year (not retweetable, no forwarding allowed)
  • A bias towards tweets which name & analyse specific vendors / organisations / operators
  • The bulk of any conference coverage on @DApremium
  • Answers to specific subscriber questions on @DApremium where feasible
  • Additional content & benefits for @DApremium subscribers
  • The ability to interact with a narrower group of high-level industry figures, with a much better "signal to noise" ratio to the open Twitter. In future, may set up a separate forum on LinkedIn or elsewhere for more full debate & longer-format discussions.
(Incidentally, I'm willing to be quoted about @DApremium, as an example of business model innovation in social media, and monetising supposedly free services)

What are some examples?

I only set up @DApremium on 13th December, and I am putting up this page a day later. The first 20 tweets have been (from first to last):

1. The premium Twitter stream of @disruptivedean, exclusively for subscribers & clients. Vendor analysis, event streams, unique insights etc

2. RT @disruptivedean BBC iPlayer 3G streaming approach points to death of non-consensual video optimisation model bbc.in/uufHaA

3. My dislike of Twitter is well known, but if DApremium enables monetisation of the "opinion pipe", I'll admit to seeing the value in it.

4. 2 telcos clearly embracing Telco-OTT concepts most strategically today: Telefonica & Telenor. Most others have smaller tactical activities

5. EverythingEverywhere had 326TB data traffic in last year bit.ly/uTPsFq but 3UK claims 137TB per DAY bit.ly/vcsBKn

6. I know that 3UK has lot of USB dongles & is still unlimited, but find it hard to believe 50x or 100x data traffic of EE. Miscalculation?

7. BBC using HLS adaptive bitrate streaming: self-optimising content. Will be more important than in-network optimisers see.sc/GHBBrI

8. Ah, seems EverythingEverywhere 326TB traffic just applies to the sharing/roaming deal T-Mo/Orange, not "native" data on each network

9. Still a disparity though: 3UK=137TB with 6m subs, EE=maybe 50TB/day with 27m subs. Big difference is down to market share of 3G USB modems

10. Telcos likely to be net losers on API balance-of-trade with Google, Facebook etc. eg GMaps charging coverage checkers see.sc/JtEH3b

11. Trying to work out whether there are actually any real-world implementations LIPA or SIPTO offload standards yet. Some vendor support.

12. Interesting presentation on RCS-e: suggests Spain launch now early 2012, not 2011 after all. Embarassing for GSMA see.sc/c97hGw

13. Congrats 3GPP for saying user WIFi prefs take priority over MNO steering. From OPIIS docs "Solution shall allow UE to override rules"

14. Thinking a lot about impact of HTML5 on networks. Totally kills app-based policy - there could be 200m different versions of Facebook Mobile

15. 2012 is likely to see dampening of NFC hype. Where it happens, it will be driven by non-financial interactions, not transactions or purchase

16. Wondering if the attempt by network depts & 3GPP to "own" online charging infrastr is the most damaging strategic mistake by mobile industry

17. Nokia Lumia devices seem well-received & even selling out in some places. But is that just because nobody had the confidence to order many?

18. Rogers in Canada extending its subs' numbers & voice services over the web via an OTT-style CounterPath platform t.co/igvIaiz

19. Ericsson's slow & grudging embrace of small cells seems to reflect fears over cannibalisation & commoditisation http://see.sc/sAsqoE

20. Uploaded my intro preso on LTE ("A contrarian's view") that I gave at last week's Layer123 LTE/EPC event in London www.scribd.com/doc/75687759

I can send across some more recent tweets via email if you get in contact with me


What does it cost?

Pricing for a new service like this is difficult. There are few benchmarks. Obviously, full analyst subscriptions can cost $10s of thousands per year, but with much more content. Disruptive Analysis' own research reports (such as January's forthcoming Telco-OTT report) vary from $hundreds to $thousands. At the other end of the spectrum, consumers getting a daily horoscope via SMS can cost $140 a year.

Now, I'd like to think that my insight is worth rather more than an astrologist's, even in volume terms - you can expect at least 3x the number of messages per year.



How can I sign up / pay?

Given the relatively small transaction size, I'm hoping that the bulk of subscribers will use credit cards or Paypal.


EDIT 1st Feb 2012: The introductory discount pricing period has now finished. 
Please go to the new price-plan page here

If for some reason this doesn't work (sometimes PayPal can be a bit flaky), contact me via information AT disruptive-analysis DOT com and I can issue an invoice or payment email request.

You may want to send me an email anyway, or add me on LinkedIn so I can get to know my community a bit better. If you're an existing Disruptive Analysis client, please let me know and I can give you a complimentary subscription.


If you sign up - thanks for valuing my input, and recognising the value in a new form of analyst interaction & advisory delivery model. I'll share feedback as we go along, and I hope you find the service useful & thought-provoking.



The fine print

A full set of terms & conditions are available on request. Key terms include:

- @DApremium Tweets may not be retweeted, copied/pasted into other Twitter streams, forwarded via email or otherwise distributed.
- Subscriptions are per-user, with the expectation that a maximum of 3 people have access to any individual Twitter account. Corporations [eg company-wide accounts @XYZcorp] will need to contact Disruptive Analysis for access
- Unauthorised distribution will result in removal of the subscriber's ability to access @DApremium
- @DApremium subscriptions will last for 12 months from the date of payment. Renewal reminders will be sent 1 month and 1 week before expiry.
- Subscribers recognise the limitations of Twitter's 140-character posts - this means that abbreviations, approximations & concise statements are typical. Complex ideas, trademarks etc may need to be condensed, and nuances may be lost. Any issues with accuracy, clarity or disputes should be raised with Disruptive Analysis, which will use either Twitter, blog or other channels to make clarifications or corrections if necessary.
- Subscribers recognise that paid Twitter streams represent an innovative business model and may be subject to change because of circumstance beyond Disruptive Analysis' control
- In particular, Twitter may change its terms of service, block or delete the @DApremium account, or require a new payment mechanism
- If there are problems or interruptions to the @DApremium service, Disruptive Analysis will attempt to find alternative channels for delivering status updates. These could include LinkedIn, private blogs, other social networks or, if necessary, email lists
- I'll be travelling over the Xmas and New Year period until around Jan 10th, so posts over the next few weeks will be less frequent than during normal periods. This is part of the rationale behind making the intro subscriptions valid until 31st Jan 2013
Read More
Posted in | No comments

Thursday, 8 December 2011

Telco-OTT Strategies & Case Studies: Pre-publication discount on forthcoming Disruptive Analysis report

Posted on 09:04 by Unknown
**UPDATE! DISRUPTIVE ANALYSIS' REPORT ON TELCO-OTT STRATEGIES NOW PUBLISHED - PRICING AND CONTENTS DETAILS here **

I've discussed for several years (via private advisory work and this blog) the prospect of operators running their own "access independent" services. This means mimicking the major Internet players' role by offering what I'm calling "Telco-OTT" services. In other words, decoupling the network from the application - something which is highly controversial, especially for telecom traditionalists.

Google, Facebook, Skype and others have shown the power and value of standalone applications and services. Finally, operators are exploiting the scale of the multi-billion user Internet, the flexibilty of PCs, and the ease and virality of the smartphone app paradigm, to go beyond the narrow confines of their own access customer base. In other words, they are embracing the doctrine of "If you can't beat 'em, join 'em". But while compelling, it is not easy - there will be failures as well as successes.

I'm now in the final stages of putting together a landmark Disruptive Analysis strategy report on Telco-OTT services. It will span both fixed and mobile operators, and a broad set of applications and case-studies. It will be published in January, but I'm giving advance notice here, a taster of some of the thinking in the report, and offering this blog's readers a promotional discount for pre-orders.

This will be the first analyst report to focus on, describe and identify the trend - and "give it a name".

There are two broad strategies I see emerging for operators' own inhouse OTT activities:
  • Add value to existing access subscribers, by adding service "accessibility" via 3rd-party networks and devices
  • Extend brand and service/content reach to non-access subscribers
A prime example in the first category is fixed-broadband IPTV or cable providers, which are increasingly offering their subscribers a mobile app to watch content while "on the go" - typically on smartphones or tablets connected via other operators' networks. In the communications arena, Telefonica is trialling its O2 Connect VoIP service (based on Jajah) with an initial focus on existing O2 subscribers connecting via WiFi. Even RCSe believers see value in a "broadband access" option for OTT linkage into a user's service, via a PC client and generic Internet connection.

The second class is a more "full" vision of OTT-style services by operators, reaching new users who do not have a "subscription" at all. Orange's ON Voicefeed (a third-party visual voicemail) is a good example, as is T-Mobile US Bobsled (a Facebook-based communications app). Many operators also have traditional web portal or video/audio businesses, selling or delivering content to the Internet at large.

The first strategy is generally lower-risk and easier to monetise; the second is much more difficult (essentially on a par with any other Internet startup) but ultimately could generate more value. The report considers both paths, as well as the different options for organic development and acquisition. It also looks at strategic pitfalls and mistakes for Telco-OTT services, such as the creation of custom hardware (eg Vodafone 360 handsets).

There are also various other offerings that fall into the Telco-OTT envelope - classical enterprise VPN remote access services, femtocells connected over another operator's broadband lines, even WiFi "onload" used as a way for one operator to gain a service "pipe" onto another operator's smartphone fleet. New markets as diverse as M2M and social networking also yield opportunities and examples.

**NEW! DISRUPTIVE ANALYSIS' REPORT ON TELCO-OTT STRATEGIES NOW PUBLISHED - PRICING AND CONTENTS DETAILS here **
Read More
Posted in | No comments

Wednesday, 7 December 2011

What is the cost of VoLTE? Is it a huge strategic error for the mobile industry?

Posted on 03:12 by Unknown
I'm sitting at the Layer123 conference on EPC and LTE in London.

I've just heard a joint presentation on VoLTE from the GSMA and the MSF. The latter was about the recent interop trial, and the former was a general flag-waving / propaganda pitch. Apparently "everyone" is now agreed on VoLTE deployment. (I assume that comes from the same GSMA dictionary as "ubiquitous" used in discussions on RCS / RCSe). The subsequent presentation is from Germany consultancy Detecon, and the former one was from BT Wholesale.

Some outputs so far on LTE Voice:

- General agreement that CSFB still reduces QoE by increasing call setup time and dropping the LTE data connection
- "Pure" VoLTE should be available without handoff when the user moves out of LTE coverage in late 2012 / early 2013, after trials in early/mid 2012
- E911 support is a bit of a hassle for the US and a short-term cludge will be needed
- VoLTE with SRVCC (for handoff to 2G/3G at the edge of coverage) is late and complex
- Massmarket VoLTE with SRVCC "should" be available in 2014. I think that's both (a) delayed from initial discussions, and (b) very optimistic. I think 2016 is more likely when you take into account various other factors like devices

But the elephant in the room is the growing realisation in the industry that mobile voice telephony revenues are declining. A survey of strategists at the London Telco 2.0 event I attended a few weeks ago suggested 20-30% revenue erosion for operators on telephony in the next 3 years. The Asian equivalent in Singapore last week was even more bearish. Private conversations I've had with telco voice and strategy executives also demonstrate pessimism - sometime to the extent that they see telephony revenues evaporating almost entirely, under the onslaught from MVNOs, Google, Skype and the regulators.

So let me get this straight:

- Telephony revenues are declining because of external (and unstoppable) factors
- CSFB gives a worse user experience than circuit telephony
- VoLTE is only useful where there's actually LTE coverage
- Without SRVCC, VoLTE will drop calls (worse QoE than circuit)

Now in theory, VoIP means more calls can be squeezed into a Hz of frequency - so there are efficiencies on the radio and on backhaul capacity. But all the complexity of policy management, QoS, legacy interworking needs to be factored in.

Can anyone honestly say they believe that VoLTE will offer a lower "production cost" for telephony than GSM? Or are the costs per unit about to go up, just when the revenues are going down? And CSFB probably means higher costs AND lower quality than today's GSM.

That doesn't make economic sense - what operator will want to invest in more-expensive technology in order to provide a worse service in a declining market?

EDIT - a couple of people have pointed out that CS voice infrastructure isn't that cheap either, and neither are the legacy billing/OSS systems. True, but modern developing world 2G/3G networks show that CS can be really cheap if deployed today. I think a CS refresh (or full OTT-style service with a chance of Future of Voice type new revenue streams) may be a better bet than VoLTE.


Read More
Posted in | No comments

Monday, 28 November 2011

Controlling telephony IN and supplementary features by app

Posted on 06:06 by Unknown
This is a very quick post, before I dash to the airport. I'll be in Singapore for the rest of this week at Telco 2.0's New Digital Economics event, speaking on both Mobile Broadband, and also on the telco-world implications of HTML5. More to follow on that topic another time.

I was out with a friend last night, and he bemoaned the fact that he can't easily adjust the number of rings his phone does before it diverts to voicemail. (He's using an Android handset on a major operator). He said he'd been able to do something via some obscure code like *7529# , but it had taken him ages to find it - and he's a serious geek as well.

It struck me that this is the sort of thing that should be done via an app, linked to the voicemail server. (I don't know if Apple's visual voicemail allows this, or 3rd-party consumer options like ON VoiceFeed, but this is just about the normal operator-provided vmail).

More broadly, there's a ton of supplementary services and other legacy IN stuff around in the telephony space that never really gets properly thought about, except how much of a pain it is in terms of ensuring backward-compatiblity. Why? Why isn't there a decent operator-provided app for voicemail, 3-way calling or whatever other features they've got? Who cares if it only works on some people's phones - and who cares if it emulates *# codes or hooks directly into the server?

It's this type of thing that Martin Geddes and I mean when we say that the basic telephony service hasn't evolved. You don't need to go to HD Voice or even application-embedded voice to make a difference - nobody seems to have sat back and thought "how can we make telephony 1.0 work better, given the tools we've got at our disposal?". Yes, I'm sure there are security issues - but sort them!

Anyone got an answer to why there's no easy "configure my voicemail" app? Or even a web interface through the operator self-care portal?

[or maybe there are examples, but I'm not aware of them. In which case, an alternative question is "why don't you tell anyone about this?"]
Read More
Posted in | No comments

Saturday, 26 November 2011

The top 10 assumptions the telecoms industry makes about communications services

Posted on 07:18 by Unknown

This is a slightly-edited repeat of a post I made last year. I think it is more relevant than ever, especially as the telecoms industry sleepwalks into another generation of mistakes, notably RCSe.

Disruptive Analysis' tagline is "don't assume". It is worth quickly stepping back to understand some of the basic premises and unspoken assumptions of the telecom industry “establishment” around personal communications. These are themes and concepts burnt into the mind of the "telecoms old guard", especially standards bodies, telecoms academics and traditional suppliers; they are seen as unwritten laws.

But are they really self-evident and unquestionable? Maybe not. It needs to be remembered that the telecom industry has grown up around the constraints and artificial boundaries of 100-year old technology (numbering, for example, or linking of length of a conversation with value). Many of those unnatural constraints no longer apply in an Internet or IP world - it is possible to more accurately replicate society's interactions - and extend them way beyond normal human modes of communication.

For any telecoms company wanting a continuing role for the next 100 years of the industry, it is worth going back to first principles. It is critical that everything previously taken for granted is reassessed in the light of real human behaviour - because we now have the tools to make communications systems work the way that people do, rather than forcing users to conform to technology's weird limitations.

For instance, we currently see the weird phenomenon of companies pushing so-called “HD” (high-definition) voice, as if it’s an amazing evolution. Actually, they mean “normal voice”, as it comes out of our mouths when we speak. We’ve only been using low-def voice because the older networks and applications weren’t capable of living up to our everyday real-life communications experience. Shockingly, this has taken decades to make a “big jump”, rather than evolving gradually and gracefully as technology improved.

So, in no particular order, these are the assumptions Disruptive Analysis believes are unwarranted.
  1. A “subscription” is assumed to be the most natural way to engage with, or pay for, communications services.
  2. The most basic quantum of human communication is assumed to be “a session”.
  3. It is entirely rational to expect people to want a single presentation layer or interface, for all their various modes of communication (ie “unified” communications or messaging).
  4. Communications capabilities are best offered as “services” rather than being owned outright, or as features of another product.
  5. A phonebook is assumed to be the best metaphor for aggregating all of a person’s contacts and affiliations.
  6. A phone call – (person A sets up a 2-way voice channel with person B for X minutes) is an accurate representation of human conversation & interaction, and not just a 100-year old best effort.
  7. People always want to tell the truth (presence, name, context) to others that wish to communicate with them.
  8. People are genuinely loyal to communications service providers, rather than merely grudgingly tolerant.
  9. Ubiquity is always more important than exclusivity or specialisation.
  10. The quality of a communications function or service is mostly determined by the technical characteristics of the network.
These are the types of issue discussed in much more depth my research reports, private advisory engagements, and in the Future of Voice workshops run by myself and Martin Geddes. 

If you'd like more detailed explanation of these assumptions - and what they mean for next-generation communications business models, please get in touch. I'm at information AT disruptive-analysis DOT com
Read More
Posted in | No comments

European Court of Justice ruling on ISP filtering of P2P has wider implications for DPI & policy

Posted on 03:17 by Unknown
There has already been quite a lot of online discussion of this week's ruling by the European Court of Justice (ECJ) on whether a Belgian ISP can be forced to filter out P2P traffic on its network. (A quick news article is here and more detailed analysis here  or here)

From my point of view, the most interesting thing about this judgement seems to be that the court has taken a very dim view of the possibility of "false positives" from the DPI or whatever other system might be used to monitor the traffic. (It also has implications for the privacy aspects of data monitoring by telcos as well - it was that, rather than more general "neutrality" concerns, that led to the Dutch Net Neutrality law)

The term "false positive" comes (I think) from the healthcare and pharmaceutical industry, where a "false positive" is a wrong diagnosis, such as telling someone that tests show they've got a disease, when actually they don't. The opposite (false negative) is arguably even worse - that means telling them that the tests show they're clear of the disease, when actually they DO have it all along. False positives/negatives also crop up regularly in discussions of security technology (eg fingerprint recognition or lie-detectors).

In other words, DPI tends to work blindly testing for a specific "class" or other group of traffic flows, using "signatures" to help it detect what's happening. This works OK, up to a point, and we see various ISPs blocking or throttling P2P traffic. It can also distinguish between particular web or IP addresses and various other parameters. But the issue with the ECJ judgement is that this process can't distinguish between "bad" P2P (illegal content piracy) and "good" P2P (legitimate use for distributing free content or other purposes).

A DPI system should be able to spot BitTorrent traffic, but likely wouldn't know if the content being transported was an illicit copy of New Order's True Faith, or a recording of a really terrible karaoke version with a plinky-plonky backing track you'd done yourself and released into the public domain. (To be fair, if your singing is as bad as mine, blocking its transmission is probably a service to humanity, but that's not the point here).

If the network is actually congested, the operator can probably claim it is reasonable to block or slow down all P2P traffic. And there is possibly a legal argument about the ISP working to limit piracy, if costs are low enough for implementation (again, a separate discussion). But if the network is not congested, it's definitely not reasonable to slow down the "false-positive" legal P2P data.

The interesting thing for me is how this could apply to other P2P use cases - for example application-based prioritisation. What is the legal (and/or consumer protection) stance when the DPI or PCRF makes a mistake? Let's say for some reason, I use a video codec, player and streaming server to transmit non-video material - animation perhaps, or some form of machine-readable codes. If the DPI leaps to the conclusion that it's "video" and prioritises/degrades/"optimises"/charges extra for it, is that a false positive and therefore illegal?

According to Wikipedia "Video is the technology of electronically capturing, recording, processing, storing, transmitting, and reconstructing a sequence of still images representing scenes in motion". There are definitely applications of video-type technology for non-video content.

There are plenty of other false-positive and false-negative risks stemming from mashups, encryption, application obfuscation or simple poor definition or consumer understanding of what constitutes "Facebook". (This is especially true where a network DPI / optimisation box / TDF works *without* the direct collaboration of the third party).

Now, in my experience, DPI vendors have been very cagey about disclosing their rate of "false positives", simply saying that overall, the situation is improved for the operator. I've heard from some sources that an accuracy of 85-90% is considered typical, but I imagine it varies quite a bit based on definition (eg # of bits vs. # of flows). It's also unclear exactly how you'd measure it from a legal point of view.

But the ECJ judgement would seem to suggest that's not good enough.

I'm quite glad that most of the vendors' product and marketing people don't work in the pharmaceuticals industry. Wrongly diagnosing 15% of patients would probably not be acceptable.... Even in other technology areas (eg anti-spam software) it would be near-useless. As I've said before about video, I think there are some very interesting use-cases for consent-based policy with involvement of the user and upstream provider, but it seems that the network *acting on its own* cannot be trusted to detect applications with sufficient accuracy, at least under stricter legal supervision from bodies like the ECJ.
Read More
Posted in | No comments

Thursday, 24 November 2011

My thoughts on Ofcom's Net Neutrality statement

Posted on 08:12 by Unknown
Ofcom has finally put out its statement on how it views Net Neutrality today. I'm quite impressed - it seems like a sensible compromise between a ton of vested interests and ideologies. If I'm honest, I'm also quite pleased because it has a broadly similar tone to my own submission to the consultation last year and my proposed "code of conduct" on this blog.

Some highlights:

  • The distinction between vanilla best-efforts "Internet Access" and "Managed Services", which can be prioritised
  • "If a service does not provide full access to the Internet, we would not expect it to be marketed as Internet Access"
  • The stipulation that the Internet Access portion should remain neutral, unless absolutely necessary for traffic management - in other words, Internet Access as a product means "The Real Internet" (TM)
  • Sensible discussion of two-sided business models
  • Recognition of the value of the "open Internet"
  • General view that competition seems to work quite well in broadband access, as long as it remains easy enough to switch between providers
  • Ofcom has seemingly booting the idea of generalised two-sided "traffic tax" out of consideration. (Luckily, the ludicrous telco-sponsored ATKearney report on a "viable future model of the Internet" seems to have been laughed out of court - pure #telcowash that ATK should have been ashamed of)
  • Making the point that any traffic management applied to Internet Access should be non-discriminatory between similar services (eg throttle one source of video, throttle them all). This is similar to the situation the Israeli regulator on my panel at the BBTM conference last week was expounding (ie "fair" rather than "reasonable" traffic management).
  • That traffic management practices need to be made very clear at point of sales and ongoing
  • Reservation of capacity for managed services should not unnecessarily impede the QoS of best-efforts Internet Access. No hard-and-fast rules, but a clear "we're watching you" with an implicit threat of more specific regulation if operators misbehave.
  • Marketing should be based on average speeds rather than theoretical peaks
  • It seems that traffic management is deemed OK for dealing with congestion, but not for dealing with variable costs on uncongested networks ("traffic management may be necessary in order to manage congestion on networks"). That means that various of the video optimisation use cases I discussed in my post this morning would be forbidden.
  • Ofcom has washed its hands of dealing with "public service" content and how that should be dealt with. Basically, that's kicking the problem of the massively successful BBC iPlayer upstairs, for the politicians to decide about.
One thing to note here is that I expect all operators to have to sell "Internet Access" as part of their offers, if not by law then in practical terms. If I buy a phone contract from an 18yo trainee on a Saturday morning and ask "does it come with Internet access?" then I'll have grounds for complaints about mis-selling if it subsequently turns out that I've been sold "processed Internet-like substitute", or "I can't believe it's not Internet". Ofcom's left it a bit vague about selling Internet Access with caveats, which suggests that burying the detail in the fine print won't be acceptable either.

Overall, this strikes me as a good compromise. If operators' and vendors' QoS systems can genuinely improve on what today's best-effort Internet can do, without damaging the utility of that vanilla Internet Access for other customers, then it's entirely reasonable for them to monetise it. Making operators sell the vanilla service in terms of average speeds (ideally average latency as well) means that the minimum QoS floor is defined when the customer buys the service. If I buy an average 2Mbit/s connection, then I don't really care if someone else gets 5Mbit/s priority turbo-boosts or whatever.

That said, as always the Devil will be in the detail. An average 2Mbit/s is pretty useless if it's really spiky. But Ofcom has reserved the right to give harder targets for QoS in future, which will likely be punitive if operators transgress what's reasonable. It seems to suggest that enough capacity to watch Internet video - and therefore, implicitly, VoIP - is likely a reasonable compromise for best-effort access.

This fits with my general view that operators are not going to be able to "monetise" stuff that works perfectly well now, just by putting a tollgate in the way. They'll have to add value and make it better - implicitly meaning that network QoS has to fight its corner for content companies and developers, against other options like better software/codecs/UI. In other words, if operators want a revenue-share from Google or Facebook, they're going to have to help them earn MORE money than they're already doing, not try and take a share of their existing business.

I suspect that the disparaging term "OTT" might finally disappear when T-Mobile's account director sits down with his YouTube counterpart to agree a new affiliate deal....


[Please sign up for my blog email distribution list - see form on top-right of the web page. If you're reading on RSS, then please click through in the browser!]

Read More
Posted in | No comments

Mobile video optimisation - different perspectives

Posted on 00:46 by Unknown
I had a briefing call with one of the mobile video optimisation companies yesterday, discussing the role played by network boxes that can compress, rate-limit and otherwise change a downloaded video stream.

It's an area I've been critical of for a while - especially where the network 'transparently' alters video content, without the explicit consent of either video publisher or end-user. I recognise that's a useful short-term fix for congested networks as it can shave 20-30% off of data throughput, but I also think that it's not a viable model in the long term.

Basically, publishers don't like the idea that some device in the data path "messes with our content", and in many cases users and regulators won't like it either. Some of the vendors suggest that reducing video "stalling" by optimisation actually improves the quality of user experience (QoE), but there are also other approaches to that which are more "consensual".

It was interesting to read the other day that Verizon's Video arm (formerly V-Cast) is working with its network teams to pre-define properly "optimal" formats for video depending on network connection, device type (ie screen & processor) and even user dataplan. In my mind, that's totally acceptable, as the content publisher/aggregator is working hand-in-hand with the network team to create a balance between QoE, network integrity and "artistic integrity". Collectively, they've thought about the trade-offs for both network and user in terms of quality, cost and performance. Presumably they also looked at the role of CDNs, adaptive bitrate streaming, on-device caching and assorted other clever mobile-video tech. According to a related article, they can't use WiFi offload because their content rights agreement is for 3G/4G only - another good illustration of the complexities here. All in all, it sounds like a great example of "holistic" traffic management, working with the various constraints imposed. I assume they have a roadmap for further evolution as it all matures.

But there's a million miles between that philosophy, and the type of non-consensual optimisation that's increasingly common for normal Internet video traffic destined for mobile devices, where the network unilaterally decides to alter the content. We're seeing solutions getting a bit smarter here - for example, only acting if the cell is actually busy, but there are various other use cases where it's more directly about the operator's costs.

The vendor I spoke to yesterday mentioned scenarios like shared networks (where each operator pays a share of costs based on traffic volumes), or where the backhaul is obtained from a third-party fixed operator at variable-cost. Another scenario I've heard (usually more for caching / CDNs) is around international transit in parts of the world with expensive connectivity. But another use cases was where the network is uncongested but high-quality video was being downloaded at high speed. In that instance "we can take 20% of the data stream off the top, and they won't notice".

Here is where we differ. In my view, if I (as a customer) pay for Internet Access, then I expect that the bits and bytes that come out of the server are the same bits and bytes that arrive on my device. The server owner thinks the same. Yes, in certain circumstances I'll accept a trade-off if it improves my QoE, as long as I am told what's happening and opt in: imagine an icon appearing, indicating "optimiser on", or a switchable user-controlled optimiser like Onavo's for reducing roaming traffic. But I'm not prepared to accept a lower quality just because the operator has been stupid enough to agree contracts for a shared network, with terms that don't take account of the types of retail broadband service its selling to its customers.

"Oh, I'm sorry Mr Bubley, we know you paid £1000 for the flight, but we've downgraded you to economy class because it's a codeshare flight, and we have to pay our partner airline £500 extra per-seat for business class passengers". I'm sorry, but that's your problem and not mine.

And as for the notion that the Internet connection can arbitrarily decide that "I won't notice" if it chops up & reformats my data? Hello? How do you know that? Maybe I've encoded something into it steganographically in the background? Maybe I've PAID for content of a specific resolution, and I have a contract with the publisher? Maybe I'm willing to take the trade-off of throttled buffered' throughput, because it's going to a background app for viewing later? If I've paid for my 2GB per month, I'll use it the way I like, thank you very much. If the network's busy, fine - tell me (and/or the publisher) & I'll use WiFi or download it later or *I* will make the decision to drop the quality if I want instant gratification.

I've got a lot of sympathy for operators that have genuine peaks and troughs of demand, or that are constrained in terms of supply by spectrum shortages or difficulties obtaining cell sites. But I've got less sympathy for companies that sell a product ("2GB of mobile Internet access at up to 10Mbit/s") and then realise they can't deliver it, because it doesn't match up to their network's capabilities or cost structure.


But it struck me that the reason that telecom operators (especially mobile) think that this is OK, is that degrading quality is at the core of their business proposition. For the last 100 years or so, we've had to deal with the fact that our natural, wideband human speech has been squished into 3KHz pipes for transport across the network. It's "optimised" (ie downgraded) at the microphone on your handset. Of course, now we have service providers trying to monetise HD voice - and expecting you to pay a premium just for transporting what your vocal cords have been putting out all along.

I'd love to know if there's internal transfer-pricing going on at Verizon for the video service. Is VZW actually "monetising" its optimisation capabilities and charging the Video department for the privilege? Or is it being done for mutual benefit, without money changing hands? I suspect the latter, which is the model I see most (but maybe not all) consensual video optimisation working out.

Oh, and if I ever go for a drink with this particular optimisation company exec, I'm going to have a word with the barman. I'll tell him to pour 80% of his pint of beer as normal, but top up the other 20% with coloured water. I'm sure he won't notice.
Read More
Posted in | No comments

Tuesday, 22 November 2011

Another reason why application-based charging for mobile data won't work

Posted on 01:36 by Unknown
I was at the Broadband Traffic Management conference in London last week, one of the largest events in the calendar for 3G data networks and policy/traffic management and charging solutions. I spoke to a wide range of vendors and operators, and moderated an afternoon stream about dealing with mobile video.

I came away from the event with a number of my beliefs about policy, WiFi offload, video optimisation and operator "politics" strengthened, and a number of new learnings and perpectives that I'll be sharing either on this blog, or in a report in early 2012. This particular post covers a couple of things about "service-based" or "application-based" charging and policy.

(As an aside: I'm going to be boycotting the BBTM event in 2012, for numerous reasons, not least of which was the ridiculous decision to host it in a place with no decent cellular coverage and £20 / day WiFi. I know from organising my own events that organisers have a lot of negotiating power with venues about the "delegate package". If the venue refuses because it has 3rd-party run WiFi with an inflexible contract [this venue used Swisscom] then go somewhere else. It's inexcusable).

I've said on numerous occasions (eg here, here and here) before that I don't believe that operators can (in general) successfully design mobile data or broadband services around application-specific policies and pricing. Despite continued hype from the industry and standards bodies, network cannot and never will be able to accurately detect and classify traffic, application or "services" on its own. With explicit cooperation from third-parties, or sophisticated on-device client software hooked into the policy engine, there's a bit more of a chance.

But I continue to hear rhetoric from the network-centric side of the policy domain about creating "a Facebook data plan", or "charging extra for video", or "zero-rating YouTube". I'm a serious skeptic of this model, believing instead that policy will be more about location, time, speed, user, device, congestion and other variables, but not an attempt to decode packets/streams etc. in an effort to guess what the user is doing. However, lots of  DPI and PCRF vendors have spent a lot of money on custom silicon and software to crunch through "traffic" and have promoted standards like 3GPP's new "Traffic Detection Function", and are now determined to justify the hype.

Much of the story fits with the usual attitude of punishing (or "monetising") the so-called OTT providers of application and content, by enabling the network to act as a selective tollgate. On a road, it's easy to differentiate charges based on the number of wheels or axles a vehicle has, as you can count them. Not so true of mobile data - some of the reasons that I'm skeptical include mashups, encryption, obfuscation, offload, Web 2.0, M&A between service providers and so on. (And obviously, national or internationl laws on Net Neutrality, privacy, copyright, consumer protection and probably other bits of legislation).

But during the BBTM, I came to a neat way to encapsulate the problem: timing.

Applications change on a month-by-month or week-by-week basis. The Facebook app on my iPhone looks different to me (and the network) when it gets upgraded via the AppStore. I talked to a network acrhitect last night about the cat-and-mouse game he plays with Skype and its introduction of new protocols. Not only that, but different versions of the app, on different devices, on different OS's, all act differently. And according to someone I met at BBTM, different countries' versions of different phones might interact differently with the network too. And the OS might get updated every few months as well.

Operators can't work on timescales of weeks/months when it comes to policy and charging. The business processes can't flex enough, and neither can customer-facing T's and C's. How do you define "Facebook"? Does it include YouTube videos or web pages shared by friends viewed *inside the app*? What about plug-ins? Who knows what they're going to launch next week? What if they shift CDN providers so the source of data changes? 

Unless you've got a really strong relationship with Facebook and hear about all upcoming changes under NDA, you'll only find out after it happens. And then how long will it take you to change your data plans, and/or change the terms of ones currently in force? What's the customer service impact when users realise they're charged extra for data they thought was zero-rated?

And if you think that's bad, what a year or two.

As we move towards HTML5 apps, I'd expect them to become more personalised. My Facebook app and your Facebook app might be completely different, just as my PC Facebook web page is. Maybe I've got encryption turned on, or maybe Mark Zuckerberg sets up the web server to put video ads on my wall, but not yours. Maybe I'm one of 5 million people out of 800 million who's testing a subtly different version of the app? Or that has a Netflix integration? Websites do that all the time - they can compare responsiveness or stickiness and test alternative designs on the real audience. And because it's all web-based, or widget-based, much of that configuration may be done on the server, on the fly.

How are you going to set up a dataplan & DPI that copes with the inherent differences between dean.m.facebook.com and personX.m.facebook.com? Especially when it changes on a day-by-day or session-by-session basis?

Yes, there will still be fairly static, "big chunks" of data that will remain understandable and predictable. If I download a 2GB movie, it's going to be fairly similar today and tomorrow. Although if I stream it with adaptive bitrate filtering, then maybe the network will find it harder to drive policy.

EDIT - another "gotcha" for application-based pricing is: How do you know that apps don't talk to each other? Maybe Facebook has a deal with Netflix to dump 8GB of movie files into my phone's memory (via the branded Facebook app & servers), which the Netflix app then accesses locally on the device? This is likely to evolve more in the future - think about your PC, and the way the applications can pass data to each other.

One last thing from the BBTM conference: we heard several speakers (and I heard several private comments) that the big pain is still signalling load of various types, not application data "tonnage". I've yet to hear a DPI vendor talk convincingly about charging per-app or per-service based on signalling, especially as much of the problem is "lower down" the network and outside of the visibility of boxes at the "back" of the network.

Yes, we'll continue to see experiments like the Belgian zero-rating one I mentioned recently. But I expect them to crumble under the realities of what applications - defined in the user's eyes, not the network's - really are, and how fast they are evolving.

UNSUBTLE SALES PITCH: if you want a deeper understanding of how application changes will impact network policy, or the fit of traffic management with WiFi offload, CDNs, optimisation, devices and user behaviour, get in touch to arrange a private workshop or in-depth advisory project with Dean Bubley of Disruptive Analysis . Email information AT disruptive-analysis DOT com
Read More
Posted in | No comments
Newer Posts Older Posts Home
View mobile version
Subscribe to: Posts (Atom)

Popular Posts

  • Quick musing on Cloud Computing
    I just heard the phrase "Everything as a Service" during a presentation on Cloud, SaaS and other forms of managed service offering...
  • Apple, embedded SIMs, NFC and mobile payments - some speculation
    I wonder if I've just managed to join up the dots on something rather important: - Recent reports suggest that Apple is intending to use...
  • New Cisco VNI traffic report out
    One of the broadband industry's "bibles" has been published in a 2010 edition . Cisco's "Visual Networking Index...
  • Is the MID a market?
    MIDs (Mobile Internet Devices) are being pushed by some notebook OEMs and silicon suppliers as the next big convergent handheld category. I...
  • "You can't use my eyeballs for free"
    Let's look forward 10 years. We've all got augmented reality browsers on our handsets, or perhaps our 4G-connected sunglasses. They ...
  • Mobile traffic management - the Inter-technology war begins
    I've been following the proliferation of mobile broadband traffic management technologies for some considerable time now, having publish...
  • Pre-MWC notes for analyst relations staff
    OK, it's the time of the year when I get bombarded by emails and phone calls from a million people inviting me to briefings and similar ...
  • Mobile operators' future voice strategies decoded
    Apologies in advance, but this blog post is deliberately a bit of a tease. I'm not going to spell out the answer here, as it's too v...
  • Hosted mobile services in the recession - Caveat Emptor
    I used to work as an equity analyst at an investment bank back in 2000-2001. I remember an unending stream of first generation Application S...
  • Challenges in measuring offload volumes
    I suspect we're going to get bombarded with statistics in the next year, along the lines of "Operator X deployed Vendor Y's off...

Blog Archive

  • ►  2013 (31)
    • ►  October (2)
    • ►  September (3)
    • ►  August (1)
    • ►  July (2)
    • ►  June (6)
    • ►  May (5)
    • ►  April (1)
    • ►  March (3)
    • ►  February (3)
    • ►  January (5)
  • ►  2012 (46)
    • ►  December (5)
    • ►  November (4)
    • ►  October (3)
    • ►  September (2)
    • ►  August (4)
    • ►  July (3)
    • ►  June (1)
    • ►  May (6)
    • ►  April (4)
    • ►  March (1)
    • ►  February (9)
    • ►  January (4)
  • ▼  2011 (73)
    • ▼  December (4)
      • 2012 Winners and Losers
      • New! Disruptive Analysis Premium Twitter subscript...
      • Telco-OTT Strategies & Case Studies: Pre-publicati...
      • What is the cost of VoLTE? Is it a huge strategic ...
    • ►  November (10)
      • Controlling telephony IN and supplementary feature...
      • The top 10 assumptions the telecoms industry makes...
      • European Court of Justice ruling on ISP filtering ...
      • My thoughts on Ofcom's Net Neutrality statement
      • Mobile video optimisation - different perspectives
      • Another reason why application-based charging for ...
    • ►  October (8)
    • ►  September (6)
    • ►  August (3)
    • ►  July (5)
    • ►  June (7)
    • ►  May (9)
    • ►  April (4)
    • ►  March (7)
    • ►  February (6)
    • ►  January (4)
  • ►  2010 (130)
    • ►  December (4)
    • ►  November (10)
    • ►  October (10)
    • ►  September (6)
    • ►  August (9)
    • ►  July (7)
    • ►  June (19)
    • ►  May (19)
    • ►  April (11)
    • ►  March (18)
    • ►  February (7)
    • ►  January (10)
  • ►  2009 (126)
    • ►  December (4)
    • ►  November (14)
    • ►  October (9)
    • ►  September (8)
    • ►  August (9)
    • ►  July (10)
    • ►  June (21)
    • ►  May (14)
    • ►  April (2)
    • ►  March (11)
    • ►  February (15)
    • ►  January (9)
  • ►  2008 (94)
    • ►  December (24)
    • ►  November (26)
    • ►  October (25)
    • ►  September (19)
Powered by Blogger.

About Me

Unknown
View my complete profile