Amazon Tech Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Wednesday, 22 December 2010

Uploaded - recent free white papers on mobile data policy management

Posted on 04:29 by Unknown
A month or so ago, I announced that I'd written and published three detailed white papers, plus an introductory document, covering new use cases and concepts for what Disruptive Analysis referes to as "holistic" mobile data policy management. These look beyond the simplistic early 'silo' approaches to DPI, policy, optimisation and offload, many of which have been arbitrary and often user-unfriendly.

These are now available by download via Scribd.

The introduction is available here:
http://www.scribd.com/doc/45784870/Mobile-Policy-Management-White-Paper-Series-Intro
  • Device-aware: not just what brand and model, but over time much more granular detail about OS version, firmware, connection managers, security, power management and the ability to communicate about network status and policy with the user. Increasingly, network vendors and operators will need to link network infrastructure boxes to on-device clients. This also ties in with application awareness - particularly around dealing with mashups, VPNs and so forth. See & download from here
  • Bearer-aware: the policy infrastructure will need to be much more informed about the status of the radio connections(s) - what technology, femto vs. macro cells, whether Wifi is available and suitable, what is happening with signalling load, whether congestion is actually occurring at a given time/cell and so on. See & download from here
  • Offload-aware: whether data is being (or should be) routed via WiFi, femtocells, RAN offload and so on - and whether this should be managed or unmanaged. There are many variables here, and many use cases, such as the ability to use multiple networks simultaneously, "selective offload" (SIPTO / LIPA) and so on. See & download from here
Given that regulators such as the FCC and Ofcom seem to be moving towards policies on traffic management & Net Neutrality along the lines of "minimum necessary" or "reasonable" control of traffic by operators. This means that any policy enforcement will need to be reasonable, proportionate and context-specific. Arguably, there is no justification for unnecessary throttling or compression at quiet times / cells, unless you live on a Pacific island and IP transit is expensive. There is certainly not likely to be justification for much arbitrary discrimination between websites or applications, especially if this is not done with full transparency.

Each of these issues is covered in a separate white paper, plus there is an overview introduction.

This is an area I cover in a lot of depth. If you are interested in an internal workshop, advisory consulting project, or need an external speaker for an event, please get in touch at information AT disruptive-analysis DOT com

(Note: The papers have been kindly sponsored by the folks at Continuous Computing, who have given me a completely free rein to write about topics that are interesting, and which hopefully push forward the industry thinking about how better to control & monetise mobile data). 
Read More
Posted in | No comments

Tuesday, 21 December 2010

Why application-specific policy and charging will (almost) always fail

Posted on 23:42 by Unknown
Over the last couple of days, the technology blog and Twitter world has reacted with righteous anger to this slide and the underlying sentiments around Net Neutrality, especially given the FCC's announcements recently.

I don't think Openet and Allot have done themselves any favours with that webinar chart. Not only was it guaranteed to cause a storm of controversy, it's all made 100x worse by the fact that that sort of app-centric policy / charging mechanism simply will not work, either technically, commercially or (probably) legally. There are so many workarounds, pitfalls, gotchas and risks that any carrier would be crazy to even try it, especially in developed countries.

On the other hand, it's the sort of slide that will make some mobile operator marketing people salivate, especially the more gullible and technically-ignorant among them, so I guess I can cut the two vendors some slack. In all probability, there are some operators which will try to do this sort of thing anyway, so if they're going to waste their money, then I can understand the vendors thinking "well, they might as well waste it with us rather than the other guys".

[Sidenote - if you work for an operator and want to get a detailed view of why this is a *really bad* idea, please get in touch with me to arrange a workshop]

I'm supposed to be on holiday & am waiting for a rebooked flight, so I'm not going to give the long list of problems. But just to get the debate going, let's start with:

  • Customer dissatisfaction & churn
  • Likely regulator disapproval & censure
  • Stuff inside VPNs
  • Mashups (what happens to the YouTube video embedded in a Facebook page?)
  • Who pays for *uploads*?
  • Acceptable level of "false positives" and the legal liability associated with that
  • Need for network ops guys to *really* understand applications & how they evolve on a daily basis
  • Lack of anybody with the job title "policy manager" who grasps and has authority over all the necessary bits of the puzzle - radio, core network, app trends, devices, user behaviour, legal issues, PR, partnership etc
  • Multiple fraud possibilities
  • Fit with CDN architecture & difficulty of charging Akamai
  • Probability of key Internet companies turning the tables on the operator and say "no, you pay *us* or else we're going to block access to your subscribers, suggest they churn, and give free adverts to your competitors"
  • It gives implicit permission for fixed operators to charge your customers to use your femtocells on their DSL/cable
  • Doesn't reflect signalling load in the RAN
  • Difficulty in predicting future direction of applications. Is Facebook's new email service going to be classified as Facebook, or email? What happens if Facebook buys Skype?
  • Tons of arbitrage opportunities
  • Lack of decently flexible upstream billing & reconciliation systems
  • Need to have perfect coverage before charging for incremental services
And plenty more.

Now most of the anger is around the general philosophy of Net Neutrality and whether operators should be "allowed" to do this sort of thing. I'm actually fairly, er, neutral on that. It's simply that this sort of approach is woefully flawed, and in any half-competitive market will lead to immense customer dissatisfaction and churn.

It's not wrong or evil. It's just naive.

Now... that doesn't mean that *all* application-based policy use cases are going to fail. There are some which I can see having value, or which are less prone to outright failure. I can see "zero-rating" of specific partner apps being valuable, because that will encourage the provider to collaborate with the telco. Although it's probably easier to just put those in a separate APN rather than sift through the main data-stream. I can also see the arguments of rate-limiting of P2P traffic for congested cells, especially uplink. In the fixed world, I can certainly see the role of this for specialised services which are use the broadband, but which are not Internet-based (eg IPTV or home security or specific cloud apps).

But the idea that an end-user will pay extra per-application, for major Internet services like Skype or Facebook, on their broadband access? Utter nonsense. As I wrote yesterday in my 2011 predictions earlier this week, there's a huge opportunity on the horizon for Policy and DPI. But the slide on Engadget is 100% the wrong way to attempt to monetise it.
Read More
Posted in | No comments

Monday, 20 December 2010

The inevitable 2011 predictions - successes and failures

Posted on 03:49 by Unknown
It's the time of the year for snow-induced travel chaos and long lists of predictions. So before I chance my luck at Heathrow later today, I thought I'd put my stake in the ground with an assessment of winners and losers for 2011.

I'm focusing here on my main research themes at the moment:

  • Mobile broadband and 3G/4G, technology and business models
  • Mobile data traffic management and the network core
  • New telco strategies and revenue opportunities
  • Next-gen communications services
  • Devices and smartphones
 

HSPA/+ and smartphones for data - Really, the big story in 2011 is going to be "more of the same". More smartphones at both low and high ends, more dongles, and slow but certain growth in new device categories. Despite the fervour over LTE, it's going to be the growing maturity of "normal" 3.5G which is going to be driving the industry, if not the headlines. This will have a number of knock-on effects for both infrastructure and device vendors. Increasing densification of 3G networks in urban areas, along with network-sharing deals, will be top of the list for many operations and strategy departments. Policy (discussed below) will continue to be an issue.

Shifting demographics and usage patterns (eg cheap Android phones, new & unpredictable app adoption) will drive the need for coverage and capacity in unexpected places. A growing volume of prepay data users in Europe, Latin America and Asia will start using 3G, and rapidly ramp up their data consumption both via smartphones and featurephones. Many of those incremental users will be using "vanilla" unsubsidised devices, which are more difficult for operators to control.

Infrastructure vendors will continue pitching both HSPA equipment and managed services, while talking-up LTE. However, operators in developed markets will start considering whether fully-outsourced networks - while offering predictable Opex - might actually reduce their strategic flexibility. Disruptive Analysis believes that "under the floor" players have the potential to be even greater threats to telcos than the much-maligned so-called OTT providers, hollowing out operators' core skillsets, source of value and key differentiator - the network itself. 2011 success rating: 10/10


Evolved policy management - Linked to the growth of mobile data, and the shifting revenue/cost equations of mobile broadband networks, Disruptive Analysis expects to see a continued intense focus on traffic and policy management solutions. A particular theme will be a move towards more "holistic" approaches, shifting from the easy-but-crude approaches to managing content at the GGSN towards much more RAN-centric and device-centric approaches. Using an optimisation server stuck in the core of the network to squash down video content will be viewed as a primitive solution - not least by regulators, who will likely flex their stance on Net Neutrality to allow greater levels of active traffic management, but only when it is actually essential. Applying blanket policies in the network core, irrespective of actual or predicted congestion levels in the radio, and without a user's-eye view of QoE (no, not QoS) will start to become a competitive disadvantage.

As part of this, expect to see more refined pricing / charging strategies. New ways of tiering charges by volume, or by speed / time / device are very likely. However, we are still unlikely to see successful deployment of the mythical gold/silver/bronze tiers of service beyond speed caps. Ultimately, the largest gating factor on QoE is the radio network, and no amount of QoS-twiddling is going to make a big difference if your house is on the edge of the cell. Device-specific pricing will have far more success than attempts to charge differentially for certain Internet applications. Despite DPI vendor hype, the network does not really understand applications, just traffic types. Some broad categories might be applied (eg P2P), but attempts to try to charge differentially for VoIP or social networking or even video are doomed to failure. There will also be growing recognition that the "tonnage" way of metering mobile data by GB is not really a useful or appropriate way to align usage with actual costs of running the network. 2011 success rating: 9/10


Own-brand operator OTT services: This is going to be the big surprise of 2011. I've been talking about access-independent services for some time, but it's still been a taboo among many operators. The idea of Operator #1 launching services targetting Operator #2 and #3's access customers has sat very uneasily with the clubby, collaborative, "we're all in this together against Google and Facebook" interoperability mentality of the telcos. This ethos is now fraying. Various collaborative efforts (think RCS....) have ended in failure as coalitions of the losers. Most operators, either publicly or privately, through major efforts or skunkworks, are now thinking about decoupling access and service, and attempting to play the Internet companies at their own game. This is absolutely the right approach, and one which is currently underestimated by many - who will find out to their costs when they lose first-mover advantage.

Now the devil here is in the detail - Vodafone 360 had great vision but has suffered from poor execution. Going forward, there should be some serious innovation around VoIP (both telephony and other voice applications), social network, OTT-style social TV for fixed broadband, and various enterprise / cloud offerings. There will be out-of-region expansion (perhaps AT&T in Europe, or Orange in the US) via smartphone apps, or on-PC clients. Third-party hardware is also an option - femtocells already work over other telcos' networks, and that model is likely to extend to set-top boxes and maybe tablets as well. - 2011 success rating 8/10  [Note: this one is success relative to general expectations & consensus (ie none) - clearly it's not going to change the industry overnight, but it will point the inevitable way to the future]

Femtocells - Various people have been asking me if 2010 was the "year of the femto", which I've been hesitant to agree to, as I see market evolution more as a steady progression. Nevertheless, the femto concept has been hugely validated by a number of high-profile launches, and we are likely to see continued growth in 2011. In particular, we will see a move from a coverage-led proposition to one more based around capacity and offload. There will also be an early move towards use of femtos in public spaces or "macro femtos" for capacity enhancement in the street, perhaps attached to street furniture or cable plant. However, obstacles still exist - not least the femto model for prepaid users who are using mobile broadband as a substitute for DSL or cable contracts (a lack of prepaid deals for fixed broadband is an ongoing omission of the industry, leaving money on the table). We will also see some early interest in LTE femtos, although the concept of building a completely "inside-out" network does not seem to have legs immediately. Disruptive Analysis also remains skeptical of the market for enterprise femtocells, except simple installations for SMEs. 2011 success rating: 7/10



LTE for data - Although HSPA will continue to go from strength to strength, there is a groundswell of both optimism and activity on LTE. Not only that, but a stream of spectrum auctions - especially 800MHz digital dividens - is coming through, which will almost inevitably be used for LTE deployment. Disruptive Analysis does not expect to see a wholesale switch to LTE, even in markets where it is deployed, but certain operators will be very aggressive, especially CDMA operators that are struggling to evolve their EVDO networks to match the reality of HSPA+. Other markets will use early LTE deployments as flagships and testbeds, especially for PC users. LTE smartphones will start to emerge during 2011, but will gain likely minimal uptake outside of the US and Japan. There will lots of noise about LTE M2M, but remarkably little action. 2011 success rating: 6/10

Tablets - I've been somewhat surprised by the iPad's success, but I still don't see tablets as a major game-changing trend, beyond the "Apple Effect". I still think that the iPad is primarily a nice (and for some groups of people also very useful) gadget which complements their PC/Mac and smartphone usage. I don't see massmarket Android platforms fulfilling the same roles and I certainly don't see tablets heralding some sort of mythical "post-PC" era. I do seem them as becoming important for "social TV" use cases in the home, though. I expect to see a declining % of tablets with embedded/activated 3G radios going forward - the bulk will be WiFi-only. 2011 success rating: 5/10



Two-sided operator business models - I spend a lot of my time examining realistic ways that operators might monetise "non-user" sources of revenue. At the moment, I'm seeing something of a mixed bag. Certain niches such as M2M are showing signs of growth. Others such as (some) operator app-stores have potential. Mobile advertising is growing but still small at an absolute level. There is growing interest in the "personal information economy" around identity and authentication, but it is early days and operators have various commercial, technical and regulatory challenges.

The willingness of telcos to enable "comes with data" models for new devices has been slow, largely because of the constraints of the legacy "subscription mindset". The SIM/subscription model is extremely ill-suited to most consumer electronics products, for example. Telcos have also been very slow at releasing and marketing developer-centric APIs on a commercial basis, focusing too much on capabilities well-served by alternative solutions, such as SMS or location. Instead, they should focus on more useful & unique information sets for developers, such as network congestion APIs, or mechanisms for communicating device capabilities, status and customer tariff plan.  

Another area that operators are in danger of failing on is around broadband prioritisation and attempts to derive revenue from "upstream" players such as Netflix or Google. The wrong way to do this is via confrontation or trying to enforce some sort of "pay per GB" toll. The ridiculous operator appeals to the EU for a Google tax earlier in 2010 are classic examples of what not to do "They're clever, we're dumb, please punish them". Frankly, Google and Netflix have more of a moral (and perhaps legal) argument to charge telcos, not vice versa. Instead, MNOs need to focus on "success-based" broadband value-adds, with measurable SLAs, not just vague promises of QoS. Cableco's pay for good TV content for their access customers - telcos may be prepared to pay for good Internet services for their broadband subs, as they have more to lose in a "switching war". Another option would be a Google-style affiliate model, where clear telco-enabled uplifts to advertising (by QoS, customer data targetting etc) would be rev-shared. However, I'm doubtful that we'll see a switch from aggression to pragmatism in many cases. 2011 success rating: 4/10

NFC - I've been a long-term critic of NFC, especially for mobile payments. The noise around the technology has risen to a crescendo recently, with Orange launching a major initiatives in 2011, and a three-way project by AT&T, Verizon and T-Mobile. Apple and Google are also rumoured to be joining the fray. I'm still very skeptical, especially when it comes to the more ridiculous notions of "bill this to my phone account", or attempts to integrate Oyster-style travel passes or hotel room keys onto handsets. None of these make obvious sense as operator business models. However, I wouldn't discount the chance of Apple of Google in particular doing something clever, such as an NFC-based way of authorising music purchases, or perhaps acessing specific services in particular locations. 2011 success rating: 3/10

LTE for voice - Despite talks of trials of VoLTE, Disruptive Analysis does not expect to see successful massmarket voice services launched on LTE for several years, with the possible exception of NTT DoCoMo. There are huge barriers to getting any form of LTE voice working well, especially to the levels that can be experienced with a $20 GSM phone. There are likely to be many, many challenges in getting primary telephony working well - we'll probably have to go back to the drawing board in terms of voice quality as well as the network side of things. Echo cancellation, packet loss concealment, all the clever stuff the Internet VoIP guys have been doing for years will need to be replicated in LTE mobile. It is conspicuous that the GSMA VoLTE initiative focuses on the middle of the network (interoperability) and much less the factors that drive customer experience (battery life, on-device call quality and so forth). In many ways, VoLTE is mis-named - it's really just ToLTE (telephony over LTE), and not a generic multi-app voice architecture, starting from first principles of "what are the use cases for voice in future, and how should these be supported?"

The network might be good at minimising packet loss and jitter, but the real difficult stuff in mobile VoIP is what to do with the data stream when packets are lost. Cure is actually more important than prevention, especially where RAN coverage will inevitably introduce problems. As for CSFB (circuit-switched fallback), it's a disaster in the making. Call setup times, breaking LTE data connections and so on. VoLGA? Maybe, but it needs either handset vendor support or perhaps a good OTT-style client implementation. Disruptive Analysis maintains that the best short-term solution for voice with LTE is one of the afore-mentioned $20 phones, and a strip of Velcro.   2011 success rating: 2/10

I'm struggling to pick out a single outright loser for the 1/10 spot (or even zero/10), so instead a quick list of next year's biggest turkeys:

  • IMS RCS. 'Nuff said.
  • Any new operator-specified handset OS.
  • Federated operator-run cloud services (although single-operator clouds have good potential). Likely to get hyped as a triumph for interoperability, and therefore the best candidate for a new losers' coalition.
  • Subscription-based Mobile TV (as always). Yes, there's a handful of iPad video addicts. And yes Japan uses terrestrial TV-to-the-phone. But massmarket TV on massmarket phones? No, just as unlikely as in 2006 when it was last hyped.
  • It's about time for the FourSquare-style "check in" model to implode under its own preposteroussness. The only good thing about FourSquare & peers is that it's a great way to know where not to go, if you want to avoid meeting dull social-media people.
  • Paid apps for Android, Symbian and BlackBerry. There's a strong correlation between a willingness to spend $$$ on apps and willingness to give Steve Jobs $300 gross margin per device. Everyone else counts their pennies & will go for cheap/free wherever possible.

And I know I'm normally opinionated about everything, but I also have a few questionmarks, where I'm really not sure at the moment:

  • Operator-run appstores or app-malls. Might work out in certain contexts, but I can't see them worrying Apple. 
  • Large-scale M2M. I'm still unconvinced that a zillion sensors and fridges are going to get networked up. And if they do, they also guarantee that GSM/GPRS will be around for the next 30 years, and we won't be refarming much 2G spectrum any time soon.
  • New variations of the SIM card (eg OTA IMSI-updating, multi-IMSI, wholesale & switching etc)
  • Augmented reality
  • Whether Nokia can stage a return to its glory days with MeeGo
  • Will a major Internet / IT company buy into cellular, either via acquisition or as an MVNO
  • Satellite-based mobility

I'm going to be away for the next couple of weeks, so probably won't have a chance to review and respond to many comments - but I will try to get online at various points to review any feedback.

Happy New Year to all....

Now, let's see if I'm one of the favoured few to get out of LHR this afternoon.
Read More
Posted in | No comments

Thursday, 9 December 2010

Falling smartphone prices = problems for operator control?

Posted on 00:16 by Unknown
Conventional wisdom has it that operators are looking forward to smartphone prices dropping, and it means they will need to spend less on subsidy to acquire and retain customers.

Leaving aside the likelihood of iPhones continuing to stay at the top of the range, it's certainly becoming true that the lower end of the Android device range, as well as Symbian, Bada and BlackBerry, are falling precipitously, already in some cases to below $150-200.

Now while there will always be the tech enthusiasts wanting the ultimate 2GHz, super high-res screen, OS version 9 smartphones, it's possible that Joe or Joanne Average will be happy with something with a decent browser, fairly responsive screen, OK camera, good Facebook client and a screen big enough to watch kittens doing backflips on YouTube. Some will want a QWERTY

That price point is likely to reach $100 in the not-too-distant future. At that point, the world changes a bit. It's a discretionary purchase, like a new shirt or a pair of mid-range trainers, or a nice meal. It becomes subsidisable down to zero by a whole host of companies, not just mobile operators who want to pad out their ARPU with repayments from a thinly-disguised consumer credit loan.

And this will bring some interesting issues - especially whether  people continue to opt to buy the "operator-ised" version of phones, or end up getting them "vanilla" via Amazon or (almost) free with their breakfast cereal. You'll certainly get smartphones given away free with other purchases - maybe "Buy a new Samsung TV and get a Samsung Android Remote Control Smartphone free!!!".

At the moment, about 50% of the world's phones go through mobile operator-controlled channels. But a higher % of smartphones take that path, because they tend to be predominantly bought by post-paid customers, often bundled with a subsidy or a fixed "plan".

As smartphones become more accessible to prepaid users or basic SIM-only conract customers (who usually buy phones for "full retail" price, or re-use an old handset), there becomes less incentive to get them through the operator channels.

This then means less incentive to get the operator software-load (own-brand apps, appstores, connection manager / offload client etc). Coupled with Apple's reluctance to let operators customise iPhones, this is likely to mean that operator "control points" will, in a growing % of smartphones, be relegated to the SIM card or the network, unless the telco can persuade users to download their branded clients and install/configure them.

As I'm writing this, I just received a  press release from ST-Ericsson about hardware platforms for the growing demand for dual-SIM phones. They will *obviously* be sold as vanilla devices as well.

At the moment, most of the world's unlocked phones are low-end devices for voice/SMS customers.

What's unclear is what happens when there's a large enough population of unlocked smartphones as well, with 3G and WiFi and lots of memory. What applications and user experiences go viral? Will we see auto-tethering and connection-sharing emerge? Wide use of a cool new VoIP service? Content-sharing? Applications which work out how many minutes / GB you've got left at the end of the month, so they can resell them or share them with your friends?

The bottom line is that lower phone prices might lead to a reduction in subsidy expenditure by operators. But ultimately this may just further erode control over the user as well. If a phone only costs $100, why would you buy it from someone who takes a strict line in telling you how you may use it?
Read More
Posted in | No comments

Friday, 26 November 2010

Will advertisers be made to pay for mobile data use?

Posted on 04:02 by Unknown
Here's a fun question for all the DPI and mobile optimisation vendors to try and answer....

What percentage of mobile data traffic is made up of adverts? And is it rising or falling?  And why isn't the implicit cost of transporting adverts being paid by their originators or agencies?

In the fixed world, the incremental amount of advertising traffic is probably relatively small, even including embedded Flash adverts in web pages, a few seconds of pre-roll video on some clips, banners, emails and so on. I'd be surprised if ADSL users get more than 1GB of ads per month, perhaps even just a small fraction of that. Set against a typical capped home broadband service with 20, 50, 100 or 250GB that's not really moving the needle.

But if you're using broadband on a USB modem with a 1GB or 3GB cap, I suspect that advert traffic starts to build up quite heavily as a component. And on a smartphone with 500MB a month, it's quite possible that advertising (across all inventory types) is 20%+ of total traffic, especially if you include email spam and browse "real Internet" web pages.

Some of the discussion last week at the Broadband Traffic Management event concerned whether (and when) it was appropriate or legal to optimise content, or even block applications or services. Would the user (or regulator) be irked if you compressed video streams, or changed the codec or frame rate? Highly contentious and a discussion for another post.

But it surprises me that the compression & optimisation specialists (Bytemobile, Openwave, Vantrix, Acision et al) don't target adverts and spam as "low hanging fruit". Ad-blocking as a service would reduce users' bills, speed up their browsing and could be configurable / opt-in like the PC browser add-ons.

Users would likely complain or churn if operators blocked or degraded YouTube or Facebook. But there's unlikely to be much of an outcry if intrusive adverts are binned by the network, as long as there was the equivalent of a "spam folder" to check there were no incidents of miscategorisation.

People might even pay for the service, if it improves overall QoE.

I bet that in some cases, the capex / opex savings on the network from lower traffic levels, would outweigh the telco's revenues generated from their own mobile advertising business. The poacher should turn into a gamekeeper. If the telcos don't do it, then third-party proxy services should offer it as a service, especially when the user is roaming. (Companies like Vircado already offer roaming optimisation).

I often get involved in research & consulting about the possibilities of telcos charging "upstream" content or application providers for access or QoS, notably through my work with Telco 2.0. It's often difficult to imagine Skype or YouTube or NetFlix paying "cold hard cash" - I generally think that business-focused cloud computing, or perhaps cloud gaming are the most likely to be prepared to pay. But actually, advertising agencies and ad-serving companies should probably go to the top of the list, as soft targets.

Neither the telco NOR the user wants (much of) their traffic clogging the network. It's also an area in which public attitudes to Net Neutrality could be swayed to see the positive aspects. There's is also the chance that Google ends up footing much of the bill. Where it will get much more difficult to enforce is around in-app advertising, though.

Ultimately, my belief is that the advertising industry has had a free rein for far too long. It's fat, lazy and complacent. Why should advertisers assume they have the rights to "use my retina for free"? Sure, there might be some brands or campaigns I'd want to opt-in to, but in an ideal world, we'd have digital contact lenses and advertisers would have to pay us cash to occupy parts of our field of view. And I'd gladly do a rev-share with a telco or other service provider who managed the billing & collection of fees on my behalf.
Read More
Posted in | No comments

Sunday, 21 November 2010

The SIM card. The single point of failure for the mobile industry as we know it.

Posted on 11:14 by Unknown
After writing my post about Apple, SIMs and NFC at the weekend, I've been having a look around at some of the other material that's been published recently about all this. Rudolf van der Berg, whom I met at the Telco 2.0 event the other week, has published a great article over at GigaOm (his normal blog is here, where he has written a lot about the need for a wholesale market in mobile numbering for M2M).

I've talked before about what I term the "Tyranny of the SIM card". While SIMs are absolutely invaluable for many mobile use cases, they are absolutely awful for others. The problem is shown up the acronym itself - Subscriber Identity Module. There are plenty of situations in mobile communications for which subscriptions are utterly inappropriate as a means of customer engagement.

This is something I've talked about before - the telecom industry's myopic belief that customers can have "service in any colour they like, as long as it's a subscription". Sure, there are many circumstances for which a monthly payment makes a lot of sense - and plenty more where it doesn't, especially as we move away from the phone as the favoured device of choice.

For a telephone, fixed or mobile, a subscription is a great model, as long as licenced service providers hold a monopoly on issuing numbers. Users want to keep the same number so that they can receive incoming calls and texts, so it makes absolute sense to have an ongoing relationship with the company that controls it.

But for other devices, there is much less requirement for an ongoing contractual relationship. The hypothetical M2M-connected toaster won't be receiving inbound calls: it's likely to be an uplink-only device (eg for maintenance or whatever) or at least one which will always be initiating any connection. Same deal for a PC - which is why you don't need a single permanent identity to use WiFi.

The problem is that the SIM enables operators to count - watch the headline 5 billion number carry on to 6 billion, despite the fact that many people now have 2, 3 or 8 SIM cards. But selling SIMs is easy to do, and easy to award sales targets and bonuses for. But how do you count temporary or transactional users? Or embedded devices which send a 100-byte message once a year? Much trickier, with fewer PR-worthy numbers. How many WiFi users are there on the planet? Nobody is really sure, and in any case the figure isn't really especially useful. The mobile industry doesn't really want to shift to counting sessions, or users, or something similarly intangible - even if it makes more sense.

But the real problem is around lock-in, and the fact that the operator retains ownership of the SIM, even when inserted into a device that users own outright. From a telco viewpoint, instantiating its control in such a tangible form is hugely powerful. But from a user or third-party vendor point of view (ie Apple) that control steps across a dividing line, when used to extend the operator's domain into new areas such as data or payments. As always, where the device is subsidised, the operator has more moral authority to tell the customer how it may be used. But if an individual (say) buys a smartphone for full retail cost, then tying certain features to the SIM [eg NFC payments] could be seen as unacceptable.

Even for voice, the writing on the wall has been visible for some time, especially for roaming. Various niche operators have been playing with multi-IMSI and downloadable-IMSI SIM cards for a while, notably Truphone's Local Anywhere service for voice anti-roaming. [Disclosure: they're a client]

Then, we've had the dual-SIM, even triple-SIM phones that are popular in Asia especially. Apple has been signalling its displeasure since 2007 with the hard-to-remove card in the iPhone needing a pin inserted to extract it. The utter failure of SIM-embedded laptops to gain meaningful market traction has been unsurprising - customers have preferred cheaper & more flexible USB dongles and MiFi's.

In my view, international data roaming is the straw that has broken the SIM camel's back. We have had an array of mobile operators and industry associations attempting to defend the indefensible. When you are travelling, there is absolutely no value or justification for routing all traffic back to the home network, and paying 10x or 100x the local price of Internet connectivity. I can concede an argument for a small premium, like using your ATM card in a foreign bank's machine. But a multiple, rather than a percentage is totally egregious.

We all know it. They all know it. The European Commission knows it. Operators have been making whining excuses about prices falling (from astronomical to merely stratospheric).

Yet to mix the metaphor, it is nevertheless a golden goose, laying charge-by-the-MB eggs. Nobody has been prepared to kill it. Instead, we've had half-hearted attempts to cap wholesale prices and instigate €50 thresholds, we've seen a few isolated examples of common sense such as the proposed Spain/Portugal free-roaming idea. And we've had operators and roaming hubs pretending to do their customers a favour with special bundles, which might only be 5x ripoffs rather than 50x.

I'll remind you of my own debacle with Vodafone's completely duplicitous approach to roaming pricing.

Ultimately, for all the convenience that SIM cards bring in basic phone services, they utterly relinquish that benefit, when it is set against the iron control (almost cartel-like extortion) they allow operators to exert over data.

Unsurprisingly, some of the technology industry's finest minds have been working flat-out to find ways to subvert this control. The much-rumoured iPhone embedded SIM is an example - but there have also been signs from Google as well: remember that patent from 2 years ago, for a system in which different operators would bid to carry each call? There was some speculation about a Google SIM for the Nexus One as well.

The risk now - and I think the operators and the GSMA are also waking up to this - is that the sheer blatant effrontery around roaming (and M2M network switching) is about to result in a shift in power. Apple and others are going to find a workaround - with billions of dollars to thrown at solutions if necessary.

And if solving the data and roaming problems for SIMs also impacts the basic voice & text business models, well..... that's just collateral damage. It's not as if this has come out of the blue.

The question is whether the industry's attempts to retain centralised control with the SIM has inadvertantly resulted in the creation of a single point of failure for the mobile operator ecosystem. In trying to win the battle to protect roaming and data revenues, the operators may well have Phyrrically lost the war.

And this is all happening at a critical point:
  • Android & Nokia & BlackBerry smartphones are almost cheap enough to buy without subsidy
  • Apple can do its own financing for people who want iPhones
  • Most people have alternative ID's that are better than phone numbers (Skype, Google, Facebook)
  • HSPA networks are often good enough for VoIP
  • Forthcoming LTE networks rely on data for VoIP
  • Voice usage is stagnating anyway, in favour of various forms of messaging
  • WiFi is everywhere and in all high-end devices
LTE is the killer here, to be honest. Stupidly, there is no SIM-free option for LTE data, something I've been advocating for a long time.

This means that if the operator strangehold on SIMs for data is broken by Apple or others, then they will probably lose voice on LTE as well. And you can certainly bet that any future Apple or Android voice service won't be based IMS or the I-SIM application on the UICC.

Now none of this affects the billions of ordinary mobile users, at least just yet. It will initially be confined to non-phone devices and a few top-end smartphones I expect. But I have a feeling that we're about to see the end of SIM Tyranny in one form or another.

There is a wider lesson here for the mobile industry - large pockets of profit that are based on control, rather than genuine innovation, are extremely tempting to outsiders. Almost without exception, they will be disrupted. If telcos want to retain and grow profitability, they need to invent new and valuable services and propositions continually - simply hiding ostrich-like in the sand, hoping that nobody notices areas of unrealistically high margins, is not a recipe for sustained success.
Read More
Posted in | No comments

Saturday, 20 November 2010

Apple, embedded SIMs, NFC and mobile payments - some speculation

Posted on 01:48 by Unknown
I wonder if I've just managed to join up the dots on something rather important:

- Recent reports suggest that Apple is intending to use NFC chips in iPhones
- Other recent reports suggest Apple wants to use an embedded SIM in iPhones

The NFC rumour tends to suggest that Apple is interested in mobile payments. However, that raises the question of whether the mobile operators are involved in the payment value chain, or if it is simply a secure extension of Apple's existing 100m+ credit card iTunes base out to the handset.

Presumably Apple has some pretty good relationships already with Visa, Mastercard, Amex et al, given iTunes $4bn run-rate. But that's not enough, on its own, to get those companies to push new card-reader equipment to thousands of retailers, that just support iPhone payments and nothing else. If they were going to go down the NFC path, they would probably wish the EPOS readers to work with a variety of NFC implementations and business models, not just one.

I've had a number of criticisms of NFC over the years - including the cost to the handset manufacturers, the unclear role of the mobile operators, the difficulty of getting merchants to adopt costly new readers, the willingness of consumers to entrust payments to new providers, whether customers actually need a new payment method to replace cash or existing cards, the unsuitability of charge-to-my-phonebill models, failure modes such as theft or handset crashing and so forth.

In short, I've been unconvinced by the "phone is your wallet" argument.

Now, there have been various approaches to NFC which have looked like cutting the operators out of the equation - most notably the NFC stickers that can be attached to the back of the phone. Other ideas have involved extending the role of not-quite-NFC contactless cards used in other applications such as London's Oyster, or Visa's PayWave.

But operators (and bodies such as ETSI and GSMA) have been pushing hard for the version of the NFC architecture which links the NFC chip to the SIM card [technically, it's called the UICC card], where the "secure element" of NFC is stored on the SIM itself, and accessed via the single-wire protocol (SWP).

A good overview of the NFC/SIM/SWP approach is here . There's a diagram on page 10.

But a core problem is the lack of incentive for manufacturers to support the cost of putting the NFC hardware and software in the device, especially if it is based only on SWP use cases. This would essentially mean that the NFC chip would be useless without the SIM, and that therefore the operator could insert themselves in all possible applications, not just payment, unless the device vendor put a second non-SIM secure element somewhere else in the phone.

For the manufacturer, this adds to the bill of materials, increases testing complexity, risks support and return costs, may delay time-to-market, yet may not generate extra revenues either from customers directly, or operators via subsidy or some sort of revenue-share. Especially for markets which sell unsubsidised 'vanilla' phones which could end up with non-NFC supporting SIMs, why go to the cost of putting in an NFC chip, rather than say a better camera, or more memory?

Yet at the same time, the non-SIM/SWP implementations of NFC are looking even more tenuous in acceptability. Why bother with a sticker (or trust it) on the back of your phone?

So we have an impasse:

  • Handset vendors don't really want to be forced to support implementations of NFC where all the control and most of the value resides in the (operator-owned and issued) SIM card, even if they can put secondary applications onto it.
  • The non-SIM implementations of NFC will have problems scaling and getting publicity, especially given the operators' indifference to selling handsets supporting this, and the handset vendors' general lack of clout with the credit card companies.
  • End users seem (largely) indifferent to both a new form of payment, or all the other "near field" applications like waving your phone at a billboard. They also tend to push back against perceived operator lock-in.
  • The merchants don't want to buy new terminals, especially if it's unclear what the new payment value chain will look like.
Catch-22.

Or maybe not....

Let's revisit the two phrases "technically it's called a UICC card" and "operator-owned and issued SIM card".

Now let's just be a bit clearer about the terms. From Wikipedia :

A Universal Subscriber Identity Module is an application for UMTS mobile telephony running on a UICC smart card which is inserted in a 3G mobile phone. There is a common misconception to call the UICC itself a USIM, but the USIM is merely a logical entity on the physical card.

And from Zahid Ghadialy's blog

CONFIDENTIAL APPLICATION MANAGEMENT IN UICC FOR THIRD PARTIES
The security model in the UICC has been improved to allow the hosting of confidential (e.g. third party) applications. This enhancement was necessary to support new business models arising in the marketplace, with third party MVNOs, M-Payment and Mobile TV applications. These new features notably enable UICC memory rental, remote secure management of this memory and its content by the third party vendor, and support new business models supported by the Trusted Service Manager concept.


Now... I am not currently an expert on the full inner workings of UICCs and SIM technology. I will read up when I get a chance. But I have a suspicion that this might sum up what's going on:

  • Today, operators issue (but still own) physical UICC cards, which include the SIM functionality for secure authentication to the radio network, and also other applications such as SIM Toolkit and the NFC secure-element functions. They can "rent space" to third parties for other applications, acting as a Trusted Services Manager.
  • Tomorrow, some other third parties may issue physical UICC cards, or embed them into devices rather than distributing them through retail stores. And then those third parties (Apple, for instance) can perhaps "rent space" to operators for applications such as "secure authentication to the radio network".
In other words, perhaps we move to a world in which the operators' SIM connectivity function becomes just software running on someone else's physical card. Whether that (removable or embedded) card is owned by the end-user or by the manufacturer is another question.

There are a couple of other angles to this as well, which seem to tie in:

  • The GSMA announced its "Embedded SIM Initiative" the other day. I don't actually think it's specifically Apple-driven, but is more about the general M2M market. There are plenty of new business models which could be enabled by this, as well as plenty of problems to solve. However, it is possible that the Apple discussion has brought the issue to a fore.
  • Various operators are reportedly throwing a strop about Apple's plans to allow users to provision and activate services remotely. They are (quite reasonably in my view) threatening to stop subsidising iPhones if this occurs.
But so what? Really, a subsidy is just a loan by another name, cunningly designed to make it look like the consumer isn't really taking on more debt. It's quite a *safe* loan, to be fair, as the phone is useless if the operator cancels your number and SIM, and so most people won't default on their monthly fees. But we already know that Apple is friendly with the credit card companies. It already offers finance plans for Macs and iPads. So why can't it also sell "free iPhones", backed up by its own subsidy / credit arrangement? Whether this is charged to your iTunes credit card, set up as a separate agreement or whatever is merely detail. Because if Apple also owns the UICC card, it has a built-in anti-default mechanism just as good as the operators'.

For years, we have had people advocating the "Soft SIM". Intel worked on a project called the "Identity Capable Platform" back in 2006-7 - I remember seeing presentations about it. The operators (and GSMA) have been fairly vociferous in their condemnation of the concept.

I'm wondering if Apple has done an end-run around this, aiming to own a separate but embedded hardware SIM - and acting as a Trusted Services Manager itself - provisioning the operator's credentials as software on it. Add this to a way to escape from the "subsidy trap" by doing handset financing without scaring investors with risks of default, and we potentially have another disruption from Jobs.

Now.... a disclaimer. I may well be adding 2+2 and getting 7.3 here. But I'd be very surprised if some elements of the recent Apple SIM and NFC stories don't blend together.
Read More
Posted in | No comments

Friday, 19 November 2010

Enforcing traffic management transparency

Posted on 03:35 by Unknown
I need to do a full write-up of this week's Traffic Management Conference, but one theme that came out loud and clear (especially from the regulator's panel) was that any non-neutral network policy management will need to be absolutely clear and explicit to the end user.

Although regulators have generally not had many official complaints, there have been many, many suggestions of illicit traffic-shaping or degradation. I had an IM chat last night about a particular operator's suspiciously-poor broadband performance with Skype, and how that might be proven or disproven as deliberate.

One of the slides in my presentation at the conference yesterday which gained the most attention was the one with the Monopoly-board image of "Go to jail". It's quite possible that telecom executives who allow broadband to be mis-sold may be legally liable, if it is found that secret policies are going to be applied.

I suggested that a certain company with millions of end-points, million of servers, and proven data analytics capabilities should be able to spot any suspicious anomalies in traffic patterns, latencies and so forth. Any "monkeying-about" should stick out like a sore thumb, similar to a bank's anti-fraud systems.

So it's interesting that the BBC is perhaps the first major content provider to specifically say that they were looking at software to help keep the network honest, and inform users about who is to blame if there are glitches.

Of course, if you've been a regular reader of this blog, and customer of Disruptive Analysis' research and advisory services, none of this will come as much of a surprise to you, as it's been on the cards for more than three years - and indeed the EFF has had a tool available for some time to spot miscreant ISPs.

Bottom line is that telcos in markets with liberal attitudes to neutrality will need to be 100% upfront to their customers about policy and optimisation techniques, or else they will get "outed" mercilessly - and perhaps prosecuted as well.
Read More
Posted in | No comments

Wednesday, 17 November 2010

Wallets don't crash

Posted on 23:41 by Unknown
Yesterday evening, I exchanged a couple of debating points with some others in the industry, after one of them had reported hearing a at a conference that Nokia would include NFC chips, allegedly in all its smartphones, next year.

(I doubt that's accurate - if you're racing to the bottom vs Android on smartphone pricing in India, for example, you don't put a few $ of useless bill-of-materals in all your products, especially those sold in markets with no readers)

We've also heard a lot of hype in the past couple of weeks about NFC chips supposedly going to be in the next iPhone and also the next Gingerbread-powered iterations of Android. It's also worth noting that the "official" NFC may sometimes get confused with other short-range RFID solutions.

Anyway, all that is outside the point of this post, except as context. And irony, given what happened next:

In short, my phone spontaneously turned into a brick. One minute I was taking a photo at an interesting event last night, then switched it off. And it stayed off. Completely black - nothing happened holding the power button, the home button, nor trying the usual trick of physically hitting it against the table. [It was on about 85% battery]

This was at about 8pm, just before the start of the dinner & event I was attending.

Eventually when I got home, attached to my PC, looked up on the (PC!) web for help & support, and eventually reincarnated it by holding all the buttons down together for about 20 seconds.

I was very, very glad that it hadn't contained my wallet, my house-keys, or my Oyster London travelcard. Maybe an NFC chip might have worked with the phone dead. Maybe not. But would I have felt like taking a chance, and staying out until 11.30pm & having a really enjoyable evening, knowing I might need to call a locksmith when I got home? Or if I'd get home, if the tube ticket barriers rejected my defunct psuedo-Oyster?

Instead, I was just mildly grumpy I'd have to reorder another phone and I'd lost a few weeks' of photos, phone numbers and other stuff, since I'd last backed up.

[Sidenote: would I pay for a network backup service even after this experience? No, probably not. But I am glad I've got the phone from an operator, on a subsidy, with a warranty, who I could have harassed for a replacement. And I will be syncing it with my PC more often]

Yes, I know that NFC is supposed to work when the battery is dead. But in this case the kicker was that it wasn't dead.... there was *something* going on in the phone, as when I breathed life back into it, it had dropped to 32% battery, and still felt slightly warm 5 minutes after I'd taken it out of my pocket. Will NFC work when the OS is stuck in a loop or some other software / firmware Hades?

Will any phone company want to take the risk that crashed phones render m-wallets and m-keys useless? What's the support cost of that? Could I have charged the locksmith to my telco, if I'd bought its phone-lock service? Or will they try and bill you extra for insurance?

The bottom line: I'm very glad that my phone isn't a "single point of failure for my life". Ironic that I had a wake-up call just after a discussion about NFC.
Read More
Posted in | No comments

Tuesday, 16 November 2010

Three new white papers on Mobile Broadband Traffic Management

Posted on 01:01 by Unknown
A very quick heads-up, as I'm at the Informa Broadband Traffic Management conference today & for Weds and Thurs. Very busy here - about 250 people or so.

I've just published a series of three white papers on next-gen "holistic" traffic management, along with an introductory paper. These look beyond the simplistic early approaches to DPI, policy, optimisation and offload, many of which have been arbitrary and often user-unfriendly.

The papers have been kindly sponsored by the folks at Continuous Computing, who have given me a completely free rein to write about topics that are interesting, and which hopefully push forward the industry thinking about how better to control & monetise mobile data.

[EDIT - for downloads, please see the links embedded in this page]

In a nutshell, my belief is that any future implementations of mobile broadband traffic management will need to be:

  • Device-aware: not just what brand and model, but over time much more granular detail about OS version, firmware, connection managers, security, power management and the ability to communicate about network status and policy with the user. Increasingly, network vendors and operators will need to link network infrastructure boxes to on-device clients. This also ties in with application awareness - particularly around dealing with mashups, VPNs and so forth.
  • Bearer-aware: the policy infrastructure will need to be much more informed about the status of the radio connections(s) - what technology, femto vs. macro cells, whether Wifi is available and suitable, what is happening with signalling load, whether congestion is actually occurring at a given time/cell and so on.
  • Offload-aware: whether data is being (or should be) routed via WiFi, femtocells, RAN offload and so on - and whether this should be managed or unmanaged. There are many variables here, and many use cases, such as the ability to use multiple networks simultaneously, "selective offload" (SIPTO / LIPA) and so on.
Many regulators seem to be moving towards policies on traffic management & Net Neutrality along the lines of "minimum necessary" or "reasonable" control of traffic by operators. This means that any policy enforcement will need to be proportionate and context-specific. Arguably, there is no justification for unnecessary throttling or compression at quiet times / cells, unless you live on a Pacific island and IP transit is expensive. There is certainly not likely to be justification for much arbitrary discrimination between websites or applications, especially if this is not done with full transparency.

Each of these issues is covered in a separate white paper, plus there is an overview introduction.

[For downloads, please see the links embedded in this page]

This is an area I cover in a lot of depth. If you are interested in an internal workshop, advisory consulting project, or need an external speaker for an event, please get in touch at information AT disruptive-analysis DOT com
Read More
Posted in | No comments

Saturday, 13 November 2010

Part of Nokia's problem - making Ovi compelling

Posted on 02:08 by Unknown
I've got a Nokia Ovi account somewhere. Signed up for it ages ago, to play around with the Ovi Store when I had an N97 to play with, back when the store was a real exercise in frustration. Since then, my only interaction with it has been spam SMS (with no STOP opt-out) exhorting me to try other features, when I've put that SIM card into other phones.

In other words, I've not exactly had a compelling experience, and the SMS spam is a complete and utter red flag for any business. (I've stopped using Virgin Airlines whenever possible because of text spam - losing them maybe £50-100k lifetime value)

I don't know anyone who uses Ovi either, among friends or family. I never see @ovi.com email addresses, and none of my acquaintances has ever mentioned it, linked to its site from Facebook, or otherwise brought it to my attention.

Compare that to the number of times I have heard the words "Gmail", "iTunes" or "BBM".

Now, certainly agree that a lot of people buy smartphones for their standalone capabilities (eg a good voice phone, great camera, browser and so on). Certain people buy phones for apps - although I'm unconvinced that's as important as many seem to think.

But a lot of people get swayed in their decisions because of something server-side or cloud-related. Historically, BlackBerry grew because it was the best way to hook into Microsoft Exchange for businesses.

Now, we see other hooks:

  • If you've got a lot of music on iTunes, you'll want an iPhone
  • If you've got a lot of friends on BBM, you'll want a BlackBerry
  • If you're a heavy user of Gmail & other properties, you'll increasingly want an Android, although it's not quite there yet, as obviously you can get G-services on other phones too.
Nokia doesn't have a story here, certainly in the developed world, and I have seen little evidence of imminent viral explosion. Palm never had a server-side lock-in either

In fact, it wouldn't surprise me if Microsoft was the next one to make the connection work, perhaps to Xbox or Kinect, as well as its corporate services and Azure cloud.

Maybe Nokia should swallow a bitter pill, and for developed markets drop Ovi services entirely and act as an ODM partner (or dual-brand supplier) to Facebook?
Read More
Posted in | No comments

Thursday, 11 November 2010

Will there be legal pitfalls of policy management?

Posted on 06:25 by Unknown
It's a common enough theme that you should never post anything on a social network or web forum, that you couldn't deal with being openly available. We all know that security breaks down, APIs are opened up, privacy rules change.

But do people take that seriously enough in the offline world? Increasingly, secrets and dubious behaviour get revealed. The UK government suffered a huge scandal over the leaking of questionable MP's expenses claims last year. It resulted in resignations, arrests and helped to put the last administration out of power. A number of parliamentarians are now facing criminal charges.

Various other examples abound of businesses wilfully hiding the true facts behind their actions, mis-selling products or actually committing fraud. The true facts might come out years later, but authorities are often prepared to find the executives responsible. US companies' chiefs are bound by Sarbanes-Oxley rules as well.

So the question I have is whether all those tasked with implementing network policies really think through the ramifications of their actions? Are all decisions cross-checked with what has actually been sold to customers, or how it was marketed? Yes, there are often woolly clauses in contracts about operators being able to do necessary management... but would these stand up in court, if some actions appear to go beyond what is strictly "necessary"?

And at what point do any "secret" policies (eg degrading a competitor's services or applications) step over the line to being anti-competitive or fraudulent? Forget about simple abuse of Net Neutrality laws, which can obviously be debated & appealed until we're blue in the face. This is about actually lying to customers: hard-and-fast concerns in terms of consumer protection, for which the law tends to have big & pointy teeth.

I'm not a lawyer, so I don't really have a clear view. But then neither are many of the people actually *implementing* the business rules and policies at a network level.

I've never met anyone with a business card title of "Network policy manager", who understands everything from the operations of the network, to the customer's viewpoint, to the nitty-gritty of sales and marketing, to various angles of regulation, to competition and contract law.

If telcos or their vendors think they can "get away with" dubious policies that are not made transparent, they may get a nasty surprise some time in the future. Sooner or later policies will get leaked, or reverse engineered. Normal ups & downs of network performance will look like "white noise". Any unnatural patterns (by user, by app, by location, by time, by device, by OS etc) will stand out a mile, correlated with the right software and enough processing clout. Then someone will do a compare & contrast with the details of what they've been sold - and if there are material differences, trouble is likely.

Bottom line: don't enforce any network policies you wouldn't like to see published on the web tomorrow.
Read More
Posted in | No comments

Tuesday, 9 November 2010

Mobile operators have lost their chance at owning the social graph

Posted on 03:38 by Unknown
I'm at the Telco 2.0 event in London today. We just saw the results of a live survey of the delegates on a range of questions, asking whether telcos were "in control" of particular areas of the industry - and if so, whether that control was a solid grip, or ephemeral and likely to be weakened over the next few years.

One question stuck out to me - whether telcos still "own" the user's addressbook and process of initiation of personal communications. The consensus in the room was that operators still have a couple of years' window, before they risk losing control of the much-discussed "social graph". (I actually hate the term, but agree with the general idea that it's valuable to understand an individual's affiliations and personal universe).

By coincidence, I happened to read this article about youths' behaviours on Facebook this morning. This paragraph leapt out to me:

I asked Shamika why she bothered with Facebook in the first place, given that she sent over 1200 text messages a day. Once again, she looked at me incredulously, pointing out that there’s no way that she’d give just anyone her cell phone number. Texting was for close friends that respected her while Facebook was necessary to be a part of her school social life. And besides, she liked being able to touch base with people from her former schools or reach out to someone from school that she didn’t know well.

This actually tallies with my own use of social networks - I've only got a small fraction of my Facebook and LinkedIn affiliations in my mobile phone's addressbook. And I wouldn't want the others included - especially those affiliations which are not people (events, groups, fan pages, things I "like" and so on).

Even if I had some telco-based cloud addressbook, it would only reach a fraction of my personal or business universe. An increasing percentage of my communications are conducted "off-phonebook", especially on Facebook or Skype, but also via Twitter and various email accounts. As I've written before, I have no need for some sort of converged addressbook, especially one controlled by a gatekeeper who wants to charge me for the privilege, and use its bottleneck position to stop me churning when I want in future. The notion that this is somehow going to be solved by clunky centralised solutions like IMS and RCS is an exercise somewhere between self-delusion and wishful thinking.

I notice that Telco 2.0 has also published this post on the role of operators in understanding and monetising personal information. My view is that they have three uphill challenges:
  • The growing fraction of personal communications & data that it invisible to the operators
  • The general happiness of end-users with fragmentation of their contacts / affiliations and communication channels. The "convergence layer" is in the brain, not the phone or the network. As increasingly multi-tasking capable users, this is not a problem to many of us.
  • The poor structuring and accessibility of the data that they *do* possess, spread across multiple databases and repositories.
Some of the "old school mobile" pundits are still clinging to the idea that mobile operators "know everything" about you. Nothing could be further from the truth, especially as growing numbers of people have multiple (unlinked) accounts, devices and SIMs from a range of operators.

If the semi-mythical "social graph" does turn out to exist, it's more likely to be Facebook, Google or Apple that owns it, at least in the developed and Internet/smartphone-centric world.
Read More
Posted in | No comments

Saturday, 6 November 2010

What impact will security worries have on WiFi offload?

Posted on 06:33 by Unknown
I'm not normally too paranoid about WiFi security - although to be honest, I probably should be, given the amount of time I spend in weird countries using public hotspots, as well as an hour or two a day working cafes in London. I take what I feel are sensible precautions, but I'm still aware that I could probably be more careful.

But what has scared me recently has been the fuss around FireSheep. To the uninitated, I suggest a quick read-up on it. Basically it allows the easy hacking of someone's web access, especially when using popular websites like Facebook, when using ordinary HTTP rather than the encrypted HTTPS option. Specifically, FireSheep enables people to snoop on their neighbours' access to various web services when using shared, open WiFi networks.

This post is not about the controversy, or the various software countermeasures to force more traffic to secure access paths, or squash the capability of the hacking tool to operate effectively.

I'm more thinking about what this does to mobile operators' 3G data offload strategies - specifically using public WiFi hotspots. There are various implications:
  • Legal folks at telcos probably want to have a good think about liability issues if their software forces (or automates) WiFi access, without at least warning users about the risks.
  • There is an opportunity for operators to differentiate and add value by putting VPN or other capabilities in their connection manager clients, or custom browser variants.
  • Some end-users are going to switch off WiFi or be hesitant about using it, and just stay on 3G
  • Public / hotspot femtos are going to start looking more attractive
  • UMA-style WiFi, or I-WLAN, which hooks back to the operator's core network via an IPsec tunnel, is going to look more attractive again
  • More WiFi APs in public hotspots will probably shift to WEP/WPA encryption, making logon and authentication more of a pain (expect more support calls from confused customers)
Overall, I think these issues have not yet filtered through to the telecoms community as quickly as might be expected. A quick Google search doesn't show much for firesheep+offload.

This is too important to overlook, I think.
Read More
Posted in | No comments

Friday, 29 October 2010

Apple - incremental products, incremental profit

Posted on 01:47 by Unknown
A very quick post...

Apple is now in an extremely happy place and unusual place. It has worked out a magic formula (big fan base, good products, expectations of pricing), where it can pretty much guarantee that it can earn at least $200 of gross margin on any mid-to-high end product it sells - iPhones, Macs, iPads and so on.

So as long as there is a decent-sized market, at low enough risk, it can afford to treat certain new things as "projects" even though they may not be game-changing. If they do change the game, even better.

For example - the iPad. Even if it only sold 10 million, Apple would be up perhaps $2bn in gross profit. Even if the R&D upfront was $500m, that's still a pretty decent return.

And so now, everyone is talking about a Verizon iPhone. A year or so ago, I would have been skeptical - a CDMA iPhone would have been a risky distraction. Now... with an extra year's traction, it seems like 10m units is pretty much a baseline. And I'll assume the cost/profit structure will look pretty similar to the HSPA ones.

In other words, it's money in the bank, assuming that nothing goes horribly wrong. The only argument against it might be the opportunity cost - could those engineers be doing something *even more profitable*. But I'd imagine the company has had the time & resources to get its hiring aligned with its business opportunities.

By the same token.... could Apple make & sell 10m LTE iPhones, at $200+ gross margin, at equivalent low risk next year? No. And probably not in 2012 either. There's not an installed base of existing customers to sell to, the technology isn't mature, the chipsets expensive and the user experience would likely have issues that would mean something "going horribly wrong" would be much higher probability.

If it can stop its margins creeping down, I'm sure there are plenty more alternative 10-30m unit segments that Apple can target for its next few billion, while it's waiting for LTE to make the cut.
Read More
Posted in | No comments

Thursday, 28 October 2010

Sprint's mobile wallet sounds sensible

Posted on 00:40 by Unknown
I'm not generally a big believer in mobile payment solutions for developed-world countries.

Cash, cards and online payments already work perfectly well for me and other people, thanks. I don't need to "store value" in my phone, I don't want to scan it across an NFC reader, and I certainly don't want my operator bumping up inflation by taking a slice of everything I spend.

I especially don't buy the "bill it to my phone bill" concept of mobile payments - like many people, I have a very different relationship with my bank & my telco(s) and I'm quite happy to keep it that way. You'd have to be crazy to have a financial-services arrangement with an operator which tied it to an access account provision, although if it was access-independent it might make a more sensible proposition. There's also no way I'd exclusively use my phone for payment when travelling, unless all the transaction data was very explicitly zero-rated for roaming, and I had a guarantee of 100% coverage.

I'm also very happy with my existing payment mechanisms - Visa, Paypal, Mastercard, Amex and so forth. I might set up another, but I'd need a lot of persuasion.

But, Sprint's announcement of its mobile wallet solution is much more appealing - you get to keep all your existing accounts, but get access to them through your phone. Makes sense. Adds to what people already have, doesn't try to substitute it. Doesn't stop you carrying a physical wallet around as well as a virtual one if you choose. Doesn't try to bill things to your phone account. Maybe over time, if it's got a good UI and proves itself, you might change your approach to physical payments, some or all of the time. That's fair enough.

In other words, it doesn't force a behavioural change, but works with what people are already happy with. Which is good.

Now it's not 100% clear to me what the business model is, but in terms of "will this fly", my gut feel is that it has 100x the chances of all the various NFC and other mobile payments nonsense that's been trotted out in recent years.

Bottom line is that unlike most people in the industry, Sprint has actually bothered to look up the dictionary definition of a wallet: something that contains various different payment mechanisms from third parties.

Edit - looks like AT&T is also entering the fray. But they are going to go for the bill-to-the-phone approach. Let's see if people are actually prepared to wear that. My money's on "no" - although I'd rather not use an MNO-powered betting application & account....
Read More
Posted in | No comments

Wednesday, 27 October 2010

Ray Ozzie, the so-called "post PC" era, and the naivete of the software industry

Posted on 00:33 by Unknown
A post on ForumOxford pointed me towards Ray Ozzie's monologue about Microsoft and the future direction of the IT industry, "Dawn of a New Day".

Beautifully-written, yes. And containing much wisdom and knowledge.

But displaying, once again, the arrogance of the software mindset which believes it has conquered physics.

Contrast this with Jobs' comment last week:
"We create our own A4 chip, our own software, our own battery chemistry, our own enclosure, our own everything"

The idea that software engineering has the capability of beating hard limits and challenges in RF, power management, silicon and so on is myopic. Apple understands this and works to juggle all of them appropriately. (As does RIM, incidentally). But then Apple value hardware at least as much as software, from milled aluminium shells to customised bits of radio front-ends.

Ozzie assumes that wireless networks will be fast and pervasive. But what the software crowd (whether it's in Redmond, Silicon Valley or London) fails to comprehend is that there are no 3G networks to connect tablets or clouds to, in most parts of the world. Nor the money to justify the build-out in the depth needed to realise Ozzie's vision. Nor the willingness to support subscription-based business models. Even in the developed world, ubiquitous, high-performance indoor networks would need fibre everywhere for WiFi or femtos.

And lets not even touch on the legal and regulatory hurdles. But good luck trying to persuade the GSMA to ditch roaming charges for data, specially for cloud devices. Maybe give the WTO a call and see if they can sort it out over the next decade or two? Until then, I'll keep my local storage and processing, thank you very much.

It's interesting that none of the best-known software billionaires talk about the "post-PC era". Maybe that's perhaps because, through their non-IT philanthropic work, they get exposure to true "ecosystems", ones based on far more complexity than Ozzie's filtered view of computing. We're not yet living in a post-malaria world, despite Gates' heroic efforts through his Foundation.

At a recent conference, I crossed swords with a well-known and outspoken financial analyst, who complained that PCs hadn't evolved in 20 years. I pointed out that sharks & crocodiles haven't evolved in over 100 million. They are still occupying and controlling their own (literal) ecosystems rather better than humanity does with its.

Ozzies's comment about devices that "They’re instantly usable, interchangeable, and trivially replaceable without loss" also displays the overwhelming naivete of the software mindset.

It ignores that fact that end-users (you know, customers) like expensive, unique and tangible hardware. It performs many social and behavioural functions, not just acting as a cloud-services end-point. Software isn't loved and cherished, and neither will cloud services be. Software isn't polished, used as a status symbol, or oohed-and-aahed over. Nobody walks across a cafe to excitedly ask a stranger if they've *really* got the new OS7.3.

Yes, there will be some "trivially replaceable" devices (3G dongles, for example, already are), but anything truly personal will stay expensive. Again, Apple understands this - as does Nokia, squeezing out an extra few bucks at the low-end, differentiating hardware against no-brand competitors in the developing world.

Telecom dinosaurs refer to "dumb pipes". I predict that software/cloud dinosaurs will refer to "dumb devices". Both are wrong. (Yes, I know - Larry Ellison already got this one wrong in the 1990s)

Yes, cloud services will be more important and we'll see more devices accessing them, or optimised for them.

But the notion that this means that the world is somehow destined for a "post PC" era remains as risible now as it did a year or two ago, when the term was first coined.
Read More
Posted in | No comments

Tuesday, 26 October 2010

Moving away from measuring mobile broadband "tonnage" (the mythical $/GB)

Posted on 01:19 by Unknown
There's a couple of charts on mobile broadband that I'm now heartily sick of seeing:

1) The "diverging curves" chart, with data traffic rising exponentially, and revenue only growing anaemically, if at all.

2) The "comparative $ per GB" for LTE vs. HSPA and so forth.

The first one made an interesting point and observation about two years ago, but is now well past its sell-by date. In particular, it always gets positioned as a "problem statement" when it's just an illustration.

I don't think I've ever heard a compelling argument for why it's actually an issue - not least because a similar chart is possible for virtually every technology industry out there. Imagine a similar chart of "computer processing cycles" vs. revenue spent on silicon chips. Or "packets processed" vs. router revenues. Or flight passenger-miles vs. airfares, and so forth.

Not only that, but it encapsulates price elasticity - average price goes down, so average usage goes up. Surprising?

A much more useful chart would be one which directly links weak revenue growth to stronger growth in costs for the operators (ideally combining aspects of capex + opex).

The problem is that the measure of mobile broadband by "tonnage" (ie GB of data) had become the single over-riding metric used by much of the industry, certainly in public. Sure, much more subtle calculations are used by the people who actually have to dimension and build the infrastructure, but the public persona of the "evil Gigabyte of traffic" as the key culprit dragging down the industry has a wide and damaging impact elsewhere.

Policymakers, marketers, vendors and investors often gloss over the details - and their opinions often have knock-on impacts based on this simplistic and often-meaningless analysis. .

In particular, I see a risk that the "tonnage" argument is used spuriously in debates about Net Neutrality or other aspects of competition and regulatory affairs. That chart is a lobbyist's dream if used effectively - never mind if the message and true chain of causality get a bit blurred? It's a pretty good tool for "educating" users with simple messages about pricing and tiering, too. Only yesterday, I saw a presentation from a very well-known strategy consulting firm that pursued exactly this line of debate, with highly questionable conclusions and assertions

Given that we already know that the real problems are often to do with busy hour / busy cell issues, signalling load, rather than tonnage - or uplink rather than downlink load - the trite comments about "20% of users consuming 80% of the capacity" are not meaningful.

On the other hand, if there was a way to prove that "20% of users are causing 80% of the problems, and driving 80% of the extra costs", that would be a very different - and much more compelling - story. But at the moment, the diverging-curves chart is simply an exercise in non-sequiturs.

(On a related topic - if you use the chart to assert that traffic should always be proportional to revenues, aren't you also just implicitly arguing that SMS prices should really be cut by 99.9% to fit your neat desired pattern?)

The fact is that the cost side of the equation for mobile broadband is very, very complex - which is why the second chart is misleading too. Yes, I absolutely agree that LTE can squeeze more bits per Hz per cell across the network, which makes it much more efficient than older WCDMA deployments - but surely, the baseline for most of today's MBB is using R5 or R6 or R7 HSPA? And yes, the flatter IP architecture should also be cheaper.

But then there's the question of whether we're talking about deploying LTE in 5, 10 or 20MHz channel widths, which spectrum band it's going to be used in, whether it can use the existing cell-site grid, whether it costs extra to have an overlay on top of the old 3G networks, and of course any incremental costs of devices, testing and so forth.

It's not much use having a chart which essentially says "all other things being equal....", when in the real world they're patently not equal. It's the same as arguing that hydrogen-powered cars are much more efficient than petrol, but conveniently forgetting about the infrastructure required to support it, or what to do with the rather large legacy installed base.

Not to mention, of course, that it again uses the mythical Gigabyte as the arbiter of all things to do with broadband economics. No mention, once again, of signalling, offload, spectrum, up/downlink and so forth. No consideration of any additional costs for supporting voice as well as data.

Plus, I'm pretty sure it's the same chart I've seen recycled through about 17 iterations, with little clarity on the original source, or how old it it. I think it pre-dates most of the discussion around small cells, SIPTO and various other innovations - and quite possibly also comes from before the finding that things like fast dormancy and RNC load are often limiting factors for smartphone-heavy networks.

If freight companies bought vehicles based on "tonnage", then a small container ship would cost more than a Boeing 747. There is very little rationale for telecom networks to be based on such a narrow metric either.

To me, the fact that these two charts are repeated almost as mantras points to this being another "tyranny of consensus". Obviously, tonnage measured in GB is an easy metric for the industry and end-users to understand. But that doesn't mean it's the right one.
Read More
Posted in | No comments

Wednesday, 20 October 2010

Will telcos have a positive "balance of trade" in APIs?

Posted on 07:02 by Unknown
I had an interesting discussion today, about mobile cloud applications and syncing.

In part, it covered my regularly-expressed disdain for social network aggregation, and the metaphor of the extended address book on mobile phones. I remain unconvinced that users really want to have their contact lists heavily integrated with Facebook, Skype, Twitter or whatever.

Nevertheless, it highlighted an interesting angle I wasn't previously aware of. I'd recognised that most of the major Internet players had exposed APIs so that developers can hook into them for various mashup purposes - but I hadn't realised how the terms and conditions worked.

In particular, I didn't realise that if you're a large-scale user of some APIs (Facebook, say), then you have to pay. So an operator or handset vendor wanting to do a complex enhanced-phonebook with imported photos & status updates, for millions of users, is not only competing with the standalone downloadable Facebook client, they may also be writing a cheque for the privilege. Ditto if they want to give an MNO-customised variant to their users out-of-the-box.

I've been saying for a while that the power of Apple, Google, Facebook et al was such that operators play Net Neutrality games with them at their peril ("No, how about *you* pay *us*?") . I hadn't realised that they - or some of them, at least, I don't have details which will undoubtedly by confidential - were already throwing their weight about.

Now, I've also been talking and writing about operator-provided APIs for a long time as well, including through my work with Telco 2.0. Initiatives like the GSMA's OneAPI, as well as telco-specific services like BT Ribbit and many others in the mobile world, point the way towards operators selling access to messaging, billing, authentication, voice call control and numerous other features and functions.

In theory. In the real world, telcos' commercial sales of API access has been evolving at a glacially-slow pace, hamstrung by painful back-end integration work and lots of design-by-committee delays. In the meantime, some supposedly valuable telco "assets" have now depreciated to being (essentially) worthless, such as location. I expect "identity" to be the next to be replicated and improved by Internet players.

So... the telcos' API revenue streams haven't yet materialised to a meaningful degree. But instead, they're starting to have to spend money on other companies' APIs in order to provide the services their customers actually want.

I wonder where we'll be in a few years time, in terms of the "balance of trade" in APIs - will operators be "exporting" or "importing" more? In which direction will the net flow of cash be going? It will probably be difficult to measure, but it's certainly going to be an important analytical question to answer.
Read More
Posted in | No comments

Webinar on Holistic approach to Mobile Broadband Traffic Management *tomorrow*

Posted on 04:02 by Unknown
A quick heads-up that I'm doing a webinar tomorrow morning (Thursday 21st) on holistic approaches to mobile broadband traffic management. I'll be covering various themes around Net Neutrality, the need for radio-, location- and device-awareness in mobile policy management, and the challenges of "silo" solutions such as video compression operating without understanding the user's and network's context.

I'll also be covering the role of what I'm terming a "Congestion API" as a goal to work towards - an eventual win/win/win for user, operator and content/app providers.

Sign-up details are here: http://bit.ly/9q9vLk

Disclosure: this webinar is being conducted in conjunction with (and sponsored by) a client of mine, CCPU
Read More
Posted in | No comments

Monday, 18 October 2010

Upstream billing in two-sided telco models will face same challenges as normal user billing

Posted on 09:01 by Unknown
I just read this post on the Telco 2.0 blog. While it's an interesting idea (to try to create a business model for free broadband, using apps as a means to create cable-TV style service bundles and sell lots of advertising), at first glance I can see multiple flaws in the theory and approach.

I'm not going to do a forensic dissection of it for now, because it highlighted a specific set of problems likely to be generic across all "upstream"-paid telco models: how to do fair and secure billing to the advertisers / content / app provider, especially where there's some form of volume-based charging involved.

Let's assume for a moment that non-neutrality of access is permitted, at least to specific upstream walled gardens, rather the full Internet. So to use that blog post's idea, maybe the user has a set of apps from "useful" services such as Facebook, YouTube, Salesforce or whatever, rather than the full open Internet through a browsers. Each app's access profile is able to be analysed and accurately modeled, based on the various new app-analytics frameworks evolving.

So, in theory, each app's owner could be charged for the data consumed within its application - eg YouTube gets billed $x per GB and so on. It's a bit like freephone or freepost services, paid for by the company, not the user.

Sounds like the Holy Grail for monetising broadband, no?

No. I'm unconvinced, on several levels.

Firstly, what are the protections for upstream provider? Are they exposed to a potentially unlimited bill, for example if a virus or unscrupulous competitor racks up huge downloads? I'm not sure how this works with Freephone, but presumably it's a lot easier to track usage and spot fraud and abuse - if a robot keeps calling your number, your call centre agents will spot it pretty fast. For freepost, I'm pretty sure if you stuck the prepaid envelope to a brick or something else heavy, the recipient wouldn't get stung for a huge postage bill.

Next, how are errors and re-sends accounted for? Netflix is probably not going to be happy if a network outage or other glitch means resending large volumes of data and thus having to pay twice.

What are the appropriate metrics for billing, anyway? A simple cost per MB or GB of data downloaded is a spectacularly poor way of pricing data, especially on mobile, as it doesn't reflect the way that network costs build up. How are uploads priced compared to downloads? For mobile, how do you deal with small-volume / high-signalling apps? Do you charge apps that download during quiet hours the same amount as those that are oriented towards peak-hour commuters?

Then there are future "gotchas". How aer mashups dealt with? What happens when one app on a device discovers the other apps running concurrently on a PC or phone or set-top box, and routes some traffic through their own exposed APIs? On the server side, is Facebook expected to pay for YouTube traffic displayed within its own app, without a way to "cascade" payments onward?

There are plenty more issues I could list.

The bottom line is that upstream charging is going to need just as much sophistication in terms of rating, realtime capabilities, anti-fraud, revenue assurance, policy and so forth, as we currently see in normal downstream billing. For some of these, even more sophistication will be needed as things like anti-fraud will need to be bi-directional.

Overall, this isn't going to be easy - and it's not obvious that the operators or billing vendors have yet to go far enough down this path in terms of ready solutions.
Read More
Posted in | No comments

Monday, 11 October 2010

New Report: Zero chance that IMS RCS will become a massmarket service, but niches may be possible

Posted on 21:21 by Unknown
I have just published a new research report on the many failings of the Rich Communications Suite (RCS), which is a proposed IMS-based service for enhanced mobile messaging, phonebooks and communications services. The industry effort is coordinated by the GSMA's RCS Initiative.

My belief is that RCS is not "fit for purpose" as a massmarket application on mobile devices. It is late, it is inflexible, and it has been designed with many flawed in-built assumptions. The report identifies at least 12 different reasons why it cannot and will not become a universal standard. I've written about RCS several times before, since its inception more than 2.5 years ago.

Although originally intended as a ""epitaph" for a dead technology, I've tried to be open-minded and see if something might be salvaged. The report gives a few possible ways that RCS might be reincarnated.

I've been writing about mobile implementations of IMS for more than 4 years. In mid-2006, Disruptive Analysis published a report I authored, analysing the complete lack of useful standards defining what constituted an IMS-capable phone. I've subsequently talked about the failings of the MMtel standard for mobile VoIP in another report on VoIPo3G, and the recklessness of attempting to tie the LTE sports car to the IMS boat-anchor (or indeed, a dead parrot to its perch).

On the other hand, I've been broadly supportive of the GSMA's initiative for VoLTE, as it essentially does what I suggested in the 2007 VoIPo3G report - create a workable "bare bones" replacement for circuit voice over mobile IP. Although it doesn't preclude later adoption of some of the extra baggage that MMtel carries (video and so-called "multimedia", for example), it focuses on the here-and-now problem of making mobile VoIP phone calls, and interconnecting them.

If you look at why IMS has had moderate success in the fixed world, it's because it has been driven by direct PSTN-replacement, often with absolutely no additional features and gloss for the end-user. Yes, there can be more bells-and-whistles for corporate users, or people who want a fancy home screenphone. But it also works for seamless replacement of ordinary phone services for people who want to re-use their 1987-vintage handset plugged into a terminal adapter.

VoLTE also seems to be an attempt to "start simple" - essential, because there will be enough problems with optimising VoIP for mobile (coverage, battery, QoS, call setup, emergency calls etc), without layering on additional stuff that most customers won't want, initially at least. There will also, critically, be a ton of competition from other voice technologies, so speed and flexibility are critical.

Lastly, VoLTE can be deployed by individual LTE operators, without the need for all cellular providers (or handset manufacturers) to adopt it. VoLTE can interconnect quite easily with any other telco or Internet voice service, much as circuit voice, fixed VoIP or Skype can today.

RCS is another matter entirely. Rather than being a "bare-bones" way to migrate SMS (and, OK, MMS) to IP, in both network and handset, it attempts to redefine the messaging interface and phonebook UI at the same time. Rather than just getting SMS-over-mobile-IP to work as well as the original, it layers on additional complexities such as presence, IM and a reinvention of the handset's contacts list. It has been presented as enabling operators to create their own social network platforms, and interoperate amongst themselves.

All this extra functionality is intended to be a "core service" of future phones and networks, with an RCS software client functioning at the very heart of a handset's OS, especially on feature-phones. It is intended to ship (out of the box) in new handsets - although aftermarket RCS clients should also be available on Android and other devices.

Most of the initial thought seems to be an attempt to replicate a 2005-era MSN or Yahoo Messenger IM client. It certainly doesn't appear to have been invented for an era of downloadable smartphone apps, current-generation mobile browsers, mashups - or the harsh realities of a markeplace in which alternatives such as Facebook, BBM and various Google and Apple and Ovi services are already entrenched.

Much of the RCS Initiative effort is focused around interoperability. While this is very worthy, it is unfortunately demonstrating inter-operation with the wrong things: other operators' RCS platforms, rather than the hundreds of millions of people happily using other services already. There are some hooks into Facebook and related services, but these fall into the already-failed effort to put a clunky layer of social network aggregation on top of already-refined services.

The net result is that if an interoperable RCS is going to do something *better* than Facebook, it needs to be available to *everyone* and be adopted by all their friends. In reality, this means that all operators in a country will need to deploy it (roughly) together. And it means RCS functionality needs to be in all handsets - even those bought unsubsidised by prepay users.

As a result, RCS is yet another of a long series of "coalitions of the losers". It has not been designed with either web mashups or 3rd-party developers in mind. (At the 2010 RCS developer competition, there were "almost 40" entries). It has not been designed with the idea that an individual operator could launch a blockbuster service across all networks. It comes from an era of mobile central-planning, where a roomful of people could determine a single universal architecture and service, despite a lack of awareness of users' actual behavioural needs.

Smartphones, and the emergence of Apple, Google, Nokia, RIM, Facebook and others as applications powerhouses, has now guaranteed that there will never again be another single, ubiquitous mobile service, controlled solely by the operators. That ship has sailed, MMS being the last vessel limping from the port.

Let me ram that point home. There are now more users of open and flexible 3G smartphones, than there were total cellular subscribers when MMS was first invented. Almost all of them know what a good IM or social network service looks like - and which ones their friends are on.

The report covers all the topics raised here in greater depth, and also looks in more detail at some of the other "minor" gotchas around RCS, such as its impact on radio network signalling and handset battery life. Or its complete lack of obvious business model. Or any enterprise focus.

As I mentioned above, the report does contain some suggestions about possible "salvage" use cases for RCS. It could be used to create single-operator niche services, perhaps - maybe a music or movie fan service, perhaps.

Contents, pricing and online purchase of the report are available here , or contact me directly via information AT disruptive-analysis DOT com .

(Note: I'm trying out a new online payment / download service, please drop me a message if there are any problems and I can send the report by conventional invoice and email)
Read More
Posted in | No comments
Newer Posts Older Posts Home
View mobile version
Subscribe to: Posts (Atom)

Popular Posts

  • Quick musing on Cloud Computing
    I just heard the phrase "Everything as a Service" during a presentation on Cloud, SaaS and other forms of managed service offering...
  • Apple, embedded SIMs, NFC and mobile payments - some speculation
    I wonder if I've just managed to join up the dots on something rather important: - Recent reports suggest that Apple is intending to use...
  • New Cisco VNI traffic report out
    One of the broadband industry's "bibles" has been published in a 2010 edition . Cisco's "Visual Networking Index...
  • Is the MID a market?
    MIDs (Mobile Internet Devices) are being pushed by some notebook OEMs and silicon suppliers as the next big convergent handheld category. I...
  • "You can't use my eyeballs for free"
    Let's look forward 10 years. We've all got augmented reality browsers on our handsets, or perhaps our 4G-connected sunglasses. They ...
  • Mobile traffic management - the Inter-technology war begins
    I've been following the proliferation of mobile broadband traffic management technologies for some considerable time now, having publish...
  • Pre-MWC notes for analyst relations staff
    OK, it's the time of the year when I get bombarded by emails and phone calls from a million people inviting me to briefings and similar ...
  • Mobile operators' future voice strategies decoded
    Apologies in advance, but this blog post is deliberately a bit of a tease. I'm not going to spell out the answer here, as it's too v...
  • Hosted mobile services in the recession - Caveat Emptor
    I used to work as an equity analyst at an investment bank back in 2000-2001. I remember an unending stream of first generation Application S...
  • Challenges in measuring offload volumes
    I suspect we're going to get bombarded with statistics in the next year, along the lines of "Operator X deployed Vendor Y's off...

Blog Archive

  • ►  2013 (31)
    • ►  October (2)
    • ►  September (3)
    • ►  August (1)
    • ►  July (2)
    • ►  June (6)
    • ►  May (5)
    • ►  April (1)
    • ►  March (3)
    • ►  February (3)
    • ►  January (5)
  • ►  2012 (46)
    • ►  December (5)
    • ►  November (4)
    • ►  October (3)
    • ►  September (2)
    • ►  August (4)
    • ►  July (3)
    • ►  June (1)
    • ►  May (6)
    • ►  April (4)
    • ►  March (1)
    • ►  February (9)
    • ►  January (4)
  • ►  2011 (73)
    • ►  December (4)
    • ►  November (10)
    • ►  October (8)
    • ►  September (6)
    • ►  August (3)
    • ►  July (5)
    • ►  June (7)
    • ►  May (9)
    • ►  April (4)
    • ►  March (7)
    • ►  February (6)
    • ►  January (4)
  • ▼  2010 (130)
    • ▼  December (4)
      • Uploaded - recent free white papers on mobile data...
      • Why application-specific policy and charging will ...
      • The inevitable 2011 predictions - successes and fa...
      • Falling smartphone prices = problems for operator ...
    • ►  November (10)
      • Will advertisers be made to pay for mobile data use?
      • The SIM card. The single point of failure for the ...
      • Apple, embedded SIMs, NFC and mobile payments - so...
      • Enforcing traffic management transparency
      • Wallets don't crash
      • Three new white papers on Mobile Broadband Traffic...
      • Part of Nokia's problem - making Ovi compelling
      • Will there be legal pitfalls of policy management?
      • Mobile operators have lost their chance at owning ...
      • What impact will security worries have on WiFi off...
    • ►  October (10)
      • Apple - incremental products, incremental profit
      • Sprint's mobile wallet sounds sensible
      • Ray Ozzie, the so-called "post PC" era, and the na...
      • Moving away from measuring mobile broadband "tonna...
      • Will telcos have a positive "balance of trade" in ...
      • Webinar on Holistic approach to Mobile Broadband T...
      • Upstream billing in two-sided telco models will fa...
      • New Report: Zero chance that IMS RCS will become a...
    • ►  September (6)
    • ►  August (9)
    • ►  July (7)
    • ►  June (19)
    • ►  May (19)
    • ►  April (11)
    • ►  March (18)
    • ►  February (7)
    • ►  January (10)
  • ►  2009 (126)
    • ►  December (4)
    • ►  November (14)
    • ►  October (9)
    • ►  September (8)
    • ►  August (9)
    • ►  July (10)
    • ►  June (21)
    • ►  May (14)
    • ►  April (2)
    • ►  March (11)
    • ►  February (15)
    • ►  January (9)
  • ►  2008 (94)
    • ►  December (24)
    • ►  November (26)
    • ►  October (25)
    • ►  September (19)
Powered by Blogger.

About Me

Unknown
View my complete profile