Technology trends for 2010

Technology trends for 2010

Whether it’s in mobile telecoms, fixed-line infrastructure, cloud computing or security, there will be changes and innovations throughout this year that carriers will have to assimilate and adapt to.

The days of the groundbreaking killer app or huge sea changes in technology direction may be behind us, but today more than ever consumers expect ubiquitous access to the internet, and take it for granted they can do business online. Similarly, as well as looking to all links in their supply chain to be connected, enterprises are looking to the cloud to save them costs and reduce complexity, so it’s more important that ever that the telecoms business stays abreast of what’s coming out of the labs and even the most subtle changes in usage of technology.

So, what technologies are on the edge, what’s being deployed, and what should we be looking out for in the coming 12 months?

IP security from carriers

Through the 1990s and in the last 10 years, the burden of coping with threats to security fell on the user, either at home or in the enterprise. Now, by common consent, any user attempting to cope alone will be overwhelmed. And so some of the burden passes to the carrier.

While Symantec grew to be the fourth largest software company in the world on the back of selling packaged product, Messagelabs achieved much slower growth selling hosted security. When Symantec acquired Messagelabs for $695 million in 2008 it signalled, if not a changing of the guard, a changing of the emphasis.

Messagelabs, now being rebranded as Symantec Hosted Services, has a thriving carrier business providing either branded or white-label security to carriers and ISPs who find themselves with the responsibility not just to provide the connectivity, but to secure it as well. “There are now a number of things the carrier has to do. Isolating customer traffic that shares one infrastructure is essential. A carrier is in a good position to inspect packets. It can secure VoIP against eavesdropping. IT needs to be aware of denial-of-service attacks,” says Paul Wood, intelligence senior anlayst at Messagelabs.

The threat, Wood says, is much harder to spot than even in the recent past: The basic advice that enterprises “need a policy” is no longer enough. It also needs access to intelligence on systematic threats, the ability to predict threat patterns and to analyse traffic patterns.

“It is much harder to recognise an attack when it is taking place. Industrial espionage is hard to spot, and traditional countermeasures cannot spot it,” Wood adds. Messagelabs has noted more organised security threats emerging: there have been increased attacks on companies who rely on intellectual property – for example, pharmaceutical companies – in an attempt to pre-empt their patent filings; botnets are increasing in sophistication and “Fast Flux” botnets are hard to spot and hide phishing and malicious web sites; specialised malware is created by gangs to target vulnerable devices.

One example of the latter was 2009’s attacks on ATMs, which showed detailed knowledge of their operation and security. In 2010 Messagelabs warns that this activity will spread to electronic voting systems, both commercial and for elections. The threat needs to be identified and eliminated in real time.

The carrier isn’t just a natural partner in these efforts, it is the best-placed supplier to take the role. Every year Messagelabs publishes its security trends to watch in the year to come, and in 2010 the industrialisation of security threats demonstrates the need for a carrier to have two roles in security: to catch threats early, but also to be part of a wider effort to analyse and identify emerging threats. “The first wave of carriers are doing enough, but there are a group of laggards who still do the bare minimum,” Wood says. “The problem is there is no statutory requirement for carriers to provide security, and we have to ask if some are doing enough to deal with this problem.”





The dramatic and sudden rise in false answer supervision (Capacity December 2009) fraud for carriers and their enterprise customers alike is a test case. In 2010 the False Answer Supervision Forum will convene and try to find enough carriers who are willing to share intelligence so that they can identify the threats. If all carriers act independently, the ability to cut FAS fraud is compromised. So it is with botnets, denial of service attacks, and other organised security threats. In 2010 the increased requirement for carriers to provide security services will not just be a test of their ability to provide a managed service, but also to cooperate for the benefit of the entire industry.





Mobile billing drives profit not ARPU

“The marketing department is finally understanding the value of the network that it has,” says Michael Manzo, chief marketing officer of billing platform provider Openet. After a decade of rapid technological innovation, 2010 may see more creativity in how those platforms are used to drive retention and service, as well as ARPU.

ARPU is an easy measure to calculate, but says nothing about the profitability or lifetime value of a customer. According to Manzo the newest data applications that operators are putting into place will drive not just ARPU, but profitability in markets where there is little headroom in what users will spend.

“Loyalty is especially relevant in low ARPU markets where there are a lot of prepay customers. With throwaway phones we are seeing no sense of loyalty. They are saying, give me what I want at a reasonable rate, and so at that point it becomes efficient to have alternative billing methods,” he says.

Manzo spotlights two trends. The first is the ability to use advanced billing to provide service passes for limited times or limited locations. “We have seen them in public areas and hotels, for example, but if we were to expand that we’re getting to something like an e-wallet. The user gets a range of data services on a range of devices, and can pay for them in many different ways.” In an era where users will care less about how their smartphone or netbook is connected to the network, and more about the access they get from it, the service pass offers them the ability to provide short-term access on a network they choose.

The second is the deployment of integrated charging and policy management on a mobile network: not just allowing users on to the network, but controlling the service they get. “Policy is evolving, and we can take it beyond who can use a network into what speeds they can access, which application types can be used, fair usage caps, even quality of service guarantees. You can have a number of tiers, or you can even provide a service that’s entirely a la carte,” Manzo says.

These sophisticated customer acquisition and retention tools will be the keys to generating profitable revenues wherever in the world you are based, he says, as the methods can be adapted to provide tiers of service anywhere in the world – as long as you have the billing and mediation services in place to manage them. While five years ago it seemed that small, hungry operators were making inroads with a lack of legacy technology and innovative models (MVNO, for example), much of the expertise to take the next step belongs to the operators who have experience in customer segmentation or of providing tiers of service in the fixed carrier world. “The ones we see doing this are the market leaders: AT&T, Verizon, Sprint, Orange,” says Manzo.

For competitors who are serious about using their networks to improve profitability and loyalty in 2010, the marketing department needs to move beyond a narrow focus on ARPU: but it also needs to work collaboratively with the engineers and with providers. “Five years ago the operators would have come to us saying: ‘We want a mediation and charging platform’. There wasn’t a lot of vision required. Today it’s a much more consultative engagement,” says Manzo. For smaller, more agile operators, their focus on innovation may be what they need to close the gap.

Flexibility and the public cloud

After the announcements come the hype. After the hype comes the realisation that this is a business opportunity after all.

The frenzy to build data centres from 2006 to 2008 is, perhaps, the greatest investment in technology innovation the computing business has ever made, supported by the investments in network that carriers had made at the end of the 20th and beginning of the 21st century.

If the first job of cloud computing providers like Microsoft, Google and Amazon was to provide flexibility in how enterprises do their processing, the second wave, arriving in 2010, will be how those resources are allocated and charged.

Virtualisation has created an extremely efficient market. With the opportunity for enterprises to create time-limited servers, offering live migration from one server to another as well as the ability to migrate not just from one server to the next but from one data centre to the next. Hybrid clouds will become much more of a focus in 2010: enterprises that are comfortable with moving resources from local data centers to their private cloud models will look for excess capacity in the public cloud, buying it where appropriate from public providers.

This is a natural fit for carriers, who own not just the capacity but the ability to manage its allocation and the security of data as it travels from private to public and back again. But technology providers are challenging the ability of carriers to innovate in their business models by integrating technology concepts like service-oriented architectures (SOA) with the ability of the network to provide that service anywhere enterprise customers need it. In this way the technology provider becomes the valued supplier, and the network is simply the connection.

“If you look at virtualisation as well as SOA, those were technologies that needed to nature before we could provide these services over the internet in an on-demand fashion,” says Werner Fogels, CTO at Amazon.com. When Amazon launched its enterprise cloud computing services in August 2006, many thought it a strange fit; yet by packaging and offering the flexibility with resources that Amazon had developed internally, it has continued to innovate.

The latest idea from Amazon is Spot Instances: customers bid on unused Amazon EC2 (its public cloud service) capacity, and can run their instances for as long as their offer is greater than the spot price. For users with flexible start and stop times – for example if they are doing financial modelling or video conversion – this is a financially-attractive way to get computing resources at the lowest market price. For those who want guaranteed capacity, they can still reserve instances or buy on demand.

Aggressive innovation in pricing and service is the focus of all public cloud providers in 2010. With even the US Federal Government committing heavily to the public cloud in the future as a supplier of its processing resources, the challenge for carriers is how to find partnerships which cement their role in the cloud revolution.

The message from cloud providers is that they will not stop innovating in the pricing of services they provide. “It’s about how quickly, in how much of a granular basis, we can provide the service. Otherwise it’s the same bits and pieces we have always offered,” says John Engates, the CTO of Rackspace. And Hal Stern SVP CTO Sun Microsystems global sales and services, points out that simply allowing customers to use cloud computing is not longer enough: they need to be able to use it in the way that offers most economic benefit. “It’s not just about allowing more things to be run, it’s elasticity. You run things when you want them, and then give it back.”

Big switches: smart grids


It’s a different kind of network all together: Cisco has always had designs on a larger goal than simply selling routers (CEO John Chambers calls the company’s side projects “market adjacencies”), but revolutionising the way that power is supplied is an ambitious project even for the Great Acquirer itself. No matter how many ambitious technologies the company develops, or how many visionary start-ups it buys, it cannot hope to succeed without the support of the industry. That’s why in 2010 the Cisco “Smart Grids” will stand or fall on the activities of its partners. They will make Cisco’s concept of ruggedised and secure switches and routers plus an open standard for the data passing through them into a reality.

“Smart Grids”, as the jargon has already named them, will get much focus in 2010 from producers of power, anxious to adopt new regulation and drive efficiency, from consumers, anxious to cut cost – and from equipment providers like Cisco, who can make this a reality. In the US, for example, the grid loses 7% of power in transmission, theft and mechanical problems. Fixing these problems, Cisco estimates, is a $20 billion global market in the next five years.

No one can doubt the need for an effective way to make power grids as efficient as possible. “New requirements such as integrating renewable sources of energy… cannot be met with traditional approaches,” says Marthin de Beer, SVP Cisco emerging technologies group. The Smart Grid aims to create a secure environment in which a complicated patchwork of suppliers and consumers can be brought on or offline as required, and in which smart meters can optimise power demand. The data collected can be stored and analysed, unlike today’s power networks, which have expanded in scope but not complexity. “Traditional systems were designed for one-way information flow,” says Calvin Chai, Cisco’s senior manager for security solutions marketing.

It’s not just a question of saving the planet: Smart Grids may help to save the businesses of equipment providers. Cisco was not the only one to suffer a sales drop in 2009. Standard and Poors points to a “sharp drop” in its investment review for communications equipment providers, but a better 2010 based on innovations like Smart Grid. On the other hand 2009 has shown that even the most urgent investments can be delayed if the money is not there: so equipment investment programs with regulatory or compliance push behind them offer more certainty than most.

“Expect industry profitability metrics to improve markedly,” says Standard and Poors, “[but] we remain wary of the poor sales visibility.”

Among Cisco’s partners also pinning their hopes for a better future, financially and ecologically, on the Smart Grid: Cable & Wireless Worldwide, Capgemini, GE, Verizon and Wipro.

Nancy Gofus, senior vice president of business products, Verizon, says that “just as IP has helped so many industries transform the way they do business, an IP-enabled Smart Grid has the power to improve how we manage and preserve energy resources.” Capgemini “is working already for a number of clients on both sides of the Atlantic,” says Colette Lewiner, the energy, utilities and chemicals global sector leader.

More from across our site
Gift this article