Where the cloud lives
Thirty metres below Stockholm, in a former civil-defence nuclear shelter, Swedish ISP Bahnhof AB's Pionen Data Centre is the perfect storage facility for sensitive info like WikiLeaks' files (which are stored there). With its 40-cm-thick doors, Pionen has been compared to a Bond villain's lair.
Who's leading the market in revenue?
Rackspace: $1.025 billion (U.S.) (estimated)
Amazon Web Services: $1 billion (U.S.) (estimated)
Who has the most servers?
Google: 900,000+ (estimated)
Microsoft: 700,000+ (estimated)
What share of corporate computing is happening in the cloud?
61% in 2011
68% in 2014 (projected)
More than 50% of Global 1000 companies will have stored customer-sensitive data in the public cloud by the end of 2016
Where the cloud lives
Co-location centres like 151 Front St. West are third-party-owned spaces where a firm like Rogers can place its servers next to a competitor like Cogeco's. In addition to sharing security and maintenance costs, companies link (or "peer") with each other, which reduces connecting times for users.
Where the wired things are
The crossroads of Canada's telecommunications network is hiding in plain sight
In early August, one small link in Bell's vast national network of cables suffered a 30-minute outage that sporadically stopped customers from accessing some websites, proving again that, in order to function, the Internet - that diffuse network of networks - still relies on a huge collection of physical things.
One of these is a drab, brown building at 151 Front St. West in downtown Toronto, which a senior wireless carrier executive once joked would be the ideal site for a terrorist attack, since a strike would shut down much of the country's communications, both wired and wireless.
"When you walk by the building, you can't tell what we are," says Doug Riches, a British electrical engineer who manages the ultra-secure co-location facility. "We like it that way." Canadians should be glad for the tight security; if you access the Internet from within this country's borders, it's certain your data has flowed through one of the 100,000 strands of fibre-optic cable strewn throughout this building's telecommunications suites, which house millions of dollars worth of equipment. Companies like Cogeco and Rogers co-locate and join their networks together in this neutral location, instead of going through the wider Internet. If you're using Google - and who isn't? - your data is sure to have passed through a corner of one suite in particular: the Toronto Internet Exchange, or Torix - a non-profit association that runs a caged-off area on the sixth and seventh floors at 151 Front. Here, about 160 companies plug into each other directly.
So why do companies want to co-locate? Getting a spot in one of these facilities helps speed up the connection between two websites and reduces wait times, as well as minimizing breaks in continuity on extremely sensitive services like voice over IP. Jon Nistor, a systems engineer and president of Torix, says that having a presence in Torix also increases the resiliency of a company's connection, should something go awry in the physical world.
Companies that are co-located at Torix were totally fine during Bell's outage. "Torix is basically an interconnection point," Nistor says, as he walks around the cage, gesturing at servers covered in labels on pieces of tape. Torix pays 151 Front for space, and charges companies for access as well as supplier and contractor costs, though its goal isn't profit; it's simply to improve the Internet.
Cloudy with a chance of high-frequency trading
Inside NYSE Euronext's U.S. liquidity centre in New Jersey, where cloud computing meets HFT
(1) New York Stock Exchange seat holders can plug their trading servers into the NYSE's matching engine, the computer that pairs buyers and sellers in electronic trades. Each server is an equal distance from the matching engine; if one firm's server were closer, its orders might arrive milliseconds before the others - an advantage worth millions in the world of HFT.
(2) The matching engine ferries orders from traders to several NYSE Euronext exchanges, such as NYSE Amex and NYSE Arca.
(3) In addition to the seat holders' servers, the liquidity centre features several cloud-computing servers. For a fee, non-seat holders can run virtual trading programs that place orders just as quickly as seat holders can. This is the first time in NYSE history that non-seat holders have been allowed to trade directly on the exchange.
of Americans believe stormy weather would interfere with their cloud computing (Citrix survey, August, 2012)
Where the cloud lives
Unlike Google's rumoured dozen-plus data centres, Facebook rents much of its capacity from third parties. That's changing - Facebook's first data centre outside the U.S. is under construction in Luleå, Sweden, where sub-zero temperatures will help rein in the cost of cooling 28,000 square metres worth of servers.
Countries with frigid climates are becoming hot spots for energy-hogging data centres. Why isn't Canada?
If the Big Data revolution can be compared to the Industrial Revolution, then massive server farms are the factories where the next historical shift is taking place. These huge, humming buildings house tens of thousands of servers and allow people in nearby regions to access their stored information more quickly.
They also consume enormous amounts of energy, making them expensive to operate. Consumer-facing tech giants like Google, Microsoft and Facebook are trying to locate, power and cool their data centres in the most efficient, eco-friendly way possible - by sucking in outside air and piping in water to chill the rows of servers, as well as using hydroelectricity. Canada, a country with both a lot of water and a lot of cold, should be the perfect home for the type of high-tech facility that, say, Google just built in a repurposed Finnish paper mill. Right?
Guess again. No big global tech giant is rushing into our country. Is our water not wet enough? Our cold not cold enough?
As always, it's a numbers game. Facebook, for example, has server farms in the United States in order to be close to its 309 million people, as well as more than 108 million in Mexico. Those farms can service Canada's population of 34 million without straining any fibre.
"I don't really think 'Canada versus the U.S. versus Mexico'; I view North America as kind of North America," says Tom Furlong, vice-president of Facebook's site operations. When it comes to latency - the computing term for system delays such as the time that elapses between the moment a user types the address of a Facebook page in their Internet browser and the moment that page is sent to the user - Furlong says the trip your data takes across the 49th parallel is "not that relevant....When you start crossing large bodies of water, that latency starts to get felt."
Furlong says Facebook's new data centre in Sweden will noticeably improve performance for its European customers, who have had to make do with an undersea cable connecting them to Facebook's East Coast server farm.
But even if Furlong views North America as a single entity, others don't. Indeed, some companies look at the U.S. and see a country of angsty, national-security-obsessed yahoos - and want to host their sensitive data elsewhere. "There's a belief that U.S. legislation has a lower threshold for search and seizure," says Strahan McCarten, the director of hosting and data centre services at Bell Canada. Bell operates about seven facilities across the country, ranging from 10,000 to 200,000 square feet. Companies pay Bell to use its data centres for a number of reasons, including Canadian privacy laws.
Yet those server farms are almost all in or near Canada's biggest cities. That's because these buildings not only require 24-7 on-site security, but they also have to be close to skilled electrical engineers and plumbers, as well as specialists in IT-related fields; also, clients sometimes need to visit their data. Though you might save cash by putting a heat-producing mega-facility on an ice floe, there isn't much point if you have to fly someone in from Calgary or Edmonton every time you need to replace an LED. /I.M.
of people who think they aren't using the cloud actually are
are banking online
have shopped online
report using social networking sites
have played online games
store photos online
have stored music or videos online
use online file-sharing services
Cloud computing market size
$40.7 billion in 2011
$61 billion by end of 2012
$241 billion in 2020
2,259,998,000 Kilowatt-hours used by Google in 2010 / That's enough to power 76,756 Canadian households for a year
SOURCES CITRIX CLOUD COMPUTING SURVEY; 451 RESEARCH GROUP; FORRESTER RESEARCH; GOOGLE; STATISTICS CANADA
Where the cloud lives
Containerized modular data centres like the models eBay installed on the roof of its eco-friendly Project Mercury data centre in Scottsdale, Arizona, are very efficient. Compared to the years required to build a data centre from scratch, containers can be installed quickly, and use less energy - even when it's 48 C outside.
It's raining data
The big question for cybersecurity experts: How do you catch a cloud and lock it down?
The massive data centres dotting the planet guard their secrets closely. Many have 24-hour security, backup generators, biometric eye scans and perimeter fences. Inside its data centres, Google employs what it calls "The Crusher," which drives an oversized, rounded steel arrowhead through the middle of servers that have died. That twisted heap of metal is then fed through "The Shredder," from whence it emerges as a pile of unreadable green-and-silver shards.
That might prevent secrets from being stolen by dumpster divers, but The Crusher is powerless to keep safe the terabytes of vital data that still live in the cloud. More and more confidential information is moving online every day; tax, health care and insurance data will go soon, if they're not there already.
"The trend is not going to reverse," says Nirav Mehta, director of identity and data protection at RSA. "There will be a migration." Vital information long ago moved out of locked filing cabinets and onto password-protected computers. Next stop: the cloud.
When it comes to security, not all cloud computing models are the same. Whereas in "public" cloud services like Gmail, anyone can set up an account, "private" cloud services use security precautions like RSA's to restrict access, usually just to employees. If you access your work network from home, there's a good chance you've used, or at least seen, RSA's SecurID two-factor authentication keys and software. SecurID transmits random numeric codes every 60 seconds to users, who must enter them, and their personal passwords, correctly.
Many services are a hybrid of the two models, keeping the most sensitive information and tasks in the private cloud. No security is infallible, however, which is why execs considering how to operate in the cloud feel a chill when they hear about events like the hack that com-promised 70 million Sony PlayStation users worldwide (see below).
"There are some data sets that are best kept inside the company," says Microsoft Canada's national technology officer John Weigelt. "There have been some high-profile failures."
So how do you decide what parts of your business ought to be in the cloud? The nature of the data determines where it fits best. For data that needs to be widely accessible, like large databases of passwords (think Sony or Google), RSA has come up with technology that allows companies to break apart the data and store chunks of it separately, so that a breach wouldn't yield anything usable. For other types of data, Mehta envisions "community clouds" between like-minded organizations, such as health care providers and insurance companies.
Ultimately, the level of security will depend on the sensitivity of the information. Proprietary data such as CEOs' e-mails are best locked down in private clouds - if they're accessible from outside the company's offices at all.
But too much security can get in the way of cloud computing's real purpose: making it easier to get things done. Mehta is always aware of the fine line. In June, when he tried to check into a hotel in Tokyo, his credit card was frozen. "I had called [my credit card company] in advance, and it still failed," Mehta says. "It took me two or three phone calls to solve it." /I.M.
WHEN THE CLOUD BURSTS: THREE HUGE #FAILS
April 20, 2011
Hackers down Sony's PlayStation Network for 44 days; the outage costs the company an estimated $171 million (U.S.). In a related attack, the same hacking group leaks the personal information of roughly 75,000 SonyPictures.com users.
April 21, 2011
Amazon Web Services goes down in a massive crash. Reddit, Hootsuite, Kickstarter, FourSquare, Evite and many other services suffer outages and/or malfunctions for three-to-five days, showing just how popular AWS has become.
Oct. 10, 2011
A three-day outage of RIM's BlackBerry service gives the Waterloo firm a black eye, prompting corporate customers to consider ditching their BlackBerrys. As one CIO told Reuters, "This has brought [the issue] to the front burner."
The other business that Amazon dominates
The fastest-growing segment of the cloud computing market is Infrastructure as a Service (IaaS). Clients like Netflix use Amazon Web Services (AWS) to outsource the messy and time-consuming job of building and maintaining server farms, so it can focus on the product. Amazon CEO Jeff Bezos says AWS could soon become as big as the company's $25-billion-plus retail business.
IaaS market share in 2011
SOURCES (IAAS MARKET) 451 RESEARCH