Under the sea: Microsoft unveils results of first ever ocean-based data ‘cloud’
Microsoft has literally taken cloud computing to new depths. The software giant recently lifted a 38,000-pound container of data from the ocean floor and says the results of its first ever undersea server tests are promising.
The team unveiled details of its new brainchild, called Project Natick, which may soon broaden global horizons in terms of data storage and data transmissions.
Last August, Microsoft developers dumped a 10 foot by seven foot box with a datacenter inside on the seafloor, approximately one kilometer off the Pacific Coast of the US. It was very compact and did not need any supervision, except for a diver’s check once a month.
Once the vessel was put on the seafloor, the researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus, according to a press release.
“A wild ocean adventure turned out to be a regular day at the office,” Norm Whitaker, who heads special projects for Microsoft Research NExT, said.
The team was keeping in touch with the container remotely, while also using cameras and other sensors to record measurements, such as temperature, humidity, the amount of power being used for the system, as well as the speed of the current.
“When I see all of that, I see a real opportunity that this could work,” Microsoft’s Sean James said. “In my experience the trick to innovating is not coming up with something brand new, but connecting things we’ve never connected before, pairing different technology together.”
It was James’ naval service on a submarine that inspired him to suggest the project.
While the planning was quite complex, the idea was rather simple. In fact, it also drew inspiration from submarines, with designers creating a round container that would house the datacenter.
“Nature attacks edges and sharp angles, and it’s the best shape for resisting pressure,” Ben Cutler, the project manager who led the team behind this experiment, explained.
It took 90 days to build the vessel that housed the experimental datacenter. It appeared that unlike land-based servers, which require to be tailored depending on terrain and environment, this underwater facility does not need any special adjustments.
Right away, the problem of cooling the facility and keeping the computers from overheating solved itself – the deeper you put the container, the cooler it is. It is also less costly, more energy efficient and eco-friendly.
“But what was really interesting to me, what really surprised me, was to see how animal life was starting to inhabit the system,” Christian Belady, general manager for datacenter strategy, planning and development at Microsoft, said.
“You think it might disrupt the ecosystem, but really, it’s just a tiny drop in an ocean of activity,” Whitaker added.
The underwater datacenter also reduced latency by closing the distance to populations and thereby speeding data transmission, which researches say is a big advantage of this scheme.
Lifted up in November, the container is now sitting in the lot of one of Microsoft’s buildings. The team is now planning to submerge a vessel four times bigger with as much as 20 times the computing power. Microsoft is also evaluating test sites for its new container, which could stay underwater at least a year.
“We’re learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them. We’re managing power, learning more about using less. These lessons will translate to better ways to operate our datacenters. Even if we never do this on a bigger scale, we’re learning so many lessons,” says Peter Lee, corporate vice president of Microsoft Research NExT.