Microsoft is changing the data storage dynamic by applying for a patent that disguises its experimental underwater data center as a natural sea bed.
In August 2015, Microsoft lowered its experimental underwater data center into the depths of the ocean off the central coast of California to see if storing and running servers on the ocean floor were feasible. Project Natick, as it was dubbed, operated for 105 days from August to November to test the hypothesis that is not only underwater data centers possible, but is also more efficient than their land-based brethren.
The test was claimed as a success without any real issues, especially since the trial run ended up going on longer than first anticipated. The sealed pod that was submerged in the ocean was returned to Microsoft’s headquarters in Redmond, WA for analysis.
For those who are unfamiliar with Project Natick, it was dreamt up in 2013 by Microsoft employee Sean James, a former member of the US Navy who had served on a submarine. James’ prior experience operating under the waves of the ocean perpetuated the idea of a water-tight capsule that would house a cloud-based server unit connected to fiber optic cables leading to the coast for data access. The server would be powered by renewable energy generated by the ocean waves themselves, and the cold depths would keep what little heat would be generated to a negligible level.
The white paper written on what would become Project Natick was co-authored by James and fellow Microsoft engineer Todd Rawlings and caught the attention of senior leaders at Microsoft. One of these leaders was Norm Whitaker, who subsequently put the project team together in 2014 to begin planning and building the server capsule. Upon launch, the server capsule was christened Leona Philpot and lowered into the ocean depths late in 2015.
Microsoft's Underwater Data Center Gets More Aquatic
Now, almost two years later, Microsoft has visited the patent office to make a new aesthetic change to the server capsule: Microsoft will be disguising the exterior to look like a bed of coral to ward off data thieves, divers, and invasive aquatic animal life while also encouraging non-interfering sea life to flourish. The underwater data centers will also be equipped with sensors that detect intrusion attempts and will remove the data in case of a serious breach.
The idea behind building underwater data centers in the first place is in response to critical issues with current data centers built on land:
- Require lots of land - Data centers are massive and take up a lot of space that could be used for better purposes. Relocating data centers to just offshore would free up precious land on the surface, and through Microsoft’s patent, encourage new growth for local sea life.
- Require lots of energy - Electricity must not only power the data centers themselves, but also the cooling of the servers, which has been proven time and again to be one of the most costly aspects. Colder ocean depths would take care of both energy generation and heat dissipation with a minor effect on the surrounding environment.
- Subject to physical data breaches - Like any standard building, data centers are always at risk for a physical break-in. Keeping data storage underwater and camouflaged can make it more difficult for physical data breaches to occur.
- Built in remote places far from users - Large data farms are often built far from large concentrations of the world’s population, which can also affect data access time. Since over 50% of the world’s population lives near the coast, installing these underwater data centers can make the time to access data decrease.
Underwater data centers may become the future one day, but how will it affect you?