Tweet This! :)

Friday, July 29, 2016

Positioning part of 'the cloud' underwater

by Mark Ollig

Copyright © 2016 Mark Ollig

We know much of the data managed using our computing devices is accessed and stored in the “cloud.”

“The cloud” of course, is a common metaphor for “the internet.”

If you shopped online at Etsy or Amazon, you were cloud computing.

For me, cloud computing, in addition to online shopping, is remotely accessing computer servers, software programs, services, and the storing and the retrieval of information from computer data servers located inside a building within the same city, or across the country.

This building is called a data center.

Major online players such as Google, Microsoft, Amazon, along with internet service providers, telecommunications, cable, and wireless service providers make use of their own data centers.

Companies sometimes share the space inside a data center; leasing space and equipment from another provider, or controlling the entire operation of their own data center.

Corporations and businesses accessing their data center usually do it over a private, dedicated network connection.

The data centers are usually networked to each other via fiber-optic cables.

Early this morning, I uploaded some 200 photos I had stored to my Google Photos account, which is, of course, in the cloud.

Come to think of it, I believe my Amazon cloud app automatically uploads and stores my photos for me to their data center service located in the cloud.

I therefore have double-cloud redundancy – amazing.

Of course, I am placing much faith in the cloud; hoping my precious photos will not be accidently deleted, lost, or hacked into by some foreign superpower.

Maybe I should just print them out on high-quality photography paper, and place them in the photo album on the coffee table in the living room like I did in the 1990s.

IBM describes cloud computing as “the delivery of on-demand computing resources – everything from applications to data centers – over the internet on a pay-for-use basis.

Many years ago, companies maintained their own computer systems located within a dedicated, climate-controlled computer room.

It cost a company a lot of money, time, and resources to support their own internal computer department.

With access to today’s high-speed broadband and dedicated networks, many companies have decided to totally do away with maintaining their own internal computer department, and are having their computer needs outsourced.

Enter the data center.

Data centers or “data farms” are physical buildings, located throughout the country, with many right here in Minnesota.

A data center provides a company with controlled computing costs, dedicated network access, security protection of their information, outsourced maintenance responsibilities, and multiple power and data storage system redundancies.

An individual data center can take up thousands, or even over a million square feet of floor space.

It’s critically important to keep the inside of the data center cool, especially where the data servers and other computing components are located.

The electronics used within a data center are kept cool in order for them to operate more efficiently and to last longer.

With the great amount of heat being released by power systems, computer systems, electronic components, and other hardware, many redundant “chillers” or cooling systems are used to vent-off or “chill” this heat.

As mentioned above, cooling is needed in order to remove the tremendous amount of heat generated by the many rows of shelving bays containing thousands of data servers.

The card slots in the individual bay shelves contain printed circuit boards embedded with numerous electronic and power components.

The cost of cooling and the amount of available surface land area for data centers in coastal regions of the world were two reasons one large, well-known computing company decided to install a data enter cloud – underwater.

The folks at Microsoft are researching how to manufacture and operate an underwater data center.

They call it Project Natick.

An underwater cloud-computing data center is intended to service major populations living near large bodies of water.

Microsoft states 50 percent of society resides near large bodies of water, which is one of the reasons they are researching underwater data centers.

I mentioned chillers, and how cooling plays a critical role plays in keeping data centers operating. Microsoft states; “Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.”

Utilization of the energy generated by the motion of the tides and currents is being considered for generating the electricity to power their underwater data center.

Connecting an underwater data center to the cloud network, uses undersea fiber-optic cables, which route onto a land station hub with connections to the Internet.

Microsoft said an underwater data center would have a lifespan of five years, after which it would be retrieved, and “reloaded with new computers, and redeployed.”

Although Project Natick is still in the early research stage, it does looks promising.

So, what does Natick mean?

Microsoft says this is simply a code name, it has no special meaning.

However, your always-investigative columnist discovered there is a town in Massachusetts called Natick.

The only familiar tech-industry connection name I found associated with Natick, MA, is Phil Schiller; a senior vice president of worldwide marketing with Apple, Inc. who was born in Natick, and is regularly seen onstage during Apple-related events.

I assume Microsoft, in choosing this project name which happens to be the same name of the town where a highly recognized Apple employee was born, is merely a coincidence.

The website for Project Natick is: http://natick.research.microsoft.com.


Find my Twitter messages (kept above water) at the @bitsandbytes user handle.


Photo Source: http://natick.research.microsoft.com