Thursday, June 26, 2014

The federal cloud computing plan


by Mark Ollig

In 2009, the federal government began moving computerized information record files from its local data hubs, to cloud-based data storage centers.

A public cloud-based data server stores information in a remote geographic location. The user accesses this information over a network, like the Internet.

There are also private, community, and hybrid cloud-based data servers.

Having data files stored remotely in the cloud eliminates the need for having it stored in a local computer’s internal or external hard drive.

Using cloud-based data storage also removes the possibility of losing data due to a crashed computer, or hard drive/disk failure.

Major cloud-based data storage centers back up user data multiple times, and secures (via encryption) the information being stored. 
The data storage servers are also provisioned with power redundancy.

One analogy to compare storing data in a local hard drive versus the cloud is in how we use a commercial power utility for obtaining our electricity.

Back in the 1800s, in Troy, NY, Henry Burden built a 70-foot-tall by 12-foot-wide iron and wood water wheel. He built it in order to harness the power of the rushing water flowing out of a stream from a nearby waterfall next to his factory.

Burden used this power for operating the factory’s machinery.

By the early 20th century, it became more cost-effective to operate factory machines using electricity from the commercial power grid, versus constructing and maintaining water wheels or other independent power generation systems.

Today, it is not only becoming more cost-effective, but more practical and convenient to use cloud-based solutions for storing and retrieving data; instead of purchasing and maintaining one’s own data storage devices.

One reason the shift to cloud-based storage was proposed within the federal government by the Office of E-Government and Information Technology, was to help decrease information technology (IT) costs.

Another reason was the advantage of storing and retrieving cloud-based data from remote locations.

Other advantages include security, reliability, and the means to automatically stay up-to-date with the cloud’s rapid technical innovation.

In 2010, a “25-point implementation plan to Reform Federal IT Management” proposal was released by the US Chief Information Officer.

The purpose of this plan was to improve the efficiency of federal technology resources.

One of its points was to enact a “Cloud First” policy.
Agencies were to start a reassessment and modification policy to include cloud computing within their IT financial planning.

This policy required federal agencies to put into service cloud-based solutions; as long as they were secure, economical, and reliable.

One concern raised by some of the federal agencies, was the moving of data they currently controlled to an outsourced, cloud service provider.

It was obvious in the report I read, these agencies were having issues “trusting” the cloud, even though the data would be fully encrypted.

They were not alone in their hesitation.

I read where Steve Wozniak, co-founder of Apple Computer, expressed his apprehension with storing one’s private data in the cloud.

Of course, we have been storing data in the cloud for a while now.

Our private e-mail messages are stored in the cloud, using providers such as: Gmail, Yahoo, and AOL.

Some of us are creating and storing our word processing documentation on Google, WordPress, and Microsoft’s suite of online office products accessible via the cloud.

We’re uploading our personal photos to Photobucket, Instagram, and other photo-sharing service providers using storage servers located in the cloud.

Video we upload from our smartphones and other devices to YouTube, Vimeo, or Facebook are being saved in the cloud, as well.

Nearly all of our social networking now takes place in cloud-based servers.

However, a number of online social networking sites are not cloud-based; or even on the Internet.

There are vintage computer BBS’s (bulletin board system) still operating out there, accessible only through telephone dial-up modems.

Some folks (including yours truly) enjoy the feeling of nostalgia being logged into a local community BBS.

Most of the computing applications we use today on our mobile devices are stored and processed in a cloud-based server.

Where we create, retrieve, share, and store our data is moving to an all cloud-based services platform.

A May 13 Office of Management and Budget (OMB) report states HP and Cisco, leading manufacturers of the data storage servers, routers, and other equipment used for cloud-based services, is investing over $1 billion during the next two years for cloud-computing products, services, and development. 

This investment will go, in part towards developing new processors needing little electrical power. These will become part of “The Internet of Everything,” which we will be seeing embedded in many devices by 2020.

Google recently announced substantial price drops for its cloud computing services and storage.

Microsoft recently purchased a high-performance, cloud computing company called GreenButton.

I have no doubts: the “cloud-wars” among service providers has officially begun.

Gartner Research shows annual US spending on public cloud services will increase approximately 20 percent this year, reaching $158 billion.

OMB states conversion to cloud-based services has increased; as has the construction of cloud-based data-centers.

Some 60 percent of Fortune 500 companies will be deploying cloud-based “social-enabled solutions” via social media networks by 2016.

In 2017, 50 percent of mobile application development will use the cloud, and 14 million jobs worldwide will be created due to the increased growth of cloud computing.

A recent Microsoft study reports 45 percent of organizations have gone beyond the “pilot phase” for instituting cloud-based computing services.

The study said 32 percent of organizations have developed cloud computing plans as part of their IT and business strategy.

Some 24 percent of organizations presently consider themselves to be “heavy cloud users.”

Nearly 40 percent of executives surveyed expect parts of their organization’s services to be utilizing the cloud by 2015.

By 2020, it is said, cloud data centers will become cloud “ecosystems,” having incredibly fast response times, with its data traveling over high-speed broadband network connections to users.

The changeover to cloud-based computing services continues to expand, as the applications and services we use within them keep growing.

I like to think of our transition to cloud-based services as the next technological evolution in how we access, create, share, apply, and store information.
 


 Photo taken by: Mark Ollig

Thursday, June 19, 2014

Military GPS replacement technology



by Mark Ollig



Global Positioning System (GPS) earth-orbiting satellite technology, for guiding US military payloads, began in the 1970s.

During the 1960s, US Navy ships used related technology for obtaining hourly fixes on their positions.

The technology used for the GPS had its early beginnings during the 1950s.

The Advanced Research Projects Agency (ARPA) began under the Eisenhower administration in 1958, and was instrumental in the creation of a new computing network originally called: ARPAnet.

Today, ARPAnet is known to all of us as the Internet.

In 1972, ARPA was renamed DARPA (Defense Advanced Research Projects Agency).

Our good friends at DARPA say they are working every day “creating new technological solutions and transitioning them into practice.”

As the military’s use of GPS technology for navigation and precise guidance of weaponry grew, so did the enemies’ attempts at blocking data being sent from the GPS.

In the last few years, reports of GPS signal jamming using equipment which generates radio frequency waves or signal noise to act as interference have increased.

Signal noise can trick a GPS receiver into believing a GPS satellite is not available.

It is a violation of federal law to operate, market, or sell any type of jamming equipment (including GPS signal jamming).

Situations have been encountered by the US Department of Defense, whereby, GPS data has been intentionally blocked using high-powered signal jammers.

Besides the intentional jamming, there are times when the GPS signal can be lost, such as when a device using GPS is inside a heavily walled area of a building, underwater, or underground.

In 2012, DARPA was working on a more reliable replacement technology to guide the military’s missiles, armaments, smart weapons, drones, and other payload packages without GPS.

DARPA hopes to build a single 3D architecturally-structured computing chip; able to complete all the functions accomplished by today’s earth-orbiting GPS satellites.

The program for replacing dependency on GPS is called Chip-Scale Combinatorial Atomic Navigator, or C-SCAN.

The advanced computing chip, to be used for navigation, will contain atomic and solid-state inertial sensors.

It will become a part of DARPA’s Microtechnology for Positioning, Navigation and Timing (Micro-PNT) program.

This program is to design computing sensor technology, allowing devices so equipped to have GPS-like capabilities, and operate where a GPS signal sometimes does not reach.

Because of recent breakthroughs in microfabrication and miniaturization techniques, a single computing chip package can contain the needed sensors, gyroscopes, accelerometers, magnetometers, and atomic clocks, inside one self-contained Micro-PNT sensor.

The physical size of the newly designed Micro-PNT sensor is small.

Its 8-10 cubic millimeter size consists of a dome-shaped “bubble” made from silica, and contains a “rate integrating gyroscope.”

This bubble appears to be nested inside a high-quality, military-grade hardened plastic.

An array of small, combined antennas radiate from the bubble.

I uploaded a futuristic-looking photo of what DARPA thinks a C-SCAN Micro-PNT chip package may end up looking like: http://tinyurl.com/bytes-pnt.

A few weeks ago, Northrup Grumman was awarded the contract from DARPA for developing the miniaturized navigation grade inertial system part of the C-SCAN program.

“This microsystem has the potential to significantly reduce the size, weight, power requirement and cost of precision navigation systems,” said Charles Volk, vice president, Advanced Navigation Systems, at Northrop Grumman.

The microsystem will be using newly developed micro-technology, including self-contained, chip-scale sized, inertial navigation and precision guidance.

The Micro-PNT technology will be advanced enough “for applications such as personnel tracking, handheld navigation, small diameter munitions, and small airborne platforms,” according to Dr. Andrei M. Shkel, DARPA program manager.

Continually knowing where on the planet a device is located, without having to rely on GPS satellites, or maintaining a “line-of-sight” with a communication satellite, is the ultimate goal of this new high-dynamic range, low-power consumption chip.

One of DARPA’s visions, seen during a presentation (approved for public release) by Dr. Shkel, is to have these devices use self-contained, active guidance systems, for defining munitions orientation to their target without GPS reliance, during military missile engagements.

This is not a small technological task, to say the least.

Another description from Dr. Shkel’s presentation says the Micro-PNT uses “inertia of elastic waves, self-calibration/cross-calibration algorithms, and 3D fabrication.”

The price per Micro-PNT chip package may range from approximately $50,000 to $100,000.

Over the years, we have come to see new technology used in the military, eventually being introduced into civilian life; such as in 2000, when the public was allowed to begin using GPS technology.

Someday, instead of using a GPS, we may be navigating using a Micro-PNT sensor, built directly into our smartphones and other devices.

Roger L. Easton, who is given credit for designing and inventing GPS satellite technology, passed away May 8 of this year.

The official US Global Positioning System government website is: http://www.gps.gov.

Source: http://www.darpa.mil


Friday, June 13, 2014

From global to 'local' warming

by Mark Ollig


Yours truly understands it’s June, and we are just beginning to enjoy the warmth of the summer season.

However, for today’s column, let’s turn back the calendar a few months.

I promise we will recommence with June in just a few minutes. 

Ready? Good. 

Here we go. 

Think of the days when we were in the midst of our coldest weather. 

It’s wintry cold outside, and rather than heating an entire building, or a single room for that matter, why not instead provide warmth in only the area “surrounding” a person?

The researching folks at the Massachusetts Institute of Technology, better known as MIT, have come up with an innovative system which tracks, and then envelops the area a person occupies, with a radiated, warming temperature.

They are calling the project: Local Warming.

This system tracks people’s whereabouts throughout a building, using a newly developed Wi-Fi-based, location tracking system.

Ceiling-mounted, dynamic heating elements, communicate in real-time with information sent by the Wi-Fi tracking system. 

The intelligence associated within these heating elements will track and “zero-in” on one or more individuals, and create a personalized envelope of warmth around them. 

The idea behind this is to save energy and lower heating costs, by focusing the heat directly around a person or persons in a given area, instead of heating the entire square footage of a room, or large, open expanse. 

Granted, this sounds a bit like science fiction; wrapping a specialized environment around a person as they traverse. However, this idea has seized the imagination of many in the technological community.

“Infrared heat is emitted to generate what are essentially spotlights of warmth centered on people a few meters away,” said Leigh Christie, MIT project engineer.

The first commercial use of Local Warming technology, according to a lead researcher, could be heating systems targeting warmth to people walking through the exposed, outdoor areas of a building complex.

Places where Local Warming systems could be installed include: office buildings, malls, homes, large lobbies, lofts above industrial buildings, and in areas of buildings people seldom occupy. 

Energy and monetary savings could be realized by not having to waste resources maintaining a constant heated temperature via climate control, throughout an empty or low-populated building.

Local Warming technology installed in these types of buildings would track and provide for a person’s warmth needs only within the area they occupy; returning to the building’s previous climate controlled temperature once the space is unoccupied.

Specific temperatures would be maintained, while infrared light beams tracked the tenants throughout the building.

“Local Warming allows participants to engage with their climate directly and to enact a new type of efficient, localized climate control,” explained Miriam Roure, lead researcher of the Local Warming project. 

A Local Warming display area was set up, and successfully demonstrated, during the 14th Venice Architecture Biennale, which took place a couple weeks ago in Venice, Italy. 

One method of local warming yours truly utilizes has been benefiting people for years, uses no external power, and is very low-tech. 

When I need to create a personalized envelope of warmth around myself while at home, I take out the large, comforter blanket I got for Christmas. When wrapped around me, it utilizes the warmth I alone generate; thus saving heating energy resources, and money. 

On a more serious note, I look for applications of various-sized Local Warming systems being used in the future. 

These systems will co-exist with other designs as a supplementary means for conserving energy, and reducing a building’s climate-control costs during the colder months of the year.

We can now return to our regularly scheduled month of June.







Photo: Giulia Bruno


Thursday, June 5, 2014

Apple's 25th developer conference



by Mark Ollig


       
“Write the code. Change the world.”

This was the theme for the 2014 World Wide Developers Conference (WWDC), hosted by Apple.

It’s the 25th year Apple has met with the software developers (programmers) who create the applications (apps) used on Apple’s operating system (OS) computing platforms.

In 1990, 1,300 developers gathered to talk about the Apple computer System 7 OS.

Today, Apple has 9 million registered developers.

More than 1,000 Apple engineers were available to talk with the developers who came from 69 countries.

These app developers came to get an in-depth look at the latest in Apple’s iOS and OS X operating systems.

They use this to create new and improved apps, which will be of benefit to those of us who use devices containing these operating systems.

Last week’s 2014 WWDC included more than 100 conference breakout sessions for app developers and Apple engineers.

The WWDC allowed programmers to check out Apple’s newest software designs and applications.

They also brought their own programming code to review with Apple programmers.

App developers took advantage of the 120 labs available, for working with Apple engineers in improving their software coding techniques.

This year’s keynote address began with a smiling Tim Cook, Apple’s CEO, taking the stage to loud applause from the folks in the capacity-filled Moscone West, in San Francisco.

Cook pointed out the youngest app developer in the audience was 13 years old.

He also noted the current installed base of Apple Mac computers is now 80 million.

Craig Frederighi, Apple’s VP of software engineering, later addressed the audience about the new Mac computer operating system.

It is called OS X 10.10 Yosemite.

Laughter erupted from the audience, when Frederighi jokingly said they almost named the new operating system “OS X WEED.”

Yours truly imagined this could have turned out to be a real “smoking” operating system for Apple.

Some of the features in Yosemite include a new translucent design, providing a distinct visual appearance of depth, and transparency.

Another is its system font, Sans Serif, which is optimized for Retina displays. It’s thinner, rounder, easier on the eyes, and very clean looking.

This version has translucency; meaning, the images, toolbars, content, and the applications in the background, will faintly show up through the desktop wallpaper.

Using Dark mode, a user will see the menu bar change into a translucent, black shade in the background, instead of the brighter white theme normally seen.

The translucent background is a way to keep a computer user focused on their current program application.

The Notification Center will better utilize the calendar, allowing customization for weather and other apps, such as the ESPN ScoreCenter.

An improved Spotlight was featured. A single keystroke launches this search tool.

Spotlight appears in the middle of the screen; you then type what you’re searching for inside the search bar within Spotlight.

Search results come from Bing, Wikipedia, the iTunes Store, Apple Maps, and other sources.

AirDrop works between the mobile iOS and desktop OS X computers using Continuity, which is a proximity awareness technology.

Features of this technology include a user wirelessly transferring documents, and phone information between an iPhone and Mac.

It allows the iPhone caller ID information to be seen on your Mac. You will be able to answer a phone call ringing on the iPhone from your Mac computer.

The Mac could also be used as a speaker phone.

A new user cloud-based storage medium, iCloud Drive, works with Apple’s iCloud.

The iCloud Drive is integrated into Finder, allowing a Mac user to browse through and add or remove files as if they were on the Mac’s physical hard drive.

The iCloud Drive will even be available to users of the Microsoft Windows OS platform.

Apple’s new Mac Yosemite OS has improved integration with their mobile devices using iOS, making for a more seamless use of applications between the two.

Yosemite will become available to Mac users this fall at no cost – it was available to the app developers last Tuesday.

Frederighi also presented Apple’s new iOS 8 for its smart mobile devices.

Here is the list of the Apple mobile smart devices iOS 8 can be used on when it becomes available this fall:

• iPhone 4s

• iPhone 5

• iPhone 5c

• iPhone 5s

• iPod touch 5th generation

• iPad 2

• iPad with Retina display

• iPad Air

• iPad mini

• iPad mini with Retina display

The HomeKit app, installed on an iPhone, uses Siri to control the connected smart devices within a home.

There’s also a Healthkit app, which partners with the Mayo Clinic.

This app monitors a person’s vital statistics as collected by third-party wellness apps.

The information Healthkit collects can be forwarded to your healthcare provider; if you choose to do so.

Healthkit also comes with the Health app, which includes a virtual dashboard where you can monitor all of your own pre-defined health metrics; and obtain informative feedback.

The iOS keyboard texting app was also overhauled using QuickType; a predictive typing suggestion feature.

QuickType is personalized. It learns “how” you type; suggesting words while you are texting.

Apple continues its journey of slowly transitioning itself from the Steve Jobs era.