Tweet This! :)

Friday, May 29, 2015

We've become immersed in technology



by Mark Ollig

       

Has the Internet become the new society humans want to essentially live in?

I see you smiling out there, but I am serious.

It’s not just the Internet.

People’s minds are becoming absorbed inside their high-tech smartdevices – figuratively speaking, of course.

The millennials, and those who grew up with the Internet and new technology, easily embrace it, and are a bit more comfortable around it than those of us from the “boomer” generation.

We boomers, however, are excellent observers, innovators, thorough researchers, and are able to adapt; so never count us out.

While writing today’s column, I began thinking about how many of us have become, do I dare say “addicted” to our cellphones, laptops, iPads, and other smartdevices.

Truth time: How many of you, before going to sleep, place your iPad, computing tablet, or smartphone on the nightstand next to your bed?

I do, too.

If I wake up during the middle of the night, I’ll grab my smartphone and go to Facebook and Twitter.

I’ll also check for any instant messages, or see if there is any late-breaking news posted on the CNN website.

Once, around 2:30 a.m., I posted an update on my Facebook page, and it immediately received two likes.

This made me feel less guilty about being on social media at that hour, and I eventually went back to sleep.

In this day and age, more young people are immersing themselves inside 3D (three-dimensional) virtual reality computer gaming software programs.

They spend countless hours playing inside these virtual 3D games; sometimes competing with multiple users while connected to the Internet via devices such as Sony Corporation’s PlayStation console.

“The Matrix,” a movie from 1999, depicted humans living out their physical lives inside a shell-like pod, with their minds hard-wired to a computer-generated, surrealistic virtual reality.

Morpheus is a character from this movie; and in 2016, Sony will be selling a new virtual reality (VR) helmet, interestingly enough called: Morpheus.

It seems wherever I go, people have become totally preoccupied with their smartdevices.

Last Saturday morning, I stopped at the local coffeehouse and ordered my usual coffee beverage: A depth charge (black coffee spiked with a shot of espresso), with added light roast, and heavy cream – not half-and-half cream.

I make sure the barista uses the cream out of the quart container saying: “Ultra Pasteurized Heavy Cream.”

Yes, I know. Yours truly is such a stickler – but a splash of heavy cream does make a big difference in the taste of my favorite brew . . . and besides, it’s low-carb.

The person taking my order looked down while pressing keys on the touchscreen of the coffeehouse’s POS (Point of Sale) computer register terminal.

I imagine this register terminal was connected to the coffeehouse’s local computer, which was probably networked to their corporate headquarters computer.

Glancing back at the seating area, I noticed how everyone’s attention in the coffeehouse was focused on the particular electronic device each was using.

Some of the tables in the coffeehouse seated only one customer; others had two, but each person was using a laptop, tablet, or a smartphone.

It was exceptionally quiet, there was no verbal conversations going on, just the occasional calls from a barista of an “order up,” and the sporadic keyboarding clicks of someone using a laptop computer.

Two people seated across from each other at one table were smiling (but not talking) while typing away on their Apple laptops.

I would not have been surprised if they were actually text messaging with each other.

The baristas’ eyes were also fixated on a computer display screen.

I noted how often they looked at the monitor display screens near the coffee/espresso machines.

These monitors showed customer coffee orders needing to be completed.

The customers waiting in line for their coffee were either text messaging, or interacting on an Internet social media site from a smartdevice.

No one was talking; it was eerily silent.

I suppose some people have just become more comfortable conversing using text messaging or social media, rather than directly speaking with each other.

After picking up my 20-ounce to-go cup filled with its precious coffee goodness, I approached the front door and observed a young person entering – he was staring fixedly down at his smartphone; appearing oblivious to anyone else.

I smiled at him, and cheerfully said; “Good morning!” He paused for a moment, slowly looked up at me, smiled back, and said; “Good morning to you, too.”

This brief verbal interaction broke the silence, confirming courteous, face-to-face human communication is still alive and well – at least in this instance.

As I walked outside of the coffeehouse, I paused to look up at the deep-blue morning sky, felt the slight cool breeze, and enjoyed the comforting sunshine while sipping my coffee.



















Above: Sony's new Morpheus VR helmet.
Source: Sony Corporation website.

Thursday, May 21, 2015

Thanks Snoopy - are you still out there?



by Mark Ollig



This mission was the decisive dress rehearsal.

It was the last practice run for identifying and resolving any loose ends, and it happened just two months before one of our planet’s most memorable events.

A very young yours truly was seated on the floor of the family living room; my eyes were intently focused on the images being shown on the screen of the Zenith console color television.

It was May of 1969, and Walter Cronkite was reporting from the CBS News Apollo Headquarters in New York.

Cronkite usually provided much enthusiastic commentary during these televised NASA space missions.

The Apollo 10 mission was the crucial test flight needed before the scheduled July moon landing of Apollo 11.

Apollo 10 was to make a “dry run” of all the operations and maneuvers necessary for a lunar module to land on the moon, and then rendezvous with a lunar-orbiting command module.

Apollo 10’s lunar module would not physically land on the moon; this would happen during the mission of Apollo 11.

It was said the lunar module was going to be “snooping around” the moon’s surface, so they named the Apollo 10 lunar module “Snoopy.”

Of course, it then made sense to name the command module, the spacecraft all three astronauts rode to and from the moon in, “Charlie Brown.”

May 22, 1969, Apollo 10 had achieved an orbit around the moon. Astronauts Eugene Cernan and Thomas Stafford entered and undocked the two-stage lunar module from the command module.

With Snoopy now floating free in moon orbit, they began their descent towards the grayish lunar surface.

The command module, piloted by astronaut John Young, would continue to orbit the moon.

Snoopy tested its guidance computer, landing radar, and radio communications with Charlie Brown and Mission Control in Houston, TX.

They practiced firing the lunar module’s reaction control system (RCS) thruster quad engines, tested the lower-stage descent propulsion system, radar, and completed other procedures needed to simulate landing and taking off from the moon.

Snoopy also made a survey of the Sea of Tranquility, the designated landing site for Apollo 11.

Using a new color television camera system, they showed the moon’s surface to everyone back on Earth.

The one thing Snoopy would not do is actually land on the moon; however, during the practice landing, Snoopy’s descent rocket engine was fired, and the lunar module did descend to about 9.5 miles above the moon’s surface.

I often wondered if the two astronauts aboard Snoopy were ever tempted to keep descending and land on the moon.

Cernan and Stafford had the lunar lander to do it with, and besides, they were very close to the moon’s surface.

I later learned things would not have ended well if they would have set Snoopy down on the surface of the moon.

If Snoopy had landed on the moon, there would not have been enough fuel remaining in the propellant tanks of the lunar module’s ascent stage (the upper portion of the lunar module containing the astronaut’s crew cabin) for them to take off and successfully reach a high enough lunar orbit to rendezvous with the command module.

Yes, they could have taken the spotlight away from Apollo 11 by landing on the moon first, but they both would have ended up being marooned there.

On the other hand, John Young, orbiting the moon in the command module, would have been able to return to Earth.

Of course, Snoopy did not waver from the planned mission, and the astronauts carried out and successfully completed the practice moon landing with professionalism and skill.

Having achieved all of the low lunar orbit objectives, Stafford and Cernan fired Snoopy’s upper ascent stage rocket in order to gain altitude, and make a rendezvous with the command module.

Snoopy’s bottom platform lander section (descent stage), having already been released from the ascent stage, slowly fell towards and crashed onto the moon.

So, a part of Snoopy did, in fact, make it to the moon’s surface.

Approximately eight hours had elapsed since Cernan and Stafford began working inside the lunar module.

They were now in the proper lunar orbit for docking Snoopy’s ascent stage module with the command module.

After docking and boarding Charlie Brown, the astronauts jettisoned the abandoned Snoopy into space.

Once Snoopy’s ascent stage had drifted to a safe distance from the command module, NASA Flight Control in Houston remotely ignited its ascent rocket engine.

Snoopy’s ascent stage was programmed to journey into space, in order to drain its remaining fuel supply.

The Apollo 10 mission was a success, and May 26, 1969, all three astronauts safely returned to Earth.

Today, the Science Museum in London is the current home of the Apollo 10 command module, Charlie Brown.

It can be seen here: http://tinyurl.com/m2lyqx6.

But where is Snoopy’s ascent stage?

Snoopy’s crew cabin, the ascent stage where Cernan and Stafford hovered over the moon, has been in a “heliocentric orbit” (traveling around the Sun) – for the last 46 years.

In fact, Snoopy is the only surviving Apollo lunar module ascent stage still voyaging through space.

Apollo 13’s lunar module; Aquarius, which did not land on the moon, but was instead used as a lifeboat to get all three astronauts back to Earth in April of 1970, burned up in the Earth’s atmosphere after being jettisoned from the Apollo 13 command module; just before splashdown.

The Smithsonian National Air and Space Museum records the fate of all Apollo lunar modules here: http://tinyurl.com/nxwyrl6.

NASA’s website lists the present location of all the Apollo command modules at: http://tinyurl.com/6vdjr2c.

The website also includes video clips of Snoopy and Charlie Brown: http://tinyurl.com/k9z5uvt.

 

Thursday, May 14, 2015

IBM's first large-scale electronic computer



by Mark Ollig



Thomas J. Watson Jr., President of IBM, revealed to its shareholders on April 29, 1952 the company was constructing “the most advanced, most flexible high-speed computer in the world.”

What later became the IBM 701 Electronic Data Processing Machine was first known during its early development as the “Defense Calculator.”

It was named this because in 1950, at the start of the Korean War, IBM’s chairman, Thomas J. Watson Sr., had asked the US Government how his company could be of assistance.

The government’s reply to him was to ask IBM to build a large, scientific computer.

The computer would need to be capable of designing aircraft, assisting in nuclear development, engineering of armaments, performing calculations, and probably other details yours truly is not privy to.

IBM “put pencil to paper” and began designing this new computer in January 1951.

“We convinced ourselves that by taking a giant step toward this far-out high-performance machine, we and our customers would benefit in many ways,” said Jerrier Haddad, who worked for IBM.

Haddad was in charge of the technical and executive responsibilities for the 701 development and design program.

The IBM 701 was constructed in IBM’s Poughkeepsie, NY plant, where its parts were put together using production-line assembly techniques. It was the world’s first mass-produced computer.

This electronic digital computer had an input and output system, assorted memory devices, digital arithmetic-processing components, and what I viewed to be a well-organized, human-operator control center.

The 701 computer’s cabinet-bay wiring was neatly cabled using modular, pluggable ends.

It also had various physical devices for storing and retrieving information.

I had not heard of using a cathode ray tube (CRT) for storing binary information, until I read about the electrostatic data-storage system glass tubes used in the 701 computer.

Memory devices of the IBM 701 included 72 electrostatic data-storage “Williams-Kilburn tubes,” capable of holding 1,024 bits each. This total of 73,728 bits had the capacity of 2,048 words using 36-bits, when the logical wording address system used two 18-bit words.

These electrostatic tubes held the data/information as a “charge pattern.”

The 701 memory also used four magnetic-coated cylinder drums with a capacity of storing 81,920 digits.

The magnetic-coated cylinder drum memory could read and write at 8,000 digits per second.

In addition, four cabinet bays, each equipped with magnetic-tape drive units, held more than 8 million digits per individual tape-reel.

The magnetic-tape unit’s read and write speed was 12,500 digits per second.

In addition to vacuum tubes, several thousand germanium-diode electronic components were used inside the IBM 701 computer system.

Output information obtained from the IBM 701 could be sent to a paper printer at a rate of 150 lines per minute.

As far as processing capabilities, this circa 1951 IBM 701 computer performed exceptionally well for this time period.

The computer’s internal processing operations were performed using the binary system; yes, those bits and bytes made up of 1’s and 0’s.

Information from IBM’s 701 webpage states this computer could perform an average of 14,000 mathematical operations a second.

The IBM 701 was managed from an operator’s control center panel using buttons, keys, and switches for input instruction entry of the memory, accumulator, and multiplier-quotient registers.

It also had a lot of visual indicator activity lights.

Two IBM power frames, and an IBM power distribution unit, provided the electricity for the individual systems comprising the IBM 701 data processing computer.

April of 1952, an IBM 701 development machine computing model was assembled in an IBM office at Poughkeepsie, NY.

The individual equipment cabinets were neatly arranged inside the office; I noted the inter-machine cabinet cabling was smartly hidden from view.

Here is a photo of the IBM 701 in the Poughkeepsie office: http://tinyurl.com/mct2w67.

May 21, 1952 – 63 years ago this week – IBM branch managers were informed the Defense Calculator would now be referred to as the IBM Electronic Data Processing Machine.

There were 19 commercial IBM 701 Electronic Data Processing Machines manufactured.

The first shipped to IBM World Headquarters in New York, NY Dec. 20, 1952.

The University of California, in Los Alamos, NM, received the second IBM 701 computer March 23, 1953. It processed hydrodynamic calculations.

Some of the contractors and agencies receiving the IBM 701 included: Boeing, Lockheed, the US Navy, General Electric, and the National Security Agency – better known as the NSA.

Jan. 8, 1954, an IBM press release was written about Russian being translated into English using an “electronic brain.” This electronic brain could translate Russian to English onto an “automatic printer” at a speed of two-and-a-half lines per second.

This “electronic brain” was the IBM 701 computer.

The IBM website has an archival page with photos of the 701 computer, and its associated cabinet components at: http://tinyurl.com/lrms8u6.

The last IBM 701 computer (assembled from spare parts) was shipped to the US Weather Bureau in Washington, DC, Feb. 28, 1955.




 

Friday, May 8, 2015

Senior assistance via Apple iPads



       

by Mark Ollig



In 2000, senior citizens 65 and over represented 12.4 percent of our country’s population.

Statistics from the US Department of Health and Human Services say this percentage will rise to 19 percent in 15 years.

They calculate by 2025, there will be 72 million more people over age 65, than there were in 2000.

Japan has been addressing this growing segment of their own population, by testing various robotic technologies to assist their increasingly elderly populace.

In fact, Robear, a new experimental 300-pound nursing care robot for humans, is now being tested in Japan.

With its cute-looking bear face, the strong Robear is able to gently lift a person out of a bed, help them to stand up, assist putting them into a chair, and even carry them.

Recently, tech giants IBM and Apple computer, along with Japan Post;,Japan’s largest employer with interests in focusing resources on the aging, formed an alliance.

This alliance will assess technology focused on improving the lives of older adults.

The three companies jointly announced special senior citizen software applications (apps) will be designed and incorporated into Apple’s iPad devices.

A population projection from Japan’s National Institute of Population and Social Security Research showed their country’s 65 and over population in 2010 was approximately 29.5 million.

By 2040, this number will increase to almost 39 million.

The United States Census Bureau statistics showed in 2010, there were 40.3 million age 65 and better.

By 2040, the US will have about 81.3 million citizens age 65 and over, per current US Census estimations.

By 2050, the US Census predicts those 65 and over will rise to 88.5 million, more than doubling the number of those 65 and over, in 2010.

“The aging of the population will have wide-ranging implications for the country,” predicted the US Census in the report I was reading.

It went on to say; “The aging of the older population is noteworthy, as those in the oldest ages often require additional care giving and support.”

One can understand then, why new technology is being developed for assisting our growing senior population.

It was disclosed during a press conference that Apple, IBM, and Japan Post will be using Japan as the testbed country for their experiment.

Special apps for assisting seniors’ day-to-day activities include: health monitoring, personalized medicine-taking with reminders, shopping from home, family member communications, and other yet-to-be created senior citizen apps.

These special apps will be pre-installed in the approximately 5 million iPads Japanese seniors will receive.

IBM will be providing special “cloud” services, which includes improving its Apple mobile iOS platform handling, and creating apps to be used in the iPads.

“Many countries will be facing the same situation soon,” said Taizo Nishimuro, CEO of Japan Post, during the press conference he attended with Apple CEO Tim Cook, and IBM CEO Virginia Rometty.

Cook gave an example of the benefits senior citizens can get from using an iPad for improving their lives, by mentioning Edith Kirchmaier, a 107-year-old who is the oldest known person using an iPad.

“I use the Internet when I’m interested in something, or someone. I find it very educational,” Kirchmaier said during an interview with The New York Daily Times in 2013.

Kirchmaier is also the oldest person using Facebook. She currently has 60,071 followers. Her Facebook page is: http://on.fb.me/1KOD6lf.

“Today is about reimagining life for what is the largest generation in human history – seniors,” Rometty said.

She also mentioned, everyday 10,000 people in this country turn 65, and by 2050, one in five people in the world will be a senior citizen.

The special “senior-citizen equipped” Apple iPads will be distributed to Japanese elders over a five-year period; using Japan Post’s personnel to handle their delivery, and setup.

If this iPad-Seniors program is successful in Japan, it could be made available globally.

I wanted to get some feedback about this project, so I asked my mom what she thought of it.

Of course, I very much respect the opinion of my mother.

She listened intently while I somewhat competently explained how seniors would be using Apple iPads in Japan.

I told her these special iPad apps would have large, easy-to-read buttons, adjustable vision and audible settings, and specific features seniors would find useful.

Mom nodded, and then quickly said to me, “Oh, I’m too old to mess with all that.”

I paused; not knowing how to respond, as all of the built-up enthusiasm I felt while explaining this to her suddenly came crashing to the ground.

For some reason, I recalled the words spoken by the radio reporter Herb Morrison, who witnessed and described the famous Hindenburg airship crash of 1937.

His words: “Oh, it’s crashing to the ground!” piercingly sounded in my mind.

However; on the bright side, mom still likes the idea of robotic-assistants.

A video by WXYZ TV channel 7 in Detroit shows Japan’s Robear robot in action. It can be seen here: http://tinyurl.com/p9q9vtd.