Oct. 31, 2011
by Mark Ollig
What well-known company created the first desktop office computer, navigable by using a mouse-driven graphical user interface?
Did I hear someone say Apple Computer’s Lisa or Macintosh computer?
The Lisa was available in January 1983, followed by the Macintosh one year later.
The Microsoft Windows 1.0 graphical user interface program came out in November 1985.
It was in April 1973, when Xerox Corporation’s Palo Alto Research Center (PARC) division completed work on a new desktop computer.
This computer contained a graphical user interface, navigated by using a 3-button mouse.
They called it the Xerox Alto.
I think of the Alto computer as the ancestor of today’s personal computer.
The name “Alto” was taken from the Palo Alto Research Center, where Xerox developed it.
Xerox is better known for its copier machines, but back in the early 1970s, inside Xerox’s Software Development Division, Xerox developers began work on a unique computer graphical user interface design.
Xerox researchers believed future technology favored digital over analog, and so they designed a way to merge their copier machines with digital computing technology – which they began using within their organization.
The Alto computer’s graphical user interface offered a significant improvement over keying in text at a command line.
Alto users experienced a dramatic visual difference when manipulating the display screen’s graphical images, scrollbars, icons, windows, and file names using the 3-button mouse.
The Xerox Alto computer used a rectangular-portrait-like, monochrome, 875-line, raster-scanned, bitmap display screen.
“Bitmap” refers to how each pixel element on the display screen is mapped with one or more bits stored inside the computer’s video memory.
A bitmap display was essential in using the graphical user interface.
Alto’s programs were stored on 2.5 MB single-platter removable disk cartridges.
The “microcoded” processor was a based on Texas Instrument’s Arithmetic and Logic Unit (ALU) 7481 chip, and was equipped with 128 kB of main memory, expandable to 512 kB.
The computer’s processing components, disk storage units, and related systems were encased inside a small cabinet the size of a compact refrigerator.
Alto computers were connected to Xerox’s LAN (Local Area Network) using Ethernet – which Xerox had developed at PARC.
The LAN allowed for the sharing of program files, documents, printers, office email, and other information.
The Alto computer included a 64-key QWERTY keyboard.
Another device for entering commands was a five-finger “chord keyset” device; however, this never became as popular with Alto users as did using the 3-button mouse.
The Alto was designed to be used with laser printers, which were also developed by Xerox.
Software used with the Alto included word processors Bravo and Gypsy.
Alto’s email software was called Laurel; someone with a sense of humor called the next version; Hardy.
Other software included a File Transfer Protocol program, a user chat utility, and computer games such as, Chess, Pinball, Othello, and Alto Trek.
Markup and Draw (a painting and graphics program), was also used on the Alto.
Xerox originally started with 80 Alto computers, each costing $10,000.
Alto computers were not sold to the general public; however, Xerox provided them to universities and government institutions, as well as using them within their corporate business offices.
By 1978, Xerox Alto computers had been installed in four test sites – including the White House.
I learned that by the summer of 1979, almost 1,000 Alto computers were being used by engineers, computer science researchers, and office personal.
In December 1979, Steve Jobs, who co-founded Apple Computer in 1976, visited Xerox at PARC, and was given a demonstration of the Alto computer.
He was shown Ethernet-networked Alto computers using email and an object-oriented programing language which ended up being called, SmallTalk.
However, what impressed Jobs the most was when he observed people operating the Alto computers by means of a working graphical user interface, instead of text commands.
“I thought it was the best thing I had ever seen in my life,” said Jobs.
He went on to say, “Within 10 minutes, it was obvious to me that all computers would work like this someday.”
In 1981, Xerox made available to the public a graphical user interface desktop business computer called, the Xerox Star 8010 Information System.
That same year, recognized technology leader IBM came out with its own desktop computer called, the IBM Personal Computer (model 5150).
Yours truly used this IBM PC model for several years.
IBM had a historical reputation with computing, and its personal computers became popular with businesses and the general public.
Throughout the 1980s, IBM, Apple, Microsoft, and other computer companies continued to develop and improve upon their own computer hardware and operating systems.
As the 1980s progressed, it became apparent that it was too late for Xerox to become a serious player in this newly emerging personal computer game.
The public identified Xerox more as a copier machine company, than a computer company.
As for the Xerox Alto, its significance was the major role it played in influencing how we interact with computers.
“ . . . Xerox could have owned the entire computer industry today,” Steve Jobs said in 1996.
To see a picture of the Xerox Alto, go to: tinyurl.com/42x52qo.
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
Thursday, October 27, 2011
Thursday, October 20, 2011
Licklider's vision: an 'Intergalactic' computer network
Oct. 24, 2011
by Mark Ollig
Joseph Carl Robnett Licklider was a visionary, and an Internet pioneer.
I realize many folks may not have heard of Licklider, however, he is deserving of recognition for his innovative concepts which helped bring us the Internet.
Licklider was born March 11, 1915.
He developed an early interest in engineering, building model airplanes, and working on automobiles.
In 1937, he graduated from Washington University in St. Louis, where he received a bachelor of arts degree majoring in psychology, mathematics, and physics.
After receiving a PhD in psychoacoustics in 1942 from the University of Rochester, he went on to work at Harvard University’s Psycho-Acoustic Laboratory.
In 1950, Licklider went to MIT, where he was an associate professor.
During the early 1950s, Licklider was asked about working on creating a new technology for displaying computer information to human operators. This was for the purpose of improving US air defense capabilities.
It was during this time Licklider’s thoughts about human-computer interactions began.
The Advanced Research Projects Agency (ARPA) was established in February 1958 by President Dwight Eisenhower, in response to the Soviet Union’s Sputnik I satellite program, which yours truly recently wrote a column about.
The next year, Licklider had written a book titled “Libraries of the Future.”
This book explained how people would be able to simultaneously access (from remote locations) what he called an “automated library,” located in a database inside a computer.
In 1960, Licklider wrote, “It seems reasonable to envision, for a time 10 or 15 years hence, a ‘thinking center’ that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.”
He even began seriously speaking about interactive computers serving as automated assistants – and people were listening to him.
Licklider wrote what eventually became a seminal paper in March 1960, called “Man-Computer Symbiosis.”
In it he wrote, “It seems entirely possible that, in due course, electronic or chemical “machines” will outdo the human brain in most of the functions we now consider exclusively within its province.”
Licklider wrote about the need for computer involvement in formative and real-time thinking.
He described a computer assistant that could perform simulation modeling which would graphically display the results. He also wrote about how a computer could determine solutions for new problems based on past experiences, and how a computer would be able to answer questions.
Even back in 1960, Licklider foresaw a close symbiotic, or interdependent relationship developing between computers and human beings.
His foresight was revealed in his writings regarding computerized interfaces with the human brain – which he believed was possible.
Licklider also wrote a series of memos involving a number of computers connected to each other within a “Galactic Network” concept.
This concept, Licklider wrote, allowed the data and programs stored within each computer to be accessed from anywhere in the world, by any of the computers connected to the network.
He accurately recognized the importance and potential of computer networks, explaining that by distributing numerous computers over a fast-moving electronic data network, each one could share its programs and informational resources with the other.
It seems as if Licklider is describing the basic foundation of the Internet.
Licklider, in collaboration with Welden E. Clark, released a 16-page paper in August 1962, titled “On-Line Man Computer Communication.”
In this paper, he described ,in detail, the concepts of the future use of on-line networks, and how computers would play the role of a teacher for “programmed instruction” training purposes.
A quote from this 1962 paper says, “Twenty years from now [1982], some form of keyboard operation will doubtless be taught in kindergarten, and forty years from now [2002], keyboards may be as universal as pencils, but at present good typists are few."
I, your humble columnist, have always considered myself a good typist.
In October 1962, Licklider was chosen as the first director of the Information Processing Techniques Office (IPTO) research program located at the Defense Advanced Research Projects Agency (DARPA), which is part of the US Department of Defense.
This is where he was successful in gaining acceptance and support regarding his computer networking ideas. Licklider also helped to guide the funding of several computer science research projects.
While at DARPA, Licklider was involved in the development of one of the first wide area computer networks used in the United States for a cross-country radar defense system.
This network system was connected to many Department of Defense sites, including Strategic Air Command (SAC) headquarters, and the Pentagon.
In 1963, Licklider obtained IPTO funding for the research needed to explore how time-sharing computers could be operated by communities of users, located in various geographic locations.
IPTO originally began development of the Advanced Research Projects Agency Network (ARPANET) in 1966, which led to today’s Internet.
Licklider had the foresight in the 1960s to accurately estimate millions of people being online by the year 2000, using what he called an “Intergalactic Computer Network.”
In December of 2000, the Internet reached 361 million users, according to the Internet World Stats website.
Joseph Carl Robnett Licklider, also known as J.C.R. or Lick, passed away at age 75, June 26, 1990, in Arlington, MA.
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
by Mark Ollig
Joseph Carl Robnett Licklider was a visionary, and an Internet pioneer.
I realize many folks may not have heard of Licklider, however, he is deserving of recognition for his innovative concepts which helped bring us the Internet.
Licklider was born March 11, 1915.
He developed an early interest in engineering, building model airplanes, and working on automobiles.
In 1937, he graduated from Washington University in St. Louis, where he received a bachelor of arts degree majoring in psychology, mathematics, and physics.
After receiving a PhD in psychoacoustics in 1942 from the University of Rochester, he went on to work at Harvard University’s Psycho-Acoustic Laboratory.
In 1950, Licklider went to MIT, where he was an associate professor.
During the early 1950s, Licklider was asked about working on creating a new technology for displaying computer information to human operators. This was for the purpose of improving US air defense capabilities.
It was during this time Licklider’s thoughts about human-computer interactions began.
The Advanced Research Projects Agency (ARPA) was established in February 1958 by President Dwight Eisenhower, in response to the Soviet Union’s Sputnik I satellite program, which yours truly recently wrote a column about.
The next year, Licklider had written a book titled “Libraries of the Future.”
This book explained how people would be able to simultaneously access (from remote locations) what he called an “automated library,” located in a database inside a computer.
In 1960, Licklider wrote, “It seems reasonable to envision, for a time 10 or 15 years hence, a ‘thinking center’ that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.”
He even began seriously speaking about interactive computers serving as automated assistants – and people were listening to him.
Licklider wrote what eventually became a seminal paper in March 1960, called “Man-Computer Symbiosis.”
In it he wrote, “It seems entirely possible that, in due course, electronic or chemical “machines” will outdo the human brain in most of the functions we now consider exclusively within its province.”
Licklider wrote about the need for computer involvement in formative and real-time thinking.
He described a computer assistant that could perform simulation modeling which would graphically display the results. He also wrote about how a computer could determine solutions for new problems based on past experiences, and how a computer would be able to answer questions.
Even back in 1960, Licklider foresaw a close symbiotic, or interdependent relationship developing between computers and human beings.
His foresight was revealed in his writings regarding computerized interfaces with the human brain – which he believed was possible.
Licklider also wrote a series of memos involving a number of computers connected to each other within a “Galactic Network” concept.
This concept, Licklider wrote, allowed the data and programs stored within each computer to be accessed from anywhere in the world, by any of the computers connected to the network.
He accurately recognized the importance and potential of computer networks, explaining that by distributing numerous computers over a fast-moving electronic data network, each one could share its programs and informational resources with the other.
It seems as if Licklider is describing the basic foundation of the Internet.
Licklider, in collaboration with Welden E. Clark, released a 16-page paper in August 1962, titled “On-Line Man Computer Communication.”
In this paper, he described ,in detail, the concepts of the future use of on-line networks, and how computers would play the role of a teacher for “programmed instruction” training purposes.
A quote from this 1962 paper says, “Twenty years from now [1982], some form of keyboard operation will doubtless be taught in kindergarten, and forty years from now [2002], keyboards may be as universal as pencils, but at present good typists are few."
I, your humble columnist, have always considered myself a good typist.
In October 1962, Licklider was chosen as the first director of the Information Processing Techniques Office (IPTO) research program located at the Defense Advanced Research Projects Agency (DARPA), which is part of the US Department of Defense.
This is where he was successful in gaining acceptance and support regarding his computer networking ideas. Licklider also helped to guide the funding of several computer science research projects.
While at DARPA, Licklider was involved in the development of one of the first wide area computer networks used in the United States for a cross-country radar defense system.
This network system was connected to many Department of Defense sites, including Strategic Air Command (SAC) headquarters, and the Pentagon.
In 1963, Licklider obtained IPTO funding for the research needed to explore how time-sharing computers could be operated by communities of users, located in various geographic locations.
IPTO originally began development of the Advanced Research Projects Agency Network (ARPANET) in 1966, which led to today’s Internet.
Licklider had the foresight in the 1960s to accurately estimate millions of people being online by the year 2000, using what he called an “Intergalactic Computer Network.”
In December of 2000, the Internet reached 361 million users, according to the Internet World Stats website.
Joseph Carl Robnett Licklider, also known as J.C.R. or Lick, passed away at age 75, June 26, 1990, in Arlington, MA.
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
Friday, October 14, 2011
Smell-O-Vision II: Coming soon to a nose near you
Oct. 17, 2011
by Mark Ollig
Remember those paper scratch-and-sniff stickers?
The 1981 movie “Polyester” featured numbered scratch-and-sniff cards which allowed the viewer (when prompted by a card number) to smell what was being shown on the movie screen.
It was promoted as Odorama.
Placing a fragrance coating on a piece of paper or cardboard is one thing; however, I had no idea of the long history of inventive devices used in dispersing smells while watching a movie.
Hans Laube and Michael Todd were involved in the creation of a device called The Smell-O-Vision.
Laube went on to build a machine that discharged a variety of odors, scents, and smells which would coincide with the events happening during a theater movie or stage play.
Hans Laube was issued US Patent number 2,813,452 titled MOTION PICTURES WITH SYNCHRONIZED ODOR EMISSION Nov. 19, 1957.
Laube’s device would disperse various mixtures and dilutions of liquid scented perfumes, and included one scent neutralizer.
Todd is credited with calling this device Smell-O-Vision.
“Scent of Mystery” was a 1960 movie using an updated version of Laube’s device that circulated up to 30 different smells into theater seats when prompted via specific signal markers on the movie’s film.
Disappointingly, this did not work very well, and as such, no future movies were shown using the Smell-O-Vision device.
The one scent I fondly recall as a youngster while seated inside my hometown’s local theater, was the addicting aroma that drifted in from the popcorn machine in the front lobby.
One of the earliest attempts at combining a motion picture film and smells goes back to 1906, when Samuel Lionel Rothafel, working at The Family Theater in the mining town of Forest City, PA., came up with an idea.
While a motion picture newsreel film of what is believed to have been the 1906 Rose Bowl parade was being shown inside the theater, Rothafel took a wad of cotton wool soaked with rose oil, and placed it in front of an electric fan. This caused the smell of roses to be wafted throughout the theater and amongst the seated patrons.
A rose by any other name would smell as sweet.
It is interesting to note Samuel Lionel Rothafel was born right here in Minnesota, in the city of Stillwater in 1882.
An in-theater “smell system” was installed in Paramount’s Rialto Theater on Broadway in 1933. Blowers released various smells during the movie, but proved unpopular as it took hours (sometimes days) for the scent to finally clear out of the theater building.
There is quite a variety of aromas in this world – and countless opinions on the number of unique scents the human nose can distinguish.
Trygg Engen, a Brown University psychologist, wrote in 1982 that an untrained person can identify 2,000 odors, and an expert, 10,000.
“The human nose can detect and differentiate 350,000 smells; it’s just that we shouldn’t smell them at the same time because you get anosmia – nose fatigue,” according to Sue Phillips, a fragrance expert.
In the book, “The Future of the Body,” Michael Murphy cites his source as saying a real expert (smelling expert, I would assume) “must distinguish at least thirty thousand nuances of scent.”
Ernest Crocker, a chemical engineer and MIT graduate, used a mathematical rating system and came up with 10,000 as being the number of recognizable odors a human can detect.
Mixing smells with your favorite movies, gaming, and television programs is becoming a reality through a French company called Olf-Action.
No doubt the company name is a play on the word “olfactory” which relates to the recognition of smell.
Olf-Action uses Odoravision.
Odoravision is a copyrighted term used to describe the concept for the delivery of odors, or particular scents, in combination with motion picture films viewed in movie theaters.
This method of odor-delivery has also been called: smell-synchronization.
Olf-Action’s Odoravision System can administer 128 scents with three simultaneous odors over the course of one motion picture film.
One aroma diffuser I saw connected a video source to Olf-Action’s Olfahome model 4045 scent dispenser device.
The model 4045 is a 44-pound rectangular, box-like device which was attached to the ceiling approximately 10 feet in front of, and above, the movie viewers.
The diagram for the Olfacine/Olfahome model 4045 showed 40 individual, open-air nozzles.
The scents are stored inside cartridges.
Some of the scents listed included: cakes, gasoline, flowers, roses, wood, sea water, smoke, candies, fabrics, trees, polluted city smells, and one I like; the smell of freshly cut grass.
Olf-Action listed several movie film titles available in Odoravision, including one many would like to see and smell: “Charlie and the Chocolate Factory.”
My concern is when we watch an Odoravision movie and tell people it stinks, they won’t know whether we meant the movie’s plot or the smells in it.
I can’t wait until Apple’s App Store starts selling the “iSmell” application.
Then we will be able to watch people sniffing their iPhones while they watch videos on them.
Do I hear laughter from some of my readers?
Folks, you just can’t make this stuff up.
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
by Mark Ollig
Remember those paper scratch-and-sniff stickers?
The 1981 movie “Polyester” featured numbered scratch-and-sniff cards which allowed the viewer (when prompted by a card number) to smell what was being shown on the movie screen.
It was promoted as Odorama.
Placing a fragrance coating on a piece of paper or cardboard is one thing; however, I had no idea of the long history of inventive devices used in dispersing smells while watching a movie.
Hans Laube and Michael Todd were involved in the creation of a device called The Smell-O-Vision.
Laube went on to build a machine that discharged a variety of odors, scents, and smells which would coincide with the events happening during a theater movie or stage play.
Hans Laube was issued US Patent number 2,813,452 titled MOTION PICTURES WITH SYNCHRONIZED ODOR EMISSION Nov. 19, 1957.
Laube’s device would disperse various mixtures and dilutions of liquid scented perfumes, and included one scent neutralizer.
Todd is credited with calling this device Smell-O-Vision.
“Scent of Mystery” was a 1960 movie using an updated version of Laube’s device that circulated up to 30 different smells into theater seats when prompted via specific signal markers on the movie’s film.
Disappointingly, this did not work very well, and as such, no future movies were shown using the Smell-O-Vision device.
The one scent I fondly recall as a youngster while seated inside my hometown’s local theater, was the addicting aroma that drifted in from the popcorn machine in the front lobby.
One of the earliest attempts at combining a motion picture film and smells goes back to 1906, when Samuel Lionel Rothafel, working at The Family Theater in the mining town of Forest City, PA., came up with an idea.
While a motion picture newsreel film of what is believed to have been the 1906 Rose Bowl parade was being shown inside the theater, Rothafel took a wad of cotton wool soaked with rose oil, and placed it in front of an electric fan. This caused the smell of roses to be wafted throughout the theater and amongst the seated patrons.
A rose by any other name would smell as sweet.
It is interesting to note Samuel Lionel Rothafel was born right here in Minnesota, in the city of Stillwater in 1882.
An in-theater “smell system” was installed in Paramount’s Rialto Theater on Broadway in 1933. Blowers released various smells during the movie, but proved unpopular as it took hours (sometimes days) for the scent to finally clear out of the theater building.
There is quite a variety of aromas in this world – and countless opinions on the number of unique scents the human nose can distinguish.
Trygg Engen, a Brown University psychologist, wrote in 1982 that an untrained person can identify 2,000 odors, and an expert, 10,000.
“The human nose can detect and differentiate 350,000 smells; it’s just that we shouldn’t smell them at the same time because you get anosmia – nose fatigue,” according to Sue Phillips, a fragrance expert.
In the book, “The Future of the Body,” Michael Murphy cites his source as saying a real expert (smelling expert, I would assume) “must distinguish at least thirty thousand nuances of scent.”
Ernest Crocker, a chemical engineer and MIT graduate, used a mathematical rating system and came up with 10,000 as being the number of recognizable odors a human can detect.
Mixing smells with your favorite movies, gaming, and television programs is becoming a reality through a French company called Olf-Action.
No doubt the company name is a play on the word “olfactory” which relates to the recognition of smell.
Olf-Action uses Odoravision.
Odoravision is a copyrighted term used to describe the concept for the delivery of odors, or particular scents, in combination with motion picture films viewed in movie theaters.
This method of odor-delivery has also been called: smell-synchronization.
Olf-Action’s Odoravision System can administer 128 scents with three simultaneous odors over the course of one motion picture film.
One aroma diffuser I saw connected a video source to Olf-Action’s Olfahome model 4045 scent dispenser device.
The model 4045 is a 44-pound rectangular, box-like device which was attached to the ceiling approximately 10 feet in front of, and above, the movie viewers.
The diagram for the Olfacine/Olfahome model 4045 showed 40 individual, open-air nozzles.
The scents are stored inside cartridges.
Some of the scents listed included: cakes, gasoline, flowers, roses, wood, sea water, smoke, candies, fabrics, trees, polluted city smells, and one I like; the smell of freshly cut grass.
Olf-Action listed several movie film titles available in Odoravision, including one many would like to see and smell: “Charlie and the Chocolate Factory.”
My concern is when we watch an Odoravision movie and tell people it stinks, they won’t know whether we meant the movie’s plot or the smells in it.
I can’t wait until Apple’s App Store starts selling the “iSmell” application.
Then we will be able to watch people sniffing their iPhones while they watch videos on them.
Do I hear laughter from some of my readers?
Folks, you just can’t make this stuff up.
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
Friday, October 7, 2011
Apple: No iPhone 5 - Hello iPhone 4S
Oct. 10, 2011
by Mark Ollig
“Let’s talk iPhone.”
This was the message inside the invitations sent out to members of the media from Apple Inc.
Most of us were anticipating Apple to announce the new iPhone 5 to the world.
Last Tuesday, Apple’s CEO Tim Cook took the stage at Apple’s Cupertino, CA headquarters to make the announcement of what many had assumed, written, tweeted, and blogged would be the new iPhone 5.
Cook spent considerable time talking about Apple’s past achievements, until we were finally introduced to the new iPhone 4S by Phil Schiller, Apple’s senior vice president of Worldwide Product Marketing.
“So people have been wondering, how do you follow up a hit product like the iPhone 4? Well, I’m really pleased to tell you today all about the brand new iPhone 4S,” said Schiller.
There was respectable applause from the audience in attendance, while yours truly wondered what happened to the iPhone 5.
One of the new features I liked on the iPhone 4S was Siri.
Siri is used to access the built-in voice-enabled personal assistant.
An iPhone 4S user speaks to Siri in ordinary, conversational dialogue.
Siri responds to verbal queries such as, “What is the weather like today in Minneapolis?”
Siri allows a person to work with the iPhone’s applications using normal, everyday voice conversation, while Siri will reply to the user’s voice in kind.
During Apple’s demonstration, Siri placed phone calls, provided directions to the nearest restaurant, and reported on the weather.
Siri performs a variety of dictation tasks; from creating reminders and setting alarms, to verbally notifying the iPhone 4S user about them.
If Siri needs more information in order to fulfill a request, it will verbally ask.
Incoming text messages can be read to you by Siri, and it will compose the text to reply with from your voice responses, providing hands-free texting.
I find myself suddenly having flashbacks to the HAL 9000.
The iPhone 4S Retina display significantly improves the sharpness and quality of images and text seen on the display screen.
Pixels on the iPhone 4S display screen are just 78 micrometers wide.
Pixel density is 326 pixels per inch, which means a human eye will not be able to detect the individual pixels, so web pages, photos, text in eBooks, and email will look very focused and sharp.
The Retina display utilizes technology called IPS (in-plane switching), which is the equivalent technology used in the iPad.
Included with the Retina display is LED back-lighting, and an ambient light sensor that intelligently adjusts the brightness of the screen, providing the best viewing possible under most conditions.
The iPhone 4S includes the Apple A5 dual-core graphic processing chip, (which operates in the iPad 2). It is twice as fast as the processor used in the previous iPhone 4. Web pages will load much faster; and to all the gamers out there: your video graphics will render seven times faster on the iPhone 4S.
The iPhone 4S screen is a 3.5-inch (diagonal) widescreen Multi-Touch display.
Its camera uses an 8 megapixel sensor, which takes pictures at a resolution of 3264 X 2488 pixels.
The camera includes tap-to-focus, auto-focus, and LED flash. Its optics include five element lenses and an f/2.4 aperture lens opening, which lets in more light and provides for better low-light performance.
The iPhone 4S captures 1080p video using video stabilization and records up to 30 frames-per-second with audio.
The iPhone 4S is used on 3G networks, and has a significantly improved antenna re-design from the previous iPhone 4.
Schiller stated how the iPhone 4S can attain data-rate speeds of up to 5.8 uploading and 14.4 Mbps downloading. However, to get these speeds, the 3G Carrier would need to be using a HSPA+ (Evolved High-Speed Packet Access) network.
With the iPhone 4S built-in rechargeable lithium-ion battery, talk-time battery life is said to be 8 hours, with 200 hours of standby time.
The battery provides 10 hours of video playback time, with up to 40 hours of audio/music playback.
When using the Internet, it provides six hours of usage with 3G enabled, and nine hours over Wi-Fi.
Containing both CDMA and GSM cellular phone standards, the iPhone 4S can be used world-wide. It also supports Bluetooth 4.0 wireless technology.
The iPhone 4S operates with the iOS 5 mobile operating system, integrates with the iCloud, and will be available in Apple retail stores Oct. 14.
The iPhone 4S will work over the AT&T and Verizon networks, and soon over the Sprint network.
The iPhone 4S models are priced at: $199 for 16GB, $299 for 32GB, and $399 for the 64GB iPhone 4S.
As this column was being sent to press, I learned of the passing of Apple Inc. co-founder and former CEO Steve Jobs.
While the news was breaking over the social networking sites, I came across one poignant Twitter message about Steve Jobs in reference to the new Apple iPhone 4S.
I was kindly given permission to use the following by the person who wrote it; “From now on, the 4S is going to stand for, ‘For Steve.’”
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
by Mark Ollig
“Let’s talk iPhone.”
This was the message inside the invitations sent out to members of the media from Apple Inc.
Most of us were anticipating Apple to announce the new iPhone 5 to the world.
Last Tuesday, Apple’s CEO Tim Cook took the stage at Apple’s Cupertino, CA headquarters to make the announcement of what many had assumed, written, tweeted, and blogged would be the new iPhone 5.
Cook spent considerable time talking about Apple’s past achievements, until we were finally introduced to the new iPhone 4S by Phil Schiller, Apple’s senior vice president of Worldwide Product Marketing.
“So people have been wondering, how do you follow up a hit product like the iPhone 4? Well, I’m really pleased to tell you today all about the brand new iPhone 4S,” said Schiller.
There was respectable applause from the audience in attendance, while yours truly wondered what happened to the iPhone 5.
One of the new features I liked on the iPhone 4S was Siri.
Siri is used to access the built-in voice-enabled personal assistant.
An iPhone 4S user speaks to Siri in ordinary, conversational dialogue.
Siri responds to verbal queries such as, “What is the weather like today in Minneapolis?”
Siri allows a person to work with the iPhone’s applications using normal, everyday voice conversation, while Siri will reply to the user’s voice in kind.
During Apple’s demonstration, Siri placed phone calls, provided directions to the nearest restaurant, and reported on the weather.
Siri performs a variety of dictation tasks; from creating reminders and setting alarms, to verbally notifying the iPhone 4S user about them.
If Siri needs more information in order to fulfill a request, it will verbally ask.
Incoming text messages can be read to you by Siri, and it will compose the text to reply with from your voice responses, providing hands-free texting.
I find myself suddenly having flashbacks to the HAL 9000.
The iPhone 4S Retina display significantly improves the sharpness and quality of images and text seen on the display screen.
Pixels on the iPhone 4S display screen are just 78 micrometers wide.
Pixel density is 326 pixels per inch, which means a human eye will not be able to detect the individual pixels, so web pages, photos, text in eBooks, and email will look very focused and sharp.
The Retina display utilizes technology called IPS (in-plane switching), which is the equivalent technology used in the iPad.
Included with the Retina display is LED back-lighting, and an ambient light sensor that intelligently adjusts the brightness of the screen, providing the best viewing possible under most conditions.
The iPhone 4S includes the Apple A5 dual-core graphic processing chip, (which operates in the iPad 2). It is twice as fast as the processor used in the previous iPhone 4. Web pages will load much faster; and to all the gamers out there: your video graphics will render seven times faster on the iPhone 4S.
The iPhone 4S screen is a 3.5-inch (diagonal) widescreen Multi-Touch display.
Its camera uses an 8 megapixel sensor, which takes pictures at a resolution of 3264 X 2488 pixels.
The camera includes tap-to-focus, auto-focus, and LED flash. Its optics include five element lenses and an f/2.4 aperture lens opening, which lets in more light and provides for better low-light performance.
The iPhone 4S captures 1080p video using video stabilization and records up to 30 frames-per-second with audio.
The iPhone 4S is used on 3G networks, and has a significantly improved antenna re-design from the previous iPhone 4.
Schiller stated how the iPhone 4S can attain data-rate speeds of up to 5.8 uploading and 14.4 Mbps downloading. However, to get these speeds, the 3G Carrier would need to be using a HSPA+ (Evolved High-Speed Packet Access) network.
With the iPhone 4S built-in rechargeable lithium-ion battery, talk-time battery life is said to be 8 hours, with 200 hours of standby time.
The battery provides 10 hours of video playback time, with up to 40 hours of audio/music playback.
When using the Internet, it provides six hours of usage with 3G enabled, and nine hours over Wi-Fi.
Containing both CDMA and GSM cellular phone standards, the iPhone 4S can be used world-wide. It also supports Bluetooth 4.0 wireless technology.
The iPhone 4S operates with the iOS 5 mobile operating system, integrates with the iCloud, and will be available in Apple retail stores Oct. 14.
The iPhone 4S will work over the AT&T and Verizon networks, and soon over the Sprint network.
The iPhone 4S models are priced at: $199 for 16GB, $299 for 32GB, and $399 for the 64GB iPhone 4S.
As this column was being sent to press, I learned of the passing of Apple Inc. co-founder and former CEO Steve Jobs.
While the news was breaking over the social networking sites, I came across one poignant Twitter message about Steve Jobs in reference to the new Apple iPhone 4S.
I was kindly given permission to use the following by the person who wrote it; “From now on, the 4S is going to stand for, ‘For Steve.’”
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.
Subscribe to:
Posts (Atom)