Tweet This! :)

Friday, October 25, 2019

The internet turns 50


© Mark Ollig

           
Feb. 7, 1958, the seeds of the internet took root when the Department of Defense Directive 5105.15 activated the Advanced Research Projects Agency (ARPA).

Four months earlier, the Soviet Union launched a missile containing Sputnik 1, the world’s first artificial satellite to orbit the Earth.

Sputnik 1 caused fear and anxiety among those worried the next Soviet missile orbiting above the US might release a nuclear warhead instead of a harmless beeping satellite.

It was the late 1950s. The Cold War was at its height. The possible destruction of the US military’s core computer system in a single attack pushed into action the creation of a brand-new computer communications network.

To ensure military computers would be accessible in the event of a nuclear attack, ARPA designed a survivable computer communications network named ARPANET.

This network used data-packet-switching protocol transmissions over dedicated long-distance telephone lines for sharing resources from geographically-separated computers across the country.

ARPANET would provide redundant connection paths from military computers connected on its network to the remote teletype data terminals used by military personal.

It was instrumental in laying the groundwork for the “network of networks,” known today as the internet.

The first attempt at a host-to-host message between different model computers using the ARPANET happened Oct. 29, 1969.

A Scientific Data Systems (SDS) Sigma-7 32-bit host computer node located in 3420 Boelter Hall at the University of California, Los Angeles (UCLA), was linked over dedicated telephone line facilities network to a second computer node, an SDS-940 time-sharing host computer located 350 miles south at the Stanford Research Institute (SRI) in Menlo Park, CA.

A small cable from the Sigma-7 computer terminates into an Interface Message Processor (IMP).

The IMP uses serial-communication messaging protocols to send/receive data packets over the ARPANET and into the SDS-940 computer’s IMP.

An IMP is essentially an internet router used for connecting computing devices over networks, and for delivering packets of digital data.

The IMP used in 1969 is the size of a refrigerator.

The telephone lines served as the physical transport facilities to the second node in this network, the SRI host computer, which connected to its own IMP.

The ARPANET transport facilities functioned as the “internet” between the two IMPs.

Charley Kline, a programmer and student at UCLA, was attempting to send the login command, “LOG” to the SRI SDS-940 from his teletype terminal connected to the Sigma-7.

“I would type a character. It would go into my computer. My software would take it; wrap around it all the necessary software to send it to the IMP. The IMP would take it and say, ‘Oh, this is supposed to go up to SRI,’” Kline explained during a 2009 National Public Radio interview.

Kline typed the letters “L” and “O” from his teletype terminal. The Sigma-7 computer transmitted this data to the SRI SDS-940 computer over the ARPANET link connecting the two machines.

During this time, Kline was on the telephone with Bill Duvall, another programmer, operating the SDS-940 computer at the SRI.

Duvall confirmed to Kline the SDS-940 computer received and recognized his typed characters “L” and “O.”

However, a problem occurred when Kline typed in the final letter, “G.”

The SDS-940 computer did not receive the “G.”

At SRI, Duvall discovered a memory buffer became full, causing the SDS-940 to stop working. In other words, it crashed.

And so, the first official word successfully transmitted over the original internet was “LO.”

So, “lo” and behold, we have a problem with the login.

Meanwhile, Duvall understood the circumstances causing the computer crash and knew how to correct it from happening again.

He needed to increase the size of a memory buffer on the SRI SDS-940 computer, which took about an hour to complete.

Once Duval completed the memory modifications, he asked Kline to retype the login command.

Kline typed “L-O-G” from his terminal at UCLA.

This time, the three-letter word was successfully recognized as a login command by the SDS-940 computer at the Stanford Research Institute.

The SDS-940 automatically filled-in the “I-N” to complete the “L-O-G-I-N” command.

Kline was now remotely logged into the SDS-940.

He was able to type computing commands from his remote terminal into the SDS-940 and confirm the results.

The SDS Sigma-7 terminal in Los Angeles was operating as a teletype terminal directly connected to the SDS-940 computer in Menlo Park.

This type of data communication exchange is known as a telnet session.

The importance of Kline’s and Duvall’s work meant no longer needing a collection of uniquely programmed and designed teletype/computer terminals only compatible with specific models of computers requiring dedicated connections.

Now, a user with one computer terminal can (with proper clearance) connect, log in, perform commands, and access information from different models of computers connected on the internet.

Kline’s logbook shows the first message sent over the newly-born internet took place Oct. 29, 1969 at 10:30 p.m. His handwritten note is here: https://bit.ly/2Vh1pYo.

With new college and university computer nodes added, the expansion of the internet was well underway.

A photo of an Interface Message Processor is at https://bit.ly/2WnDovk.

The ARPANET diagram with the UCLA and SRI computer nodes can be viewed here: https://bit.ly/2V45E4w.

See the Department of Defense Directive 5105.15 at https://bit.ly/2pLunRy.

Tuesday is the recognized 50th birthday of the internet, so celebrate accordingly.










































Friday, October 18, 2019

Portable jukebox put 1,000 songs in our pocket


© Mark Ollig

   



“We are living in a new digital lifestyle with an explosion of digital devices. It’s huge,” said Apple CEO Steve Jobs.

He made this statement Oct. 23, 2001, at Apple Computer’s campus in Cupertino, CA, during the public introduction of the first iPod music player.

Jobs went on to describe the new Apple iPod as a digital device that could put “1,000 songs in your pocket.”

He then reached into his pocket and held up a small white plastic box (iPod), the size of a deck of playing cards.

The iPod weighed 6.5 ounces and was ergonomically designed with easy-to-use features.

Earbud-style headphones for music listening came with the iPod. They plugged into the 3.5-mm stereo headphone jack.

By rotating iPod’s unique scroll-wheel with your thumb or finger, you could quickly maneuver through the songs digitally stored inside a 1.8-inch 5GB (gigabyte) hard drive.

Assuming 5MB (megabyte) per song, the iPod could hold up to 1,000 songs.

An optional 10GB hard drive model introduced in 2002, stored 2,000 songs.

The iPod could play digital audio using 160/320Kbps MP3 compression file formatting.

It featured a 2-inch diagonal flat panel monochrome LCD (liquid-crystal display) and LED (light-emitting diode) for backlighting for viewing its menu information.

The CPU (central processing unit) used in the iPod was a PortalPlayer PP5002 system-on-a-chip with dual embedded 90 MHz ARM 32-bit microprocessors.

ARM is an acronym for Acorn RISC Machine. RISC stands for Reduced Instruction Set Computer.

Before the iPod, in the late 1980s, Apple was working on a PDA (personal digital assistant) device called the Newton MessagePad.

The Newton MessagePad was the first PDA to recognize handwriting from a stylus pen when pressed on its touch-sensitive display screen.

Apple’s engineering design called for using a low-power-consuming CPU with this PDA.

They established a contractual agreement with the British company, Acorn Computers, and custom-integrated circuit manufacturer VLSI Technology, to use the ARM CPU in the Newton MessagePad.

The Newton MessagePad weighed nearly 1 pound, and measured 7.25 inches by 4.5 inches and .75 inches thick. It became publically available in 1993.

I recall it being a bit pricey; $699 in 1993, or $1,242 if sold today.

Unfortunately, the Newton MessagePad failed user expectations, and its highly-publicized handwriting feature was not reliable.

Apple discontinued the manufacturing of the Newton MessagePad March 4, 1994.

In 1996, US Robotics Corporation’s subsidiary, Palm Inc., began selling a PDA called the Pilot, which was small enough to fit in a shirt pocket.

The Pilot was a handheld computing organizer advertised as, “The pocket-sized organizer that’s always in touch with your PC.”

It was designed to work as a portable peripheral device that could sync data with a desktop or laptop computer, using Windows 95 and 3.1 operating systems.

The Pilot features included Calendar Appointment and Address Book, To-Do List, Calculator, and Memo Pad for note-taking.

A plastic stylus pen to input commands, draw, or make selections was used with the Pilot.

It included a four-shade monochrome screen to display information.

The Pilot computing device could mirror the information contained on a user’s IBM-compatible computers using Palm’s HotSync synchronization software.

It could back up its data to a home or business computer. The Pilot became widely used by medical, law enforcement, business, and technical personal.

During the late 1990s, I used a Palm Pilot at the telecommunications company where I worked.

Did you know the Pilot pen company sued Palm Inc. for using the name Pilot?

The Pilot pen folks alleged consumers would mistake the Palm Pilot solid plastic stylus pen for one of their hollow plastic tubes filled with ink.

Oh, there is more.

In 1998, Microsoft announced their new “Palm PC” product, and guess who sued Microsoft?

You are correct. Palm Incorporated’s parent company, US Robotics.

However; I digress, back to the iPod.

The iPod’s lithium-polymer battery played music for up to 10 hours before needing to be recharged from a Macintosh computer port, or an electrical outlet using an Apple power adapter.

Controls and customizable settings on the iPod included play, pause, skip, shuffle, repeat, startup volume, sleep timer, and multiple language selections.

Downloading songs into the iPod was accomplished by plugging it into the USB port of an Apple computer, or optional docking cartridge.

Apple has always been proprietary when it comes to its software and computers. Software bundled with the first generation iPod could only be used with their Macintosh computers.

Windows operating system computers needed third-party software interfaces, such as EphPod, CopyTrans, or XPlay, for Windows users to manage or back up songs on an iPod.

The original Apple iPod 5GB model sold for $399, and the 10GB model sold for $499, or when adjusted for 2019, would be $578 and $712, respectively.

In case you were curious, today’s Apple iPad Pro with a 1TB hard drive can hold 200,000 songs.

In May 2018, Gracenote, a Nielsen-owned media data specialist company, announced it was using artificial intelligence and machine learning to classify and sort through 90 million songs.

After much thought (and a calculator), I determined one would need an iPod with a 450TB (terabyte) hard drive to hold that many songs.

April 28, 2003, Apple discontinued the production of our portable 1,000-song jukebox.

Original Apple iPod
US Robotics Palm Pilot



Apple Newton MessagePad



Friday, October 11, 2019

It’s National Cybersecurity Awareness Month

© Mark Ollig



Along with October’s cooler temperatures, we continue to enjoy the beautiful brown, yellow, orange, and red fall foliage.

The good folks at the US Department of Homeland Security use the month of October to call our attention to National Cybersecurity Awareness Month (NCSAM).

It is the 15th anniversary of NCSAM.

The Department of Homeland Security began NCSAM in 2004 to remind us of keeping our online digital devices and information safe and secure from cybercriminals.

“Internet users at home are not as safe online as they believe,” advised an Associated Press newspaper article published Oct. 25, 2004.

The article talked about some 600 pieces of spyware found inside a home computer monitoring a family’s online activities. The story described how computer users could protect their privacy.

NCSAM is a collaborative effort between the Cybersecurity and Infrastructure Security Agency (CISA) and the National Cyber Security Alliance (NCSA), along with other organizations, both public and private.

Cisco Systems, Inc. defines cybersecurity as “the practice of protecting systems, networks, and programs from digital attacks.”

For me, cybersecurity is the protection of computing devices, websites, and networks from those attempting unauthorized entry to disrupt services or steal data.

Cyberattacks can happen over the networks of government agencies, utility companies, Fortune 500 corporations, small businesses, or the internet-connected devices of people in our local community.

Our internet-connected digital devices; computers, smartphones, tablets, and digital assistants such as Google Home, Amazon Alexa, Apple’s Siri, and Microsoft’s Invoke, are susceptible to cyber-attacks.

Also vulnerable to cyberattacks are internet-connected smart TVs, digital wearables, home appliances, home-internet connected security camera systems, and automobiles with an internet connection.

Also open to cyberattacks are digital IoT (Internet of Things) information devices like smart electronic sensors, and the assortment of digital gadgets monitoring, reporting, and sharing information over the internet from homes, businesses, utilities, and governmental municipalities.

A cyberattack of an internet-connected device using targeted hacking can result in the device’s service disruption, manipulation, and control of its operation and theft of information the device contains.

The number of IoT devices connected to the internet continues to grow, and as far as I can determine, there is no end in sight.

By 2020, there will be 30 billion IoT devices networked over the internet, according to a recent report by Statista.

By 2025, I believe 5G wireless connectivity will be built-in within many digital gadgets; IoT devices are projected to reach 75 billion.

Beyond 2025, the sky’s the limit on what the number of IoT devices there will be.

This year’s NCSAM is Information Technology-themed: “Own IT. Secure IT. Protect IT.”

The National Initiative for Cybersecurity Careers and Studies (NICCS) is a US government website providing many consumer and business resources to defend against cyberattacks.

Cybersecurity includes using a long password or password phrase and limiting your personal information posted to social media sites.

An example of a phishing attack is when cybercriminals lure you to “clickbait” links with messages such as “We suspect an unauthorized transaction on your account. To ensure that your account is not compromised, please click the link below and confirm your identity.”

Also, beware of fraudulent internet schemes, including identity theft, imposter deception scams, and debt collection swindlers.

Software apps on our smartphone can control many of the digital devices and home assistants we use. Be suspicious of unfamiliar apps running in the background or using permissions you don’t recall approving.

Other suggestions to defend against cyberattacks and theft of personal information include:

• Disconnect internet access for a computing device not in use.

• Limit conversation near voice monitors, audio recordable toys, and digital assistants.

• Cover toys, laptops, and monitoring devices’ cameras when not in use.

• Disable wireless connections for digital devices not in use.

Our home and business wireless routers are the primary entrance for cybercriminals to access our digital devices. Keep the Wi-Fi network secured by changing the router’s factory-set passwords and usernames.

A wealth of information is available for protecting your smartdevices and personal data at the National Cybersecurity Awareness Month 2019 website. I created a shortened link to it: https://bit.ly/2m4zsTb.

The Department of Homeland Security has a cybersecurity toolkit filled with suggestions and help for students, parents, educators, industry, small business, law enforcement, older Americans, and young professionals at www.dhs.gov/stopthinkconnect-toolkit.

Use the Twitter hashtag #BeCyberSmart to share your comments and see what others are posting about cybersecurity.

Our awareness and preventive action against cyberattacks is something we should be mindful of not only during October, but year-round.



Friday, October 4, 2019

A quantum leap in computing supremacy

© Mark Ollig


You may have heard about quantum computing. There is a lot of excitement about it, and this future generation of computer processing is in the news.

A quantum processor was reportedly used to solve a complex calculation proving the predictability of generated numbers. The calculation took 200 seconds to answer.

According to Google, the same calculation would take 10,000 years for today’s fastest processing supercomputer to solve.

Whose quantum processor performed this astonishing feat?

Google’s 54-qubit quantum computing chip called the Sycamore processor was said to have accomplished the complex calculation. One qubit did not function properly, so the answer was solved using 53 qubits.

“Quantum Supremacy Using a Programmable Superconducting Processor,” Google’s research paper describing their calculation accomplishment and quantum processor, appeared during August on NASA’s Ames Research Center website.

Surprisingly, this research paper has disappeared from the website. Why it is no longer available remains unanswered. Neither Google nor NASA has publicly commented about the disappearance.

IBM, a competitor in the quantum computer race, stated Google’s milestone was accomplished using a specialized system designed to solve a single problem.

They went on to say Google had not achieved quantum supremacy, which Google appeared to claim in their research paper.

As of today, the IBM Summit is recognized as the world’s fastest processing supercomputer.

IBM is also busy with quantum research, and recently announced its 53-qubit universal quantum computing system suitable for cloud computing applications.

“The single goal of this passionate community is to achieve what we call Quantum Advantage, producing powerful quantum systems that can ultimately solve real problems facing our clients that are not viable using today’s classical methods alone,” said Dario Gil, director of IBM Research.

Another company, called IonQ, headquartered in College Park, MD, is working on a 32-qubit quantum computer.

During a 1981 conference hosted by IBM and MIT, the late theoretical physicist, Richard Feynman stressed the importance of constructing a quantum computer.

The next year, IBM began researching the technology needed to build a quantum computer.

July 7, 2008, I wrote a column regarding quantum computing. In the column, I used the analogy of searching through a telephone book for a specific name and telephone number.

A standard computer program looks at all the names; searching each name, one at a time, until it finds the correct name and telephone number.

Although a computer processes information quickly, it still performs each search operation sequentially.

By using the qubit processing power of a quantum computer, every name and associated telephone number in the phone book would be searched instantaneously in a single quantum operation.

It would not matter how large the telephone book is, the quantum operation and resulting output would be immediate.

Today’s computers use digital logical bits; quantum computers use spinning electron molecules which form into qubits.

A computing bit is the smallest unit of digital computation. A bit is in one of two states: a one or a zero, on or off.

Today’s standard computers and supercomputers are based on logical operations using binary digits of “1 and 0.” Being in a state of “1,” the bit is on; and in a state of “0,” the bit is off.

Computer circuitry using logical gate components have their output determined by their binary input.

An individual binary bit is in one logical state at a time; either a “1” or “0.”

A qubit is a quantum bit that can be in a superposition or entanglement state of either a “1” or a “0,” or both at the same time, or at different times.

Lasers are used to perform quantum logic gate operations by modifying a qubit molecule into any given state required.

A quantum computer with a considerably large qubit processor would have incredible processing power.

Quantum computing hardware components and software are currently being researched and developed.

No existing quantum computer is yet ready for tackling the real-world applications today’s supercomputers are used for.

A stable, reliable quantum computer is still years away from replacing today’s supercomputers. We are, however, beginning to slowly transition from this current period of computer processing and into the quantum computing era.

As far as Google’s missing research paper, fortunately, their 12-page document was saved and stored before being removed from the NASA website.

After much investigating, I found and read Google’s “Quantum Supremacy Using a Programmable Superconducting Processor” research paper. You can read it at this shortened link I created: https://bit.ly/2kuKJLU.

I believe future quantum computers will be superior to today’s computing devices, as much as today’s computing devices are superior to a 2,500-year-old abacus.

Like Sam Beckett, we may soon find ourselves experiencing a “Quantum Leap.”

(Photo used with the permission of IBM Research)