Friday, March 30, 2018

A message from the founder of the Web

©Mark Ollig


March 12 is recognized by many as the 29th birthday of the World Wide Web.

Tim Berners-Lee wrote the code we use to point-and-click our way through the myriad of website pages, text, videos, photos, and social media links.

Berners-Lee initially called the web a “global hypertext system.”

Many people assume the internet and the web are the same; they are not.

The onternet is a network of interconnected computers, routers, gateways, border controllers, and cables, sending and receiving binary-coded packets of data.

Think of these data packets as discrete messages written on postcards delivered to specific addresses.

The computing devices and software on the internet passes and routes the data packets through its network to their desired destinations.

The web, which I consider an overlay program on top of the internet, would not work without the underlying network topology of the internet; which operates using high-level computer programming languages, and assorted hardware and electronic components needed to send and receive data packets using internet transmission control protocols.

The internet delivers data packets using its transmission protocols to anywhere in the world with access to its network – and it does this very quickly.

Back in 1980, Tim Berners-Lee was managing software programs for storing computer information files.

In 1989, he was working at the European Particle Physics Laboratory near Geneva, Switzerland, also known as CERN – which I learned is the French abbreviation for Conseil EuropĂ©en pour la Recherche NuclĂ©aire (European Council for Nuclear Research).

While working at CERN, he submitted his now famous document, “Information Management: A Proposal.”

The proposal described and had diagrams of his software program hierarchy using a hypertext transport protocol in which an information file stored within a computer server on a shared network could be uniquely referenced and quickly accessed from any other networked computer via a Universal Document Identifier (UDI).

UDI is known today as the Uniform Resource Locator (URL), which is the specific address of a webpage.

“We should work toward a universal linked-information system,” Berners-Lee wrote in his March 1989 proposal to CERN.

Berners-Lee’s original CERN proposal document and diagrams of what eventually became the World Wide Web are at https://bit.ly/1eBedal.

During 1990, Berners-Lee finished coding a client-browser and Graphical User Interface software he named the “WorldWideWeb Program,” using a late-1980s model NeXT workstation computer.

He also used the NeXT workstation as the web’s first Hypertext server, and for writing the code for the first web-browser.

A photo of the NeXT computer Berners-Lee used to create the web is at https://bit.ly/2I2stQf.

To see Berners-Lee’s screengrab of the world’s first hyperlinked website on his WorldWide Web, visit https://bit.ly/2I3u34g.

On this 29th anniversary, Berners-Lee mentioned concerns he has about the web: “How do we get the other half of the world connected?” and, “Are we sure the rest of the world wants to connect to the web we have today?”

“While the problems facing the web are complex and large, I think we should see them as bugs: problems with existing code and software systems that have been created by people and can be fixed by people,” Tim Berners-Lee said.

The threats to the web are “real and many . . . from misinformation and questionable political advertising to a loss of control over our data. But I remain committed to making sure the web is a free, open, and creative space – for everyone,” Berners-Lee added.

He also suggested a plan of action: “Let’s assemble the brightest minds from business, technology, government, civil society, the arts, and academia to tackle the threats to the web’s future.”

“The web, as I envisaged it, we have not seen it yet. The future is still so much bigger than the past,” Tim Berners-Lee said in 2009.

Follow Sir Tim Berners-Lee on Twitter at @timberners_lee.




















(Image Art right-to-use paid) 


Friday, March 23, 2018

Apollo 17 landing site to be visited using 4G technology

©Mark Ollig


July 20, 2019, will mark the 50th anniversary of the Apollo 11 lunar landing mission.

Some of us recall watching on our television sets, the ghost-like image of astronaut Neil Armstrong as he descended the Lunar Module (LM) ladder; touching his boot on the lunar soil in one small step, and then taking a giant leap onto the moon’s surface for all humankind.

Westinghouse Electric Corporation’s Aerospace Division developed and manufactured the Apollo Lunar Television Camera used on Apollo 11.

This camera operated under extreme lunar temperature variations; +250 degrees Fahrenheit during the day and -300 degrees Fahrenheit at night.

The lunar camera needed to function while experiencing various space/moon environmental pressures, the lack of an oxygen atmosphere, meteoroid storming, particle radiation; and of course, moon dust.

The lunar camera’s resolution of 10 fps (frames per second) 320 line scan, was selected because, in 1969, most homes TV’s received a 30 fps 525-line scan and could reproduce the 320 lines and still provide a satisfactory picture of Neil and Buzz Aldrin hopping around on the moon’s surface.

“Molecular integrated devices” made up 80 percent of the lunar camera’s solid-state circuitry.

The camera operated using filtered +6VDC and +8VDC supplied by the Inverter and Line Regulator’s variable power input circuitry.

The Apollo 11 lunar television camera also used technology not publicly known about in 1969.

The fiber-optic technology in the lunar television camera was considered “CLASSIFIED CONFIDENTIAL” in NASA’s “Lunar Television Camera Pre-installation Acceptance (PIA) Test Plan” document 28-105, dated March 12, 1968.

The Apollo 11 LM lunar surface camera was stowed inside a storage compartment on the descent stage of the LM called Eagle.

Once the camera was removed and activated, it transmitted live video images via a radio antenna on the Eagle, to tracking stations in Australia and California, 250,000 miles away.

To this day, the original lunar television camera remains on the surface of the moon, in the Sea of Tranquility, within the Apollo 11 landing site.

But I digress back to today’s topic.

We can get excited about the moon, again.

Brand-new, HD (High Definition) video cameras will be sent to the moon’s surface using a SpaceX Falcon 9 rocket in 2019.

Plans call for two Audi Lunar Quattro moon rovers, equipped with HD cameras, to link up with a new 4G moon network, as part of the lunar video communications base station developed by Nokia Bell Labs.

A “space-grade network” will be operating on the moon using 4G technology by Vodafone and Nokia.

The new moon-based 4G broadband cellular network “is highly energy-efficient, compared to analog radio,” stated Vodafone.

5G technology is yet to be standardized and thus was not considered at this time.

Vodafone says its 4G broadband lunar network will provide live-streaming HD video from the surface of the moon to a global audience watching here on Earth.

Audi Lunar Quattro moon rovers HD cameras will be studying close-up, the lunar rover vehicle used during the December 1972, Apollo 17 moon-landing mission; which was the last time humans walked on the moon.

NASA used lunar rovers during the Apollo 15, 16, and 17 missions.

What an exciting experience it will be for us to watch the live video feed from the Audi lunar rovers HD cameras, as they approach the Apollo 17 landing site.

We will see, in High Definition video, Apollo 17’s battery-powered, four-wheeled, 463-pound Lunar Roving Vehicle (LRV), as it rests in the Taurus-Littrow Valley highlands.

“Vodafone testing indicates that the base station should be able to broadcast 4G using the 1800 MHz frequency band and send back the first-ever live HD video feed of the Moon’s surface,” read a statement released by Nokia.

The base station will transmit 4G over the 1800 MHz frequency band, and send back the first live HD video feed from the Moon’s surface.

The video, broadcast to Earth via a deep-space data link, will no doubt be widely viewed on various media venues and internet websites.

Images of the Apollo 17 landing site were photographed in 2011, by NASA’s Lunar Reconnaissance Orbiter (LRO) satellite, which took low-altitude, Narrow-Angle Camera images.

On one LRO Apollo 17 landing site photo, we can easily see astronauts Eugene Cernan and Harrison Schmitt’s footpath trails in the lunar soil to the left of the LM (Challenger) descent stage.

The LRO photo also shows the US flag, the Apollo Lunar Surface Experiments Package (ALSEP), and the lunar rover’s final resting place; just east of the LM, as well as the wheel tracks it created on the lunar surface.

I have wondered what it would be like to revisit, close up, an Apollo moon landing site after all these years; hopefully, the Audi moon rovers will not bump into the Apollo 17 lunar rover, flag, or LM descent stage.

Next year, we will celebrate the 50th anniversary of the Apollo 11 mission, along with seeing the up-close, high-definition, live video feed of the landing site and lunar rover from the last Apollo moon mission.


Credit: NASA/Goddard Space Flight Center/Arizona State

Sunday, March 18, 2018

Buying a computer in the ‘90s

©Mark Ollig

Back in the mid-1990s, when the excitement and wonderment of the internet, Web, and personal computing was popular, a local Winsted resident started writing a column in his hometown newspaper.

This person was able to dabble in his favorite pastime, learn some new and exciting technology, and share the information with his readers.

Of course, this fellow was me, and with my new 1995 HP Omnibook notebook personal computer, I set out to explore and explain this cyber-spaced landscape – from my perspective – to the reading masses.

Back in the ‘90s, trying to figure out what type of home computer a person should purchase was something one needed to research before aimlessly walking into the computer electronics store.

Knowledge is power, as the saying goes.

In 1995, I recall walking into a computer store just west of my hometown; it was quiet; with the exception of sounds from computer cooling fans, and a matrix dot printer operating.

My eyes wandered as they took in all the modern computers on display; some were operating with information from a software program showing on their monitors.

It may sound corny, but there was a feeling of serenity in the air; as if I was looking into the future while being surrounded by all those computers, monitors, and printers; along with the software disks, and hardware and software manuals displayed.

Eventually, the salesperson breaks my tranquil ambiance by asking, “Can I help you find anything?”

“I’m looking to buy a home computer,” I replied.

The salesperson smiles (I could see the dollar signs in his eyes). He then politely inquires what I want to use the computer for.

I gave the standard response; I wanted to have the latest Windows operating system, which was Windows 3.0; a word processor, a spreadsheet program, games, communication software, and a dialup modem.

The salesperson then asks the “Do you want an internal or external modem?” question.

I was purchasing a Tower PC model and thought it would be best to have an external modem in case I needed to replace or set any of the modem chip’s dual in-line package (dip) switches.

Having an external modem meant I needn’t open up the computer and plug in a printed wiring card.

Besides, I liked seeing what I was paying for.

Remember, this was 1995.

In addition to my home computer model, I needed a more portable computer I could take with me.

So, which notebook computer did I end up purchasing in 1995?

You’re right. It was the HP Omnibook 4000 equipped with the Intel 100 MHz – 486DX4 processor, 520 MB hard drive, 10.4-inch thin-film-transistor liquid-crystal display screen, tactile keyboard, internal modem, external floppy drive, a battery lasting around three hours, and a few ancillary connection ports and devices.

The price for the HP Omnibook 4000 was a little over $3,000, which, in 1995, was a lot of money – in fact; it’s still a lot of money.

For those of you out there not up on your mega: a megahertz (MHz) processor performs one million calculations per second.

In 1995, I thought having 520MB (megabytes) hard drive installed on my computer was more than enough disk storage space.

I can see today’s millennials smiling at the thought of my having MB disk storage, and then taking another sip of their favorite beverage.

Even a 1 MHz processor was “Star Trek” compared to what was used for processing information used on the onboard guidance computer systems of the 1960s and early ‘70s Apollo spacecraft.

In fairness, NASA did most of the heavy number-crunching here on Earth, using rows of IBM mainframe computers. This information was relayed by satellites to the spacecraft.

You might remember the IBM computer “Big Blue” playing a few games of chess against a human grandmaster. At first, the human chess grandmasters proved why a computer should not be playing chess. The machine would lose.

Today, the computer is winning because of better-written software with improved analytical processes; it operates much faster and with faster processing power than in years past.

Back in the late ‘90s, much computing news centered on developing faster computing processor chips for processing complex software programs.

In 2007, software programs used millions of lines of code for video gaming and computer movie making graphics. Anyone else remember playing the early 1970’s video game using a console box connected to a television called Pong?

By 2007, Computer Graphical Interfaced (CGI) technology available on computers had created amazingly lifelike computer-animated visual effects movies such as “The Polar Express” and “Sky Captain and the World of Tomorrow” without having to use real actors or sets.

Although today’s computers are far more advanced than those from the ‘90s, I still miss my old 1995 HP Omnibook.
1995 Omnibook notebook personal computer



Friday, March 9, 2018

5G, high-tech showcased at Mobile World Congress


©Mark Ollig


“Creating a better future” was the technology theme for the Mobile World Congress 2018 (MWC) convention in Barcelona recently.

MWC is the mobile industry’s most highly-attended yearly conference, and is organized by the Global System for Mobile Communications Association (GSMA).

Woman4Tech hosted several events during MWC 2018, including keynote addresses by prominent women in the technology industry and those leading and working within high-tech companies.

“It’s going to take progress through technology. We know the next set of changes are going to have to be the same kind of leapfrogging in improvements that we see here at this show,” explained Kathy Calvin, president and CEO of the United Nations Foundation, during her MWC keynote address.

As I mentioned in last week’s column, the buzz about the upcoming 5G (fifth-generation) wireless broadband technology and its applications were discussed by many of the guest speakers during MWC.

Industry executives reportedly said North America and northeast Asia, including China, Japan, and South Korea, are leading in 5G development, and that Europe has “some catching up to do.”

The chairman of the Federal Communications Commission, Ajit Pai announced during his MWC keynote address, the US intends to host an auction of radio spectrum in the 28 GHz band, starting in November.

Pai added, following the 28 GHz auction, another will take place for radio spectrum in the 24 GHz radio frequency band.

The mobile cellular attendees cheered Pai’s announcement, which will push the development of new 5G, networking systems, smartphones, computing tablets, Internet of Things (IoT) devices, and more.

MWC 2018 took place last month, and saw more than 107,000 people from 205 countries in attendance, including 7,700 technology industry leaders and company executive officers.

Total official delegates representing various countries surpassed 2,000, and the number of press and media passes covering the MWC event was more than 3,500.

More than 2,400 exhibits were on display during MWC 2018.

Technology and social media leaders, such as IBM, Intel, AT&T Facebook, and Google, along with SK Telecom, which demonstrated its 5G network during the recent Winter Olympics in PyeongChang, South Korea, also attended the MWC convention.

Demonstrations showed how 5G technologies would improve the operation of robotics, building security systems, autonomously driven cars connected to the internet via 5G, and the energy management systems within smart homes and businesses.

MWC demonstrated real-world applications of using 5G technology within rural agriculture applications, such as remotely controlling farm machinery, and using information-gathering aerial drones.

IoT, social media content usage, facial recognition devices, and internet online policy and regulations were topics discussed during MWC 2018.

MWC 2018 also called attention to the insulating, yet superconducting honey-comb-like-shaped material called graphene, and its importance in future mobile devices and other electronic components.

An area within the MWC venue, called NEXTech Hall 8.0, featured virtual and augmented reality VR/AR technology, robotics, cognitive computing, and artificial intelligence (AI) exhibits.

The next Mobile World Congress convention will take place Feb. 25-28, 2019, in Barcelona, Spain.

Visit the MWC website at http://www.mobileworldcongress.com.

As we draw closer to 2020, let us explore how 5G and future technologies will enhance and create a better future for all of us living on this planet, and someday, beyond it.


























(Image royalty license-to-use paid)

Thursday, March 1, 2018

Future 5G technologies showcased

©Mark Ollig


Located in the autonomous region of Catalonia, Spain, Barcelona played host for the 2018 Mobile World Conference (MWC).

MWC was full of excitement and anticipation regarding 5G; the mobile cellular network we will be using in our cellphones and smart devices in 2020.

We are still over a year-and-a-half away from the official 5G cellular mobile technology industry standard to become available, and events such as the MWC are already eagerly showcasing it.

Industry network specifications for international mobile telecommunications (IMT) are established by the technical standards-setting International Telecommunication Union (ITU), a branch of the United Nations, headquartered in Geneva, Switzerland.

The current and most commonly used cellular technology is called fourth-generation long-term-evolution (4G LTE), which many of us use with our smartphones.

4G LTE has been available since 2010.

On average, it’s about every 10 years before the next generation of cellular technology is released.

We are about to turn the page forward into the next chapter of wireless mobile cellular networking technology.

5G will be used for not only smartphones, but for the IoT (Internet of Things) devices and particular services IoT provide.

IoT incorporates intelligent components within its electronic sensors.

IoT smart sensors are attached to, and gather information from a wide variety of everyday-used electronic devices. These “devices” then become part of the “things” connected to the Internet.

“If we had computers that knew everything there was to know about things; using data they gathered without any help from us, we would be able to track and count everything, and greatly reduce waste, loss, and cost. We would know when things needed replacing, repairing, or recalling, and whether they were fresh or past their best,” stated a well-worded explanation for the benefits of using IoT from Techopedia, a website founded by Dale and Cory Janssen.

IoT will be extensively using 5G’s ultra-broadband-like network and will fill in the data bandwidth and speed gap limitations of the current 4G network.

5G will reportedly be up to 40 times faster than 4G under optimal operating conditions.

Everything we use our 4G smartphones for, from downloading web pages and video, messaging, running apps, and tethering to other devices, will happen at blazingly-fast speeds using 5G.

Radio frequency bands above the 24 GHz radio spectrum will be used for 5G.

KT Corporation, South Korea’s primary telecommunications provider, showcased their 5G technology during the recent Winter Olympics in PyeongChang, South Korea.

The KT 5G network used 22 high-speed, fiber-optic data links connecting 10 separate locations. Its pre-industry standard 5G network operated in the 28 GHz radio spectrum.

Samsung and Ericsson were two of the equipment vendors assisting KT with its 5G network demonstration, by equipping test smartphones and tablet devices for 5G services.

I was surprised to learn one Samsung tablet device recorded peak data transfer speeds of up to 3.5 Gbps (Gigabits per second). This is incredible speed for a smart device, and gives all of us a look at what the very near future speeds using 5G will be.

Technology from Intel Corporation, including its FlexRAN and Mobile Edge Computing systems, was also a part of KT’s 5G network demonstration.

KT demonstrated how 5G would be used in cloud computing, artificial intelligence, IoT, and immersive media services; such as virtual reality and augmented reality.

5G networks will be used during sports events, virtually-attended concerts and meetings, directing autonomous smart transportation and factories, remote-assisted health care services, and much more.

During the Winter Olympics, KT’s 5G network handled 3.8 PB (Petabytes) of data, which is equal to 3,800 TB (Terabytes), or 3,800,000 GB (Gigabytes).

If you watched the Winter Olympics, you would recall the memorable opening and closing ceremonies, where 300 drones were flying in perfect formation high above the stadium crowd.

The drones were linked and operated using KT’s 5G network.

Right now, the tech industry is centered on 5G; businesses, our local city communities, along with you and I, should look forward to exciting things happening when 5G hits the airwaves in 2020.

Hurry 5G; there are 50 billion IoT devices, smartphones, and virtual devices anxiously waiting to be connected to your network.





















(Clipart Of LLC. Royalty user-fee paid for by Mark Ollig)