Friday, May 28, 2021

‘Light guiding’ through water and glass

© Mark Ollig


Light amplification by stimulated emission of radiation is the definition for the acronym LASER, which should not be confused with the fictional PHASER (phased array pulsed energy projectile weapon) used on the television show “Star Trek.”

Of course, the once fictional technology from “Star Trek” continues to become a reality.

Not long ago, the US military demonstrated a powerful microwave weapon it calls a Phaser, but this is a topic for a future column.

Jean-Daniel Colladon, a Swiss physicist, demonstrated the ability to guide light beyond just a straight line in 1841.

Colladon showed how a ray of light traveling inside a curved arc of flowing water would bend, thus becoming “refracted.”

He called this event “light guiding,” and said it “was one of the most beautiful and most curious experiments that one can perform in a course on optics.”

“By turning the laser light signal on and off quickly, you can transmit the ones and zeros of a digital communications channel,” describes the broadcast use of a laser from my reliable Newton’s Telecom Dictionary.

Laser light encodes the information transmitted using direct modulation through multiplexing laser frequencies; the modulated light guides data through a transparent, fiber-optic glass strand.

The two most common light sources for fiber-optic transmission are LEDs (light-emitting diodes) and laser diodes.

In 1976, AT&T (American Telephone & Telegraph) engineers successfully sent information encoded within laser light through a fiber-optic telephone cable installed at AT&T’s research facility in Atlanta, GA.

Also, in 1976, the US Air Force replaced the copper wiring harness from an A-7 Corsair US military fighter jet with a fiber optical link network containing 250 feet of 13 individual fiber-optic cables weighing 3.75 pounds.

The A-7 Corsairs’ previous copper wiring harness included 300 individual copper wire circuits, totaling 4,133 feet and weighing 88 pounds.

Today’s fiber-optics installed in modern aircraft allow military pilots to make instantaneous decisions using high-resolution imaging systems, displays, and flight controls linked with fiber-optic networks providing real-time data.

In 1977, AT&T’s first commercial application of transmitting voice communications over a fiber-optic cable took place in Chicago.

Many believed fiber-optic technology would not reach its data-capability bandwidth ceiling for years to come; however, this is no longer the case.

According to the University of the Witwatersrand, located in Johannesburg, South Africa, the increases in amounts of data used, coupled with the advances in information technology, will have troublesome consequences on the current technology used for providing the bandwidth needed to transport data over fiber-optics.

“Traditional optical communication systems modulate the amplitude, phase, polarization, color, and frequency of the light that is transmitted,” read a statement from the university.

Recently, researchers from this university working in a laboratory created a computer-generated holographic simulation encoded with more than 100 light pattern configurations in multiple colors.

These configurations individually passed through a special liquid crystal combining all the light pattern configurations, acting as a single spatial light modulator or focal point.

Each light configuration pattern was extracted (demultiplexed) and sent to its individually coded destination.

The university’s spatial light modulator experiment demonstrates the expansion possibilities available within the currently-used optical fiber to improve their efficiency in sending voice, video, and data to their destinations.

The following is something previously unknown to the public.

In 1969, NASA secretly used fiber-optic technology during the first Moon landing.

I learned the lunar television camera used by the Apollo 11 astronauts on the moon’s surface during July 1969 included fiber-optic technology.

NASA once classified as confidential the fiber-optic technology within its “Lunar Television Camera Pre-installation Acceptance (PIA) Test Plan” document, dated March 12, 1968.

The Aerospace Division at Westinghouse Electric Corporation assisted in developing and manufacturing the 7.3-pound black-and-white Apollo Lunar Television Camera (part number 607R962).

NASA stowed the Apollo 11 lunar surface television camera inside a storage compartment on the lunar module’s (Eagle) descent stage during its mission to the moon.

Apollo 11 is the only moon mission when astronauts would use this camera on the lunar surface.

However, NASA brought the same camera on the Apollo 13, 14, 15, and Apollo 16 missions as a backup if the newer color television cameras onboard encountered a problem.

Back in the day, this old telephone cable splicer was concerned about having enough wired copper pairs to provide a dial tone signal to the phones of the subscribers living miles away from the central telephone office.

Today, engineers and technicians concern themselves with having enough optical fibers to provide sufficient bandwidth to transmit streams of high-speed broadband data over the internet.

NASA’s March 12, 1968 booklet describing the lunar television camera is available at https://go.nasa.gov/2Rzilcd.

Included inside this booklet on page 1 of 10, Section 1.1, titled: Security Classification, is the description: “The fiber optics portion of the camera is classified CONFIFDENTIAL. The camera must be kept in a secured area when not in use.”

Over the last 180 years, we have progressed from Colladon’s “light guiding” through the water to sending terabits per second of light-encoded information through glass strands inside a fiber-optic cable.

What’s next? Stay tuned.