Tweet This! :)

Thursday, February 12, 2026

How frequency modulation conquered static

@Mark Ollig 

In the 1920s, radio evolved from small experiments into a powerful new medium for sharing information and entertainment wirelessly across America.

Westinghouse station 8ZZ in Pittsburgh, the direct predecessor of KDKA, broadcast live the results of Warren G. Harding’s presidential win over James M. Cox at about 545 kHz Nov. 2, 1920, before newspapers could report them.

The station aired the first Major League Baseball play-by-play broadcast from Forbes Field in Pittsburgh Aug. 5, 1921, using a telephone as a microphone; in case you were interested, the Pittsburgh Pirates beat the Philadelphia Phillies 8 to 5.

According to a Feb. 12, 1922, article in the New Jersey Trenton Sunday Times-Advertiser, “President Harding is the latest notable wireless fan,” noting that “a radio phone has been placed in the presidential study in the White House.”

“The President proposes to make excellent use of the wireless and is planning to give some part of each of his busy days to listening-in. A radiophone is also to be installed in the press room in the White House,” the article stated.

WLAG in Minneapolis began broadcasting Sept. 4, 1922, at 720 kHz.

It was acquired by Washburn‑Crosby Co. and became WCCO Oct. 2, 1924.

WCCO moved to 810 kHz in 1928 and transmitted at 50,000 watts by September 1932, reaching much of the Upper Midwest.

In 1941, it transferred to 830 kHz and was long promoted as “the station that serves the nation.”

Nationally, radio saw the launch of the National Broadcasting Company (NBC) in 1926 and the Columbia Broadcasting System (CBS) in 1927.

Radio owes much to Edwin Howard Armstrong, an American electrical engineer and inventor born Dec. 18, 1890, in New York City.

Armstrong was fascinated by gadgets from a young age, building a homemade antenna tower and conducting electrical experiments.

In 1912, while studying at Columbia University, Armstrong discovered that feeding some of a radio tube’s output back into its input could make weak signals regenerate into much stronger ones.

In 1914, his regeneration design obtained US Patent 1,113,149, titled “Wireless receiving system.”

However, Lee de Forest, an American inventor and electrical engineer, had patented the Audion tube in 1907, and claimed it was the same basic idea as Armstrong’s, and so he should not own the rights or collect any royalties.

This argument over who really invented regeneration in a circuit ended in 1934 when the US Supreme Court sided with de Forest.

Many engineers believed Armstrong was the one who had actually turned regeneration into a practical, working radio circuit.

In 1918, while serving in the US. Army Signal Corps during World War I, Edwin Howard Armstrong developed the superheterodyne receiver.

His design mixed the incoming station signal with a locally generated signal and shifted the result to a fixed intermediate frequency that was easier to amplify and filter.

After amplification at that intermediate frequency, the receiver recovered the audio signal.

This approach produced clearer voices and music with less “bleed-over” from nearby stations, and it remains the basic architecture of most modern radio, television, radar, and cellular receivers.

Starting in the 1920s, home radios tuned into AM stations, typically from about 550 to 1,500 kHz, until the AM band was extended to 1,600 kHz in 1941.

In the early 1930s, Armstrong engineered wide-band frequency modulation, or FM, which changed the signal’s frequency rather than its amplitude, making it much less susceptible to electrical noise.

He tested the system at extremely high frequencies and launched the experimental station W2XMN in Alpine, NJ.

Assigned 42.8 MHz (megahertz) in 1938, the station began regular broadcasts in 1939, demonstrating that FM could deliver high-fidelity sound far superior to AM.

Minnesota engineers soon joined this high-frequency frontier.

In 1939, Minneapolis station W9XHW operated at 42.30 MHz with 50 watts from the Nicollet Hotel.

W9XHW was an experimental “Apex” station using amplitude modulation (AM) on very high radio frequencies above the standard AM band to provide wide audio bandwidth and higher‑fidelity sound.

Because it still relied on AM, it remained vulnerable to lightning and man‑made static, letting engineers see how Armstrong’s FM system improved on earlier high‑frequency AM.

Some Twin Cities broadcasters were also testing Apex signals.

KSTP’s high-frequency station W9XUP began “ultra-short-wave” Apex tests in 1938 at 25.95 MHz, then shifted to 26.15 MHz in 1939.

WTCN’s W9XTC, active in 1939, operated as an experimental FM transmitter on 26.05 MHz.

The Apex era of experimentation effectively ended when the Federal Communications Commission (FCC) eliminated the Apex band Jan. 1, 1941, reallocating its spectrum to the new FM broadcast band.

The FCC then used those former Apex channels to launch the first commercial FM band from 42 to 50 megahertz, replacing high‑frequency AM with Armstrong’s new static‑resistant system.

In March 1941, Nashville’s W47NV went on the air at 44.7 MHz as the first licensed commercial FM station in the United States.

In June 1945, the FCC moved FM broadcasting from 42 to 50 MHz up to 88 to106 MHz.

The band was expanded again in 1946 to 88 to 108 megahertz.

The change disrupted many early FM stations, including Armstrong’s, but the core technology adjusted to the new allocation and eventually became the standard for high-quality broadcasting.

FM radio faced strong opposition from the Radio Corporation of America (RCA) and the AM radio establishment, which saw it as a threat to their existing investments.

In 1940, RCA offered Armstrong $1 million for a nonexclusive license to his FM patents with no royalties.
He refused, insisting that RCA pay royalties as other licensees did.

Years of lawsuits followed, with RCA President David Sarnoff applying financial and legal pressure while Armstrong fought to defend his work and reputation.

The struggle took a heavy toll on Armstrong, and by the early 1950s, he was financially strained and physically exhausted.

Edwin Howard Armstrong, at the age of 63, took his own life in New York City, NY Feb. 1, 1954.

His wife, Marion, carried on the litigation, securing several important settlements, including a significant deal with RCA.

In 1983, the US Postal Service paid tribute to Edwin Howard Armstrong with a 20-cent commemorative stamp featuring his portrait alongside an illustration of his frequency modulation invention.



Thursday, February 5, 2026

Farmers used barbed wire fences as phone lines

@Mark Ollig

During the 1880s and 1890s, most rural areas in Minnesota lacked commercial telephone service.
Bell, the national organization named for inventor Alexander Graham Bell, and other companies mainly served larger towns, leaving farmers without service.

In Minnesota, its Bell-licensed operating company was the Northwestern Telephone Exchange Company, incorporated in Minneapolis in 1878.

Farms were often miles from the nearest exchange, and pole lines across open land were costly to build and maintain, so commercial telephone companies commonly delayed rural service.

By the 1880s and 1890s, farmers in southern and western Minnesota had already installed miles of galvanized steel barbed wire fencing to manage their livestock.

Farmers realized the metal in their fences could carry electrical current, which sparked a creative idea.

They used these fences as improvised telephone lines, letting farmers talk with neighbors from their homes.

Many sources called them fence line telephone systems, which used a single-wire earth-return circuit, with the barbed wire serving as the line conductor.

A copper lead-in wire ran from the telephone’s L1 terminal and was clamped to the fence’s top strand.

The return path for speech and ringing current was the ground.

A second wire ran from the telephone’s L2 terminal to a ground rod, often an iron rod driven into the soil.

When you talked, current left your telephone, traveled along the fence to your neighbor’s telephone, went into their ground connection, passed through the earth, and returned through your ground rod to complete the circuit.

The fence line acted as a single conductor, carrying speech or ringing current, powered during conversation by dry-cell batteries inside the telephone or a nearby external battery box.

To place a call, the user turned a hand crank, generating a brief burst of higher-voltage alternating current from the magneto that rang the bells on every telephone connected to the line.

In some cases, as many as 20 telephones shared a single fence line circuit, so every party-line bell rang when someone called, unless neighbors used agreed-upon ring codes.

Insulators included glass bottle necks, leather straps, rubber hose pieces, or even corn cobs, used to keep the wire from touching damp wooden fence posts and leaking electricity.

These homebuilt telephone systems created party lines that connected neighbors over long distances, allowing them to share news, organize farm work, and call for help in emergencies.

A Minnesota newspaper offered a detailed look at how fence line telephony worked.

The Freeborn County Times of Albert Lea published an article Aug. 31, 1900, describing how people were turning fence lines into working telephone lines.

The article explained that barbed wire fences were being used as conductors, allowing farmers to build low-cost telephone systems without erecting traditional pole lines.

It described a system in which the top strand of the fence line served as the main line, with insulation improvised at fence posts and crossings to prevent electrical leakage.

The paper reported that the fence line system extended about 15 miles and had been in operation since late December, with only brief interruptions, including one caused when a cow broke the wire after being struck by a train.

It said the entire line was built for less than $100, and that the telephone outfit itself cost no more than $10, highlighting how inexpensive these systems could be.

These fence line systems typically used wall-mounted, hand-cranked magneto telephones, the standard rural instruments of the era.

Dry-cell batteries inside the telephone supplied low-voltage direct current to the carbon transmitter for talking, while turning the hand crank generated a brief burst of higher-voltage alternating current to ring the bells on other phones sharing the line.

Because ground conditions affected signal quality, farmers learned that wet soil improved performance, while dry ground could weaken it, and some even joked that it was “time to go out and water the grounding rod.”

Barbed wire isn’t the best medium for carrying sound, as voice quality fades over longer distances and during storms, and calls often contained some noise.

Even with these quirks, fence line telephones were still an important way for rural farms to stay connected.

Fence line telephones made it easier to get help with farm equipment, report fires, call for medical aid, share weather updates, and chat across miles of farmland in the evenings.

Linking farms along property lines, they formed local networks that served rural communities.

These homemade telephone systems brought basic communication to rural areas long before commercial telephone service reached many Minnesota farms.

By the early 1900s, fence line systems drew the attention of state lawmakers.

The St. Paul Globe reported Feb. 26, 1903, on two Minnesota House bills designed to settle disputes over fence line telephone connections.

The article said the bills were introduced by Rep. Charles H. Klein of District 25, representing Carver County, and Rep. Jacob D. Schroeder of District 14, representing Jackson County.

The proposals reflected growing pressure from rural users pushing back against Bell-affiliated telephone companies.

While both bills failed, the debate made clear that by 1903, farmers were making fence line telephone service a viable mode of communication in rural Minnesota.

An article in the Park Rapids Enterprise reported March 21, 1912, that Minnesota collected $163,052.28 in gross earnings taxes from 695 telephone companies in 1911.

Taxes ranged from 12 cents to $70,059.63.

The Carlisle Barbed Wire Telephone Company was highlighted for using fence lines instead of poles, paying 87 cents in taxes.

This shows that even fence line telephone systems were treated as taxable telephone operations under Minnesota law.

As the federal Rural Electrification Administration (REA) financed new power lines across the countryside in the late 1930s and 1940s, grounded distribution circuits often ran near older, single-wire, earth-return farm telephone fence lines.

That closeness let 60-cycle power fields (and their harmonics) couple into those fence line circuits, adding a loud hum that could make conversations extremely difficult.

Farmers began switching from fence line networks to telephone cooperatives or independent company networks, which provided clearer calls, fewer disruptions, better privacy, and connections to more people within Minnesota and other states’ networks.

Long before digital transmissions, fiber optics, and cell towers, the story of rural telecommunications in Minnesota was shaped by farmers who transformed barbed wire fences into makeshift telephone lines.



Friday, January 30, 2026

NASA’s Orion laser test from the moon

@Mark Ollig

The Orion Artemis II Optical Communications System (O2O) is an infrared laser payload for NASA’s Artemis II Orion spacecraft.

The Artemis II mission will demonstrate O2O by establishing a high-speed optical link between the moon and Earth.

The system is designed to send data to Earth at 20 to 260 megabits per second and receive at 10 to 20 megabits per second.

Put simply, it is like a high-speed fiber-optic link between the moon and Earth.

The O2O Space Terminal Element (STE) is installed on Orion’s Crew Module Adapter (CMA), the ring that connects the crew module to the European Service Module, analogous to the interface between Apollo’s Command Module (CM) and the attached Service Module (SM).

The Space Terminal Element (STE) includes a four-inch telescope on a two-axis gimbal, the modem module, which modulates and demodulates data for transmission over the laser link, and power-conversion and controller electronics integrated into the Crew Module Adapter (CMA).

Orion’s S-band radio link operates at two to four gigahertz, with about one to two megabits per second downlink from lunar distance for telemetry and data, and uplink for commands.

It also uses the Modular, Agile, Scalable Optical Terminal (MAScOT) inside the Space Terminal Element.

Developed by the Massachusetts Institute of Technology (MIT) Lincoln Laboratory, MAScOT is a compact laser communications terminal designed for spacecraft, including the Orion spacecraft.

NASA and MIT Lincoln Laboratory describe MAScOT as about the size of a house cat.

Its gimbal keeps the infrared beam locked onto an optical ground station across the Earth-moon span.

The data rate jump comes from moving from radio frequencies to light.

O2O employs a 1,550-nanometer infrared laser, which operates at approximately 193 terahertz. This light-wave frequency provides significantly more usable bandwidth than S-band.

One way to compare links is by daily data returns.

In a NASA comparison for a 10-day Artemis II mission, S-band alone returns about seven gigabytes per day.

That can leave hundreds of gigabytes still stored onboard at landing.

Add one hour per day of optical downlink at 80 megabits per second (Mbps), and the daily data return rises to about 36 gigabytes.

At 260 megabits per second, one hour moves about 117 gigabytes, and two hours clears more than 200 gigabytes.

That changes what can be sent during a mission.

NASA briefings suggest O2O can support 4K video from lunar distance at 260 megabits per second, plus still images, procedures, flight plans, and other files.

Laser beams are more focused than radio beams, allowing signals to travel straight to Earth and arrive stronger.

The spacecraft and ground station need very accurate pointing and tracking.

For waveforms and coding, O2O follows the standards of the Consultative Committee for Space Data Systems (CCSDS).

It uses serially concatenated pulse-position modulation (SCPPM) to improve photon efficiency.

O2O employs CCSDS Blue Book standards for optical communications.

In everyday terms, O2O carries a steady stream of internet-style data between the spacecraft and Earth.

Ethernet frames are packaged, encoded, transmitted, and reconstructed on the ground, with higher-layer protocols and addressing unchanged.

To maintain a strong connection, NASA uses at least two ground stations to receive laser signals.

One is at White Sands Complex in New Mexico, and another is at Table Mountain in California. Having more than one site is important because clouds or bad weather can block the laser beam.

By spreading out the stations, NASA reduces the odds that every location will be clouded over at the same time.

Clouds can block the signal, so separated terminals reduce the odds that every site is blocked at once.

This laser system, called O2O, is just one part of the larger network that helps send information between space and Earth.

Radio remains the all-weather workhorse for command and safety, while higher-capacity links handle bulk data.

Orion’s parachutes are packed in the top of the crew capsule, but O2O is mounted lower on the ring that connects the capsule to its service module.

The ring is jettisoned before reentry, so O2O is gone before the parachutes deploy.

Some Minnesota companies are involved in sending Orion around the moon.

Minnetonka-based Stratasys worked with Lockheed Martin to produce more than 100 Orion flight parts using Antero materials.

TE Connectivity in Shakopee provides radiation-hardened connectors for Orion’s deep space systems, and ToolDiscounter in Saint Paul supplies assembly tools for NASA.

PAR Systems in Shoreview is a listed NASA Orion supplier specializing in remote manipulators and robotics.

Honeywell supports Artemis through Lockheed Martin by providing Orion’s guidance, navigation, data handling, and flight software.

Scientifically backed, NASA created O2O using CCSDS optical protocols developed for lunar and deep-space laser communications, building on lessons from earlier optical demonstrations.

Think of it as an Ethernet “pass-through.” Orion hands off regular Ethernet traffic, the laser link carries it across the gap, and the ground side puts those same frames back onto the network.

Orion’s O2O payload is flying to validate higher-rate optical communications beyond Earth orbit.

Plus, it supports NASA’s longer-term work on space networking, including delay-tolerant methods used when links drop in and out.

Today, NASA says the Artemis II spacecraft, named Integrity, should be set to launch no earlier than Friday, Feb. 6, on an approximately 10-day mission.

Rolled out Jan. 17 of this year, NASA’s Space Launch System (SLS) rocket and Orion spacecraft, a 322-foot-tall integrated stack, left the Vehicle Assembly Building (VAB) and arrived at Launch Pad 39B at the Kennedy Space Center, FL.

Under NASA’s plan, the Artemis II wet dress rehearsal is set for early February to fuel the Space Launch System (SLS) rocket and Orion spacecraft and run the countdown to just before an actual liftoff.

March and April will be considered if the Friday, Feb. 6 launch date is unattainable.

Pending a “go for launch” after the wet dress rehearsal, Artemis II would be the first crewed trip around the moon since Apollo 17 in 1972.


Thursday, January 22, 2026

Consumer Electronics Show 2026 showcased ‘physical’ AI

@Mark Ollig

This year’s Consumer Electronics Show, or CES, hosted Jan. 6 to 9 in Las Vegas, NV, highlighted a shift in artificial intelligence (AI) away from purely digital applications.

AI technology is moving inside machines designed to operate in the physical world.

Much of the attention focused on the computing platforms and processors that power physical AI systems, including robots, vehicles, and industrial equipment.

NVIDIA, based in Santa Clara, CA, develops chips and software for high-end computing used in AI, scientific research, and gaming.

At CES, the company introduced Rubin, a new computing platform designed to support large-scale robotics, automation, and other real-world AI systems.

Named after astronomer Vera Florence Cooper Rubin (1928 to 2016), the platform consists of six tightly integrated chips, including the Vera central processing unit (CPU), which uses 88 custom processor cores.

NVIDIA said Rubin is designed to reduce the time and cost required to train and run advanced AI systems, addressing the rapidly growing demand for computing power.

The Rubin platform uses High Bandwidth Memory 4 (HBM4), delivering up to 22 terabytes per second of memory bandwidth.

Boston Dynamics used CES to unveil a fully electric, production-oriented version of its Atlas robot.

Atlas stands about 6.2 feet tall, has a reach of roughly 7.5 feet, can lift up to 110 pounds, and is designed for industrial environments.

LG Electronics, a South Korean electronics corporation, introduced the CLOiD humanoid home robot at CES this year.

The name combines LG’s CLOi brand with the letter D for dynamic and the suffix -oid for humanoid.

The human-like torso wheeled robot features articulated arms and five-fingered hands that independently mimic human dexterity.

Powered by vision-based physical AI, CLOiD navigates over household surfaces and can handle chores, such as folding laundry.

It can also open the refrigerator, identify various drinks or other items, and bring the correct one as requested by the user.

Unitree Robotics, based in China, known for its quadruped robots, showcased its humanoid lineup at CES 2026.

It featured their G1 model performing dance and boxing routines to demonstrate its advanced motor control.

The G1 employs proprietary AI and trained through imitation and reinforcement learning to maintain balance and perform complex movements.

Intel, the American technology company based in Santa Clara, unveiled its Core Ultra Series 3 processors, codenamed Panther Lake, which are the first consumer chips made with the Intel 18A manufacturing process.

The platform features a dedicated neural processing unit (NPU) that can perform up to 50 trillion operations per second (TOPS).

Combined with the CPU and integrated GPU, the total platform performance can reach 180 TOPS, enabling advanced capabilities like real-time voice processing and high-fidelity image generation, while powering next-generation AI computing applications.

Advanced Micro Devices (AMD), based in Santa Clara, showcased its Ryzen AI Max+ processor.

This processor supports up to 128 gigabytes of unified memory, enabling the CPU and GPU to share it and improving performance for compute-intensive applications.

AMD said some Ryzen AI laptop systems are expected to start at $499.

Qualcomm, an American technology corporation headquartered in San Diego, CA, introduced the Snapdragon X2 Plus processor, designed for lower-cost laptops.

The company said the chip can handle up to 80 trillion operations per second while using less power than earlier designs.

Wearable devices were also featured across the CES 2026 show floor.

Razer, a gaming hardware and software company based in Irvine, CA, displayed concept products such as Project AVA, a desktop companion with a holographic avatar for user interaction.

It also showed Project Motoko, a concept wireless headset that uses two forward-facing cameras to capture what the wearer sees, supporting on-device assistance features and research into machine vision and robotics.

Nirva AI Jewelry, a wearable technology company headquartered in San Francisco, CA, showed off a device that aims to spot activity patterns by listening to and tracking the wearer’s movement.

Lenovo, a global technology company incorporated in Hong Kong with headquarters in Beijing and North Carolina, displayed concept smart glasses designed to provide real-time translation and visual recognition.

In transportation and industrial technology, NVIDIA introduced Alpamayo, an AI model and development stack intended to support autonomous driving systems.

NVIDIA said the technology will first appear in the Mercedes-Benz CLA.

Ford Motor Co. plans to roll out a new “in-app helper” early this year through the Ford and Lincoln smartphone apps, reaching about eight million existing customers.

Ford says it will move into vehicle dashboards in 2027.

The app will pull from vehicle data and owner information to answer questions about tire pressure, oil life, and towing capacity.

Ford’s current BlueCruise is a level two hands-free driving feature that still requires the driver to watch the road.

Ford says its 2028 update is aimed at level three capability, allowing the driver to look away at times on approved highways under specific conditions.

Caterpillar Inc. introduced a suite of industrial tools developed in collaboration with NVIDIA.

The Cat AI Assistant allows equipment operators to interact directly with machinery to diagnose issues, plan maintenance, and locate replacement parts.

Caterpillar said the system uses real-time data from the equipment itself.

This year’s CES showcased how AI is moving beyond computing screens and data servers into robots, vehicles, and machines built to perform real-world work in a physical AI environment.



Thursday, January 15, 2026

How CES showcased technology

@Mark Ollig

The Consumer Electronics Show (CES) was hosted from June 25 to 28, 1967, in New York City, NY.

The Electronic Industries Association (EIA), which later became the Consumer Technology Association (CTA), organized the event at the New York Hilton and Americana hotels.

At the first CES, 117 exhibitors took part, and 17,500 people attended.
In 1967, New York highlighted the future of technology with compact devices such as transistor radios, stereos, and small black-and-white televisions.

One highlight was Sony’s New York exhibition during CES week, which showed a seven-inch color TV with a Chromatron tube and a one-inch black-and-white TV.

These were examples of advanced engineering that were not yet ready for consumers.

The June 19, 1967, issue of Merchandising Week also mentioned Sony’s portable videotape recorder and camera set.

It was a lightweight, battery-powered system for outdoor use, expected to cost about $1,000 (about $9,700 today) and to launch later that year.

By the mid-1970s, CES in Chicago became the site of a major format battle in home entertainment between Sony’s Betamax and JVC’s Video Home System (VHS).

In 1972, Philips introduced its N1500 videocassette recorder, one of the first consumer video cassette recorders (VCRs).

In the years that followed, Sony’s Betamax and JVC’s VHS competed, with recording time and price as the main factors.

While Betamax was often seen as technically superior, VHS offered two hours of recording time, enough for a movie or a sporting event, compared with Betamax’s original one hour.

JVC licensed its technology to other companies, which lowered costs and helped VHS take over the market.

CES became a place to see early video game consoles and home computers, as companies brought gaming and computing into people’s homes.

In 1976, while attending Brainerd High School, I took an audiovisual television class.

We could earn extra credit by recording educational TV programs in the evenings for the school library.

I used a Sony Betamax video recorder and 60-minute Betamax tapes from Sony and 3M.

The picture and sound quality were impressive at the time, and in case you were wondering, I did earn an “A” in the class, though I am not bragging.

CES was first hosted in Las Vegas, NV, in 1978 and later became its main location.

In the 1980s and 1990s, CES introduced consumers to compact discs, early camcorders, and home video game systems such as the Nintendo Entertainment System.

The event also showed how the industry was moving from bulky picture tubes to flat-panel and high-definition TVs.

At the 1996 CES, the Digital Versatile Disc (DVD) was a major highlight. DVDs soon helped bring about the end of VHS tapes.

In the early 2000s, CES was the scene of another format battle, this time between Toshiba’s high-definition DVD (HD DVD) and Sony’s Blu-ray Disc.

The contest ended when movie studios chose Blu-ray over HD DVD.

Sarah Szabo served as public relations event manager for the Consumer Electronics Association during the 2007 CES, hosted Jan. 8 to 11 in Las Vegas.

In 2007, I had the chance to ask Szabo a few questions about that year’s show.

She said we could expect major advances in consumer electronics, with trends focused on better device connectivity, greater portability, and closer links between technology and content.

That column from Jan. 15, 2007, is available online at https://bit.ly/453PtLM.

In the 2000s, CES became a key venue for launching smartphones, tablets, and other mobile devices, demonstrating how computing, telecommunications, and consumer electronics were converging.

In the 2010s and 2020s, CES increasingly focused on automotive technology, digital health, and smart-home products.

The 2026 CES brought together more than 4,100 exhibitors and more than 148,000 attendees across approximately 2.6 million square feet at the Las Vegas Convention Center and The Venetian Resort.

More than 6,900 journalists, content creators, and industry analysts from around the world covered the event.

This year’s CES put the spotlight on advances in artificial intelligence (AI), digital health, robotics, and automotive technology.

Many of the event’s exhibitors presented how AI is finding its way into more real-world applications.

During this year’s CES, Intel unveiled its Core Ultra Series 3 “Panther Lake” processors, designed for everyday laptops and personal computers (PCs).

RCA, the Radio Corporation of America, also celebrated its 100th anniversary with a nostalgic exhibit of vintage analog televisions from the late 1940s and 1950s.

Visitors were said to be able to feel static electricity on the glass screens and see the warm glow of cathode-ray tube (CRT) technology.

A 1946 RCA 630-TS, the first mass-produced TV, and the 1946 General Electric (GE) 950, one of the first color models, were there, too.

It was the 59th CES, and even amid today’s AI headlines, those old TV screens reminded folks how far consumer technology has come.




Thursday, January 8, 2026

‘Classified’ fiber-optics used with Apollo 11’s camera

@MaarkOllig

In 1841, Swiss physicist Jean-Daniel Colladon at the University of Geneva demonstrated to his students that light could be guided through total internal reflection.

This phenomenon occurs when a ray of light repeatedly bounces within a transparent substance, following its contours instead of escaping.

Colladon illustrated this by focusing sunlight through a lens into a tank and then directing the beam into a jet of water flowing out of the tank.

The light remained trapped within the flowing water, guided along its curve by repeated internal reflections at the boundary between the water and the surrounding air.

The water appeared to glow like a luminous glass pipe, demonstrating how total internal reflection can guide light along a curved path.

Colladon later described the experiment, titled “On the reflections of a ray of light inside a parabolic liquid stream,” in a brief note to the French Academy of Sciences, published in Comptes Rendus Oct. 24, 1842 (Volume 15, pages 800 to 802).

In it, he wrote, “I managed to illuminate the interior of a stream in a dark space. I have discovered that this strange arrangement offers in results one of the most beautiful, and most curious experiments that one can perform in a course on Optics.”

Colladon’s experiment showed that light can be guided, the same basic principle behind today’s fiber-optic cables, where light travels through ultra-thin glass strands about the width of a human hair.

By the 1960s, this light-guiding idea had moved from the lecture hall into space-age hardware.

When Apollo 11 touched down on the moon July 20, 1969, it brought along a Westinghouse television camera with a fiber-optic element inside its imaging tube, using total internal reflection to direct light.

Westinghouse both designed and manufactured this fiber-optic component for the Apollo 11 lunar TV camera.

The project was overseen at their Defense and Space Center in Baltimore, MD, while the SEC (secondary electron conduction) tube’s fiber-optic faceplate was produced at Westinghouse’s electron tube division.

The Manned Spacecraft Center in Houston, TX, issued an acceptance test plan for the Lunar Television Camera March 12, 1968, classifying its fiber-optic section as confidential.

This designation was tied to security controls surrounding the SEC image tube technology in the camera, a proprietary low-light imaging design.

The test plan said that any camera with the fiber-optic part had to be treated as classified if it was taken out for repair or maintenance, and that it had to be kept in a secure place when not in use.

Only authorized and supervised staff could handle the camera because of its classified fiber-optic assembly inside the SEC image tube.

The camera required secure handling, controlled access, and supervised maintenance for any unit that included the classified fiber-optic portion.

The Apollo 11 lunar camera, weighing about 7.25 pounds, captured video on the moon that was transmitted to Earth receiving stations, where its signal was converted into a standard format for broadcasting.

In NASA photo AS11-40-5907, you can see the television camera set up on the lunar surface for the Apollo 11 mission.

The camera sent a slow-scan black-and-white picture at about 320 lines and 10 frames per second, which seems crude today but was state-of-the-art in 1969.

Because the lunar module’s radio system could not transmit standard broadcast television, the camera sent a slow-scan video signal to Earth.

At NASA’s ground receiving stations, specialized scan-conversion equipment converted the slow-scan signal into a standard television format for broadcast over televisions on Earth.

On the Apollo 11 lunar module descent stage, the camera was stowed in the Modular Equipment Stowage Assembly, or MESA.

The MESA was a fold-down storage compartment on the lunar module that carried tools, scientific equipment, and the lunar surface television camera.

The Apollo 11 lunar module Eagle landed on the moon at 3:17:40 p.m. CDT (Central Daylight Time, Minnesota time) July 20, 1969.

That evening, astronaut Neil Armstrong opened the hatch of the lunar module and pulled a cord next to the ladder to release the MESA.

This step set up the television camera so it could show his descent live as he climbed down the ladder.

A TV cable linked the camera to the lunar module, providing power and sending the video signal through the lunar module’s communication system to be broadcast to Earth.

Inside the lunar module, astronaut Buzz Aldrin verified that the TV circuit breaker was in.

Later during the moonwalk, Armstrong removed the camera from its MESA mount, carried it about 30 feet from the lunar module, and set it on a tripod, keeping the cable connected so the signal could be sent back to Earth.

The following excerpts are from NASA’s official “Apollo 11 Technical Air-to-Ground Voice Transcription.”
Mission transcript times have been converted to Central time (CDT).

Key: CDR, Commander Neil A. Armstrong; LMP, Lunar Module Pilot Edwin E. Aldrin Jr.; CapCom, capsule communicator in Mission Control Center, Houston; TV, television camera in the lunar module’s Modular Equipment Stowage Assembly (MESA) bay.

9:52:58 p.m. CDR: “Okay. Can you pull the door open a little more?”

9:53:00 p.m. LMP: “All right.”

9:53:03 p.m. CDR: “Okay.”

9:53:07 p.m. LMP: “Did you get the MESA out?”

9:53:09 p.m. CDR: “I’m going to pull it now.”

9:53:18 p.m. CDR: “Houston, the MESA came down all right.”

9:53:22 p.m. CapCom: “This is Houston, Roger. We copy. And we’re standing by for your TV.”

9:53:39 p.m. CDR: “Houston, this is Neil. Radio check.”

9:53:42 p.m. CapCom: “Neil, this is Houston. Loud and clear. Break. Break. Buzz, this is Houston. Radio check, and verify TV circuit breaker in.”

9:53:54 p.m. LMP: “Roger. TV circuit breaker is in, and read you five square.”

9:54:00 p.m. CapCom: “Roger. We’re getting a picture on the TV.”

9:54:09 p.m. LMP: “You got a good picture, huh?”

9:54:11 p.m. CapCom: “There’s a great deal of contrast in it, and currently it’s upside down on our monitor, but we can make out a fair amount of detail.”

9:54:28 p.m. LMP: “Okay. Will you verify the position, the opening I ought to have on the camera?”

9:54:34 p.m. CapCom: “Stand by.”

9:54:48 p.m. CapCom: “Okay. Neil, we can see you coming down the ladder now.”

When the lunar landing mission ended, the camera was left at the Apollo 11 landing site, with its classified fiber-optic portion inside the camera’s housing.

Jean-Daniel Colladon was born Dec. 15, 1802, and died June 30, 1893.




















Thursday, January 1, 2026

Birth of the television age: 1926

@Mark Ollig

A hundred years ago, electricity, telecommunications, and radio were shaping the world of technology.

A new broadcast medium was emerging from a small attic in London’s Soho district.

John Logie Baird, born Aug. 13, 1888, in Helensburgh, Scotland, became fascinated early on with sending images over wires.

After World War I, he moved to London and set up a laboratory in two attic rooms above a shop at 22 Frith Street.

There, he built a prototype television from whatever he could find, including bicycle wheels, biscuit tins, old lenses, and leftover radio parts.

His system used a spinning Nipkow disk to scan images one line at a time, turning them into flashes of light.

A photoelectric cell converted these flashes into electrical signals, which were then amplified by vacuum-tube circuits similar to those found in 1920s radios.

Baird achieved what is often considered one of his first clearly successful demonstrations of television Oct. 2, 1925.

In 1925, others were also working on television, but Baird demonstrated a working system that transmitted recognizable moving images from a scanner to a receiver.

This breakthrough brought him significant recognition.

Baird used an early mechanical scan of about 30 lines at roughly five images per second to transmit a small, dim, high-contrast picture of a ventriloquist’s dummy head he called Stooky Bill.

At the receiving end, a synchronized disk and a modulated light source recreated the image on a ground-glass screen.

Curious to see how a human face would appear, Baird later asked 20-year-old office worker William Edward Taynton to sit under bright lamps in front of the scanning disk.

Taynton is often regarded as the first person televised in an identifiable, full-tone image.

Observers said the pictures were dim and sometimes shaky, but faces were still easy to recognize, complete with movement and expressions.

Photoelectric cells in the 1920s were not very light-sensitive, so human faces had to be lit intensely for the cell to produce a strong enough signal for recognizable features.

Baird gave a public demonstration of his working television system Jan. 26, 1926, in his upstairs laboratory at 22 Frith Street.

Although he had produced experimental images in 1925, the 1926 demonstration brought television into public view.

Baird named his electromechanical device the “Televisor.”

The demonstration showed that live moving pictures could be scanned, converted into an electrical signal, and sent to a receiver in the same room.

Baird’s improved 1926 system used a disk with about 30 holes arranged in a spiral, spinning at about 750 revolutions per minute to produce a 30-line image at roughly 12.5 images per second.

Most sources do not list the exact operating voltages for Baird’s 1926 equipment, but vacuum-tube circuits of the period typically used low-voltage filaments, often about four to six volts.

Higher direct-current plate supplies could range from several tens to several hundred volts.

On the receiving end, a neon tube provided the light for the ground-glass viewing screen.

Neon lights respond quickly to changes in signal strength, making them well-suited for mechanical scanning.

The resulting image was small, dim, and tinted orange, a common characteristic of early neon-lit television pictures.

The Times of London reported Jan. 28, 1926, that the demonstration showed it was possible to transmit and instantly reproduce movement details.

The reporter wrote that the image was “faint and often blurred,” but it “substantiated a claim that through the ‘Televisor’. . . it is possible to transmit and reproduce instantly the details of movement, and such things as the play of expression on the face.”

By spring 1926, Baird had begun demonstrating that sound could accompany the images.

The Minneapolis Sunday Tribune reported May 23, 1926, “To demonstrate his televisor, Mr. Baird speaks into the transmitter of the device located in his laboratory.”

The paper added, “The inventor’s voice is heard and the picture of his face is seen very clearly, witnesses say.”

The Austin Daily Herald reported Aug. 27, 1926, that British radio authorities had granted Baird licenses for television, allowing him to build and sell televisors and establish a service that let owners “look in” as well as listen to programs.

That fall, newspapers continued searching for words to describe what audiences were seeing on Baird’s televisor.

The Bayonne Evening News in New Jersey referred to the image broadcast technology as “radiovision,” Nov. 19, 1926, while other newspapers described it as “seeing by radio.”

A report dated Sept. 11, 1926, in the Illustrated London News presented readers with a detailed technical diagram of Baird’s television system.

It displayed and labeled Baird’s image transmitting station, showing a large spinning disk, rows of lamps illuminating the subject, and amplifiers boosting the signal.

The receiver was depicted as a wooden cabinet with its own spinning shutter, controls, and a small glass viewing screen.

The newspaper called television a “new radio miracle” and predicted the technology would “eventually give us sight as well as sound by wireless.”

Baird continued testing the limits of his system.

Later in 1926, he demonstrated that television signals could travel beyond the laboratory, either over wires or by wireless radio on medium-wave frequencies of about 200 meters (about 1.5 MHz).

In late May 1927, Baird proved that television signals could be transmitted over specially conditioned telephone lines, sending images from London to Glasgow, about 438 miles through the long-distance telephone circuit.

The images were described as grainy, but the transmission over the telephone circuits worked.

Baird’s mechanical system established that moving images could be scanned, transmitted, and reassembled at a distance.

By 1936, television had fully shifted to electronic broadcasting, leaving behind the spinning disks of the 1920s.

John Logie Baird died June 14, 1946, at age 57, in Bexhill, Sussex, England.

Today, the former laboratory at 22 Frith St. in Soho is home to Bar Italia, a cafe.

A plaque installed in January 2017 by the Institute of Electrical and Electronics Engineers marks the building as the site of the “First Public Demonstration of Television, 1926.”