Radio Astronomy: Listening to the Universe – Star Trails: A Weekly Astronomy Podcast
Episode 107
In this episode, we move beyond what the eye can see and into a universe that has been quietly speaking all along.
Radio astronomy has transformed our understanding of the cosmos, not by capturing images, but by detecting faint signals that have traveled across space for billions of years. From the
accidental discovery of radio waves from the Milky Way by Karl Jansky, to the detection of the afterglow of the Big Bang, this field has revealed a hidden layer of reality that optical astronomy alone could never uncover.
We’ll explore how radio telescopes actually work, from signal capture and amplification to digitization and frequency analysis using the Fast Fourier Transform. Along the way, we’ll break down concepts like interferometry, beamforming, and deconvolution, techniques that allow
astronomers to reconstruct images from incomplete data and even map the structure of our own galaxy using hydrogen emissions.
We’ll also take a look at one of the most remarkable achievements in modern astronomy: the first image of a black hole, created by the Event Horizon Telescope, a global network of
observatories that effectively turned Earth itself into a single telescope.
And we’ll connect these advanced techniques back to everyday life. The same math and signal processing used to study the universe are also at work in your phone, your Wi-Fi router, and
your headphones.
Later in the show, we reflect on a striking new image from the Artemis II mission, and step outside for a look at this week’s night sky, featuring dark skies, distant galaxies, and a subtle
planetary alignment for early risers.
Links
Transcript
Howdy stargazers and welcome to this episode of Star Trails. My name is Drew and I’ll be your guide to the night sky for the week of April 12, through the 18th.
This week we’re talking about radio astronomy and the processes that allow us to see everything from the afterglow of the Big Bang to the shape of black holes. Later in the show I’ll talk about some the photography from the Artemis II mission, and we’ll check in which this week’s sky, which offers some gems for early risers.
Whether you’re tuning in from the backyard or the balcony, I’m glad you’re here. So grab a comfortable spot under the night sky, and let’s get started!
Hopefully you enjoyed the bonus episode I released last week about the David Bowie song, “Space Oddity.” I know that isn’t everyone’s cup of tea, which is why I didn’t release it as part of the “main sequence” of shows.
Anyway, I have news to report related to that episode: It looks like Bowie made it back into space. At the end of that bonus episode I mentioned that I didn’t know if the Artemis II astronauts were listening to “Space Oddity,” but as it turns out, they did listen to a song featuring Bowie. One of the tracks on their “wake-up” playlist published by NASA was “Under Pressure,” the 1981 collaboration between Queen and Bowie. Here’s another wild connection: Queen guitarist, Sir Brian May, has a doctorate in astrophysics.
These songs are selected by Mission Control specialists, and the wake-up songs are a tradition from the Apollo era to help astronauts stay connected to humanity. Other artists featured on the Artemis playlist included Chappell Roan, John Legend and more.
The historic Artemis mission ended Friday after a successful splashdown in the Pacific. I’m looking forward to combing through the images they brought back. Later in the show, I’ll talk about one particular image they’ve already shared. In the meantime, I’ll include a link in the show notes to a playlist of their wake-up songs.
Speaking of sounds in space, let’s talk about radio astronomy.
For most of human history, astronomy meant one thing: looking up. Watching the sky with our eyes and later, with telescopes that extended those eyes, pulling distant light just a little closer. And through that simple act of looking, we’ve learned an extraordinary amount about the universe.
Visual astronomy has given us the structure of the cosmos. It’s how we mapped the stars, traced the motions of the planets, and discovered entire galaxies far beyond our own. It’s how we study nebulae, the birthplaces of stars, and the remnants of stellar death. And with techniques like spectroscopy, we’ve even learned what distant objects are made of, their temperatures, their velocities, even how fast they’re moving away from us.
Long-exposure imaging pushed that even further. A camera left open to the sky for minutes or hours reveals structures so faint they’re completely invisible to the human eye, swirling gas clouds, distant galaxies, delicate filaments of cosmic structure. In many ways, visual astronomy is the foundation of everything we know.
But it’s only part of the story.
Visible light, the tiny slice of the electromagnetic spectrum that our eyes can detect, is just that, a slice. The universe is constantly emitting energy across a vast range of wavelengths, from high-energy gamma rays to long, slow radio waves. And for most of human history, we were completely deaf to all of it.
That changed in the 1930s, not in an observatory, but in a field in New Jersey. An engineer named Karl Jansky was working for Bell Telephone Laboratories, trying to track down sources of radio interference. He built an antenna that could sweep the sky, and at first, he found what he expected, thunderstorms, distant noise, human-made signals. But then there was something else: a faint hiss that repeated not every 24 hours, but every 23 hours and 56 minutes: a sidereal day. That’s the time it takes Earth to complete a rotation relative to distant stars.
Jansky realized that signal wasn’t coming from Earth at all. It was coming from the center of the Milky Way. For the first time in history, we had detected the universe not through light, but through radio waves.
A few decades later, Arno Penzias and Robert Wilson stumbled onto something even more profound: the Cosmic Microwave Background, the faint afterglow of the Big Bang itself.
This discovery is a cornerstone of modern cosmology. The cosmic microwave background is remarkably uniform, but not perfectly so. Tiny fluctuations in its temperature, variations of just a few millionths of a degree, encode information about the early universe. From those patterns, we’ve been able to determine the age of the universe, its composition, and even its large-scale geometry. The CMB is a snapshot of the universe when it was just 380,000 years old, and it still surrounds us today, filling all of space.
At that point, astronomy changed forever. We were no longer just looking at the universe. We were beginning to listen to it.
Radio astronomy is the study of the universe through radio waves: long-wavelength electromagnetic radiation that behaves very differently from visible light. Instead of mirrors, radio telescopes use large dishes that collect and focus signals onto a receiver. These signals are incredibly faint, buried in noise, but they carry information that optical telescopes simply can’t access.
Radio waves pass through dust clouds that block visible light, allowing us to see into star-forming regions and across the structure of our galaxy. They can be observed day or night, often through conditions that would make optical observing impossible. It reveals a hidden universe layered on top of the one we see.
Facilities like the Arecibo Observatory, the Very Large Array, and the FAST telescope have allowed us to map hydrogen gas, discover pulsars, first identified by Jocelyn Bell Burnell, and study the environments around black holes and distant galaxies.
But the real story of radio astronomy is how we turn those signals into something we can understand.
A radio telescope doesn’t take pictures. It measures voltage over time, tiny electrical fluctuations caused by incoming radio waves. When a signal reaches the dish, it’s focused onto a receiver and converted into an electrical waveform. That signal is amplified using low-noise amplifiers, then digitized into a stream of numbers.
At this point, we don’t have an image. We have data.
To make sense of that data, astronomers use a process called the Fast Fourier Transform. The math behind the Fourier Transform dates back to Joseph Fourier in the early 1800s, who showed that complex signals can be broken down into simple sine and cosine waves.
The modern Fast Fourier Transform, developed in 1965, made this process dramatically faster and practical for real-time computing.
Essentially, the FFT transforms a signal from the time domain into the frequency domain, revealing what frequencies are present and how strong they are. Without it, there would be no radio astronomy.
It’s essential because specific frequencies correspond to physical phenomena. One of the most important examples is the hydrogen line at 1420 megahertz. Neutral hydrogen atoms naturally emit radiation at this frequency, allowing astronomers to map vast clouds of gas across the galaxy. And thanks to the Doppler effect, shifts in that frequency reveal motion, rotation, expansion, and dynamics on a galactic scale.
In fact, much of what we know about the structure of the Milky Way comes from this exact technique. By mapping hydrogen emissions across the sky, and measuring how those signals shift, we’ve been able to trace out the spiral arms of our own galaxy, even though we’re embedded inside it.
But things get even more powerful when multiple telescopes are combined. In systems like the Very Large Array, each antenna captures its own signal. These signals are aligned in time and combined through correlation, producing what are known as visibilities, the raw data of interferometry.
From there, the process becomes even more computational.
Using inverse Fourier transforms and deconvolution techniques, astronomers reconstruct an image from incomplete data.
Let’s pause there, because that sounds complex, and it is. An inverse FFT is a process that let’s us reconstruct a signal from data. So, if the Fourier transform breaks a signal apart into its frequencies, the inverse Fourier transform puts it back together again.
De-convolution is a little harder to explain, but the idea is this: Imagine you’ve made a voice recording in a big concert call and the size of that space is creating a lot of reverb and echo. You can run your recording through a tool to strip out the reverb, which leaves the so-called “dry” signal. In radio astronomy, we need to subtract the distortion introduced by the so-called beam pattern of the radio telescope itself. And that’s de-convolution.
Basically, the telescope ‘colors’ the signal it receives, and astronomers have to mathematically remove that effect to recover what’s really out there.
A process called “windowing” is used to smooth the edges of the signal and reduce artifacts introduced by the Fourier transform. Matched filtering helps detect extremely faint signals by comparing incoming data to known patterns. And a technology called beamforming allows arrays of antennas to electronically focus on a specific region of the sky without physically moving.
And then there’s the problem of interference.
Radio Frequency Interference, or RFI, is everywhere. Signals from Earth-based technology can easily overwhelm the faint emissions from space. So astronomers use a combination of hardware shielding, remote observatory locations, and sophisticated filtering algorithms to remove unwanted noise and isolate the signals they care about.
And nowhere is all of this more dramatically demonstrated than in the first image of a black hole.
In 2019, the Event Horizon Telescope linked radio telescopes across the globe using very long baseline interferometry, effectively turning Earth itself into a single telescope.
The data was enormous, so large it had to be stored on physical drives and shipped for processing. And just like everything we’ve discussed, the image wasn’t directly observed. It was reconstructed.
Using Fourier transforms, correlation, and deconvolution algorithms, scientists assembled an image of the black hole’s shadow in the galaxy M87. Multiple independent teams used different reconstruction methods, and all arrived at the same fundamental structure: a bright ring surrounding a dark center.
There was discussion about how much of that image was “real” versus algorithmic. But that’s the nature of this kind of science. The structure is strongly supported by the data, even if the fine details vary.
In other words… we didn’t take a picture of a black hole. We solved for it.
More recently, radio telescopes have uncovered something even stranger: brief, intense flashes of energy known as fast radio bursts. These signals last just milliseconds, yet release enormous amounts of energy, often from galaxies billions of light-years away. They’re one of the most mysterious phenomena in modern astronomy, and we’re still working to understand what causes them.
Radio astronomy is the foundation for the Search for Extraterrestrial Intelligence program. It’s currently being used to study exoplanet magnetospheres, and we’ve used it to discover wild phenomena like quasar jets.
Sometimes, the most intriguing discoveries in astronomy don’t look like galaxies or nebulae, they look like numbers on a page.
In 1977, a signal detected by the Big Ear radio telescope stood out so clearly from the background noise that astronomer Jerry Ehman circled it and wrote a single word beside it: ‘Wow.’
This was dubbed the “Wow Signal,” and it was simply a string of characters on a printout representing signal intensity over time. The data indicated it was a very strong narrowband signal near the hydrogen line, rising and falling over about 72 seconds, which is what you’d expect from a fixed telescope as the Earth rotates.
To this day we still don’t know what that signal was, but it’s never been confirmed as extraterrestrial, and it’s never been heard again.
Behind every radio image is a pipeline of processing, filtering noise, correcting for Earth’s rotation, compensating for atmospheric effects, and assembling a coherent picture from fragments of signal. It’s less like taking a photograph, and more like assembling a picture from echoes.
And here’s the part that might surprise you.
The same fundamental techniques that make radio astronomy possible, are quietly at work in your everyday life.
The amplification and digitization of signals, the use of Fourier transforms to break complex signals into frequencies, the filtering of noise, even techniques like beamforming, are built into the devices you use every day.
Your phone relies on them to maintain a clear connection. Your Wi-Fi router uses beamforming to direct signals toward your devices. Your noise-cancelling headphones filter out noise to isolate what you want to hear. Voice recognition is much like matched filtering.
Even the music you stream every day has been broken down into frequencies and rebuilt again, using the same kind of math astronomers use to study the universe. This is such a foundational technology to modern life.
Every time I edit a podcast, I’m using a fast Fourier transform, perhaps 1,000 times a second or more, depending on how many tracks and audio processors I’m using. The efficiency of modern computers is incredible when you think about it.
We’ll explore the computational side of astronomy in more detail in a future episode. But for now, it’s enough to say this: radio astronomy doesn’t just expand what we can observe, it changes how we think about observing.
The night sky isn’t silent. It’s alive with signals that are ancient, energetic, and constant. For most of human history, we simply didn’t have the tools to detect them.
But now, we’re listening. And what we’re hearing is a universe far richer than anything we could ever see with our eyes alone.
After a quick break we’ll be back to cover this week’s sky, and talk about a photograph from the Artemis II mission. Stay with us.
Welcome back.
You know, there’s something about the infamous “dark side of Earth” image from the Artemis II mission that’s had me thinking a lot about perspective. The ever-prickly internet photography community was in a tizzy last week over this image, focusing on the camera used and its settings, the lens choice, the ISO, and so on.
As a photographer, these factors interested me too, but ultimately what matters is something much simpler.
For the first time in a long time, we’re seeing the entire Earth as a complete sphere, just hanging there in space. And here’s the part that didn’t really dawn on me until recently:
Most astronauts never actually see that.
When you’re in low Earth orbit, like the crews aboard the International Space Station, you’re only about 250 miles up. That sounds high, but on a planetary scale, it’s nothing. From that distance, you see curvature. You see oceans, continents, weather systems stretching across thousands of miles.
But you don’t see the whole Earth. You’re too close. If you want to see an entire sphere, you have to be far enough away that it fits inside your field of view. This is a no-brainer.
For Earth, that distance turns out to be surprisingly large. In a low orbit, Earth fills your entire frame, and then some.
To step back far enough to see the full disk, you need to be not hundreds of miles away, but thousands. Roughly speaking, you need to get out to more than 6,000 miles or more, before the entire Earth can comfortably fit into view as a complete circle.
And that’s exactly what the Artemis mission did. They headed out toward the Moon, tens of thousands of miles away, finally giving us the distance needed to see our planet as a whole.
When you’re close, Earth feels endless. It fills your vision. It’s where you are. But when you step back far enough, It becomes something else entirely. A finite object. A sphere of oceans, clouds, and life suspended in black space.
We’ve seen this before, of course. Images from the Apollo 17 mission, the famous “Blue Marble,” gave us that same perspective more than 50 years ago. But for a long time, that view hasn’t been something we could experience in real time.
The infamous “Pale Blue Dot” image does something similar. It’s not a technically great photo, just a blue pixel in a dark void, but it offers up something more valuable: Scale, perspective and humility.
And that’s why this new image matters. Not because it was shot with a 10-year-old Nikon camera. Not because it has some sensor noise, or could have been sharper. Not because some goof on the internet thinks they could have shot it better.
But because it reminds us of something easy to forget: You can’t see the whole Earth, until you leave it.
And now, let’s step outside.
This week offers one of the best observing windows of the month, and it comes down to one simple thing: darkness.
We’re moving into a new moon on April 17th, which means for much of this week, the sky will be free of bright moonlight.
In fact, early in the week you’ll catch a waning crescent Moon just before sunrise, hanging low in the southeastern sky. It’s a beautiful sight, especially if you look for earthshine—the faint glow on the dark portion of the Moon caused by sunlight reflecting off Earth.
By the end of the week, the Moon disappears entirely into the Sun’s glare, reaching new phase on the 17th. And then, just a day later, on the 18th, a razor-thin waxing crescent returns to the evening sky, only about one percent illuminated.
This is prime time for deep-sky observing.
April marks the heart of what astronomers often call “galaxy season.” With the Milky Way dipping lower in the evening sky, we’re looking outward—away from the dense star fields and into the vast expanse of intergalactic space.
Look toward the constellation Leo, now high in the evening sky, where you’ll find the Leo Triplet—three galaxies interacting with one another, appearing as faint smudges of ancient light. Nearby, in Ursa Major, you’ll find Messier 81 and Messier 82—two striking galaxies often seen together in the same field of view.
These are not bright objects. You’re seeing light that has traveled millions of years to reach your eyes.
The planets are putting on a show this week, but you’ll need to be an early riser to catch some of them. On the morning of April 18th, a rare alignment unfolds low on the eastern horizon. Mercury, Mars, and Saturn gather together in a tight grouping, with Neptune nearby for those with binoculars or a telescope.
This “planet parade” will be subtle, and you’ll need a clear, unobstructed horizon and a bit of patience.
Venus continues to dominate the early evening sky, shining brilliantly in the west just after sunset. It’s unmistakable, the brightest object in the sky after the Sun and Moon. Jupiter is still visible in the evening, high in the sky.
And finally, while it won’t peak until next week, the Lyrid meteor shower is beginning to ramp up. Active from around April 16th onward, you may catch a few early meteors streaking across the sky in the pre-dawn hours. With the Moon out of the way, conditions are ideal.
The Lyrids are associated with debris from Comet Thatcher, and are known for their bright meteors, fireballs and visible smoke trails. They are expected to peak on the 22nd.
That’s going to do it for this week. If you found this episode interesting, please share it with a friend who might enjoy it. The easiest way to do that is by sending folks to our website, startrails.show. And if you want to support the show, use the link on the site to buy me a coffee. It really helps!
Be sure to follow Star Trails on Bluesky and YouTube — links are in the show notes. Until we meet again beneath the stars … clear skies everyone!
Support the Show
Connect with us on Bluesky @startrails.bsky.social
If you’re enjoying the show, consider sharing it with a friend! Want to help? Buy us a coffee! Also, check out music made for Star Trails on our Bandcamp page!
Podcasting is better with RSS.com! If you’re planning to start your own podcast, use our RSS.com affiliate link for a discount, and to help support Star Trails.

Leave a comment