Captured in infrared light by the Near-Infrared Camera (NIRCam) on NASA’s James Webb Space Telescope, this image reveals previously obscured areas of star birth. Called the Cosmic Cliffs, the region is actually the edge of a gigantic, gaseous cavity within NGC 3324, roughly 7,600 light-years away. (NASA via Bay City News)

Local News Matters weekly newsletter

Start your week with a little inspiration. Sign up for our informative, community-based newsletter, delivered on Mondays with news about the Bay Area.

Spectacles of gassy cosmic cliffs, black holes, signs of water on exoplanets and the most ancient galaxies ever seen by human eyes – the James Webb Space Telescope this month unveiled a new space era to the world.  

One million miles away from Earth, the JWST is answering 14-billion-year-old questions about the beginnings of the universe. Led by NASA and the European and Canadian space agencies, the $10 billion infrared observatory is the largest and most powerful space science telescope ever built.  

As big as a tennis court, the full James Webb observatory could solve mysteries in solar systems from distant worlds and stars to the Big Bang’s aftermath. Most importantly, it will open new findings, possibilities and conversations about the mysterious structures and origins of the universe and humankind’s place in it.  

The James Webb telescope involved some 20,000 collaborators across 29 countries and 14 U.S. states, and many Bay Area scientists played a part in JWST’s large success. 

Engineers and astrophysicists from the NASA Ames Research Center in Silicon Valley and Lockheed Martin alongside the University of Arizona helped build Webb’s revolutionary near-infrared camera and mid-infrared instrument. Additionally, L-3 Communications Tinsley Laboratories (now Coherent Inc.) crafted the telescope’s 21-foot-long beryllium mirrors.  

NASA technicians lifted the telescope using a crane and moved it inside a clean room at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. (NASA/Desiree Stover via Bay City News)

On July 12, the world was stunned by Webb’s first photo, the deepest and sharpest infrared image of the universe ever taken – 150-million-pixel images to be exact. Named “Webb’s First Deep Field,” the telescope revealed galaxy cluster SMACS 0723 about four billion light years away.  

Thousands of galaxies are captured in this slice of the vast universe. Looking up from the ground, the image merely covers a patch of the sky approximately the size of a grain of sand held at arm’s length.  

In an ever-expanding universe, far red light on the electromagnetic spectrum cannot be viewed by human eyes or telescopes. But Webb’s near-infrared camera (NIRCam) can detect red and infrared light over a wavelength range of 0.6 to 5 microns, including parts of the spectrum that weren’t visible to the lens of the Hubble telescope, Webb’s predecessor.  

Webb’s NIRCam thus allows us to take a more clear and deeper look into the earliest stars and galaxies, and even young stars in the Milky Way and Kuiper Belt objects.  

But James Webb’s journey didn’t happen overnight, In fact, it took around 20 years to research, build and finally send the telescope into the space void. The infrared space observatory launched on Dec. 25, 2021, from the European Space Agency’s launch site in French Guiana.  

In 30 days, JWST traveled one million miles to its permanent home, the second Lagrange point, where the telescope can function in a gravitationally stable location.  

There are five Lagrange points – areas where gravity from the sun and Earth balance the orbital motion of a satellite. The points can be used by a spacecraft, or in this case, JWST, to reduce fuel consumption needed to remain in position.  

The second Lagrange point, L2, lies in the opposite direction from the sun, but is exactly lined up with both the sun and the Earth.  

JWST stays in line with Earth as it orbits around the sun. All the while, this path grants protection to the satellite’s large sunshield to protect the telescope from the light and heat of the sun, earth and moon.  

Here are the four main focus areas for the telescope’s mission: first light in the universe, assembly of galaxies in the early universe, birth of stars and protoplanetary systems and planets (including the origins of life).  

In 2014, the NIRCam was carefully picked up in Palo Alto by FedEx’s special “white glove” team, then shipped to Maryland’s Goddard Space Flight Center, where it was integrated with the other instruments. 

From there, it went to Houston’s Johnson Space Center for testing, then Northrop Grumman in Redondo Beach to be placed into the spacecraft. Finally, the whole project was flown to South America for launching. 

NASA’s hand in the NIRCam and MIRI instruments  

NASA’s Ames Research Center made significant contributions to Webb’s early mission concepts, technology development and modeling, especially the NIRCam and MIRI or Mid-Infrared instrument.  

While there are four instruments on the James Webb telescope, three of the four work in the near infrared – from colors, like red, that the eye can see, to wavelengths that are about 10 times redder. However, there’s also useful information at longer wavelengths.  

The NIRCam operates over a wavelength range of 0.6 to 5 microns, and MIRI operates over a wavelength range of 5 to 28 microns (NASA via Bay City News))

MIRI has both a camera and a spectrograph with wavelengths that are longer than human eyes can observe, a range of 5 to 29 microns to be specific. While the NIRCam reveals cooler red stars and dust that is transparent, MIRI reveals planets, comets, asteroids, and dust warmed by starlight discs of dense gas rotating around new stars.  

Ames researchers are also leading and contributing to the mission’s current science investigations such as studying brown dwarfs, young stars, evolved stars, galaxies, molecules and worlds beyond the solar system – exoplanets.  

Collectively, Ames researchers will lead over 400 hours of observations in Webb’s first year of operations, according to a NASA Ames press release.  

Thomas Roellig, chief of the Astrophysics Branch at the Ames Research Center, has worked on the James Webb telescope mission for the past 20 years, contributing to the development of the NIRCam instrument. 

Roellig joined a team created by Marcia Jean Rieke, the principal investigator on Webb’s near-infrared camera and a regents’ professor of astronomy and associate department head at the University of Arizona.  

The team’s first proposal on Webb to NASA was originally named the “Next Generation Space Telescope.” In what Roellig describes as feeling “over the moon,” NASA selected the proposal and designated the team to design and build the camera in 2002. The hardware was largely built at the Lockheed Martin facility in Palo Alto, Roellig said.  

Roellig said the “risky” yet very successful JWST mission will answer two major questions: what happened shortly after the Big Bang and how humans can determine and refine the laws of physics with observations from the universe. Understanding the timescales, energies and temperatures from the universe can have direct applicability to life on Earth, such as highly advanced technology, Roellig said. 

Tracking the instrument’s journeys back to its origin, Bay City News interviewed the brains behind Webb to better understand how NIRCam, the mid-infrared instrument (MIRI) and the mirrors of the massive observatory came to life and what JWST is sending back to Earth. 

(Interviews are edited for length and clarity.)

Bay City News: In your own words, could you define and describe what James Webb is and what it does and its functions? 

Roellig: Plain curiosity. Ever since we were cavemen, we’ve always looked up at the stars and said, “What is up there?” “Where did we come from?” “What’s going on out there in the sky?” And so we’re trying to answer those questions for humans basically. First of all, I should start off by saying that James Webb does not observe light that we see with our eyes, or at least for the most part it doesn’t. It mostly observes in the infrared. Infrared is the longer wavelengths of light. Most animals can’t see it either. There are a few that can, like rattlesnakes. James Webb is different from Hubble in that respect. Hubble is mostly a visible light observatory. But we’re looking at a different range of light colors – different wavelengths. And there’s a lot of advantages for the infrared. So first of all, it’s an infrared telescope. It’s also a very large telescope. It’s by far the biggest space telescope that anyone on Earth has ever launched. And the advantages for a big telescope are kind of obvious. It gathers more light so you can see things that are dimmer. Also, according to the laws of physics, a bigger telescope that’s made precisely enough can see things a lot more sharply, and with much better detail than you could see with a smaller mirror.  

BCN: What are the advantages of the infrared?  

Roellig: The advantages of the infrared are basically threefold. First of all, if you want to do cosmology, if you want to look at the whole structure of the universe and see what happened back just in the earliest days of the Big Bang, you really need to look at the infrared for that. And the reason for that is that the Big Bang was a big explosion. So the stuff that is furthest away, is also flying away the fastest. And when things fly away from you, if they’re emitting light like a star, the wavelength shifts and it gets redder. So a star that’s flying away from us is some distance away. Since we’re seeing light that took so long to get here, we’re looking back into time. That starlight doesn’t look to us like starlight, it looks like infrared light. So you need a telescope that looks in the infrared. The other reason that you might want to look for the infrared is some objects, basically everything emits light of some sort. I mean, you and I emit light. We actually emit light in the infrared. Room temperature, body temperature gets light in the infrared. Well there’s a lot of other things in the universe that are kind of the same temperature as we but also emit in the infrared – planets do, the Earth is mostly room temperature and it glows in the infrared. So do areas of star formation. When stars first start forming, they’re not very warm. So if you want to really look into the details of the star formation, you want to look in the infrared. The third big advantage about the infrared – it allows you to see through the dusty areas. And it turns out the universe is full of dust and it blocks light from rather interesting places where new stars and stellar nurseries are forming are shrouded by dust. For example, the center of our galaxy is actually a fairly bright place. If there weren’t these dust clouds that are blocking the light from the center of the Milky Way, we would have enough light at night to actually read a newspaper at night. But in fact, the clouds of dust blocked that light, but infrared can see right through it.  

BCN: What about aliens?  

Roellig: There’s actually a fourth advantage with infrared light, and that goes with exoplanets – planets that don’t orbit our Sun but orbit other stars. And if you’re looking for light from mosaics, or at least chemical signatures from those planets that might indicate that life is on them, the infrared is an ideal place to look for that. So, for example if some other alien civilization were to look at Earth, they would immediately realize that something’s wrong with Earth. We have molecules in our atmosphere that should not be here. And the only reason they are here is because there’s life putting them there. One big example of that is the methane, which is in our atmosphere and is there only because it’s constantly being replenished by life, because it actually gets destroyed rather rapidly by sunlight. That’s one of the big things that Webb is going to want to do – actually look at exoplanets. Look at the atmospheres, trying to see those indications that something is out of balance, and therefore can only be due to life on other planets. 

Alongside Roellig at NASA Ames is Dr. Thomas Greene, an astrophysicist in the Space Science and Astrobiology Division and the co-investigator of the NIRCam and MIRI instruments. He conducts observational studies of exoplanets and young stars and spent the past 24 years developing and designing Webb’s technology. Now, he will study the atmospheres of exoplanets.  

Before joining NASA in 1998, Greene worked at Lockheed Martin and began a number of concept studies for Webb. Fast forward to 2014, Greene was assisting in the assembly of JWST at the Goddard Space Flight Center in Baltimore, which included putting the telescope together, then putting the instruments on the telescope.  

NIRCam is shown on its rotation dolly after the two mirror-image modules were bolted together to form the complete instrument. (NASA and Lockheed Martin via Bay City News)

BCN: Tell us about the NIRCam and MIRI instrument.  

Dr. Thomas Greene: The NIRCam is sort of the workhorse instruments of the observatory in a lot of ways. It’s used not only to take a lot of the beautiful images that were released in the last week, but also to align the telescope and to keep it calibrated as time goes on. Every couple of days we use the camera to assess the alignment of the telescope and then there’s little tweaks made to all the optics to keep it tuned up. We turned on that Near Infrared Camera just days after the telescope had finished all of its deployments because we needed to see how well the telescope was working. Ames was actually quite instrumental in guiding the development of those MIRI detectors.  

BCN: What was the moment you realized all your years of hard work paid off on JWST? 

Greene: So I actually designed some of the components for the observatory. In particular, I designed some optics for NIRCam. And what that allows us to do is not just take pictures, but also spread the light out to see what these constituent molecules are. I’m going to use those for studying the atmospheres of exoplanets, for example. So when I saw those, when I saw those things working in Baltimore in February, I got very excited. I could tell that these things are really going to be useful. Being a team member, also Dr. Roellig was part of this too, we saw the first images come out of the telescope when we turned it on. They were not aligned. They were not pretty, but it was quite impressive. We knew things were good and every time as we’ve made improvements, alignments and tweaked up the instruments, things have only gotten better. And the ones that were released last week were just – wow.  We spent six months not taking science images, but tuning up the telescope. And then at the very end, we took a few science images. So think of this like going into an ice cream shop, and you see all these delicious looking flavors and you want to try them to get a little taste of those little spoons – that’s what these images are. You’ve gotten five little spoons and many different flavors. And the rest of the cone or cup is coming. 

Lockheed Martin paves JWST’s path 

Malcolm Ferry, the current program manager of the NIRCam project at Lockheed Martin, joined the program as a systems engineer in the early 21st century. In the company’s Advanced Technology Center in Palo Alto, he witnessed and helped this powerful eye of James Webb grow from preliminary sketches and ideas to today’s coffee-table-sized instrument. 

NIRCam started out in 2002, when the ATC won the contract to create the imager for Rieke, the principal investigator from the University of Arizona. Approximately 80 to 100 engineers were involved in the design and creation of NIRCam at the peak of the program. The camera left the Palo Alto center in late 2013 and was transported to several stations to go through various levels of testing and integration. Ferry became the fifth program manager of the NIRCam project in 2016.  

BCN: What was the most challenging part of the creation of NIRCam at ATC? 

Malcolm Ferry: From the technical perspective, I would say there were two big challenges. One was mounting the optical elements. The optical performance of the NIRCam instrument was so pushing the envelope of what’s possible. It became very critical that the optical elements were correctly made in terms of their shape and the materials and were correctly positioned relative to each other. Because NIRCam operates in the infrared spectrum, the materials we used were not standard glass-type materials but were optical materials that needed to be transmissive in the infrared frequency region. So, we were using certain exotic materials, some of which were quite fragile. We had to hold them gently so we didn’t stress them and also hold them so they couldn’t move at all. And they had to survive the rigors of launch. It’s an incredibly violent experience when you sit on top of one of these rockets, there is lots of vibration and lots of noise. And then, of course, there’s the huge temperature change.  

That brings me to the second challenge. The optical bench and all the mechanisms had to operate at this 38 Kelvin temperature (about -391.27 Fahrenheit), which is incredibly cold, almost absolute zero. However, we had to assemble it at normal human room temperature because it’s done by humans, and humans don’t operate well at 38 Kelvin. As things generally contract when it gets colder, the challenge was determining how things were going to move over that temperature transition and make sure we put them in the right place at room temperature that when we got to 38 Kelvin, everything came into alignment and worked perfectly.  

BCN: Where were you on July 12 when the James Webb images were released? Can you describe your feelings at that time since you had gone through such a tough and exciting part of it? 

Ferry: I was in our Palo Alto facility, and we had a little party in a conference room. Several people who had worked on NIRCam over the years were there along with other Lockheed Martin people who came and celebrated with us. It was very exciting, obviously. We were keen to see the pictures. We had no preview of them, so we had the same wow moment that everybody else did. We knew the pictures were going to be good. And obviously, we all got used to Hubble pictures over the years. But these were just something else: the level of detail and the clarity is just mind blowing.  

Webb’s primary mirror is 6.5 meters (21 feet 4 inches) in diameter. By comparison, Hubble’s mirror is 2.4 meters (7.8 feet). (NASA via Bay City News)

BCN: What have your everyday tasks been like since JWST launched in December? Is there a way to fix any problems when it’s a million miles away from the earth? 

Ferry: Since December, we’ve been going through the commissioning phase and are actually in the middle of the final commissioning readiness review right now. We see engineering types of images from our instrument and other instruments to check that everything’s working correctly. It all looks very good. Now we’re at the end of commission, and we’re going to handle this off to the scientists shortly. There’s no way to change anything on the instruments now. It’s not coming back and no human is going out there, probably not in the lifetime of James Webb. So, the way we get around that is to build in a lot of redundancy. There are actually two totally separate NIR cameras. And so if we lose one, we’ve got the other one right there. There’s even more redundancy within the electronics systems. If we get an electronics failure, we can switch over the backups, and multiple levels of redundancy will get us around that non-serviceability problem. The only thing we can change is the software that supports it. Once you let the scientists work on these things, these thousands of scientists across the world waiting to get their hands on this, they’ll come up with ideas to enhance its operation in some ways.  

Mirrors reflect into the universe’s history 

On the other side of the instruments is the telescope’s primary mirror, composed of 18 hexagonal beryllium segments. The mirror is 21 feet long and captures the light from the universe’s most distant galaxies and stars. The 18 segments first came to life at beryllium mines in Utah and then moved across the country for processing and polishing. 

A test station in Richmond was one of the stops. From about 2005 to 2022, the mirrors were shipped there and polished to a smooth and exact shape. Patrick Johnson, a former engineer at L-3 Communications Tinsley Laboratories in Richmond, led the metrology group on the mirror’s measurement and polishing program back then.  

In the Tinsley Labs, an “Optical Test Station” was created that enabled the mirrors to be crafted to extreme accuracy. Tinsley utilized temperature cycling ovens, sophisticated measurement systems and nine unique computer controlled optical surfacing systems to polish the mirrors to a precision of 18 nanometers.

Patrick Johnson is standing in front of an optical test station at Tinsley Labs used to measure mirror surfaces. (NASA and L-3 Tinsley via Bay City News)

BCN: What was the main purpose of the manufacturing process in Tinsley Labs? And how many mirrors did you measure and polish in total? 

Patrick Johnson: In the optical design, you want an object to look a certain way or shape a certain way, which would be the prescription. Just like your glasses or contacts have a prescription, you must create manufacturing and test methods to achieve that prescription with sun tolerance. That was the goal. We had a prescription, we had the size, shape geometries of these mirrors, and we needed to go figure out a measurement plan to achieve those tolerances. We had to build and develop those tests and then optimize them. And of course, we had to account for gravity and other deformations like material shape changes, just because there is a coefficient of thermal expansion that the shape of the material can change as the temperature changes.  

For the 18 mirror segments, there was a prescription for the segments closer to the center, and it changed as it went outward. So, we ended up making about three different prescription types. And the intent was to make a couple of spares in total, like one spare of each type. So for the 18 mirrors, we ended up making about 21 or so. And that’s only talking about the primary. If you look at the telescope, you see the three-pronged mount holding the secondary mirror. And behind everything, there are more mirrors, including the tertiary mirror, the fold mirror and the steering mirror.  

BCN: What technical challenges did you encounter and how did you overcome them? 

Johnson: Most of the challenges were related to the actual use case. For example, the mirror material would change shape under different environmental conditions like temperatures and gravity. We had to work closely with our partners at Ball Aerospace to do cryogenic testing. We had to measure these at very, very cold temperatures and validate any modeling that was done. Then from there, it was getting the accuracy we needed for these measurements. It was also validating the shape we were making, the right geometry or the right prescription.  

BCN: How did it feel to have contributed to such an exciting program as you watched the JWST’s first images? 

Johnson: It’s a culmination of some really amazing people, bright minds and a lot of hard work and dedication. Those images are extremely impressive, and it’s cool to be a small part of it. It’s exciting to see where it is today, and I have a poster of it that hangs on my wall. It’s also exciting to start seeing the science coming from it, because we don’t know what we’ll learn.