Science Articles, Scientific Current Events | Popular Science https://www.popsci.com/category/science/ Awe-inspiring science reporting, technology news, and DIY projects. Skunks to space robots, primates to climates. That's Popular Science, 145 years strong. Mon, 15 Jan 2024 18:00:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://www.popsci.com/uploads/2021/04/28/cropped-PSC3.png?auto=webp&width=32&height=32 Science Articles, Scientific Current Events | Popular Science https://www.popsci.com/category/science/ 32 32 Check out some of the past year’s best close-up photography https://www.popsci.com/technology/2023-best-close-up-photos/ Mon, 15 Jan 2024 18:00:00 +0000 https://www.popsci.com/?p=598436
A female fairy shrimp displays the colorful eggs inside her.
A female fairy shrimp displays the colorful eggs inside her. © René Krekels | cupoty.com

The 5th annual Close-up Photographer of the Year competition celebrated detailed glimpses of the natural world. Here are a few of the finalists and winners.

The post Check out some of the past year’s best close-up photography appeared first on Popular Science.

]]>
A female fairy shrimp displays the colorful eggs inside her.
A female fairy shrimp displays the colorful eggs inside her. © René Krekels | cupoty.com

There’s always a reason to stop and appreciate the smaller stuff in life. Since 2018, Tracy and Dan Calder have drawn attention to documenting daily minutiae with the Close-up Photographer of the Year competition, highlighting the past 12 months’ best images capturing nature, animal, underwater, and human subjects.

The 5th annual edition is no exception, with amazing glimpses of everything from slumbering frogs, to magnetic waves, to microscopic life, to rarely seen deep sea creatures. Across a wide range of categories, photographers around the world managed to snap some extremely striking images, making even some of the creepiest of crawlies look pretty cute for a change. Check out a few of our favorite finalists and winners of 2023 below, and remember to keep an eye out for the little things this year. They’re always there and worth seeing, even if you don’t have a camera in hand.

Close up of damselfly
Invertebrate Portrait Finalist: “Look Into My Eyes,” portrait of a damselfly covered in dew taken in May in Shropshire, UK © Pete Burford | cupoty.com
Ice chunk with twig frozen in it
Intimate Landscape 2nd Place Winner: “Ice Fossiel,” ‘In winter, many of the flooded wetlands in the Netherlands can be skated upon. The ice is often damaged, with pieces being chipped off. On one such occasion, I discovered a small chunk of ice stuck to a frozen twig that made me think of a prehistoric find.’ © Piet Haaksma | cupoty.com
Light captured in bottles to look like electric storm
Human Made Finalist: “Electric Storm in a Bottle,” Light captured in a pair of bottles to look like an electrical storm taken on November 6th in Hemel Hempstead, UK. © Rachel McNulty | cupoty.com
Dark brown globular springtail
Invertebrate Portrait Finalist: “Allacma Fucsca,” A dark brown globular springtail (Allacma fusca) taken on September 24th in Solingen, Germany. © Jacek Hensoldt | cupoty.com
Light through glass door creating electric effect
Human Made Finalist: “Magnetic Waves,” Light through the glass of a front door creates an ‘electric’ effect taken on
June 23rd in Stourbridge, UK. © Chris Mills | cupoty.com
Small slime mould with ice crown atop it
Fungi 1st Place Winner: “The Ice Crown,” ‘This 1mm tall slime mould (Didymium squamulosum) was found in leaf litter on a Buckinghamshire woodland floor in January. Attracted by the way the frost had formed a crown shape on top of the fruiting body, I had to be very careful not to breathe on it. During a previous attempt with another slime mould, my breath had melted the ice when I inadvertently got too close.’ © Barry Webb | cupoty.com
Two four-spotted skimmer dragonflies mating
Butterflies & Dragonflies 2nd Place Winner: “Letting Go,” ‘‘Capturing a Four-spotted skimmer dragonfly (Libellula quadrimaculata) mating is particularly difficult because they connect and mate in-flight without any warning and for only a few seconds. The moment captured in this photo is just after the male has finished depositing his sperm on the female’s eggs and they are disconnecting. She will then attempt to deposit the eggs in the water and he will hover near her to ward off other males who would like to also mate with her.’ © Steve Russell | cupoty.com
Elephant trunk gripping flowers from water
Animals Finalist: “Picking Flowers,” ‘An Elephant enjoys a nutritional meal of water lily flowers as it makes its way across the Chobe River, Botswana. As flood water reaches the Chobe river (all the way from its starting point in Angola) the waterways are transformed with a wave of flowers.’ © William Steel | cupoty.com
Two huntsman spiders
Animals Finalist: “Pandercetes Sp. Squared,” ‘I was observing a large huntsman spider (Pandercetes sp.) on a tree when it suddenly leapt and caught a moving subject next to it. Upon closer inspection, I realised that a smaller huntsman spider had caught its own prey and while feeding on it, it had attracted the attention of the larger spider. If you look closely, you can see the pools of venom secreting from its fangs. Cannibalism among spiders is quite common, but finding such beautiful spiders showing this behaviour was a highlight from my trip to Malaysia.’ © Peter Grob | cupoty.com
Two frogs and a toad
Animals Finalist: “Frogs and Toad Mating,” ‘‘As I was walking around my local lake looking for amphibians on a warm spring evening I began to hear the calls of frogs and toads coming from a small area around the roots of an Alder tree at the edge of the water. I watched the mass of amphibians until the light disappeared and noticed two frogs next to the water on the edge of the footpath. When I went to have a better look and take some images, I noticed that this pair had a common toad attempting to join!’ © Nathan Benstead | cupoty.com

See more at Cupoty.com.

The post Check out some of the past year’s best close-up photography appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Earth isn’t the only planet with seasons, but they can look wildly different on other worlds https://www.popsci.com/science/seasons-on-other-planet/ Sat, 13 Jan 2024 17:00:00 +0000 https://www.popsci.com/?p=598533
Jupiter’s iconic Great Red Spot and the surrounding turbulent zones, shown in shades of white and brown.
Jupiter’s iconic Great Red Spot and the surrounding turbulent zones, as seen by NASA’s Juno spacecraft. NASA

Nearby planets can affect how one planet ‘wobbles’ on its spin axis, which contributes to seasons.

The post Earth isn’t the only planet with seasons, but they can look wildly different on other worlds appeared first on Popular Science.

]]>
Jupiter’s iconic Great Red Spot and the surrounding turbulent zones, shown in shades of white and brown.
Jupiter’s iconic Great Red Spot and the surrounding turbulent zones, as seen by NASA’s Juno spacecraft. NASA

This article was originally featured on The Conversation.

Spring, summer, fall and winter–the seasons on Earth change every few months, around the same time every year. It’s easy to take this cycle for granted here on Earth, but not every planet has a regular change in seasons. So why does Earth have regular seasons when other planets don’t?

I’m an astrophysicist who studies the movement of planets and the causes of seasons. Throughout my research, I’ve found that Earth’s regular pattern of seasons is unique. The rotational axis that Earth spins on, along the North and South poles, isn’t quite aligned with the vertical axis perpendicular to Earth’s orbit around the Sun.

That slight tilt has big implications for everything from seasons to glacier cycles. The magnitude of that tilt can even determine whether a planet is habitable to life.

Seasons on Earth

When a planet has perfect alignment between the axis it orbits on and the rotational axis, the amount of sunlight it receives is fixed as it orbits around the Sun–assuming its orbital shape is a circle. Since seasons come from variations in how much sunlight reaches the planet’s surface, a planet that’s perfectly aligned wouldn’t have seasons. But Earth isn’t perfectly aligned on its axis.

This small misalignment, called an obliquity, is around 23 degrees from vertical for Earth. So, the Northern Hemisphere experiences more intense sunlight during the summer, when the Sun is positioned more directly above the Northern Hemisphere.

Then, as the Earth continues to orbit around the Sun, the amount of sunlight the Northern Hemisphere receives gradually decreases as the Northern Hemisphere tilts away from the Sun. This causes winter.

The obliquity marks the difference between the Earth’s spin axis (blue) and the vertical from orbit (green). The Northern Hemisphere experiences summer when the tilt lines it up directly with light from the Sun.
The obliquity marks the difference between the Earth’s spin axis (blue) and the vertical from orbit (green). The Northern Hemisphere experiences summer when the tilt lines it up directly with light from the Sun. CREDIT: Gongjie Li.

The planets spinning on their axes and orbiting around the Sun look kind of like spinning tops–they spin around and wobble because of gravitational pull from the Sun. As a top spins, you might notice that it doesn’t just stay perfectly upright and stationary. Instead, it may start to tilt or wobble slightly. This tilt is what astrophysicists call spin precession.

Because of these wobbles, Earth’s obliquity isn’t perfectly fixed. These small variations in tilt can have big effects on the Earth’s climate when combined with small changes to Earth’s orbit shape.

The wobbling tilt and any natural variations to the shape of Earth’s orbit can change the amount and distribution of sunlight reaching Earth. These small changes contribute to the planet’s larger temperature shifts over thousands to hundreds of thousands of years. This can, in turn, drive ice ages and periods of warmth.

Translating obliquity into seasons

So how do obliquity variations affect the seasons on a planet? Low obliquity, meaning the rotational spin axis is aligned with the planet’s orientation as it orbits around the Sun, leads to stronger sunlight on the equator and low sunlight near the pole, like on Earth.

On the other hand, a high obliquity–meaning the planet’s rotational spin axis points toward or away from the Sun–leads to extremely hot or cold poles. At the same time, the equator gets cold, as the Sun does not shine above the equator all year round. This leads to drastically varying seasons at high latitudes and low temperatures at the equator.

When a planet’s spin axis is tilted far from the vertical axis, it has a high obliquity. That means the equator barely gets any sunlight and the North Pole faces right at the Sun
When a planet’s spin axis is tilted far from the vertical axis, it has a high obliquity. That means the equator barely gets any sunlight and the North Pole faces right at the Sun. CREDIT: Gongjie Li.

When a planet has an obliquity of more than 54 degrees, that planet’s equator grows icy and the pole becomes warm. This is called a reversed zonation, and it’s the opposite of what Earth has.

Basically, if an obliquity has large and unpredictable variations, the seasonal variations on the planet become wild and hard to predict. A dramatic, large obliquity variation can turn the whole planet into a snowball, where it’s all covered by ice.

Spin orbit resonances

Most planets are not the only planets in their solar systems. Their planetary siblings can disturb each other’s orbit, which can lead to variations in the shape of their orbits and their orbital tilt.

So, planets in orbit look kind of like tops spinning on the roof of a car that’s bumping down the road, where the car represents the orbital plane. When the rate–or frequency, as scientists call it–at which the tops are precessing, or spinning, matches the frequency at which the car is bumping up and down, something called a spin-orbit resonance occurs.

The orbits of planets close by and the precession motion of a planet on its axis can affect seasonal patterns.
The orbits of planets close by and the precession motion of a planet on its axis can affect seasonal patterns. CREDIT: Gongjie Li.

Spin-orbit resonances can cause these obliquity variations, which is when a planet wobbles on its axis. Think about pushing a kid on a swing. When you push at just the right time–or at the resonant frequency–they’ll swing higher and higher.

Mars wobbles more on its axis than Earth does, even though the two are tilted about the same amount, and that actually has to do with the Moon orbiting around Earth. Earth and Mars have a similar spin precession frequency, which matches the orbital oscillation–the ingredients for a spin-orbit resonance.

But Earth has a massive Moon, which pulls on Earth’s spin axis and drives it to precess faster. This slightly faster precession prevents it from experiencing spin orbit resonances. So, the Moon stabilizes Earth’s obliquity, and Earth doesn’t wobble on its axis as much as Mars does.

Exoplanet seasons

Thousands of exoplanets, or planets outside our solar system, have been discovered over the past few decades. My research group wanted to understand how habitable these planets are, and whether these exoplanets also have wild obliquities, or whether they have moons to stabilize them like Earth does.

To investigate this, my group has led the first investigation on the spin-axis variations of exoplanets.

We investigated Kepler-186f, which is the first discovered Earth-sized planet in a habitable zone. The habitable zone is an area around a star where liquid water can exist on the surface of the planet and life may be able to emerge and thrive.

Unlike Earth, Kepler-186f is located far from the other planets in its solar system. As a result, these other planets have only a weak effect on its orbit and movement. So, Kepler-186f generally has a fixed obliquity, similar to Earth. Even without a large moon, it doesn’t have wildly changing or unpredictable seasons like Mars.

Looking forward, more research into exoplanets will help scientists understand what seasons look like throughout the vast diversity of planets in the universe.

Disclaimer: Gongjie Li receives funding from NASA.

The post Earth isn’t the only planet with seasons, but they can look wildly different on other worlds appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this rocket ‘eat’ its own body for fuel https://www.popsci.com/technology/ouroboros-self-eating-rocket/ Fri, 12 Jan 2024 19:00:00 +0000 https://www.popsci.com/?p=598484
GIF of Ouroboros-3 test rocket igniting
Ouroboros-3 uses its own plastic fuselage as propellant. University of Glasgow

The Ouroboros-3 prototype is an autophage rocket engine designed for a fiery demise.

The post Watch this rocket ‘eat’ its own body for fuel appeared first on Popular Science.

]]>
GIF of Ouroboros-3 test rocket igniting
Ouroboros-3 uses its own plastic fuselage as propellant. University of Glasgow

As satellite constellations and space junk continue crowding orbital zones above Earth, researchers are searching for ways to prevent adding to the growing problem. According to one team of researchers, one solution may involve using the physical rocket to fuel its own launch.

Collaborators from the University of Glasgow say they have debuted the first successful, unsupported autophage (Latin for “self-eating”) rocket engine prototype. Revealed earlier this week during the American Institute of Aeronautics and Astronautics SciTech Forum, the Ouroboros-3—named after the ancient Egyptian symbol of a snake eating its own tail—utilizes its own body as an additional fuel source. In a video of the tests, the Ouroboros-3 can be seen shrinking in length as its body is burned away during a simulated launch.

Today’s conventional rocketry stores its fuel in separate stages that are ejected once depleted, either to burn up during atmospheric re-entry or to become yet another piece of orbital space junk. Ouroboros-3 leaves very little trace once it completes its duties, given that it would only be tasked with launching and delivering a small, unpiloted payload into orbit.

After a first ignition using a main propellant composed of gaseous oxygen and liquid propane, Ouroboros-3’s high-density polyethylene plastic tubing encasement subsequently adds to the propulsion as the rocket continues its burn. Much like a candlestick flame consuming its wax, the case provided as much as one-fifth the total necessary propellant. In test-firings, Ouroboros-3 generated as much as 100 newtons of thrust.

“A conventional rocket’s structure makes up between five and 12 percent of its total mass. Our tests show that the Ouroboros-3 can burn a very similar amount of its own structural mass as propellant,” University of Glasgow engineering professor and project lead Patrick Harkness said in a statement earlier this week. “If we could make at least some of that mass available for payload instead, it would be a compelling prospect for future rocket designs.”

Subsequent tests also demonstrated how the team can control their autophage rocket’s burn, allowing it to restart, pulse in an on/off pattern, or be throttled.

“These results are a foundational step on the way to developing a fully-functional autophage rocket engine,” Harkness continued.

[Related: The FCC just dished out their first space junk fine.]

Although still an early prototype, the team hopes to scale future iterations of Ouroboros-3 enough to support the delivery of payloads, such as nanosatellites, into orbit without further cluttering the atmosphere. Speaking with Gizmodo on Thursday, Harkness intends to strengthen their autophage rocket by around two orders of magnitude—any more than that is probably unnecessary, since deliveries will likely be restricted to comparatively small payloads.

Still, autophage rockets could one day provide the space industry with an alternative to existing designs’ costly, cluttering problems. And besides, anything that helps avoid instigating a Kessler cascade is certainly good news.

The post Watch this rocket ‘eat’ its own body for fuel appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This yeast loves light https://www.popsci.com/science/yeast-light/ Fri, 12 Jan 2024 18:00:00 +0000 https://www.popsci.com/?p=598496
Green rhodopsin proteins inside the blue cell walls help these yeast grow faster when exposed to light.
Green rhodopsin proteins inside the blue cell walls help these yeast grow faster when exposed to light. Anthony Burnetti/Georgia Institute of Technology

While it usually needs darkness to thrive, scientists have created a light-powered yeast by moving a single gene.

The post This yeast loves light appeared first on Popular Science.

]]>
Green rhodopsin proteins inside the blue cell walls help these yeast grow faster when exposed to light.
Green rhodopsin proteins inside the blue cell walls help these yeast grow faster when exposed to light. Anthony Burnetti/Georgia Institute of Technology

Unlike some pretty metal plants that thrive in the darkness, yeast generally doesn’t function well in the light. This fungi turns carbohydrates into ingredients for beer or bread when left to ferment in the dark. It must be stored in dark dry places, as exposure to light can keep fermentation from happening all together. However, a group of scientists have engineered a strain of yeast that may actually work better with light that could give these fungi an evolutionary boost in a simple way. The findings are described in a study published January 12 in the journal Current Biology.

[Related: The key to tastier beer might be mutant yeast—with notes of banana.]

“We were frankly shocked by how simple it was to turn the yeast into phototrophs (organisms that can harness and use energy from light),” study co-author and Georgia Institute of Technology cellular biologist Anthony Burnetti said in a statement. “All we needed to do was move a single gene, and they grew 2 percent faster in the light than in the dark. Without any fine-tuning or careful coaxing, it just worked.”

Giving yeast such an evolutionarily important trait may help us understand how phototropism originated and how it can be used to study evolution and biofuel production, as well as how cells age. 

Give it some energy

Previous work on the evolution of multicellular life by this research group inspired the new study. In 2023, the group uncovered how a single-celled model organism called snowflake yeast could evolve multicellularity over 3,000 generations. However, one of the major limitations to their evolution experiments was a lack of energy.

“Oxygen has a hard time diffusing deep into tissues, and you get tissues without the ability to get energy as a result,” said Burnetti. “I was looking for ways to get around this oxygen-based energy limitation.”

Light is one of the ways organisms can get an energy boost without oxygen. However, from an evolutionary standpoint, an organism’s ability to turn light into usable energy can be complicated. The molecular machinery that allows plants to use light for energy requires numerous proteins and genes that are difficult to synthesize and transfer into other organisms. This is difficult in the lab and through natural processes like evolution. 

A simple rhodopsin

Plants are not the only organisms that can convert light into energy. Some on-plant organisms can also use this light with the help of rhodopsins. These proteins can convert light into energy without any extra cellular machinery.

“Rhodopsins are found all over the tree of life and apparently are acquired by organisms obtaining genes from each other over evolutionary time,” study co-author and Georgia Tech Ph.D. student Autumn Peterson said in a statement

[Related: Scientists create a small, allegedly delicious piece of yeast-free pizza dough.]

A genetic exchange like this is called a horizontal gene transfer, where genetic information is shared between organisms that are not closely related. A horizontal gene transfer can cause large evolutionary leaps in a short period of time. One example of this is how bacteria can quickly develop resistance to certain antibiotics. This can happen with all kinds of genetic information and is particularly common with rhodopsin proteins.

“In the process of figuring out a way to get rhodopsins into multi-celled yeast,” said Burnetti, “we found we could learn about horizontal transfer of rhodopsins that has occurred across evolution in the past by transferring it into regular, single-celled yeast where it has never been before.”

Under the spotlight

To see if they could give a single-celled organism a solar-powered rhodopsin, the team added a rhodopsin gene synthesized from a parasitic fungus to common baker’s yeast. This individual gene is coded for a form of rhodopsin that would be inserted into the cell’s vacuole. This is a part of the cell that can turn chemical gradients made by proteins like rhodopsin into needed energy. 

With this vacuolar rhodopsin, the yeast grew roughly 2 percent faster when it was exposed to light. According to the team, this is a major evolutionary benefit and the ease that the rhodopsins can spread across multiple lineages might be key. 

“Here we have a single gene, and we’re just yanking it across contexts into a lineage that’s never been a phototroph before, and it just works,” said Burnetti. “This says that it really is that easy for this kind of a system, at least sometimes, to do its job in a new organism.”

Yeasts that function better in the light could also increase its shelf life. Vacuolar function may also contribute to cellular aging, so this group has started collaborating with other teams to study how rhodopsins may reduce aging effects in the yeast. Similar solar-powered yeast is also being studied to advance biofuels. The team also hopes to study how phototrophy changes yeast’s evolutionary journey to a multicellular organism. 

The post This yeast loves light appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA finally pries open stuck Bennu asteroid sampler https://www.popsci.com/science/nasa-opens-stuck-bennu-asteroid-sampler/ Fri, 12 Jan 2024 16:00:00 +0000 https://www.popsci.com/?p=598477
NASA’s OSIRIS-REx curation engineer, Neftali Hernandez, attaches one of the tools developed to help remove two final fasteners that prohibited complete disassembly of the Touch-and-Go Sample Acquisition Mechanism head that holds the remainder of material collected from asteroid Bennu. Engineers on the team, based at NASA’s Johnson Space Center in Houston, developed new tools that freed the fasteners on January 10.
NASA’s OSIRIS-REx curation engineer, Neftali Hernandez, attaches one of the tools developed to help remove two final fasteners that prohibited complete disassembly of the Touch-and-Go Sample Acquisition Mechanism head that holds the remainder of material collected from asteroid Bennu. Engineers on the team, based at NASA’s Johnson Space Center in Houston, developed new tools that freed the fasteners on January 10. NASA/Robert Markowitz

It took about 3.5 months to safely access the container.

The post NASA finally pries open stuck Bennu asteroid sampler appeared first on Popular Science.

]]>
NASA’s OSIRIS-REx curation engineer, Neftali Hernandez, attaches one of the tools developed to help remove two final fasteners that prohibited complete disassembly of the Touch-and-Go Sample Acquisition Mechanism head that holds the remainder of material collected from asteroid Bennu. Engineers on the team, based at NASA’s Johnson Space Center in Houston, developed new tools that freed the fasteners on January 10.
NASA’s OSIRIS-REx curation engineer, Neftali Hernandez, attaches one of the tools developed to help remove two final fasteners that prohibited complete disassembly of the Touch-and-Go Sample Acquisition Mechanism head that holds the remainder of material collected from asteroid Bennu. Engineers on the team, based at NASA’s Johnson Space Center in Houston, developed new tools that freed the fasteners on January 10. NASA/Robert Markowitz

Even the brilliant minds at NASA sometimes have trouble opening up a tightly-sealed container. Engineers and scientists from Johnson Space Center finally opened a container of asteroid sample material, after two fasteners had been stuck for about 3.5 months. 

[Related: NASA’s OSIRIS mission delivered asteroid samples to Earth.]

On September 24, 2023, the agency received roughly 2.5 ounces of rocks and dust collected from a 4.5 billion year-old near-Earth asteroid named Bennu. The regolith was dropped off by OSIRIS-REx in a Utah desert. This is the first United States mission to collect samples from an asteroid. The spacecraft traveled 1.4-billion-miles from Earth, to the asteroid Bennu, and then back again to drop off the asteroid dust. However, NASA announced in October that some of the material was out of reach in a capsule inside a robotic arm with a storage container called the Touch-and-Go Sample Acquisition Mechanism (TAGSAM). 

The asteroid samples must be analyzed in a specialized glovebox with a flow of nitrogen to prevent them from becoming contaminated. According to NASA, 35 fasteners were holding the sampler shut and two of the fasteners were too difficult to open with any of the pre-approved ways to access containers of such precious samples. They initially managed to collect some black dust and debris l from the TAGSAM head when the aluminum head was first removed and could access some of the material from inside the canister with tweezers or a scoop, while the TAGSAM head’s mylar flap was held down. 

To pry open the stuck fasteners, NASA needed to develop new materials and specialized tools that minimize the risk that the precious space rock samples will be damaged or contaminated. These new tools include custom-fabricated bits built from a specific grade of surgical, non-magnetic stainless steel. This is the hardest metal approved for use in the container’s pristine curation gloveboxes. These techniques enabled the team to open the stuck fasteners. 

“In addition to the design challenge of being limited to curation-approved materials to protect the scientific value of the asteroid sample, these new tools also needed to function within the tightly-confined space of the glovebox, limiting their height, weight, and potential arc movement,” Johnson Space Center OSIRIS-REx curator Nicole Lunning said in a statement. “The curation team showed impressive resilience and did incredible work to get these stubborn fasteners off the TAGSAM head so we can continue disassembly. We are overjoyed with the success.”

After a few additional disassembly steps, the remainder of the sample will be fully visible. Image specialists will be able to take ultra-high-resolution pictures of the sample while it is still inside TAGSAM’s head. After imaging, this portion of the sample will be removed, weighed, and the team will determine the total mass of the asteroid material captured by the mission. 

Bennu dates back to the crucial first 10 million years of the solar system’s development. Its age offers scientists a window into what this time period looked like. The space rock is shaped like a spinning top and is about one-third of a mile across at its widest part–slightly wider than the Empire State Building is tall. It revolves around the sun between the orbits of Earth and Mars.

An analysis of Bennu’s dust conducted last fall revealed that the asteroid had a lot of water in the form of hydrated clay minerals. The team believes that signs of water on asteroids support the current theory of how water arrived on Earth.

[Related: NASA’s first asteroid-return sample is a goldmine of life-sustaining materials.]

OSIRIS-REx principal investigator Dante Lauretta told PopSci in October that asteroids like Bennu were likely responsible for all of Earth’s oceans, lakes, rivers, and rain. Water likely arrived when space rocks landed on our planet about 4 billion years ago. The asteroid Bennu has water-bearing clay with a fibrous structure, which was the key material that ferried water to Earth, according to Lauretta.

The Bennu sample also contained about 4.7 percent carbon. According to Daniel Glavin, the OSIRIS-REx sample analysis lead at NASA’s Goddard Space Flight Center, this sample has the highest abundance of carbon that a team from the Carnegie Institution for Science have measured in an extraterrestrial sample. Glavin told PopSci that when the team opened it, “There were scientists on the team going ‘Wow, oh my God!’ And when a scientist says that ‘Wow;’ that’s a big deal.”

In the spring, the curation team is scheduled to release a catalog of the OSIRIS-REx samples for the global scientific community to study. OSIRIS-REx is now renamed OSIRIS-APEX and is currently on its way to study a potentially asteroid named Apophis. That rendezvous is scheduled for sometime in 2029.

The post NASA finally pries open stuck Bennu asteroid sampler appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How hummingbirds switch gears at breakneck speeds https://www.popsci.com/environment/hummingbirds-switch-gears/ Thu, 11 Jan 2024 20:00:00 +0000 https://www.popsci.com/?p=598332
A pink hummingbird sits on a perch inside of a tunnel. Green and black stripes are projected on the walls of the tunnel.
An experiment using a tunnel and various projections revealed that hummingbirds used two distinct strategies to control hovering and forward flight. Anand Varma

The agile avians rely on sensory strategies that change based on their flight method.

The post How hummingbirds switch gears at breakneck speeds appeared first on Popular Science.

]]>
A pink hummingbird sits on a perch inside of a tunnel. Green and black stripes are projected on the walls of the tunnel.
An experiment using a tunnel and various projections revealed that hummingbirds used two distinct strategies to control hovering and forward flight. Anand Varma

Hummingbirds are some of the fastest and most agile birds on Earth. They can squeeze into incredibly small spaces to get nectar and hit flight speeds as high as 9Gs while courting without getting physically hurt. They also appear to have very controlled methods of flight. Hummingbirds use two distinct sensory strategies to control how they fly, depending on whether they are moving forward or hovering. The findings are described in a study published January 10 in the journal Proceedings of the Royal Society B.

[Related: Hummingbirds have two creative strategies for flying through tight spaces.]

When flying forward, hummingbirds rely on an “internal forward model.” This model is an ingrained and intuitive autopilot that allows them to gauge speed while experiencing multiple visual stimuli. 

“There’s just too much information coming in to rely directly on every visual cue from your surroundings,” study co-author and University of British Columbia zoologist and comparative physiologist Vikram B. Baliga said in a statement. 

However, when the birds are hovering or handling cues that may require them to change their altitude, the team found that they use more real-time, direct visuals from their environment. 

To study these flight patterns, the team brought 11 wild adult male Anna’s hummingbirds (Calypte anna) into the lab. They prompted the birds to repeatedly fly from a perch to a feeder in a tunnel about 13 feet long and recorded videos of each flight. The team also projected patterns on the front and side walls of the tunnel to test how the hummingbirds reacted to this variety of visual stimuli.

University of British Columbia zoologists observed how hummingbirds reacted to a variety of visual stimuli in a tunnel they built in a lab. CREDIT: Roslyn Dakin.

In some flight scenarios, the researchers projected vertical stripes that were moving along at various speeds on the side of the tunnel to mimic forward motion. Other times, they used horizontal stripes on the side to mimic a change in altitude. On the front wall, the team projected  rotating swirls. These circular patterns were designed to create the illusion of a change in position.

“If the birds were taking their cues directly from visual stimuli, we’d expect them to adjust their forward velocity to the speed of vertical stripes on the side walls,” said Baliga. “But while the birds did change velocity or stop altogether depending on the patterns, there wasn’t a neat correlation.”

However, the team observed that the hummingbirds adjusted more directly to stimuli indicating a change in altitude while they were flying. When the birds were hovering, they also worked to adjust their position so they were closer to the shifting spirals on the front wall. 

[Related: This hybrid hummingbird’s colorful feathers are a genetic puzzle.]

“Our experiments were designed to investigate how hummingbirds control flight speed,” study co-author and University of British Columbia zoologist and comparative physiologist Doug Altshuler said in a statement. “But because the hummingbirds took spontaneous breaks to hover during their flights, we uncovered these two distinct strategies to control different aspects of their trajectories.”

The findings provide insight on how these speedy birds perceive the world when they transition their flight patterns. Data like this could also help engineers develop better onboard navigation systems for drones and hovering vehicles in the future.

The post How hummingbirds switch gears at breakneck speeds appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Peregrine sent back this image from its now abandoned lunar mission https://www.popsci.com/science/peregrine-image-sent/ Thu, 11 Jan 2024 17:31:29 +0000 https://www.popsci.com/?p=598255
An image taken by a camera aboard the Peregrine spacecraft, with Earth in the top right.
An image taken by a camera aboard the Peregrine spacecraft, with Earth in the top right. Astrobotic

The lander is expected to lose power only five days after launching.

The post Peregrine sent back this image from its now abandoned lunar mission appeared first on Popular Science.

]]>
An image taken by a camera aboard the Peregrine spacecraft, with Earth in the top right.
An image taken by a camera aboard the Peregrine spacecraft, with Earth in the top right. Astrobotic

Following a successful launch aboard a Vulcan Centaur Rocket on Monday January 8, private space company Astrobotic has abandoned its attempt to land its Peregrine spacecraft on the moon. In an update from the Pittsburgh-based company, Astrobotic reported that Peregrine sent back a few images from the earlier parts of its journey. One of the images showed what Astrobotic described as a “curved sliver” that appears to be Earth. 

[Related: Peregrine lunar lander experiences ‘critical loss of propellant’ following successful launch.]

“Our flight dynamics team has confirmed that the curved sliver in this image taken on our first day of operations is, in fact, Earth! This image from our spacecraft simulator shows the camera’s view of Earth at the time the photo was taken,” the company wrote in the January 10 update.

Astrobotic also has gathered data from the payloads that were designed to communicate with the lander. “All 10 payloads requiring power have received it, while the remaining 10 payloads aboard the spacecraft are passive,” Astrobotic wrote in a January 11 update. “These payloads have now been able to prove operational capability in space and payload teams are analyzing the impact of this development now.”

What went wrong?

About seven hours after launch, Peregrine was unable to shift its solar panels towards the sun so that its batteries could charge. While the engineering team was able to turn the panels, more problems developed. 

Astrobotic believed that the root of the problem was a failure in the vehicle’s propulsion system that was causing a critical loss of propellant. The company shared the first image of the lander in space, with its outer insulation appearing very crinkled. 

The first image from Peregrine in space. The camera is mounted atop a payload deck and shows Multi-Layer Insulation (MLI) in the foreground.
The first image from Peregrine in space. The camera is mounted atop a payload deck and shows Multi-Layer Insulation (MLI) in the foreground. CREDIT: Astrobotic.

By Monday evening, Astrobotic announced that this fuel leak was causing the thrusters in the spacecraft’s attitude control system to “operate well beyond their expected service life cycles to keep the lander from an uncontrollable tumble.” The mission’s priority also became maximizing the data and scientific information that Peregrine could capture and send back to Earth. 

On Tuesday January 9 Astrobotic said that the leak meant that “there is, unfortunately, no chance of a soft landing on the moon. By January 10, Peregrine was roughly 192,000 miles from Earth. The spacecraft was “stable and fully charged” and gathering “valuable data.” They estimated that it will likely shut down at around sometime on Friday January 12.

[Related: Inside NASA’s messy plan to return to the moon by 2024.]

What will happen to Peregrine?

Peregrine will join the estimated 29,000 pieces of space debris larger than 10 centimeters (3.9 inches). The spacecraft will also be a floating gravesite. Its payload contained DNA samples and portions of cremated remains of three former United States presidents, Star Trek creator Gene Roddenberry, and several members of the original cast of the groundbreaking sci-fi series. 

What is next for public-private lunar exploration?

With this mission, Astrobotic hoped to become the first private business to successfully land on the moon. This is a feat only four countries–Russia, China, India, and the United States–have accomplished. 

A second lander from Houston-based Intuitive Machines is scheduled to launch in February. NASA has given both of these companies millions of dollars to construct and fly their own lunar landers, so that the privately owned landers can explore landing sites before astronauts arrive and deliver critical technology and experiments. Astrobotic’s contract with NASA for the Peregrine lander was $108 million with more to come

[Related: NASA delays two crewed Artemis moon missions.]

This week, NASA leadership announced that it is delaying future missions to the moon, citing safety issues and delays in developing lunar landers and spacesuits. Originally scheduled to launch in November of this year, the Artemis II mission that will send four astronauts around the moon has been postponed to September 2025. Meanwhile, the moon-landing mission Artemis III will now aim for September 2026 instead of late 2025. The Artemis IV mission remains on track for September 2028.

The post Peregrine sent back this image from its now abandoned lunar mission appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Oldest known fossilized reptile skin was dug up in an Oklahoma quarry https://www.popsci.com/science/oldest-reptile-skin-fossil/ Thu, 11 Jan 2024 16:00:00 +0000 https://www.popsci.com/?p=598236
A visual collage of skin fossils described in the new study. The mummified skin specimen is shown sliced into two pieces in the center-left of the image. The surrounding specimen scans are of fossilized skin impressions.
A visual collage of skin fossils described in the new study. The mummified skin specimen is shown sliced into two pieces in the center-left of the image. The surrounding specimen scans are of fossilized skin impressions. Current Biology, Mooney et al.

Paleontologists believe the fossil is at least 285 million of years old.

The post Oldest known fossilized reptile skin was dug up in an Oklahoma quarry appeared first on Popular Science.

]]>
A visual collage of skin fossils described in the new study. The mummified skin specimen is shown sliced into two pieces in the center-left of the image. The surrounding specimen scans are of fossilized skin impressions.
A visual collage of skin fossils described in the new study. The mummified skin specimen is shown sliced into two pieces in the center-left of the image. The surrounding specimen scans are of fossilized skin impressions. Current Biology, Mooney et al.

Usually, fossilized animal remains result from bits of bone or impressions of a creature long-passed. But sometimes a specific place has just the right conditions to preserve even more. From Richards Spur, a long filled-in cave network and active quarry in southern Oklahoma, a group of paleontologists say they’ve identified and described the oldest fossilized reptile skin ever found. The soft tissue fossil is a rare find–enabled through a series of chance events. It offers a glimpse into a distant evolutionary past that pre-dates both mammals and the oldest dinosaurs.

The skin sample, about the size of a fingernail and literally paper thin, is described in a study published on January 10 in the journal Current Biology, along with other fossil findings. The ancient, reptilian skin flake is an estimated 286-289 million years old. That’s at least 21 million years older than the next oldest example and more than 130 million years older than the vast majority of comparable samples, which come from mummified dinosaurs that lived in the late Jurassic, says lead study author Ethan Mooney, a biology master’s student at the University of Toronto studying paleontology. 

The fossils’ estimated age is based on the site where they were found. Once, Richards Spur was an open limestone cave, but between 286 and 289 million years ago, it filled in with clay and mud deposits, according to Mooney and Maho. During that in-filling process, the cave stopped forming, so the youngest stalagmite rings present in the cave represent the approximate age of the sediments and the fossils they contain, according to previous research using Uranium-Lead radioisotope dating.

Roger Benson, a paleontologist and curator at the American Museum of Natural History who was not involved in the study, agrees these methods and assumptions are sound. “All the evidence, especially what sorts of fossil groups are present, is consistent with an early Permian age (around 300 to 273 million years ago),” he wrote in an email. 

In addition to the fossilized piece of “skin proper,” the researchers also documented multiple preserved impressions of skin– a much more common type of fossil to find. But unlike with the impressions, which are essentially the outline of an animal pressed into stone, the researchers were able to assess the cross section of their most notable fossil and identify layers and detail that would’ve otherwise been unknowable. 

“At first we thought they were broken pieces of bone,” says Tea Maho, a study co-author and a pHD student at the University of Toronto, of all the epidermal fossils. The skin, she says, “could have just been so easily disregarded until we looked under a microscope.” Then it became clear that they were seeing preserved soft tissue, an exceptionally rare thing among samples that old. 

Usually, soft tissue breaks down quickly before it can fossilize. But Richards Spur is a hotbed of paleontological discovery. The study suggests that oxygen-poor sediments and the presence of oil seeps in the cave system helped to preserve the prehistoric animals and carcasses  that happened to fall in. In this environment, the skin was mummified–the technical term in paleontology for when organic matter dries out before decaying. Because the site is also an actively mined quarry, new layers of fossils are constantly being uncovered.

“The sheer chance for a soft tissue structure to be preserved, to survive until now–through the mining process–to have been found…and then described by us is quite an incredible story,” Mooney says. Especially given how fragile the preserved epidermis is. “If you were to have pressed it a little too hard, it would have just cracked,” says Maho. Thankfully, for our understanding of vertebrate history, the scientists were careful enough to keep the skin sample from becoming dust. 

Though they can’t say for certain what animal the skin specimen was from, Maho and Mooney have an idea. Captorhinus aguti was a lizard-like animal known to have been common in the region during the Permian Period. It had four legs, a tail, was about 10 inches long, and had an omnivorous diet–eating insects, small vertebrates, and occasionally plants. Fossilized skeletal remains of C. aguti have been found at the same site, and aspects of the skin sample are similar to features of those larger fossils. 

On top of being the oldest reptile skin ever discovered, Mooney notes the newly described specimen is also the oldest amniote skin ever found. Amniotes are the subcategory of animals that encompasses reptiles, birds, and mammals, and the finding offers insight into a key moment in animal biology. “It comes from a pivotal time in the evolution of life as we know it. It represents the first chapter of higher vertebrate evolution,” from fish and amphibians to creatures not reliant on aquatic habitats to survive or breed. Skin is the body’s largest organ and plays a major role in moisture regulation. Paleontologists have long assumed that good skin was a big deal for early terrestrial animals, now there’s additional fossil evidence for that view, he adds. 

Incredibly, the ~289 million-year-old specimen closely resembles present-day crocodile skin, according to the study. Both living crocodiles and the ancient bit of preserved epidermis have a pebble-like texture and a non-overlapping scale pattern. This similarity, Mooney says, is further proof of skin’s outsized role in adapting to life on land. “The fact that we have an example from one of these earliest reptiles and it’s quite consistent with what we see in modern reptiles underscores how important that structure was and how successful it was in doing its job.” 

Nearly 300 million years ago, “life would have looked very different,” Mooney says. But reptile skin might not have.

The post Oldest known fossilized reptile skin was dug up in an Oklahoma quarry appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The best telescopes for deep space in 2024 https://www.popsci.com/gear/best-telescopes-for-deep-space/ Wed, 10 Jan 2024 21:00:00 +0000 https://www.popsci.com/?p=598022
Four of the best telescopes for deep space side by side on a plain background.
Amanda Reed

Gaze deep into the heavens with these powerful telescopes.

The post The best telescopes for deep space in 2024 appeared first on Popular Science.

]]>
Four of the best telescopes for deep space side by side on a plain background.
Amanda Reed

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Best overall A Celestron NexStar 8SE on a plain background Celestron NexStar 8SE
SEE IT

It offers powerful magnification and a useful Go-To mount for finding celestial objects.

Best smart A Unistellar equinox 2 telescope on a plain background. Unistellar Equinox 2
SEE IT

Ponder the heavens and take photos of space from the comfort of your couch.

Best budget A black Sky-Watcher 8" Flextube 200P telescope on a plain background Sky-Watcher 8" Flextube 200P
SEE IT

A budget-friendly option with plenty of magnification and light-gathering capabilities.

Telescopes for deep space allow you to gaze at the wonders of our universe in a way that wouldn’t be possible with other telescopes. These powerful devices use larger apertures to gather loads of light, illuminating what would be too dim to see otherwise. Many even offer motorized mounts to move to the celestial objects you want to admire automatically. Whether you are a new astronomer or a seasoned pro, the best telescopes for deep space will broaden your horizons.

How we chose the best telescopes for deep space

Telescopes for deep space have more specific requirements than general telescopes. Because of this, we selected powerful scopes with large apertures and plenty of magnification. Beyond those two factors, we also looked for options with and without motorized mounts. Finally, we assessed the quality of the optics, build quality, mount type, and any extra features. We selected based on a mix of hands-on telescope experience, expert insight, editorial reviews, and user feedback. 

The best telescopes for deep space: Reviews & Recommendations

If you want to see beyond planets and our moon, you’ll need a telescope for deep space. These powerful scopes will open up the ability to gaze at deep sky objects (DSOs), such as star clusters, nebulas, and galaxies, providing new opportunities for epic stargazing sessions.

Best overall: Celestron NexStar 8SE

Celestron

SEE IT

Specs

  • Optical design: Schmidt-Cassegrain
  • Mount: Computerized alt-azimuth  
  • Aperture: 203mm (8 inches)
  • Focal length: 2032mm
  • Eyepiece: 25mm (81x) Plössl
  • Weight: 24 pounds
  • Dimensions: 42.01 x 23.66 x 12.99 inches

Pros

  • Computerized mount makes tracking easy
  • Very sharp across entire field of view
  • Large aperture
  • Portable

Cons

  • Slewing results in some lag

Our best overall pick comes from one of the most trusted telescope manufacturers. The Celestron NexStar 8SE is a relatively portable Schmidt-Cassegrain scope. It weighs 24 pounds but is quite compact, so you can bring it to dark-sky locations if needed. 

This telescope for deep space offers a minimum useful magnification of 29x and a maximum useful magnification of 480x, making it a versatile tool for viewing a wide range of celestial objects, including planets. Plus, the eight-inch aperture captures plenty of light for DSOs. Provided you don’t have much light pollution, you’ll be able to see nebulas, galaxies, and more clearly.  

This type of telescope requires collimation, but Celestron’s SkyAlign makes it quick and easy, even for beginners. Another plus for beginners and pros alike is the included alt-azimuth mount, a fully automated Go-To mount. You can select from a database of 40,000 objects, and the telescope will automatically find it and track it across the sky for you. It also comes with a 25mm eyepiece, StarPointer finderscope, visual back, and mirror star diagonal. The NexStar 8SE is pricey, but you get a lot of value for that price that is hard to beat.

Best smart: Unistellar Equinox 2

Unistellar

SEE IT

Specs

  • Optical design: Newtonian reflector
  • Mount: Computerized alt-azimuth  
  • Aperture: 114mm
  • Focal length: 450mm
  • Eyepiece: Not applicable (no eyepiece)
  • Weight: 19.8 pounds
  • Dimensions: 18.6 x 11.2 x 30.4 inches

Pros

  • Very easy to use
  • Can be used from a distance
  • Filters out light pollution 
  • Portable
  • Good battery life

Cons

  • Expensive
  • Lack of an eyepiece isn’t for everyone

For those who want to gaze into the heavens from the comfort of their couch, the Unistellar Equinox 2 is the way to go. This smart telescope is unique in that it doesn’t feature an eyepiece. Instead, you pair the scope with the easy-to-use Unistellar app and view from there. This won’t be everyone’s cup of tea, but it makes deep space observation easier for groups and kids. It also came in handy for those mosquito-ridden Florida nights, as my husband and I could stargaze from inside.

The Equinox 2 features a sturdy base with a computerized alt-az mount. After an easy alignment process, you can search a database of 5,000 objects, including DSOs. Then, the telescope will automatically find and track them across the night sky. Or, you can use the in-app joystick to browse on your own. 

One of the best features of the Equinox 2 is its Deep Dark Technology, which filters out light pollution. This opens up stargazing even to those living in cities. Though I don’t live in a large city, there is a lot of light pollution, and it was remarkable what I was able to see with this setting turned on. Should you want to travel to dark sky locations, the telescope is relatively compact and portable, and you can even purchase a bundle with a backpack for easier transportation. Want an even more up-to-date (but also more expensive) model? Check out the newly announced Odyssey Pro.

Best splurge: Celestron Advanced VX 8 Edge HD

Celestron

SEE IT

Specs

  • Optical design: Schmidt-Cassegrain
  • Mount: Computerized equatorial mount 
  • Aperture: 203mm (8 inches)
  • Focal length: 2032mm
  • Eyepiece: 12mm (150x) and 40mm (38x)
  • Weight: 61 pounds (full kit)
  • Dimensions: ‎9.1 x 9.1 x 9.1 inches

Pros

  • Very compact
  • Accurate Go-To mount
  • Extremely high-quality optics
  • Excellent for both astrophotography and observation

Cons

  • Mount isn’t sturdy enough for long-exposure astrophotography
  • Expensive

Our splurge pick is also one of the best telescopes for astrophotography. It features high-quality optics that fully correct for coma and field curvature, resulting in a truly flat field. Plus, the StarBright XLT coatings provide better light transmission for bright, sharp images. 

The included equatorial mount makes tracking objects easy, so you can make long observations or take long-exposure photos. It is computerized with Go-To functionality, making it easy to find and automatically track objects of interest. The mount even features ports for hand control, an autoguider, and two AUX ports for optional accessories. All those ports make it an ideal option for seasoned pros or for beginners who want something to grow into. 

Adding to the versatility of this scop is the ability to use three different f-stop configurations. You can attach a camera to the scope for f/10 or attach the eight-inch EdgeHD focal reducer to shoot at f/7. Finally, the EdgeHD is Fastar/Hyperstar compatible, making it possible to shoot at f/2. It is also quite compact, albeit fairly heavy, making it feasible to travel with. This is an expensive telescope for deep space, but you won’t be disappointed if you want something to last a long time or are looking for extremely high-quality optics.

Best compact: Vaonis Vespera

Vaonis

SEE IT

Specs

  • Optical design: Apochromatic (APO) quadruplet refractor
  • Mount: Computerized alt-azimuth  
  • Aperture: 50mm (2 inches)
  • Focal length: 200mm
  • Eyepiece: Not applicable
  • Weight: 11 pounds
  • Dimensions: 15 x 8 x 3.5 inches

Pros

  • Very compact and portable
  • Helps remove light pollution
  • Sleek, futuristic design
  • Ideal for group observations

Cons

  • Images aren’t very high-quality

The Vaonis Vespera is one of the best options If you love to travel and want a telescope for deep space to take along. Weighing only 11 pounds and measuring 15 by 8 by 3.5 inches,  the Vespera is very compact for what it provides. It also features a futuristic design, which will look nice sitting in your home. 

Like the Unistellar telescope, this option doesn’t offer an eyepiece. It can pair with up to five smartphones or tablets via the Singularity app, making it a fun way to stargaze with friends. Also like the Unistellar, it can filter out light pollution so that you can view DSOs even in cities. The telescope and app are both easy to use, so you’ll have no issues if you are a complete novice. 

The Vespera uses a Sony IMX462 image sensor to produce images. Unfortunately, this isn’t a great option if you want high-quality images of celestial objects. It only offers a resolution of 1920 by 1080 pixels, and users report that images are a little on the soft side. But it uses your phone’s GPS to calibrate yourself and automatically tracks objects, taking the work out of stargazing.

Best budget: Sky-Watcher 8″ Flextube 200P

Sky-Watcher

SEE IT

Specs

  • Optical design: Newtonian reflector
  • Mount: Dobsonian
  • Aperture: 203mm (8 inches)
  • Focal length: 1200mm
  • Eyepiece: 10mm (120x) and 25mm (48x)
  • Weight: 52 pounds (full kit)
  • Dimensions: Base: 29.5 x 20 inches

Pros

  • Included mount is very sturdy
  • Very large aperture for the price
  • Comes with two eyepieces and an eyepiece tray
  • Smooth movements for manual tracking

Cons

  • Not motorized Go-To functionality

Telescopes for deep space are not cheap; there’s no getting around it. But the Sky-Watcher 8” Flextube 200P offers a much more budget-friendly option for deep sky observation. Coming in well below $1,000 when writing, this device provides a large eight-inch aperture for plenty of light gathering. 

The Flextube 200P comes with two eyepieces, offering more versatility. It also includes an eyepiece tray to keep your accessories organized. The 1200mm focal length offers plenty of reach for deep space viewing, with a maximum useful magnification of 400x. The high-quality mount allows for smooth movements as you scan the sky.

This is not a lightweight device at 52 pounds (the base and scope combined). But it is relatively compact so that it will fit well in smaller spaces. Should you need to transport the scope, it comes apart in two pieces to make it easier. The Flextube 200P also doesn’t feature a motorized mount, meaning it requires manual input for finding and tracking objects. Purists will appreciate that, but it may take some getting used to for novices. It comes with a 50mm finder, though, which makes it easier to find what you are after.

What to consider when buying the best telescopes for deep space

Telescopes for deep space have some specific requirements beyond most scopes. Add to that all the highly technical jargon that goes along with telescopes, and it can be extremely confusing what to actually pay attention to when shopping. Below are some key features you’ll need to consider when choosing your new telescope to look out at the cosmos’ wonders. 

Aperture

The aperture is the most important aspect of a telescope for deep space (yes, this is even more important than magnification). A telescope’s aperture controls how much light is let in. It is measured in millimeters or inches. If you want to check out deep-space objects, you’ll need a telescope with a large aperture to gather as much light as possible. Broadly speaking, your best bet is to choose the largest aperture you can afford. 

The exact type of object you want to check out could also guide your decision. Celestron suggests a minimum of 5 inches (120mm) for open star clusters and at least 8-11 inches (200-280 mm) for galaxies. 

Focal length & magnification

The focal length of your telescope is a measurement (measured in millimeters) of the distance between the primary lens or mirror and where the light comes in to focus at the other end. Focal length matters because it is part of what determines its magnification. 

While it might be somewhat counterintuitive, you don’t need crazy high magnification to view deep-space objects (DSOs). In fact, too much magnification may prevent you from seeing the object in the best light. Depending on what exactly you hope to check out, a focal length of 800 to 1250mm or so is best. 

Eyepiece

The other piece of the magnification puzzle is the eyepiece. Most telescopes will come with two eyepieces, providing different levels of magnification for better versatility. You can also purchase eyepieces separately if you want more options. 

Keep in mind that each telescope will have a minimum and maximum useful magnification. If you choose an eyepiece with too much magnification for your telescope, you will see objects larger, but it won’t be any sharper, so you may not get a very clear image. On the other hand, if you go with too little magnification, there will be a vignette around your view, and you won’t see the entire field of view. 

Without getting too into the weeds, there are also multiple types of eyepieces, each with its pros and cons. The most common ones that you’ll find are Barlow and Plössl eyepieces. Barlow lenses feature optical elements that increase magnification by either twice or triple as you step up. Plössl eyepieces offer a wider field of view, which makes them ideal for deep sky viewing. 

Optical design

There are many types of telescopes, but broadly speaking, there are three categories for consumers: Refractor, reflector, and catadioptric.

Refractor telescopes use lenses (typically made of glass) to allow the light to travel in a straight path from the front objective lens to the eyepiece at the back of the telescope. These are easy to use, reliable, and require little to no maintenance. However, if you want a large aperture, they are quite long. They also get expensive, especially for high-quality refractors, since the glass lenses are pricey to produce well. 

Reflector telescopes, including Newtonian and Dobsonian types, use mirrors to bounce light inside the device, allowing for a shorter design than refractor scopes. Mirrors are cheaper to make than glass lenses, so reflector telescopes are generally more affordable when compared to the same aperture as refractor scopes. They can, however, be quite bulky and heavy and need to be collimated (the process of aligning the mirrors), which adds a step before you can use your scope. 

Finally, catadioptric telescopes use both mirrors and lenses, allowing for a compact, portable design. Schmidt-Cassegrains and Maksutov-Cassegrains are two types of catadioptric telescopes that you’ll encounter. 

Mount type

The type of mount your telescope uses will impact how you can use it. At the risk of sounding like a broken record, there are three primary types you’ll encounter: Dobsonian, alt-azimuth, and equatorial.

Alt-azimuth mounts (also called alt-az) are the simplest and, therefore, most affordable. They allow for altitude (vertical) and azimuth (horizontal) adjustments. High-quality alt-az mounts also provide smooth tracking abilities; some even feature a motor for automated tracking. 

Dobsonian mounts use a platform much like a lazy susan. As a result, they need to be placed on sturdy, flat surfaces, such as a table. They provide excellent stability, as long as you have a good place to put it. They are not very portable, though, so these are best suited for homes where you can set it up and leave it. 

Finally, equatorial mounts counteract the Earth’s rotation, allowing you to focus on a single object and track it across the night sky. As a result, they are the preferred choice for serious astrophotography and long observations of single celestial objects. 

FAQs

Q: What makes a telescope good for deep-space observation?

The most important feature of a telescope for deep space observation is the aperture of the objective lens. Large apertures collect more light, necessary for faint objects far away. You’ll also need a sturdy mount to keep the telescope still throughout your viewing. If you want to view a single object for long periods, you’ll want to go with an equatorial mount or a smart telescope that can track automatically for you.

Q: Can you buy used telescopes?

You can absolutely buy used telescopes. Most telescopes require little maintenance, so purchasing a used one is typically a safe bet. It can save you a lot of money, allowing you to get into star gazing for less while reducing your impact on the environment in at least a small way.

Q: What can you see with a telescope for deep space?

Telescopes for deep space allow you to see beyond our solar system. These are typically called deep space objects, or DSOs, and include galaxies, nebulae, and star clusters. Keep in mind, however, that in order to view deep-space objects, you’ll need extremely dark skies. Light pollution of any almost level can prevent you from seeing distant objects. 

Final thoughts on the best telescopes for deep space

Telescopes for deep space have specific requirements that make them more expensive than basic scopes. But, if there’s room in your budget, they allow for epic stargazing, opening up new discoveries. And although they may be more advanced than cheap telescopes, most are still very beginner-friendly.

Why trust us

Popular Science started writing about technology more than 150 years ago. There was no such thing as “gadget writing” when we published our first issue in 1872, but if there was, our mission to demystify the world of innovation for everyday readers means we would have been all over it. Here in the present, PopSci is fully committed to helping readers navigate the increasingly intimidating array of devices on the market right now.

Our writers and editors have combined decades of experience covering and reviewing consumer electronics. We each have our own obsessive specialties—from high-end audio to video games to cameras and beyond—but when we’re reviewing devices outside of our immediate wheelhouses, we do our best to seek out trustworthy voices and opinions to help guide people to the very best recommendations. We know we don’t know everything, but we’re excited to live through the analysis paralysis that internet shopping can spur so readers don’t have to.

The post The best telescopes for deep space in 2024 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA delays two crewed Artemis moon missions https://www.popsci.com/science/nasa-delays-artemis-moon-missions/ Wed, 10 Jan 2024 19:00:00 +0000 https://www.popsci.com/?p=598186
NASA’s Space Launch System (SLS) rocket with the Orion spacecraft atop launches the agency’s Artemis I flight test on November 16, 2022. The Artemis I mission was the first integrated test of the agency’s deep space exploration systems.
NASA’s Space Launch System (SLS) rocket with the Orion spacecraft atop launches the agency’s Artemis I flight test on November 16, 2022. The Artemis I mission was the first integrated test of the agency’s deep space exploration systems. Steven Seipel/NASA

Astronauts won’t walk on the moon again until 2026 at the earliest.

The post NASA delays two crewed Artemis moon missions appeared first on Popular Science.

]]>
NASA’s Space Launch System (SLS) rocket with the Orion spacecraft atop launches the agency’s Artemis I flight test on November 16, 2022. The Artemis I mission was the first integrated test of the agency’s deep space exploration systems.
NASA’s Space Launch System (SLS) rocket with the Orion spacecraft atop launches the agency’s Artemis I flight test on November 16, 2022. The Artemis I mission was the first integrated test of the agency’s deep space exploration systems. Steven Seipel/NASA

On January 9, NASA leadership announced that it is delaying future missions to the moon. Originally slated to launch November 2024, the Artemis II mission that will send four astronauts around the moon has been postponed to September 2025. Meanwhile, the moon-landing mission Artemis III will now aim for September 2026 instead of late 2025. The Artemis IV mission remains on track for September 2028. 

[Related: Inside NASA’s messy plan to return to the moon by 2024.]

The agency cited safety concerns with its spacecraft and development issues with the lunar landers and spacesuits, both of which are being made by private industry. The announcement came within hours of private space company Astrobotic abandoning its attempt to land a spacecraft on the moon due to a fuel leak. Peregrine Mission One launched on January 8 as part of NASA’s commercial lunar program and the lander was intended to serve as a support scout for Artemis astronauts. 

When it eventually launches, Artemis II will not enter orbit around the moon the way that Apollo missions did. Instead, the Orion capsule will swing around the moon and use lunar gravity to sling the spacecraft back towards the Earth. The entire trip is expected to take about 10 days. In April 2023, NASA announced that the crew will be three of its astronauts—Victor Glover, Christina Koch, and Reid Wiseman—and Canadian astronaut Jeremy Hansen. 

NASA plans to land two astronauts on the moon near its south pole for the first time in its now rescheduled Artemis III mission. If successful, it will mark humanity’s first return to the lunar surface in over 50 years. 

“Safety is our top priority, and to give Artemis teams more time to work through the challenges with first-time developments, operations and integration, we’re going to give more time on Artemis II and III,” NASA Administrator Bill Nelson said in the live streamed briefing

The officials cited several technical issues for the delay, including the electronics in the life support system that will need to sustain the astronauts inside the Orion and the heat shield on the capsule. 

According to deputy associate administrator for NASA’s Moon to Mars program Amit Kshatriya, the heat shield issues that the Orion capsule experienced during the uncrewed Artemis 1 test flight around the moon in November and December 2022 have been a major concern while the data from that mission has been analyzed. They’ve found that while Orion’s heat shield sufficiently protected the capsule, a large amount of the shield was burned away from the spacecraft. 

[Related: Before the Artemis II crew can go to the moon, they need to master flying high above Earth.]

“We did see the off-nominal recession of some char that came off the heat shield, which we were not expecting,” Kshatriya said in the briefing. “Now, this heat shield is an ablative material—it is supposed to char—but it’s not what we were expecting, with some pieces of that char to be liberated from the vehicle.”

Over the past 10 years, NASA’s moon-landing effort has been delayed repeatedly. In December 2023, the Government Accountability Office reported that Artemis III’s targeted December 2025 lunar landing was unlikely. The accountability office cited an optimistic schedule for developing Space X’s Starship lunar lander and the spacesuits necessary for walking on the moon. In 2023, two Starship test launches failed to reach orbit. 
These delays have added billions of dollars to the cost of the program. According to the Associated Press, recent government audits project that it will cost $93 billion through 2025.

The post NASA delays two crewed Artemis moon missions appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA plans to unveil experimental X-59 supersonic jet on January 12 https://www.popsci.com/technology/x-59-supersonic-jet-unveil/ Wed, 10 Jan 2024 18:00:00 +0000 https://www.popsci.com/?p=598149
The livestream event will begin at 4pm on January 12 across multiple platforms and websites.
The livestream event will begin at 4pm on January 12 across multiple platforms and websites. NASA / Lockheed Martin

The cutting-edge plane aims to generate a 75 decibel ‘sonic thump’ instead of a sonic boom.

The post NASA plans to unveil experimental X-59 supersonic jet on January 12 appeared first on Popular Science.

]]>
The livestream event will begin at 4pm on January 12 across multiple platforms and websites.
The livestream event will begin at 4pm on January 12 across multiple platforms and websites. NASA / Lockheed Martin

It may officially be Hollywood awards season, but NASA is also rolling out a red carpet of its own. On January 12 at 4pm EST, the agency will livestream the official public debut of its highly anticipated X-59 QueSST experimental aircraft. Designed alongside Lockheed Martin’s secretive Skunk Works division, the currently one-of-a-kind X-59 QueSST (short for Quiet SuperSonic Technology) is intended to demonstrate its potentially industry-shifting ability for human air travel at supersonic speeds sans sonic boom.

A sonic boom’s trademark thunderclap has long been associated with vehicles traveling faster than Mach 1. As a plane’s velocity surpasses the speed of sound, the shockwave formed by its wake results in a percussive noise capable of startling nearby humans and animals, as well as shattering windows if loud enough.

[Related: This experimental NASA plane will try to break the sound barrier—quietly.]

While sonic booms are permitted by certain military aircraft, commercial flights above the US have been prohibited from generating them since the Concorde jet’s retirement in 2003. The cutting edge X-59, in contrast, is designed to travel around 938 mph while only creating a “sonic thump” that is supposedly much quieter than an average sonic boom’s 110 decibels. NASA representatives previously estimated the X-59 will generate around 75 decibels of sound, or about as loud as slamming a car door.

The video livestream will begin at 4pm ET on January 12.

Engineers have spent years creating and honing the X-59’s state-of-the-art design. The experimental craft to be showcased on Friday is much smaller and more elongated than similar planes, measuring roughly 95-feet-long and less than 30-feet-wide. As New Scientist points out, that’s narrower than an F-16, but twice as long. The nose alone comprises nearly half plane’s length to ensure shockwaves generated near the front do not merge with waves created in the rear and thus emit a deafening boom. Because of this, the plane’s pilot will rely on 4K video screens inside the cockpit for their visuals to guide the aircraft.

It’s highly unlikely that X-59 will publicly take to the skies on Friday. Instead, the ceremony is meant to mark the beginning of a multiyear testing phase that will see the X-59 speed above “several US communities” selected by NASA’s QueSST team, who will then gather data and assess public reactions to the supposedly “gentle” sonic thump.

“This is the big reveal,” Catherine Bahm, manager of NASA’s Low Boom Flight Demonstrator project overseeing the X-59’s development and construction, said in a separate announcement. “The rollout is a huge milestone toward achieving the overarching goal of the QueSST mission to quiet the sonic boom.”

To call a sonic thump “quiet” may be a bit of an oversell, however. According to a 2022 Government Accountability Office (GAO) report, many people aren’t exactly pleased with daily disruptions caused by existing subsonic air travel, so it’s hard to envision sonic thumps being quieter than the average passenger jet. And even if the X-59’s volume proves nominal, environmental advocates continue to voice concerns over the potentially dramatic increase in carbon emissions that a new era of hypersonic flights could generate. In a letter penned to NASA administrator Bill Nelson by Public Employees for Environmental Responsibility (PEER) last year, the watchdog organization argued increased supersonic travel would be a “climate debacle.”

[Related: Air Force transport jets for VIPs could have a supersonic future.]

“Because the QueSSt mission is focused on the sonic boom challenge, the X-59 is not intended to be used as a tool to conduct research into other challenges of supersonic flight such as landing and takeoff noise, emissions and fuel burn. These challenges are being explored in other NASA research,” NASA representatives told The Register in July 2023.

Even if everything goes smoothly, however, it is unlikely that a fleet of X-59 jets will be zipping over everyone’s heads anytime soon. In 2021, a Lockheed Martin Skunk Works manager estimated that supersonic air travel won’t feasibly make its potential return until around 2035.

First, however, is Friday’s scheduled pomp and circumstance. Viewers can tune into NASA’s livestream of the event at 4pm ET on YouTube, as well as through the agency’s NASA+ streaming service, NASA app, and website.

The post NASA plans to unveil experimental X-59 supersonic jet on January 12 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
These extinct, nearly 10-foot-tall apes could not adapt to shifting seasons https://www.popsci.com/environment/extinct-10-foot-tall-apes/ Wed, 10 Jan 2024 16:01:43 +0000 https://www.popsci.com/?p=598137
An artist’s impression of a group of G. blacki within a forest in southern China. Four of these giant apes sit on the grass near a stream, while an orangutan hangs from a tree branch. They are brown with yellow-ish manes around their faces.
An artist’s impression of a group of G. blacki within a forest in southern China. They are believed to be the largest primates to ever live. CREDIT Garcia/Joannes-Boyau/Southern Cross University

A new study pinpoints that changes in climate likely led to Gigantopithecus blacki’s demise.

The post These extinct, nearly 10-foot-tall apes could not adapt to shifting seasons appeared first on Popular Science.

]]>
An artist’s impression of a group of G. blacki within a forest in southern China. Four of these giant apes sit on the grass near a stream, while an orangutan hangs from a tree branch. They are brown with yellow-ish manes around their faces.
An artist’s impression of a group of G. blacki within a forest in southern China. They are believed to be the largest primates to ever live. CREDIT Garcia/Joannes-Boyau/Southern Cross University

Beginning about 2.6 million years ago, giant primates almost 10 feet tall weighing 551 pounds roamed the plains of southern China. Gigantopithecus blacki (G. blacki) towered over today’s largest monkeys by about five feet and is believed to be the largest primate to ever roam the Earth. However, it went extinct just as other primates–like orangutans–were thriving. 

[Related: These primate ancestors were totally chill with a colder climate.]

Now, a team of scientists from China, Australia, and the United States believe that this giant ape went extinct between 295,000 and 215,000 years ago because it could not adapt its food preferences and behaviors and was vulnerable to extreme changes in the planet’s climate. The findings are detailed in a study published January 10 in the journal Nature

“The story of G. blacki is an enigma in paleontology–how could such a mighty creature go extinct at a time when other primates were adapting and surviving? The unresolved cause of its disappearance has become the Holy Grail in this discipline,” Yingqi Zhang, study co-author and Institute of Vertebrate Palaeontology and Palaeoanthropology at the Chinese Academy of Sciences (IVPP) paleontologist, said in a statement

Seasonal shifts 

Roughly 700,000 to 600,000 years ago, the rich forest environment that G. blacki lived in began to change. The new study proposes that as Earth’s four seasons began to strengthen and G. blacki’s habitat saw more variability in temperature and precipitation, the structure of these forest communities began to change. 

In response, G. blacki’s close relatives the orangutans adapted their habitat preferences, behavior, and size over time. However, G. blacki was not quite as nimble. Based on its dental anatomy, these giant apes were herbivores that had adapted to eat fibrous foods like fruits. However, when its favorite food sources were not available, the team believes that G. blacki relied on a less nutritious backup source of sustenance, decreasing the diversity of its food. They likely suffered from a reduced geographic range for foraging, became less mobile, and saw chronic stress and dwindling numbers. 

G. blacki was the ultimate specialist, compared to the more agile adapters like orangutans,  and this ultimately led to its demise,” said Zhang. 

Honing in on a date

G. blacki left behind roughly 2,000 fossilized teeth and four jawbones that helped paleontologists put together the story of G. blacki’s time on Earth, but more precise dating of these remains was needed to determine its extinction story. To find definitive evidence of their extinction, the team took on a large-scale project that explored 22 cave sites in a wide region of Guangxi Province in southern China. 

[Related: Nice chimps finish last—so why aren’t all of them mean?]

Determining the exact time when a species disappears from the fossil record helps paleontologists determine a timeframe that they can work to rebuild from other evidence. 

“Without robust dating, you are simply looking for clues in the wrong places,” Kira Westaway, a study co-author and geochronologist at Macquarie University in Australia, said in a statement

In the study, the team used six dating techniques the samples of cave sediments and teeth fossils. The techniques produced 157 radiometric ages that were combined with eight sources of environmental and behavioral evidence. They took this combined figure and applied it to 11 caves that had evidence of G blacki in them and 11 caves of a similar age range that did not have any remains of G. blacki.

Two paleontologists are seen digging into hard cemented cave sediments.
Digging into the hard cemented cave sediments containing a wealth of fossils and evidence of G. blacki. CREDIT: Kira Westaway/Macquarie University.

The primary technique that helped the team hone in on a date range was luminescence dating. It measures a light-sensitive signal that is found in the burial sediments that encased the G. blacki fossils. Uranium series and electron-spin resonance were also critical in dating the G. blacki teeth themselves. 

“By direct-dating the fossil remains, we confirmed their age aligns with the luminescence sequence in the sediments where they were found, giving us a comprehensive and reliable chronology for the extinction of G. blacki,” Renaud Joannes-Boyau, a study co-author and geochronologist at Southern Cross University  in Australia, said in a statement. 

Building a world from teeth and pollen 

Researchers also used a detailed pollen analysis to reconstruct what the plant life looked like hundreds of thousands of years ago, a stable isotope analysis of the teeth, and a detailed analysis of the cave sediments to re-create the environmental conditions leading up to the time G blacki went extinct. Trace element and dental microwear textural analysis of the apes’ teeth enabled the team to model what G. blacki’s behavior likely looked like when they were flourishing, compared to their demise. 

[Related: An ‘ancestral bottleneck’ took out nearly 99 percent of the human population 800,000 years ago.]

“Teeth provide a staggering insight into the behavior of the species indicating stress, diversity of food sources, and repeated behaviors,” said Joannes-Boyau.

The dates of the fossils combined with the pollen and teeth analysis revealed that G.blacki went extinct between 295,000 and 215,000 years ago, earlier than scientists previously assumed. The team believes that studying their lack of adaptation has implications for today’s changing climate and the need for adaptation. 

“With the threat of a sixth mass extinction event looming over us, there is an urgent need to understand why species go extinct,” said Westaway. “Exploring the reasons for past unresolved extinctions gives us a good starting point to understand primate resilience and the fate of other large animals, in the past and future.”

The post These extinct, nearly 10-foot-tall apes could not adapt to shifting seasons appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How video game tech, AI, and computer vision help decode animal pain and behavior https://www.popsci.com/science/computer-vision-mice-pain-behavior/ Wed, 10 Jan 2024 15:00:00 +0000 https://www.popsci.com/?p=598046
AI photo
The Jackson Laboratory / Popular Science

Top neuroscience labs are adapting new and unexpected tools to gain a deeper understanding of how mice, and ultimately humans, react to different drug treatments.

The post How video game tech, AI, and computer vision help decode animal pain and behavior appeared first on Popular Science.

]]>
AI photo
The Jackson Laboratory / Popular Science

Back in 2013, Sandeep Robert “Bob” Datta was working in his neurobiology lab at Harvard Medical School in Boston when he made the fateful decision to send his student Alex Wiltschko to the Best Buy up the street. Wiltschko was on a mission to purchase an Xbox Kinect camera, designed to pick up players’ body movements for video games like Just Dance and FIFA. He plunked down about $150 and walked out with it. The unassuming piece of consumer electronics would determine the lab’s direction in the coming decade and beyond. 

It also placed the team within a growing scientific movement at the intersection of artificial intelligence, neuroscience, and animal behavior—a field poised to change the way researchers use other creatures to study human health conditions. The Datta Lab is learning to track the intricate nuances of mouse movement and understand the basics of how the mammal brain creates behavior, untangling the neuroscience of different health conditions and ultimately developing new treatments for people. This area of research relies on so-called “computer vision” to analyze video footage of animals and detect behavior patterns imperceptible to the unaided eye. Computer vision can also be used to auto-detect cell types, addressing a persistent problem for researchers who study complex tissues in, for example, cancers and gut microbiomes.

In the early 2010s, Datta’s lab was interrogating how smell, “the sense that is most important to most animals” and the one that mice can’t survive without, drives the rodents’ responses to manipulations in their environment. Human observers traditionally track mouse behavior and record their observations—how many times a mouse freezes in fear, how often it rears up to explore its enclosure, how long it spends grooming, how many marbles it buries. Datta wanted to move beyond the movements visible to the unaided eye and use video cameras to track and compute whether a rodent avoids an odor (that of predator urine, for instance) or is attracted to it (like the smell of roses). The tools available at the time—overhead 2D cameras that tracked each animal as a single point—didn’t yield sufficiently detailed data.

“Even in an arena in the dark, where there’s no stimuli at all, [mice] just generate these incredible behavioral dynamics—none of which are being captured by, like, a dot bouncing around on the screen,” says Datta. So Wiltschko identified the Xbox Kinect camera as a potential solution. Soon after its introduction in 2010, people began hacking the hardware for science and entertainment purposes. It was fitting for Datta’s lab to use it to track mice: It can record in the dark using infrared light (mice move around much more when it’s darker) and can see in 3D when mounted overhead by measuring how far an object is from the sensor. This enabled Datta’s team to follow the subjects when they ran around, reared up, or hunkered down. As it analyzed its initial results, it realized that the Kinect camera recorded the animals’ movements with a richness that 2D cameras couldn’t capture.

“That got us thinking that if we could just somehow identify regularities in the data, we might be able to identify motifs or modules of action,” Datta says. Looking at the raw pixel counts from the Kinect sensor, even as compressed image files and without any sophisticated analysis, they began seeing these regularities. With or without an odor being introduced, every few hundred milliseconds, mice would switch between different types of movement—rearing, bobbing their heads, turning. For several years after the first Kinect tests, Datta and his team tried to develop software to identify and record the underlying elements of the basic components of movement the animals string together to create behavior.

But they kept hitting dead ends.

“There are many, many ways you can take data and divide it up into piles. And we tried many of those ways, many for years,” Datta recalls. “And we had many, many false starts.”

They tried categorizing results based on the animals’ poses from single frames of video, but that approach ignored movement—“the thing that makes behavior magic,” according to Datta. So they abandoned that strategy and started thinking about the smaller motions that last fractions of a second and constitute behavior, analyzing them in sequence. This was the key: the recognition that movement is both discrete and continuous, made up of units but also fluid. 

So they started working with machine learning tools that would respect this dual identity. In 2020, seven years after that fateful trip to Best Buy, Datta’s lab published a scientific paper describing the resulting program, called MoSeq (short for “motion sequencing,” evoking the precision of genetic sequencing). In this paper, they demonstrated their technique could identify the subsecond movements, or “syllables,” as they call them, that make up mouse behavior when they’re strung together into sequences. By detecting when a mouse reared, paused, or darted away, the Kinect opened up new possibilities for decoding the “grammar” of animal behavior.

AI photo
MoSeq

Computer visionaries

In the far corner of the Datta Lab, which still resides at Harvard Medical School, Ph.D. student Maya Jay pulls back a black curtain, revealing a small room bathed in soft reddish-orange light. To the right sit three identical assemblies made of black buckets nestled inside metal frames. Over each bucket hangs a Microsoft Xbox Kinect camera, as well as a fiber-optic cable connected to a laser light source used to manipulate brain activity. The depth-sensing function of the cameras is the crucial element at play. Whereas a typical digital video captures things like color, the images produced by the Kinect camera actually show the height of the animal off the floor, Jay says—for instance, when it bobs its head or rears up on its hind legs. 

Microsoft discontinued the Xbox Kinect cameras in 2017 and has stopped supporting the gadget with software updates. But Datta’s lab developed its own software packages, so it doesn’t rely on Microsoft to keep the cameras running, Jay says. The lab also runs its own software for the Azure Kinect, a successor to the original Kinect that the team also employs—though it was also discontinued, in 2023. Across the lab from the Xbox Kinect rigs sits a six-camera Azure setup that records mice from all angles, including from below, to generate either highly precise 2D images incorporating data from various angles or 3D images.

In the case of MoSeq and other computer vision tools, motion recordings are often analyzed in conjunction with manipulations to the brain, where sensory and motor functions are rooted in distinct modules, and neural-activity readings. When disruptions in brain circuits, either from drugs administered in the lab or edits to genes that mice share with humans, lead to changes in behaviors, it suggests a connection between the two. This makes it possible for researchers to determine which circuits in the brain are associated with certain types of behavior, as well as how medications are working on these circuits.

In 2023, Datta’s lab published two papers detailing how MoSeq can contribute to new insights into an organism’s internal wiring. In one, the team found that, for at least some mice in some situations, differences in mouse behavior are influenced way more by individual variation in the brain circuits involved with exploration than by sex or reproductive cycles. In another, manipulating the neurotransmitter dopamine suggested that this chemical messenger associated with the brain’s reward system supports spontaneous behavior in much the same way it influences goal-directed behaviors. The idea is that little bits of dopamine are constantly being secreted to structure behavior, contrary to the popular perception of dopamine as a momentous reward. The researchers did not compare MoSeq to human observations, but it performed comparably in another set of experiments in a paper that has yet to be published.

These studies probed some basic principles of mouse neurobiology, but many experts in this field say MoSeq and similar tools could broadly revolutionize animal and human health research in the near future. 

With computer vision tools, mouse behavioral tests can run in a fraction of the time that would be required with human observers. This tech comes at a time when multiple forces are calling animal testing into question. The United States Food and Drug Administration (FDA) recently changed its rules on drug testing to consider alternatives to animal testing as prerequisites for human clinical trials. Some experts, however, doubt that stand-ins such as organs on chips are advanced enough to replace model organisms yet. But the need exists. Beyond welfare and ethical concerns, the vast majority of clinical trials fail to show benefits in humans and sometimes produce dangerous and unforeseen side effects, even after promising tests on mice or other models. Proponents say computer vision tools could improve the quality of medical research and reduce the suffering of lab animals by detecting their discomfort in experimental conditions and clocking the effects of treatments with greater sensitivity than conventional observations.

Further fueling scientists’ excitement, some see computer vision tools as a means of measuring the effects of optogenetics and chemogenetics, techniques that use engineered molecules to make select brain cells turn on in response to light and chemicals, respectively. These biomedical approaches have revolutionized neuroscience in the past decade by enabling scientists to precisely manipulate brain circuits, in turn helping them investigate the specific networks and neurons involved in behavioral and cognitive processes. “This second wave of behavior quantification is the other half of the coin that everyone was missing,” says Greg Corder, assistant professor of psychiatry at the University of Pennsylvania. Others agree that these computer vision tools are the missing piece to track the effects of gene editing in the lab.

“[These technologies] truly are integrated and converge,” agrees Clifford Woolf, a neurobiologist at Harvard Medical School who works with his own supervised computer vision tools in his pain research.

But is artificial intelligence ready to take over the task of tracking animal behavior and interpreting its meaning? And is it identifying meaningful connections between behavior and neurological activity just yet?

These are the questions at the heart of a tension between supervised and unsupervised AI models. Machine learning algorithms find patterns in data at speeds and scales that would be difficult or impossible for humans. Unsupervised machine learning algorithms identify any and all motifs in datasets, whereas supervised ones are trained by humans to identify specific categories. In mouse terms, this means unsupervised AIs will flag every unique movement or behavior, but supervised ones will pinpoint only those that researchers are interested in.

The major advantage of unsupervised approaches for mouse research is that people may not notice action that takes place on the subsecond scale. “When we analyze behavior types, we often actually are based on the experimenters’ judgment of the behavior type, rather than mathematical clustering,” says Bing Ye, a neuroscientist at the University of Michigan whose team developed LabGym, a supervised machine learning tool for mice and other animals, including rats and fruit fly larvae. The number of behavioral clusters that can be analyzed, too, is limited by human trainers. On the other hand, he says, live experts may be the most qualified to recognize behaviors of note. For this reason, he advocates transparency: publishing training datasets, the classification parameters that a supervised algorithm learns on, with any studies. That way, if experts disagree with how a tool identifies behaviors, the publicly available data provide a solid foundation for scientific debate.

Mu Yang, a neurobiologist at Columbia University and the director of the Mouse NeuroBehavior Core, a mouse behavior testing facility, is wary of trusting AI to do the work of humans until the machines have proved reliable. She is a traditional mouse behavior expert, trained to detect the animals’ subtleties with her own eyes. Yang knows that the way a rodent expresses an internal state, like fear, can change depending on its context. This is true for humans too. “Whether you’re in your house or…in a dark alley in a strange city, your fear behavior will look different,” Yang explains. In other words, a mouse may simply pause or it may freeze in fear, but an AI could be hard-pressed to tell the difference. One of the other challenges in tracking the animals’ behaviors, she says, is that testing different drugs on them may cause them to exhibit actions that are not seen in nature. Before AIs can be trusted to track these novel behaviors or movements, machine learning programs like MoSeq need to be vetted to ensure they can reliably track good old-fashioned mouse behaviors like grooming. 

Yang draws a comparison to a chef, saying that you can’t win a Michelin star if you haven’t proved yourself as a short-order diner cook. “If I haven’t seen you making eggs and pancakes, you can talk about caviar and Kobe beef all you want, I still don’t know if I trust you to do that.”

For now, as to whether MoSeq can make eggs and pancakes, “I don’t know how you’d know,” Datta says. “We’ve articulated some standards that we think are useful. MoSeq meets those benchmarks.”

Putting the tech to the test

There are a couple of ways, Datta says, to determine benchmarks—measures of whether an unsupervised AI is correctly or usefully describing animal behavior. “One is by asking whether or not the content of the behavioral description that you get [from AI] does better or worse at allowing you to discriminate among [different] patterns of behavior that you know should occur.” His team did this in the first big MoSeq study: It gave mice different medicines and used the drugs’ expected effects to determine whether MoSeq was capturing them. But that’s a pretty low bar, Datta admits—a starting point. “There are very few behavioral characterization methods that wouldn’t be able to tell a mouse on high-dose amphetamine from a control.” 

The real benchmark of these tools, he says, will be whether they can provide insight into how a mouse’s brain organizes behavior. To put it another way, the scientifically useful descriptions of behavior will predict something about what’s happening in the brain.

Explainability, the idea that machine learning will identify behaviors experts can link to expected behaviors, is a big advantage of supervised algorithms, says Vivek Kumar, associate professor at the biomedical research nonprofit Jackson Laboratory, one of the main suppliers of lab mice. His team used this approach, but he sees training supervised classifiers after unsupervised learning as a good compromise. The unsupervised learning can reveal elements that human observers may miss, and then supervised classifiers can take advantage of human judgment and knowledge to make sure that what an algorithm identifies is actually meaningful.

“It’s not magic”

MoSeq isn’t the first or only computer vision tool under development for quantifying animal behavior. In fact, the field is booming as AI tools become more powerful and easier to use. We already mentioned Bing Ye and LabGym; the lab of Eric Yttri at Carnegie Mellon University has developed B-SOiD; the lab of Mackenzie Mathis at École Polytechnique Fédérale de Lausanne has DeepLabCut; and the Jackson Laboratory is developing (and has patented) its own computer vision tools. Last year Kumar and his colleagues used machine vision to develop a frailty index for mice, an assessment that is notoriously sensitive to human error.

Each of these automated systems has proved powerful in its own way. For example, B-SOiD, which is unsupervised, identified the three main types of mouse grooming without being trained in these basic behaviors. 

“That’s probably a good benchmark,” Yang says. “I guess you can say, like the egg and pancake.”

Mathis, who developed DeepLabCut, emphasizes that carefully picking data sources is critical for making the most of these tools. “It’s not magic,” she says. “It can make mistakes, and your trained neural networks are only as good as the data you give [them].”

And while the toolmakers are still honing their technologies, even more labs are hard at work deploying them in mouse research with specific questions and targets in mind. Broadly, the long-term goal is to aid in the discovery of drugs that will treat psychiatric and neurological conditions. 

Some have already experienced vast improvements in running their experiments. One of the problems of traditional mouse research is that animals are put through unnatural tasks like running mazes and taking object recognition tests that “ignore the intrinsic richness” of behavior, says Cheng Li, professor of anesthesiology at Tongji University in Shanghai. His team found that feeding MoSeq videos of spontaneous rodent behavior along with more traditional task-oriented behaviors yielded a detailed description of the mouse version of postoperative delirium, the most common central nervous system surgical complication among elderly people. 

Meanwhile, LabGym is being used to study sudden unexpected death in epilepsy in the lab of Bill Nobis at Vanderbilt University Medical Center. After being trained on videos of mouse seizures, the program detects them “every time,” Nobis says.

Easing their pain

Computer vision has also become a major instrument for pain research, helping to untangle the brain’s pathways involved in different types of pain and treat human ailments with new or existing drugs. And despite the FDA rule change in early 2023, the total elimination of animal testing is unlikely, Woolf says, especially in developing novel medicines. By detecting subtle behavioral signs of pain, computer vision tools stand to reduce animal suffering. “We can monitor the changes in them and ensure that we’re not producing an overwhelming, painful situation—all we want is enough pain that we can measure it,” he explains. “We would not do anything to a mouse that we wouldn’t do to a human, in general.”

His team used supervised machine learning to track behavioral signatures of pain in mice and show when medications have alleviated their discomfort, according to a 2022 paper in the journal Pain. One of the problems with measuring pain in lab animals, rather than humans, is that the creatures can’t report their level of suffering, Woolf says. Scientists long believed that, proportional to body weight, the amount of medicine required to relieve pain is much higher in mice than in humans. But it turns out that if your computer vision algorithms can measure the sensation relatively accurately—and Woolf says his team’s can—then you actually detect signs of pain relief at much more comparable doses, potentially reducing the level of pain inflicted to conduct this research. Measuring pain and assessing pain medicine in lab animals is so challenging that most large pharmaceutical companies have abandoned the area as too risky and expensive, he adds. “We hope this new approach is going to bring them back in.”

Corder’s lab at the University of Pennsylvania is working on pain too, but using the unsupervised B-SOiD in conjunction with DeepLabCut. In unpublished work, the team had DeepLabCut visualize mice as skeletal stick figures, then had B-SOiD identify 13 different pain-related behaviors like licking or biting limbs. Supervised machine learning will help make his team’s work more reliable, Corder says, as B-SOiD needs instruction to differentiate these behaviors from, say, genital licking, a routine hygiene behavior. (Yttri, the co-creator of B-SOiD, says supervision will be part of the new version of his software.) 

As computer vision tools continue to evolve, they could even help reduce the number of animals required for research, says FDA spokesperson Lauren-Jei McCarthy. “The agency is very much aligned with efforts to replace, reduce, or refine animal studies through the use of appropriately validated technologies.”

If you build it, they will come

MoSeq’s next upgrade, which has been submitted to an academic journal and is under review, will try something similar to what Corder’s lab did: It will meld its unsupervised approach with keypoint detection, a computer vision method that highlights crucial points in an object like the body of a mouse. This particular approach employs the rig of six Kinect Azure cameras instead of the Datta lab’s classic Xbox Kinect camera rigs.

An advantage of this approach, Datta says, is that it can be applied to existing 2D video, meaning that all the petabytes of archival mouse data from past experiments could be opened up to analysis without the cost of running new experiments on mice. “That would be huge,” Corder agrees.

Datta’s certainty increases as he rattles off some of his team’s accomplishments with AI and mouse behavior in the past few years. “Can we use MoSeq to identify genetic mutants and distinguish them from wild types? —mice with genetics as they appear in nature. This was the subject of a 2020 paper in Nature Neuroscience, which showed that the algorithm can accurately discern mice with an autism-linked gene mutation from those with typical genetics. “Can we make predictions about neural activity?” The Datta Lab checked this off its bucket list just this year in its dopamine study. Abandoning the hedging so typical of scientists, he confidently declares, “All of that is true. I think in this sense, MoSeq can make eggs and pancakes.”

The post How video game tech, AI, and computer vision help decode animal pain and behavior appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why fruit bats can eat tons of sugar without getting diabetes https://www.popsci.com/science/fruit-bats-sugar-diabetes/ Tue, 09 Jan 2024 19:00:09 +0000 https://www.popsci.com/?p=597970
Fruit bats hanging on tree branches in daylight in Bangladesh on November 6, 2023.
Fruit bats hanging on tree branches in daylight in Bangladesh on November 6, 2023. Md Rafayat Haque Khan/Eyepix Group/Future Publishing via Getty Images

The answer could lie in their genes.

The post Why fruit bats can eat tons of sugar without getting diabetes appeared first on Popular Science.

]]>
Fruit bats hanging on tree branches in daylight in Bangladesh on November 6, 2023.
Fruit bats hanging on tree branches in daylight in Bangladesh on November 6, 2023. Md Rafayat Haque Khan/Eyepix Group/Future Publishing via Getty Images

Some fruit bats eat up to twice their body weight in sugary mangoes, bananas, or figs every day to not only survive, but thrive. Unlike humans, these flying mammals can have an essentially permanent sweet tooth and do not develop some of the negative health consequences such as diabetes. A study published January 9 in the journal Nature Communications found that genetic adaptations have helped keep their sugary diets from becoming harmful. 

[Related: How do bats stay cancer-free? The answer could be lifesaving for humans.]

The study could have future implications for treating diabetes, which affects an estimated 38 million Americans, according to the Centers for Disease Control and Prevention (CDC). It is the eighth leading cause of death in the United States and the leading cause of kidney failure, lower-limb amputations, and adult blindness.

“With diabetes, the human body can’t produce or detect insulin, leading to problems controlling blood sugar,” study co-author and University of California, San Francisco geneticist Nadav Ahituv said in a statement. “But fruit bats have a genetic system that controls blood sugar without fail. We’d like to learn from that system to make better insulin-or sugar-sensing therapies for people.”  

Fruit bats vs. insect bats

Every day, fruit bats wake up after about 20 hours of sleep and feast on fruit before returning back to their caves, trees, or human-built structures to roost. To figure out how they can eat so much sugar and thrive, the team in this study focused on how the bat pancreas and kidneys evolved. The pancreas is an abdominal organ that controls blood sugar

Researchers compared the Jamaican fruit bat with an insect-eating bat called the big brown bat. They analyzed the gene expression–which genes were switched on or off–and regulatory DNA that controls gene expression. To do this, the team measured both the gene expression and regulatory DNA present in individual cells. These measurements show which types of cells primarily make up the bat’s organs and also how these cells regulate the gene expression that manages their diet. 

They found that the compositions of the pancreas and kidneys in fruit bats evolved to accommodate their sugary diet. The pancreas had more cells to produce insulin, an essential hormone that tells the body to lower blood sugar. It also had more cells that produce another sugar-regulating hormone called glucagon. The fruit bat kidneys had more cells to trap scarce salts and electrolytes as they filter blood.  

Changes in DNA

Taking a closer look at the genetics behind this, the team saw that the regulatory DNA in those cells had evolved to switch the appropriate genes for fruit metabolism on or off. The insect-eating big brown bats had more cells that break down protein and conserve water and the gene expression in these cells was calibrated to handle a diet of bugs. 

[Related: Vampire bats socially distance when they feel sick.]

“The organization of the DNA around the insulin and glucagon genes was very clearly different between the two bat species,” study co-author and Menlo College biologist Wei Gordon said in a statement. “The DNA around genes used to be considered ‘junk,’ but our data shows that this regulatory DNA likely helps fruit bats react to sudden increases or decreases in blood sugar.” 

While some of the fruit bat’s biology resembled what is found in humans with diabetes, the bats are not known to have the same health effects.

“Even small changes, to single letters of DNA, make this diet viable for fruit bats,” said Gordon. “We need to understand high-sugar metabolism like this to make progress helping the one in three Americans who are prediabetic.” 

Studying bats for human health

Bats are one of the most diverse families of mammals and everything from their immune systems to very particular diets are considered by some scientists to be examples of evolutionary triumph. This study is one of recent examples of how studying bats could have implications for human health, including in cancer research and virus prevention

For this study, Gordon and Ahituv traveled to Belize to participate in an annual Bat-a-Thon, where they took census of wild bats and field samples. One of the Jamaican fruit bats that they captured at the Bat-a-Thon was used to study sugar metabolism.  

“For me, bats are like superheroes, each one with an amazing super power, whether it is echolocation, flying, blood sucking without coagulation, or eating fruit and not getting diabetes,” Ahituv said. “This kind of work is just the beginning.” 

The post Why fruit bats can eat tons of sugar without getting diabetes appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This tiny sea creature builds a ‘snot palace’ to capture food https://www.popsci.com/science/snot-palace-water-pump/ Tue, 09 Jan 2024 16:30:00 +0000 https://www.popsci.com/?p=597911
Microscopic view of Oikopleura dioica
A microscopic view of Oikopleura dioica. University of Oregon

Oikopleura dioica’s feeding processes could help design new water pumps systems.

The post This tiny sea creature builds a ‘snot palace’ to capture food appeared first on Popular Science.

]]>
Microscopic view of Oikopleura dioica
A microscopic view of Oikopleura dioica. University of Oregon

When it’s time for a snack, the miniscule sea creature known as Oikopleura dioica gets gross. At barely a millimeter long, the filter-feeding larvacean excretes and encases itself in a jelly-like substance to form what biologists dub a “mucus house” or a “snot palace.” 

A tadpole-like O. dioica’s tiny, temporary abodes are biological wonders—using its tail, the larvacean creates its own pump-filtration system capable of capturing and propelling food particles towards its mouth. Now, researchers believe the snot palace’s interior fluid dynamics could inspire a new generation of artificial pump systems for wastewater treatment plants and air filtration systems.

[Related: These animals build palaces out of their own snot.]

“It’s so cool. It’s a pretty complex structure,” University of Oregon biology research assistant Terra Hiebert said in a January 8 profile.

Hiebert and collaborators detailed their work in a study recently published in the Journal of the Royal Society Interface. To better understand a snot palace’s inner workings, Hiebert’s team traveled to a larvacean breeding facility in Bergen, Norway to analyze the creatures’ movements using a high-speed video camera attached to a microscope. In reviewing the footage, researchers noticed how an O. dioica’s tail shifted responsibilities depending on whether or not it was time to eat. While simply swimming near the ocean’s surface, the tail wriggles side-to-side to push the creature forward through water, but it’s a different story once inside the mucus house.

Once encased in the gelatinous substance, O. dioica’s appendage actually touches the interior in multiple locations. When the tail wiggles in these moments, the animal doesn’t move nearly as much. Instead, the tail sticks and unsticks from the casing “like Velcro,” according to the University of Oregon, and the snot palace subsequently inflates like a balloon as nearby particles collect on the surface. Each movement pushes these particles along, eventually in the direction of the larvacean’s mouth. Once the mucus filtration system is too clogged to function, O. dioica simply sheds its makeshift restaurant, which then sinks into the ocean and eventually decomposes. In approximately 3-to-4 hours, the larvacean repeats the process all over again.

Although O. dioica’s structure fits the bill for a peristaltic pump, it’s not the most common design. Usually, a peristaltic pump’s fluid motion originates through external pressure, such as contractions in your colon to push along waste. In a snot palace, however, the momentum derives from within the pump itself via the larvacean’s tail. Researchers believe designers could adapt this alternative setup for engineering new wastewater treatment plants or air filtration systems—hypothetically, locating any moving parts within the pump could protect the overall setup from wear-and-tear.

If this proves true, urban planners could have snot palaces to thank for cleaner, more efficient municipal water facilities. 

The post This tiny sea creature builds a ‘snot palace’ to capture food appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Scientists finally discover the enzyme that makes pee yellow https://www.popsci.com/science/what-hat-makes-pee-yellow/ Tue, 09 Jan 2024 15:36:55 +0000 https://www.popsci.com/?p=597926
A white toilet bowl in a bathroom with yellow brick walls is surrounded by multiple shelves holding plants and towels.
A bacterial enzyme called bilirubin reductase appears to be responsible for urine’s signature yellow color. Deposit Photos

Understanding bilirubin reductase could lead to better treatments for gallstones, jaundice, and inflammatory bowel disease.

The post Scientists finally discover the enzyme that makes pee yellow appeared first on Popular Science.

]]>
A white toilet bowl in a bathroom with yellow brick walls is surrounded by multiple shelves holding plants and towels.
A bacterial enzyme called bilirubin reductase appears to be responsible for urine’s signature yellow color. Deposit Photos

Every animal must urinate to get rid of liquid waste in their body. While the pee of a healthy person has a distinctly yellow color, it’s been unclear to scientists for centuries what actually gives urine this hue. Now, a team from the University of Maryland and National Institutes of Health believe that they have solved this mystery by pinpointing the microbial enzyme that makes our pee yellow. The findings are detailed in a study published January 3 in the journal Nature Microbiology.

[Related: Renaissance-era doctors used to taste their patients’ pee.]

According to study co-author and University of Maryland microbiologist Brantley Hall, the team built on decades of research going back to the 1960s and a difficult three-and-a-half-year long lab experiment to find that an enzyme in the gut microbiome called bilirubin reductase is responsible for urine’s color.

“The gut microbiome is just full of incredible chemists. It’s so important to human physiology, all these molecules that the gut microbes are making,” Hall tells PopSci. “As we understand more about microbial chemistry in our gut, we’re going to understand the important things. But the first step for any of this is to figure out the enzymes responsible. If you don’t know what’s going on, you basically can’t even start with the research.”

Solving a microbial mystery 

Previously, scientists knew that the yellow color comes from how the body gets rid of old blood cells. Red blood cells typically reach the end of their life cycle after about 120 days and they are degraded in the liver. A byproduct of this process called bilirubin is a bright orange substance that is secreted from the liver and into the gut. Bacteria living in the gut then convert bilirubin into a colorless substance called urobilinogen. The urobilinogen is finally degraded into the yellow pigment molecule called urobilin that plays a part in the coloring. What scientists did not know was the bacterial enzyme responsible. 

Identifying this enzyme has long been a microbial mystery for two primary reasons. According to Brantley, the first challenge is that culturing anaerobic microbes has historically been very difficult and expensive to do in the lab.

“The microbes that perform this function cannot live with atmospheric oxygen. They die within minutes or seconds,” says Hall. “And those definitely never grow.”

Brantley and the team were able to harness scientific advances made over the past 15 years in culturing these microbes that survive and thrive without oxygen.

The second challenge has been the lack of genome sequences of the microbiomes in the gut. Recent improvements in genetic sequencing means that there were more sequences available for the team to study to see how the microbes in the gut work. 

“In our case, we identified microbes that reduced bilirubin and microbes that did not. And then we performed a comparative genomics analysis between the two and identify candidate genes,” says Hall.

Into the gut microbiome

In the study, the team compared the genomes of the exact species of human gut bacteria that convert bilirubin into urobilinogen with the species of gut bacteria that can’t. This helped them identify the specific gene that encodes bilirubin reductase. Next, they used Escherichia coli (E. coli) to test if this enzyme could convert bilirubin into urobilinogen in it as well as in other gut bacteria.

After searching for this gene in all known bacterial species, the team found that the enzyme is primarily produced by a species belonging to a large group of bacteria that dominate the gut microbiome called Firmicutes. After genetically screening the gut microbiomes of over 1,000 adults to find the pee coloring gene, they found that 99.9 percent of people have gut bacteria that carry the gene for bilirubin reductase. 

[Related: Bees make more friends when they’re full of healthy gut bacteria.]

“I think the biggest surprise to me is how prevalent this function is in adult humans,” says Hall. “Basically, everyone’s urine is yellow, and everyone’s stool is brown, so we knew that there must be microbes that did it. There are actually not that many microbes that do it and they’re essentially prevalent in every person.”

Potential medical applications

The study also looked to see if this gene was present in adults with inflammatory bowel disease (IBD) and infants with jaundice. Only about 68 percent of those with IBD had the gene and about 40 percent of babies under three months old who were at a heightened risk for jaundice. While more research is needed, identifying what these enzymes and genes could help develop better treatments for IBD, jaundice, and even gallstones.

“People are so excited about gut health and I just love talking to people about gut health,” says Hall. “Everyone either has or knows someone who has gut issues and I just think there’s an enormous opportunity to really modulate the human gut microbiome and health in a positive way.”

The post Scientists finally discover the enzyme that makes pee yellow appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Peregrine lunar lander experiences ‘critical loss of propellant’ following successful launch https://www.popsci.com/science/peregrine-launch-technical-anomaly/ Mon, 08 Jan 2024 17:36:30 +0000 https://www.popsci.com/?p=597783
Smoke billows out of two engines as United Launch Alliance's Vulcan Centaur, lifts off from Space Launch Complex 41d at Cape Canaveral Space Force Station in Cape Canaveral, Florida, on January 8, 2024. The new rocket is carrying Astrobotic's Peregrine Lunar Lander.
United Launch Alliance's Vulcan Centaur, lifts off from Space Launch Complex 41d at Cape Canaveral Space Force Station in Cape Canaveral, Florida, on January 8, 2024. The new rocket is carrying Astrobotic's Peregrine Lunar Lander. CHANDAN KHANNA/AFP via Getty Images

The lander was scheduled to reach the moon by mid-February, hoping to be the first United States moon landing mission in 50 years.

The post Peregrine lunar lander experiences ‘critical loss of propellant’ following successful launch appeared first on Popular Science.

]]>
Smoke billows out of two engines as United Launch Alliance's Vulcan Centaur, lifts off from Space Launch Complex 41d at Cape Canaveral Space Force Station in Cape Canaveral, Florida, on January 8, 2024. The new rocket is carrying Astrobotic's Peregrine Lunar Lander.
United Launch Alliance's Vulcan Centaur, lifts off from Space Launch Complex 41d at Cape Canaveral Space Force Station in Cape Canaveral, Florida, on January 8, 2024. The new rocket is carrying Astrobotic's Peregrine Lunar Lander. CHANDAN KHANNA/AFP via Getty Images

On January 8 at 2:18 a.m. local time, the United Launch Alliance’s (ULA) new Vulcan Centaur rocket successfully launched from Cape Canaveral Space Force Station in Florida. The rocket separated from the lander after about an hour and sent Peregrine Mission One into space.

Several hours after the launch, the company who built the Peregrine lander announced that it had experienced an “anomaly” that stopped Peregrine from pointing its solar panels stably at the sun. In a press release, Astrobotic stated that it has engineers working on this issue, but without the spacecraft’s ability to charge its battery, the plan to for a soft landing on the moon is in jeopardy.

At 1:03 p.m. EST Astrobotic issued an update saying that the mission will likely not go on as planned, as the lunar lander is experiencing a failure within its propulsion system.

Later, Astrobotic announced that Peregrine is suffering a critical fuel leak and has less than two days of fuel left.  An image taken by the lander in space showed damaged insulation on the spacecraft, which indicates a leak in Peregrine’s propulsion system.

“An ongoing propellant leak is causing the spacecraft’s Attitude Control System (ACS) thrusters to operate well beyond their expected service life cycles to keep the lander from an uncontrollable tumble,” the company wrote.

On Tuesday January 9, Astrobiotic announced that it would be abandoning its attempt for a soft landing on the moon. The lunar lander was slated to attempt to make the first soft landing on the moon by the United States since 1972. Peregrine’s mission is to study the lunar surface ahead of future human missions to the moon.

The launch also began a new chapter in the age of private space exploration. The United Launch Alliance is a joint venture between Boeing and Lockheed Martin, with the Vulcan rocket designed to replace two older rockets and compete with SpaceX. The private company owned by Elon Musk sent close to 100 rockets into orbit in 2023 alone. The United States Space Force is also counting on the Vulcan Centaur rocket to launch spy satellites and other spacecraft that Space Force believes are in the interest of national security. 

The Peregrine lander was built by Pittsburgh-based space robotics firm Astrobotic and aimed to become the first lunar lander constructed by a private company. This is also the first mission to fly under NASA’s Commercial Lunar Payload Services (CLPS) initiative, where NASA pays private companies to send scientific equipment to the moon.

“It’s a dream … For 16 years we’ve been pushing for this moment today,” said Astrobotic CEO John Thornton during a webcast of the launch according to CNN. “And along the way, we had a lot of hard challenges that we had to overcome and a lot of people doubted us along the way. But our team and the people that supported us believed in the mission, and they created this beautiful moment that we’re seeing today.”

Peregrine has a total of 20 payloads on board, five for NASA and 15 others. They include five small moon rovers and the first Latin American scientific instruments attempting to reach the lunar surface. If successful, the technology on board will measure properties including radiation levels, magnetic field, ice and water on the surface and subsurface, and a layer of gas called the exosphere. A better understanding of the exosphere and the moon’s surface is expected to help minimize risks when humans return to its surface, as early as 2025.  

Several non-scientific payloads are aboard as well, including a lunar dream capsule with over 180,000 messages from children around the world, a chunk of Mount Everest, and a physical coin containing one bitcoin.

Controversially, Peregrine is carrying human remains on behalf of commercial space burial companies Celestis and Elysium Space. Celestis offers to carry ashes to the moon for prices starting at more than $10,000. The 265 capsules include human remains from Star Trek creator Gene Roddenberry and original cast members and DNA samples from three former US presidents–George Washington, Dwight Eisenhower, and John F. Kennedy. Bringing human remains to the moon is strongly opposed by the Navajo Nation, as allowing human remains to touch the lunar surface would be desecration of a body that many tribes consider sacred. In a statement on January 4, Navajo Nation president Buu Nygren said that NASA or other government officials should address the tribe’s concerns ahead of the launch. 

“The moon holds a sacred place in Navajo cosmology,” Nygren wrote. “The suggestion of transforming it into a resting place for human remains is deeply disturbing and unacceptable to our people and many other tribal nations.”

[Related: The moon is 40 million years older than we thought, according to crystals collected by Apollo astronauts.]

According to The New York Times, NASA officials said in a news conference that they were not in charge of this mission and do not have a direct say on the payloads that were sold on Peregrine. ”There’s an intergovernmental meeting being set up with the Navajo Nation that NASA will support,” deputy associate administrator for exploration at NASA Joel Kearns said on January 4.

Peregrine 1 was originally scheduled to touch down on the surface of the moon on February 23, near Sinus Viscositatis–or the Bay of Stickiness. This area is named for rock domes that were potentially created by viscous lava.

Update January 9, 2:39PM: Additional information from the company about the technical problems has been added.

The post Peregrine lunar lander experiences ‘critical loss of propellant’ following successful launch appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
These are the exciting space missions slated for launch in 2024 https://www.popsci.com/science/space-missions-2024/ Sun, 07 Jan 2024 17:00:00 +0000 https://www.popsci.com/?p=597640
Illustration of what the Europa Clipper spacecraft will look like flying by Europa, a moon of Jupiter.
Illustration of what the Europa Clipper spacecraft will look like flying by Europa, a moon of Jupiter. NASA/JPL-Caltech

From the Moon’s south pole to an ice-covered ocean world.

The post These are the exciting space missions slated for launch in 2024 appeared first on Popular Science.

]]>
Illustration of what the Europa Clipper spacecraft will look like flying by Europa, a moon of Jupiter.
Illustration of what the Europa Clipper spacecraft will look like flying by Europa, a moon of Jupiter. NASA/JPL-Caltech

This article was originally featured in The Conversation.

The year 2023 proved to be an important one for space missions, with NASA’s OSIRIS-REx mission returning a sample from an asteroid and India’s Chandrayaan-3 mission exploring the lunar south pole, and 2024 is shaping up to be another exciting year for space exploration.

Several new missions under NASA’s Artemis plan and Commercial Lunar Payload Services initiative will target the Moon.

The latter half of the year will feature several exciting launches, with the launch of the Martian Moons eXploration mission in September, Europa Clipper and Hera in October and Artemis II and VIPER to the Moon in November–if everything goes as planned.

I’m a planetary scientist, and here are six of the space missions I’m most excited to follow in 2024.

1. Europa Clipper

NASA will launch Europa Clipper, which will explore one of Jupiter’s largest moons, Europa. Europa is slightly smaller than Earth’s Moon, with a surface made of ice. Beneath its icy shell, Europa likely harbors a saltwater ocean, which scientists expect contains over twice as much water as all the oceans here on Earth combined.

With Europa Clipper, scientists want to investigate whether Europa’s ocean could be a suitable habitat for extraterrestrial life.

The mission plans to do this by flying past Europa nearly 50 times to study the moon’s icy shell, its surface’s geology and its subsurface ocean. The mission will also look for active geysers spewing out from Europa.

This mission will change the game for scientists hoping to understand ocean worlds like Europa.

The launch window–the period when the mission could launch and achieve its planned route–opens Oct. 10, 2024, and lasts 21 days. The spacecraft will launch on a SpaceX Falcon Heavy rocket and arrive at the Jupiter system in 2030.

2. Artemis II launch

The Artemis program, named after Apollo’s twin sister in Greek mythology, is NASA’s plan to go back to the Moon. It will send humans to the Moon for the first time since 1972, including the first woman and the first person of color. Artemis also includes plans for a longer-term, sustained presence in space that will prepare NASA for eventually sending people even farther–to Mars.

Artemis II is the first crewed step in this plan, with four astronauts planned to be on board during the 10-day mission.

The mission builds upon Artemis I, which sent an uncrewed capsule into orbit around the Moon in late 2022.

Artemis II will put the astronauts into orbit around the Moon before returning them home. It is currently planned for launch as early as November 2024. But there is a chance it will get pushed back to 2025, depending on whether all the necessary gear, such as spacesuits and oxygen equipment, is ready.

3. VIPER to search for water on the Moon

The VIPER rover to survey water at the south pole of the Moon.

VIPER, which stands for Volatiles Investigating Polar Exploration Rover, is a robot the size of a golf cart that NASA will use to explore the Moon’s south pole in late 2024.

Originally scheduled for launch in 2023, NASA pushed the mission back to complete more tests on the lander system, which Astrobotic, a private company, developed as part of the Commercial Lunar Payload Services program.

This robotic mission is designed to search for volatiles, which are molecules that easily vaporize, like water and carbon dioxide, at lunar temperatures. These materials could provide resources for future human exploration on the Moon.

The VIPER robot will rely on batteries, heat pipes and radiators throughout its 100-day mission, as it navigates everything from the extreme heat of lunar daylight–when temperatures can reach 224 degrees Fahrenheit (107 degrees Celsius)–to the Moon’s frigid shadowed regions that can reach a mind-boggling -400 F (-240 C).

VIPER’s launch and delivery to the lunar surface is scheduled for November 2024.

4. Lunar Trailblazer and PRIME-1 missions

NASA has recently invested in a class of small, low-cost planetary missions called SIMPLEx, which stands for Small, Innovative Missions for PLanetary Exploration. These missions save costs by tagging along on other launches as what is called a rideshare, or secondary payload.

One example is the Lunar Trailblazer. Like VIPER, Lunar Trailblazer will look for water on the Moon.

But while VIPER will land on the Moon’s surface, studying a specific area near the south pole in detail, Lunar Trailblazer will orbit the Moon, measuring the temperature of the surface and mapping out the locations of water molecules across the globe.

Currently, Lunar Trailblazer is on track to be ready by early 2024.

However, because it is a secondary payload, Lunar Trailblazer’s launch timing depends on the primary payload’s launch readiness. The PRIME-1 mission, scheduled for a mid-2024 launch, is Lunar Trailblazer’s ride.

PRIME-1 will drill into the Moon–it’s a test run for the kind of drill that VIPER will use. But its launch date will likely depend on whether earlier launches go on time.

An earlier Commercial Lunar Payload Services mission with the same landing partner was pushed back to February 2024 at the earliest, and further delays could push back PRIME-1 and Lunar Trailblazer.

5. JAXA’s Martian Moon eXploration mission

The JAXA MMX mission concept to study Phobos and Deimos, Mars’ moons.

While Earth’s Moon has many visitors–big and small, robotic and crewed–planned for 2024, Mars’ moons Phobos and Deimos will soon be getting a visitor as well. The Japanese Aerospace Exploration Agency, or JAXA, has a robotic mission in development called the Martian Moon eXploration, or MMX, planned for launch around September 2024.

The mission’s main science objective is to determine the origin of Mars’ moons. Scientists aren’t sure whether Phobos and Deimos are former asteroids that Mars captured into orbit with its gravity or if they formed out of debris that was already in orbit around Mars.

The spacecraft will spend three years around Mars conducting science operations to observe Phobos and Deimos. MMX will also land on Phobos’ surface and collect a sample before returning to Earth.

6. ESA’s Hera mission

Hera is a mission by the European Space Agency to return to the Didymos-Dimorphos asteroid system that NASA’s DART mission visited in 2022.

But DART didn’t just visit these asteroids, it collided with one of them to test a planetary defense technique called “kinetic impact.” DART hit Dimorphos with such force that it actually changed its orbit.

The kinetic impact technique smashes something into an object in order to alter its path. This could prove useful if humanity ever finds a potentially hazardous object on a collision course with Earth and needs to redirect it.

Hera will launch in October 2024, making its way in late 2026 to Didymos and Dimorphos, where it will study physical properties of the asteroids.

Disclosure: Ali M. Bramson receives funding from NASA.

The post These are the exciting space missions slated for launch in 2024 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Central American volcanoes offer clues to Earth’s geological evolution https://www.popsci.com/science/central-america-volcanoes/ Sat, 06 Jan 2024 17:00:00 +0000 https://www.popsci.com/?p=597584
The 3,763-meter-high Fuego volcano is located 35 kilometers southwest of Guatemala City. On June 3, 2018, a major eruption triggered an avalanche that swept through the community of San Miguel Los Lotes and part of a highway in the neighboring town of Alotenango, leaving many dead and missing.
The 3,763-meter-high Fuego volcano is located 35 kilometers southwest of Guatemala City. On June 3, 2018, a major eruption triggered an avalanche that swept through the community of San Miguel Los Lotes and part of a highway in the neighboring town of Alotenango, leaving many dead and missing. ALAIN BONNARDEAUX / UNSPLASH

Along 1,100 kilometers, from Mexico to Costa Rica, lies the Central American volcanic arc, where the variety of magma types make for a geological paradise.

The post Central American volcanoes offer clues to Earth’s geological evolution appeared first on Popular Science.

]]>
The 3,763-meter-high Fuego volcano is located 35 kilometers southwest of Guatemala City. On June 3, 2018, a major eruption triggered an avalanche that swept through the community of San Miguel Los Lotes and part of a highway in the neighboring town of Alotenango, leaving many dead and missing.
The 3,763-meter-high Fuego volcano is located 35 kilometers southwest of Guatemala City. On June 3, 2018, a major eruption triggered an avalanche that swept through the community of San Miguel Los Lotes and part of a highway in the neighboring town of Alotenango, leaving many dead and missing. ALAIN BONNARDEAUX / UNSPLASH

This article was originally featured on Knowable Magazine.

Over millions of years, the Earth’s upper layers have performed a dance that has created mountains, volcanoes, continents, ridges and ocean trenches.

Tectonic plates play a key role in this process. These huge, irregular slabs of the Earth’s crust—the solid rock surface where humans live—and the upper part of the underlying mantle “float” on a deeper, warmer layer of the mantle. When two plates meet, sometimes one gives way and ends up sinking little by little into the depths, in a process known as subduction.

If the phenomenon occurs along the entire length of the plate boundary, a line of volcanoes, known as a volcanic arc, forms. There are volcanic arcs in the Andes of South America, Tonga in the South Pacific Ocean, the Aleutian Islands of Alaska, the Philippine Islands and Central America, among others — all of them part of the Pacific Ring of Fire, where earthquakes and volcanoes are common.

The Central American arc is relatively small, just 1,100 kilometers long. But it contains an important variety of different types of magmas, some of which are unique on the planet. It is a “geological paradise” hiding secrets worthy of investigation, says Esteban Gazel, a geochemist at Cornell University in Ithaca, New York. “Central America has a rich combination of conditions that allow the comparison of different natural experiments in magma generation,” he and his two coauthors wrote in a review published in the 2021 Annual Review of Earth and Planetary Sciences.

Central America was born on what used to be an oceanic plate, as a consequence of the subduction of tectonic plates. About 150 million years ago, a slow process began that gradually allowed volcanic islands to grow between the continental masses of northern and southern America. About 3 million years ago, the area now comprising Costa Rica and Panama was finally joined to the north of present-day South America, creating a single landmass from Alaska to Tierra del Fuego.

In the last 11,000 years, 70 volcanoes have been active in this part of the Ring of Fire. Just as volcanoes have created fertile lands where agriculture and cattle-raising have flourished, they have also brought death and displaced populations. In 1902, for example, following a series of earthquakes in the region, the Santa Maria volcano in Guatemala recorded one of the largest and most explosive eruptions of the 20th century. In 1968, an eruption of the Arenal volcano in Costa Rica left 78 dead; and in 2018, the Fuego volcano in Guatemala erupted, killing more than 100.

Gazel carried out his first studies as a geochemist in Costa Rica, his home country. In this interview, edited for length and clarity, he explains how the geology of Central America helps us understand the evolution of our planet.

What is the Central American volcanic arc?

It is an alignment of dozens of volcanoes, not all of which are currently active, stretching from the border between Mexico and northern Guatemala to central Costa Rica. Among them are the volcanoes of Fuego and Tajumulco in Guatemala, Santa Ana in El Salvador, Masaya and Momotombo in Nicaragua, and Arenal and Poás in Costa Rica. This arc is the result of the subduction of the Cocos plate under the Caribbean plate. The Cocos plate is produced in a ridge in the Pacific Ocean. The ridges are submarine fractures through which magma emanates. When magma comes out at a ridge, it pushes the plates, cools, crystallizes and creates new crust.

The Central American volcanic arc is an alignment of dozens of volcanoes stretching from the border between Mexico and northern Guatemala to central Costa Rica. This arc is a product of the subduction of the Cocos plate under the Caribbean plate and generates a variety of different types of magmas, some of them found nowhere else on the planet.
The Central American volcanic arc is an alignment of dozens of volcanoes stretching from the border between Mexico and northern Guatemala to central Costa Rica. This arc is a product of the subduction of the Cocos plate under the Caribbean plate and generates a variety of different types of magmas, some of them found nowhere else on the planet.

When the plate subducts, it is filled with volatile elements, mainly water and carbon dioxide. At about 60 kilometers deep they become unstable. Because of the high pressure and temperature conditions at those depths, the minerals will break down and the volatiles will come out in a very special form: a hybrid between a liquid and a gas, which is known as a supercritical fluid or a melt. This fluid interacts with the rest of the materials and fuses the rocks of the mantle. This is the process that generates the magma that comes to the surface in the form of lava.

When did this process begin?

We have a history of volcanism in Central America going back many millions of years. The arc has been growing and evolving, creating different versions. It began to form about 150 million years ago, at the time of the dinosaurs.

The tectonic system moves and accommodates itself. For example, the Cocos plate does not have the same subduction angle along the arc’s 1,100 kilometers; the way in which it enters under the Caribbean plate—its entry angle—is different in different areas. This affects the formation of the arc.

When plates converge at subduction zones, the thin, dense oceanic crust sinks beneath the thick, buoyant continental crust. Volcanoes form when the subducting oceanic plate becomes hot enough to melt materials and create magma that rises to the surface as lava.
When plates converge at subduction zones, the thin, dense oceanic crust sinks beneath the thick, buoyant continental crust. Volcanoes form when the subducting oceanic plate becomes hot enough to melt materials and create magma that rises to the surface as lava.

For example, some 20 million years ago, the arc was where the San Carlos plains are—north of what is now Costa Ric —and to the east of today’s Nicaragua Lake. It resembled today’s Mariana Islands [a group of islands aligned north to south, close to each other and located southeast of Japan]. The arc continued to where Chiapas is today, in Mexico.

Today, the arc is closer to the Pacific coast in northern Central America and toward central Costa Rica. Most of the volcanoes you see today are 500,000 to 250,000 years old. But there are also younger volcanoes. Nicaragua’s Cerro Negro is the youngest, dating from 1867.

The Earth evolves and leaves old versions. The pages of the evolution of the planet are written on these rocks.

What makes the Central American volcanic arc unique?

It is something incredible. The geochemical variations from Nicaragua to Costa Rica are the most extreme on the planet. Throughout Italy there are magmas similar in chemical composition to those of Costa Rica, but all those volcanoes in Italy are going to be very similar to each other.

In the Mariana Trench in the western Pacific Ocean, the magmas are very similar to those of Nicaragua, but all the volcanoes in the Marianas produce magmas very similar to each other.

Italy and the Mariana Trench are separated by thousands of kilometers, while Costa Rica and Nicaragua show important geochemical differences in their magmas even though they are very close. The extreme amount of variation in such a small area makes Central America the only place on the planet with these characteristics, and therefore a unique natural laboratory.

Why is there so much variation?

The regional variation is controlled by the angle of subduction and the composition of the subducted material. The local variations of Central American volcanoes are controlled by the volume of the volcano. Very large volcanoes have more molten material coming from the mantle and less variability. In contrast, smaller volcanoes have much more geochemical variation. Large volumes of magma homogenize the signal, and smaller volumes show us more extremes.

Central America also has a combination of denser oceanic crust and lighter continental crust. What does this mean and what does it teach us?

It is something very unique to Central America. In the Mariana Trench, the whole arc is developed in oceanic crust. In the Andes, in South America, the entire arc develops in continental crust. In Central America, the arc starts in continental crust in Chiapas and Guatemala, and when we enter Costa Rica, we enter the oceanic crust. All the magmas in Costa Rica are oceanic arc magmas that look very much like the Marianas, until about 15 million years ago when things start to change again, and the magmas in Costa Rica start to look more like continental crust. This tells us that this is a very dynamic zone.

Why does this happen?

Fifteen million years ago, a much younger crust began to reach what is now Costa Rica, with the Galapagos seamounts—submerged oceanic islands in the Pacific Ocean that once were like what are now the Galapagos Islands of Ecuador. That is what is known as a hot spot, a thermal anomaly that brings material from the Earth’s lower mantle. These islands travel with the plate, submerge, and eventually collide and enter the Costa Rica-Panama subduction zone, where the Cocos plate dives under the Caribbean plate.

This plate has a unique geochemical signature that shows up in the volcanoes of Costa Rica and Panama. The conclusion from a lot of my work over the years is that Costa Rica was not continental crust, and since about 15 million years ago one of the youngest continental crustal terranes [fragments consisting of a distinct and recognizable series of rock formations that has been transported by plate tectonic processes] on the entire planet has been formed by the melting of these seamounts in the subduction system. So it is also a natural laboratory.

Cocos Island [an island and national park located in the Pacific Ocean, some 500 kilometers off the Costa Rican coast] is part of a series of seamounts. These mountains are entering the subduction zone and are melting. As the crust evolved from oceanic to continental, its density was reduced, allowing Costa Rica and Panama to emerge from the sea. This process contributed to the closure of the Central American isthmus, which finally occurred about 3 million years ago. The birth of the volcanic arc helped to generate sediments in the area, and tectonic activity raised the surface until a complete closure occurred.

So, is Cocos Island getting closer to the mainland?

Yes. In several million years, it will subduct and then come out again as a volcano like Poás or Irazú in central Costa Rica today, where the Galapagos signature is extremely evident.

What are the tools you use to investigate volcanic arcs?

The main tools are the volcanic rocks, obtained after an eruption, because their chemical composition is controlled by these processes. We analyze them using electron microscopes, spectroscopes and mass spectrometry to infer their origin, age and evolution.

How has your field evolved over the years?

I started doing research in volcanology and geochemistry at the Nuclear Research Center of the University of Costa Rica. I started there because geology and volcanology at that time were very descriptive. The nuclear physics researchers implemented more quantitative measurements. When I started my career in the United States, my science evolved to be much more numerical and less descriptive.

In the last decade, and especially the last five years, as a community we have moved to a volcanology that is incredibly precise and quantitative.

That’s only going to continue to expand. We’re already at a point where to be a modern geologist you have to know about geochemistry equipment, about programming, about statistics. Now there is machine learning and artificial intelligence, and all those tools are being used along with field and laboratory data.

Geologically, what does the future hold for this region?

What we must understand is that in Central America we live in a tectonic and volcanically active zone. Prevention and good construction really make a significant difference to save lives in the face of eruptions and earthquakes. Because volcanoes are going to keep erupting and earthquakes are going to keep happening. It is the natural process of evolution.

What new research is coming up?

In January 2024, a group with professionals from different parts of the world will go to Poás volcano in Costa Rica to test new technologies to study volcanic processes and with the goal of someday having a better idea of how to forecast a volcanic eruption. The person organizing it from the United States was my post-doc mentor.

And my lab was just funded by the US National Science Foundation to study Plinian eruptions of mafic composition. Plinian eruptions are the highest-magnitude eruptions, like the Vesuvius eruption that destroyed Pompeii. They are generally felsic, meaning they are high in silica. However, there is a group of mafic eruptions that are high in magnesium and iron that also have that magnitude. In Central America, Nicaragua’s Masaya volcano holds a record for one such eruption.

How can Central America teach us about Earth’s formation more broadly?

If we want to understand how the continents were formed in the early part of the Earth’s evolution, one of the best places to work is Central America.

If we want to study the process of subduction, how it starts and how the composition of continents changes, we have that history in Central America.

If we want to study samples of materials that traveled to the interior of the Earth, that “visited” the mantle in a subduction zone and came up again, they are exposed in Guatemala.

If we want to understand the mechanisms that start very high-intensity eruptions, we have a record of eruptions in Nicaragua and Costa Rica.

So Central America is already a natural laboratory, and it will continue to be a place where many geological processes can be studied.

Additional examples where Costa Rica has something that is very unique are the exposed oceanic crust on the Nicoya Peninsula and the exposed mantle on the Santa Elena Peninsula. In Santa Elena, we have the terrestrial mantle exposed, and you see rocks, known as peridotites, which formed at depths of 50 to 70 kilometers and were brought to the surface by tectonic activity. These are incredibly unique and very interesting sections, which have been the focus of many years of research that have allowed us to better understand the structure, composition and history of the mantle.

Central America will continue to be studied for its diversity: a geological paradise.

Article translated by Debbie Ponchner

This story is part of the Knowable en español series on science that affects or is conducted by Latinos in the United States, supported by HHMI’s Science and Educational Media Group.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews. Sign up for the newsletter.

The post Central American volcanoes offer clues to Earth’s geological evolution appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This is what Uranus and Neptune may really look like https://www.popsci.com/science/uranus-neptune-really-look-like/ Fri, 05 Jan 2024 16:00:00 +0000 https://www.popsci.com/?p=597561
Voyager 2/ISS images of Uranus and Neptune released shortly after the Voyager 2 flybys in 1986 and 1989, respectively, compared with a reprocessing of the individual filter images in this study to determine the best estimate of the true colors of these planets.
Voyager 2/ISS images of Uranus and Neptune released shortly after the Voyager 2 flybys in 1986 and 1989, respectively, compared with a reprocessing of the individual filter images in this study to determine the best estimate of the true colors of these planets. Patrick Irwin

New study shows that our solar system’s most distant planets' true colors are actually similar.

The post This is what Uranus and Neptune may really look like appeared first on Popular Science.

]]>
Voyager 2/ISS images of Uranus and Neptune released shortly after the Voyager 2 flybys in 1986 and 1989, respectively, compared with a reprocessing of the individual filter images in this study to determine the best estimate of the true colors of these planets.
Voyager 2/ISS images of Uranus and Neptune released shortly after the Voyager 2 flybys in 1986 and 1989, respectively, compared with a reprocessing of the individual filter images in this study to determine the best estimate of the true colors of these planets. Patrick Irwin

For decades, images taken of Neptune have looked like the planet has a deep blue hue, while Uranus seemed more green. However, these two ice giants may actually look more similar to eachother than astronomers previously believed. According to a study published January 5 in Monthly Notices of the Royal Astronomical Society, our solar system’s furthest planets’ true colors could both be similar pale shades of greenish blue. 

[Related: The secret to Voyagers’ spectacular space odyssey.]

Images versus reality

NASA’s Voyager 2 mission remains the only flyby of both ice giants conducted by a spacecraft. It gave us the first detailed images of these far-flung planets. Voyager 2 conducted a flyby of Uranus in 1986, and the images revealed a planet with a more pale cyan or blue color. The vessel flew by Neptune in 1989 and the imagery showed a planet with a rich blue color.

However, astronomers have long understood that most modern images of both planets don’t accurately reflect their true colors. Voyager 2 captured images of each planet in separate colors and these single-color images were then put together to make composites. These composite images were not always accurately balanced, particularly for the planet Neptune which was believed to appear too blue. The contrast on the early Voyager images of Neptune were also strongly enhanced to better reveal the clouds and winds of the planet. 

“Although the familiar Voyager 2 images of Uranus were published in a form closer to ‘true’ color, those of Neptune were, in fact, stretched and enhanced, and therefore made artificially too blue,” study co-author and University of Oxford astronomer Patrick Irwin said in a statement. “Even though the artificially-saturated color was known at the time amongst planetary scientists–and the images were released with captions explaining it–that distinction had become lost over time.”

Creating a more accurate view

In the new study, the team applied data taken from the Hubble Space Telescope’s Space Telescope Imaging Spectrograph (STIS) and the Multi Unit Spectroscopic Explorer (MUSE) on the European Southern Observatory’s Very Large Telescope. 

With both the STIS and MUSE, each pixel is a continuous spectrum of colors, so their observations can be processed more clearly to determine the more accurate color of the planets, instead of what is being seen with a filter. 

The team used the data to rebalance the composite color images that were recorded by Voyager 2’s onboard camera and by the Hubble Space Telescope’s Wide Field Camera 3. The rebalancing revealed that both Uranus and Neptune are actually a similar pale shade of greenish blue. Neptune has a slight hint of more blue, which the model showed to be a thin layer of haze on the planet

The changing colors of Uranus

This research also provides a likely answer to why Uranus changes color slightly during its 84 year-long orbit around the sun. The team first compared images of Uranus to measurements of its brightness that were taken at blue and green wavelengths by the Lowell Observatory in Arizona from 1950 to 2016. These measurements showed that Uranus looks a little greener during its summer and winter solstices, when its poles are pointed towards the sun. However, during the equinoxes–when the sun is over the planet’s equator–it appears to have a more blue tinge. 

Animation of seasonal changes in color on Uranus during two Uranus years. The left-hand disc shows the appearance of Uranus to the naked eye, while the right-hand disc has been color stretched and enhanced to make atmospheric features clearer.
Animation of seasonal changes in color on Uranus during two Uranus years, running from 1900 to 2068 and starting just before southern summer solstice, when Uranus’s south pole points almost directly towards the Sun. The left-hand disc shows the appearance of Uranus to the naked eye, while the right-hand disc has been color stretched and enhanced to make atmospheric features clearer. In this animation, Uranus’s spin has been slowed down by over 3000 times so that the planetary rotation can be seen, with discrete storm clouds seen passing across the planet’s disc. As the planet moves towards its solstices a pale polar ‘hood’ of increasing cloud opacity and reduced methane abundance can be seen filling more of the planet’s disc leading to seasonal changes in the overall color of the planet. The changing size of Uranus’s disc is due to Uranus’s distance from the Sun changing during its orbit. Patrick Irwin/University of Oxford

One already established reason for the change is due to Uranus’ a highly unusual spin. The planet spins almost on its side during orbit, so its north and south poles point almost directly towards the sun and Earth during its solstices. Any changes to the reflectivity of Uranus’ poles would have a major impact on the planet’s overall brightness when viewed from the Earth, according to the authors. What was less clear to astronomers was how and why this reflectivity differs. The team developed a model to compare the bands of colors of Uranus’s polar regions to its equatorial regions. 

They found that polar regions are more reflective at green and red wavelengths than at blue wavelengths. Uranus is more reflective at these wavelengths partially because gas methane absorbs the color red and methane is about half as abundant near Uranus’ poles than the equator.

[Related: Neptune’s bumpy childhood could reveal our solar system’s missing planets.]

However, this wasn’t enough to fully explain the color change so the researchers added a new variable to the model in the form of a ‘hood’ of gradually thickening icy haze which has previously been observed when Uranus moves from equinox to summer solstice. They believe that this haze is likely made up of methane ice particles.

After simulating this pole shift in the model, the ice particles further increased the reflection at green and red wavelengths at the planet’s poles, which explained that Uranus looks greener at the solstice due to less methane at the poles and increased thickness of the methane ice particles. 

“The misperception of Neptune’s color, as well as the unusual color changes of Uranus, have bedeviled us for decades,” Heidi Hammel, of the Association of Universities for Research in Astronomy said in a statement. “This comprehensive study should finally put both issues to rest.” Hammel is not an author of the new study. 

Filling in this gap between the public perception of Neptune and its reality shows how data can be manipulated to show off certain features of a planet or enhance visualizations. 

“There’s never been an attempt to deceive,” study co-author and University of Leicester planetary scientist Leigh Fletcher told The New York Times. “But there has been an attempt to tell a story with these images by making them aesthetically pleasing to the eye so that people can enjoy these beautiful scenes in a way that is, maybe, more meaningful than a fuzzy, gray, amorphous blob in the distance.”

The post This is what Uranus and Neptune may really look like appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Lexington, Kentucky sent a tourism ad to ‘extraterrestrials’ with a DIY laser rig https://www.popsci.com/technology/lexington-kentucky-alien-tourist-campaign/ Fri, 05 Jan 2024 15:08:57 +0000 https://www.popsci.com/?p=597425
Robert Lodder sends Lexington tourism data ad into space at evening launch event with horses in background
Robert Lodder prepares to send VisitLEX's tourism ad towards the Trappist-1 system in October 2023. Credit: VisitLEX

The city hopes any potential aliens in the TRAPPIST-1 system will learn bourbon, horses, and bluegrass are worth the 40 light-year journey, although the message might not survive the trip.

The post Lexington, Kentucky sent a tourism ad to ‘extraterrestrials’ with a DIY laser rig appeared first on Popular Science.

]]>
Robert Lodder sends Lexington tourism data ad into space at evening launch event with horses in background
Robert Lodder prepares to send VisitLEX's tourism ad towards the Trappist-1 system in October 2023. Credit: VisitLEX

Signs of humanity have traveled through space ever since the very first radio signals left the Earth’s atmosphere. We even made concerted efforts to broadcast evidence of our existence through projects like the historic Voyager spacecraft recordings—but an official intergalactic tourism campaign advertising alien vacations to the “Horse Capital of the Word?” That’s a first.

[ Related: How scientists decide if they’ve actually found signals of alien life ]

The Lexington Convention and Visitors Bureau (VisitLEX) recently turned to University of Kentucky professor and longtime SETI advocate, Robert Lodder, to assemble experts from various disciplines including linguistics, philosophy, and design to attract a unique target audience: (potential) extraterrestrial lifeforms. More specifically, any extraterrestrial life possibly residing within the TRAPPIST-1 system.

Located approximately 40 light-years away in the Leo constellation, TRAPPIST-1 is by far the most studied planetary system outside of our own. There, seven rocky planets orbit a small red dwarf star, three of which reside within its “Goldilocks zone”—the region astrobiologists believe could be conducive to supporting life.

The VisitLEX campaign's bitmap image with annotations from its designers.
The VisitLEX campaign’s bitmap image with annotations from its designers. Credit: VisitLEX

“Many previous transmissions have employed the language of mathematics for communication, and our team did, too,” Lodder tells PopSci. “But we decided that extraterrestrials might be more interested in things unique to Planet Earth than Universal Truths like mathematics, so if we seek to attract visitors, it would be best to send something interesting and uniquely Earth.”

Collaborators ultimately decided on a package including black-and-white photographs of rolling Kentucky bluegrass hills, an audio recording of local blues legend, Tee Dee Young, and an original bitmap illustration—a type of image in which programmers use basic coding to create a grid with shaded blocks that form rudimentary images. Among other subjects, this bitmap art includes renderings of humans, horses, the elements necessary for life (as we know it), alongside the chemical composition maps of ethanol and water, aka alcohol—more specifically to Kentucky, bourbon.

With the message’s contents compiled, Lodder’s team then converted their advertisement into a one-dimensional array of light pulses using a computer-laser interface aimed at TRAPPIST-1. On a clear, dark autumn evening, VisitLEX hosted researchers and local guests at Kentucky Horse Park to fire off their tourism package into space.

While lasers are increasingly replacing radio communications in space due their increased data storage capabilities and lower costs, transmissions must be strong enough to travel millions of miles without degrading. This requires equally strong equipment, such as the Deep Space Optical Communications array aboard NASA’s Psyche spacecraft.

VisitLEX’s laser is far weaker than NASA’s equipment, but Lodder believes that at least some of the transmission’s light photons “will almost certainly” reach TRAPPIST-1. That said, it’s difficult to know if there will be enough photons to fully decode their message.

“The alien receiving technology could be worse than ours, or much better,” says Lodder.

[ Related: JWST just scanned the skies of potentially habitable exoplanet TRAPPIST-1 b ]

Regardless, if ETs ever do make a pitstop in Lexington because of VisitLEX’s interstellar commercial, it likely won’t happen until at least the year 2103—40 light-years for the broadcast to reach TRAPPIST-1, followed by another 40 light-years to travel the approximately 235-million mile trek over to Earth, assuming they’re capable of traveling at the speed of light. It all might sound like a lot both logistically and technologically, but both VisitLEX and Lodder’s team swear it’s worth the planning.

[ Related: To set the record straight: Nothing can break the speed of light ]

If there’s anyone out there listening and able to pick up this kind of admittedly weak signal—and if they have a taste for oak barrel aged bourbon and/or horses—well…

Update 1/12/24 9:00am: PopSci received the following response from Jan McGarry, Next Generation Satellite Laser Ranging Systems Deputy Lead at the NASA Goddard Space Flight Center, and her retired colleague, John Degnan:
“The distance to the nearest star is 2 light years away or many orders of magnitude farther than the edge of our solar system (Pluto). Since the strength of a laser communications link is proportional to 1 divided by the distance squared, it is highly unlikely that a laser system would be able to transfer any meaningful amount of information over that distance let alone one 20 times farther away where the signal would be 400 times smaller.”

The post Lexington, Kentucky sent a tourism ad to ‘extraterrestrials’ with a DIY laser rig appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The best binoculars for astronomy in 2024 https://www.popsci.com/gear/best-binoculars-for-astronomy/ Thu, 04 Jan 2024 21:00:00 +0000 https://www.popsci.com/?p=597338
The best binoculars for astronomy
Brandt Ranj / Popular Science

An excellent alternative to bulky telescopes, these binoculars are easy to store and transport while giving you access to all wonders in the night sky.

The post The best binoculars for astronomy in 2024 appeared first on Popular Science.

]]>
The best binoculars for astronomy
Brandt Ranj / Popular Science

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Best overall Canon 12x36 IS III Canon 12×36 IS III
SEE IT

If you want sharp, bright, colorful, stabilized, distortion-free images, you want Canon binoculars.

Best high power Celestron SkyMaster 25X100 Celestron SkyMaster 25X100
SEE IT

Essentially the equivalent of a telescope in your hands, these Celestron binoculars offer a whopping 25x magnification and an objective lens measuring 100mm so you’ll see epic details in the night sky.

Best budget Celestron Upclose G2 10x50 Celestron UpClose G2 10×50 
SEE IT

While they require collimation—the alignment of the lenses—these budget binoculars use multi-coated optics, resulting in a quality image with good contrast and mostly accurate color once adjusted.

While telescopes are popular for stargazing, binoculars for astronomy offer a more portable option for gazing into the heavens. Binoculars are extremely versatile, working well for general terrestrial observations as well as more celestial surveying. You can even use them handheld or on a mount. Whether you want to observe the moon or casually stargaze, the best binoculars for astronomy are great to take into nature and bring you closer to the stars. 

How we chose the best binoculars for astronomy

Binoculars for astronomy require more specific specs than general-purpose binoculars, so we prioritized options with larger objective lens size and higher magnification. We also aimed to select options at various price points suitable for everyone from beginners to expert stargazers. While binoculars with image stabilization are excellent for astronomy use, they are quite expensive, so we’ve included models both with and without stabilization. In making our selections, we considered optical quality, size and weight, eye relief, and build quality. 

The best binoculars for astronomy: Reviews & Recommendations

Binoculars for astronomy will allow you to gaze up at the moon, spot deep space objects, check out planets, and more. While these advanced optics can be used handheld, we’d recommend a tripod or mount of some variety to offer more stable, high-quality night sky views.

Best overall: Canon 12×36 IS III

Canon

SEE IT

Specs

  • Objective lens diameter: 36mm
  • Magnification: 12x
  • Field of view: 5 degrees
  • Eye relief: 0.57 inches (14mm)
  • Weight: 1.5 pounds
  • Dimensions: 5 x 6.85 x 2.76 inches

Pros

  • Effective image stabilization
  • Relatively lightweight and compact
  • Image stabilized
  • Excellent optical quality

Cons

  • Pricey

Canon makes some of the best image-stabilized binoculars available, so it should be no surprise that our top pick comes from the imaging giant. The Canon 12×36 IS III binoculars for astronomy offer the company’s typical high-end optics and Porro II prisms, resulting in a sharp, bright, colorful image. It also features a double field flattener, which produces a distortion-free image.

The 36mm objective lens diameter is slightly lower than what is typically recommended for astronomy use. However, these still offer plenty of light gathering for stargazing. You’ll also be able to use them for things like bird watching, adding to their versatility. Plus, the smaller objective lens results in a more compact size ideal for most people, which is why it earned our top spot. These Canon binos provide 12x magnification, allowing you to see details on the moon’s surface. 

What really makes these optics stand out is the image stabilization. Canon built these with technology similar to what they use in their EF lenses, resulting in much sharper images when holding the binoculars. You’ll need two AA batteries for power, and they will typically get up to 12 hours of use. Simply put, once you use IS binoculars, you won’t want to go back to anything else. 

Best splurge: Canon 10×42 L IS WP

Canon

SEE IT

Specs

  • Objective lens diameter: 42mm
  • Magnification: 10x
  • Field of view: 6.5 degrees
  • Eye relief: 0.63 inches (16mm)
  • Weight: 2.4 pounds
  • Dimensions: 5.4 x 6.9 x 3.4 inches

Pros

  • Excellent stabilization
  • High-quality optics
  • Rugged build 
  • Plenty of eye relief

Cons

  • Expensive
  • Fairly heavy and bulky

If money is no object and you want the best of the best, the Canon 10×42 L IS WP binoculars are the way to go. These powerful binoculars for astronomy offer a large objective lens of 42mm, capturing tons of light for viewing even dim celestial objects. The 10x magnification is plenty for most astronomical observations and offers plenty of eye relief for a range of users.

Like the pair mentioned above, these feature Canon’s impressive image stabilization. It will almost look like you are using a tripod, giving you sharp, clear views. The ‘L’ in the name refers to Canon’s top-tier line of optics. These feature two ultra-low dispersion (UD) lens elements (on each side), which effectively corrects for chromatic aberration. Images will be sharp, bright, and vibrant, offering excellent views of the stars. 

Of course, there are downsides to these binos. First, they are expensive. If you are a casual user, they will be overkill. Second, they are fairly bulky and heavy. You likely won’t want to hold them for long periods, and they will add weight to your pack if you are hiking. But this is the pair to get if you are serious about stargazing with your binoculars. 

Best high power: Celestron SkyMaster 25X100

Celestron

SEE IT

Specs

  • Objective lens diameter: 100mm
  • Magnification: 25x
  • Field of view: 3 degrees
  • Eye relief: 0.59 inches (15mm)
  • Weight: 8.75 pounds
  • Dimensions: 10.1 x 5.1 x 15.28 inches

Pros

  • Massive object lens diameter gathers tons of light
  • Lots of magnification
  • Comes with a tripod adapter
  • Excellent image quality

Cons

  • Very bulky and heavy
  • Not for handholding

Celestron is one of the top telescope producers, so it makes sense that the company would also produce top-notch binoculars for astronomy. Celestron SkyMaster 25×100 is essentially the equivalent of a telescope in your hands. It offers a whopping 25x magnification and an objective lens measuring 100mm. That massive lens will let in tons of light. Paired with the high level of magnification, you’ll see epic details in the night sky, such as Jupiter’s belts, star clusters, and more. 

These binos feature BaK-4 prisms and multi-coated lenses, enhancing contrast for superb viewing quality. They are ruggedly built with a water-resistant design. The SkyMaster also utilizes a rubber-armored housing, which protects them from damage and provides a better grip. 

Unfortunately, such power comes with great responsibility. In this case, that means lots of weight. The SkyMaster weighs 8.75 pounds and, naturally, is larger than any other option on our list. They also don’t offer any image stabilization. As a result, you won’t want to hold these by hand for very long. Luckily, it has a built-in tripod adapter, making it easier to hook up to a tripod for hands-free use. All of this also comes at a rather reasonable price, so you don’t have to break the bank to see craters on the moon. 

Best compact: Nikon PROSTAFF P7 10×42

Nikon

SEE IT

Specs

  • Objective lens diameter: 42mm
  • Magnification: 10x
  • Field of view: 7 degrees
  • Eye relief: 0.62 inches (15.7mm)
  • Weight: 1.3 pounds
  • Dimensions: 5.91 x 5.12 x 2.17 inches

Pros

  • Compact and lightweight
  • Waterproof and fogproof
  • Adjustable eyecups and long eye relief
  • Versatile

Cons

  • No tripod adapter

Weight is an important consideration when backpacking or hiking, even when you hope to take advantage of the dark skies. That’s where the Nikon PROSTAFF P7 binoculars come into play. They are very compact and lightweight, coming in at just 1.3 pounds and just under six inches long. It will be much easier to bring them along on your trips. And, it will be easier to hold for longer viewing sessions as well. 

The PROSTAFF P7 are also ruggedly built and suited for adventures. They are waterproof to 3.3 feet and nitrogen-filled for fogproof performance. The 0.62-inch eye relief works well for those who wear glasses, and the turn-and-slide eyecups are adjustable to work well for a group of people. A rubber-armored body protects from drops and bumps and makes them easier to hold. Nikon used a water- and oil-repellent coating on both the objective and eyepiece lenses, which helps keep them free of water and fingerprints. 

Although these are not specifically designed for stargazing, they will definitely do the job. The 10x magnification is enough for casual night sky viewing, and the 42mm objective lens will gather plenty of light. Nikon designed these with high-quality optics and Phase-Correction coating for superb image quality and clarity. It also features a dielectric high-reflective multilayer prism coating, which maximizes light transmission. Finally, the locking diopter ring, typically only found on much more expensive optics, keeps your setting locked in.

Best budget: Celestron UpClose G2 10×50 

Celestron

SEE IT

Specs

  • Objective lens diameter: 50mm
  • Magnification: 10x
  • Field of view: 6.8 degrees
  • Eye relief: 0.47 inches (12mm)
  • Weight: 1.69 pounds
  • Dimensions: 8 x 7 x 2.5 inches

Pros

  • Very affordable
  • Water-resistant
  • Rubber coating prevents slips
  • Good optical quality

Cons

  • Requires collimation
  • Not nitrogen-filled

You don’t have to spend a fortune to get started with binoculars for astronomy. This budget-friendly pair also happens to be great for beginner stargazers. They are compact and lightweight, making them ideal binoculars for hiking. Yet they still offer 10x magnification and a 50mm objective lens. Those specs will allow you to see the moon in all its glory easily, as well as some star clusters and more.

Celestron built these with a rugged design, including a rubber coating to protect from drops and improve grip. They are water resistant, so you won’t need to panic if you get caught in a rain shower. They are not nitrogen-filled, though, so they tend to fog up. 

The main downside to this budget set of binos is that they require collimation—the alignment of the lenses. While not difficult, it does take time away from your stargazing. The good news is that Celestron used multi-coated optics, which results in a quality image with good contrast and mostly accurate color. If you are just getting started or want some kid-friendly binoculars for astronomy, these will do a great job.

What to consider when shopping for the best binoculars for astronomy

Binoculars are, for the most part, rather simple devices without much in the way of fancy technology. But, there is some specific lingo that you should be aware of when shopping for binoculars for astronomy to ensure you pick the right optics for viewing the night sky.

Magnification 

All binoculars include two numbers in the name, such as 10×50. The first number refers to magnification. For stargazing, you’ll typically want at least 10x magnification. If you want to see the moon or planets in more detail or search for smaller deep space objects, 12x will be better. However, remember that more magnification will exaggerate movement while holding the binoculars. So, if you will only handhold the binos, we suggest sticking to 10x or lower.

Objective lens

The second number tells you the size of the objective lens, measured in millimeters. In our example above, that would be 50mm. The objective lens is the lens closest to the object you’re viewing, or the one opposite of the eyepieces. This number tells you how large the binoculars are and how much light they let in. 

Larger objective lenses collect more light, which is better for stargazing. But it also means larger binoculars, which makes them harder to handhold. As a result, you’ll need a balance unless you only plan on using a tripod or mount of some type. For astronomy use, you’ll want at least 40-50mm, though 50-60mm will allow you to see fainter celestial objects. 

Image stabilization

If you’ve ever spent time looking through binoculars, you may have noticed how hard it is to keep them steady. That movement gets even more dramatic in higher-powered binoculars for astronomy, which can make detailed observations quite challenging. If you want superb image quality and don’t always want to rely on a tripod, look for a pair of image-stabilized binoculars. 

There are different types of image stabilization in binos. Some offer passive stabilization (also called mechanical stabilization) with suspended prisms, which don’t require any batteries. Other types of stabilization include digital, optical, and hybrid stabilization (a combination of digital and optical). Each type has pros and cons, though hybrid stabilization offers the best results, albeit at the highest cost. 

Roof prism vs. Porro prism

There are two varieties of binocular design: Roof prisms and Porro prisms. In Porro prism binoculars, the objective lens is offset from the eyepiece, requiring the light to travel in a zig-zag pattern. This design can result in a higher quality image, but they are bulky and heavy compared to roof prism binos. 

The prisms in Roof prism binoculars line up closely, allowing the objective lens to be in a straight line from the eyepiece. The Roof prism design results in a more compact, lightweight form factor. However, it is a more complicated design, which results in a much higher price tag compared to Porro prism binoculars. 

Exit pupil

Exit pupil refers to the round, bright image you see when looking through the eyepiece. The larger the diameter, the brighter the image, which is important for astronomy. To calculate this, divide the objective lens diameter by the magnification. So, for example, a 10×50 binocular would offer an exit pupil of 5mm. 

The key here is to find binoculars for astronomy with an exit pupil roughly the same size as the human pupil when dilated for darkness. In dark conditions, most pupils dilate to around 7mm. Opting for binoculars with an exit pupil of 2.5mm will make the image look quite dark.

Eye relief

Eye relief is the distance from the eyepiece lens to the exit pupil, where the image is formed. Put simply, it is how far you can hold the binoculars away from your eyes and still see the full image without vignetting. If you wear glasses, you’ll need binoculars with longer eye relief. Be sure to go with an eye relief greater than 14mm if you use glasses.  

Weight

Weight might not be the first thing that comes to mind when choosing binoculars for astronomy. However, it can be incredibly important. If you plan on handholding your binoculars, look for a more compact, lightweight option. Otherwise your arms will tire quickly, but more importantly, they will be hard to hold steady. And if you can’t hold them steady, you won’t get a very good view of the night sky. 

If you opt for a heavier option or plan long observation sessions with high magnification, we recommend mounting the binoculars to a tripod. 

FAQs

Q: What size binoculars are best for astronomy?

Binoculars with 10x magnification and an objective lens of 50mm (10×50) are the most popular option for astronomy, thanks to the balance of size and magnification. However, if you want to see objects in more detail or hope to view faint deep space objects, something like 15×70 or larger is best.

Q: What night-sky objects can you see with binoculars?

Depending on the objective lens and magnification on your binoculars, you’ll be able to use them to view the moon, planets, star clusters, nebulae, and even some galaxies. 

Q: Can you use any binoculars for astronomy?

While you can certainly look up at the stars with any binoculars, not just any pair will allow for in-depth astronomy. For astronomy use, you’ll need optics that are able to gather plenty of light and offer higher magnification than general use. Budget and travel-friendly binoculars typically won’t make the cut as a result. 

Q: How much should I spend on binoculars for astronomy?

How much you should spend on binoculars for astronomy depends on how you plan on using them and what you hope to view. For beginners, a few hundred dollars is plenty. For those wanting epic night sky views, you’ll want to spend closer to $1,000 for high-quality optics, impressive image stabilization, and plenty of light-gathering abilities. 

Final thoughts on the best binoculars for astronomy

Binoculars for astronomy can serve as an excellent alternative to bulky telescopes. These optics allow you to view celestial objects on the go, making it a great choice for camping, hiking, or travel of any type. Binoculars are also easier to store, which is ideal for those living in smaller spaces. Despite their convenience, they still allow you to see plenty of wonders in the night sky.

Why trust us

Popular Science started writing about technology more than 150 years ago. There was no such thing as “gadget writing” when we published our first issue in 1872, but if there was, our mission to demystify the world of innovation for everyday readers means we would have been all over it. Here in the present, PopSci is fully committed to helping readers navigate the increasingly intimidating array of devices on the market right now.

Our writers and editors have combined decades of experience covering and reviewing consumer electronics. We each have our own obsessive specialties—from high-end audio to video games to cameras and beyond—but when we’re reviewing devices outside of our immediate wheelhouses, we do our best to seek out trustworthy voices and opinions to help guide people to the very best recommendations. We know we don’t know everything, but we’re excited to live through the analysis paralysis that internet shopping can spur so readers don’t have to.

The post The best binoculars for astronomy in 2024 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA is headed for the moon next week, and it’s bringing lots of weird stuff https://www.popsci.com/science/nasa-vulcan-lunar-lander/ Thu, 04 Jan 2024 20:52:10 +0000 https://www.popsci.com/?p=597513
Rendering of Astrobotic Peregrin lunar lander on moon's surface
The Astrobotic Peregrin lander is scheduled to make its soft lunar landing in late February. Astrobotic

United Launch Alliance's unmanned spacecraft takes off on January 8, 2024, carrying new tools, tiny robots, and... Gene Roddenberry’s ashes.

The post NASA is headed for the moon next week, and it’s bringing lots of weird stuff appeared first on Popular Science.

]]>
Rendering of Astrobotic Peregrin lunar lander on moon's surface
The Astrobotic Peregrin lander is scheduled to make its soft lunar landing in late February. Astrobotic

A rocket stocked with scientific instruments, technological gadgets, and… bitcoin (literally) is about to head for the moon’s surface. United Launch Alliance’s NASA-funded Vulcan Centaur is slated to lift off in the early hours of January 8 from Cape Canaveral, Florida, to begin its nearly two-month journey. After traveling roughly 238,900 miles, the nearly 2,829-pound Peregrin lander, built by private space company Astrobotic, should arrive at the Gruithuisen Domes within the moon’s Sinus Viscositatis region. If successful, it will mark the first US landing on Earth’s satellite since NASA’s Apollo 17 mission in 1972.

As Gizmodo notes, over 20 various payloads from six countries will be aboard the Peregrin lander—some meant for research, with others purely symbolic gestures ahead of Artemis astronauts’ planned touchdown later this decade.

[Related: Why scientists think it’s time to declare a new lunar epoch.]

The technology aboard

NASA intends to utilize a number of new tools and analysis tech aboard the lander, including a Near-Infrared Volatile Spectrometer System (NIRVSS) and Neutron Spectrometer System (NSS) meant for identifying substances such as water on the lunar surface. A Laser Retro-Reflector Array (LRA) will also provide incredibly precise distance measurements between the moon and Earth, while the Linear Energy Transfer Spectrometer (LETS) will assess lunar surface radiation to advance future astronauts’ safety.

Similar to LETS, Germany’s M-42 radiation detector will analyze similar potential mission dangers, as Mexico’s Colmena robot swarm will deploy and assemble to form a solar panel. Alongside not to be outdone, Carnegie Mellon University’s tiny, student-built Iris Lunar rover could become the first US robot upon the moon if all goes as planned. In addition, the university is also sending off a MoonArk lightweight time capsule containing poems, music, nano-scale objects, Earth samples, and images.

Also, that

Despite the industry’s many criticisms, a portion of Vulcan’s inventory will also center on cryptocurrency—namely, Bitcoin. Thanks to BitMex and Bitcoin Magazine, a physical Bitcoin engraved with a private encryption key will be deposited on the lunar surface for “future explorers” to recover, along with a few other shiny crypto objects.

Stranger things

Although primarily intended to signify humanity’s future on the moon, next week’s launch also includes the literal remnants of its past. Two memorial space companies, Celestis and Elysium Space, will also have cargo aboard the Vulcan rocket: DNA from legendary science fiction author Arthur C. Clarke, as well as the trace cremated ashes of multiple original Star Trek actors and show creator, Gene Roddenberry.

And all that’s just a portion of the larger inventory list intended to travel in the Vulcan rocket next week. For a more detailed look at additional payload info, including a hunk of Mount Everest, head over to Gizmodo.

The post NASA is headed for the moon next week, and it’s bringing lots of weird stuff appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
In Juneau, Alaska, a carbon offset project that’s actually working https://www.popsci.com/environment/juneau-alaska-carbon-offset/ Thu, 04 Jan 2024 15:00:00 +0000 https://www.popsci.com/?p=597383
Mendenhall Glacier
Some 1.5 million tourists visited Juneau aboard cruise ships this summer. Many of them visited Mendenhall Glacier, which has retreated about a mile in the past 40 years due to climate change. Wolfgang Kaehler/LightRocket via Getty Images

Visiting Alaska is an emissions-heavy prospect. An innovative program has tourists ease that by helping buy heat pumps for locals.

The post In Juneau, Alaska, a carbon offset project that’s actually working appeared first on Popular Science.

]]>
Mendenhall Glacier
Some 1.5 million tourists visited Juneau aboard cruise ships this summer. Many of them visited Mendenhall Glacier, which has retreated about a mile in the past 40 years due to climate change. Wolfgang Kaehler/LightRocket via Getty Images

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here.

When Kira Roberts moved to Juneau, Alaska, last summer, she immediately noticed how the town of 31,000 changes when the cruise ships dock each morning. Thousands of people pour in, only to vanish by evening. As the season winds down in fall, the parade of buses driving through her neighborhood slows, and the trails near her home and the vast Mendenhall Glacier no longer teem with tourists.

“That unique rhythm of Juneau is really striking to me,” she said. “It’s just kind of crazy to think that this is all a mile from my house.”

But Mendenhall is shrinking quickly: The 13-mile-long glacier has retreated about a mile in the past 40 years. Getting all those tourists to Juneau—some 1.5 million this summer by cruise ship alone—requires burning the very thing contributing to its retreat: fossil fuels.

In an effort to mitigate a portion of that CO2, some of those going whale watching or visiting the glacier are asked to pay a few dollars to counter their emissions. The money goes to the Alaska Carbon Reduction Fund, but instead of buying credits from some distant (and questionable) offset project, the nonprofit spends that cash installing heat pumps, targeting residents like Roberts who rely upon oil heating systems. 

Heat pumps are “a no-brainer” in Juneau’s mild (for Alaska) winters, said Andy Romanoff, who administers the fund. Juneau’s grid relies on emissions-free hydropower, so electricity is cheaper and less polluting than oil heat. They also save residents money—Roberts said she was paying around $500 a month on heating oil, and has seen her electricity bill climb just $30.

“The financial difference is huge,” she said

Programs from Monterey, California, to Lancaster, Pennsylvania, have tried using similar models to finance local renewable or energy-efficiency projects, and carbon offsets for flying and other activities are nothing new. But most of the voluntary market for such things is run by large companies backing distant projects. The fund in Juneau is eager to capitalize on the massive tourist interest in its backyard.

The program, which until recently was called the Juneau Carbon Offset Fund, started in 2019 when members of the advocacy organization Renewable Juneau were discussing how to help Juneau achieve its goal of having renewables provide 80 percent of the city’s energy needs by 2045. The organization’s existing heat pump programs were reaching only the “low-hanging fruit,” Romanoff said: People who had money and were ready to switch for climate reasons alone. It envisioned the fund as a way to get the devices—and the fossil fuel reduction they provide—to more residents. 

Romanoff, who also is executive director of the nonprofit Alaska Heat Smart, is aware of the reputational hit carbon offsets have taken lately, but believes the fund’s focus on heat pumps, and working locally, provides transparency and accountability. “It’s a carbon cost that people could actually relate to and understand,” he said.

Many voluntary offset projects overestimate the emissions they’re preventing, sometimes by as much as five to 10 times, said Dr. Barbara Haya, director of the Berkeley Carbon Trading Project. “Project developers are making methodological choices that give them more credits instead of less,” she said, and those verifying the claims are not enforcing conservative estimates when there’s uncertainty.

The Alaska Carbon Reduction Fund uses three years of utility bills to determine how much oil a recipient was burning before getting a heat pump. It’s paid for 41 installations since 2019, at an average cost of $7,000, and estimates the devices will prevent 3,125 metric tons of carbon emissions over their 15-year lifespan. Those calculations, plus a subsidy from non-tourism donations, brings its carbon price to $46 a ton. 

That’s more expensive than many voluntary credits, but in line with what Haya said are higher-quality projects. “That looks like the cost of real mitigation,” she said. A more fundamental issue is proving any offset project wouldn’t have happened on its own, Haya said. 

Romanoff believes their project meets that condition because the heat pumps go to residents who earn less than 80 percent of the local median income. One of the first recipients, Garri Constantine, lived on far below that when his system was installed. In the three years since, Constantine has become an evangelist for the technology, in part because he no longer spends $300 a month on firewood, trading it for a $50 monthly increase in his electricity bill. 

“I just don’t understand why these things haven’t taken off like wildfire,” he said.

The fund has $150,000 in the bank, Romanoff said, but the speed with which it can work is limited by a nationwide shortage of installers. Most of those donations came from the nearby gold mine, but Allen Marine, a regional tour operator, started pitching the fund to passengers this summer and offers an opt-in donation when booking online. It considered the fund an opportunity to “give back to the communities that we operate in,” said Travis Mingo, VP of operations. As part of the partnership, the carbon reduction fund agreed to start funding heat pumps in other Allen Marine destinations, like Ketchikan and Sitka.

A much smaller company, Wild Coast Excursions, includes the offset in its prices. When owner Peter Nave’s plan for summer tours on the local ski mountain fell through, he shifted to bear viewing and alpine hiking trips, some of which are far enough away to require helicopter rides. Climate change is especially visible for Nave, a Juneau native who’s seen the dramatic changes in Mendenhall up close and has worked as a state avalanche forecaster. He’s covering a 125 percent offset of the climate impact of those excursions, labeling his company “carbon-negative.” He estimates that will end up being about 1 percent of the price of each tour. In his mind, it’s simply a cost of doing business.

“I kind of rationalized that if I could offset more than we would use, then I could feel a little bit better about taking on [the helicopter] strategy,” he said.

He’s skeptical of offsets in general, but the tangibility of this program made a difference. “I could see the reduction happening, because I know the heat pumps work, my friends have them, people I know install them,” he said.

Wild Coast Excursions’ contribution to the carbon reduction fund in the first year is unlikely to cover even one heat pump, however. Including cruise ships or major airlines in the program would make a far more significant dent in Juneau’s emissions. Romanoff said his organization had an initial conversation with a local representative of a major cruise company, but was told it wouldn’t participate if the fund only benefits Juneau and the offsets weren’t verified by a third party.

The Alaska Carbon Reduction Fund began pursuing verification with Verra, the world’s largest certifier of voluntary credits by volume, but walked away because of the cost and its own discomfort over negative press coverage. “We could install five or six heat pumps with that money,” Romanoff said.

Offsets are one tool cruise companies consider “on a case-by-case basis,” to hit their own emissions goals, said Lanie Downs, a spokeswoman for Cruise Lines International Association Alaska. 

Carnival Plc, which owns three cruise companies operating in Alaska, said it will consider carbon offsets only if energy efficiency options have been exhausted. The other two major cruise lines that regularly dock in Juneau did not respond to requests for comment, but do list offset purchases in their annual sustainability reports.

While the city charges cruise lines a per-head passenger fee, that revenue can be used only for specific projects in the port area. Alexandra Pierce, Juneau’s tourism manager, said the city has “never formally proposed any emissions fees,” on cruise ships, but pointed to the industry’s involvement in efforts to reduce cruise line emissions and install electric shore power, the marine equivalent of stopping idling emissions.

Allen Marine has “started discussions” about including an offset fee in its tours sold through cruise lines. “As we go through contract renewals, it will actually start to snowball effect the amount of money we’re able to receive for this program,” Mingo said. But ultimately, that leaves the bulk of tourists’ emissions—the cruises—unaccounted for.

Romanoff gets a few emails a year from people in other parts of Alaska and the Lower 48 interested in setting up their own offset fund. He thinks his organization’s model could be replicated in places with plenty of oil heating systems to replace. That said, a carbon price based on replacing gas-powered heat might be too expensive for most people, he said.

But in the Alaskan panhandle, he thinks a “groundswell” of support from small businesses could make a difference in getting the cruise lines on board. “Once we build that arsenal to a certain size, then I think that’ll speak pretty loud and clear,” he said. 

This article originally appeared in Grist at https://grist.org/energy/in-juneau-alaska-a-carbon-offset-project-thats-actually-working/. Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org