The thread for space cadets!

How the Morons got to Outer Space! (or perhaps not)

Pythom Space beleive that exploration is about being brave not clever, and are trying to get a human to Mars by 2024 or '26. They have released a video of them handling the Eiger rocket and its hypergolic propellants (furfuryl alcohol and nitric acid) with less than industry-standard care. At one point in the company's promotional video, a handful of employees can be seen running from an expanding cloud of dust and exhaust. I like the delicate care they use to lift the rocket onto its legs.

 
It was really just a mock rocket.

Only a token rocket inside with a token quantity of fuel.

It was mostly hollow, if not one person would
never have been able to raise it with a rope.
 
Mega-map of Milky Way adds depth to stars’ motions

Astronomers’ main reference guide to the Milky Way just received a major update. The Gaia mission, a spacecraft that is tracking nearly two billion stars, has released a vastly improved map, which now includes the three-dimensional motions of tens of millions of stars and thousands of asteroids, detections of stellar ‘quakes’ and of possible extrasolar planets.

The Gaia team unveiled the trove at a press conference on 13 June, together with around 50 scientific papers, and made the full database available for public download. The European Space Agency (ESA) probe launched the 2-tonne probe in 2013. Like an earlier database released in 2020, the latest release consists of 34 months’ worth of data collected between 2014 and 2017.
ESA Gaia page
Nature Review
Video explaining it


The Milky Way in four maps: data from the Gaia spacecraft show the speed at which stars move towards or away from us, known as radial velocity (top left); their radial velocity and proper motion, or how they move across the sky (bottom left); their chemical make-up (bottom right); and the interstellar dust (top right). Click image for 12Mb version
 
I find very interesting there is such chemical differences along the galaxy. With such huge distances one would think different regions would become more or less homogeneous independently of location but obviously the whole galaxy can still be considered as a single object.
 
Mega-map of Milky Way adds depth to stars’ motions

Astronomers’ main reference guide to the Milky Way just received a major update. The Gaia mission, a spacecraft that is tracking nearly two billion stars, has released a vastly improved map, which now includes the three-dimensional motions of tens of millions of stars and thousands of asteroids, detections of stellar ‘quakes’ and of possible extrasolar planets.

The Gaia team unveiled the trove at a press conference on 13 June, together with around 50 scientific papers, and made the full database available for public download. The European Space Agency (ESA) probe launched the 2-tonne probe in 2013. Like an earlier database released in 2020, the latest release consists of 34 months’ worth of data collected between 2014 and 2017.
ESA Gaia page
Nature Review
Video explaining it


The Milky Way in four maps: data from the Gaia spacecraft show the speed at which stars move towards or away from us, known as radial velocity (top left); their radial velocity and proper motion, or how they move across the sky (bottom left); their chemical make-up (bottom right); and the interstellar dust (top right). Click image for 12Mb version

Really cool!

I wonder what those two bright blobs are in the bottom right corner of the the top-left picture.
 
Really cool!

I wonder what those two bright blobs are in the bottom right corner of the the top-left picture.

That would be the Magellanic clouds, satellite galaxies of our own. As far as I can tell, most off plane white pixels are galaxies or similar. Andromeda should be the single pixel on the bottom left quadrant. It's about the size of the MIlky Way, just much much further away.
 
SpaceX one step closer to launching giant Starship

The Federal Aviation Administration concludes environmental review

SpaceX cleared a key hurdle Monday for its plan to launch a gigantic, futuristic rocketship into orbit from Texas.

The Federal Aviation Administration (FAA) concluded an environmental review of Elon Musk's Starship base. The agency saw no significant environmental concerns, but is requiring more than 75 actions to reduce impacts to the region.

It's no guarantee a launch license will be issued since other factors such as safety and financial responsibility requirements still must be met at the Boca Chica site, according to the FAA.

After the latest news, SpaceX tweeted: "One step closer to the first orbital flight test of Starship."

At nearly 400 feet (120 metres), Starship is the most powerful rocket ever built and meant to carry people to the moon and Mars. NASA intends to use it for the space agency's lunar landing of astronauts, planned no earlier than 2025.

While SpaceX has launched Starship's bullet-shaped upper stage 10 kilometres into the air over the past year — resulting in some spectacular explosions — it's yet to fly it atop a Super Heavy booster.

https://www.cbc.ca/news/science/spacex-starship-1.6489349
 
Whatever hit the Moon in March, it left this weird double crater

When space junk crashed into the Moon earlier this year, it made not one but two craters on the lunar surface, judging from images revealed by NASA on Friday.

Astronomers predicted a mysterious object would hit the Moon on March 4 after tracking the debris for months. The object was large, and believed to be a spent rocket booster from the Chinese National Space Administration's Long March 3C vehicle that launched the Chang'e 5-T1 spacecraft in 2014.

"The double crater was unexpected and may indicate that the rocket body had large masses at each end. Typically a spent rocket has mass concentrated at the motor end; the rest of the rocket stage mainly consists of an empty fuel tank. Since the origin of the rocket body remains uncertain, the double nature of the crater may indicate its identity" said NASA
double_lunar_crater.jpg


What is the difficult bit of taking samples from an asteroid? Writing the software it seems!!!

Sadly for NASA's mission to take samples from the asteroid Psyche, software problems mean the spacecraft is going to miss its 2022 launch window.

The US space agency made the announcement on Friday: "Due to the late delivery of the spacecraft's flight software and testing equipment, NASA does not have sufficient time to complete the testing needed ahead of its remaining launch period this year, which ends on October 11."

NASA had already pushed the launch back from August 1 to no earlier than September 20 after compatibility issues cropped up with the software testbed simulators.

Last week's announcement is tantamount to a throwing in of the towel regarding Psyche's original mission. While launches are possible in 2023 and 2024, the orbital positions of the Earth and the asteroid means Psyche would not arrive at its destination until 2029 or 2030 respectively.​
 
BepiColombo's images of Mercury

d41586-022-01784-y_23213828.jpg

DWx5AnSZHFKKzZSUNPEY8W-970-80.jpg.webp

Last flyby with annotations:
3A8SJqFy5JHnm9pQH3hZzE-970-80.jpg

The Mercury-bound probe BepiColombo has taken its second look at its target planet today during a superclose flyby designed to slow the spacecraft down and adjust its trajectory.

The June 23 flyby was BepiColombo's second at Mercury, following the probe's first encounter with the planet in October 2021. The probe made its closest approach to Mercury's surface at 5:44 a.m. EDT (0944 GMT), when it passed only 125 miles (200 kilometers) from Mercury's crater-riddled surface, closer than the two orbiters will operate once the mission begins in earnest.
 
I wonder if in 2022 technology doesn't allow to take color images. Is it a bandwidth issue or b&n cameras are more sturdy or something?
 
I wonder if in 2022 technology doesn't allow to take color images. Is it a bandwidth issue or b&n cameras are more sturdy or something?

The three cameras they're using for these pictures are intended more for checking navigation and inspecting the spacecraft itself rather than imaging Mercury (that's why there's so many parts of the craft visible in the frame). I would guess the emphasis is on making them as simple and robust as possible, and colour isn't seen as particularly necessary for either purpose.

The ESA give a quick explanation on their site. BepiColombo does have a high resolution camera, but it can't be used while it's still in transit to Mercury. This is because the high res camera, along with most of the other delicate instruments, are on the side of the craft that's currently attached to the transfer stage. That design makes sense, as it protects them from, e.g. micrometeorite damage during the seven year trip. But it does mean that they can only be used once BepiColombo enters a stable Mercury orbit in 2025, and they can detach the transfer stage.
 
I wonder if in 2022 technology doesn't allow to take color images. Is it a bandwidth issue or b&n cameras are more sturdy or something?

https://www.straightdope.com/21344181/why-are-images-from-space-probes-always-in-black-and-white

Dear Cecil: Why are images from our space program always in grayscale instead of color? I know NASA needs to extract data from those images, and I also know the cameras aren’t $9.99 specials from the corner drugstore. But couldn’t NASA just stick a plain old color digital camera on board and send it to Mars along with the rest of the equipment? Buster Blocker, Bettendorf, Iowa

They’ve thought about it, actually. But the truth is, we’re probably better off the way things are.

To find out about space cameras, we got in touch with Noam Izenberg, a planetary scientist working on the MESSENGER probe, which is now circling Mercury taking pictures. He told us there are basically two reasons space photography is mostly in black and white. The first, as you rightly suppose, is that grayscale images are often more useful for research.

In principle, most digital cameras, including cheap Walmart models in addition to the custom-built jobs on space probes, are monochrome, or more accurately panachrome. Each of the pixel-sized receptors in a digital camera sensor is basically a light bucket; unmodified, their combined output is simply a grayscale image generated from all light in the visible spectrum and sometimes beyond.

To create a color image, each pixel on a typical earthbound camera has a filter in front of it that passes red, green, or blue light, and the camera’s electronics add up the result to create the image we see, similar to a color TV. In effect, filtering dumbs down each panachrome pixel so that it registers only a fraction of the light it’s capable of seeing. Granted, the human eye works in roughly the same way. The fact remains, in an earthbound camera, some information is lost.

Space cameras are configured differently. They’re designed to measure not just all visible light but also the infrared and ultraviolet light past each end of the visible spectrum. Filtering is used primarily to make scientifically interesting details stand out. “Most common planetary camera designs have filter wheels that rotate different light filters in front of the sensor,” Izenberg says. “These filters aren’t selected to produce ‘realistic’ color that the human eye would see, but rather to collect light in wavelengths characteristic of different types of rocks and minerals,” to help identify them.

True-color images — that is, photos showing color as a human viewer would perceive it — can be approximated by combining exposures shot through different visible-color filters in certain proportions, essentially mimicking what an earth camera does. However, besides not inherently being of major scientific value, true-color photos are a ***** to produce: all the variously filtered images must be separately recorded, stored, and transmitted back to Earth, where they’re assembled into the final product. An 11-filter color snapshot really puts the squeeze on storage space and takes significant transmission time.

Given limited opportunities, time, and bandwidth, a better use of resources often is a false-color image — for example, an infrared photo of rocks revealing their mineral composition. At other times, when the goal is to study the shape of the surface, measuring craters and mountains and looking for telltale signs of tectonic shifts or ancient volcanoes, scientists want black-and-white images at maximum resolution so they can spot fine detail.

Terrific, you say. But don’t scientists realize the PR value of a vivid color photo?

They realize it all right. But that brings up the second reason most NASA images aren’t in color. The dirty little secret of space exploration is that a lot of the solar system, and for that matter the cosmos, is pretty drab. “The moon is 500 shades of gray and black with tiny spatterings of greenish and orangish glass,” Izenberg says. “Mars is red-dun and butterscotch with white ice at the poles. Jupiter and glorious Saturn are white/yellowish/brown/reddish. Hubble’s starscapes are white or faintly colored unless you can see in the infrared and ultraviolet.”

As for Mercury, Izenberg’s bailiwick, NASA has posted on its website detailed color photos showing vast swaths of the planet’s surface. If the accompanying text didn’t tell you they were true-color, you’d never know.

False-color images are often a lot more interesting. The colors aren’t faked, exactly; rather, they’re produced by amplifying modest variations in the visible spectrum and adding in infrared and ultraviolet. Some of the less successful examples look like a Hare Krishna tract, but done skillfully the result can be striking. The spectacular full-color nebula images from the Hubble Space Telescope were all produced by black-and-white sensors with color filters.

For what it’s worth, some colleagues of Izenberg’s a few years ago floated the idea of doing as you suggest — putting an off-the-shelf digital camera on a probe in addition to the more expensive models. The idea didn’t get off the ground, as it were, partly out of concerns the camera wouldn’t survive the extreme temperatures of space. But chances are the raw results wouldn’t have been all that impressive anyway. Experience suggests a good space photo needs a little …eh, don’t call it show biz. Call it art.
 
Evidence of Aliens measured on Mars!!

Actually the news is that they have quantified the amount of organic carbon in rocks for the first time, 200 to 273 parts per million of organic carbon. There are plenty of non-living ways this carbon could have come about.

To make the measurement, Curiosity delivered the sample to its Sample Analysis at Mars (SAM) instrument, where an oven heated the powdered rock to progressively higher temperatures. This experiment used oxygen and heat to convert the organic carbon to carbon dioxide (CO2), the amount of which is measured to get the amount of organic carbon in the rocks. Adding oxygen and heat allows the carbon molecules to break apart and react carbon with oxygen to make CO2. Some carbon is locked up in minerals, so the oven heats the sample to very high temperatures to decompose those minerals and release the carbon to convert it to CO2. The experiment was performed in 2014 but required years of analysis to understand the data and put the results in context of the mission’s other discoveries at Gale Crater. The resource-intensive experiment was performed only once during Curiosity’s 10 years on Mars.

This process also allowed SAM to measure the carbon isotope ratios, which help to understand the source of the carbon. Isotopes are versions of an element with slightly different weights (masses) due to the presence of one or more extra neutrons in the center (nucleus) of their atoms. For example, Carbon-12 has six neutrons while the heavier Carbon-13 has seven neutrons. Since heavier isotopes tend to react a bit more slowly than lighter isotopes, the carbon from life is richer in Carbon-12. “In this case, the isotopic composition can really only tell us what portion of the total carbon is organic carbon and what portion is mineral carbon,” said Stern. “While biology cannot be completely ruled out, isotopes cannot really be used to support a biological origin for this carbon, either, because the range overlaps with igneous (volcanic) carbon and meteoritic organic material, which are most likely to be the source of this organic carbon.”​

view-of-yellowknife-bay.jpg

From a position in the shallow "Yellowknife Bay" depression, NASA's Mars rover Curiosity used its right Mast Camera (Mastcam) to take the telephoto images combined into this panorama of geological diversity.
 
Back
Top Bottom