Newsworthy Science

LANL is brewing beer at the speed of sound
Tech transfer solving real-life problems
BY ERIC DAVIS STAFF SCIENTIST, LOS ALAMOS NATIONAL LABORATORY (LANL) [the guys who build big bombs]

As beer lovers sip their favorite IPAs on a weekend or after a long day’s work, they might be unaware — and who can blame them? — of the complexity of the brewing process and all the things that can go wrong in getting that tasty beverage from the recipe to the restaurant table. Especially in many of the more than 8,800 craft breweries in the United States, the art of brewing is still very much a manual process, relying on sampling, testing and tinkering.

As a scientist on an acousticscience research team at Los Alamos National Laboratory, I have long been interested in acoustic sensor technology.
Developments in this science apply to the oil and gas sectors and other industries, but we saw the potential in the brewing industry. Visiting craft breweries in the Denver area last year, as well as canvassing many of the breweries here in New Mexico, we heard about their challenges and assessed the potential for acoustic-sensor technology in this fast-growing industry that nonetheless has complex technical challenges. It looked like a good fit of technology to a tasty application.
Brewing is a multi-step process. In the critical fermentation stage, yeast is added to a mixture of wort, which is an infusion of malt or other grains, and hops (a so-called slurry) within a stainless-steel vat. Inside the vat, yeast sets to work converting sugars in the wort to alcohol and carbon dioxide. Brewers often check progress manually by draining off the product to sample it.

As we talked to brew masters, they noted that monitoring fermentation was important for quality and efficiency. With batches sometimes containing hundreds of gallons in larger industrial operations, the fermentation stage can take weeks. That’s plenty of time for something to go wrong — at the expense of the batch, the labor to produce and monitor the product and the availability of the equipment occupied by the beer-in-progress. With a window into the problem, we returned to the lab and we embarked on developing a solution. Brewing beer proved to be a novelty at Los Alamos, but the physics of materials is a longstanding specialty. So in our unique pilot brewing operation, we developed the SoniView acoustic sensor system.

SoniView relies on a couple key pieces of technology. Vibrations from a transducer attached to the front, outer side of the vat send sound waves into the medium inside the tank. A sensor on the back, outer side of the vat picks up the vibration and converts it to an electrical signal. Fed into an oscilloscope and a computer, the system provides visualizations and data on the progress of fermentation. Knowing the distance from one side of the vat to the other, the “time of flight” it takes the sound wave to travel from the transducer to the back sensor helps determine the speed of sound through the liquid. That speed constantly changes as the beer ferments and the relative amounts of sugars, alcohol and other components change. Additionally, when the slurry is actively fermenting, the sound traveling through the liquid will be weaker because so much carbon dioxide is being produced.
Carbon dioxide production drops off the longer the wort ferments, which is picked up by the acoustic sensors.
Eventually, the yeast dies off and drop to the bottom of the vat as trub, a layer of sediment. The sensors provide continuous, real-time insight into the entire process.

ajax-request.php
zoom_in.png

Staff scientists Eric Davis, left, and Vamshi Chillara examine the SoniView laboratory brewing station as part of the UC-LANL Entrepreneur Postdoc Accelerator program at Los Alamos National Laboratory. COURTESY OF LOS ALAMOS NATIONAL LABORATORY

Traditional brewing means periodically checking the fermentation by draining some product out during the process. SoniView’s monitoring capabilities provide the lynchpin for more efficient — and profitable — processes.
Now data can be acquired in real time, so labor that might otherwise be spent manually checking on fermentation is freed up for other tasks. SoniView also quickly and accurately determines the transition from yeast slurry to beer when dumping the yeast and closes the dump valve. This is especially important for larger breweries because slurry may require expensive handling and disposal. It also saves beer during the dump process. With this key technology monitoring arguably the most critical stage in brewing, each batch comes out according to plan: ready for filtering, packaging and eventual consumption.

Our experimental efforts also create a database of calculations that can provide useful information for brew masters as they prepare and perfect their batches. Getting this technology out of the laboratory and into businesses is a key goal, facilitated by the Feynman Center for Innovation at Los Alamos, which licenses technology to partners in the private sector. We’ve also partnered on a New Mexico Small Business Assistance project, teaming up with a Santa Fe area brewery to test the technology in a scaled-up, commercial setting.

That outreach means that New Mexico businesses might have a science-driven complement to the art of brewing, and an advantage in creating delicious beer that customers can enjoy — every batch, and every sip, just right.
Eric Davis is a staff scientist in the Materials Synthesis and Integrated Devices group at Los Alamos National Laboratory. Team members include Vamshi Chillara, Cristian Pantea and Craig Chavez, with support from Marc Witkowski at the Feynman Center. The work was supported through the UC-LANL Entrepreneur Postdoc Accelerator program. The Executive’s Desk is a guest column providing advice, commentary or information about resources available to the business community in New Mexico. To submit a column for consideration, email gporter@abqjournal.com.
 
Prevalence of Depression and Posttraumatic Stress Disorder in Flint, Michigan, 5 Years After the Onset of the Water Crisis

Question What are the long-term psychiatric outcomes of environmental disasters, such as the Flint water crisis?

Findings In this cross-sectional household probability sample survey of 1970 adults living in Flint, Michigan, during the water crisis, more than one-fifth met criteria for presumptive past-year depression, nearly one-quarter for past-year presumptive posttraumatic stress disorder, and more than one-tenth for both disorders 5 years after the onset of the water crisis. Only 34.8% were ever offered mental health services to assist with water-crisis–related psychiatric symptoms; most (79.3%) who were offered services utilized them.

Meaning These findings suggest that public-works environmental disasters such as the Flint water crisis have lasting psychological sequelae and may require expanded mental health services to meet long-term psychiatric need.

zoi220924f2_1663087711.81039.png

Spoiler Legend :
a Past-year depression prevalence rates for contextual comparison are from the Centers for Disease Control Behavioral Risk Factor Surveillance Data for Michigan in 2010 (the last year 12-month prevalence was assessed), the 2019 National Survey on Drug Use and Health for the United States in 2019,24 and Lim et al25 for the global average from 1994 to 2014. Lifetime (>1 year) depression prevalence rates are from the Centers for Disease Control Behavioral Risk Factor Surveillance Data for Michigan in 2019 and the United States in 2018 and Lim et al25 for the global average. The National Survey on Drug Use and Health does not report standard errors for depression statistics, therefore for the purposes of constructing 95% CIs these have been assumed to match those reported for other mental disorders in the same study (ie, substance use).

b Past-year posttraumatic stress disorder (PTSD) prevalence rates for contextual comparison are from Kang et al26 for postdeployment Gulf War veterans from 1995 to 1997, the National Epidemiologic Survey on Alcohol and Related Conditions-III for the United States from 2012 to 2013,27 and the World Health Organization World Mental Health Surveys for the global average from 2001 to 2012.28 Lifetime (>1 year) PTSD prevalence rates are from the National Vietnam Veterans Readjustment Study for male Vietnam-era veterans from 1986 to 1988,29 the National Epidemiologic Survey on Alcohol and Related Conditions-III for the United States,27 and the World Mental Health Surveys for the global average.28 Lifetime PTSD prevalence rates are higher among Vietnam veterans than those of more recent deployments.30
 
Also the Ig Nobels were the other day

Actually important - Contracts are hard to understand because the people who write them cannot write:

LITERATURE PRIZE [CANADA, USA, UK, AUSTRALIA]
for analyzing what makes legal documents unnecessarily difficult to understand.
REFERENCE: “Poor Writing, Not Specialized Concepts, Drives Processing Difficulty in Legal Language,” Eric Martínez, Francis Mollica, and Edward Gibson, Cognition, vol. 224, July 2022, 105070.

Spoiler Seriously, are lawyers doing it on purpose to scam us? :
Despite their ever-increasing presence in everyday life, contracts remain notoriously inaccessible to laypeople. Why? Here, a corpus analysis (n ≈10 million words) revealed that contracts contain startlingly high proportions of certain difficult-to-process features–including low-frequency jargon, center-embedded clauses (leading to long-distance syntactic dependencies), passive voice structures, and non-standard capitalization–relative to nine other baseline genres of written and spoken English. Two experiments (N=184) further revealed that excerpts containing these features were recalled and comprehended at lower rates than excerpts without these features, even for experienced readers, and that center-embedded clauses inhibited recall more-so than other features. These findings (a) undermine the specialized concepts account of legal theory, according to which law is a system built upon expert knowledge of technical concepts; (b) suggest such processing difficulties result largely from working-memory limitations imposed by long-distance syntactic dependencies (i.e., poor writing) as opposed to a mere lack of specialized legal knowledge; and (c) suggest editing out problematic features of legal texts would be tractable and beneficial for society at-large.


Others:

APPLIED CARDIOLOGY PRIZE [CZECH REPUBLIC, THE NETHERLANDS, UK, SWEDEN, ARUBA]
for seeking and finding evidence that when new romantic partners meet for the first time, and feel attracted to each other, their heart rates synchronize.

MEDICINE PRIZE [POLAND]
for showing that when patients undergo some forms of toxic chemotherapy, they suffer fewer harmful side effects when ice cream replaces one traditional component of the procedure.

ENGINEERING PRIZE [JAPAN]
for trying to discover the most efficient way for people to use their fingers when turning a knob.

ART HISTORY PRIZE [THE NETHERLANDS, GUATAMALA, USA, AUSTRIA]
for the study “A Multidisciplinary Approach to Ritual Enema Scenes on Ancient Maya Pottery.”

PHYSICS PRIZE [CHINA, UK, TURKEY, USA] [AWARDED JOINTLY TO TWO GROUPS]
for trying to understand how ducklings manage to swim in formation.

BIOLOGY PRIZE [BRAZIL, COLOMBIA]
for studying whether and how constipation affects the mating prospects of scorpions.

PEACE PRIZE [CHINA, HUNGARY, CANADA, THE NETHERLANDS, UK, ITALY, AUSTRALIA, SWITZERLAND, USA]
for developing an algorithm to help gossipers decide when to tell the truth and when to lie.

ECONOMICS PRIZE [ITALY]
for explaining, mathematically, why success most often goes not to the most talented people, but instead to the luckiest.
[NOTE: This is the second Ig Nobel Prize awarded to Alessandro Pluchino and Andrea Rapisarda. The 2010 Ig Nobel Prize for Management was awarded to Alessandro Pluchino, Andrea Rapisarda, and Cesare Garofalo, for demonstrating mathematically that organizations would become more efficient if they promoted people at random.]

SAFETY ENGINEERING PRIZE [SWEDEN]
Magnus Gens, for developing a moose crash test dummy.
 
Last edited:
More in the talent vs. luck one:

The largely dominant meritocratic paradigm of highly competitive Western cultures is rooted on the belief that success is mainly due, if not exclusively, to personal qualities such as talent, intelligence, skills, smartness, efforts, willfulness, hard work or risk taking. Sometimes, we are willing to admit that a certain degree of luck could also play a role in achieving significant success. But, as a matter of fact, it is rather common to underestimate the importance of external forces in individual successful stories. It is very well known that intelligence (or, more in general, talent and personal qualities) exhibits a Gaussian distribution among the population, whereas the distribution of wealth — often considered as a proxy of success — follows typically a power law (Pareto law), with a large majority of poor people and a very small number of billionaires. Such a discrepancy between a Normal distribution of inputs, with a typical scale (the average talent or intelligence), and the scale-invariant distribution of outputs, suggests that some hidden ingredient is at work behind the scenes. In this paper, we suggest that such an ingredient is just randomness. In particular, our simple agent-based model shows that, if it is true that some degree of talent is necessary to be successful in life, almost never the most talented people reach the highest peaks of success, being overtaken by averagely talented but sensibly luckier individuals. As far as we know, this counterintuitive result — although implicitly suggested between the lines in a vast literature — is quantified here for the first time. It sheds new light on the effectiveness of assessing merit on the basis of the reached level of success and underlines the risks of distributing excessive honors or resources to people who, at the end of the day, could have been simply luckier than others. We also compare several policy hypotheses to show the most efficient strategies for public funding of research, aiming to improve meritocracy, diversity of ideas and innovation.​

VIcHSjZ.png
 

When an octopus kills, it is stealthy and calculating​

Researchers film octopuses ensnaring their prey and identify specific, intentional killing techniques

When an octopus entangles its prey in its swirling mass of tentacles, it may look to the untrained eye like chaos. But, in fact, it's a highly calculated hunt.

Researchers in Minnesota filmed octopuses killing prey in a tank, and found the creatures used specific, identifiable techniques to catch their dinner.

"For us, the main take-home was [that] it was really repeatable," co-author Trevor Wardill, an animal behaviour professor at the University of Minnesota, told As It Happens host Nil Köksal.

"They would come out in a very specific way and we found quite different approaches — strategies, if you like — of how they captured shrimp versus crabs."

The findings, published this week in the journal Current Biology, not only help us better understand the complex and mysterious sea creatures, but could also have implications for the future of robotics.

To observe how the cephalopods catch their dinner, the team put two-spot California octopuses in tanks with little dens for them to hide in.

When the creatures were happily ensconced in their dens, the scientists dropped live crabs and shrimps into the tank in front of the den's opening, then filmed the creatures as they lunged for the prey and gobbled it up.

As they observed the footage in slow motion, a pattern emerged.

For example, they quickly learned that when it comes to killing, octopuses don't use all eight legs equally. Instead, they favour what scientists call their "second" legs — the limbs on either side of the tentacle that emerges from the front and centre of their bodies.

Which of those two tentacles they use depends on which of their eyes spotted the prey.

"The surprising thing with octopuses that the sort of general public may not understand is they hunt with just one eye. So there's one eye looking out into the world in one direction, and one eye looking in the other direction," Wardill said.

"And so the eye that spies the food item … will then be directing arms towards the prey. And they'll always — and I mean always — use arms on the side that the eye is pointing towards the food item."

Zoologist Michael Vecchione, who specializes in deep-sea invertebrates, says it's not surprising that octopuses have specialized limbs for killing prey within their visual range. After all, male octopuses have specialized limbs for fertilizing females.

Vecchione, the curator of cephalopods at the National Museum of Natural History in Washington, D.C., says the study is interesting because it looks at a different kind of octopus hunting than scientists usually focus on.

Octopuses, he says, do most of their hunting by swimming around and sticking their limbs into various nooks and crannies to see what they can find. When they do that, he says, the limbs act almost independently, as each one has its own large nerve ganglion, or "mini brain."

"There's been a lot of emphasis lately on the fact that each of these arms can be doing its own thing, and now, you know, the central brain of the animal is just doing high level co-ordination kind of things," Vecchione said.

"This deals with the fact that they're also often very visual, and how they act when something is in sight. And the answer is very different, actually.

"If they're doing a visual attack, then they appear to have a preferred way of doing it, and they have learned to do it in different ways for different types of animals."

When crabs were dropped into their tanks, the octopuses favoured a "parachuting" approach — attacking from above and enclosing the poor, unsuspecting crustaceans. For the shrimp, they tended to approach stealthily, and quickly ensnare them with a single limb.

One of their shrimp-hunting techniques took the researchers by surprise. The critters would snake one tentacle out in front of the prey and wave it around before closing in for the kill.

The scientists suspect it's a way of distracting or confusing the shrimp, which, if it senses movement nearby, can very quickly dodge out of the hunter's reach.

"And first, when you see that you think this is weird. Isn't it going to scare away their prey? But they kind of wiggle it gently," Wedell said.

"Meanwhile, they're approaching quite slowly, their whole body, so that they get in reaching distance and then they'll strike at the prey."

Robots for surgery and exploration

Wedell says the findings could prove useful for the creation of highly dexterous, octopus-inspired robots.

"We now know [controlling] them is a little less erratic than what you would see if you just were a naive observer of an octopus," he said.

"And so we're hoping that that will inspire, you know, engineers to make fancier vehicles that maybe do underwater rescue or, you know, surgeons that could have a very highly co-ordinated arm system to do keyhole surgery or something like that."

Vecchione, meanwhile, says anything that helps us better understand how cephalopods think and behave is welcome, because they are so different from any other species we consider intelligent.

Not only does an octopus have mini-brain in each tentacle, but its primary brain is ring-shaped, with its esophagus going right through the centre.

"If you think about intelligent animals, almost all of them that you might think of — you know, birds, parrots, dolphins or even fishes — they're all vertebrates, and their brains are all built in the same basic model…. But this is something that's developed in a completely different pathway," he said.

"So how it does these sorts of things is important as far as understanding the more general questions about how things become intelligent and how behaviours develop."
https://www.cbc.ca/radio/asithappens/octopus-hunt-study-1.6592315
 
Republicans and Democrats overestimate the extent to which the other side dehumanizes them by 50–300%, and debunking these misperceptions can reduce rates of animosity.

Abstract​
Rising partisan animosity is associated with a reduction in support for democracy and an increase in support for political vio- lence. Here we provide a multi-level review of interventions designed to reduce partisan animosity, which we define as nega- tive thoughts, feelings and behaviours towards a political outgroup. We introduce the TRI framework to capture three levels of intervention—thoughts (correcting misconceptions and highlighting commonalities), relationships (building dialogue skills and fostering positive contact) and institutions (changing public discourse and transforming political structures)—and connect these levels by highlighting the importance of motivation and mobilization. Our review encompasses both interventions conducted as part of academic research projects and real-world interventions led by practitioners in non-profit organizations. We also explore the challenges of durability and scalability, examine self-fulfilling polarization and interventions that backfire, and discuss future directions for reducing partisan animosity.​
Conclusion​
Partisan animosity is a growing concern in the United States, prompting scientists and practitioners to examine its roots and potential solutions. We have attempted to synthesize this rich and quickly growing body of work. Although reducing partisan ani - mosity may be difficult, we believe that it is useful for researchers and practitioners to ‘TRI’: aim to reduce partisan animosity by changing thoughts, building relationships and transforming institutions. Successful interventions help partisans gain more accurate perceptions of each other and recognize the similarities they share, teach them how to have productive conversations and create the conditions for encouraging cross-party interactions, and attempt to improve public discourse and transform political structures. To enact durable and scalable change, we also encourage practitioners to intervene to motivate and mobilize partisans to become actively involved in reducing partisan animosity. We hope that this review helps make sense of the variety of interventions and prompts future research in the field. Partisan animosity is powerful, but so is the potential for interdisciplinary work between scientists and practitioners to help overcome it.​

Paper paywalled, try this link or perhaps go via nature "Quote of the day"

41562_2022_1442_Fig1_HTML.png

Six themes of interventions for reducing partisan animosity. Interventions range from thoughts (correcting misperceptions and highlighting commonalities) to relationships (building dialogue skills and fostering positive contact) to institutions (changing public discourse and transforming political structures). To transcend from one level to the next, people need to be motivated (thoughts to relationships) and then mobilized (relationships to institutions).
 
Migration, not conquest, drove Anglo-Saxon takeover of England

The genetic make-up of people in south and east England changed radically between the Iron Age and the early medieval period, suggesting that there was a huge influx of Anglo-Saxon settlers from northern Europe. Researchers did a genome-wide analysis of 460 people who lived between AD 200 and 1300 in what is now England, Ireland, the Netherlands, Germany and Denmark. Early on, almost all the English people’s ancestors were from the British Isles. Later, they derived an average of 76% of their ancestry from continental northern Europeans. The finding overturns a favoured hypothesis that Anglo-Saxon culture promulgated in Britain thanks mostly to small incursions of elite warriors. “We’re a million miles away from an invasion hypothesis — it’s not a bunch of blokes getting in boats with weapons and conquering territory,” says archaeologist and co-author Duncan Sayer. “Actually, the North Sea was a highway, where people were coming and going,” says archaeologist Catherine Hills.​

41586_2022_5247_Fig5_HTML.png

Spoiler Legend :
a, Ternary plot of present-day British–Irish populations as a three-way admixture between late Iron Age and Roman England (England LIA Roman) (n = 32), France IA (n = 26) and England EMA CNE (n = 109). b, Boxplot comparison of France IA ancestry proportions in 23 English PoBI sampling regions using either England LIA Roman (n = 32) or Worth Matravers (n = 16) as source for local British ancestry in qpAdm. The P value obtained from a two-sided paired Student’s t-test is shown. The bounds of the box represent the 25th and 75th percentile, the centre represents the median, and the whiskers represent the minimum and maximum values in the data. Dashed lines connect points from the same region. c, Geographical distribution of the England EMA CNE, ancestries based on the interpolation of 31 present-day population estimates. The coordinates of the sample collection districts approximate the centroids of the averaged birthplaces of the grandparents. d, Same as c, but for France IA.

Paper Writeup
 
NM startup on cusp of DNA sequencing breakthrough
Copyright © 2022 Albuquerque Journal
BY KEVIN ROBINSON-AVILA
JOURNAL STAFF WRITER

Technology to rapidly read long strands of DNA with accurate, molecular-level precision has eluded researchers for decades.
But an Albuquerque startup is close to achieving that breakthrough in genomic sequencing, now backed by $2 million in funding from the National Institutes of Health.

Armonica Technologies Inc. says it can accurately identify single molecules among millions of particles attached to strands of DNA as they shoot through measuring devices, based on new methods developed at the University of New Mexico. The technology could potentially open the floodgates to “personalized medicine,” allowing for much faster medical diagnostics and more effective treatments for things like cancer.

The company, which launched in 2017, has raised nearly $8 million to date to advance its technology, which has already proven effective in laboratory testing through UNM’s Center for High Technology Materials, or CHTM. But it must still develop standardized methods for manufacturing market-ready systems that work with the speed and accuracy required for medical diagnostics, something the new NIH grant will help Armonica develop, said company CEO Victor Esch.
“We’re currently working to implement manageable fabrication technology at CHTM,” Esch told the Journal. “It doesn’t yet scale well for standard manufacturing. It’s still not very reproducible, so we’re working to transition it into mainstream fabrication technology.” The NIH grant, approved in August, reflects significant enthusiasm about Armonica’s technology, said Waneta Tuttle of Tramway Venture Partners, which invested in the company.
“The NIH peer review panel expressed real excitement in the technology’s potential,” Tuttle told the Journal. “It’s a real vote of confidence.”
Santa Fe-based Cottonwood Technologies Fund and Sun Mountain Capital have also invested in Armonica. So did Hamamatsu Ventures USA, a subsidiary of global company Hamamatsu Photonics K.K., which is a world leader in photonics, or lightbased, technologies. Hamamatsu was attracted to Armonica’s photonics-based innovation, which applies laser technology to identify the individual molecules in DNA strands using advanced analytics tools that illuminate the molecules under study to better capture and measure their characteristics.

“We basically fabricate ‘nanoantennas,’ or enhancement structures, which are super-tiny antennas that resonate with optical light to probe individual molecules,” Esch said. A full DNA genome encompasses more than three billion nucleotides, or base parts, which constitute the individual building blocks of DNA. Researchers chop up those full genomes into smaller parts, or strands, of DNA to capture and identify the individual molecules in bitesized chunks as they pass through laser-reading devices.
But with current technology, researchers have struggled to accurately detect and identify all the individual nucleotides in each DNA strand, something Armonica’s technology now enables. In addition, Armonica has developed nano-scale channels, or nanopores, to push DNA strands through laser readers. Those microscopic holes force the DNA strands to stretch out and slow down as they move through reading devices, permitting the lasers to accurately process them. And that, in turn, allows Armonica to use larger strands of DNA in the laser readers, permitting it to record a lot more nucleotides in a single sample, while also speeding the process of measuring all the chopped up strands of DNA that lead to sequencing a full genome.

That process acceleration could potentially reduce the time for full genomic sequencing from days or weeks today to perhaps minutes, according to Armonica. And, with accurate readings at the individual molecular level, Armonica’s technology can provide a lot more critical information for medical diagnostics that’s often missed with current sequencing systems.


ajax-request.php
zoom_in.png

UNM created these nano-scale tunnels to stretch out strands of DNA for scanning under optical readers as they flow through the channels. Researchers form the nanochannels by stacking individual silicon carbide particles. COURTESY OF UNM

Researchers, for example, often miss “structural variants,” or individual nucleotides, that attach to DNA and change their function or characteristics, turning things on and off, Esch said.
“Many diseases like cancer are the function of structural variants,” he said. “It’s critical to capture and measure those variants in medical diagnostics.”
Accurately detecting and identifying those variants, known as “epigenetics,” could advance medical breakthroughs in many fields, such as studying the aging process.
“Longevity studies today look at how epigenetics influences the way the body ages,” Esch said. “It could lead to therapies that modify things that are causing the body to slowly break down.”
Hamamatsu senior associate Robert Warren said Armonica’s nextgeneration sequencing capability could significantly advance medical diagnostics.
“It can help to understand DNA at a much deeper level,” Warren told the Journal. “It can provide a full picture of what’s going on with each molecule.”
It may still take five to 10 years to fully develop Armonica’s technology and bring it to market, said Richard Oberreiter, managing director of Hamamatsu Ventures USA.
“Transformative technology takes time and money,” Oberreiter told the Journal. “… But these types of technological advances can lead to breakthroughs in personalized medicine and treating disease.”
 
Researchers, for example, often miss “structural variants,” or individual nucleotides, that attach to DNA and change their function or characteristics, turning things on and off, Esch said.
I really do not know what they are on about. I think the reporter is confused between three things:
  • Structural variants are lengths of sequence that have been deleted, replicated or moved around the genome. These can be a challenge for conventional sequencing, but are not generally
  • Individual nucleotide changes, which are the common thing studied and are well detected by conventional sequencing
  • Epigenetic changes, which are chemicals (usually methyl groups) that attach to the DNA and change its function or characteristics, turning things on and off
The tech sounds very like what Oxford Nanopore is doing, with little sequencers the size of a USB stick. It has applications, but the critical number for the sort of medical applications they are talking about is price per read depth, and Oxford Nanopore is not very good for that.
 
Last edited:
It is all over my head. My take was that they are figuring out how to "speed read" DNA at the molecular level.
 
Nature's got an article with a slightly clearer take on it here:

https://www.nature.com/articles/d41586-021-00462-9

the development of sequencing technologies that can read long stretches of DNA uninterrupted. Now Miga and her colleagues in the Telomere to Telomere (T2T) consortium are poised to complete the 20-year odyssey that began with the release of that first draft sequence. Their goal is to produce, for each chromosome, an end-to-end genome map that stretches from one telomere (the repetitive sequence elements that cap chromosomal ends) to the other.

Yet it’s still missing 5–10% of the genome, including all the centromeres and other challenging regions, such as the large collection of genes encoding the RNA sequences that form protein-producing organelles called ribosomes. These are present in long stretches of numerous, repeated gene copies. “That’s a large portion of the yet-to-be-closed gaps,” says Adam Phillippy, a bioinformatician at the US National Human Genome Research Institute in Bethesda, Maryland, and T2T co-chair. The genome is also peppered with hard-to-map stretches of near-identical DNA called segmental duplications — the product of ancient chromosomal rearrangements.

While I'm raising a dubious eyebrow to the claim that 5-10% of the genome is "missing", the existing sequencing technologies do struggle with sections of DNA with many repeats of the same sequence. Since current methods rely on chopping the DNA into 1000+ fragments, and then sticking them back together by matching up the overlapping ends of each section, it's possible to lose, say, some of the middle repeats. Sure, the sequencing will provide the sequence of the repeating unit, and which other gene sequences it's sandwiched between. But as to how many repeats there are, sometimes the most that can be said is "at least X", simply because that's the longest individual fragment composed entirely of those repeats. There can also be similar problems if near identical sequences appear in different parts of the genome.

The nanopore technology's selling point seems to be that it can read much longer individual fragments - indeed they seem to have a goal of a literal end to end scan of the genome in one go, although they're not there yet. In principle that would pin down issues like how many repeats there are, how much it varies from person to person (and even from cell to cell). There are some medical conditions which can be caused by abnormal numbers of repeats in these kinds of sections of DNA (usually excessive numbers). They're not just problematic for us to sequence, they're weak links for the body's own DNA handling systems for somewhat related reasons. Hence potentially pathologically relevant.

I agree with Samson that the reporter in the previous article has got horribly confused between these kind of problematic sequences, individual nucleotide variation, and (probably?) DNA methylation - based on the description of attaching things to DNA.
 
Last edited:
Eating early is good for you
  • Late eating increases waketime hunger and decreases 24-h serum leptin
  • Late eating decreases waketime energy expenditure and 24-h core body temperature
  • Late eating alters adipose tissue gene expression favoring increased lipid storage
  • Combined, these changes upon late eating may increase obesity risk in humans
Late eating has been linked to obesity risk. It is unclear whether this is caused by changes in hunger and appetite, energy expenditure, or both, and whether molecular pathways in adipose tissues are involved. Therefore, we conducted a randomized, controlled, crossover trial to determine the effects of late versus early eating while rigorously controlling for nutrient intake, physical activity, sleep, and light exposure. Late eating increased hunger (p < 0.0001) and altered appetite-regulating hormones, increasing waketime and 24-h ghrelin:leptin ratio (p < 0.0001 and p = 0.006, respectively). Furthermore, late eating decreased waketime energy expenditure (p = 0.002) and 24-h core body temperature (p = 0.019). Adipose tissue gene expression analyses showed that late eating altered pathways involved in lipid metabolism, e.g., p38 MAPK signaling, TGF-β signaling, modulation of receptor tyrosine kinases, and autophagy, in a direction consistent with decreased lipolysis/increased adipogenesis. These findings show converging mechanisms by which late eating may result in positive energy balance and increased obesity risk.

fx1_lrg.jpg


Spoiler Graphs make it look like the effect is really big :
gr3_lrg.jpg
Spoiler Legend :
Effect of late eating on energy intake regulation
Effects of late eating schedule on daily profiles in (A) self-reported hunger, (B) serum leptin concentration, (C) plasma acylated (active) ghrelin concentration, and (D) acylated ghrelin:leptin ratio. Data shown as mean ± SEM; each data point is expressed as the percentage of the mean of all time points collected during the early eating protocol for that same participant. Left panels, test day 1; middle panels, test day 2; right panels, effects of late eating (late eating schedule minus early eating schedule) averaged across test days with asterisks indicating significant differences (∗p < 0.05; ∗∗p < 0.01; ∗∗∗p < 0.001; ∗∗∗∗p < 0.0001). Vertical black dashed and red solid lines, timing of meals in early and late eating schedule, respectively. Horizontal black bars along x axes, scheduled sleep episodes. Gray bars, semi-recumbent posture.


Spoiler Relevance of ghrelin:leptin ratio :
The increase in the ghrelin:leptin ratio during the waking hours and across 24 h was explained primarily by a decrease in leptin, without a significant change in ghrelin. This decrease in leptin could, at least partially, be responsible for the observed increase in hunger and appetite as well as for the decrease in energy expenditure and CBT

From The long road to leptin:
Leptin is an adipose tissue hormone that functions as an afferent signal in a negative feedback loop that maintains homeostatic control of adipose tissue mass. This endocrine system thus serves a critical evolutionary function by protecting individuals from the risks associated with being too thin (starvation) or too obese (predation and temperature dysregulation). Mutations in leptin or its receptor cause massive obesity in mice and humans, and leptin can effectively treat obesity in leptin-deficient patients. Leptin acts on neurons in the hypothalamus and elsewhere to elicit its effects, and mutations that affect the function of this neural circuit cause Mendelian forms of obesity. Leptin levels fall during starvation and elicit adaptive responses in many other physiologic systems, the net effect of which is to reduce energy expenditure.
 
Last edited:
I started skipping/minimizing dinner decades ago. This makes me feel ahead of the curve.
 
I try not to eat after dark, I used to be a big nighttime binge eater.

It's unlikely during our evolution that humans ate much late at night (except maybe on special occasions/feasts).
 

Footage shows pod of orcas killing a great white shark and devouring its liver​

It's the 1st 'irrefutable evidence' that orcas are feeding off great whites near South Africa, says biologist

Great white sharks can no longer claim the title of top predator — at least not off the coast of South Africa.

New drone and helicopter footage shows a pod of orcas ruthlessly pursuing a great white shark in Mossel Bay before going in for the kill. The grisly video culminates with one of the killer whales gobbling up a large chunk of the shark's liver.

Scientists have long suspected that killer whales have been hunting sharks off South Africa's coast and driving them from their natural habitat. Now they have "irrefutable evidence," says shark biologist Alison Towner.

"When I saw the footage, of course, it just confirmed everything," Towner, a PhD candidate at South Africa's Rhodes University, told As It Happens host Nil Köksal.

"And as haunting it is to see the behaviour, one can't help but be awed by it as well. It's really quite a novel piece of natural history to observe."

Towner is the co-author of a new paper analyzing the footage, which was published this week in the journal Ecology.

Towner says there's been mounting evidence of orcas hunting great whites off South Africa since 2017.

"We had carcasses that were washed up, ripped open, livers missing. We had sightings of orcas in the area, [followed by] disappearances of the white shark," she said.

Not only does the new footage give her the smoking-gun evidence she's been looking for, but it allows scientists to observe exactly how the hunt played out.

The video — a compilation of helicopter and drone footage of the same hunt on May 16 — shows five orcas working together to kill a shark.

As one orca homes in on its prey, instead of fleeing, the shark swims tight circles, mimicking an evasive technique that turtles and seals use when trying to escape great whites.

It's a move that sometimes works out for a shark's prey — but it didn't help this great white escape its hunters. That's because orcas, unlike great whites, hunt in packs.

"I always refer to the orcas as being like the wolves of the ocean," Towner said. "They've kind of got the edge on this because, you know, they're co-ordinated and they've got teamwork, whereas the white sharks are on their own. They're caught by surprise and they're basically just panicking."

It's no easy feat to make a great white shark panic. But there's a reason that orcas are referred to as killer whales — even though they are, in fact, dolphins.

"What does everybody associate an orca with, typically? We all think of Free Willy, right? Or Sea World? And we think cute and cuddly, and obviously very charismatic," Towner said.

But orcas, like great whites, are considered apex predators of the ocean.

"They are prolific hunters. They're extremely efficient. They're rather savage actually, because they're so good at what they do," she said.

"White sharks have this very apex predator role, and they're quite feared. But now they've definitely been knocked off the top pedestal here."

Josh D. McInnes, a marine mammal researcher at the University of British Columbia who has studied orcas hunting large mammals, says there are a number of possible reasons the orcas are targeting sharks.

"Killer whales may kill white sharks based on competition for resources, perceived risk to young calves in a pod, or for food," he told CBC Radio in an email.

"The liver of sharks is rich in nutrients and oil, and makes up a large proportion of the shark's anatomy. It would make sense for killer whales to feed on this organ."

Jenny L. Atkinson, executive director the Whale Museum in Washington, says that while this footage is novel, the behaviour likely isn't.

There have been reports of orcas killing great whites in California since the late '90s. And a 2019 study found that when orcas show up in shark-infested waters, great whites flee.

"With more people watching and video more accessible, we are going to be documenting more behaviours that seem new or unique to us — but it's likely not new to them," Atkinson said in an email.

Driving the sharks away​

Towner says the hunt caught on film is part of a broader pattern. In fact, she and her colleagues suspect that three other sharks were killed by that same orca pod that day.

And they believe it all started with two orcas known as Port and Starboard — the latter of whom appeared in the footage chomping down on the shark's liver.

The deadly duo, Towner said, appear to have be passing their shark-hunting techniques to other orcas. And the sharks are responding, she said, by getting the heck out of dodge.

After the video was filmed, no great whites were seen in the area for 45 days. They started to reappear in the summer, but then another mangled great white shark carcass washed ashore.

"Since then, there haven't been any white sharks back in Mossel Bay," she said.



Towner says she's worried what this will ultimately mean for the future of South Africa's great white shark population, and what their loss will mean for the ecosystem at large.

The coast of South Africa, she says, is the sharks' natural feeding ground. And it's unclear where the fleeing sharks have taken refuge.

"How many more times can it happen before the entire population is displaced?" she said. "It's really hard to predict what will happen."
https://www.cbc.ca/radio/asithappen...white-shark-and-devouring-its-liver-1.6610075
 
Also the Ig Nobels were the other day

ECONOMICS PRIZE [ITALY]
for explaining, mathematically, why success most often goes not to the most talented people, but instead to the luckiest.
[NOTE: This is the second Ig Nobel Prize awarded to Alessandro Pluchino and Andrea Rapisarda. The 2010 Ig Nobel Prize for Management was awarded to Alessandro Pluchino, Andrea Rapisarda, and Cesare Garofalo, for demonstrating mathematically that organizations would become more efficient if they promoted people at random.]

I waited to see if they would update their academic profiles, but they must have been busy otherwise: both Pluchino and Rapisarda only list among their achievements the 2010 win (but they added the latest awards to their personal web pages).
I'm not sure when, but they also reunited with Garofalo i.a. to create a theory proposing that a number of members of parliament should be chosen randomly.
 
Back
Top Bottom