Half-life (not the game)

MeteorPunch

#WINNING
Joined
Jan 19, 2005
Messages
4,834
Location
TN-USA
Dont' make this thread on Evolution vs. Creation please :)

Could someone explain or give a link as to how elements with billion year half-lifes could be accurate. Carbon-14 is somewhat accurate after thousands of years, but how can anything even be measured on a miniscule scale resulting in billions? I assume this is done with math calculations, but a small variance would set it off by millions of years.
 
MeteorPunch said:
[Could someone explain or give a link as to how elements with billion year half-lifes could be accurate. Carbon-14 is somewhat accurate after thousands of years, but how can anything even be measured on a miniscule scale resulting in billions? I assume this is done with math calculations, but a small variance would set it off by millions of years.
Bear in mind that when we are talking billions, millions is only 1/1000 error. Obviously it's not very good over short term, but is useful for very big numbers.
 
Well, first of all, to measure the half life of something you don't actually need to wait there a billion years until half of it decays away. You can measure the rate at which it decays, and by applying a simple differential equation work out how long it's been decaying for. Not sure if that's what you were asking though. Try searching on http://www.wikipedia.org , if you haven't already.

EDIT in light of Scuffer: Oh, that's what you were asking ^^
 
Also keep in mind that the unit for measurement of molecules is the mole.

One mole is 6.02214199e23.

That's 602 sextillion 214 quintillion 199 quadrillion, give or take about 50 quadrillion.

That number of molecules gives us quantities we can understand, like grams of stuff.

So if you have one mole of something with a billion year half life, you still get about 10 million decay events every second.

A nuclear decay event produces, some combination of alpha and beta particles along with gama radiation. Each of these can easily be measured as an individual event because they have high energies. An individual alpha particle typically has an energy in the range of 3 to 10 MeV, beta from 5 keV - 5 MeV. Gama radiation has energies typically in the range of 20 keV - 10 Mev, but are a bit harder to detect because they don't have any mass (they are EM radiation).

Now if we are talking about dating then we are talking about measuring the ratio between the 'mother' isotope and its daughters. Many times the daughters are radioactive too. I think you can work out the rest.

C14 is a special case because it has been produced at a known and relatively constant rate by the sun, and taken up at a known rate by organic matter (in equilibrium with the atmosphere). Thus you only need to know how much C14 is left.
 
Carbon is only used for about 50,000 years in the past. Potassium-Argon dating can be used basically for anything (provided that there was potassium in the substance), as it has a half-life of 1.3 billion years. The half-life is the time that it takes half of the material to decay. In this case, after 1.3 billion years, 5 moles of Potassium would have decayed into 2.5 moles of potassium and 2.5 moles of Argon, along with 2.5 moles of Hydrogen (I guess 1.25 moles of H2, actually).

According to wiki, this would be an example of positron emission.
 
Half lives this long are calculated from the radioactive decay constant k.

A=kN

A= activity - which can be measured
N = no of nuclei - which can be calculated as above from the mass and nucleon number.

Half life t= ln2/k
 
Yom said:
Potassium-Argon dating can be used basically for anything (provided that there was potassium in the substance), as it has a half-life of 1.3 billion years.

Using this one as an example, let's say the decay has been observed and calculated for the past 50 years. Even *if* the rate of decay during this amount of time is 100% constant, how can it be assured that weather events such as solar flares, an ice age, meteorite dust blocking the sun, etc. hasn't changed the rates of the old objects we date?

I'm just thinking to assume a constant decay is asking a lot really. 50/1,300,000,000 is a small fraction in which a small change of consistency could throw the results off by billions of years.
 
MeteorPunch said:
Using this one as an example, let's say the decay has been observed and calculated for the past 50 years. Even *if* the rate of decay during this amount of time is 100% constant, how can it be assured that weather events such as solar flares, an ice age, meteorite dust blocking the sun, etc. hasn't changed the rates of the old objects we date?

I'm just thinking to assume a constant decay is asking a lot really. 50/1,300,000,000 is a small fraction in which a small change of consistency could throw the results off by billions of years.

I'm not a physicist, nor did I study it. But my understanding of decay is that it's not dependant on outside forces. That is, if you have, oh, Uranium-238 sitting out in the open, it will decay at the same constant rate as U-238 buried deep underground.
 
Yom said:
Carbon is only used for about 50,000 years in the past. Potassium-Argon dating can be used basically for anything (provided that there was potassium in the substance), as it has a half-life of 1.3 billion years.
I even heard that you can go as far as Uranium Dating.
 
Sorry for going off-topic, but I have a question about half-lives that I don't think really deserves its own thread.

What happens when there's only a small number of atoms left in the substance? Let's say there's ten atoms of a radioactive element. After one half life, there should be five. But after two half lives, there can't be 2.5 atoms. So how many would be left after two half lives? Is radioactive decay just a matter of probability, and thinking of it in "half lives" only gives an approximation (and in this example, most likely there would be 2 or 3 atoms, although it could be anywhere from 0 to 5, with the more extreme values being less likely)?
 
WillJ said:
Sorry for going off-topic, but I have a question about half-lives that I don't think really deserves its own thread.

What happens when there's only a small number of atoms left in the substance? Let's say there's ten atoms of a radioactive element. After one half life, there should be five. But after two half lives, there can't be 2.5 atoms. So how many would be left after two half lives? Is radioactive decay just a matter of probability, and thinking of it in "half lives" only gives an approximation (and in this example, most likely there would be 2 or 3 atoms, although it could be anywhere from 0 to 5, with the more extreme values being less likely)?
The number will never reach down to zero due to the expodential curve (IIRC) :)
 
WillJ said:
Sorry for going off-topic, but I have a question about half-lives that I don't think really deserves its own thread.

What happens when there's only a small number of atoms left in the substance? Let's say there's ten atoms of a radioactive element. After one half life, there should be five. But after two half lives, there can't be 2.5 atoms. So how many would be left after two half lives? Is radioactive decay just a matter of probability, and thinking of it in "half lives" only gives an approximation (and in this example, most likely there would be 2 or 3 atoms, although it could be anywhere from 0 to 5, with the more extreme values being less likely)?
It's probabilistic; if you have 10,000,000,000 particles, after one half life there will be 5,000,000,000 particles, on average. It won't be exact, but with so many particles, a variation of 2 or 3 is nothing. But with only 10 particles, half of that is 5, but with a variation of 2 or 3 particles that could be 8 or 2 remaining. So it doesn't exactly halve it; it is, at the end of the day, a probability.

I did some basic experiments at uni this year with radioactivity, and the probabilistic nature makes it very interesting!

But since, as has been said, we are generally dealing with between ~10^20 and 10^26 particles, a small variation is nothing.
 
Turner_727 said:
I'm not a physicist, nor did I study it. But my understanding of decay is that it's not dependant on outside forces. That is, if you have, oh, Uranium-238 sitting out in the open, it will decay at the same constant rate as U-238 buried deep underground.

Carbon 14 dating is dependent on atmospheric pressure. I don't know what all the factors are.
 
MeteorPunch said:
Carbon 14 dating is dependent on atmospheric pressure. I don't know what all the factors are.
In what way is it dependent on atmospheric pressure?
 
This is the article the info is from.

INHERENT PROPERTIES AND ERRATA

Radiocarbon dating, especially using the Carbon 14 method, takes advantage of the radioactive decay of the isotope, which is seen as a constant. Every living thing takes in and expels Carbon 14 while it is alive, and a static level of the element is maintained. When the organism dies, the infusion is suspended, and the level is reduced according to the rate of decay, known as the �half-life.� The amount of Carbon 14 in the artifact is measured and then compared to the presumed static level the organism maintained while alive; the comparison then yields the relative age of the specimen. Though this sounds very straightforward and scientific, there are several serious problems.

The first problem is seen in the very approach in the presumption that must be made in the level of Carbon 14 the organism had while living. Here we have a critical calculation that is based upon an assumption that an organism which lived thousands of years previous, of which there are no modern species to compare, developed a specific level of Carbon 14 from an environment we know nothing about. If for example, the presumption is inaccurate by only 10%, considering that it is the rate of decay that forms the mathematical constant, the inaccuracy of the calculation of age at the upper limit would be tens of thousands of years.

The very basis for the assumption above is another problem, and is perhaps the most embarrassing for the proponents of radiocarbon dating. To assume a particular level of Carbon 14 in an organism requires a precise determination of environmental (atmospheric) levels of the same. That is, to presume a particular level in a living thing requires a precise knowledge of the ambient amount of Carbon 14 in the air and environment. Scientists performing radiocarbon dating assume that the amount in the environment has not changed. This is compelling for several reasons, not the least of which is the convenience with which �science� apparently operates; we hear of massive changes in the earth, ice ages, catastrophic events that killed the dinosaurs, etc., but the environment never changed according to the same scientists.

Not only does the requisite level of assumption and presumption all but invalidate the accuracy of the claims of very old dating, but were there for example, an environmental phenomenon that affected the level of ambient Carbon 14, the results could be skewed exponentially. In fact, several such phenomena did indeed exist, proven by the same science that supports old-age radiocarbon dating! It would seem quite clear that some predisposition or predilection for particular findings in terms of dating artifacts is at work in this case. For example, consider that it is essentially accepted that an antediluvian water canopy existed surrounding the earth; this would have acted to either negate or at least significantly reduce the effect of cosmic, x-ray, and ultraviolet radiation in the upper atmosphere. Carbon 14 production would have been negligible, and therefore would not have been absorbed by living things; any organism living before the reduction of the canopy would in turn be dated exponentially older than it actually is. Or consider the effect a global atmospheric shield of dust created as a result of a meteor impact some scientists believe killed off the dinosaurs�levels of Carbon 14 in the atmosphere must certainly have been different, thereby invalidating the age/date test data. Isn�t it funny how the same scientists who purport constant catastrophic changes in earth�s history depend upon the inherent necessity that it was completely without any changes?

Moreover, it is established fact that the earth�s magnetic field has been in a constant decline in strength2, which would have vigorously protected the earth from the same radiation, all but negating the production of Carbon 14 and thereby minimizing the ambient amount available for absorption by living things. Yet these two facts are virtually unknown in modern society, and it seems never associated with radiometric dating, apparently since it would put such method (and indeed its findings) in doubt as to its reliability.

Another fact, which proves quite embarrassing to �old-age� proponents in regard to radiometric dating, is the half-life of Carbon 14 itself. Not only is the actual half-life length itself in some contention, but the effect it would have on the upper limits of its capability in dating illustrates clearly the level of fraud that has been foisted on an unsuspecting society. Consider that Carbon 14�s half-life is around 5,630 years 3 (though estimates range from 5,300 to 5,700 years); in only ten cycles of this, there would be nothing left to measure in the extant specimen! This means that the absolute maximum age radiocarbon could date a specimen to would be around 56,300 years; yet daily society is barraged with reports that some new find was dated in the hundreds of thousands, and even millions of years using Carbon 14. Actually, after the sixth cycle or so, there would not be enough Carbon 14 in the sample to be measured; the upper limit then would be around 30,000 years.

This leads to yet another inherent problem in the use of radiometric dating which would seem virtually insurmountable, and is caused by the presence of environmental Carbon 14 itself, ironically, the phenomenon scientists exploit in the determination of date of origin. Simply stated, it is nearly impossible to preclude contamination that seriously affects the results of the measurement. The levels of Carbon 14 in any �old� artifact are extremely low; because of this, it is virtually impossible to prevent the test and measurement equipment from picking up residual or background environmental Carbon 14 not associated with the specimen. Further, most artifacts by their very nature are found in and around various forms of rock, which provide several sources of additional radiation. This has the concomitant effect of providing a source of neutrino radiation; Carbon 14 decay is accelerated in the presence of such bombardment, and again the effect would be to cause the specimen to appear much older than it actually is. This effect cannot be overstated in regard to the estimates of age�a less than 5% reduction in the extant amount of Carbon 14 in the specimen, owing to the �constant� of its half-life will yield a factor of 5 times the actual age. Imagine the effect on science if an artifact dated at 45,000 years is actually only 9,000; the possibilities are staggering.

The foregoing is but a few examples of the problems with Carbon 14; many more examples could be given, as well as some documented, glaring failures such as live clams being dated at 1,500 years, and parchment documents from the 17th century being dated to the 4th. The point however, is that radiocarbon dating has serious problems in terms of reliability and veracity, and its use is at best quite limited. On the other hand, there is an obvious dichotomy in these problems and the lack of common knowledge regarding them; it would seem that there should be some explanation why the vast majority of society is so unaware of the spurious nature of the science behind radiocarbon dating. That is, since science is ostensibly clinical and without emotion, the most likely cause of the dearth of knowledge of the limitations, fallacies, and vulnerabilities in this method is man himself�a manifestation of his own biases and predilections. This is the subject of the next division.
 
I am a professional physicist. The article is full of nonsense. Half lives are known accurately.

To answer an earlier point, they are probabilistic. That is they predict the chance of decay. When the number of nuclei is very small, the situation is not unlike rolling a dice.

Nothing external to the nucleus influences nuclear decay, not pressure not temperature.
 
Back
Top Bottom