AR
Thanks for your concise posting. I admit I was looking at it more from a sun-climate change point of view rather than carbon dating although the basic points still applie. A little more detail to clarify what I mean and a wee question at the end.
Radioactive isotopes record solar magnetic activity and related climate change for thousands of years. During periods of increased activity on the Sun, when the Earth was presumably warmer, the magnetic fields in the solar wind had a larger shielding effect on cosmic rays. This prevented the energetic charged particles from entering the Earth's atmosphere and producing radioactive isotopes. In contrast, high amounts of the radioactive elements were produced when the Sun was inactive and the climate was cold.
Carbon-14, dubbed radiocarbon and designated C14, is the first radioactive isotope to be used to reconstruct past solar activity. Radiocarbon is produced in the Earth's atmosphere by a nuclear reaction in which energetic neutrons interact with nitrogen, the most-abundant substance in our air. The neutrons are themselves the products of interactions between cosmic rays and the nuclei of air molecules.
Radiocarbon can be found in annual tree rings dating back to eight thousand years ago. Each radiocarbon atom, C14, joins with an oxygen molecule, O2, in the air to produce a form of carbon dioxide, designated by 14C02 , that is assimilated by live trees during photosynthesis, and deposited in their outer rings. The time of assimilation can be determined at any later date from the age of the annual tree ring. Just count the number of tree rings that have been subsequently formed at the rate of one ring per year.
The radiocarbon records confirm that the Maunder Minimum corresponded to a dramatic reduction in solar activity, and show that such prolonged periods of inactivity are a fairly common aspect of the Sun's behavior.
This time scale can be extended. Cores taken from the polar ice caps extend the tree-ring evidence for sun βclimate connections. The ice contains the radioactive isotope of beryllium, Be10, that has been deposited there by snows. Like radiocarbon Be10 is produced reactions between energetic neutrons and the molecules of the air, a consequence of cosmic rays entering the atmosphere.
The point of all this is that it can be determined when the sun was active or inactive which then, presumably, affect carbon dating calculations. This is where I have a question. I understand that the rate of decay is vital but I'm not totally clear in my mind why the amount of C14 in the atmosphere has a bearing on this. In other words why do fluctuations in the atmospheric C14 content affect the apparent C14 date from a sample? What's the correlation between amount and rate of decay? I'm assuming from your post that you can't calculate the loss by decay without knowing the starting amount, thus the calibration curves.
The past is a foreign country: they do things differently there.