Continuing with the previous post
regarding the two dating processes of Carbon-14 dating vs. Geologic Column
dates. As stated at the end of the previous post, “Unfortunately for those who
believe in both concepts, they do not agree with one another!
As an example, if
two dates, that of Carbon-14 dating and that of known index fossils, disagree,
the radiocarbon dating is thrown out and the index fossil dates are used. This
is because, as earlier stated, fossils are not dated by radiocarbon dating, but by their
geologic position (in the column in which they are found).
This idea presents some grave
difficulties and problems for adherents of the geologic column and of Carbon-14
dating.
Take the experience from the
Hornton Quarries in England, where fossil wood was found alongside ammonite and
belemnite index fossils considered to be 189 million years old. Three specimens
of sample wood were sent to the commercial Geochron Laboratories in Cambridge,
Boston, for testing. As a matter of cross-checking, a piece of the first sample
was also sent to the Antares Mass Spectrometry Research Laboratory at the
Australian Nuclear Science and Technology Organisstion near Sydney, Australia.
Both labs and all three samples tests showed a range between 20,700 years and
28,820 years before the present, far short of the index fossil date of 189
million years. Normally, neither lab would have tested the wood samples had
they known they were found among fossilized specimens indexed at 189 million years, since
they would know there would be no measurable carbon in the wood. And had they
known of the index fossil location, they would have simply regarded the wood as
being 189 million years old.
How important is this? Why have we spent so much time on it?
The answer is simple: As we have
written here several times, scientists use a technique
called radiometric dating to estimate
the ages of rocks, fossils, and the earth. Many people have been led to believe
that radiometric dating methods have proved the earth to be billions of years
old—in fact, this so-called “fact’ has been taught in schools for more than the
past 60 years, making our society a three-generation believer in the process
and the results. Ask any group of students in almost any high school or college
today and they will tell you the Earth is billions of years old—4.55 billion at
last published evolutionary dates. This attitude has caused many normally
religious people to reevaluate the biblical creation account, specifically the
meaning of the word “day” in Genesis. This, in turn, has created at least two
generations in our society (and the world) to doubt their religious heritage
and turn away from that which has made our nation strong—a belief in and
conviction of, a supreme power, a Diety more powerful than ourselves, an
omnipotent God.
Is that important?
One
need only look around to the difference they see now regarding human behavior,
moral conduct, ethics, standards, social manners, charity and overall common
decency, and compare it with that which existed in the first half of the
twentieth century. Of course there has always been anti-religious sentiment,
moral bankruptcy, and social deterioration among groups of people in any
society, but the numbers today are so obviously overbalanced toward this
bankruptcy as to make it quite clear that society has been on a downward slope
since the earliest inroads made by evolutionists and anti-God dogmas of the
early 20th-century.
Left: High School classroom 1900;
High School classroom 2000
While
it is not politically correct today to talk about God, morality, and social
behavior as it once existed, and that is not the major emphasis of this
blog-site, the corrupting of human nature goes hand-in-hand with the
elimination of God in any society, and there are certain so-called “scientific
advances” that have led to this breakdown and all that such a societal problem
entails.
Among
other things, pseudo-science of evolution, the Geologic Column and Carbon-14
dating have taken us far from the truth in terms of our understanding the past.
While archaeology and anthropology are meant to help man understand his past,
when it is promoted at every educational level based on untruths, lies, and
downright misrepresentations, it does far more harm than just providing
generations with falsifications about their past and heritage.
While
the harm of such teaching and false understandings can have numerous
consequences, in simple form, it keeps us from understanding what we read, what
we see, and what we learn about past generations and their encounters with life
that can teach us a lot about our own circumstances and difficulties. Wherever
you want to place the Jaredite and Nephite nations, the point is that without
our understanding their surroundings, their challenges, and their successes in
the light in which they occurred, we miss a lot that can be learned from such
historical events.
Take
the difference of dates, as an example. When science talks about people living
in caves in the earliest centuries of man, or that early man knew nothing of
the finer points of life, such as not discovering agriculture for hundreds or
thousands of generations, or that metallurgy, textile, and carpentry came along
much later than it did, we lose touch with and an understanding of the peoples
who built societies, nations, and industries. Moses tells us that seven
generations from Adam man was living in tents and herding livestock; inventing
musical instruments, including the lyre, flute, harp and organ; forging
implements of bronze and iron.
We
“learn” from archaeology and anthropology that man occupied certain lands,
cities and regions in 30,000 B.C., or 10,000 B.C., etc., yet the Flood occurred
in 2344 B.C., suggesting to the uninformed about so-called ratiocarbon dates
that these lands, cities and regions were populated after that time.
Consequently,
to more accurately understand the dating of past events, the building of
ancient cities, the development of early societies, we need to recognize that
the dates archaeologists and anthropologists throw around based on Carbon-14 or
radiometric dating is far from accurate.
While
archaeologists and their dating sequencing claims that when they find a fossil
in the dirt, they can test it for age; however, they cannot know in advance when
it lived or died—that is a guess they have to make. The lab to which the
Carbon-14 dating is tested, does not know how much Carbon-14 the specimen had
at the time of death, how much existed and was common for the time it lived, cannot
know if other factors were involved, such as contaminants in the ground, water,
or air when it lived, or afterward over the hundreds or thousands of years
before being found. The lab cannot know at what rate the existing Carbon-14 in
the specimen decayed, or any number of other factors the dating program is not
equipped to consider.
All
the lab can do is measure how much Carbon-14 remains in the specimen and
compare it to a table that is based upon a number of assumptions, such as those
just stated, to provide an “age.” As an example, animals that lived before the
Flood would have
less Carbon-14 than after the Flood, since the conditions before the Flood,
such as ice, moisture canopy, etc., blocking radiation from the Sun (X-ray and
UV radiation—visible light which turns nitrogen into Carbon-14 in the
atmosphere) existed, and how much was blocked from even forming.
So how does science and the
Carbon-14 clock dating sequencing and testing account for this variation? After
all, they do have tables to use for other factors, such as the decrease or
increase of Carbon-14 in the atmosphere during certain pre-determined periods,
such as atomic bomb testing, ice-ages, etc. But what about the Flood?
(See the next post, “How Far Back
Can We Measure Dates? Part III,” to see about the way science compensates for
pre- and post-Flood dating in Carbon-14 testing)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment