So, plasma cosmology is less well-funded than more mainstream theories, because it isn't as well-supported by the evidence as they are? This sounds to me like the scientific enterprise working as it should. Ideas and experiments which produce results and comport with the evidence are used as the foundation for further research, while those that don't fall by the wayside. What exactly is the problem?JParker said:Plasma cosmology is not a widely-accepted scientific theory, and even its advocates agree the explanations provided are less detailed than those of conventional cosmology. Its development has been hampered, as have that of other alternatives to big bang cosmology, by the exclusive allocation of government funding to research in conventional cosmology. Most conventional cosmologists argue that this bias is due to the large amount of detailed observational evidence that validates the simple, six parameter Lambda-CDM model of the big bang.
Spetner wrote:In his book Not By Chance!, Lee Spetner presents an illuminating mathematical analysis of evolution. Spetner is a professor emeritus of physics from MIT who specialized in information theory. In this book, Spetner points out that, in order to build up information in small steps, each step must add information on average. But few if any mutations have ever been discovered that add information. Virtually all known beneficial mutations in bacteria, for example, reduce sensitivity to antibiotics by actually losing information. Spetner's mathematical analysis from first principles contrasts sharply with Dawkins' method of working back from a preconceived conclusion (or, equivalently, working forward from the premise of naturalism). And Spetner's analysis demonstrates that the Neo-Darwinian Theory of Evolution is not even close to mathematical viability. "The deck is stacked," as he puts it. But don't hold your breath waiting for evolutionists to concede that a mere mathematical analysis can trump their naturalistic dogma.
Spetner is a physicist, not a biologist, and the very title of his book betrays his ignorance of the subject. No biologist thinks that evolution is a chance or random process. It has a random component, but Spetner seems to be ignoring the role of natural selection, which was Darwin's key contribution. Evolution is the non-random survival of random mutations: "The combination of random mutations with a suitable law (for example, the non-random natural selection) can accelerate evolution by many orders of magnitude (as, for example, has been demonstrated by Dawkins ) and this makes Spetner's probabilistic exercise immaterial" (from http://www.talkreason.org/articles/spetner.cfm#calculates).How many generations, and how long, would it take to get a particular multiple nucleotide change in a germ cell to have an effect on Neo-Darwinian evolution? Here, the mutation rate is about one per billion nucleotides per replication. Let’s suppose we're doing this experiment with a population of a billion bacteria. Then, in one generation, there will be an average of one change in a particular base. A particular double base change has a probability of one per quintillion, or 10-18. To get one of these would take a billion generations, or about 100,000 years. To get a triple change would take 1014, or a hundred trillion, years. That is why a long waiting time cannot compensate for a low mutation rate. I've given numbers here for a laboratory experiment with bacteria. Many more mutations would be expected world-wide. But the same kind of thing has to happen under NDT with multicelled animals as well. With vertebrates, for example, the breeding populations seldom exceed a few thousand. Multicelled animals would have many fewer mutations than those cited above for bacteria.
Look, Carl, you're not discussing specifics, you're criticizing someone because his background of his religion. I'm no Talmudic scholar, but if the below arguments and reasoning are sound, I don't care if it's Scientology based; what do you find wrong, with your expertise, such as it is, with Spetner's reasoning. His book runs the numbers,Contra your claims of Dawkins ignoring the subject, that citation refers to his The Blind Watchmaker, which is, incidentally, an outstanding book on the subject for anyone who'd like to learn more about how evolution works, and how we know that it does. It's accessible, straightforward, and unburdened by ideological commitments (the same cannot be said for Spetner's book, in which he claims that "The NREH, as an explanation of evolution, is in fact derivable from Talmudic sources.").
http://www.uncommondescent.com/intelligent-design/lee-spetner-responds-to-tom-schneider/W&E were mistaken in thinking the evolutionary process to be an in-parallel one — it is an in-series one. A rare adaptive mutation may occur in one locus of the genome of a gamete of some individual, will become manifest in the genome of a single individual of the next generation, and will be heritable to future generations. If this mutation grants the individual an advantage leading to it having more progeny than its nonmutated contemporaries, the new genome’s representation in the population will tend to increase exponentially and eventually it may take over the population.
Let p be the probability that in a particular generation, (1) an adaptive mutation will occur in some individual in the population, and (2) the mutated genome will eventually take over the population. If both these should happen, then we could say that one evolutionary step has occurred. The mean number of generations (waiting time) for the appearance of such a mutation and its subsequent population takeover is 1/p. (I am ignoring the generations needed for a successful adaptive mutation to take over the population. These generations must be added to the waiting time for a successful adaptive mutation to occur.) After the successful adaptive mutation has taken over the population, the appearance of another adaptive mutation can start another step.
In L steps of this kind, L new alleles will be incorporated into the mean genome of the population. These steps occur in series and the mean waiting time for L such steps is just L times the waiting time for one of them, or L/p. Thus the number of generations needed to modify L alleles is linear in L and not logarithmic as concluded from the flawed analysis of W&E.
The flaws in the analysis of W&E lie in the faulty assumptions on which their model is based. The “word” that is the target of the guessing game is meant to play the role of the set of genes in the genome and the “letters” are meant to play the role of the genes. A round of guessing represents a generation. Guessing a correct letter represents the occurrence of a potentially adaptive mutation in a particular gene in some individual in the population. There are K letters in their alphabet, so that the probability of guessing the correct letter is K-1. They wrote that
1– (1 – 1/K)r
is the probability that the first letter of the word will be correctly guessed in no more than r rounds of guessing. It is also, of course, the probability that any other specific letter would be guessed. Then they wrote that
[1– (1 – 1/K)r]L
is the probability that all L letters will be guessed in no more than r rounds. The event whose probability is the first of the above two expressions is the occurrence in r rounds of at least one correct guess of a letter. This corresponds to the appearance of an adaptive mutation in some individual in the population. That of the second expression is the occurrence of L of them. From these probability expressions we see that according to W&E each round of guessing yields as many correct letters as are lucky enough to be guessed. The correct guesses in a round remain thereafter unchanged, and guessing proceeds in successive rounds only on the remaining letters.
Their model does not mimic natural selection at all. In one generation, according to the model, some number of potentially adaptive mutations may occur, each most likely in a different individual. W&E postulate that these mutations remain in the population and are not changed. Contrary to their intention, this event is not yet evolution, because the mutations have occurred only in single individuals and have not become characteristic of the population. Moreover, W&E have ignored the important fact that a single mutation, even if it has a large selection coefficient, has a high probability of disappearing through random effects [Fisher 1958]. They allow further mutations only in those loci that have not mutated into the “superior” form. It is not clear if they intended that mutations be forbidden in those mutated loci only in those individuals that have the mutation or in other individuals as well. They have ignored the fact that evolution does not occur until an adaptive mutation has taken over the population and thereby becomes a characteristic of the population. Their letter-guessing game is more a parody of the evolutionary process than a model of it. They have not achieved their second goal either.
Thus their conclusion that “there’s plenty of time for evolution” is unsubstantiated. The probability calculation to justify evolutionary theory remains unaddressed.
Fisher, R. A. (1958). The Genetical Theory of Natural Selection, Oxford. Second revised edition, New York: Dover. [First published in 1929]
Hoyle, F. and N. C. Wickramasinghe, (1981). Evolution from Space, London: Dent.
Wilf, H. S. & Ewens, W. J. (2010) There’s plenty of time for evolution. Proc Natl Acad Sci USA 107 (52): 22454-22456.
Now, what is your own challenge to each essay? What errors have you found?I just became aware of Tom Schneider’s “response” to my objection to his criticism of my calculation of probability (go here for Schneider). I don’t know whether he can’t read or if he has a mental block against admitting to criticism. He thinks that my probability p = 1/300,000 is the probability of an adaptive mutation. I clearly stated that it is the probability that a particular mutation will occur in a population and will survive to take over that population.” He did not understand this clear statement and thought that I meant it to be the probability of a particular point mutation occurring in a given genome. (His comparison of this number with 10-8 for the probability of the mutation rate in bacteria is irrelevant and is indicative of his misunderstanding.) Evolution requires that a sequence of many of these occur, and for the example I chose, a sequence of 500 of them was necessary. I am going to go through this once more and I hope he is listening carefully. Once the first adaptive mutation in the sequence has occurred and has taken over the population, a next one must occur and then a next one and so on 500 times. The event that consists of the appearance of an adaptive mutation followed by natural selection of sufficient effect to take over the population is an event that is independent of subsequent events of the same character. Therefore the probabilities multiply, and the probability of the entire sequence occurring is 1/300,000 raised to the 500th power. The only way he can criticize this calculation is to distort what I am saying and claim I was calculating something else.
is not the issue, its truly tertiary. The issue includes the specific failings that Lerner (whose religious faith or lack of same doesn't concern me).So, plasma cosmology is less well-funded than more mainstream theories, because it isn't as well-supported by the evidence as they are? This sounds to me like the scientific enterprise working as it should. Ideas and experiments which produce results and comport with the evidence are used as the foundation for further research, while those that don't fall by the wayside. What exactly is the problem?
And what predictions has Neil DeGrasse Tyson made that have been verified:The so-called ‘Schwarzschild solution’ is not Schwarzschild’s solution, but a corruption,
due to David Hilbert (December 1916), of the Schwarzschild/Droste solution,
wherein m is allegedly the mass of the source of a gravitational field and the quantity
r is alleged to be able to go down to zero (although no proof of this claim has
ever been advanced), so that there are two alleged ‘singularities’, one at r=2m and
another at r=0. It is routinely asserted that r=2m is a ‘coordinate’ or ‘removable’
singularity which denotes the so-called ‘Schwarzschild radius’ (event horizon) and that
the ‘physical’ singularity is at r=0. The quantity r in the so-called ‘Schwarzschild
solution’ has never been rightly identified by the physicists, who, although proposing
many and varied concepts for what r therein denotes, effectively treat it as a radial
distance from the claimed source of the gravitational field at the origin of coordinates.
The consequence of this is that the intrinsic geometry of the metric manifold
has been violated. It is easily proven that the said quantity r is in fact the inverse
square root of the Gaussian curvature of a spherically symmetric geodesic surface in
the spatial section of the ‘Schwarzschild solution’ and so does not in itself define any
distance whatsoever in that manifold. With the correct identification of the associated
Gaussian curvature it is also easily proven that there is only one singularity
associated with all Schwarzschild metrics, of which there is an infinite number that
are equivalent. Thus, the standard removal of the singularity at r=2m is, in a very
real sense, removal of the wrong singularity, very simply demonstrated herein. This
has major implications for the localisation of gravitational energy i.e. gravitational
Regards,Doug: I am. I don't expect a new Dark Age, but neither did the Romans in the early 5th century. Everything and anything is possible, both on the upside and the downside – you don't live long and prosper by ignoring unsavory possibilities.
We've become accustomed, as a civilization, to rapid improvements in science, technology, and our general standard of living for roughly the last 200 years, since the start of the Industrial Revolution. It seems like a long time from one perspective. But it's only about eight generations, or the overlapping lives of two really old people. If you take a longer view, since biologically modern humans evolved perhaps 200,000 years ago, you see that progress was very slow. Maybe 100,000 years went by between the ability to make fire and the invention of the bow. Then maybe another 80,000 to the invention of pottery.
Maybe advances in technology are subject to periods of punctuated equilibrium, as are the evolution of species. Maybe the last 200 years of rapid progress are slowing down. It seems to me there were rapid advances in every area for that time – electricity, aircraft, telephony, atomic energy, and literally a thousand other things resulting from the systemization of science. Other than in computers, though, things seem to have slowed down over the last 50 years. I wonder if we're not just advancing past breakthroughs more than making new ones. Living off of past inertia… I really don't know if that's an accurate view; I'm just considering possibilities.
L: So why are you an optimist, then?
Doug: Well, for one thing, as we discussed in our previous conversation on technology, I think it's a very important fact that there are more scientists and engineers alive now than there have been in all of history combined. That's an extraordinarily positive thing. But looking at the trivia many of them are working on, I don't get the impression there are that many Edisons, Teslas, and Einsteins out there. Let me put that in context… there are probably more, simply because it's a standard distribution, and there are more people. But maybe conditions aren't as conducive to their blossoming as was the case 100 years ago, and making the most of their abilities is harder in some ways – although it's easier in others, like the things made possible with the Internet.
In other words, it seems to me most geniuses in the past were entrepreneurs, working in their basements and garages. Today it seems most go to work for big corporations, or especially the government; those aren't environments conducive to game-changing breakthroughs. A lot of the science today seems to require multibillion-dollar investments; it seems to consume capital, as opposed to creating capital. For instance, NASA resembles the post office more and more every day. On the other hand you've got Burt Rutan's and Richard Branson's Virgin Galactic, and Elon Musk's SpaceX. But capital has to be available to fund things like that. And the losses people have incurred in Facebook and a protracted bear market may be a disincentive to put that capital together. Plus, the actions of governments – which are largely approved of by their subjects – all over the world are very destructive of capital, even if we don't get World War III.
This essay is worth a review:There must be no barriers for freedom of inquiry. There is no place for dogma in science. The scientist is free, and must be free to ask any question, to doubt any asssertion, to seek for any evidence, to correct any errors.
See also:Electricity is an immensely more powerful force than gravity, and far more complex in the ways it interacts with matter. Yet modern astronomy remains wedded to a belief in gravity as the dominant mover and shaper of the universe, and seeks to explain new observations in terms that conceptually go back hundreds of years. James Hogan describes an emerging alternative theory that recognizes the important role played by electricity on cosmic scales, offering explanations based on principles that are well understood and demonstrable in laboratories, without need of recourse to unobserved, untestable physics or speculative mathematical abstractions.
Humans have a wonderful ability for creating visions of ways to improve themselves, thereby making the world a better place; and then, it seems, for losing track somewhere along the way of turning the visions into reality.
Take the business of science, for instance. After several thousand futile years of fighting wars over whose revealed truth was really true, and attempts to impose truth by decree with the aid of rack and thumbscrew or deduce it via rigorous logic from self-evident premises that nobody could agree on, the idea finally emerged that a better way of finding out about the way things are in the world might be to stop fixating on how they ought to be, actually look at what's out there, and accept what it's telling you, whether you like it or not. It works pretty well with such questions as figuring out why cannon balls and planets move the way they do, what heat is, and other matters that can be decided beyond argument according to whether your motor starts or not, or if your plane gets off the ground – all of which rapidly become engineering. But when it comes to issues that aren't settled so easily – the meaning and origin of life; how the cosmos gets to be the way it is, and where it came from: areas where authority can still command and get away with it – things don't seem to have really changed that much. Powerful establishments enjoying political favor and monopoly privileges in teaching and promotion rigidify into orthodoxies defending their beliefs tenaciously, with dissenting views being dismissed, ridiculed, and marginalized, even when supported by what would appear to be verifiable fact and simpler arguments. In possibly an ultimate of ironies, in areas where hopes for science were at their highest, instead of showing the openness to alternatives and readiness to follow the evidence wherever it pointed that were supposed to characterize the new way of understanding the world, much of what we hear today seems to be taking on more the trappings of intolerant religion protecting dogma and putting down heresy.
More than ninety-nine percent of the observed universe exists in the form of matter known as plasma. In the atoms that make up the planet we live on, equal amounts of positive and negative electric charge are confined together and cancel each other out, resulting in objects like rocks and cabbages that are neutral on balance and hence "feel" only the force of gravity. Plasma, by contrast, consists, fully or in part, of charged particles – negative "electrons" and positive "ions" (an atom missing one or more of its electrons) – that are separated, and hence respond to electric and magnetic forces. The electric force between two charged particles, which can be attractive or repulsive, is thirty-nine orders of magnitude stronger than the gravitational attraction between them. That's a one followed by 39 zeros. Such a number boggles the imagination. It is in the order of a millionth of a millimeter compared to 10,000 times the size of the known universe. Even in a plasma comprising just one charged particle in 10,000 – which would be typical of the interstellar clouds of dust and gas from which stars are formed – electromagnetic forces will dominate gravity by a factor of ten million to one. Yet, conceptually, the prevailing view of the cosmos remains essentially rooted in the work of such names as Kepler, Newton, and Laplace, whose laws describe a mechanical universe made up of neutral bodies moving in a vacuum under the influence of gravity. And today's reigning cosmological model, founded on general relativity, is essentially a theory of geometry manifesting itself as gravity.
Gravity-based models were reasonable two hundred and more years ago, when Newtonian dynamics was shown to predict precisely the motions of the Solar System. The plasma that permeates interplanetary space was unknown, along with its ability to organize spontaneously into isolating sheaths that, under stable and tranquil conditions like those prevailing in our locality at the present time, screen planets from electrical forces. And not a lot was understood about electricity in any case. But more recent advances in observational astronomy have revealed phenomena that do not lend themselves readily to explanation in familiar gravitational terms. Pulsars – rapidly varying stellar objects conventionally interpreted as spinning neutron stars – have now been measured to fluctuate at rates that call into question even the power of postulated neutron matter to hold together. Quasars, if accepted in accordance with the customary reading of red-shift as being the most distant objects known, radiate energy with intensities that defy explanation by any process involving conventional matter. The way galaxies rotate, and their violent ejections of matter jets, do not conform to expectations based on gravity. To account for these and other anomalies, such speculative devices as "dark matter" – at the last count numbering seven different varieties – "dark energy," matter collapsing into black holes, and similar exotic mechanisms that have never been observed are introduced to make the theory fit the facts.
Seeking to explain new findings in familiar terms is natural and represents a desirable economy of thought. Models that have become standard were not lightly arrived at and should not lightly be cast aside. However, as was seen with the ever-more elaborate systems of epicycles contrived to keep the Ptolemaic system alive for long after a change of thinking was called for, such conservatism can be taken too far. There comes a point where, "We don't need another theory, because the one we've got can be made to fit the data," is saying more about human inventive ingenuity than the accuracy of the theory.
Over the last two hundred years an enormous amount has been learned about electricity. Technology has gone from Faraday motors and hand-cranked Wimhurst machines to super computers and satellite communications. In parallel with these advances, electrical theorists have developed an alternative paradigm for interpreting astronomical observations, based on principles that are well understood and can be demonstrated in any electrical or plasma laboratory. It requires none of the esoteric physics or ad-hoc inventions that the mainstream has had to resort to repeatedly when new observations failed to match expectations, or were never anticipated at all, and it is proving to be more powerful predictively. Proponents refer to it as Electric Universe theory. Its basic premise is that what we're seeing when we point telescopes at new stars being born or violently energetic events deforming distant galaxies are not results of gravity being intensified unimaginably and behaving in strange and unheard of ways, but electricity. Where electrical forces are operating, gravity effectively ceases to exist. A tiny magnet will snap a nail up effortlessly against the gravitational pull of the entire Earth. You don't have to keep your coffee pot below the wall outlet to enable the electrons to fall down through the cord...
Some suggested web sites for further information on the Electric Universe:
In other words, physics is inherently limited. Even if the Electric Universe were to be accepted, it would not be the final answer to the great questions; for some of you, the quest will continue: gnōthi seautonPace the establishment, there is a great deal that human beings don't know and even what we do know is limited by our human perspective, and the limitations of our knowledge, which is discussed very well in the brilliant Interpretative essay by Mark Kremer from this book which you can preview on Amazon, Plato and Xenophon: Apologies:
[The natural scientists] claim to have knowledge of the macroscopic and the microscopic, and consider this knowledge of the whole or comprehensive knowledge. Socrates does not deny that it would be impressive to have this knowledge, but the denies that either they or he has it. No one has this knowledge of the whole, because it is impossibly founded upon a misconceived notion about the whole...[the scientist] cannot comprehend the universe insofar as he is related to it, because the human is outside of his view.
The blindness to what is human is itself a human defect requiring explanation. Natural science is above all a human activity, through which the natural scientist practices a form of self-forgetting. He finds security in his dedication to knowledge, but he has not examined the meaning of this dedication...He practices a form of asceticism whereby he tries to avoid facing his own existence by subordinating himself to science.The whole is inextricably linked to to human beings and how they live, and a failure to explain this relation is a failure to examine the whole. Because the scientist lacks self-knowledge, he lacks knowledge of the whole..
The self-forgetting of the scientist is paradoxically a form of self-sacrifice and therefore, contains a religious instinct. But this instinct is in contradiction with science itself. His life and his science are in contradiction, and his unwillingness to face it is the mark of his ignorance.
Hello, Mr. Geek, Spetner addressed this and I posted his response. See above:paidgeek said:http://www.lecb.ncifcrf.gov/~toms/paper/ev/AND-multiplication-error.html
And using mathematics, Mr. Geek, if you do digital work on film, what do you find incorrect with any of the points on the challenges to explain life by Dr. Berkovich? His paper can be viewed from the below link.I just became aware of Tom Schneider’s “response” to my objection to his criticism of my calculation of probability
(go here for Schneider http://www.lecb.ncifcrf.gov/~toms/paper/ev/AND-multiplication-error.html).
I don’t know whether he can’t read or if he has a mental block against admitting to criticism. He thinks that my probability p = 1/300,000 is the probability of an adaptive mutation. I clearly stated that it is the probability that a particular mutation will occur in a population and will survive to take over that population.” He did not understand this clear statement and thought that I meant it to be the probability of a particular point mutation occurring in a given genome. (His comparison of this number with 10-8 for the probability of the mutation rate in bacteria is irrelevant and is indicative of his misunderstanding.) Evolution requires that a sequence of many of these occur, and for the example I chose, a sequence of 500 of them was necessary. I am going to go through this once more and I hope he is listening carefully. Once the first adaptive mutation in the sequence has occurred and has taken over the population, a next one must occur and then a next one and so on 500 times.
The event that consists of the appearance of an adaptive mutation followed by natural selection of sufficient effect to take over the population is an event that is independent of subsequent events of the same character.
Therefore the probabilities multiply, and the probability of the entire sequence occurring is 1/300,000 raised to the 500th power. The only way he can criticize this calculation is to distort what I am saying and claim I was calculating something else.
And Spetner has recently written after your old challenge, which he addressed, see my post above.The information contained in the genome is insufficient for the control of organism development.
Thus, the whereabouts of actual operational directives and workings of the genome remain obscure.
In this work, it is suggested that the genome information plays a role of a "barcode".
"One of the profoundest enigmas of nature is the contrast of dead and living matter" (Weyl,
1949). Envisioning living organisms as machines governed by laws of physics and chemistry
raises the sacramental question of whether the "living matter" posses some properties which are
not inherent to the "dead matter". Anyway, why and how does the change in the behavior of dead
and living matter occur so abruptly?
The core of the enigma of living matter lies in the origin of information control. The problem is
how biological objects acquire guidance through the life cycle as they emerge from and degrade
into non-existence. In this paper, the organization of biological information processing is
approached purely in terms of “engineering design”. The everlasting debates on delicate points
of the origin and true meaning of Life present a separate issue.
[My comment: Again, is you background digital? Are you aware of the problems due to the amount of data required?]
Contemporary science firmly rests upon the conviction that Life is a mere by-product on top
of the material processes. The life cycle of a biological organism is considered as a sequence of
transitions from one molecular configuration to another. This outlook is problematic in many
respects. Particularly confusing is the fact that the structural complexity of the genome is
insufficient for organism development.
The deficit of the genome information is supposed to be compensated via “interaction with the
environment”. In our suggestion, the information in the genome does not play its traditional role
of data or instructions. Instead, the information contained in the DNA macromolecules presents a
pseudo-random number (PRN), like, for example, a barcode. The structure of the DNA is not a
primitive carrier of fixed information resources but an identification key that ensures a unique
specification of an organism within a broad taxonomy. Using the DNA “barcode” as a key for
wireless communications by means of the Code Division Multiple Access (CDMA) technique
biological organisms acquire access to incomparably richer information processing facilities.
How can resemblance of a child to the father be conveyed through the information deficient
genome? The “barcode” interpretation of DNA simply indicates that functioning of biological
objects cannot be based on material configurations. To understand the phenomenon of Life it is
necessary to consider informational infrastructure underlying the material world.
The idea that Life is associated with immaterial information processes in the Universe had been around, in one
form or another, through the whole history of human civilization. In the suggested construction
this idea is linked to a cellular automaton model of the physical world. The basic pillars of this
model - hardware componentry and software architecture - are outlined in Appendices A and B.
The Appendices A and B consider general problems: how the “barcode” functionality affects the
view on the physical Universe and what changes does it incur in the computational scheme of
biological information processing. The main body of the paper concentrates on issues of the
“barcode” functionality of the DNA that are of immediate bio-medical concern.
The groundwork theses
1. The “barcode” functionality of the DNA gives the genome an operational meaning.
2. The amount of information involved in biological information processing is enormous.
[My observation: Hence if you work in the industry, look at the amount of data to capture a single frame of film. How much more data is contained within a single cell!]
3. Biology must comply with "the basic law of requisite variety" which says that achieving of
appropriate selection "is absolutely dependent on the processing of at least that quantity of
information. Future work must respect this law, or be marked as futile even before it has been
4. Investigation of biological information processing must rely on the methodology of
engineering design. There is a limit of complexity that anything can sustain - the design must
be done with “economy and elegance”.
5. Pure technicalities in the implementation of information processes in the physical Universe
should not be mixed up with the philosophical and metaphysical questions of higher
6. Problems such as how memories are stored in the brain are not likely to be affected by the
discovery of the final theory in physics (Weinberg, 1992). Understanding of biological
information processing will come with the re-vitalization of the concept of ether (Wilczek,
1999; Davis, 2001).
Foundations of physics have to be revised in conjunction with the
involvement of information:
1). The material structure of the DNA molecules does not contain enough variety to serve
as a repository of control directives for living organisms (Claverie, 2001).
[Again, is your background information and data? Both Spetner and Berkovich, approaching life from similar backgrounds identify problems]
2). The existing picture of information pathways in the physical Universe is incomplete.
Information impact of quantum entanglement behind material processes spreads at
least 107 faster than light (Seife, 2000).
3). Modern cosmology does not care about information structures in the Universe.
Instead, the bulk of the Universe (95%) is supposed to be filled with an unstructured
stuff of “dark matter” and “dark energy” (Cowen, 2001).
7. A. Einstein stated: “Someday we'll understand the whole thing as one single marvelous
vision that will seem so overwhelmingly simple and beautiful that we will all say to each
other -- Oh, how could we have been so stupid so long? How could it have been otherwise?”
Ironically, Einstein’s concept of general relativity is the primary barrier on the way of the
information dominant Universe.
8. The major instrument in the advancement of knowledge is Experimentum Crucis - a crucial
experiment that demonstrates a clean fact negating a contender theory. Confirming evidences
do not assure logical correctness of a scientific theory - the moment of truth comes through a
negation. The decisiveness of an Experimentum Crucis in natural science can be compared
to that of a counter-example in mathematics and an alibi in jurisprudence.
Again, I've given those who have an interest in exploring other avenues to seek answers, which are necessarily incomplete.One must understand that at the heart of NDT lies chance and randomness.
Mutations are random events. The occurrence of a beneficial mutation at any given time in any given population is governed by chance.
Even natural selection, which carries the burden of being the directive force of evolution, is subject to the laws of chance. Selection coefficients are average values. What happens in any particular instance is a random event. A mutation, even one that confers adaptive benefit on the organism, is likely to be wiped out by chance events (see Chapter 3 of my book).
There is a good chance that it will disappear before it can take over the population. The question is not if it can happen, but, with what probability will it happen?
NDT is a theory that is supposed to account for the natural development of all life from a simple beginning. I don’t know why we need such a theory, because the development of life from a simple beginning is not an observable.
The theory is gratuitous; it comes to account for something that was never observed.
Actually, evolutionary thinking goes like this.
One observes present life.
One then assumes that it arose in a natural way.
One then concocts a theory (e.g., the NDT) to account for the observation, given the assumption.
I suppose that if the theory were really a good one, and could really explain well how life could have developed in a natural way, it would lend some credence to the assumption that life did indeed develop in a natural way. But it is not a good theory, and it does not account for what it is supposed to. Evolutionists, realizing this, have lately been reduced to arguing that if no one has a better theory that can account for the natural origin of life, then one must accept NDT. As you will see from some of Max’s comments below, he also adopts this approach. I don’t know why NDT merits the pedestal on which evolutionists have put it.
There are terribly bright, well educated people with amazing backgrounds who see the limitations of current 'science'. All I'd advise is for those who are curious to consider there are challenges to orthodox positions; that is how true science is supposed to work. The answer does not have to be Metaphysical; the answer to the problem of what is life is still not solved. I've provided some challenges and additional questions. Spetner questions, and does not provide a solution. Berkovich provides a potential answer. I don't think there are good challenges to the questions they have raised or alternate answers using current theories. Other readers may investigate further via the links provided.It's always very telling when the so-called scientists resort to wishful thinking and ideological propaganda...
Leakey is letting the atheist evolutionary cat out of the bag here. Unlike the likes of Harris, whose revolutionary Enlightenment 2.0 globalism is never advertised and can only be confirmed by carefully reading through his books, Leakey is quite willing to draw the connection between evolution, atheism, multiculturalism, all intended to lead towards the long-term utopian fantasy of rule by a scientific and technocratic global oligarchy.
My prediction is quite the opposite. I am increasingly convinced that genetic science will render the Neo-Darwinian Synthesis scientifically unviable in the same manner it previously required the development of the synthesis by rendering untenable classic fossil-based Darwinian evolution by natural selection. One thing that has escaped most professional biologists, who are neither historians of science nor logicians, is that the increasing complexity of the DNA/RNA interplay along with growing understanding of mutations renders the present evolutionary timelines increasingly improbable. Whereas the decoding of the human and other genomes was supposed to provide not only answers, but even conclusive proof of macroevolution, it has instead raised considerably more questions. And while the growing number of proposed evolutionary mechanisms are not necessarily proof that macroevolution has not happened in the past and is not happening in the present, they do show the need to develop epicycles that is always indicative of a theory that is in trouble and on its way to being falsified and ultimately jettisoned.
Could I be incorrect? Of course. That is why I describe myself as an evolutionary skeptic rather than an anti-evolutionist. But once again, we see a conflict between pattern recognition and scientific consensus, and I expect that as has usually happened before, pattern recognition will win out because scientific consensus is not always science, it is often logical conclusions drawn from science by scientists. And the history of science shows that scientists are, for the most part, inept logicians, which is why they tend to keep making the same type of mistakes with each new generation of scientist. So, I am quite comfortable asserting, contra Leakey, that in 15 years, skepticism over evolution will not only not be history, but will be both more popular and more scientifically credible than it is now.
It's sad the topic is contentious; let me for the sake of argument agree with you. Dawkins wrote in his book, The Blind Watchmaker, about an immortal space alien playing bridge:paidgeek said:If I want to know how something works, I ask someone who studies that topic as a profession. Asking a mathematician for answers about organic systems makes about as much sense as asking a biologist to explain quantum physics.
Dawkins is therefore writing about mathematics, probability. Now, Spetner, a physicist, is the one with knowledge about mathematics. He responds,They will expect to be dealt a perfect bridge hand from time to time, and will scarcely trouble to write home about it when it happens
As to physics, see here:He's wrong. One can easily calculate the chance of Dawkins' alien experiencing a perfect bridge hand at least once in his lifetime. The chance of getting such a hand in one deal is 4.47 X 10 -28 [superscript -28]. If the alien plays 100 bridge hands every day of his life for 100 million years, he would play about 3.65 X 10 (to the 12th power) hands. The chance of seeing a perfect hand in at least once in his life is...about one in a quadrillion...Would he bother to write home about it?
Stephen Hawking (correctly for once) declares in his latest book, “Philosophy is dead.” But so is modern physics, and for the same reason, although the corpse refuses to lie down. Kant’s influence has morphed into the oxymoronic “thought experiment.” Science has become surreal and illogical with the sainted Einstein as its exemplar and holy relic. A return to classical natural philosophy is urgently needed to restore sanity.
This paper is short and concise; give it a try:DaveF said:That's a curious site. It reads like the author is proposing significant new theories. But I don't see any science or math; nothing beyond armchair philosophizing and pop-sci editorial. That's fine if you're a Brian Greene, writing science for the masses. But this fellow doesn't indicate he has any background from which to produce explanation for the masses.
But maybe I missed it, I didn't look hard.
As to Thorhhill, see here: http://www.holoscience.com/wp/the-ieee-plasma-cosmology-and-extreme-ball-lightning/ and here: http://ieeevic.org/events/getdetails.php?id=412The following article is by Jeremy Dunning-Davies, Senior Lecturer in Physics at the University of Hull and member of the Royal Astronomical Society and Natural Philosophy Alliance.
Notable Presidents of IEEE and its founding organizations include Elihu Thomson (AIEE, 1889–1890), Alexander Graham Bell (AIEE, 1891–1892), Charles Proteus Steinmetz (AIEE, 1901–1902), Lee De Forest (IRE, 1930), Frederick E. Terman (IRE, 1941), William R. Hewlett (IRE, 1954), Ernst Weber (IRE, 1959; IEEE, 1963), and Ivan Getting (IEEE, 1978).
More on Mr. Thornhill:This is a report on a few aspects of the Institute of Electrical and Electronics Engineers (IEEE) International Conference on Plasma Science (ICOPS 2006), held in Michigan earlier this month. The IEEE is the world’s leading professional association for the advancement of technology, with more than 365,000 members. The labours of these large numbers of professionals have driven technological progress in the twentieth century. Their success has often been equated with scientific progress, which has allowed the stagnation in the hard sciences to be overlooked. It is engineers who have made space exploration possible, and their precision probes and navigation skills have returned data that routinely surprises space scientists. After each surprise the scientists scuttle back to their drawing boards but they only touch-up the old picture. Perhaps it is time for engineers to bring new concepts to the drawing board.
Members of the IEEE Nuclear and Plasma Sciences Society began to show the way to a new understanding of the universe several decades ago. Their practical experience with plasma, the stuff from which almost the entire visible universe is composed, contrasts strongly with the purely theoretical approach of astrophysicists. Astrophysicists need to invent black holes, dark matter, strange matter and dark energy simply to salvage their theoretical models based on big bang assumptions and the puny force of gravity. Their language has lost touch with the newly perceived reality.
A Melbourne University physics graduate and natural philosopher, Mr. Thornhill has spent decades questioning popular ideas about the physical world. In the past 15 years, standing on the shoulders of noted predecessors, he has laid an interdisciplinary foundation for the Electric Universe paradigm.
He has peer-reviewed papers in the IEEE Transactions on Plasma Science and the Open Astronomy Journal and he received a gold medal in 2010 from the Telesio-Galilei Academy of Science for his work on the Electric Universe.
Aaron Silverman said:Spetner's comment on the Bridge-playing alien is just more nonsense. He arbitrarily assigns the alien a 100 million year lifespan, when the original proposition clearly stated that the alien is immortal. Not to mention the fact that his math is wrong -- the chance of being dealt a perfect Bridge hand is nowhere near that unlikely.
Speaking of nonsense, is that Holoscience website supposed to be serious, or is it just an overly complicated gag? "I don't understand astrophysics; therefore it's all false!
LA LA LA LA MARY HAD A LITTLE LAMB."
I'd suggest discovering what it means to be a good human being (see Plato references above) and forget about 'black holes', Darwin, et al. Altogether a much harder yet worthier prospect, in the event!"How can an otherwise sane individual become so enamored of a fantasy, an imposture, that even after it’s exposed in the bright light of day, he still clings to it – indeed, clings to it all the harder? No amount of logic can shatter a faith consciously based on a lie."
~ Lamar Keene, a scam artist who posed as a psychic, describing why it was so easy to fleece people.