Originally posted by DiophantusAgree with everything you said; at extremely high speed and negligible masses, Newton's Laws do indeed break down and Einstein's theories are much more applicable. But then again, I was speaking in general and of the practical and typical scenarios we constantly experience and where Newton's Law are more than sufficient. I don't think we have reached the stage where can we throw around quantum physics jargon so easily like that to describe and explain processes, maybe one day in the future...since you bring it up here's a list of scientific and technological advances which have had major impact on our lives and society and economies or are expected to have one:
You've got that round the wrong way. Newton called his theories laws and would have insisted that he had proved their truth. I don't think he used the word theory once in the Principia, although it might be difficult to tell considering he wrote in Latin.
So Newton's Laws started life as laws of nature, undisputed truth if you like. Then people noticed ...[text shortened]... ian surface for instance. But scientists don't think of them as absolute truth now.
Stem Cell R&D
Bio-mechanics\Bionics
Contraception and Infertility solutions
The Human Genome project
The Internet\ E-Business
The Laser\ Fiber Optics
The Microchip\ Nanotechnology
The MRI scanner\ Digital Revolution
Quantum Physics R&D
Alternative Energy R&D
I don't know what rock all these science haters have been hiding under, but I think they should all be a little more appreciative...at the very least for well-established theories and technologies, we can actually provide evidence whenevr required to support us. Can that be said of the opposition?
Originally posted by Iere manI don't think a discussion of the place of proof in science is anything to with hating science. I am a scientist by trade and accept the limitations of science but I do not hate it.
Agree with everything you said; at extremely high speed and negligible masses, Newton's Laws do indeed break down and Einstein's theories are much more applicable. But then again, I was speaking in general and of the practical and typical scenarios we constantly experience and where Newton's Law are more than sufficient. I don't think we have reached the st ...[text shortened]... provide evidence whenevr required to support us. Can that be said of the opposition?
I think the word proof is much misunderstood by non-scientists. There seems to be some notion that scientists produce proofs of mathematical rigour by a similar process (evidence presented etc.) to that of a court of law. This mixes the two sorts of proof and causes a great deal of trouble.
Originally posted by huckleberryhoundNeanderthal's became extinct and finally Huck has provided us with the reason. Who would ever put up with women if they had a vagina on their foot?
"proof has no place in science"
This statement was posted in a reply to me in a forum.
Am i going mad, or is this the most stupid i've read in a long time...someone please tell me this doesn't make any sense.
My response to this was....
"I have to ask....are you a troll?
I've never used the word "proof" other than in quoting you v ...[text shortened]... rse of brown dogs
As no proof is required, these are obviosly scientific in nature.
Originally posted by Iere manI haven't seen any 'science haters' in this thread?? well, maybe freaky, but that's still just one.
Agree with everything you said; at extremely high speed and negligible masses, Newton's Laws do indeed break down and Einstein's theories are much more applicable. But then again, I was speaking in general and of the practical and typical scenarios we constantly experience and where Newton's Law are more than sufficient. I don't think we have reached the st ...[text shortened]... provide evidence whenevr required to support us. Can that be said of the opposition?
and dude, you've got so many gross errors there (again) that I think you should calm down and think before you write. half of what you wrote was in gross violation with the disciplines you're trying to 'defend'.
Originally posted by wormwoodI hate the playas, not the game.
I haven't seen any 'science haters' in this thread?? well, maybe freaky, but that's still just one.
and dude, you've got so many gross errors there (again) that I think you should calm down and think before you write. half of what you wrote was in gross violation with the disciplines you're trying to 'defend'.
You, however, appear to hate the game of punctuation.
I'm just saying.
Originally posted by wormwoodAre you speaking in general too?, I have given quite a few examples and according to you, half of what i wrote was in violation to something I am defending...yet you cannot specify one...
half of what you wrote was in gross violation with the disciplines you're trying to 'defend'.
Originally posted by Palynkagood for you...you would have failed in any event if you attempted to do so...broken english and punctuation errors yeah but i can care less about that, my point is very clear otherwise...
You write like a rambling lunatic. Arguing with you would be foolish of me.
Pass.
Originally posted by DiophantusA good example:
You've got that round the wrong way. Newton called his theories laws and would have insisted that he had proved their truth. I don't think he used the word theory once in the Principia, although it might be difficult to tell considering he wrote in Latin.
So Newton's Laws started life as laws of nature, undisputed truth if you like. Then people noticed ian surface for instance. But scientists don't think of them as absolute truth now.
Newtonian summation of velocities (u and v to get s):
s = v + u
After observations by Michelson, Morley, Lorentz and Einstein, you will get this:
s = (v+u) / (1 + (vu / c^2))
Of course, at low speeds (small fractions of the speed of light), the denominator part becomes very close to 1 and you will get the first equation.
Therefore, Newton's "laws" aren't entirely accurate, although it will do for most people's purposes.
EDIT: Also need to point out that Isaac Newton, although a genius, was quite arrogant, so it is of no surprise that he wouldn't be happy to consider what he says as not exact.
Originally posted by Iere manYes, it's clear you have no clue about scientific research, it's methods or the logic behind them, yet you beat your chest and call others 'noobs'.
good for you...you would have failed in any event if you attempted to do so...broken english and punctuation errors yeah but i can care less about that, my point is very clear otherwise...
The application of scientific knowledge is a statistical endeavour. To say that a bridge can support X kilos means that X is a lower bound on a confidence interval (sometimes a buffer is added heuristically, but that's an informal way of increasing the confidence level). It does not mean that at exactly X kilos and only at X kilos will the bridge collapse.
So why am I telling you this? Because this means that applications only require good enough approximations, not exact truth. And that's exactly what scientific research does, it finds the best possible approximation and never, ever concludes that no better one can ever be found because what we have is the Truth.
Originally posted by PalynkaYou're too good for your own good you know...What you saying above there is common knowledge nothing new; every scientist\engineer knows that 'exact truth' is impossible to realize when applying theory, but what benefit does the 'exact truth' accomplish when the 'good enough and best approximations' meet or exceed the application requirements for all PRACTICAL purposes; this is why we have tolerances and %error ranges specified when designing and engineering systems,,, the point is that the approximations might as well be regarded as the truth because the difference between the two is quite negligible...Nevertheless the exact truth CAN and usually is provided, exact variable quantities are calculated because of course we have to use math\calculus which accomodates for IDEAL cases...
...Because this means that applications only require good enough approximations, not exact truth. And that's exactly what scientific research does, it finds the best possible approximation and never, ever concludes that no better one can ever be found because what we have is the Truth.
Originally posted by lauseyMaybe there are not as many science haters afterall...I'd just like to clarify few points however,,,with the exception of the research taking place at the LHC and work done by nuclear physicists, Einstein's and other quantum physics theories are still at the infancy stage because experimentation is very difficult at these microscopic levels and scientist are physically limited at moment in some areas as Diophantus has stated,,,soon should be making progress with that however and just a matter of time before technologies are developed from it or can be easily applied to situations we would routinely encounter,,,until then we have Newton's Laws which work perfectly fine for everything in between the extreme speeds and masses and IMO will stay even as quantum physics theories eventually become mainstream...
Therefore, Newton's "laws" aren't entirely accurate, although it will do for most people's purposes.
Originally posted by Iere man
You're too good for your own good you know...What you saying above there is common knowledge nothing new; every scientist\engineer knows that 'exact truth' is impossible to realize when applying theory, but what benefit does the 'exact truth' accomplish when the 'good enough and best approximations' meet or exceed the application requirements for all PRACTI ...[text shortened]... ted because of course we have to use math\calculus which accomodates for IDEAL cases...
every scientist\engineer knows that 'exact truth' is impossible to realize when applying theory
.Nevertheless the exact truth CAN and usually is provided, exact variable quantities are calculated because of course we have to use math\calculus which accomodates for IDEAL cases...
Seriously, you don't see the contradiction here? How do you validate a theory to be proven as the truth if every application (of which the empirical tests used to check the theory) has a margin of error?
approximations might as well be regarded as the truth because the difference between the two is quite negligible...
Not in science, which is what we're talking about. Researchers can always potentially improve on that approximation.
Originally posted by PalynkaSo much for the pass, I guess... 😕every scientist\engineer knows that 'exact truth' is impossible to realize when applying theory
.Nevertheless the exact truth CAN and usually is provided, exact variable quantities are calculated because of course we have to use math\calculus which accomodates for IDEAL cases...
Seriously, you don't see the contradiction h ...[text shortened]... hat we're talking about. Researchers can always potentially improve on that approximation.
Originally posted by PalynkaNo there is no contradiction here,,,to re-iterate, Mathematics is NOT a sub-discipline of science...I see i have to spell it out...the difference between mathematical variables and scientific variables is DIMENSIONS,,, in science we have quantifyable and measurable variables which can be PRODUCED example 50 psi line pressure or 10 kg mass of water or 9-volt battery or travelling at speed of 100 mph, and then we also have constraints or limits, in mathematic there are no constraints and dimension; has anyone ever PRODUCED the actual number '2' for instance NO!!! of course not,, in mathematic (in its pure form) everything is abstract and IDEAL cases can be calculated as i said before...
Seriously, you don't see the contradiction here? How do you validate a theory to be proven as the truth if every application (of which the empirical tests used to check the theory) has a margin of error?