Originally posted by menace71Gamma ray bursts are a different phenomena from supernovas. (related but different.)
I've read some items on-line and the consensus seems to be between 150-200 L.Y. is about the limit if a supernova went off close to the earth and apparently it also matters where the Gamma Ray burst are aimed
Manny
Look, a supernova when it goes off will have a power output roughly equivalent to a
large galaxy.
Or approximately 100 billion times (1E11) brighter than the sun...
Radiant energy goes down as a function of the inverse square of distance.
For the supernova to be as bright as the sun in the sky it would have to be (1E11)^(1/2)
times further away than the sun.
The sun is 93 million miles away so the supernova would have to be about 3E13 miles away.
Or about 5 light years.
Now obviously having two suns in the sky for 2~3 weeks is going to be a problem and
the supernova will be putting out a lot more radiation and harmful UV light than our
sun does so you would want it further away than that.
Minimum safe distance for an earth like planet is most likely in the 25~75 ly range. [EDIT: correction 50~100ly.]
It's [almost] certainly less than 100 ly.
As gamma ray bursts are more energetic and focus the energy into narrow beams they are
dangerous at much greater ranges.
However they are also much much rarer, and require you to be in one of the beams to be at risk.
(outside of the normal supernova safety zone).
Basically the 'online consensus' is wrong.
EDIT: Also the daily mail couldn't get a science story right if they had a thermonuclear bomb
in the office that would detonate if they got it wrong.
If Betelgeuse went... When it goes I should say... It will probably outshine a full moon.
It will certainly be brighter than the planets, an day time visible.
It will very likely be bright enough to cast shadows.
It will look very pretty.
And we will get to do lots of awesome science watching and studying it.
It will not get anywhere close to being as bright as the sun.
As I showed earlier to rival the sun (assuming it gets as bright as the entire galaxy) it would need to
be around 5 ly away...
At 640 ly away, which is 128 times further away than it would need to be to be as bright as the sun,
it's luminosity will be 128^2 times less bright than the sun, or around 16 thousand times dimmer.
Now the full moon is appx 400,000 times dimmer than the sun, so it's possible that using this
very rough and ready math, that it might appear up to 25 times brighter than the full moon.
However there are a multitude of factors I am not considering in this, that could well mean its a lot dimmer.
In fact it's expected brightness is about the same as the full moon, appx -12 on the apparent magnitude scale.
Which means I am likely overestimating the size of the supernova by a factor of 20~30.
Originally posted by menace71I don't know about more intelligent.
Cool and your much more intelligent than I but I do enjoy the discussion
I do realize many things on-line have to be taken with a grain of salt
Manny
Just possibly on this particular issue more informed.
Which has little to do with intelligence.