Originally posted by ZahlanziNo, I mean even simply assessing if it works or not. There are always false positives and false negatives that need to be accounted for.
you mean a drug that might cure cancer? if you have cancer, do you care if it has a 15% chance or 1% chance or 90% chance? at what probability of success would you refuse the drug? and if someone else would refuse the drug at a different number?
But your example is again flawed. You can't take all the possible drugs, so how do you choose which ones if given the option? You choose the ones with the highest probability.
It's all pretty simple, really. Even with your feeble attempts to give binary all-or-nothing examples you're failing miserably.
Originally posted by ZahlanziOf course you calculate. Even if implicitly. If she's surrounded by ten sharks, there always a small probability you'll manage to save her. Would you jump in?
so if you see a girl drowning you calculate your chances of success? where would you consider jumping in after her? would it matter that you calculate 1% or 99%? not jumping means 100% death for her. jumping has some chance. for me it is irrelevant what that chance is because i would jump after her even though i am not such a good swimmer. at least in this ...[text shortened]... in my opinion trumps probability. of course there are exceptions, which you didn't produce yet.
Obviously not. So it depends on the probability of both you dying and you saving her.
Keep trying.
Originally posted by ZahlanziHere's an example where calculating the probability is a good idea before making a decision.
people try and should do the right choice, not the choice with the most chances of success.
tell me of a situation where it is necessary to calculate the probability before making a decision. if a doctor tells me i have cancer and i will croak in 1 month and that there is a treatment that will give me 5% of survivability, i will freaking take the treatme uld be foolish to do the less intelligent action just because it has more chances of success.
Imagine there are three closed doors. Behind one of them is a porche whereas the other doors have a bag of salt each. You are required to choose one door. When you have made your choice, but not opened it, someone else (say a presenter) opens one door that has a bag of salt behind it. He then asks you if you want to keep your original choice or switch to the last remaining door. He tells you you get to keep whatever is behind the door you finally choose.
Originally posted by WheelyLOL! Rec'd.
Here's an example where calculating the probability is a good idea before making a decision.
Imagine there are three closed doors. Behind one of them is a porche whereas the other doors have a bag of salt each. You are required to choose one door. When you have made your choice, but not opened it, someone else (say a presenter) opens one door that has a ...[text shortened]... t remaining door. He tells you you get to keep whatever is behind the door you finally choose.
Originally posted by Wheelythats provided you believe you have other than 50-50 chance of success in this example.
Here's an example where calculating the probability is a good idea before making a decision.
Imagine there are three closed doors. Behind one of them is a porche whereas the other doors have a bag of salt each. You are required to choose one door. When you have made your choice, but not opened it, someone else (say a presenter) opens one door that has a ...[text shortened]... t remaining door. He tells you you get to keep whatever is behind the door you finally choose.
Originally posted by Palynkaif the sharks are far enough that i might reach her in time, then yes i would jump. if the sharks will reach her before i do, there isn't a possibility of saving her. keep trying
Of course you calculate. Even if implicitly. If she's surrounded by ten sharks, there always a small probability you'll manage to save her. Would you jump in?
Obviously not. So it depends on the probability of both you dying and you saving her.
Keep trying.
Originally posted by WheelySorry to go back in time a bit in this thread, but if you look at the post to which I was replying, I think you'll see I was answering a different question than the OP asked.
Given that this is a well known question, we know (ATY has the link) that it isn't, in fact, a random choice the presenter makes.
Even if this wasn't the case, Shav tells us that the presenter opened a door that had the salt behind it. At that point, you have a 2/3 chance of winning if you switch and a 1/3 chance of winning if you don't. This is simply be ...[text shortened]... g door to start with and you know there is only one other door that could contain the porche.
Originally posted by ZahlanziWe all know and can prove it too.
who knows? why is your first choice influenced by the second? ultimately you choose between two doors and you have 50% chance. that is the only event that matters.
Ultimately you choose between two doors, as you say, however the two doors you are choosing from have been selected as the result of your first choice (and the presenter). Therefore it is not a 50/50 split because the odds have been stacked in favour of the remaining door you haven't chosen by a) your choice and b) the presenter
Originally posted by WheelyClassic!
Here's an example where calculating the probability is a good idea before making a decision.
Imagine there are three closed doors. Behind one of them is a porche whereas the other doors have a bag of salt each. You are required to choose one door. When you have made your choice, but not opened it, someone else (say a presenter) opens one door that has a ...[text shortened]... t remaining door. He tells you you get to keep whatever is behind the door you finally choose.
Originally posted by ZahlanziPeople have survived for longer than a few minutes swimming in shark-infested waters, so that's clearly false.
if the sharks are far enough that i might reach her in time, then yes i would jump. if the sharks will reach her before i do, there isn't a possibility of saving her. keep trying
Anyway, stick your head in the sand for all I care.
Originally posted by Wheelythis is not how the problem is formulated. the problem asks if you have more chances of winning if you change your choice. the dude picked the first door with a 1/3 chance of success. but now he makes a new pick from 2 choices.
We all know and can prove it too.
Ultimately you choose between two doors, as you say, however the two doors you are choosing from have been selected as the result of your first choice (and the presenter). Therefore it is not a 50/50 split because the odds have been stacked in favour of the remaining door you haven't chosen by a) your choice and b) the presenter
if you play russian rullette and you pull the trigger 4 times what are your chances on the 5th pull? things would not be 50% only if you would have asked before the 1st pull of the trigger what are your chances of reaching the 5th pull.