Originally posted by LemonJelloLots of number crunching for this one, as there are twenty different histories for a Best-Of-Five match, and the history-dependent mechanism means that you have to track each one, far as I can tell.
Here's another problem about conditional probability to keep the thread going.
Suppose one is scheduled to play a best-of-five match where every game of the match is decisive: either a win or a loss results. The probability that he will win the first game is 1/2. But in subsequent games thereafter, the probability of his winning each game depends on ...[text shortened]... What is the probability that he wins the match, given that he wins the first game of the match?
However, after processing this data, my numbers indicate that his odds of winning the best-of-5 if he wins the first match is 70.68%
I'll work on an exact number.
EDIT: Exact odds are 229/324.
Originally posted by LemonJelloI can't see any easier approach here than counting up the various combinations. So draw up a tree diagram, work out the probability for each branch, and add up the branches that are relevant.
Suppose one is scheduled to play a best-of-five match where every game of the match is decisive: either a win or a loss results. The probability that he will win the first game is 1/2. But in subsequent games thereafter, the probability of his winning each game depends on the match history and his related confidence level:
If he won the previous game ...[text shortened]...
What is the probability that he wins the match, given that he wins the first game of the match?
To cut a long story short, I make it 229/324 (about 71% ).
Originally posted by eldragonflyThe idea is that the probability of an event having happened can change if you acquire additional information about the event. Here's a rather gruesome example:
they go from a 1/3 to 1/2 but i still don't understand the mechanism for this.
A detective is called to a murder scene, where a victim has been shot. The officers have rounded up "n" suspects, each carrying a different calibre gun (no twins), each of which has been fired in the last hour. Without any further information, the detective could only assign a 1/n probability of guilt to each suspect and would not be able to press an investigation.
However, after an investigation by the on-scene coroner, the bullet hole is determined to have been caused by a bullet at least 5.56 mm in diameter. Now the detective can eliminate each suspect with a calibre less than 5.56, say "k" suspects, and thereby increase his probability of guilt for those remaining from 1/n to 1/(n-k), which gives his case enough credence to convince his Sergeant to let him proceed.
In this case, the event that suspect "x" was guilty was increased from 1/n to 1/(n-k) solely on the basis of additional information. In a conditional probability problem, this is often worded as something like "what is the probability that 'x' is guilty, given that the bullet was at least 5.56 in diameter", with the intent of making you re-examine the sample space for newfound impossibilities.
Originally posted by PBE6re: the monty hall paradox :
The idea is that the probability of an event having happened can change if you acquire additional information about the event. Here's a rather gruesome example:
A detective is called to a murder scene, where a victim has been shot. The officers have rounded up "n" suspects, each carrying a different calibre gun (no twins), each of which has been fired in t ...[text shortened]... th the intent of making you re-examine the sample space for newfound impossibilities.
http://mathforum.org/dr.math/faq/faq.monty.hall.html
http://math.ucsd.edu/~crypto/Monty/montybg.html
Playing the game under these assumptions is equivalent to spinning the roulette wheel to the right except that if the blackened area of the wheel comes up then it is spun again. Once again the red area means that in order to win the contestant will need to switch doors, and the blue means that the contestant should not switch. Notice that there is the same amount of red area as blue. In other words, it doesn't matter if the contestant switches in this case.
Originally posted by eldragonflyIn my example, the crew knows where the prize is. As long as the person who chooses which door gets opened knows where the prize is, conditional probability provides the correct probabilities. But if you really want to understand conditional probability, you have to do some work on your own.
re: the monty hall paradox :
http://mathforum.org/dr.math/faq/faq.monty.hall.html
http://math.ucsd.edu/~crypto/Monty/montybg.htmlPlaying the game under these assumptions is equivalent to spinning the roulette wheel to the right except that if the blackened area of the wheel comes up then it is spun again. Once again the red area means that in ...[text shortened]... rea as blue. In other words, it doesn't matter if the contestant switches in this case.
1) Come up with a situation where a random event occurs, and figure out the probabilities of each possible outcome.
2) Modify your situation such that some partial information is revealed after the random event occurs, and re-evaluate the probabilities of each outcome. For best results, the information should help you eliminate some possibilities, but not give away the outcome altogether.
Post your situations when you're done.
Originally posted by PBE6Yeah i know how to do the math PBE6, trust me. Back to the playoff example, i understand intuitively that since it appears that the mets won more games in ny than los angeles, by direct implication it would be at least an even bet that the final won game was played in ny rather than los angeles, therefore doing the number crunching is the *next* step.
In my example, the crew knows where the prize is. As long as the person who chooses which door gets opened knows where the prize is, conditional probability provides the correct probabilities. But if you really want to understand conditional probability, you have to do some work on your own.
1) Come up with a situation where a random event occurs, and figu ...[text shortened]
Here's a question. Suppose in the Mets vs Dodgers example that instead of the record in LA was 7-3 in favor of the Mets, rather than only 6-4.
So we have 19 games between the clubs, Mets winning 7 of 9 in New York, and 7 of 10 in LA. The postponed match is still decided by coin toss. Neither you nor your friend know where it was held (making either place equallly likely without further information).
The Mets still win 11-4. Your gambling friend still wagers you even money the game was in New York.
Is this a fair bet, and what are the odds for the game being in New York, given the Mets won?
Originally posted by PBE6There's nothing to prove, or else you must be braindamaged, i simply posted two links that examine the monty hall paradox, beyond the usual grab bag of gee whiz solutions. 😞
Prove it.
re: the monty hall paradox :
http://mathforum.org/dr.math/faq/faq.monty.hall.html
http://math.ucsd.edu/~crypto/Monty/montybg.html
Playing the game under these assumptions is equivalent to spinning the roulette wheel to the right except that if the blackened area of the wheel comes up then it is spun again. Once again the red area means that in order to win the contestant will need to switch doors, and the blue means that the contestant should not switch. Notice that there is the same amount of red area as blue. In other words, it doesn't matter if the contestant switches in this case.
Originally posted by geepamoogleSo exactly the same problem, except swapping 7/10 for 6/10.
Here's a question. Suppose in the Mets vs Dodgers example that instead of the record in LA was 7-3 in favor of the Mets, rather than only 6-4.
So we have 19 games between the clubs, Mets winning 7 of 9 in New York, and 7 of 10 in LA. The postponed match is still decided by coin toss. Neither you nor your friend know where it was held (making either ...[text shortened]...
Is this a fair bet, and what are the odds for the game being in New York, given the Mets won?
No, still not fair. The previous formula still applies, and resolves to 70/133 (about 52.6% ). For the probability to be 50% the proportion of wins in each place has to be the same.
Unless you're meant to make the assumption that there's likely to be the same number of games in each of NY and LA, in which case you know it's going to be in NY.
Originally posted by mtthwThis rewording was aimed at tricking those who fail to rebalance the possibilities like I originally did.
So exactly the same problem, except swapping 7/10 for 6/10.
No, still not fair. The previous formula still applies, and resolves to 70/133 (about 52.6😵. For the probability to be 50% the proportion of wins in each place has to be the same.
Unless you're meant to make the assumption that there's likely to be the same number of games in each of NY and LA, in which case you know it's going to be in NY.
If you used (Wins in NY) / (Total Wins) like I did, then you would conclude it was a fair and even bet... and you would be wrong.
Originally posted by eldragonflyCongratulations, you can cut and paste with the best of them. I'm done trying to help. You're obviously more interested in your own pride than in learning something. Good luck, you're going to need it.
There's nothing to prove, or else you must be braindamaged, i simply posted two links that examine the monty hall paradox, beyond the usual grab bag of gee whiz solutions. 😞
re: the monty hall paradox :
http://mathforum.org/dr.math/faq/faq.monty.hall.html
http://math.ucsd.edu/~crypto/Monty/montybg.htmlPlaying the game under these a ...[text shortened]... ea as blue. In other words, it doesn't matter if the contestant switches in this case.