let me take a crack at this :
1) The NY Mets and the LA Dodgers have just played their final game of the season, which was a make-up game due to a scheduling error. The venue was decided on a coin flip (50/50 chance of the game being held in NY or LA), but you missed the sports report so you don't know where it was held. During the season, the following stats were compiled:
2) When playing in NY, the Mets won 7/9 times and the Dodgers won 2/9 times.
3) When playing in LA, the Mets won 6/10 times and the Dodgers won 4/10 times.
(The statisticians who complied these stats assure you that the stats are perfectly predictive of the result of this make-up game.)
4) You overhear the result of the game on the radio on your way home from work, and the Mets won 11-4. A friend of yours, a compulsive gambler who doesn't follow baseball and doesn't know where the game was held either, says to you "I'll bet you even money that the game was in NY (i.e. I, your friend, will pay you even money if the game was held in LA, and you will pay me even money if the game was held in NY)."
1) location is randomly decided by a coin toss,
2} The mets win 7/9 times in ny the dodgers win 2/9 times in ny
3) the dodgers win 4/10 times in los angeles, the mets win 6/10 times in los angeles
4) the mets won 11-4
(a) Is this a fair bet?
(b) If not, what odds should your friend give you to make it fair?
Originally posted by PBE67/2= 3.5
A new problem! Another simple application of Bayes theorem, although I hope this one won't generate as much discussion:
The NY Mets and the LA Dodgers have just played their final game of the season, which was a make-up game due to a scheduling error. The venue was decided on a coin flip (50/50 chance of the game being held in NY or LA), but you missed the ...[text shortened]... Is this a fair bet?
(b) If not, what odds should your friend give you to make it fair?
6/4=1.5
11/4=2.75
3.5-2.75=0.75
2.75-1.5=1.25
1.25/0.75=1.66
i misscalculated the first time but the answer is 1.66 to 1
and before you say its wrong (again) run it on any program you like!
again overcomplicating simple issues!
Originally posted by eldragonflyI'll take a look-see at the sports statistics problem in a bit.
You simply turn the card over, it has to be either the silver/silver or the silver gold.
Let me try this again listing all the possibilities BEFORE we find out the extra information, then eliminating the possibilities which the extra information makes impossible.
When the card is first selected, and before we see the color whcih faces up we have 4 possibilities.
1) Silver showing with Silver hidden. (Silver/Silver card)
2) Gold showing with Gold hidden (Gold/Gold card)
3) Silver showing with Gold hidden (Silver/Gold with Silver face up)
4) Gold showing with Silver hidden (Silver/Gold with Gold face up)
Now the most important thing to note is this: NOT ALL THE POSSIBILITIES ARE EQUALLY LIKELY More specifically, latter two occur with half the frequency of the first two if the card selection is random, and any card is equally likely.
If it was the Silver card, then silver WILL be showing. With the Gold card, the Gold WILL be showing, but if the mixed card is picked, then half the time it will come Gold side up, and half the time it will come Silver side up.
This is crucial to note because it means uneven chances of the four possibilities. It isn't enough to assume all possibilities are equal, becaused sometimes they aren't, even in fair games.
Let us now proceed.. We are then told that it is a Silver side showing, which eliminates possibilities #2 and #4, leaving only #1 and #3.
This is where the different odds of the two scenarios occurring give us an answer which seems counter-intuitive, because we had determined beforehand that Silver with Silver back is twice as likely (1 in 3 picks) to occur as Silver with Gold back (1 in 6 picks).
In general your odds for one event are the same as the long term odds in a repeated experiment.
Originally posted by alexdinoOK, I simulated it first. 56%. You're wrong 🙂
i misscalculated the first time but the answer is 1.66 to 1
and before you say its wrong (again) run it on any program you like!
again overcomplicating simple issues!
I've posted my answer, which approximates to 56.5%. If you can let me know which bit you disagree with I'll try and explain further.
Originally posted by mtthwAfter much consternation, I believe your solution is correct. I made the mistake of calculating P(B) as 13/19 because the Mets won 13 out of 19 games. In fact, I was convinced your solution was lacking the proper game weighting for P(M | NY) and P(M | LA) because it gives P(M) = 31/45 > 13/19. But, after thinking about this problem:
Hmm. Strange we're so close, yet different. I must be missing something subtle (or you are!). Maybe you can spot the flaw in this reasoning:
(M = Mets win)
We want to know P(NY | M) = P(NY & M)/P(M)
But also:
P(NY & M) = P(M | NY)*P(NY)
and
P(M) = P(M | NY)*P(NY) + P(M | LA)*P(LA)
We know:
P(M | NY) = 7/9
P(M | LA) = 6/10
P(NY) = P(LA) = 1 ...[text shortened]... NY & M) = 7/9*1/2 and
P(M) = 7/9*1/2 + 6/10*1/2
So P(NY | M) = (7/9)/(7/9 + 6/10) = 70/124.
Suppose you are given 50 black balls and 50 white balls. You are to put the balls into 2 urns in any combination and quantity you wish (provided you use all 100 balls), such that when your friend picks an urn at random and then picks a ball from that urn at random, the chance that he will draw a white ball will be maximized. What is the optimal solution?
I realized that your method is indeed correct. Kudos mtthw! Excellent solution.
Originally posted by mtthwI ran a simulation too. Each number below represents 5000 trials, and the average is the value for all 50,000 trials:
OK, I simulated it first. 56%. You're wrong 🙂
I've posted my answer, which approximates to 56.5%. If you can let me know which bit you disagree with I'll try and explain further.
0.556419113
0.557171952
0.541043724
0.563888088
0.543978749
0.576020851
0.567821491
0.565077508
0.582945285
0.568644561
Average = 0.562301132
This is closer to mtthw's answer (56.5% ) than my answer (56.8% ), which gives me added confidence in mtthw's answer.
I applied Bayes' Formula and I get the following.
Mets played the Dodgers in 19 previous games.
Mets won 13 of these games total.
Mets won 7 of these in New York.
Mets won 6 of these in Los Angelas.
So assuming these statistics accurately portray Mets chances against the Dodgers, (the most reasonable assumption), and given that the Mets won the game (the actual score doesn't matter), the following calculations should hold for odds.
Played in NY: (7/19)/(13/19) = 7/13
Played in LA: (6/19)/(13/19) = 6/13
The bet goes against you, but not by a whole not. He should offer $7 for every $6 you bet on LA for it to be a fair, even bet.
Originally posted by PBE6I agree with your application of Bayes' Theorem.
P(A g B) = P(A U B) / P(B)
where P(A g B) = probability of A given B
P(A U B) = probability of A and B happening together
P(B) = probability of B
But in terms of notation, I was under that impression that the symbol U is typically understood to refer to the union (related to the logical disjunction); whereas if you inverted the symbol then it would refer to the intersection (related to the logical conjunction). In other words, I thought that (A U B) is normally understood to convey A or B -- not A and B.
Originally posted by geepamoogleMy only qualm is with the selection process, sure it was forced but a beautiful problem nonetheless. Let's move on...
I'll take a look-see at the sports statistics problem in a bit.
Let me try this again listing all the possibilities BEFORE we find out the extra information, then eliminating the possibilities which the extra information makes impossible.
When the card is first selected, and before we see the color whcih faces up we have 4 possibilities.
1) Silv general your odds for one event are the same as the long term odds in a repeated experiment.
Originally posted by mtthwI got the same answer.
Hmm. Strange we're so close, yet different. I must be missing something subtle (or you are!). Maybe you can spot the flaw in this reasoning:
(M = Mets win)
We want to know P(NY | M) = P(NY & M)/P(M)
But also:
P(NY & M) = P(M | NY)*P(NY)
and
P(M) = P(M | NY)*P(NY) + P(M | LA)*P(LA)
We know:
P(M | NY) = 7/9
P(M | LA) = 6/10
P(NY) = P(LA) = 1 ...[text shortened]... NY & M) = 7/9*1/2 and
P(M) = 7/9*1/2 + 6/10*1/2
So P(NY | M) = (7/9)/(7/9 + 6/10) = 70/124.
Might as well make this a new problem!
Suppose you are given 50 black balls and 50 white balls. Your task it to distribute the balls between the 2 urns in any combination you wish, provided you use all 100 balls, with the goal of maximizing the probability that a white ball will be chosen when a ball is selected at random from a randomly selected urn.
What is the optimal solution?
Originally posted by LemonJelloYou're totally right. I actually did do the calculation as an intersection, but I used the symbol for union by mistake.
I agree with your application of Bayes' Theorem.
But in terms of notation, I was under that impression that the symbol U is typically understood to refer to the union (related to the logical disjunction); whereas if you inverted the symbol then it would refer to the intersection (related to the logical conjunction). In other words, I thought that (A U B) is normally understood to convey A or B -- not A and B.
Originally posted by PBE649 white balls in one urn.
Might as well make this a new problem!
Suppose you are given 50 black balls and 50 white balls. Your task it to distribute the balls between the 2 urns in any combination you wish, provided you use all 100 balls, with the goal of maximizing the probability that a white ball will be chosen when a ball is selected at random from a randomly selected urn.
What is the optimal solution?
1 white ball in the other urn.
Just a guess.