Le 4 mai 2021, la plateforme Yahoo Questions/Réponses fermera. Elle est désormais accessible en mode lecture seule. Aucune modification ne sera apportée aux autres sites ou services Yahoo, ni à votre compte Yahoo. Vous trouverez plus d’informations sur l'arrêt de Yahoo Questions/Réponses et sur le téléchargement de vos données sur cette page d'aide.

Ben
Lv 6
Ben a posé la question dans Science & MathematicsMathematics · il y a 1 décennie

Unfairly fair coin?

Suppose we have a coin that "tries" to be fair. To be more specific, if we flip the coin n times, and have X heads, then the probability of getting a heads in the (n+1)st toss is 1-(X/n). [The first time we flip the coin, it truly is fair, with p=1/2.]

A short example is in order. Each row here represents the ith coin toss, with associated probability and its outcome:

p=1/2: H

p=0: T

p=1/2: H

p=1/3: T

p=1/2: T

p=3/5: T

It certainly would seem that the expected value of X should be n/2. Is this the case? If so (and even if not), then think of this coin as following a sort of altered binomial distribution. How does its variance compare to that of a binomial [=np(1-p)]?

Does the coin that tries to be fair become unfair in the process? Or does it quicken the convergence to fairness?

(I haven't worked this one out, it just occurred to me. If the algebra becomes intractable, computer solutions are allowable.)

Mise à jour:

In light of John C's excel results, and some thinking of my own, I have the results for up to n=47 (and it wouldn't be too hard to expand those results). The mean is indeed the "correct" mean n/2 (as ksoileau has proved). The variance is exactly n/12, or 1/3 of an ordinary coin's variance, except for n=2, where the variance is 0.

Astounding. Perhaps someone will get to an algebraic proof?

Mise à jour 2:

To shed some light on my work:

I figured that the probability P(X; n)...that is, the probability of having X heads, given the parameter of n tosses...should follow a recurrence relation. Specifically,

P(X; n) = (1-X/n)*P(X-1; n-1) + X/n*P(X; n-1).

Using this relation it was relatively easy to create an excel spreadsheet with the probabilities, then to calculate E(X) and E(X^2). The really interesting thing is when John noticed the Eulerian triangular numbers, which follow a rather similar recurrence. Oddly though, I haven't been able to make the connection concrete.

Anyway, if we conclude that P(X; n) can be written as <n-1, x-1> [note that I'm using the angular brackets similar to the choose function, not A(n,k)], then we should be able to calculate E(X^2) and from there get the variance. I have unfortunately not been able to complete this work. It may include the use of the formula about the sum of triangular numbers being n!.

7 réponses

Pertinence
  • il y a 1 décennie
    Réponse favorite

    With the help of excel I calculated the first 12 flips.

    My preliminary results indicate the mean for your "nicely fair" coin is the same as an ordinary binomial coin. The variance is np(1-p)/3, 1/3 the variance of an ordinary coin.

    I don't think I'll have time to do it algebraically until next week.

    EDIT 11 Feb, 2008

    This is what I have so far:

    Make a chart of Number of flips completed, Probabiltiy # of Heads = 0,P(H=1),P(H=2)...

    n, P(H=0),P(H=1),P(H=2)...

    0, 1

    1, 1/2, 1/2

    2, 0, 1, 0

    3, 0, 1/2, 1/2, 0

    4, 0, 1/6, 2/3, 1/6, 0

    5, 0, 1/24, 11/24, 11/24, 1/24, 0

    6, 0, 1/120, 26/120, 66/120, 26/120, 1/120, 0

    The mean of H*p is still n/2 and the variance = n/12.

    Now strip away some of the numbers for a clearer look.

    From n=2 on P(H=0) = 0 and P(H=n)=0. So leave these out.

    Also the common denominator is (n-1)!

    Make a chart of just n and the numerators from n=2.

    n_, H=1, 2, 3, 4, 5, 6, 7

    2: 1

    3: 1, 1

    4: 1, 4, 1

    5: 1, 11, 11, 1

    6: 1, 26, 66, 26, 1

    7: 1, 57, 302, 302, 57, 1

    8: 1, 120, 1191, 2416, 1191, 120, 1

    and so on.

    These numerators are Euler's triangle numbers:

    T(n, k)=k*T(n-1, k)+(n-k+1)*T(n-1, k-1), T(1, 1)=1. T(n, k)=Sum (-1)^j*(k-j)^n*C(n+1, j), j=0..k.

    For this chart:

    T(n, H)=H*T(n-2, H)+(n-H)*T(n-2, H-1), T(2, 1)=1. T(n, H)=Sum (-1)^j*(H-j)^n*C(n, j), j=0..H.

    If someone can calculate in general the mean and variance of H*p from this it would be great.

    http://www.research.att.com/~njas/sequences/A00829...

    http://mathworld.wolfram.com/EulerianNumber.html

  • il y a 1 décennie

    After your initial pertabations, it will quickly converge into a normal guasian curve. Remember that the 'normal' standard deviation varies with 1/sqrt(n). Once you get big n's it won't matter.

    For example, at 1000 flips, we would expect an s.d of 16 or 500 +/- 16. At that point, using your 'fair' coin, if you were one s.d. out, your odds would only be (1-484/1000) or .516. In other words, you are only marginally more likely to flip a head v. a tail. It would barely change your percentage.

    At 100, the sd would be 5, and if you were one sd out, your precentage using the fair coin would be .55

    At 10, the sd would be 1.6 and your percentage using the fair coin at one sd out would be .66

    In other words, your coin would be very self correcting initially, but as you go out it would resemble a normal gausian curve.

    _______________________

    Edit: Having pondered this, I think I am wrong. It never converges to the standard gaussian curve. It is narrower. In the above-case of 1000 tries, if the coin flips were off by one sd ~16, your function would change the odds to .516, which means that in another 1000 tries, if the odds stayed the same, the mean would be 516, i.e. your expected answer would center on 1000 instead of 986. You can show this for all number of tries and all deviations.

    My guess is that it would cause the gausian curve to become uniformily narrower by some factor such 2 or sqr 2, or maybe 3 as suggested by John C, above. I will give it some more thought.

    _________________________

    In the interest of science, I did 250 runs of 1000 flips on an excel spreadsheet.

    Avg: 500.16

    sd: 9.33

    variance: 87.05

    3 looks like a good number (i.e. 1/3rd variance of a regular coin flip, 1/sqr(3) s.d.)

    ...........................

    I find it interesting that a random walk appears to be balanced exactly by a 1/r field such that it (appears to) always be narrower by 1/sqr (3) than an regular distribution.

    Great question.

  • Dr D
    Lv 7
    il y a 1 décennie

    I didn't do any theoretical analysis, but here's a simple program which simulates the behavior of the coin.

    ***

    n = 10 (no. of trials)

    X = 0 (initial no of heads)

    p = 0.5 (initial prob)

    for i = 1 to n

    r = random(0,1) i.e. a random no. between 0 and 1

    if r < p (i.e. heads thrown)

    X = X + 1

    end if

    p = 1 - X/i

    end for loop

    disp(X)

    ****

    To simulate a fair coin, all we need to do is to omit the line

    p = 1 - X/i

    Now for n = 10 here are the no. of heads (out of 10) in 20 runs.

    Fair coin: 1,7,4,5,5, 7,5,3,5,6, 4,4,5,6,4, 8,5,4,6,5

    New coin: 5,4,5,5,5, 5,6,4,4,5, 5,5,4,5,5, 4,4,5,6,5

    Notice that the new coin never deviates further than 5±1, while the original coin deviates a lot more.

    Without any calculations, the new coin has a small standard deviation, and clearly has an expectation of n/2.

    Obviously as n gets large, both coins tend to n/2 heads, but from our results for small n, the new coin gets there faster.

    So to answer your question - yes it does quicken the convergence to fairness.

    I believe you can compare the new coin to a person trying to balance on a beam. Whenever he starts to fall, he shifts his body weight to other side to force a balance. The original coin is like someone randomly picking which side to shift his weight. That person has a much stronger chance of falling off.

  • Anonyme
    il y a 1 décennie

    I didn't do any theoretical analysis, but here's a simple program which simulates the behavior of the coin.

    ***

    n = 10 (no. of trials)

    X = 0 (initial no of heads)

    p = 0.5 (initial prob)

    for i = 1 to n

    r = random(0,1) i.e. a random no. between 0 and 1

    if r < p (i.e. heads thrown)

    X = X + 1

    end if

    p = 1 - X/i

    end for loop

    disp(X)

    ****

    To simulate a fair coin, all we need to do is to omit the line

    p = 1 - X/i

    Now for n = 10 here are the no. of heads (out of 10) in 20 runs.

    Fair coin: 1,7,4,5,5, 7,5,3,5,6, 4,4,5,6,4, 8,5,4,6,5

    New coin: 5,4,5,5,5, 5,6,4,4,5, 5,5,4,5,5, 4,4,5,6,5

    Notice that the new coin never deviates further than 5±1, while the original coin deviates a lot more.

    Without any calculations, the new coin has a small standard deviation, and clearly has an expectation of n/2.

    Obviously as n gets large, both coins tend to n/2 heads, but from our results for small n, the new coin gets there faster.

    So to answer your question - yes it does quicken the convergence to fairness.

    I believe you can compare the new coin to a person trying to balance on a beam. Whenever he starts to fall, he shifts his body weight to other side to force a balance. The original coin is like someone randomly picking which side to shift his weight. That person has a much stronger chance of falling off.

  • Anonyme
    il y a 5 ans

    Why are you making the type of enormous deal approximately having the words "In God We have faith" on you cash? in case you come across that offensive then the difficulty is YOU, not Christians -- very almost all of whom had not something to do with that legend being imprinted on our forex. P.S.: basically a pair of noteworthy Founding Fathers weren't followers of God and His Son Jesus Christ, so do not attempt to play it out like the adult adult males who wrote our shape have been contributors of another religious sect.

  • Anonyme
    il y a 1 décennie

    Let X_n=# of heads in the first n tosses.

    P(getting a heads on the n+1st toss|X_n=s)=1-s/n

    P(getting a heads on the n+1st toss)

    =sum(P(X_n=s)*(1-s/n),s=0..n)

    EX_(n+1)-E(X_n)=

    E(X_(n+1)-X_n)=

    1*P(getting a heads on the n+1st toss)

    +0*P(getting a tails on the n+1st toss)

    =sum(P(X_n=s)*(1-s/n),s=0..n)

    =sum(P(X_n=s),s=0..n)

    +sum(P(X_n=s)*(-s/n),s=0..n)

    =1-1/n*sum(s*P(X_n=s),s=0..n)

    =1-1/n*EX_n

    so

    EX_(n+1)-E(X_n)=1-1/n*EX_n

    hence

    EX_(n+1)=1+(1-1/n)*EX_n

    Now since EX_1=1/2 this implies

    EX_n=n/2 for n>=1, as suggested.

  • il y a 1 décennie

    it makes sence

    by the way you're a f uck*n genius

Vous avez d’autres questions ? Pour obtenir des réponses, posez vos questions dès maintenant.