**Question: **Let U_{1},U_{2}, . . . be a sequence of independent uniform (0, 1) random variables, and let

N = min{n ≥ 2: U_{n} > U_{n}−1} & M = min{n ≥ 1: U_{1} + · · · + U_{n} > 1}

That is, N is the index of the first uniform random variable that is larger than its immediate predecessor, and M is the number of uniform random variables we need sum to exceed 1. Surprisingly, N and M have the same probability distribution, and their common mean is e!

**Simulation Solution**: I am not going to solve the problem analytically here. We will start off with the results and see some very interesting things that happen in this scenario. The results from the text are as follows

- P{ N > n } =
^{1}⁄_{n!} - P{ M(x) > n } =
^{xn}⁄_{n!}

where M(x) is a generalized version of of M as in

M(x) = min{ n ≥ 1: U_{1} + · · · + U_{n} > x } : 0 < x ≤ 1

**Distributions of N and M: **Some important things to note.

- Note from the above that the distributions for M(1) and N are the same when
*0 < x ≤ 1* - For x > 1, it’s a whole nother story
- Since M is the sum of uniform random variables and the expected value of a uniform random variable is 0.5 when the range is 0 to 1
- This means that the expected number of uniform random variables to be summed to get a value of x > 1 is
^{x}⁄_{E[u]}=^{x}⁄_{0.5 }= 2x

**Case 1: **0 < x ≤ 1

What is being shown below?

- Observe that there are two entities here
- First one is the set of squiggly lines and
- There are a total of 100 these red squiggly lines
- each lines is a set of points, and each point is an average of 100 M’s for the corresponding x

- Second one is the dashed line which represents the theoretical result

- First one is the set of squiggly lines and
- It is very clear that when x : (0, 1]
- the expected number of uniform random variables and the x is a non linear variable clearly the squiggly lines show that
- Also note how the variance is slowly building up with almost none at the beginning but slowly increasing

**Case 2: ** x > 1

As we have discussed above, when x exceeds 1, the expected number of uniforms needed varies with x as 2x. The variance appears to follow a *log-like* behavior. I wasn’t able to get an expression for variance but included here is the plot.

Here is how the plot has been generated

- I wrote a python version of the accumulator to export a million counts for each of x ranging from 1 to 100 in steps of 1
- Then, I imported them back to mathematica to find the mean and the standard deviation for each of the million points
- The python code is also shown below
- Since the value for each point in the plot is a calculated from a million simulation points, the mean would be very close to the expected value
- Looking at a plot similar to the above, we see that following
- The equation for the straight line after 1 is : f[x] = 2 (x – 1) + E
- Where E is the euler number

Expected Value of M : E[M]

Variance of M : Var[M]

**Mathematica Code **

AppendTo[$Path, "Put_your_path_here"]; << utilities`; ClearAll[minSum]; minSum[x_] := Module[{total = x, count = 1, sum = RandomReal[]}, While[sum <= total, sum += RandomReal[]; count += 1]; count] minSum[x_, sampleSize_] := Table[minSum[x], sampleSize] Module[{data, theory, meanTrack, mergedPlot, range = Range[0.01, 1, 0.01]}, data = Table[minSum[#, 100] & /@ range, 100]; data = MapThread[{#1, #2} &, {range, Mean /@ #}] & /@ data; meanTrack = ListLinePlot[data, ImageSize -> 788, InterpolationOrder -> 2, PlotStyle -> Table[{Red, Thickness[0], Opacity@0.4}, Length@data], Mesh -> All, MeshStyle -> {Black, PointSize[0]}]; theory = Plot[E^x, {x, 0, 1}, PlotStyle -> {Dashed, Black}]; mergedPlot = Show[meanTrack, theory, AspectRatio -> 1, PlotLabel -> stringJoinStyled[{Style["Simulation", Red], Style[" Theory", Black]}], ImageSize -> 788, Frame -> True, FrameLabel -> {"x", "M(x)"}]; Export[StringReplace[NotebookFileName[], ".nb" -> "_x_less_than_1.png"], mergedPlot, ImageResolution -> 1000, ImageSize -> 788] ] Export[StringReplace[NotebookFileName[], ".nb" -> ".txt"], Parallelize[ N /@ {Mean[#], StandardDeviation[#]} & /@ (Import[#][[1]] & /@ FileNames["*.csv", NotebookDirectory[]])]] Module[{sortedData = SortBy[ToExpression /@ # & /@ (StringSplit[StringDelete[#, " "], ","] & /@ StringSplit[ StringDelete[ Import[StringReplace[NotebookFileName[], ".nb" -> ".txt"]], {"{", "}"}], "\n"]), #[[2]] &], muPlot, sigmaPlot}, sortedData = MapThread[Prepend[#1, #2] &, {sortedData, Range[100]}]; muPlot = ListLinePlot[sortedData[[All, {1, 3}]], ImageSize -> 788, Mesh -> All, MeshStyle -> {Blue, PointSize[0.004]}, AspectRatio -> 2, Frame -> True, FrameLabel -> {"x", "Var(M)"}, GridLines -> {sortedData[[All, 1]], sortedData[[All, 3]]}]; sigmaPlot = ListLinePlot[sortedData[[All, {1, 2}]], ImageSize -> 788, Mesh -> All, MeshStyle -> {Blue, PointSize[0.004]}, AspectRatio -> 1, Frame -> True, FrameLabel -> {"x", "Exp(M)"}, GridLines -> {sortedData[[All, 1]], sortedData[[All, 2]]}]; MapThread[ Export[StringReplace[NotebookFileName[], ".nb" -> "_x_greater_than_1_" <> #1 <> ".png"], #2, ImageResolution -> 500, ImageSize -> 788] &, {{"mean", "standardDeviation"}, {muPlot, sigmaPlot}}] ]

**Python Code**

from mathematica.random_functions import * from mathematica.lists import * from mathematica.monitoring import * from mathematica.sheldon_ross_problems.sheldon_ross_utilities import * import csv unknownPath = open('folderPath.txt').readlines()[0] chapterName = 'sheldon_ross_chapter_03' problemName = 'sheldon_ross_example_3.28' _x = Range(1, 100) for __x in _x: __xList = [] __fileName = ConstructFileName([unknownPath, chapterName, problemName, problemName], '_1_through_100_' + str(__x) + '.csv') TimeTagMessage('Opening file ' + __fileName) __file = open(__fileName, 'w') TimeTagMessage('Opening writer object') __writer = writer = csv.writer(__file, delimiter=',') for i in range(1000000): _randomSum = RandomReal() _randomSumCount = 1 _randomElement = RandomReal() while _randomSum < __x: _randomSum += RandomReal() _randomSumCount += 1 __xList.append(_randomSumCount) TimeTagMessage('Writing row ' + str(__x) + ' the file') writer.writerow(__xList) TimeTagMessage('Closing the file ' + __fileName) __file.close()

End of the Post

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.