Questions tagged [martingale]

In probability theory, a martingale is a model of a fair game where knowledge of past events never helps predict the mean of the future winnings and only the current event matters.

A martingale is a sequence of random variables (i.e., a stochastic process) for which, at a particular time in the realized sequence, the expectation of the next value in the sequence is equal to the present observed value even given knowledge of all prior observed values.

67 questions
24
votes
4 answers

The magic money tree problem

I thought of this problem in the shower, it was inspired by investment strategies. Let's say there was a magic money tree. Every day, you can offer an amount of money to the money tree and it will either triple it, or destroy it with 50/50…
9
votes
2 answers

Reference Request: Book on Unit Root Theory

In trying to do time series analysis, I almost regularly stumble upon unit root and cointegration tests. The design of most these tests is based on a null of unit root (for both linear and non-linear models) and the statistic's distribution is…
9
votes
1 answer

Semi-martingale vs. martingale. What is the difference?

Can someone please explain (preferably in layman's terms) what is the difference between semi-martingales and martingales? I have found the following sentence on Wikipedia: In probability theory, a real valued process X is called a semimartingale if…
abu
  • 371
  • 3
  • 11
6
votes
1 answer

Distribution of $\frac{1}{1+X}$ if $X$ is Lognormal

Suppose $Z \sim \mathcal{N}(0,1)$. Suppose $X$ is a lognormally distributed random variable, defined as $X:=X_0exp^{(-0.5\sigma^2+\sigma Z)}$, in other words, $X$ is log-normal with $\mathbb{E}[X]=X_0$. Suppose we are interested in the variable of…
6
votes
2 answers

Prove that a simple random walk is a martingale

Note that $a$ has a mean of 0. My approach: $$X_t=X_{t-1}+a_t$$ $$E[X_{t+1}\mid X_1 + \dots+X_{t-1}]$$ $$=E[X_{t-1}+2a\mid X_1 + \dots+X_{t-1}]$$ $$=E[X_{t-1}\mid X_1 + \dots+X_{t-1}]+E[2a\mid X_1 + \dots+X_{t-1}]$$ $$=E[X_{t-1}\mid X_1 +…
GarlicSTAT
  • 179
  • 1
  • 9
6
votes
1 answer

Cox PH linearity assumption: reading martingal residual plots

According to a lot of ressources about Cox PH model, continuous numeric variables should be tested for linearity assymption by plotting the Martingale residuals. In R, you can use survminer::ggcoxfunctional() to easily plot these residuals for the…
Dan Chaltiel
  • 1,089
  • 12
  • 25
6
votes
1 answer

Martingale process

Let $\zeta(t)$ be a process with independent increments and $M(t)=E(\exp(\zeta(t))) < \infty $, show that $M(t)^{-1}\exp(\zeta(t))$ is a martingale. So what I need to show is $$E(M(t)^{-1}\exp(\zeta(t))|F_s)= M(s)^{-1}\exp(\zeta(s))$$ What I've…
Parinn
  • 73
  • 5
6
votes
1 answer

How to compute expectation of square of Riemann integral of a random variable?

How does one compute $E[(\int_0^T W_s ds)^2]$ where $(W_t)_{t \in [0,T]}$ is standard Brownian motion in $(\Omega, \mathscr F, \mathbb P)$? Apparently proving $$\int_0^T W_s ds = \int_0^T (T-s) dW_s \tag{*}$$ might need to assume that $E[(\int_0^T…
6
votes
3 answers

Power martingales for change detection: M goes to zero?

I'm trying to apply the power martingale framework by [Vovk et al., 2003] to change detection in unlabeled data streams, just like in [Ho and Wechsler, 2007]. The basic idea involves using a power martingale of the form $$M_n^{(\epsilon)} :=…
snikolenko
  • 61
  • 1
6
votes
4 answers

Is it a valid algorithm to win at the casino roulette?

I would like to try the following algorithm in order to win in the roulette: Be an observer until there are 3 same parity numbers in a row ($0$ has no defined parity in this context) Once there were achieved 3 numbers with the same parity in a row:…
0x90
  • 687
  • 1
  • 5
  • 17
5
votes
3 answers

Conditional expectation of random variables defined off of each other

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a…
5
votes
1 answer

ABRACADABRA Problem

As a complement to this answer for those not familiar with martingales. What is the expected number of keystrokes (or "time") it would take a monkey to type the string $\small \text{ABRACADABRA}$? Step-by-step intuitive solution light on math…
Antoni Parellada
  • 23,430
  • 15
  • 100
  • 197
5
votes
2 answers

Martingales: Why must expected posterior equal prior?

For a posterior distribution to be plausible in the Bayesian sense (Bayes' Plausible), it is said that: $\mathbb{E}(\mu_{t+1} | \mu_t) = \mu_t$ where $\mu_t$ is the posterior distribution at time $t$, and consequently the prior at $t+1$ and…
Fatsho
  • 331
  • 2
  • 6
5
votes
2 answers

Finding $b$ such that $e^{5B_t - bt}$ is a martingale

I have $X_t = e^{5B_t}$ and Where $B_t$ is brownian motion at time $t$. $M_t = X_t \cdot e^{-bt}$ I need to find a value for $b$ such that $M_t$ is a martingale. I am encountering difficulty, however. $$\mathbb{E}[ e^{5B_t}e^{-bt} | \mathcal{F}_s]…
4
votes
1 answer

Help to understand martingale example from Billingsley

I am studying Billingsley's section 35 about martingales and I have some difficulties understanding one of the examples. This is the example 35.4. Suppose we have a measurable space $(\Omega,\mathcal F)$ and let $Q$ and $P$ be probability measures…
Chaos
  • 421
  • 3
  • 12
1
2 3 4 5