Gaussian Process

240 posts

Gaussian Process

Gaussian Process

@GaussianProcess

algorithmic trading

Entrou em Temmuz 2020
661 Seguindo2.7K Seguidores
Gaussian Process
Gaussian Process@GaussianProcess·
@danrobinson @danielvf Yep both are almost surely 0.5 in the limit (but also I think many claimed solutions for interpretation B are bad, because they would imply that the expectation is 0.5 at finite steps, which is wrong)
English
0
0
0
92
Dan Robinson
Dan Robinson@danrobinson·
@GaussianProcess @danielvf Lol at your interpretation C and touché, but don’t both your interpretations A and B give an answer of 0.5? I guess if I interpret “what eventually happens” to mean N->infinity
English
1
0
2
183
Gaussian Process
Gaussian Process@GaussianProcess·
@danrobinson @danielvf Vaguely similar energy to the question 'what is the average bitcoin block interval' If you sample a random interval, it's 10 minutes If you sample a random point in time, the average length of the interval you land in is 20 minutes
English
1
0
2
233
Gaussian Process
Gaussian Process@GaussianProcess·
The answer depends on your interpretation. Interpretation A: We sample the fraction after each child is born. Then we have a series where each element is independently boy or girl, so by symmetry the expected fraction of girls is 0.5 at each step. Interpretation B: We sample the fraction after each family is finished. Suppose N families have finished. Then there are N girls, and some random variable B of boys. The expected fraction of girls is E[N/(N+B)] > N/(N+E[B]) = 0.5 by Jensen's inequality. So at any finite step, the expected fraction is more than 0.5! By the law of large numbers, N/(N+B) converges almost surely to 0.5 as N tends to infinity. Interpretation C: We start with a fixed population and we want to know what happens in the long run. In fact, the population will die out, so the fraction isn't even defined. Let p be the probability that a population with one man survives. Let b_i be the probability that a man has i boys. Then p = b_0 + b_1*p + b_2*p^2+... = . The only two solutions are p=0 and p=1 (the RHS is convex, so there can be at most two). Since there is positive probability the population dies out at the first step, p != 1, so we must have p = 0.
English
0
0
3
627
erisa
erisa@erisaonX·
This is the most probability question of all the probability questions I’ve ever come across
erisa tweet media
English
229
43
2.1K
484K
Gaussian Process
Gaussian Process@GaussianProcess·
@maxresnick I believe the rule is within each flashblock, you 1. Group by eoa, within each group order by nonce 2. Order groups by min(priority fee)
English
1
0
4
565
Max Resnick
Max Resnick@MaxResnick·
As I was exploring the base Data, I found something interesting. There should only be 10 flashblocks (200ms) per 2 second block. But if everything is priority ordered, why are there 13 spikes on this graph? Is this a bug in the base scheduler?
Max Resnick tweet media
English
16
2
44
9.5K
Gaussian Process
Gaussian Process@GaussianProcess·
@angeris A similar proof, but written more combinatorially: Pick a team of size at most pn uniformly from all possible teams. Let X_i be the indicator person i is selected. P(X_i = 1) <= p, so we have log(LHS) = H(X_1,…,X_n) <= H(X_1) + … + H(X_n) <= nH(p) = log(RHS)
English
1
0
2
167
guille
guille@angeris·
today's blog post actually fits in a single tweet image! a lovely little counting inequality (that has some applications to data availability sampling :)
guille tweet media
English
4
8
32
4.3K
Gaussian Process
Gaussian Process@GaussianProcess·
@angeris In fact for small eps, you can do slightly better than the random construction via Reed-Solomon codes, see e.g. arxiv.org/pdf/1206.5725 So one angle for intuition here is good error correcting codes exist => you can find lots of nearly orthogonal vectors
English
1
0
6
462
guille
guille@angeris·
so, fun fact, turns out you can fit exponentially many (normalized) vectors in a list such that the pairwise inner product of any two distinct vectors is ≤ eps
guille tweet media
English
10
11
85
35.8K
Gaussian Process
Gaussian Process@GaussianProcess·
@Galois_Capital Yep, and note that your recurrence telescopes. Another way to phrase this is to let I_k = 1 if a loop is formed when k noodles remain, and 0 otherwise. As above, E[I_k] = 1 / (2k-1). Then by linearity of expectation, E[loops] = E[I_100 + ... + I_1] = 1/199 + ... + 1
English
0
0
4
295
Galois Kevin
Galois Kevin@Galois_Capital·
The key is that after arbitrarily grabbing one end, the probability of grabbing the same noodle’s other end with the next grab is 1/199. If that happens then the problem reduces down to 1 completed loop put aside and the 99 noodles case of the same problem. On the other hand, there is a 198/199 chance to grab a different noodle’s end with the second grab so if that happens, the problem reduces down to the 99 noodles case. E[loops_k]=(1/(2k-1))*(1+E(loops_(k-1)))+((2k-2)/(2k-1))*E(loops_(k-1)) with E(1)=1 as the base case. So then you just solve this sum of two summations for k=100. @GaussianProcess what do think?
Dmitrii Kovanikov@ChShersh

Y’all complain about LeetCode interviews but omg have you seen Quant interview questions?? Wth is this bro 😭

English
3
0
18
7.8K
Dan Robinson
Dan Robinson@danrobinson·
This impressive paper is the biggest contribution to modeling LVR since 2023 It’s the first paper to model LVR for uniform block times (like almost all chains now have) This means that switching from PoW to PoS may have reduced LVR by ~17%, just due to deterministic block time!
Dan Robinson tweet media
Alex Nezlobin@0x94305

@MartinTassy @jason_of_cs @ciamac @Tim_Roughgarden @alz_zyd_ @_Dave__White_ Turns out that the constant we were looking for includes the Riemann zeta function evaluated at 1/2. I feel like we are getting close to connecting AMM research to the Riemann hypothesis, which is pretty impressive given that it all started with xy=k! arxiv.org/pdf/2505.05113

English
7
22
150
27.1K
Gaussian Process
Gaussian Process@GaussianProcess·
@danrobinson @ksrini_ @ThogardPvP @phildaian As an aside, someone's feelings on short block times should match their feelings on uniform block times, because f(1)+ f(1) vs f(2) is the same comparison as f(1) + f(1) vs f(2) + f(0)
English
1
0
3
250
Gaussian Process
Gaussian Process@GaussianProcess·
@danrobinson @ksrini_ @ThogardPvP @phildaian In the screenshot I'm replying to a thread where the assumed model is that MEV is proportional to sqrt(block time) x.com/0xdoug/status/… In reality I think it could be convex (like in academic LVR models), ~linear (CPMM with no fee), or concave (monopolist arbitrageur)
Doug Colkitt@0xdoug

1/ Toy model showing that MEV extraction from ordinary users will be slightly worse in post-Merge PoS compared to PoW...

English
2
0
4
464
Gaussian Process
Gaussian Process@GaussianProcess·
@libevm @MetaMask Note that using logs in this way isn't very safe, it would be quite easy for an attacker to generate false positives, see e.g. @naterush1997/eth-goes-bloom-filling-up-ethereums-bloom-filters-68d4ce237009" target="_blank" rel="nofollow noopener">medium.com/@naterush1997/…
English
1
0
30
1.6K
tensorfish
tensorfish@tensorfish·
1/6 Think @MetaMask's new smart transaction feature is saving you from getting sandwiched? Think again, there's a new searcher in town, and they're serving up blind sandwiches, with fat profit margins. 🧵👇
tensorfish tweet media
English
8
32
236
37K
Gaussian Process
Gaussian Process@GaussianProcess·
@MevRefund I think mev blocker only keeps the signature private and reveals the complete unsigned tx. This stops an attacker from including a mev blocker tx in a bundle (e.g. sandwiching it), but nothing stops an attacker from sending their own competing tx, so no leak necessary here.
English
2
0
8
4.5K
MevRefund
MevRefund@MevRefund·
The dark forest just got a lot darker. A few weeks ago, 0x991 frontran a seemingly private shezmu hack tx for 250 Eth. Today the same bot frontran a private mev blocker tx, also for 250 Eth. Someone's abusing private order flow.
English
18
43
383
88.1K
Gaussian Process
Gaussian Process@GaussianProcess·
Yet another solution, starting from the following well known facts: (1) If X ~ Gamma(k) and Y ~ Gamma(n), then X/(X+Y) ~ Beta(k, n) and X/(X+Y) is independent of X+Y (2) Beta(k, n) has the same distribution as V_(k), the kth smallest of k+n-1 uniform r.v.s - (uniform r.v. order statistics) (3) Gamma(k) ~ sum of k iid Exp(1) (4) e^{-Exp(1)} ~ Unif(0, 1) - (probability integral transform) (3) + (4) imply that e^-Gamma(m) ~ U_1...U_m for iid uniform U_i (1) + (2) imply that X = (X+Y) * X/(X+Y) ~ Gamma(k+n) * V_(k) Combining, U_1...U_k ~ (U_1...U_{k+n})^{V_(k)} Setting k=n=1, we recover the original problem U_1 ~ (U_1U_2)^(V_1)
English
0
0
2
841
Grant Sanderson
Grant Sanderson@3blue1brown·
Here's the challenge mode for all you math whizzes. Sample three numbers x, y, z uniformly at random in [0, 1], and compute (xy)^z. What distribution describes this result? Answer: It's uniform! I know how to prove it, but haven't yet found the "aha" style explanation where it feels expected or visualizable. If any of you have one, please send it my way and I'll strongly consider making a video on it. h/t @alonamit for this one.
Grant Sanderson@3blue1brown

On a recent trip to the UK, @standupmaths shared this delightful little probability fact with me and asked if I might animate an explanation for a video of his. It's a fun one! See his full video with shenanigans relating this to dice and more. youtu.be/ga9Qk38FaHM

English
93
177
2.7K
481K
Gaussian Process
Gaussian Process@GaussianProcess·
A nice generalization that makes what's going on clearer: Imagine radioactive sludge decaying (i.e. a Poisson process). Repeat the following: pick an undecayed particle i, and wait until it decays at time T_i. We retain a uniform r.v. fraction of particles at each step, so the fraction of particles undecayed after n steps is U_1...U_n. We can also work 'backwards'. If we know that the nth particle decayed at time T_n, then we can generate T_1,...,T_{n-1} by picking n-1 uniform r.v.s V_i and setting T_i = V_(i) * T_n, where V_(i) is the ith smallest of the V_i. If the fraction of undecayed particles at time t is k, then the fraction of undecayed particles at time ct is k^c. Then the fraction of undecayed particles at time T_m is (U_1...U_n)^V_(m). So we must have that (U_1....U_n)^V_(m) ~ U_1...U_m And in particular for n=2 and m=1 we get (U_1 * U_2)^V_1 ~ U_1 As an aside, V_(m) has distribution Beta(m, n-m), and -log(U_1...U_n) has distribution Gamma(n) So this is equivalent to Beta(m, n-m) * Gamma(n) ~ Gamma(m), a more well known fact
English
0
0
3
900
Gaussian Process
Gaussian Process@GaussianProcess·
@MihaiCNica @3blue1brown I addressed this in my last few sentences above. Consider X, Y, Z which are i.i.d. uniform random variables, independent of U,V,W. XY and UV have the same distribution and are both independent of W, so (UV)^W and (XY)^W have the same distribution, and X,Y,W are i.i.d.
English
1
0
1
315
Mihai Nica
Mihai Nica@MihaiCNica·
@GaussianProcess @3blue1brown I originally thought this exact proof, but I don't think it works because of dependence. Even though W is independent of the product, UV, it is not enough because W is dependent on (U,V). So they are not all independent!
English
1
0
1
230
Gaussian Process
Gaussian Process@GaussianProcess·
@danrobinson Infinite money glitch: If you think E[EUR per USD tomorrow] = EUR per USD today, then E[USD per EUR tomorrow] > USD per EUR today
English
0
0
1
213
Dan Robinson
Dan Robinson@danrobinson·
Suppose 1 EUR = 1 USD today. There are two prediction markets on whether 1 EUR > 1 USD in a month: one denominated in USD, one in EUR. If that probability is 50%, you might naïvely expect the first to trade at 0.5 USD and the second at 0.5 EUR. But then there’d be an arbitrage.
English
3
0
19
3.3K
Dan Robinson
Dan Robinson@danrobinson·
The answer is D. Depending on the event, price could be anywhere from $0 to $1. The reason is risk. Risk is often misunderstood. It has nothing to do with inefficiency or irrationality. In fact, there’d be an arbitrage if price always = probability. Here’s my explanation🧵
Dan Robinson@danrobinson

Suppose the probability of some event happening is 75%, and everyone knows that. What is the price of a prediction market share that resolves to $1 if that event happens and $0 if it doesn't? Assume 0% interest rates, trustworthy oracle, and perfectly efficient markets.

English
17
15
180
94.2K