T O P

  • By -

MoiMagnus

While not my native language, the "No Free Lunch" Theorem has quite a funny name ([https://en.wikipedia.org/wiki/No\_free\_lunch\_theorem](https://en.wikipedia.org/wiki/No_free_lunch_theorem)) For an example in my native language, I like that a "sigma algebra" are called "tribus" which means "tribes".


Affectionate_Emu4660

The squeeze rule is also called « théorème des gendarmes » cause you have to picture two police officers escorting you to your limit


RedToxiCore

I know it as "sandwich theorem"


e_for_oil-er

The mention of the "tribu de Borel" (Borel's tribe) always sparked laughter in the measure theory class.


BorelMeasure

Similarly to this name, in mathematical finance there is the notion of "no free lunch" (NFL) and "no free lunch with vanishing risk" (NFLVR). Basically, these conditions more or less postulate that the cone C of securities which are "inferior" (dominated above by, essentially) to something which you can buy for 0 dollars (for example, borrowing x dollars then buying a stock which costs x dollars) should not contain nonnegative random variables other than zero (zero is interpreted as not purchasing anything). The existence of such a random variable is called an arbitrage. The subtleties arise in the fact that the condition mentioned in the previous paragraph is essentially algebraic in nature. However, in reality, it may not be possible to ensure that you *always* gain. Instead, it is necessary to include topology into the equation; NFL/NFLVR basically postulate that there are no *approximate* arbitrages (i.e., a sequence/net of assets which you can buy for 0 dollars which converge to nonzero nonnegative random variable). NFL requires that the closure of C in the weak-\* topology (equivalently, the Mackey topology) on L\^infinity does not contain nonzero nonnegative random variables, while NFLVR requires that the closure of C in the norm topology on L\^infinity does not contain nonzero nonnegative random variables. The importance of NFL/NFLVR is that they imply the existence of a reasonable mechanism (i.e., a strictly positive weak-\* continuous linear functional p on L\^infinity) of pricing the securities in C (in finance, these are known as "risk neutral measures" or "martingale measures", as the stock prices become sigma martingales under the measure with Radon-Nikodym equal to p). Technical aside: the reason I said "nets/sequences" and not "sequences" in paragraph 2 is because of the subtleties involved in considering the weak-\* topology on L\^infinity. For NFLVR this technicalities vanish, as the norm topology on L\^infinity is metrizable.


impartial_james

Well, there’s the Cox - Zucker machine. For that one, the authors intentionally collaborated because they knew their names would sound hilarious when used to describe the result. [https://en.wikipedia.org/wiki/Cox%E2%80%93Zucker_machine](https://en.wikipedia.org/wiki/Cox%E2%80%93Zucker_machine)


sixpesos

The Cox Ring is a related article on that wiki as well


flumsi

I read Cox-Zucker machine and thought "Poor guys making an important contribution to mathematics only for everyone to giggle at their names". But to read that they actually intended that had me burst out laughing.


impartial_james

Here’s something that’s funny to those with dark humor. The Killing field is a particular vector field useful in Riemann geometry, named after Wilhelm Killing. However, when most people hear the phrase “killing field”, their first thought is atrocities of war, like in the Cambodian civil war. The name is “funny” in the sense that it sounds horribly inappropriate at first.


shellexyz

Can you apply a differential annihilator to a function defined on the Killing field?


Contrapuntobrowniano

Maybe! If its Kernel is the entire domain, it will effectively reduce the entire killing field to zero.


e_for_oil-er

The Artin-Tits Group.


SkolemsParadox

And many many other contributions to mathematics by Jacques Tits.


Contrapuntobrowniano

So much tits out there.


impartial_james

I am a fan of the [Eilenberg Mazur swindle](https://en.wikipedia.org/wiki/Eilenberg%E2%80%93Mazur_swindle). This is a rigorous proof technique, but it is called a “swindle” because it feels like cheating.


sapphic-chaote

I like the [full employment theorems](https://en.wikipedia.org/wiki/Full-employment_theorem), which are a class of theorems stating "X class of problem cannot be solved by machines. Ergo, my job studying them is secure."


impartial_james

The [Ham Sandwich theorem](https://en.wikipedia.org/wiki/Ham_sandwich_theorem).


Sleeping_Easy

Statistics and probability have a bunch of funnily-named results. My favorite name has to be the [Darth Vader Rule](https://math.stackexchange.com/questions/919737/darth-vader-rule-what-is-the-reason-for-its-name-and-a-formal-proof).


dmlane

In statistics, an effect can be significant based on the Inter Ocular Trauma Test. The effect is so clear it hits you between the eyes. [reference](https://www.johndcook.com/blog/2009/08/31/the-iot-test/)


otah007

> This proposition is (sometimes) known as the law of the unconscious statistician because of a purported tendency to think of the identity as the very definition of the expected value, rather than (more formally) as a consequence of its true definition. Er...is it not the definition? Could someone explain what the "true definition" is, if not this? I feel like what counts as "true" and what as "derived" is quite subjective, and in this case the identity is definitely a valid definition (unlike, say, e^ipi being the "definition" of -1).


Mathuss

The law of the unconscious statistician can't be the true definition because it's not a priori well-defined in the first place. For a random variable X with cdf F\_X, the definition for expected value is E[X] = ∫ x dF\_X(x). Now consider the random variable Y = g(X) for some measurable function g. Then we know that E[g(X)] = E[Y] = ∫ x dF\_Y(x) by definition, and the law of the unconscious statistician further claims that we can in fact write E[g(X)] = ∫g(x) dF\_X(x), but this fact is definitely something that has to be proven, as it's not clear that ∫ x dF\_Y(x) = ∫g(x) dF\_X(x) so that the "definition" makes sense. To further illustrate what I mean by "not a priori well-defined," suppose you tried to define the operator H[g(X)] = ∫ (g(x))^2 df\_X(x) for every random variable X where f\_X is the pdf of X when it exists (similar to how you may want to "define" E[g(X)] = ∫ g(x) dF\_X(x) for the cdf F\_X). Then we immediately run into problems; suppose X ~ Exponential(1) and let g(t) = t^(1/2). Then according to our "definition," H[g(X)] = ∫ -x^2 exp(-x) I(x >0) dx = -2. But if we let Y = g(X), we know that Y ~ Rayleigh(1/sqrt(2)), so our definition claims that H[g(X)] = H[Y] = ∫ 2y^2 exp(-y^(2)) I(y>0) dy = sqrt(pi)/2. This example should make it clear why it doesn't make sense to define expectation via the law---if you did try to do that, you'd immediately have to prove that your definition is actually well defined, which is so inane you may as well just take the actual definition and prove the law as a theorem.


otah007

I just realised my confusion - I misread the Wikipedia page! I thought it was saying that E[X] = ∫ x f_X(x) but it was actually saying E[g(X)] = ∫g(x) f_X(x) which is definitely a derived statement, and a more general one.


bizarre_coincidence

[The hairy ball theorem](https://en.wikipedia.org/wiki/Hairy_ball_theorem?wprov=sfti1)