You won't remember all my Erdős problems
April 19, 2024 7:34 PM   Subscribe

A database of 589 math problems posed by Paul Erdős, mostly in combinatorial geometry and number theory, only 159 of which have been solved. Get to work!
posted by escabeche (15 comments total) 14 users marked this as a favorite
 
Prove that:
1 + 1 = 3, for large values of 1.
posted by neuron at 10:03 PM on April 19 [3 favorites]


Let A⊆R2 be a set of n points with minimum distance equal to 1, chosen to minimise the diameter of A. If n is sufficiently large then must there be three points in A
which form an equilateral triangle of size 1?


I got 99 problems and this is one of 'em.
posted by chavenet at 1:03 AM on April 20 [3 favorites]


I'd feel incredibly smart if I could just understand one or two of these actually mean.
posted by sammyo at 1:41 AM on April 20 [5 favorites]


Like #25:

Let n1<n2<⋯ be an arbitrary sequence of integers, each with an associated residue class ai(mod ni).
Let A be the set of integers n such that for every i either n n<ni
or n≢ai(mod ni).


What's a "logarithmic density" mean? I know what both words mean but don't have any context together, like say "mean apple", is the apple mean or is it the average apple...
Hmm, overloading mean here. Like context is everything. Don't even get me started with "residue class", no way am I taking that class.
posted by sammyo at 1:58 AM on April 20


430? Aw man this is gonna take me all weekend.
posted by jy4m at 5:01 AM on April 20 [4 favorites]


What's a "logarithmic density" mean? I know what both words mean but don't have any context together, like say "mean apple", is the apple mean or is it the average apple...

My spouse is a number theorist; I hear terms like this often. They can't really be explained to a lay audience - at least, not without several years of advanced mathematics education to get to the explanation. Which makes that lay audience not so lay anymore.
posted by entropone at 5:34 AM on April 20 [2 favorites]


If you want one that you can understand without much math, try this one.

The idea is: draw some dots, and connect them up with lines ("edges"). This is your "graph". Now pick three of the edges to make a triangle. Repeat as many times as you can without reusing edges. There might be a bunch of ways to do this, so pick the one that makes the most triangles. You will end up with some number of triangles, which Erdos calls "k". Now forget about the triangles that you already made, and start erasing edges from the graph. Erase twice as many edges as you made triangles (2k). You are trying to make it impossible to make any triangles at all. If you can't do this, then the answer to Erdos is "no".

It's actually a nice problem to fall asleep to, because when you start trying to build counterexamples, your working memory very quickly gets full enough that there's no room for self-loathing.
posted by novalis_dt at 5:48 AM on April 20 [17 favorites]


My favorite facts about Erdős is that he would just more or less randomly show up at a fellow mathematicians house, get lodging until a paper was produced and repeat. With breaks for conferences and such.
posted by Jacen at 5:52 AM on April 20 [4 favorites]


he was also open about his consistent amphetamine usage
posted by AlbertCalavicci at 7:20 AM on April 20


What's a "logarithmic density" mean?

I'll take a whack at this. First I should explain ordinary density, though. It seems reasonable to say things like "Half the integers are even" or "One-tenth of integers end in the digit 7", but if you think about it for a minute, you'll see that these statements are kind of problematic: they seem to be ascribing values to ∞/∞. How can we say that one infinite set is "half" of another? (Besides, if you've ever studied set theory, maybe you saw a proof that there are just as many even integers as there are integers in general!)

To make these statements make sense, we define the density of a subset of the integers (let's say of the positive integers, to keep things as simple as possible). We do this in two steps. First, we consider what fraction of 1 through N are in the set, where N is some "cutoff". Then we take the limit as N goes to infinity.

Take the even integers, for example. When the cutoff N is itself even, the fraction of 1 through N that are even is exactly 1/2. When N is odd, it's a little less than 1/2: for example, 4999 of the first 9999 integers are even, and 4999/9999 ≈ 0.49995. So, the fraction of even integers among the first N integers goes 0/1, 1/2, 1/3, 2/4, 2/5, 3/6, ..., but gradually the "odd" terms approach 1/2. This allows us to say that the density of the even integers is 1/2, which gives formal meaning to "Half the integers are even".

Now, that may seem like a lot of work to justify something intuitively obvious. But there are some sets whose density is a bit more interesting: for example, in the long run, almost all integers contain the digit 7, almost none are prime, and 6/pi2 ≈ 60.8% of the integers have no square divisors. Math!!

There are also some sets whose density is ill-defined, because the fraction of integers from 1 to N that are members fluctuates forever instead of approaching a limit. An example is "integers whose first digit is 1". If our cutoff N happens to be, say, 2000000000000, then over half of the integers up to N begin with 1 (think 1000000000000 to 1999999999999). But if our cutoff is 9999999999999, then only 1/9 of the integers up to N begin with 1. This disparity isn't solved by adding more digits to N.

But it turns out we can tweak the definition of density to force it to return an answer for some sets like this. The idea is that instead of weighting the membership of 1 through N in our set equally, we'll give progressively smaller weights to larger integers, so that fluctuations when N is large don't affect the outcome as much. Logarithmic density is one of several ways of doing this; in the logarithmic density, each integer k gets weight proportional to 1/k. For example, if we were calculating the fraction of {1,2} that are even, we'd give 1 (not even) and 2 (even) relative weights of 1 : 1/2, or 2 : 1; this set is only 1/3 even!

That illustration was chosen for simplicity, but it's actually misleading. Remember that density is a limit! In the long run, even using logarithmic density, half the integers are even. In fact, when ordinary density has a defined value, logarithmic density always agrees with it. This makes it somewhat easier to swallow that we're replacing an intuitive definition by a weirder definition; they disagree only when the weird definition gives an answer and the ordinary definition doesn't. This is classic mathematician: take a concept we all know, formalize it so we have logical rigor, then find ways to expand the scope of the original concept.

You might notice that I haven't yet mentioned logarithms at all. The name "logarithmic" comes from a somewhat technical fact. To take a weighted average, we need to divide by the sum of the weights. The sum 1 + 1/2 + 1/3 + ... + 1/N turns out to grow like the logarithm of N when N is large. Specifically, it grows like the natural logarithm, and the correction term is a famous constant. But that's another comment for another time.
posted by aws17576 at 7:58 AM on April 20 [19 favorites]


It's too late to edit my comment, but I want to improve my overly technical explanation of why logarithmic density is called logarithmic. If you consider the example of the integers whose first digit is 1, you can see that there are longer and longer "clumps" of such integers, which is why the ordinary way of defining density fails. The weighting involved in the logarithmic density compresses these clumps, just like graphing on a logarithmic scale, so that they end up all looking about the same size -- a regular alternation of "on" and "off" clumps that is just as tame as the alternation of even and odd integers. Hopefully that gives a good notion of why logarithmic density is "logarithmic", and why it's used!
posted by aws17576 at 8:17 AM on April 20 [2 favorites]


I now know how a dog feels when you explain a concept more complex than"car ride!"
posted by Keith Talent at 10:09 AM on April 20 [5 favorites]


Metafilter: Your working memory very quickly gets full enough that there's no room for self-loathing.
posted by hovey at 1:10 PM on April 20 [3 favorites]


If we solve one of these, do we get an Erdős number?

Or, as is more likely, if aws17576 solves one and I post this comment on the same page, do I get an Erdős number?
posted by zompist at 1:36 PM on April 20 [2 favorites]


Neuron: best I can do is prove it for small values of 3.
posted by wobh at 2:34 PM on April 20 [1 favorite]


« Older Revolution in Tennessee   |   "Animals speak their own language... it’s a lot... Newer »


You are not currently logged in. Log in or create a new account to post comments.