0ne 0f the greatest — 0r maybe the greatest — achievement of mankind
October 22, 2024 12:30 AM Subscribe
At first, zero caused confusion. “Its ability to represent ‘nothing’ and enable complex mathematical operations challenged deeply ingrained theological and philosophical ideas,” Nieder said. Particularly due to the influence of the church, philosophers and theologians associated “nothing” with chaos and disorder and were disinclined to accept it. Many even feared it, considering it “the devil’s number,” Barnett said. from How the Human Brain Contends With the Strangeness of Zero
[Quanta Magazine; ungated]
Nothing to worry about.
posted by Phanx at 1:08 AM on October 22 [5 favorites]
posted by Phanx at 1:08 AM on October 22 [5 favorites]
From the article:
posted by runcifex at 2:04 AM on October 22 [2 favorites]
Barnett was interested in absence before he was interested in zero. The majority of consciousness and perception science over the last century has focused on what happens in the brain when we detect something in the environment. “But this ignores the whole other side of things,” he said, “which is that you can often have experiences of something not being there.”Love the effortless mild self-referentiality.
posted by runcifex at 2:04 AM on October 22 [2 favorites]
meh, this conception of zero reached its zenith in the middle ages
posted by lalochezia at 3:06 AM on October 22 [4 favorites]
posted by lalochezia at 3:06 AM on October 22 [4 favorites]
Let's say you're trying to count how many items there are in a closed box. You might immediately open the box and look for the items to count. You might react to seeing an item by saying "one", and then react to seeing additional items by saying the next number- or, failing to find anything, react to the absence of any items by saying "zero". Alternatively, one might say "zero" before even opening the box, and then always react to finding any additional item by saying the next number.
On the one hand, the latter method is more conceptually uniform. On the other, it seems so unconventional that I thought it might run afoul of some child development principle I was not yet aware of, so I did not dare teach my kids to count that way.
posted by a faded photo of their beloved at 4:56 AM on October 22 [2 favorites]
On the one hand, the latter method is more conceptually uniform. On the other, it seems so unconventional that I thought it might run afoul of some child development principle I was not yet aware of, so I did not dare teach my kids to count that way.
posted by a faded photo of their beloved at 4:56 AM on October 22 [2 favorites]
Those who like this nothing may also enjoy one of my favourite nothings:
https://www.goodreads.com/book/show/424190.The_Nothing_That_Is
posted by CookTing at 5:04 AM on October 22 [1 favorite]
https://www.goodreads.com/book/show/424190.The_Nothing_That_Is
posted by CookTing at 5:04 AM on October 22 [1 favorite]
“But this ignores the whole other side of things,” he said, “which is that you can often have experiences of something not being there.”
But, one of the studies found, zero also holds a special status in the brain.
I want you to notice
When I'm not around
You're so fuckin' special
I wish I was special
posted by AlSweigart at 6:07 AM on October 22
But, one of the studies found, zero also holds a special status in the brain.
I want you to notice
When I'm not around
You're so fuckin' special
I wish I was special
posted by AlSweigart at 6:07 AM on October 22
This is one of my favourite computer things.
0 is a number signifying a count of zero
null is the absence of an object/scalar/value, it's even more zero than zero, it's no-thing.
undefined is the absence of a place to put an object, it not even no-thing, it's no-where
[compiler error] is the absence of a clear reference to a place to put an object, it's no-sense
posted by seanmpuckett at 7:01 AM on October 22 [15 favorites]
0 is a number signifying a count of zero
null is the absence of an object/scalar/value, it's even more zero than zero, it's no-thing.
undefined is the absence of a place to put an object, it not even no-thing, it's no-where
[compiler error] is the absence of a clear reference to a place to put an object, it's no-sense
posted by seanmpuckett at 7:01 AM on October 22 [15 favorites]
null is the absence of an object/scalar/value, it's even more zero than zero, it's no-thing.
I was going to call this article silly, but then you brought up null and totally and equivocally agree now with the article and zero's complexity. Null checking is extremely poorly implemented across so many computer languages, and so many of them complain about variables with no value. It hasn't been defined yet, and there is no consistent simple null check. So weird.
posted by The_Vegetables at 7:11 AM on October 22 [2 favorites]
I was going to call this article silly, but then you brought up null and totally and equivocally agree now with the article and zero's complexity. Null checking is extremely poorly implemented across so many computer languages, and so many of them complain about variables with no value. It hasn't been defined yet, and there is no consistent simple null check. So weird.
posted by The_Vegetables at 7:11 AM on October 22 [2 favorites]
I like the zeroes in the post title in place of the Os.
posted by elmer benson at 7:39 AM on October 22 [7 favorites]
posted by elmer benson at 7:39 AM on October 22 [7 favorites]
Null - the billion dollar mistake. Tony Hoare introduced null for Algol in 1965 and we've been dealing with it ever since in (many not all) computer languages.
Not very often I get to shout out Tony Hoare - he's a comp sci hero of mine (quicksort, formal program correctness and a zillion other things). He won the Turing Award in 1980.
posted by whatevernot at 7:55 AM on October 22 [5 favorites]
Not very often I get to shout out Tony Hoare - he's a comp sci hero of mine (quicksort, formal program correctness and a zillion other things). He won the Turing Award in 1980.
posted by whatevernot at 7:55 AM on October 22 [5 favorites]
Is {0} a subset of the natural numbers N or not?
This is not something I personally feel strongly about but I have been taught by several mathematicians that believe there is a clear correct answer.
I honestly think 0 has been more influential as a placeholder in the arabic numeral system (so that we can add, subtract, multiply and divide with ease), than it has a number in its own right. This is notwithstanding the fact that 0 is as important as 1 and pi and e.
posted by plonkee at 9:01 AM on October 22 [1 favorite]
This is not something I personally feel strongly about but I have been taught by several mathematicians that believe there is a clear correct answer.
I honestly think 0 has been more influential as a placeholder in the arabic numeral system (so that we can add, subtract, multiply and divide with ease), than it has a number in its own right. This is notwithstanding the fact that 0 is as important as 1 and pi and e.
posted by plonkee at 9:01 AM on October 22 [1 favorite]
undefined is the absence of a place to put an object, it not even no-thing, it's no-where
I'd say undefined is a different kind of no-thing... there's a place and no assignment has been made
whereas null is a place an assignment has been made and that assignment is no-thing
but honestly, what were they smoking when they thought null, undefined, and a vague definition of true/false all at the same time were a good idea?
posted by kokaku at 9:06 AM on October 22 [1 favorite]
I'd say undefined is a different kind of no-thing... there's a place and no assignment has been made
whereas null is a place an assignment has been made and that assignment is no-thing
but honestly, what were they smoking when they thought null, undefined, and a vague definition of true/false all at the same time were a good idea?
posted by kokaku at 9:06 AM on October 22 [1 favorite]
Especially when the fix (eg. Option type) is so monadically powerful.
posted by whatevernot at 9:33 AM on October 22
posted by whatevernot at 9:33 AM on October 22
I have been taught by several mathematicians that believe there is a clear correct answer.
Inasmuch as any particular set of numbers with a name is so through a process of consensus, in the case of N there is no single agreed-upon definition. Luckily mathematicians are quite familiar with the notion of defining your symbols before you start working with them so that really shouldn't pose an insurmountable problem. Of course, curmudgeons are going to etc.
posted by axiom at 11:05 AM on October 22
Inasmuch as any particular set of numbers with a name is so through a process of consensus, in the case of N there is no single agreed-upon definition. Luckily mathematicians are quite familiar with the notion of defining your symbols before you start working with them so that really shouldn't pose an insurmountable problem. Of course, curmudgeons are going to etc.
posted by axiom at 11:05 AM on October 22
but honestly, what were they smoking when they thought null, undefined, and a vague definition of true/false all at the same time were a good idea?If you want your language and compiler to have separate definitions of zero, false, undefined and null that do not rely on the context within the language itself (and therefore have different physical representations in memory), you either need to reserve bits that can represent these different values in every value represented in memory, or you need to add a layer of indirection for every value represented in memory.
For example, if you wanted to be able to represent all 4 of those values in every physical value in memory that might be accessed, you would need to reserve 2 bits. For an 8 bit byte, you've now reduced the maximum representable value of a byte from 256 values to 64. I do not know of any language that does this as a general-purpose mechanism (some do for special circumstances).
For the other option, a layer of indirection, you leave alone the value itself, but now must add an additional piece of memory that provides the context for if a an actual value exists, and if not, what non-value meaning it has (null, undefined, etc). This is more-or-less what all modern memory-managed programming languages (Java, C#, Python, etc, not C/C++) do.
The final option is to accept that a representation in memory of the numeric value 0x00 is the number 0 if that memory-address/variable is being contextually referred to as a number in the source code, the boolean false if it's being referred to as a boolean (or used as part of a conditional expression), or is interpreted as null if it's being used as a reference to another value (a pointer to a memory address) somewhere else in memory.
Back in 1965, doing anything other than the final option was impractical, when computers having more than a few kilobytes of memory were astronomically expensive, and shrinking the maximum representation of each value or bloating the memory usage of each value in order to represent information that could be supplied contextually by the programmer would also be expensive, as well as the extra CPU cycles needed to interpret that extra context.
posted by WaylandSmith at 12:13 PM on October 22 [8 favorites]
I'm intrigued to see no mention of infinity in the article, as I think of it as the another side of the mathematical coin. Of course, in mathematics, coins may have more than two sides.
posted by meinvt at 12:55 PM on October 22 [1 favorite]
posted by meinvt at 12:55 PM on October 22 [1 favorite]
AFAIK, having both null and undefined is unique to JS. The best design rationale I could find is that Number(null) == 0 was carried over from C, so an undefined value that didn't convert to zero was desirable, especially since the first version of the language lacked exceptions. Though practically it's kind of annoying and I have yet to find an instance where you absolutely need both.
The C notion of "undefined behavior" is way weirder.
posted by credulous at 1:06 PM on October 22 [1 favorite]
The C notion of "undefined behavior" is way weirder.
posted by credulous at 1:06 PM on October 22 [1 favorite]
Very interesting article! But some of it reads like deadpan straight-man parody…
“It looks like it is concrete because people put it on the number line — but then it doesn’t exist. … That is fascinating, absolutely fascinating.”
That’s comedy gold right there.
posted by slogger at 5:22 PM on October 22
“It looks like it is concrete because people put it on the number line — but then it doesn’t exist. … That is fascinating, absolutely fascinating.”
That’s comedy gold right there.
posted by slogger at 5:22 PM on October 22
@Fade photo of their beloved
It sounds like you and Edsger Dijkstra would get along: https://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD831.html
posted by scivola at 6:41 PM on October 22
It sounds like you and Edsger Dijkstra would get along: https://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD831.html
posted by scivola at 6:41 PM on October 22
> Is {0} a subset of the natural numbers N or not?For mathematicians it largely doesn't matter, because "natural numbers" is just a label. As long as clear definitions are available, so the reader knows that they're on the same page with the author of a research work, it shouldn't cause trouble.
This is not something I personally feel strongly about but I have been taught by several mathematicians that believe there is a clear correct answer.
I honestly think 0 has been more influential as a placeholder in the arabic numeral system (so that we can add, subtract, multiply and divide with ease), than it has a number in its own right. This is notwithstanding the fact that 0 is as important as 1 and pi and e.
As for "number in its own right" vs. "placeholder... so we can do [arithmetic] with ease", I think there's very little difference. Numbers are defined by what we can do with them (i.e. how they relate to other numbers). For example, zero is the number x such that for any natural number y (1, 2, 3...), y + x = y. Whether zero "belongs to" the natural numbers depends on how much you value its special relationship concerning all these other natural ("counting") numbers 1, 2, 3... If it's immaterial in a particular context, you may exclude zero from the natural numbers.
(But as soon as expand the context beyond counting -- including any systematic treatment of arithmetic, zero becomes almost immediately relevant. Like, you can show that under certain axioms, zero is unique, there's no other number like zero that satisfies the definition of zero-ness. Which is pretty big and useful if you think of it).
posted by runcifex at 8:04 PM on October 22
A little nothing goes a long, long way
posted by Dokterrock at 10:43 PM on October 22
posted by Dokterrock at 10:43 PM on October 22
'll be singing this all day...
My hero, Zero, such a funny little hero,
But till you came along,
We counted on our fingers and toes.
Now you're here to stay
And nobody really knows
HOW wonderful you are.
Why we could never reach a star,
Without you, Zero, my hero,
How wonderful you are.
...
Et cetera, et cetera, ad infinitum, ad astra,
forever and ever
With Zero, my hero, how wonderful you are.
posted by I_Love_Bananas at 3:51 AM on October 23 [2 favorites]
My hero, Zero, such a funny little hero,
But till you came along,
We counted on our fingers and toes.
Now you're here to stay
And nobody really knows
HOW wonderful you are.
Why we could never reach a star,
Without you, Zero, my hero,
How wonderful you are.
...
Et cetera, et cetera, ad infinitum, ad astra,
forever and ever
With Zero, my hero, how wonderful you are.
posted by I_Love_Bananas at 3:51 AM on October 23 [2 favorites]
I'll expand my previous list with "unknown"
0 is a number signifying a count of zero
null is the absence of an object/scalar/value, it's even more zero than zero, it's no-thing.
unknown is the absence of knowledge about what is in a place, it's no-idea
undefined is the absence of a place to put an object, it not even no-thing, it's no-where
[compiler error] is the absence of a clear reference to a place to put an object, it's no-sense
posted by seanmpuckett at 5:20 AM on October 23
0 is a number signifying a count of zero
null is the absence of an object/scalar/value, it's even more zero than zero, it's no-thing.
unknown is the absence of knowledge about what is in a place, it's no-idea
undefined is the absence of a place to put an object, it not even no-thing, it's no-where
[compiler error] is the absence of a clear reference to a place to put an object, it's no-sense
posted by seanmpuckett at 5:20 AM on October 23
> but honestly, what were they smoking when they thought null, undefined, and a vague definition of true/false all at the same time were a good idea?
There is an argument to be made that if you have conceptual problems with that, you are not expert enough with the language (we're talking about C here, right?)
It was a different world and the idea was an elegant solution to some real problems.
There's also an argument to be made that if you're writing application software and it's much more complicated than *ix command-line utilities you should consider using a different language, one with some blade guards.
posted by Aardvark Cheeselog at 8:23 AM on October 23
There is an argument to be made that if you have conceptual problems with that, you are not expert enough with the language (we're talking about C here, right?)
It was a different world and the idea was an elegant solution to some real problems.
There's also an argument to be made that if you're writing application software and it's much more complicated than *ix command-line utilities you should consider using a different language, one with some blade guards.
posted by Aardvark Cheeselog at 8:23 AM on October 23
I'm not really sold on the idea that zero is so counter-intuitive and hard to fathom? 0, at least as it is used in day-to-day life, does not mean "nothing" in general! It means the lack of something specific. If I have two potatoes, the two potatoes are real. If I have one potato, that potato is real. If I have zero potatoes, the fact of my lacking a potato is real. Not mysteriously abstract!
In a calculation untethered by referents, sure, 0 is "abstract." But how is it more abstract than any other digit? If I write 19+10 on a chalkboard, 9 represents 9 whatevers, and 0 represents a lack of the same whatevers. No?
I haven't been convinced that 0 fucks with our heads any more than, like, that seeing the digit 1 triggers us to contend with the entire philosophical nature of what it means for a thing to exist.
posted by dusty potato at 6:56 PM on October 23
In a calculation untethered by referents, sure, 0 is "abstract." But how is it more abstract than any other digit? If I write 19+10 on a chalkboard, 9 represents 9 whatevers, and 0 represents a lack of the same whatevers. No?
I haven't been convinced that 0 fucks with our heads any more than, like, that seeing the digit 1 triggers us to contend with the entire philosophical nature of what it means for a thing to exist.
posted by dusty potato at 6:56 PM on October 23
« Older LOCKED AND LOADED11!!1 | All aboard the Bike Bus Newer »
This thread has been archived and is closed to new comments
posted by Homemade Interossiter at 12:31 AM on October 22 [3 favorites]