Other fun arguments in the same vein: Is atheism a religion? Is not playing golf a sport? For extra fun, try explaining the answers to both in a non-contradictory way.
No to both, though atheism can be a theological philosophy.
I’d argue that atheism is a feature of a belief system and that the system may or may not be a religion. There are religions that don’t feature a belief in any gods. Similarly, your personal belief system may not be a full blown religion, even if you did happen to be theistic.
How are those the same? You need to define “religion” and “sport” rigorously first.
Since you haven’t provided one, I’ll just use the first sentence on the wiki page:
Religion is a range of social-cultural systems, including designated behaviors and practices, morals, beliefs, worldviews, texts, sanctified places, prophecies, ethics, or organizations, that generally relate humanity to supernatural, transcendental, and spiritual elements.
“Atheism,” without being more specific, is simply the absence of a belief in a deity. It does not prescribe any required behaviors, practices, morals, worldviews, texts, sanctity of places or people, ethics, or organizations. The only tenuous angle is “belief,” but atheism doesn’t require a positive belief in no gods, simply the absence of a belief in any deities. Even if you are talking about strong atheism (“I believe there are no deities”), that belief is by definition not relating humanity to any supernatural, transcendental, or spiritual element. It is no more religious a belief than “avocado tastes bad.” If atheism broadly counts as a religion, then your definition of “religion” may as well be “an opinion about anything” and it loses all meaning.
If you want to talk about specific organizations such as The Satanic Temple, then those organizations do prescribe ethics, morals, worldviews, behaviors, and have “sanctified” places. Even though they still are specifically not supernatural, enough other boxes are checked that I would agree TST is a religion.
I have no idea what you’re on about with not golfing being a sport.
To the golf thing:
“Is not playing a sport also a sport?”
The basic premise of the poster’s comment was:
“Is the absence of a thing, a thing in and of itself?”
That was not the premise of the poster’s comment.
0 isn’t nothing, and “a thing” is a much broader category than “natural numbers”.Half an apple is also a thing.
I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.
But is zero a positive number?
I think whole numbers don’t really exist outside of US high schools. Never learnt about them or seen them in a book/paper at least.
Actually “whole numbers” (at least if translated literally into German) exist outside America! However, they most absolutely (aka are defined to) contain 0. Because in Germany “whole numbers” are all negative, positive and neutral (aka 0) numbers with only an integer part (aka -N u {0} u N [no that extra 0 is not because N doesn’t contain it but just because this definition works regardless of wether you yourself count it as part of N or not]).
Natural numbers are used commonly in mathematics across the world. Sequences are fundamental to the field of analysis, and a sequence is a function whose domain is the natural numbers.
You also need to index sets and those indices are usually natural numbers. Whether you index starting at 0 or 1 is pretty inconsistent, and you end up needing to specify whether or not you include 0 when you talk about the natural numbers.
Edit: I misread and didn’t see you were talking about whole numbers. I’m going to leave the comment anyway because it’s still kind of relevant.
I wouldn’t be surprised. I also went to school in MS and LA so being taught math poorly is the least of my educational issues. At least the Natural numbers (probably) never enslaved anyone and then claimed it was really about heritage and tradition.
Negative Zero stole my heart
In school i was taught that ℕ contained 0 and ℕ* was ℕ without 0
I was taught ℕ did not contain 0 and that ℕ₀ is ℕ with 0.
ℕ₀* is ℕ with 0 without 0
Aren’t you guys taught about a tging called whole numbers??
zero is positive
-dev
Don’t explain the ieee floating point standard to mathematicians from within punching distance.
spoiler
0 == -0
but also0 > -0
1/0 == 2/0
but1/0 != 1/(-0)
not(x<y) and not(x>y) and not(x==y)
wherex
isNaN
x!=x
wherex
isNaN
I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0
Oh no, are we calling positive integer “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.
On the other hand, changing integer to whole number makes perfect sense.
I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
I’m an American mathematician, and I’ve never experienced a situation where 0 being an element of the Naturals was called out. It’s less ubiquitous than I’d like it to be, but at worst they’re considered equally viable conventions of notation or else undecided.
I’ve always used N to indicate the naturals including 0, and that’s what was taught to me in my foundations class.
Of course they’re considered equally viable conventions, it’s just that one is prevalent among Americans and the other isn’t.
I think you’re using a fringe definition of the word “fringe”.
This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.
It is true.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.
I did say mathematician, not logician.
Logicians are mathematicians. Well, most of them are.
But not all mathematicians are logicians.
The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Can I pay them to make my dick length the ISO standard?
I feel they have an image to maintain, but i also feel they would sell out for enough money. So… tell me if you make it.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
How programmers utilize something doesn’t mean it’s the mathematical standard, idk why ISO would be a reference for this at all
Because ISO is the International Organisation for Standardization
My experience (bachelor’s in math and physics, but I went into physics) is that if you want to be clear about including zero or not you add a subscript or superscript to specify. For non-negative integers you add a subscript zero (ℕ_0). For strictly positive natural numbers you can either do ℕ_1 or ℕ^+.
I hate those guys. I had that one prof at uni and he reinvented every possible symbol and everything was so different. It was a pita to learn from external material.
they’ll probably make up their own symbology just because it’s slightly more convenient for their proof
I feel so thoroughly called out RN. 😂
As a programmer, I’m ashamed to admit that the correct answer is no. If zero was natural we wouldn’t have needed 10s of thousands of years to invent it.
Did we need to invent it, or did it just take that long to discover it? I mean “nothing” has always been around and there’s a lot we didn’t discover till much more recently that already existed.
Does “nothing” “exist” independent of caring what there is nothing of or in what span of time and space there is nothing of the thing?
There’s always been “something” somewhere. Well, at least as far back as we can see.
IMO we invented it, because numbers don’t real. But that’s a deeper philosophical question.
As a programmer, I’d ask you to link your selected version of definition of natural number along with your request because I can’t give a fuck to guess
I truly have no idea what you’re saying.
I think you’re considering whether zero is somehow “naturally-occurring”, while others may be considering the concept of a natural number, which is a nonnegative integer.
I think he’s just asking for a properly documented Pull Request in order to process your thoughts.
Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.
But -0 is also 0, so it can’t be natural number.
In some countries, zero is neither positive nor negative. But in other, it is both positive and negative. So saying the set of natural number is the same as non-negative [integers] doesn’t really help. (Also, obviously not every would even agree that with that definition regardless of whether zero is negative.)
Wait, I thought everything in math is rigorously and unambiguously defined?
There’s a hole at the bottom of math.
There’s a frog on the log on the hole on the bottom of math. There’s a frog on the log on the hole on the bottom of math. A frog. A frog. There’s a frog on the log on the hole on the bottom of math.
Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.
Yes, and like any science it gets revisited and contested periodically.
Platonism Vs Intuitionism would like a word.
Zero is a number. Need I say more?
Counterpoint: if you say you have a number of things, you have at least two things, so maybe 1 is not a number either. (I’m going to run away and hide now)
I’m willing to die on this hill with you because I find it hilarious
“I have a number of things and that number is 1”
I have a number of friends and that number is 0
I have a number of money and number is -3567
Another Roof has a good video on this. At some points One was considered “just” the unit, and a Number was some multiple of units.
N is the set of “counting numbers”.
When you count upwards you start from 1, and go up. However, when you count down you usually end on 0. Surely this means 0 satisfies the definition.
The natural numbers are derived, according to Brouwer, from our intuition of time of time by the way. From this notion, 0 is no strange idea since it marks the moment our intuition first begins _
countable infinite set are unique up-to bijection, you can count by rational numbers if you want. I don’t think counting is a good intuition.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N. Though, I freely admit that another set could be used if you assumed it more primitive.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N.
Isn’t this what I just said? If I am not mistaken, this is exactly what “unique up-to bijection” means.
Anyways, I mean either starting from 1 or 0, they can be used to count in the exactly same way.
I’m arguing from the standpoint that we establish the idea of counting using the naturals - it’s countable if it maps to the naturals, thus the link. Apologies for the lack of clarity.
0 is natural.
Source - programming languages.
*Most programming languages
I don’t personally know much programming language that provides natural number type in their prelude or standard library.
In fact, I can only think of proof assistants, like Lean, Coq, and Agda. Obviously the designer of these languages know a reasonable amount of mathematics to make the correct choice.
(I wouldn’t expect the same from IEEE or W3C, LOL
It’s really just a joke about counting from 0 instead of 1.
Oh, array indexing, sure.
So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.
±ε
I think p-adic has that
How about minus zero?