• We are currently performing site maintenance, parts of civfanatics are currently offline, but will come back online in the coming days. For more updates please see here.

1=.999999...?

Well, to be frank, I was skeptical about the proof myself. So I ran a little calculator test. But whatever calculation you use (using 0.999999999, because calculator can't handle 0.999...) the result turns out to be... 0.999... But I'm sure some people will be happier to see it in a video.
 
It would seem you couldn't do any calculus without infinitesimals.

Yes you can, with limits.

Also, the notation 0.000....1 isn't just wrong, it's total nonsense, as some other have pointed out already, and which Brennan would agree too if he weren't playing a tool.

If you want to use the infinitesimals, you have to use them right. You can't just pick something and call it an infinitesimal. Especially in the reals, where there are none.

Terx was correct when he said that understanding math at a formal high level was proving a hindrance to many in this thread. There is an inability to step back and look at the causes of the assumptions, because damn those assumptions are so good/useful.
---
If akka said, "brennan, you're right, it is somewhat arbitrary that they equal each other but it's a damn good rule," brennan would go, "you're right, it is damn good and I agree that it is at least as of now the best position so I must concur the equality holds"

If that's true, Brennan doesn't have only poor writing skills, but also a lack of comprehension. It was written pages before he even entered the thread that it is the result of the axioms of the real numbers. Although, I thought it's general knowledge of people who have studied past the high school that everything in maths is a result of the axioms used. Besides, how could it not be, since the axioms define what the real numbers are. It's like the statement "Washington D.C. is the capital of US". That's only true on the virtue of "US" meaning what it means. If it meant France, then the statement would be wrong.

Also his "advocacy for devil" didn't have anything to do with the areas where it would be even remotely possible to say 0.999... != 1. They were more of demonstrating his inability to understand the basic maths, things like "0.999... is process, not a number". It doesn't make you the advocate of devil if you just continuously repeat "no, you're wrong". You have to present some valid arguments too.

Third, if you think that mathematicians don't step back to look at causes and assumptions, you don't seem to know much about maths either. That's what they do all the freaking time. That's what maths basically is about.

Though not perfectly analogous, I find this a lot in my discipline. --- political economy

You find your education to be hindrance all the time? Have you tried to study harder?

No, I know what you meant. This is funny though: you notice that the knowledge of those who have studied maths is hindrance when you're not one of them. Then you find an analogy where you have studied something and it proves to be a valuable thing, but again some other has studied something else and that is a hindrance. Why didn't you tell us a story of how you were more inadequate on political economics than a high school kid, since you had studied the subject?

Why don't you see your story of frustration with the economists the other way down? That the mathematicians in this thread are like the political economists and the laymen are like the economists? Perhaps you just want to see yourself as the hero in all the stories.
 
Hygro, again, Atticus has been saying that exact thing this entire time. And brennan has not responded, "I agree, it's a damn good rule". He's responded "you're wrong, the rule is wrong, 0.999... =/= 1, lalalalala" (paraphrasing a little).

The point of the "war" analogy was a word being used in two different ways. You're dressing this up as if equivocating on words is always important and valuable to a discussion, but it is not. There are plenty of synonyms in the English language that really are entirely trivial, and don't allow you to dress up a completely irrelevant tale as something profound and meaningful. Mentally replace "war on drugs" with one of those synonyms.

Furthermore, you haven't adequately explained why claiming that 1+1=3 is not just as trivial as claiming that 0.999... =/= 1. If 0.999... =/= 1 is not trivial then literally anything I say about maths that happens to be wrong is also meaningful and profound. I could go into a thread on probabilities and say that the odds of flipping two heads in a row on a fair coin are 1/5. 35 pages later, after much argument and namecalling, I then declare: "AHAH! I see where the problem lies! You're working under the standard mathematical axioms, whereby the order of the numbers is 1, 2, 3, 4, and THEN 5! Whereas, I am working under a different (but still perfectly valid!) set of mathematical axioms, whereby the numbers go 1, 2, 3, 5, 4. Of course, we are both right. If you had just said, "yes, you are right, it is somewhat arbitrary that we choose the probability of two heads in a row to be 1/4 and not 1/5, but it's a damn good rule", then I of course would have said, "yes, I agree; neither of us are wrong, but your rule is better"! But instead you just chose to call me an idiot and insist that the probability was 1/4, without any serious discussion about the philosophical nature of numbers themselves. I mean, I'm actually operating on a much higher level than you. After all, what I'm really doing here is questioning the very premises of mathematics -- the axioms that constitute maths itself! Isn't that really clever and cool what I'm doing here? If you think about it, I'm really the smart one here."

And after all, what does "right" really mean? Isn't it simply the opposite of left? But none of us have left, since we are all still present. And all of us are left, since we all still remain! So is it not right that whoever's not left really is right?
 
He's responded "you're wrong, the rule is wrong, 0.999... =/= 1
This contradicts what you posted earlier. And what I have posted repeatedly about the axioms under which 0.999...=1 being unnecessary, which both you and Atticus and others have agreed is correct.
 
Oh get off it... If you're really honest with yourself, you haven't been saying anything like what Hygro is pretending you've said. You really did think, at the start of this conversation, that 0.999... simply approximated to 1, because you learned about 0.999... from your Physics degree, in which it is introduced during limits and series and so on. You had no idea that it was provable from mathematical axioms to be actually equal to 1. Then you realised, oh, bugger, you really can prove that 0.999... = 1, and changed your story. "AHA! I see where the problem is. You're using standard mathematical axioms, whereas I'm using non-standard axioms. Mine are perfectly valid, if unusual. Let's talk about non-standard axioms and the philosophical nature of numbers so I can save some face!" Seriously, that's all that's happened here...
 
Shouldn't this be pressingly easy? Like "show us how one of the proofs is wrong" levels of easy. Lots of proofs show that 1 = 9/9 = 0.99..., to dispute them is to insist they're making an error.

I've no real problem with the idea of 0.00...1 as a concept. An infinite number of points between two points? Sure, no problem. It's not like you can use them to do math with a totally different type of numbers. So even if 1 - 0.99... = 0.00...1, um, so? It also equals 'banana', if I want it to. But in the understood rules of math, it also equals zero.
 
We can get to 0.999... by 1/3*3 = 0.333...*3 = 0.999... = 1

The error is that the transition from 1/3 to 0.333... introduces a lack of precision, and when you then multiply that lack of precision is compounded. 0.33333... is imprecise. To be precise, 0.333... is 1/3. If you compensate for the lack of precision that is introduced when you do the arithmetic you find that .3333...*3 does not equal .9999..., it does in fact equal 1. Restoring precision is going to refute any "proof" that 0.9999... equals one.
 
You find your education to be hindrance all the time? Have you tried to study harder?

No, I know what you meant. This is funny though: you notice that the knowledge of those who have studied maths is hindrance when you're not one of them. Then you find an analogy where you have studied something and it proves to be a valuable thing, but again some other has studied something else and that is a hindrance. Why didn't you tell us a story of how you were more inadequate on political economics than a high school kid, since you had studied the subject?

Why don't you see your story of frustration with the economists the other way down? That the mathematicians in this thread are like the political economists and the laymen are like the economists?
:D Har har. My discipline is economics, studied/augmented through/with a political economy lens.

There's been plenty of times when people with less education than I said things I dismissed because it was outside what I had learned that was related to the subject at hand, only to find out later I was wrong and they were not. I was too into my categories of interpretation that I didn't consider other ones.

Has that ever happened to you?
 
The error is that the transition from 1/3 to 0.333... introduces a lack of precision, and when you then multiply that lack of precision is compounded. 0.33333... is imprecise. To be precise, 0.333... is 1/3. If you compensate for the lack of precision that is introduced when you do the arithmetic you find that .3333...*3 does not equal .9999..., it does in fact equal 1. Restoring precision is going to refute any "proof" that 0.9999... equals one.

Those things are defined as identical, in the mathematical sense. How is equating two identical values imprecise?
 
Those things are defined as identical, in the mathematical sense. How is equating two identical values imprecise?

A repeating decimal is an imprecise representation of a fraction. They aren't "identical" other than by definition, and to maintain precision you substitute the fraction (precise) for the repeating decimal (imprecise) prior to the multiplication.
 
Why don't you see your story of frustration with the economists the other way down? That the mathematicians in this thread are like the political economists and the laymen are like the economists?

Ok here's one in that vein, you guys are trying to explain national debt and we're all like "but debt means herp a derp a derp" and you're like "yeah, but what it actually it is this" and we're like "but according to derp a derp a derp when I owe money to Bill berp a derp".

But don't leave out the dude who circumvents the entire discussion by saying "it doesn't matter!!!1 the whole money system is slavery!" because he's stupid and doesn't get it and irrelevant and wrong but so, so right.
 
A repeating decimal is an imprecise representation of a fraction. They aren't "identical" other than by definition, and to maintain precision you substitute the fraction (precise) for the repeating decimal (imprecise) prior to the multiplication.

What evidence do you have that a repeating decimal is imprecise? If it continues to infinity as conceptually alleged, and we define that as 1/3, it is the same number. When applied in practice, it yields identical results as a result of that definition.

I'll ask another way: if .333... is imprecise compared to 1/3, in what way is it imprecise? By how great a number is it off from exactly 1/3? Can you create such a number, or is this a back-to-front way of attempting a .000...1?
 
What evidence do you have that a repeating decimal is imprecise? If it continues to infinity as conceptually alleged, and we define that as 1/3, it is the same number. When applied in practice, it yields identical results as a result of that definition.

I'll ask another way: if .333... is imprecise compared to 1/3, in what way is it imprecise? By how great a number is it off from exactly 1/3? Can you create such a number, or is this a back-to-front way of attempting a .000...1?

As demonstrated by this very thread, "when applied in practice" it specifically does not yield identical results.

No, it's the correct way of avoiding a .00000....1. The .333... is imprecise in that the number of digits required to make it precisely equal to 1/3 is infinite, and infinity is by its nature numerically imprecise.

To answer "By how great a number is it off from exactly 1/3?" would require that it have a precise value, and I'm saying that it does not have such a precise value. It is an imprecise representation of the fraction one third.
 
What necessitates infinity being imprecise by nature? That's an important logical step that I'm not picturing. We have a symbol for infinity and notation to represent numbers that repeat forever.

I've not seen anything in this thread that demonstrates a practical difference between 1/3 and .333... that doesn't require leaving the typically-used definition of numbers.

If you're defining .333... as imprecise and others define it as precise, what evidence do we have that someone is correct? Mostly that if you consistently apply the .333... = 1/3 and .999... = 1 concept with the numbers system most of us use, they function identically. I've not seen evidence against the notion unless it requires moving the definition or creating a new numbers system.
 
What necessitates infinity being imprecise by nature? That's an important logical step that I'm not picturing. We have a symbol for infinity and notation to represent numbers that repeat forever.

I've not seen anything in this thread that demonstrates a practical difference between 1/3 and .333... that doesn't require leaving the typically-used definition of numbers.

If you're defining .333... as imprecise and others define it as precise, what evidence do we have that someone is correct? Mostly that if you consistently apply the .333... = 1/3 and .999... = 1 concept with the numbers system most of us use, they function identically. I've not seen evidence against the notion unless it requires moving the definition or creating a new numbers system.

0.999... = 1 isn't a concept. It's just an inaccurate statement. 0.333... = 1/3 isn't a concept either, it's just an equality based on a defined representation (hey, let's show everything in a decimal format!) that provides no additional meaning.

So let's get down to brass tacks...who are these "others" who are defining it as precise? This whole thread is a parlor trick on the same order as "where did the other dollar in the bellhop's tip come from?" so an exercise in rigorous mathematics it most certainly is not.
 
What necessitates infinity being imprecise by nature? That's an important logical step that I'm not picturing. We have a symbol for infinity and notation to represent numbers that repeat forever.

I've not seen anything in this thread that demonstrates a practical difference between 1/3 and .333... that doesn't require leaving the typically-used definition of numbers.

If you're defining .333... as imprecise and others define it as precise, what evidence do we have that someone is correct? Mostly that if you consistently apply the .333... = 1/3 and .999... = 1 concept with the numbers system most of us use, they function identically. I've not seen evidence against the notion unless it requires moving the definition or creating a new numbers system.

Why would there be a practical difference? There is no practical difference between 0.999... and 1, even though they are not equal.

Saying that 0.333 =/= 1/3 is "imprecise" is wrong. For any definition of precision you employ, they test the same. This does not make them equal.

J
 
0.999... = 1 isn't a concept. It's just an inaccurate statement. 0.333... = 1/3 isn't a concept either, it's just an equality based on a defined representation

They are both equalities based on a defined representation.

so an exercise in rigorous mathematics it most certainly is not.

It's been an exercise in defining and applying the concept of infinity from the start. The argument boils down to whether one accepts it as defined. The "others" are just a shorthand for "everyone willing to accept them as equal by definition, given the intention of defining .999...". The "others" was not intended to build any argumentative strength, because it's not evidence that would reasonably change our anticipation of reality, just to reiterate the disagreement.

There's a problem, because you can't (as far as I know) actually test for .999... . If one person says they're equal and the other says they're not, but can't test them differently anyway, you have a situation with an unfalsifiable claim. As a result this:

For any definition of precision you employ, they test the same.

Is, in my opinion, the best evidence we'll be getting on the topic.
 
Back
Top Bottom