1=.999999...?

You've gone back to limits again. You cannot disprove a notion based upon the rejection of the validity of an incinitrly recursive number simply by saying that another infinitely recursive number proves it.

This conversation seems to have gone as far as it can. You don't have a proof that does not require assumptions that are a'priori rejected by the assumptions of your interlocutors. At thus stage all you can do is roll your eyes and say 'i've never met a living finitist' and 'look I have a degree in this'. If would be nice if you could simply admit that you are arguing against a perfectly valid but unusual position.

So far nobody has addressed my proof that 0.99... is not an element of (0,1). The statement 0.99... < 1 is a contradiction to that and thus cannot be true. That proof involves no limits at all. So your position is not valid in standard maths, because it leads to contradictions.

I think we should switch ch to discussing whether the plane can take off from the treadmill

It can, unless the tires give up.
 
Heh. I've been active over on the flat Earth forum. I'm pretty sure none of them are serious but you'd be surprised how hard their 'official' thesis is to disprove.

Anyway. Sarcasm doesn't really undermine the fact that it appears that 0.999... =/= 1 is actually a perfectly valid position to hold.

1382842512280.jpg


@NinjaCow64
Since the combination of sky-high arrogance and lack of comprehension capabilities you - from my point of you - showcase with such glee, I feel unable to continue this discussion with you, since for one it appears to be an extremely and unnecessarily laborious task to debate with you and for another the only joy I could get out of it at this point would be to mock you.

You know Terx, I feel exactly the same way.

I'm unsubbing from this thread because it's clear that no matter what we say we aren't going to correct your wrong assumptions. And I'm getting way too angry about this.
 
Backwards. It's true if you can get a grasp on non-countable infinities.

J
We have number sets where .999... =/= 1, however reals are not one of them. If you're speaking in terms of real numbers, .999... = 1 is objectively true, and unless you explicitly state that you're working outside of reals, then the assumption is that you're working with reals.
 
You've gone back to limits again. You cannot disprove a notion based upon the rejection of the validity of an incinitrly recursive number simply by saying that another infinitely recursive number proves it.

It has been brought up time and again that the limit isn't "infinitely recursive".

You don't get the limit as a result of going through the elements of a sequence. The limit either is there or is not. It's just a real number that is associated to a sequence (if the sequence has a limit). It isn't a result of some sort process, it just is the limit.

My case isn't to tell that there can't be consistent systems where 0.999... is not 1. To show that there are such is a trivial thing to do. My claim is that in standard maths that is not true. And if you present a claim like that without caveats, it is assumed that you're talking about standard maths.

What you are doing is like claiming that there is proof that God exists, because you can define God to mean the Eiffel tower.

That's my degree emphasis. The grad work was in stats.

:eek:

So you are a statistician? My guess that you've been to maths courses, but the emphasis was on the applications and not the proofs was right? The proofs were presented to you, but you brushed them of as mathematicians' nitpicking?

The reasoning is simple. It's less than, but not equal to one, but the difference is not defined. Since 1 is rational, the other cannot be. If you could define two distinct but connected points, these would be candidates. I have said that the difference was undefined. Perhaps not measureable would be better.

This grew out of contemplating dense sets. I realized that limits cannot be unique, or even countable. For any limit, there must be an uncountable set of points that fit the definition. Otherwise continuity fails. Of this uncountable set, we can define only one point. The set is open and sort of bounded, but the definition of bounded does not work. The real numbers are a measure space. How is that supposed to work?

That the limit (when it exists) is unique is something that's explained usually at the first course of undergraduate maths too: If a sequence (a_n) had two limits, say a and b. Then it would be:
For all e>0 there is a n_ea such that |a_n -a| < e whenever n>n_e
and
For all e>0 there is a n_eb such that |a_n -b| < e whenever n>n_e.
Now choose e=|a-b|/3 and m = 1+max {n_ae, n_be}. Then:
|a-b| <= |a-a_m | + |a_m - b| < |a-b|/3 + |a-b|/3 = 2/3 * |a-b|.
This is a contradiction, since obviously 2/3* |a-b| < |a-b|.
Thus, a sequence can have only one limit.

As another point, R is not by itself a measure space. It becomes a measure space once you define a measure there. There are multiple choices how R can be a measure space.

Can you also elaborate how R (with a suitable measure) being a measure space has anything to do with the thing at hand?

Also, if you know what a measure space is, then it's odd that you should say that a real number is not measurable (with the Lebesgue measure for example), since that's not how the term is applied.

And lastly, you still haven't explained how some difference "is not defined", since the field axioms for R require that for each x and y in R there is x-y in R.
 
@Atticus
First: I thought you personally were rather helpful and forth-coming :)

But back topic. First regarding your criticism of my argument.
I am not arguing that the maths should be different. I am perfectly willing to accept that they contain all the wisdom of how a sound mathematical system should function.
I am merely arguing, how a particular mathematical tool is not in tune with anything we know about actual quantities.
What else is "unrealistic" in math isn't relevant for that discussion, for all I can see.
Nor is it, that the rules of academic mathematics are as good as it gets. Both is just not in dispute, is it?
What in the end is only relevant, I think, is the claim of the "skeptics" that infinite numbers do not describe an exact quantity, regardless of what the rulebook says. And it appears, and as brennan already just said, that to that claim not a single math guy so far managed to give an response which goes beyond reiterating academic rules and /or just assumes to be true what the skeptics argue against.
Moreover, I have grown convinced that a Mathematical background is not only not necessary to discuss this, but even a hindrance. Because the math guys seem - for the most part anyway - stuck in the paradigms of academic math - and unable to realize that this thread is actually about a philosophical question concerned with quantities and hence also a mathematical question, but not a question about academic math.
After all, do you think what Mouthwash really meant in his OP was "I don't understand how the rules of math establish 1 = 0.999..." or didn't he rather mean "It doesn't make sense to me, as such"?
But apparently, in a "know academic math - know all about math" - logic many seem to be completely blind to this possibility of a mathematical question not answered by academic maths. Yet unless academic mathematical rule helps you to make an argument of yours in that philosophical debate, it will not be relevant. Since we are not discussing those rules, not foremost anyway. But this transition so far never happened (or rather, when tried, it failed, there was one attempt a few pages back which brennan already refuted), but got mostly stuck in a circle jerk of academic mathematics.

Some of the math people or those that fancy themselves such are in this case dicks because they are so terribly oblivious that this is a thread about a mathematical question academic math has little to say about, while those that perhaps only intuitively know they can muster an opinion in this topic as well as a maths major are repeatedly ridiculed and mocked.
 
it's clear that no matter what we say we aren't going to correct your wrong assumptions.

That's not actually the case. As has been shown. There is no proof that 'our assumptions' are wrong.

Uppi: 0.999... is not a member of that interval. It's not a discrete number but an idealised construction that looks like a number. That's all there is to it under this interpretation.
 
Uppi: 0.999... is not a member of that interval. It's not a discrete number but an idealised construction that looks like a number. That's all there is to it under this interpretation.

That is meaningless drivel. All numbers are idealized constructions and 0.99... is no different. If you want to to define brennan-numbers, please do so. I am going to keep using reals (which are not discrete, anyway).

This thread has convinced me that there is such a thing as pseudo-maths. Like pseuo-science it is going through the motions without ever having understood the concept.
 
Why is everyone so committed to ignoring what is being said? It is bizarrely ideological.
 
This thread has pretty much devolved into feels v. reals where the feels think they're reals.
 
@Brennan:
I don't think the case you have presented is what finitists would say.

I also understated the rareness of finitists, the thing is that I've heard even people studying mathematical logic doubting whether there has been a living finitist during the past 100 years. This is not my area of expertise though. I also have never heard that they would have formed any kind of coherent system of mathematics. My understanding is that they concentrated more on writing pamflets and such. This is a thing I may be mistaken on though.

As a second point, if there is a coherent system where there is no 0.999... (the formulation that it's not equal to 1 would be misleading), that doesn't undermine the fact that there is also a coherent system where there is 0.999... and it's equal to 1. That system happens to be also the standard one, which is assumed to be the one you use if you don't explicitly say otherwise.

Furthermore, I expect that you also accept that young earth creationism is as valid idea as the theory of evolution, since there's no proof that God didn't plant all the evidence in the earth etc. You should also accept that the people you speak with change the meanings of the words in the fly: "no, that's not what I meant. I was speaking Penglish, which is a language similar to English with the exception that the word 'God' means 'the Eiffel tower'".

@Terxpahseyton:
Part of what you say is ok, but you haven't quite grasped the point I was making.

There's the physical "everyday" world and the world of mathematics. Some of the things are only part of the physical world (like chairs), and some are solely part of the world of mathematics (like integrals). Some are things of both worlds, like 1, 2,...

I don't have anything against the idea that some concepts mean different thing in the everyday world and the world of sciences. For example, use of the word "force" shouldn't imo be limited to the meaning it has in physics. If the words like "two" or "ten" would have a different meaning in maths than they have in the everyday world, that would be a problem, and the maths people couldn't claim to hold the monopoly to those word.

In everyday life you may go on with words like "half" without having it explicitly defined. If you ask someone what "half" means, he'll give you examples or says "half is half, it just is" or something like that. In maths that kind of things must be explicitly defined. That's why half is the real number whose product with 2 is 1. Or 1/n is the number whose product with n is 1.

That isn't just maths-speak. It also captures the essence of what people mean with those words in everyday life. It's like everyday speech, but without the ambiguity or the contradictions.

Your point seems to be that there is an everyday-meaning of the expression "0.999..." and it conflicts with how it is understood in maths. I don't think that's true. There is not a single set answer you will get when you ask people what "0.999..." means. If a layman tells what it means, he's likely to give an internally incoherent answer, as has been seen in this thread. For example, it can't be defined as "the sum of all it's digits", since the sum isn't defined for infinitely many numbers.

The definition used in the maths is pretty much the only sane one you can come up with if you spend some time examining what you suppose "0.999..." means. Here "pretty much" means that I have never heard of an alternative definition, nor can I imagine one. It's not impossible that another sane definition would exist, yes, but neither is young earth creationism.

So, the meaning of 0.999... used in the maths isn't different from that of "everyday world". It's just more sophisticated and less flawed.

@All: Instead of arguing about this, I'm going to do something useful or fun. If you're addressing my posts, don't expect me to reply.
 
I also understated the rareness of finitists, the thing is that I've heard even people studying mathematical logic doubting whether there has been a living finitist during the past 100 years. This is not my area of expertise though. I also have never heard that they would have formed any kind of coherent system of mathematics. My understanding is that they concentrated more on writing pamflets and such. This is a thing I may be mistaken on though.
None of which actually refutes their ideas. Thomas Paine was 'just a pamphleteer'.

that doesn't undermine the fact that there is also a coherent system where there is 0.999... and it's equal to 1.
I don't think anyone is saying that it does. Merely that an alternative system exists.


Furthermore, I expect that you also accept that young earth creationism is as valid idea as the theory of evolution, since there's no proof that God didn't plant all the evidence in the earth etc.
Well yes actually. Strictly speaking and from a philosophical point of view YEC in some forms is a completely valid and consistent worldview. That doesn't mean I have to believe it is true.
 
Well yes actually. Strictly speaking and from a philosophical point of view YEC in some forms is a completely valid and consistent worldview. That doesn't mean I have to believe it is true.

I have to admire this strategy. The facts (reality) don't back up your math, so the solution is to bring in even more crazy theories like Flat Earth and YEC to make .999...=/1 look sane in comparison and therefore get the war score up high enough to get everyone else to sign a white peace.
 
People, I think I found a way to explain my position in an understandable manner. I think this is really good.
edit: I made a dumb mistake. Now corrected.
Your point seems to be that there is an everyday-meaning of the expression "0.999..." and it conflicts with how it is understood in maths. [...] If a layman tells what it means, he's likely to give an internally incoherent answer, as has been seen in this thread.
Almost true. What I refer to is strongly related to everyday-meaning, and in large parts even identical, but not entirely. I rather refer to or at least believe to refer to a coherent basis of everyday-meaning and which offers a consistently successful account of quantities in the real world and which I can simply apply on 0.999...
That coherent basis is, as I already said and you repeated, that a number represents a quantity of whatever, whereas that quantity is the sum of the quantities expressed by the digits of the number. And that is how quantities in themselves work (and the foundation of academic mathematics), in the real word, if you express them as digits within the to us known decimal system. Is this contentious?
Now, if we follow this reasoning, 1 represents the quantity of something being there in a single instance. 0.999... represents the quantity of something being there in an unknown instance, since the sum of its digits is unknown, since it is not possible to know the sum of infinitely many digits.
All we do know is that the quantity 0.999... stands for is as close to 1 as possible, since there literally is not a number which is closer. Moreover, we know that 0.999... is not 1, since for 0.9 to be 1, 0.1 would have to be added, whereas adding 0,09 does not make it one and there is absolutely no reason to believe that doing this again and again digit after digit would change that. And once repetition is established to not achieve a goal, so is infinite repetition.
The end :)

It is not that our instruments lack infinite precision, it is reality itself that lacks infinite precision. At the microscopic level, everything has not a true value, but a probability distribution of values.
So you can prove to me that this is not merely an impression due to the limits of measurement instruments? That would be really surprising to me.

But even if it was not real, we can still imagine it to be real, can't we (just showed how in the first part of this post)? And hence we can still see weather 0.999... = 1 corresponds to how quantities work in the real world.
Numbers thus are purely abstract concepts and we can make up rules as long as they are consistent.
Just because within academic mathematics a particular case or even just because several cases are purely abstract concepts does not make numbers in general abstract concepts. For instance - numbers in their grounded real meaning could be used and then expanded on by abstract concepts, to name only one way how your conclusion would be wrong.
So far nobody has addressed my proof that 0.99... is not an element of (0,1). The statement 0.99... < 1 is a contradiction to that and thus cannot be true. That proof involves no limits at all. So your position is not valid in standard maths, because it leads to contradictions.
Because the definition didn't feel it necessary to account for numbers which are infinitely close. That's fine. But it misses the question of weather 1 actually equals 0.999..., it just tells us that it is useful to assume so, or at least thought to be useful within academic math.
That said, I thought for a while that the problem was just the concept of infinity, but there appear to be further instances of mathematical definitions diverging if not in their content than in their effects from the reality of quantities.
@All: Instead of arguing about this, I'm going to do something useful or fun. If you're addressing my posts, don't expect me to reply.
I understand. I still wanted to reply, though.
 
0.999... represents the quantity of something being there in an unknown instance, since the sum of its digits is unknown, since it is not possible to know the sum of infinitely many digits.

And exactly here is the problem. Real analysis proves that within its framework, it is possible to know the sum of infinitely many objects. A huge part of mathematics depend on this. Take that away, and you end up with crippled math that is crippled to the point where it is a mere shadow of standard math.

As a scientist, I view math as a tool to describe reality. Without central parts of real analysis I could not describe reality like I can with standard math. So, even if you had a valid philosophical objection, I would consider it to be stupid to cripple math like that.

So you can prove to me that this is not merely an impression due to the limits of measurement instruments? That would be really surprising to me.

Yes. Measurement backaction is proven as good as science can prove such things. There are physical limits on how precise things can be measured, no matter how good your instrument is. Whether these bounds are bounds on how precise a quantity can exist or bounds on how precise we can know a quantity is up for discussion, but I do not think that matters here.

But even if it was not real, we can still imagine it to be real, can't we (just showed how in the first part of this post)? And hence we can still see weather 0.999... = 1 corresponds to how quantities work in the real world.

We could imagine that, yes. But then it would just exist in our imagination and would have no connection to the real world. In my opinion, quantities are abstractions of the real world and thus it does not make any sense to look at how they are supposed to work in the real world.

Just because within academic mathematics a particular case or even just because several cases are purely abstract concepts does not make numbers in general abstract concepts. For instance - numbers in their grounded real meaning could be used and then expanded on by abstract concepts, to name only one way how your conclusion would be wrong.

I do not think there is a grounded real meaning to numbers. What exactly is that supposed to be? I suppose that there is an intuitive meaning of numbers, but that does not mean that meaning is real o makes it any less of an abstract concept.

Because the definition didn't feel it necessary to account for numbers which are infinitely close. That's fine. But it misses the question of weather 1 actually equals 0.999..., it just tells us that it is useful to assume so, or at least thought to be useful within academic math.
That said, I thought for a while that the problem was just the concept of infinity, but there appear to be further instances of mathematical definitions diverging if not in their content than in their effects from the reality of quantities.

If you keep talking like this, I would like to see a definition of what you mean with "actual" and "reality of quantities". I have no idea what concept you have in mind with this.
 
Enjoying this discussion tremendously, now :) Touches on a lot of interesting topics.
And exactly here is the problem. Real analysis proves that within its framework, it is possible to know the sum of infinitely many objects. A huge part of mathematics depend on this. Take that away, and you end up with crippled math that is crippled to the point where it is a mere shadow of standard math.
Luckily, for all intends and purposes we so far, and in the foreseeable future, and probably forever, have to consider, infinitely close is just as good as being identical. So I see no reason why we would have to take it away from what you call "real analysis" (and which I would call "useful analysis" if we have to call it in such judging manner) and which, also, with much ease explains why the assumption that 0.999... = 1 works just as well weather it is true or not.
So the problem you see seems to be without any of the consequences you imply and hence not like a problem.
So, even if you had a valid philosophical objection, I would consider it to be stupid to cripple math like that.
It is an unexpected and wholesome pleasure to see ourselves unconditionally agree on something in this thread. :)
Yes. Measurement backaction is proven as good as science can prove such things.
So I am unclear what you think the implications of "as good as science can prove such things" are, considering that my objection towards the idea of a finite nature of physics regarding its downward scale are entirely based on the limits of what science can prove in that area.
There are physical limits on how precise things can be measured, no matter how good your instrument is. Whether these bounds are bounds on how precise a quantity can exist or bounds on how precise we can know a quantity is up for discussion, but I do not think that matters here.
And now you seem to agree with my original statement. :confused:
But in the beginning, you told me "Yes", as in yes, I can prove to you that this is not just an impression due to measurement limitations.
edit: But I agree this is irrelevant to the actual question. Which from my POV is weather 1=0.999... violates the rules actual quantities operate on (rather than the rules of the tautology called academic mathematics)
We could imagine that, yes. But then it would just exist in our imagination and would have no connection to the real world. In my opinion, quantities are abstractions of the real world and thus it does not make any sense to look at how they are supposed to work in the real world.
Well strictly speaking you are certainly right that quantities are abstractions, but then, there is no word which is not an abstraction, strictly speaking. The sounds our mouths make can not encapsulate what is there, but merely a concept of what is there, an abstraction.
However, to perceive the same in more or less than one instance or to perceive shares of something appears to me to be a fundamental component of perceiving anything at all, so that I can't agree that "abstraction" covers the nature of quantities satisfactorily. Rather, I'd say that quantities are innate to the natural world, through whatever lens you choose to look at it.
After all, my dog can recognize several pieces all belonging to something it is conditioned to find interesting (and likely beyond that, but that is hard to prove). May that be pieces of an apple it can smell or merely pieces of a toy or a number of toys it just sees. It can use a category of things and apply it on different objects. That is quantity, happening on a purely sensual level. It is true that my dog's understanding of quantities is light years away of the - in comparison - sophisticated understanding people use in everyday-life - let alone the defined understanding academic math uses. However, that IMO certainly casts quit a doubt on your assertion that quantities are "just" an abstraction. They are an abstraction, as any word, but one rooted as deeply in sensual experience as any word, I dare to say.
I do not think there is a grounded real meaning to numbers. What exactly is that supposed to be? I suppose that there is an intuitive meaning of numbers, but that does not mean that meaning is real o makes it any less of an abstract concept.
As said, you can call any kind of human idea which can be communicated of literally anything an abstraction. So I don't think that the assertion that something is an abstraction is really meaningful in itself for what is discussed, since the word "reality" itself is an abstraction, strictly speaking. Hence, it needs to be qualified. There needs to be a kind of abstraction that is grounded in reality and one that is not.
For instance, I'd argue that the rules of a sport are not grounded in reality. They are probably best understood as a matter of pure will. They are not &#8222;dictated&#8220; by an a priory existing fact of life.
Do I have a say over the pieces of an apple I see?
Are quantities more abstract than physics?
If you keep talking like this, I would like to see a definition of what you mean with "actual" and "reality of quantities". I have no idea what concept you have in mind with this.
The reality of the instances of sth and what this actually means. Math is supposed to articulate that, as you yourself said. I am saying that while academic math does a fine job in usefully articulating it, it not always entirely correctly articulates it. Whereas in the case we discuss this incorrectness is virtually non-existent. But only virtually.
 
Well yes actually. Strictly speaking and from a philosophical point of view YEC in some forms is a completely valid and consistent worldview. That doesn't mean I have to believe it is true.
I have to admire this strategy. The facts (reality) don't back up your math, so the solution is to bring in even more crazy theories like Flat Earth and YEC to make .999...=/1 look sane in comparison and therefore get the war score up high enough to get everyone else to sign a white peace.
The same way you know absolutely nothing about math? Yeah.
Lesson No. 1 in philosophy is to not be carried away by implicit assumptions - especially the unconscious kind (though in all fairness I have the impression that this is a task too big for one or the other philosopher of note).
Now, brennan used an extremely cheap cop-out to be able to truthfully say what he did (I have grown to feel a lot of good intentions towards brennan in this thread and I feel bad saying this). That cop-out was "a philosophical POV [...] in some forms" That is about the most vague and far-reaching statement a philosopher can make without saying "whatever, dude".
I suppose brennan ultimately referred to how consistency always is self-contained, so you can wrap up whatever freaking nonsense you like as long as you do the work to have it somehow account for what is established knowledge and to have it be internally consistent. TATA! A point of view sees the light of earth which
in some forms is a completely valid and consistent worldview.
However, I think it is possibly that brennan merely trolled you, Sir from Scarlet, with this statement, so to test you general thoroughness of thinking regardless of academic math. See, if you were as unbelievably and very attractively smart as myself, you would have smelt the bait. But since you appear to be lost in assumptions which dictate your wisdom, you are unable to deal with such feints.

Also, for further study, I recommend my last two posts. I may be wrong and an idiot myself. But it will in any case allow you to dismiss me properly - philosophy style. Rather than to settle with "lol wrong" posts.
 
Back
Top Bottom