0.9999999.. (infinitely repeating) = 1

>0.9999999.. (infinitely repeating) = 1
Who is the user who said that?
This discussion isn't over.

How is 0.999... the same as 1?
They are two numbers.
0.999... + 0.000...0001 would equal to 1.

>pic related if you want it to be I don't care

Not hard to prove with even a small amount of math

Your logic fails at the
>.00000...000001
Part.
Because, you'll never get to the 1 with infinitely repeating nines. Or, i should say, that if you did get to the 1, then your equation would actually end up being
>.999... + .0000...00001 = 1.0000...009999999999

OP is right.

Adding another decimal will always add less than the required value to make 0.9 into 1.

And now you know the world is not round .

0.999... = 1?

1/3 = 0.333...
1/3 x 3 = 0.999...
3/3 = 0.999...
1 = 0.999...

...

The point is, no it is not equal to one, it is as close as you can get to one without reaching it

This one

You've convinced me. OP can eat dicks

I doubt it cause its infinite and theres nothing close from 1 infinte numbers of .999999 to and infinite numbers of 1.00000000 etc

try to argue against this one

Wrong

The 1/3 x 3 would be 1 / 9 u autist

Well 0.333... isn't exactly 1/3 either, just like 0.999... isn't 1.

So pretty much the way that we convert decimals to fractions and vice versa gives us the result that 0.9 recurring is the same value as 1. Thats easy to get, its just that the whole thing is presented as a smaller number is the same as a bigger one

1/3 is exactly 0.333 repeating. Use a calculator you fucking tool

You keep telling yourself that :^)

1/3 is 0.333...

0.999... is 1.

I know it's weird, but it's just a feature of how you represent numbers. You can write the number "1" in a lot of different ways, and you can express it in a lot of different ways: 1, I (the roman numeral), 3/3, etc. 0.999... is odd, yes, but it is the same as 1.

Calculators can't get it more exact than 0.333... but it's still not exact.

And yes, it's weird that we're talking about 0.999... that goes on infinitely (or the same with 0.333...) but this isn't a particularly weird sort of infinity. There are way weirder ones out there. Again, this all basically boils down to weird ways to express certain numbers.

Hell, even something pi goes on infinitely, but that's at least interesting insofar as there is no neat way to represent it other than "pi" or the symbol π, and that's just because it's not a rational number.

Well, it's not so much that an user said it and put it up for discussion, as it is that Euler proved it like 250 years ago, it's been undisputed by any half-educated person since, and an user probably brought up the fact in a some other thread recently.

It's not a question of calculators. 1/3 just is 0.333... The calculator is limited by all sorts of the things, but the obvious one here is just the size of the screen. You couldn't (practically and physically speaking) have a screen that could represent all the 3's that is the decimal form of 1/3. But the limits of expression don't effect the actual truth of the matter.

Fine, then go do long division. 3√1 will give you a remainder of 3 forever. This never changes. No new patterns will show up obviously which means that 1/3 equals 0.3 repeating

common core education

1/3 × 3 =
1/3 × 3/1 =
3/3

1/3 add 0,
10/3 results in 3 the rest is 1,

1/3
10/3 results in 3 the rest is 1,

1/3
10/3 results in 3 the rest is 1,
1/3
10/3 results in 3 the rest is 1,
1/3
10/3 results in 3 the rest is 1,
1/3
10/3 results in 3 the rest is 1,
1/3
10/3 results in 3 the rest is 1,
this will keep going you idiot, this is the pattern

This is where something like how you're representing these operations is important. Yes, 1/(3x3)=1/9, but what was obviously meant was (1/3)*3, or just 3(1/3), which equals 1, or alternatively, 0.999..., because as it has been attempted to demonstrate, 0.999... just is 1.

You're supposing 0.999... has an end. It doesn't, and because of that, there's no number that separates 0.999... of 1, therefore they're the same. Something like that I had read

Thats because there cant be an exact number, you complete fucking dyke. Adding 3 odd numbers together will never make an even number but for the sake of accuracy the whole fiasco with recurring 3's is just put in there till the point where you might as well call the shit 1 because its so close. It will never be exactly 1 because of adding 3 odd numbers but autists wont accept the fact that 1/3 or 2/3 cannot be represented using only recurring 3's when applying the fractions to the number 1.

I wouldn't say there "can't" be exact numbers. I don't know what that means. 1 is surely an exact number, so is 10. I mean, are you considering 1.5 an inexact number? It seems pretty exact to me.

And it seems like there are exact numbers all around us. There's 1 computer in front of me, sitting on 1 desk, next two which are 2 empty bottles of water.

Sure, the way we carve things out in the world--what makes an object ONE object, the object that it is, is sometimes imprecise, and in many cases the boundaries of one particular thing are nebulous (where's the exact point that heaping landmass becomes a mountain?). But this has nothing to do with numbers as such.

>there's no number that separates 0.999... of 1
There is, it's called an infinitesimal.

They're not in the real numbers

I wouldnt say that either. I did happen to say that there wasnt "an" not that there arent any.
The way you speak, either you are trying extremely hard to seem genuine or just need to for another, real world reason.

Woah, a mistake. I meant "an" with focus on the latter content of the post, not all possibillities

Not sure what you mean by trying extremely hard to seem genuine. Though I totally misunderstood the post of yours I was initially responding to, so I'm sorry about that. I thought you were saying something like "there aren't exact numbers period, so 0.999... is just for the sake of simplicity counted as 1 because, hey, there aren't exact numbers anyway."

Anyway, sorry for the misunderstanding.

I guess I was "trying hard" (if writing a quick response on Sup Forums is trying hard...) to seem genuine because, yeah, I am interested in mathematics, but I know dick about it, at least when it gets to anything technical.

Ehh dont worry about it, im just not used to very coherent, neutral communication on /b

What.

For those interested, there is another proof of this.

First, we show that if a= (a+b)/2
hence 2a >= a+b
and a >= b, which violates our initial condition.

Thus, a < (a+b)/2 if a < b.

Seeking contradiction, we can assume 0.999...= 1.
Since 0.999... cannot be greater than 1, it must be equal to one.

Those are other kind of maths user, specify if you're calculating in real or in surreal numbers

Whoa, whoa, that's pretty cool. Thanks for this. Still need to mull it over, but I think I see what it's getting at.

How does a contradiction where numbers are shown to be the same mean that a specific set of letters/numbers somehow increases in value? (Maybe "why does" would be a bit more trusting)

I didn't write the proof, but the contradiction shows that 0.999... has to be 1.

The idea is that if any number A is less then some other B, then in all cases, A will be less than the sum of A and B divided by 2. That's just a general truth.

But plug in the numbers for 1 and 0.999..., and we get the result that 0.999... is less than 0.999..., i.e., we get a number that is less than itself. This is a contradiction. Remember that the numbers we started with were 0.999... and 1, so this means that 0.999... has be greater than or equal to 1, and we know that it's not greater than 1, so it must be equal to 1.

This is really cool, thanks user. A nice proof by contradiction.

If we assume that 0.999... is less than 1, then we end up with 0.999...=0.999, when

(Hope I got that right, someone correct anything that's unclear, misstated, or just flat out wrong in my attempt to explain).

What we are doing is proving a statement by assuming the opposite and showing it cannot be true.

First, we show that for an arbitrary a and b where a= b. Because our initial condition is a=b contradicts this and a must be less than (a+b)/2.

The same type of proof is done for the numbers, letting a = 0.999... and b = 1.
If 0.999.. were less than 1, then 0.999... would be less than 0.999...

This makes no sense as it is the same number. Thus our assumption that 0.999... < 1 is incorrect and thus 0.999 >= 1. Since we know 0.999... cannot be greater than 1 and showed that it cannot be less than 1, it must be equal to one.

This is called "proof by contradiction."

Yes that is correct. Sorry, I hadn't seen your post so I clarified above.