Definitions work into this because all of mathematics rely on properly defined terms.Fanta Grape said:Your first proof makes sense, but that would simply bring me to the conclusion that 0.0...1 = 0, and then therefore, 0.9... = 1, despite my alternative proof. Would you care to explain that? (That came out a bit sarcastically, but I'm quite sincere, believe me). [Edit: Bleh, misread that AND articulated the response incorrectly. Could you explain how definitions work into this?]ManOwaRrior said:0.000...1 is indeed not defined in mathematics.
If one would try to define it, he would find that 0.00...1 = 0 = 0.00...2 = 0.00...9.
(Can prove if needed. For now, just realize that 0.00...1 - 0 has to be smaller than any given positive number and can't be negative, thus it has to be 0. From there it's just a-b=0=>a=b).
Problem A: Time, as we perceive it, is not infinite. That's one Point where your Problem falls apart. The other Point is that 1/infinity is not defined. It is not defined, because infinity is not a number. If it was, 1/infinity had to indeed be zero (same proof as above), but then we'd have 1/inf = 0 => 0*inf = 1. And that's not making any sense.
Problem B: Evaporates once you realize that the Term 1/inf is not defined.
Second question in B: The smallest possible decimal Number is 0. If you want the smallest possible positive decimal Number, well, it doesn't exist.
Google the concept of an open set to learn why. Easy argument: For every positive Number x, no matter how small, there is an even smaller one, x/2 for example, that is still positive.
Also, you really just answered problem A by restating the question. I stated that INF is just something I used to express as an "infinitely large number". Obviously 1 cannot equal 0 so where did I go wrong?
Regarding the smallest possible positive number, I know it doesn't exist. My issue was that would all infinitely small numbers be the same? If they were, then it could be stated that 1/INF = 0.0...1.
We can write stuff like 1 + 1 = 2, because 1,2,+,and= are all defined things. There are explicit rules how to use them.
Your problems arise once you use terms like "an infinitively large/small number".
Those terms, as you use them, are not defined in mathematics and can therefore not be used in a mathematic context.
Your alternative proof that 1=0 falls apart when you claim that 1/inf * inf = 1.
There are no defined rules in mathematics, how to divide and multiply with "infinitively large/small numbers", precisely because you run into problems like this.
Those numbers and rules exist in the hyperreals, but they exceed my mathematical understanding.