Dividing a number by zero

I must have gotten you confused with Pete, because you where backing him up and this is what Pete has been saying. That is the whole point, if there is a factorization of zero, then the variable of the other factor will become meaningless. It will start to act like it is a zero itself. You will get equations where only zero could be a correct possible solution for that variable.
I'm simply saying that 0 = 0 x a, where a is any number.
This is not controversial, it's very simple arithmetic. It's so simple that it's embarrassing to argue about it.
You wrongly insist that a must be 0, because you are confused over (slightly) higher mathematics, which leads to to get the basic stuff wrong.
As a rule of thumb, if your understanding of a topic you to an obviously wrong conclusion, then your understanding is incorrect.

So all I am saying is that if you factor out a zero and "a" is left as the other factor, you could end up with something like a + a = a.
Please demonstrate this assertion.

And you haven't yet answered my last post.
 
Please demonstrate this assertion.

And you haven't yet answered my last post.
It was already demonstrated in the original post that even brought this up. The proof that 1 + 1 = 2, I am saying that if you correctly anyalize what happened, then the value of "a" cannot be 1. So then this figure has been placed here where it shouldn't be. This error was brought about by not obeying the rule that a=b in those expressions, the reason why a =/= b is because it then makes "a" and "b" act like zero. In the expression a =/= 0, and b =/=0, so that is why a = b is not valid in those expressions. If you assume that a=b or a=0 and b = 0, then you have broken the mathmatical rule that says that those values are not correct with those expressions.

By saying that it is the division, your actually saying that the limit gives the incorrect value, it is not the limit that is the problem. The problem is that someone thought it would be funny to say that a=b in an equation because their algebra book told them not too because it gives wrongs answers. All I am saying is that mathmatical rule is in the correct location, it shouldn't be moved to where you would find the limit. I am just trying to explain why math is correct just the way it is, it doesn't have rules that apply to this simple arithmetic you keep spitting out.

You cannot think about the math that way and you cannot do the math that way to be correct in analytical algebra. The thing I got flamed for not being able to do, then I find that when I try to explain how to do it, I am wrong. And this is because 0 x a = 0? I already told you a dozen times, that you cannot say that in analytical algebra. There is actually a rule that says that. This would be proof of the example, where they said 1 + 1 = 2, if you don't obey that rule, then ya, you get the wrong answer for it being anything other than zero.
 
It was already demonstrated in the original post that even brought this up. The proof that 1 + 1 = 2
That proof failed because it relied on setting 0/0 = 1

Up until that step, it was true for all values of a.

Do you have a demonstration that doesn't involve dividing by zero?

Pete said:
Layman said:
Try it with a^2 - b^2 = 0, (a+b) = 10

0 = 10 x 0
$$a^2 - b^2 = (a+b)(a-b)$$

We've factorized 10 from 0.
We've factorized (a+b) from (a^2 - b^2)
Then you can take the limit and find out 1 = 10.

Prove it
Still waiting...

And this is because 0 x a = 0? I already told you a dozen times, that you cannot say that in analytical algebra. There is actually a rule that says that.
There is no such rule. Open a text book. Ask a maths instructor. Learn.
 
That proof failed because it relied on setting 0/0 = 1
Then you just don't know how to do limits. This is a common practice when working with limits. It is involved in proofs for different types of mathmatical equations. The prime example being the proof for the tangent line to a curve. If limits where wrong then it wouldn't be any proof at all. There is no hidden value that has any meaning from factoring out zero, it isn't even a valid operation. My proof is that a=/=b in those expressions for thousands and thousands of years. I don't think your capable of understanding why those rules where put in place, I don't care to try anymore. The level of incompetence here is completely dumbfounding.
 
The majority of our members are young chronologically, emotionally, or both. Developing techniques for getting through to them is good practice for the real world.
I don't even have a degree in psychology. A discovery of that magnitude would clearly be worthy of a nobel prize, and be a monumetal acheivement that would be known throughout history for the rest of time. I don't know if I am really up to the task at hand, or if I got what it takes to achieve such a goal.
 
Back
Top