The Structural analysis

Here's a formal proof that an indefinite integral leads to a constant of integration:
----

Definition: $$F(x)$$ is an antiderivative of $$f(x)$$ if $$\frac{dF}{dx} = f(x)$$.

Proof:

Consider two antiderivatives of $$f(x)$$, which we will call $$F(x)$$ and $$G(x)$$.

Now consider the function defined as $$H(x) = F(x) - G(x)$$.

Take the derivative of $$H(x)$$:

$$\frac{dH}{dx} = \frac{dF}{dx} - \frac{dG}{dx} = f(x) - f(x) = 0$$

where we have used the fact that F and G are antiderivatives of f.

Now because $$\frac{dH}{dx} = 0$$, we know that $$H(x)$$ is a constant. Let's call this constant c and check that this statement is correct.

If $$H(x) = c$$, then sure enough $$\frac{dH}{dx} = 0$$, as required.

But from the definition of $$H(x)$$ above we have $$H(x) = F(x) - G(x) = c$$.

And so we conclude that

$$F(x) = G(x) + c$$

where $$c$$ is an arbitrary constant.

Now, recall that $$F(x)$$ and $$G(x)$$ were defined to be arbitrary antiderivatives of $$f(x)$$. We have now proved that arbitrary antiderivatives of f(x) may differ only by an additive constant $$c$$.

Therefore, a general statement of antidifferentiation is:

$$\int f(x)~dx =G(x) + c$$

where $$G(x)$$ is any antiderivative of $$f(x)$$.

Q.E.D.
 
Now, mishin05:

I have now given you in post #101 a complete proof of the statement

$$\int f(x)~dx = F(x) + c$$

Do you wish to dispute my proof?

If so, you need to point out where I have made a mistake in post #101.

Nothing else will advance your argument.

Good luck!
 
Taking the indefinite integral of both sides of this equation and using the fundamental theorem of algebra, we must recover the initial statement.

Err, you mean Fundamental Theorem of Calculus, no? Not seeing where we'd use the Fundamental Theorem of Algebra here...

$$\int \frac{df}{dx}~dx = \int \frac{dg}{dx}~dx + c$$ (*)

Now, suppose we put $$f(x) =0; g(x) = c$$.

That should be $$g(x) = -c$$, although it makes no difference.

However, quibbles aside:

So, what about your reasoning:

$$\int 0~dx = 0\int 1~dx = 0 (x + c) = 0$$?

The problem is with the assumption that

$$0 \times c = 0$$.

If c is an arbitrary constant, then it could be infinite. And if c is infinite, then $$0 \times c$$ is undefined. It could be 17 or 183 or -24 or $$\pi$$.

No, the constant of integration is a real number (and so, finite). The expression $$x + \infty$$ is not a valid antiderivative of $$f(x)=1$$, for example, exactly because of the inclusion of the infinity. So, the latter part of that reasoning is indeed correct: zero times the antiderivative of a constant function is indeed equal to zero. We'd have serious problems otherwise....

The error is in the very first part. I.e., the use of the linearity property of indefinite integrals, which is commonly stated as $$\int af(x)~dx = a\int f(x)~dx $$. However, that formula is only valid when a is not equal to zero. The general version would be:

$$\int af(x)~dx = a\int f(x)~dx + C$$.

For a non-zero value of a, you get another constant of integration out of the indefinite integral of f(x), which you can just roll into the constant above to get the usual high school textbook formulation. But when a equals zero, we need to include a constant of integration in the linearity step or we end up with confusion like in this thread :]
 
quadraphonics:

Thankyou for the corrections and clarifications.

I knew I was on shaky ground with that post. I posted it in between doing real work, when I wasn't really concentrating.

I think post #101 achieves what I wanted to show mishin05, so I'm very happy to ditch my previous effort at this point.
 
Than is more I read posts of those people which write about me awfully, I am more convinced that is worthless people, but with a lot of ambitions because have read another's thoughts in textbooks and think that it now their own thoughts. Suddenly they read the person whose thoughts don't coincide with their thoughts. What to do? The clever person will tell: "Give we will look, give me the proofs!" The idiot will tell: "I don't want to hear it. Shut up him", because except someone's thoughts he don't have.
I don't mean personally you, but very much very many.
You seem to be labouring under the misconception that people here are simply regurgitating rote learning, internalized without any underlying understanding. That's not the case. A number of people here are actual, proper scientists, and many more have seen the proofs underlying basic calculus. Unless you can point to an actual error in, say, the proof of fundamental theorem of calculus (a proof that you certainly see in any first year course on analysis, and which I remember seeing a version of in high school), you're not going to convince any of us.

Furthermore, there's a formal definition of what the derivative is, and if you directly apply that definition, the result is that the derivative of a constant is zero. Can you see why this is the case? It's a very basic limit...
Now on-being a question: How to take a derivative, not to lose a constant?

I show:

$$f(x)=x^2+C, f'(x)=?$$

$$t^2=x^2+C, t(x)=\sqrt{x^2+C}$$;

$$f'(x)=\frac{d(x^2+C)}{d\sqrt{x^2+C}}=2\sqrt{x^2+C}$$;

$$x^2+C=\int\limits_{0}^{\sqrt{x^2+C}}2tdt$$.
That's just pure nonsense. To even approach the beginnings of a formal theory, you need to start with a formal definition of a "mishin05 derivative". Also, what does the $$\frac{d(x^2+C)}{d\sqrt{x^2+C}}$$ notation mean, formally? It's certainly not the $$\frac{d}{dx}$$ differentiation operator, I gather.

Currently, you're just positing false equations using a mixture of well-known notations.
 
Here's a formal proof that an indefinite integral leads to a constant of integration:
----

Definition: $$F(x)$$ is an antiderivative of $$f(x)$$ if $$\frac{dF}{dx} = f(x)$$.

Proof:

Consider two antiderivatives of $$f(x)$$, which we will call $$F(x)$$ and $$G(x)$$.

Now consider the function defined as $$H(x) = F(x) - G(x)$$.

Take the derivative of $$H(x)$$:

$$\frac{dH}{dx} = \frac{dF}{dx} - \frac{dG}{dx} = f(x) - f(x) = 0$$

where we have used the fact that F and G are antiderivatives of f.

Now because $$\frac{dH}{dx} = 0$$, we know that $$H(x)$$ is a constant. Let's call this constant c and check that this statement is correct.

If $$H(x) = c$$, then sure enough $$\frac{dH}{dx} = 0$$, as required.

But from the definition of $$H(x)$$ above we have $$H(x) = F(x) - G(x) = c$$.

And so we conclude that

$$F(x) = G(x) + c$$

where $$c$$ is an arbitrary constant.

Now, recall that $$F(x)$$ and $$G(x)$$ were defined to be arbitrary antiderivatives of $$f(x)$$. We have now proved that arbitrary antiderivatives of f(x) may differ only by an additive constant $$c$$.
In this place logic transition isn't clear!

$$\frac{dF}{dx} = \frac{dG}{dx} + \frac{dH}{dx}$$;

$$ f(x) = f(x) + 0$$;

$$ f(x) = f(x) $$.

Now begin from here more in detail!

Therefore, a general statement of antidifferentiation is:

$$\int f(x)~dx =G(x) + c$$

where $$G(x)$$ is any antiderivative of $$f(x)$$.

Q.E.D.
 
You seem to be labouring under the misconception that people here are simply regurgitating rote learning, internalized without any underlying understanding. That's not the case. A number of people here are actual, proper scientists, and many more have seen the proofs underlying basic calculus. Unless you can point to an actual error in, say, the proof of fundamental theorem of calculus (a proof that you certainly see in any first year course on analysis, and which I remember seeing a version of in high school), you're not going to convince any of us.

Furthermore, there's a formal definition of what the derivative is, and if you directly apply that definition, the result is that the derivative of a constant is zero. Can you see why this is the case? It's a very basic limit...

$$m=\frac{change in y}{change in t}=\frac{\Delta y}{\Delta t}=\frac{f(t+h)-f(t)}{h}$$;

$$f(t)=\lim_{\Delta t\rightarrow 0}\frac{F(t+h)-F(t)}{h}$$;

$$G(x)+C=G(t), t=H(x)$$;

$$\frac{dG(t)}{dt}=\frac{d(G(x))}{dH(x)}$$

Than it differs from derivative?

That's just pure nonsense. To even approach the beginnings of a formal theory, you need to start with a formal definition of a "mishin05 derivative". Also, what does the $$\frac{d(x^2+C)}{d\sqrt{x^2+C}}$$ notation mean, formally? It's certainly not the $$\frac{d}{dx}$$ differentiation operator, I gather.

Currently, you're just positing false equations using a mixture of well-known notations.

At you in a head cream of wheat!

Formally the operator looks so:

$$ \frac {d} {d (Argument of function)}$$


To the operator without a difference, you will substitute what letter. Your right to choose argument on which you will differentiate function.
 
$$m=\frac{change in y}{change in t}=\frac{\Delta y}{\Delta t}=\frac{f(t+h)-f(t)}{h}$$;

$$f(t)=\lim_{\Delta t\rightarrow 0}\frac{F(t+h)-F(t)}{h}$$;

$$G(x)+C=G(t), t=H(x)$$;

$$\frac{dG(t)}{dt}=\frac{d(G(x))}{dH(x)}$$

Than it differs from derivative?
Your variables are all over the place and completely undefined, as are your function symbols, you don't define which part is your notation and which part is common notation, and even a generous reading makes most of it either wrong or so ill-defined as to make it an error in itself. Fail, in other words.

Yes, I'd say that differs from the standard definition.
At you in a head cream of wheat!
Right, that's going in as my user title. I've no idea what the original insult is, but this mechanical translation is wonderful!
Formally the operator looks so:

$$ \frac {d} {d (Argument of function)}$$


To the operator without a difference, you will substitute what letter. Your right to choose argument on which you will differentiate function.
But $$\sqrt{x^2+C}$$ isn't a variable.
 
But $$\sqrt{x^2+C}$$ isn't a variable.
Prove! All of you speak very much and think that have the right to give estimations. The one who is responsible for the words has the right to give an estimation. Let's make experiment. I assert that you the ordinary talker, which besides also it is semiliterate. To deny this my opinion prove that $$\sqrt{x^2+C}$$ isn't a variable.
 
Will admit I have sphere volume $$ V =\frac {4} {3} \pi r^3 $$. I want to find, how the sphere is structured:

1. On equator: $$\frac{dV}{d2 \pi r}$$.

2. On the area of the basis of a hemisphere: $$\frac{dV}{d \pi r^2}$$.

3. Me doesn't interest, as it is structured on radius: $$\frac{dV}{dr}$$.

Who can make me force to item 3? You???!!!
 
Prove! All of you speak very much and think that have the right to give estimations. The one who is responsible for the words has the right to give an estimation. Let's make experiment. I assert that you the ordinary talker, which besides also it is semiliterate. To deny this my opinion prove that $$\sqrt{x^2+C}$$ isn't a variable.
I assume that translate.google.com is challenging my reading comprehension with this gibberish (which is quite funny given that, you know, mishin05 apparently cannot form a coherent sentence of English on his own.)

Variable. Expression. Which one is $$\sqrt{x^2+C}$$, now?
 
You're not seriously arguing that, are you?

Because, you know, if you cannot speak the language (pun so intended), then there's really no point in continuing. Your "I'm invincible!" act isn't that entertaining.
 
People come on a mathematical forum and can't express the thoughts with formulas.
The fact you don't understand high school mathematics isn't our fault, some of us understand mathematics just fine thanks. I'm a mathematician by profession. I spent 8 years in university doing it. Others in this thread (and your other one) have similar experience in other mathematically inclined scientific disciplines. And I've also linked you to lecture notes by professors of mathematics, which you either didn't read or didn't understand.

Previously you said :

2."...If $$\frac{df}{dx} = \frac{dg}{dx}$$ then it means that they different only by a constant..."

Haven't convinced! I see that $$f=g$$!
I gave you an example of an f and g such that $$\frac{df}{dx} = \frac{dg}{dx}$$ but $$f \neq g$$. You failed to even grasp your mistake there.

Your entire premise is based on the fact you think mathematics says $$\int 0 dx = C$$. It doesn't, unless C=0, since $$\int 0 dx = 0$$ and no one here or in the mathematics community says otherwise. It says that $$\int \frac{df}{dy} dy = f + C$$, where C is determined by the limits but since they aren't explicitly stated you can't say what C is. If we make them explicit, say $$\int_{a}^{x} \frac{df}{dy} dy$$ then you get $$\int_{a}^{x} \frac{df}{dy} dy = f(x) - f(a)$$ so $$C = -f(a)$$.

The C doesn't come from integrating $$\int 0 dx$$, it comes from the fact an integral of a non-zero function over a specified interval produces two terms, one for each end of the interval.

Interesting, in what countries there live such people?
Trust me, you don't want to go down the route of "Where did you learn maths then?".
 
Surprisingly! People come on a mathematical forum and can't express the thoughts with formulas. It make a laughing-stock of itself and at itself laugh! Surprising people! Interesting, in what countries there live such people?

I'm the one that offered an impartial computer program to put the math to the test.

I'm not the one who moved this thread to pseudo science.
 
I'm a mathematician by profession. I spent 8 years in university doing it. $$\int_{a}^{x} \frac{df}{dy} dy = f(x) - f(a)$$ so $$C = -f(a)$$.

Now we will check up!

The curve equation $$y=x^3$$ is given.

To find the tangent equation in a point $$x=a!$$

$$y=f(a)+f'(a)(x-a).$$

According to mathematician AlphaNumeric $$ f (a) =C $$.

Then $$ f ' (a) =0.$$ Hence: $$y=a^3$$ - the tangent equation!
 
I knew I was on shaky ground with that post. I posted it in between doing real work, when I wasn't really concentrating.

The whole constants of integration thing is a pedagogical mess, and that's a big part of the reason that nobody ever bothers with indefinite integrals after freshman calculus. I had to stare at your earlier post for like 20 minutes to figure out where the error was - time I should have spent working :p

That said, as long as we're going to the trouble to get such things right, I suggest that the solution is not so much to add a "+C" to the linearity rule, as to never write an indefinite integral anywhere without a "+C". This ensures that you never lose track.

I.e., the linearity rule should be written as

$$\int af(x)~dx + C_1 = a\int f(x)~dx+C_2$$

to begin with, avoiding any ambiguities. Also note that the additivity rule will produce the same defect if not phrased carefully. I.e., the lazy formulation

$$\int \left[f(x)+g(x)\right]~dx = \int f(x)~dx+ \int g(x)~dx$$

also "proves" that $$\int 0~dx = 0$$ when $$ f(x) = -g(x)$$. Not so

$$\int \left[f(x)+g(x)\right]~dx + C_1 = \int f(x)~dx+ \int g(x)~dx + C_2$$.

... although, even there I succumbed to the temptation to combine constants of integration on the RHS :[

At any rate, I've edited the Wikipedia page on Antiderivatives accordingly....
 
Last edited:
Back
Top