Biochemistry and Information Theory

Perhaps I should have put quotes around "strategy".
You're aware that there is evidence of co-operative and non-co-operative (i.e. cheating) behaviour in bacteria, like in ones that cause disease. And that they can gain advantage by cheating, by being filial, and so on, just like we do?
You should have - put quotes around strategy. In text books and paers where strategy and prupose are referred to, the context is usually abundantly clear that these terms are not being used in a teleological sense. I have not formed that impression in reading your posts. As previously noted this either because you think evolution is teleological, or because you express yourself badly.
Why not clear the matter up now using simple, direct English.
 
I'm using it in an evolutionary sense. In the sense that Life has to improve the way it fits, or responds, it's not a passive system, it's active.
Why that should mean it's teleological, or looks like it does, isn't the purview of Darwin's theory, though.
It doesn't explain anthropology, except in an adaptatory sense. Why aren't we still using stone tools, or living in the open?
 
Biochemistry's biggest concern, as far as resources, is just that. Manipulation of live stock, and feeds. As well as other food products. Maximization.

Whereas Information Handling tends to be analization of data yields. No final answer there. I think the oldest pool of info. in use to date comes from Medical Science. Not those 10 year studies or whatever.
But, constant updates.

Not really inter-related. Other than allergies.
 
What we know about proteins hasn't even begun to explain much of it yet.
But explain why so many are common to all the lifeforms we've looked inside of, ATP synthase, for example, the one that keeps the battery charged.

Are you talking about excretion and secretion, or just generalising with "analisation".
Bacteria don't have an anus, or a gut. Food is information, so is energy.

Allergies in humans (a single representative species) are related to proteins interacting with other kinds of chemicals, and other proteins (which are, of course, chemicals).
 
"The evolution of ATP synthase is thought to be an example of modular evolution, where two subunits with their own functions have become associated and gained new functionality.
The F1 particle shows significant similarity to hexameric DNA helicases and the FO particle shows some similarity to H+ powered flagellar motor complexes.

The α3β3 hexamer of the F1 particle shows significant structural similarity to hexameric DNA helicases; both form a ring with 3 fold rotational symmetry with a central pore. Both also have roles dependent on the relative rotation of a macromolecule within the pore; the DNA helicases use the helical shape of DNA to drive their motion along the DNA molecule and to detect supercoiling whilst the α3β3 hexamer uses the conformational changes due rotation of the γ subunit to drive an enzymatic reaction.

The H+ motor of the FO particle shows great functional similarity to the H+ motors seen in flagellar motors. Both feature a ring of many small alpha helical proteins which rotate relative to nearby stationary proteins using a H+ potential gradient as an energy source.
This is, however, a fairly tenuous link - the overall structure of flagellar motors is far more complex than the FO particle and the ring of rotating proteins is far larger, with around 30 compared to the 10, 11 or 14 known in the FO complex.

The modular evolution theory for the origin of ATP synthase suggests that two subunits with independent function, a DNA helicase with ATPase activity and a H+ motor, were able to bind, and the rotation of the motor drive the ATPase activity of the helicase in reverse.
This would then evolve to become more efficient, and eventually develop into the complex ATP synthases seen today. Alternatively the DNA helicase/H+ motor complex may have had H+ pump activity, the ATPase activity of the helicase driving the H+ motor in reverse.
This could later evolve to carry out the reverse reaction and act as an ATP synthase."
-wikipedia.org

A chance meeting of protein molecules...?
 
The teleological argument, which implies planning and intentional activity, goes nowhere.

Teleonomy has replaced this idea. Life evolves because it is adaptive, it's that simple.
Evolution is directed, by the blind changes made to genes, and by the selection of those organisms that adapt any new functionality. Those best adapted to the current conditions survive, implying that "favourable" changes survive, in a teleonomic sense - because life "makes" them survive.

Otherwise the apparent purposefulness of living things is a big illusion, organisms are just complex chemical "packages", and have no effect on their environment, or each other. The only changes are due to some "process", whatever it gets called (evolution, god, cosmic rays, some external agency).

Since organisms are observed changing their environment (competitively or cooperatively), and each other (co-evolving), and since organisms aren't just complex packages (they do things, in an autonomous way, they respond to changes, they "die"), then agency still needs to be explained.
 
You are creating problems that don't exist from an insistence on misusing words, or rather taking commonplace meanings for words and applying these as though they had scientific rigour.
Consequently I really don't see any point in continuing to debate this with you.
 
Hipparchia said:
..problems that don't exist from an insistence on misusing words,
What words? Which ones am I misusing?
I insist that you are the one doing this, actually. Words like "teleology", for instance. "Evolution" is another misused word, not just by you, either.

Are you seriously claiming that the teleological "conundrum" has been settled? Teleonomy just takes it to a somewhat different playing-field.
 
Last edited:
Ah well. Looks like I'll be talking to myself, for the most part. Since the only commentators have decided to all beggar off (, I could have said "bugger off", but I didn't).

So where was I?

Oh right, Information Theory, the idea of channels and connections; something that gets delivered, like a newspaper, or a message of some kind. How, in the abstract world of IT, information is a string of arbitrary bits, you might chop the string up into chunks and call them words, or symbols, or tokens, again this is arbitrary.

In good old binary, there are only two symbols needed (two states), and binary comes in useful chunks of powers of 2.
You can have arbitrarily complex alphabets, which are just representative "mappings" of any set of possible symbols, say 2^16, of which a subset might represent certain instructions, another subset might represent values, fractional and integral, another might be pixels in a compressed or uncompressed ("raw"), image, and each depends, obviously on how it is interpreted, or what is done with or to it (raw video data usually won't map to a process image --run on a CPU, for instance). The key is recognition, that every pattern is different.

If you, an observer, start receiving a stream of bits down some connection, and you don't know what the protocol is, all you can do is store it, and wait for more.

You might compare what's arriving with what you have received, in a kind of continuous search for some pattern (like, a repeated pattern). If it's just a continuous stream of random bits, you're sunk. Without some kind of repetition to "recognise", there's only meaningless information.
 
Last edited:
OK, still hanging on to your brain cells?

So information, as such, is a meaningless pattern unless some order, or structure is also implied.

Shannon compares the probability (expectation) of the arrival of any agreed-upon message, as its "information entropy". At this point he is talking about ordered information (not what that order "means"), and how ordering it (by segmenting it into manageable chunks, or words) gives it a probabilistic character.
Then because of any ordering, any resulting alphabet of words or symbols has a probability (per symbol) of arriving based only on how many times any symbol has arrived previously, and on the channel's capacity. Capacity is related to "noise", or background chaos.
Every channel has a capacity limit, which Shannon's analysis clearly shows.
It also shows that the "entropy", or probabilistic nature, of information is symmetrical with thermodynamic entropy, and its tendency to disperse, or radiate away from a common centre.

Information has mass and energy, in other words. Heat is what "moves" it around, or alternatively heat is "thermodynamic information".
 
Clearly, any analysis of messages, which can be represented as binary strings of arbitrary bits (the physical representation doesn't matter) shows that information is conditional.

Information entropy is a measure of the relative frequency, or expectation, of any message (token or symbol), and how much change or difference there is between any messages (so how many, or few bits can represent the alphabet).

Information is an abstraction, everything in principle has an "informational representation", which is not concerned with how changes occur (to any pattern), but what the changes are (the differences between strings of arbitrary bits). If there are only two "messages", then two bits (1 and 0), are sufficient as a representation. What those two messages actually mean (the interpretation) is distinct from the fundamental difference between 1 and 0.

Note that a '0' bit can be "nothing"; you could represent '1's as single photons, say, of any frequency.
This would be like communicating as was first done, along a "telegraph", which encoded connections and disconnections ("dot" and "dash" connections), in a timed way. Then, of course "nothing" becomes a '0', so it really means a "connection gap", or a disconnection of the channel itself.

This is equivalent to using a modern-day ethernet link by physically connecting and disconnecting it periodically, and obviously, the receiver would need to know beforehand about this communication protocol, for it to be effective.

Information entropy, or conditional entropy, is what the differences are between respective patterns. Thermodynamic entropy is what makes them different.
Information "has" mass and energy, or it acquires mass and energy from thermodynamic "information", which is heat.
Heat of course, is photons, and the equivalent vibrational modes of molecular/atomic lattices (matter).

There is no way to transfer information, without energy.
A single photon, however, cannot convey information, unless there is an agreed protocol (that a single photon represents, say, a '1'), in which case more photons should arrive, and there will be timed gaps ('0's).
Otherwise a single photon is the equivalent of a meaningless stream of random (chaotic) bits.
 
Last edited:
as to the argument of prokaryotes superiority or advantage over mammalian stable gene construct through their potential for diversifying their genomes.
please consider the internal diversification mechanisms leading to variability in immune cells of complex organisms. also the development of knowledge processing systems (brains) especially "Gregorian entities", which presents some new strategies replacing the R approach utilized in simple prokaryotes.

if information as a structure, gains its meaning and value through a processing system, than any assigned value of order or entropy, that could be associated with the implied information should also be partly (more or less) embodied in the processing system, wouldn't you agree?..
 
ak.R said:
if information as a structure, gains its meaning and value through a processing system, than any assigned value of order or entropy, that could be associated with the implied information should also be partly (more or less) embodied in the processing system, wouldn't you agree?..
Yes, it's complicated because gene expression, which is the processing of the informational structure of DNA, is regulated via expression of the same structure; the resulting products from the fabrication process are varied through these regulatory mechanisms.
It's kind of recursive, or regressive, because of hysteresis. There are a lot of interconnected cycles, that consume and produce things.

Associated structures, and the observed structures that indirectly regulate the structure itself, but also unregulated structures - or relative "amounts" of control, all comes into it.
The code contains its own "functional expression", as regulating agents that are fabricated, too.

But DNA isn't an organism; persistence requires sufficient functional structure (a cell) to maintain itself.
 
I think you have to consider the epigenetic information factor with regard to DNA functioning. and this again is consistent with entropy being a shared value. information as a substrate of a processor can not be evaluated entropically independently..
 
ak.R said:
I think you have to consider the epigenetic information factor with regard to DNA functioning. and this again is consistent with entropy being a shared value.
Are you talking about free energy, or information which is not energy (it's what energy "does"- the resulting patterns). Entropy is an integral, or the sum of the energy content of something, proportional to relative energy.

Information is dimensionless, information content is the expectation of seeing a particular pattern or set of patterns, and can be measured in bits or dimensionless numbers, It's "bit string change" per pattern.

Thermodynamic entropy has dimensions of energy per (degree) Kelvin. It's "heat change" per (average) kinetic motion.

ak.R said:
information as a substrate of a processor can not be evaluated entropically independently..
I'm not sure if you mean: "information and entropy appear to be equivalent", with this. Or are you implying that information requires a "channel" or connection, in order to be "delivered"?
 
The brain actively severs inactive axonal connections (synapses), and strengthens the positive and negative ones.
Neurons are like processors that communicate via serial networks.

A branching network (1 to many) type of graph, or a serial "drop line". A single neuron sends to many (maybe tens of thousands) of downstream neurons, and receives from many other upstream neurons, some of which suppress activity, and some of which stimulate it.

An axon is a simplex (one-way) kind of connection.
Each neuron communicates in real-time with many input connections (many to 1) and a single output to many other neurons (1 to many).

Neurons send spikes or pulses along their axons, in groups, or trains. The protocol is encoded as pulse-train signals. Neurons co-operate in groups, that form transiently, or cluster. Waves of activity ensue in the visual cortex when processing the input from the optic nerves. A similar kind of thing happens in the auditory cortex. The brain appears to function in terms of groups of co-operating cells, and a kind of "singing" of different tunes together.

There is a lot of evidence that patterns are set up in the brain, in terms of neuronal metabolism, when visual or auditory information is processed.
Neurons are cells, that have learned to communicate more "effectively" and immediately; the synapse is still a chemical kind of connection, though, or transducer element, at the end of a dendrite, or axonal extension.
 
Last edited:
Are you seriously claiming that the teleological "conundrum" has been settled?
This is typical of your misinformed, misalligned, misdirected posts.
1) You are unilaterally declaring that teleology is a conundrum. Why would you assume that?
2) You are implying that I feel this conundrum has been settled. Why would you do that?
3) That implication overlooks the fact that I have only pointed out that you appear to be taking a teleological view of the universe. I am neither saying this is right or wrong: I am trying - as I have tried in this and other threads - to get you to declare unequivocally whether or not you have a teleological view of the unvierse. Would you like to do so now?
 
Hip said:
...you appear to be taking a teleological view of the universe.
As far as you're concerned, you mean..?
I am neither saying this is right or wrong: I am trying - as I have tried in this and other threads - to get you to declare unequivocally whether or not you have a teleological view of the unvierse. Would you like to do so now?
I'm not going to declare anything about the so-called teleological argument, other than what I've said about it already - it's a philosophical argument with no answer, because it isn't a question. It isn't an argument either.
 
remember your phrase "Information entropy is a measure of the relative frequency, or expectation, of any message (token or symbol), and how much change or difference there is between any messages (so how many, or few bits can represent the alphabet)"
I take entropy here as a measure of order, regardless of the applied physical dimension. in that capacity, it could easily assign a value to a multitude of substances; especially coded one.
my point is that information in relation to order can not be wighted entropically in absence of entropy load (order) present in the processing system, therefore to compare the mere entropy of coded substances is a futile arrangement.
 
I would say ''yes'' to the original poster, if information either guides or provides a system with an order. Evolution, like any biological system is made of billions upon billions of atoms that all spin together, sharing energy and information, building the entire system up in ordered molecules.
 
Back
Top