Posts

Showing posts from August, 2003

Human DNA

Is the probability of obtaining a DNA sequence which codes for a human (ie, a living being capable of interbreeding with humans) by any combination of random processes and deterministic functions (like natural selection) less than 10^-150? Let's assume we already have the humans' alleged ancestor race, call it apes. Is it possible for random mutations to change an ape DNA into a human DNA? Let S = {A, C, G, T} the possible values for a codon, and S* a sequence s[1] ... s[n] where s[i] is in S . We define dist* (a, b) with a, b in S* = the number of point mutations needed to change a into b (or viceversa): dist* (a[1] ... a[n], b[1] ... b[m]) = sum (i = 1, n, dist (a[i], b[i])) + m - n , with m >= n >= 1 dist (a, b) = {0 if a = b, 1 otherwise} , with a, b in S In other words, if everything works out perfectly, it takes at least dist* (A, B) point mutations to convert one into the other, where A is a member of the ape DNA set, and B is a member of the human DNA set. How cl

Clarification

For me, the idea of a string having a probability is absurd - I can't parse it at all. Let me clarify this with an example: what is the probability of a chair? Both a chair and a string are objects. Ok, the string is an informational entity, not a physical object. What is the probability of an equation? Neither of these questions make any sense. Now, if we want it to make sense, we must start to expand the question. What is the probability that a 500-bit string will occur? Still not good enough - out of thin air? In my daily emails? So let's try again: what is the probability of a 500-bit string occuring in the following experiment: "write down 500 bits"? Well, 1 if you do it, 0 if you don't. In NO case is it going to be any other value. How do you fill the blanks so that "what is the probability of a 500-bit string ..." gives you any other result? Please email me if you found a way.

Information

Let's define I(E) = -log2 P(E), where E is an event. This is the amount of information contained in (imparted by) that event, or in other words the amount of uncertainty removed by that event, and is measured in bits. (Why uncertainty? Let {E*} be the set of possible relevant events, of which E is a member. Before E, any of the {E*} elements could have occured; the fact that we obtained E decreased that uncertainty. P(E) is, of course, 1 / the number of elements in {E*}.) Using the value we determined earlier, the "cutoff value" of 10^-150, the information contained in an event E with that probability is -log2 10^-150, which is 500 bits. Therefore, another way of specifying the "point of no return" is this: anything with an informational content larger than 500 bits could not have occured without the intervention of an intelligence . What are possible sources for this information? Well, as far as I know (please let me know if you find another one), only 3 exist
Statistics I met this gem on the TrueOrigin list: I asked her to pick a digit (0 through 9). Then do it again, and again, 53 times. You will have picked a number with 53 digits. The apriori probability that you would have picked that number is 1 chance in 10^53, but you did the impossible. Is it true that the chance of doing what is described above is indeed very small (10^-53)? Let's see. In my country, we have a lottery called "6 out of 49". You have a 7x7 grid, with numbers from 1 to 49, out of which you are supposed to pick the winning six. Let's say you bought a ticket and picked 6 numbers at random. What is the chance of winning the lottery? Well, the total number of combinations is C(49, 6), which is 49! / (6! x (49 - 6)!), where "n!" (read: n factorial ) means "1 x 2 x 3 x ... x n". Calculating it gives us 13,983,816 possibilities (if I made a mistake, please let me know). So, the probability of picking the right combination is approximat