Ω is my own personal favorite transcendental number. Ω isn't really a specific number, but rather a family of related numbers with bizzare properties. It's the one real transcendental number that I know of that comes from the theory of computation, that is important, and that expresses meaningful fundamental mathematical properties. It's also deeply non-computable; meaning that not only is it non-computable, but even computing meta-information about it is non-computable. And yet, it's *almost* computable. It's just all around awfully cool.

So. What is it Ω?

It's sometimes called the *halting probability*. The idea of it is that it encodes the *probability* that a given infinitely long random bit string contains a prefix that represents a halting program.

It's a strange notion, and I'm going to take a few paragraphs to try to elaborate on what that means, before I go into detail about how the number is generated, and what sorts of bizzare properties it has.

Remember that in the theory of computation, one of the most fundamental results is the non-computability of *the halting problem*. The halting problem is the question "Given a program P and input I, if I run P on I, will it ever stop?" You cannot write a program that reads P and I and gives you the answer to the halting problem. It's impossible. And what's more, the statement that the halting problem is not computable is actually *equivalent* to the fundamental statement of Gödel's incompleteness theorem.

Ω is something related to the halting problem, but stranger. The fundamental question of Ω is: if you hand me a string of 0s and 1s, and I'm allowed to look at it one bit at a time, what's the probability that *eventually* the part that I've seen will be a program that eventually stops?

When you look at this definition, your reaction *should* be "Huh? Who cares?"

The catch is that this number - this probability - is a number which is easy to define; it's not computable; it's completely *uncompressible*; it's *normal*.

Let's take a moment and look at those properties:

1. Non-computable: no program can compute Ω. In fact, beyond a certain value N (which is non-computable!), you cannot compute the value of *any bit* of Ω.

2. Uncompressible: there is no way to represent Ω in a non-infinite way; in fact, there is no way to represent *any substring* of Ω in less bits than there are in that substring.

3. Normal: the digits of Ω are completely random and unpatterned; the value of and digit in Ω is equally likely to be a zero or a one; any selected *pair* of digits is equally likely to be any of the 4 possibilities 00, 01, 10, 100; and so on.

So, now we know a little bit about why Ω is so strange, but we still haven't really defined it precisely. What is Ω? How does it get these bizzare properties?

Well, as I said at the beginning, Ω isn't a single number; it's a family of numbers. The value of *an* Ω is based on two things: an effective (that is, turing equivalent) computing device; and a prefix-free encoding of programs for that computing device as strings of bits.

(The prefix-free bit is important, and it's also probably not familiar to most people, so let's take a moment to define it. A prefix-free encoding is an encoding where for any given string which is valid in the encoding, *no prefix* of that string is a valid string in the encoding. If you're familiar with data compression, Huffman codes are a common example of a prefix-free encoding.)

So let's assume we have a computing device, which we'll call φ. We'll write the result of running φ on a program encoding as the binary number p as &phi(p). And let's assume that we've set up φ so that it only accepts programs in a prefix-free encoding, which we'll call ε; and the set of programs coded in ε, we'll call Ε; and we'll write φ(p)↓ to mean φ(p) halts. Then we can define Ω as:

Ω

_{φ,ε}=

**Σ**

_{p ∈ Ε|p↓}2

^{-len(p)}

So: for each program in our prefix-free encoding, if it halts, we *add* 2^{-length(p)} to Ω.

Ω is an incredibly odd number. As I said before, it's random, uncompressible, and has a fully normal distribution of digits.

But where it gets fascinatingly bizzare is when you start considering its computability properties.

Ω is *definable*. We can (and have) provided a specific, precise definition of it. We've even described a *procedure* by which you can conceptually generate it. But despite that, it's *deeply* uncomputable. There are procedures where we can compute a finite prefix of it. But there's a limit: there's a point beyond which we cannot compute *any* digits of it. *And* there is no way to compute that point. So, for example, there's a very interesting paper where the authors [computed the value of Ω for a Minsky machine][omega]; they were able to compute 84 bits of it; but the last 20 are *unreliable*, because it's uncertain whether or not they're actually beyond the limit, and so they may be wrong. But we can't tell!

What does Ω mean? It's actually something quite meaningful. It's a number that encodes some of the very deepest information about *what* it's possible to compute. It gives a way to measure the probability of computability. In a very real sense, it represents the overall nature and structure of computability in terms of a discrete probability. It's an amazingly dense container of information - as a n infinitely long number and so thoroughly non-compressible, it contains an unmeasurable quantity of information. And we can't get near most of it!

[omega]: http://www.expmath.org/expmath/volumes/11/11.3/Calude361_370.pdf

- Log in to post comments

Mark,

Is there a way to determine if a given program halts by looking at Î©?

In other words, assuming I have Î© for my machine and my encoding, and I have a program, can I use Î© to find if my program will halt?

Miguel:

No, you can't figure out if a specific program halts by looking at Ω. *If* you could create an encoding in which there was only one valid program of any given length, then the individual bits of Ω would each tell you whether a specific program would halt or not; but in the prefix-free encoding that's used for defining Ω there can be more than one program of a given length. (So, for example, if the program that you're interested in has length "p", the 2-p bit might be zero, because there were two halting programs of size p; the summing would cause 2p-1 to be 1 and 2p to be zero.)

Thanks Mark. I thought it'd be cool if Î© could be used as some sort of "oracle" for deciding halting problems (or proving theorems).

One more interesting property: unsplittable.

Solvay introduced a notion of relative randomness that is similar to Kolmogorov complexity, but is nicer because addition of the numbers corresponds to the join. For every other computable real r, there is a+b = r such that

botha and b are properly less random. Î© has no such decomposition.See http://www.math.uchicago.edu/~drh/

I got to ask. Someone mentioned on an earlier post that one can't prove randomness. As I understand it you can't take a single random number and show if it is random or pseudorandom, whether specifically or in general I'm not sure. Is that true and is it connected to Omega? (Perhaps since we wouldn't recognise an Omega if we see it, if I understand the above correctly.)

Random reals are one of those names that can be confusing for the layman because they are conflating two different terms. Technically, individual numbers are not random, but sequences are. For example, we may say that a die is random because rolling it produces a random sequence of numbers, but no single value is technically "random".

However, every individual real number can be considered a sequence. Just turn it into an infinite decimal expansion and consider each digit a number:

1/9 = 0.999999999.... is 9,9,9,9,9, ...

Ï = 3.14259... is 3,1,4,2,5,9

If this sequence is random this is what we mean when we say "random reals".

I think that there's one too many 0's at the end of "any selected pair of digits is equally likely to be any of the 4 possibilities 00, 01, 10, 100".

I think my brain just exploded.

I think there's two too many 0's and one two few 1's. :P

I also think this is a wonderfully fascinating post, and wish I was smart enough to have written it, and not just notice typos in it. :P

p.s. if I had intended the substitution of "two" for "too" in the above post, it would have almost been funny, in an extremely lame sort of way, but actually, it was just a typo.

"Technically, individual numbers are not random, but sequences are."

Sorry, I didn't thought that through properly. Yes, I sloppily assumed that a randomly generated and selected real ("single random") has a randomised sequence of digits. Why shouldn't it have - does each pseudorandom algorithm produce uncountably many sequences?

"does each pseudorandom algorithm produce uncountably many sequences"

Or rather, does some algorihms, et cetera.

You certainly can use omega to determine if a program halts or not. Just run the set of programs less than a certain number of bits, and wait until the number of programs that halt equal the beginning of omega, then you know that the remaining programs do not halt.

You mean 00, 01, 10, 11 (not 100) ... and "bizarre" :)

maybe it's the omega-part of "i'm the alpha & the omega" the finite and the infinite :D

we should create a new religion based on this revelation.

It seems to me as if knowing Î© does in fact work as an oracle to solve the halting problem.

Quick sketch: given Î©, you know exactly how many machines of size n or below will halt: call this w. To find if a given program of size n halts, run all of the programs of size n or less in parallel until exactly w of them have halted -- then the remaining programs are the ones that will not halt.

Or am I missing something?

Oops, Benjamin Higgins just said that a few lines above. Never mind!

So if Omega tells us what it's possible to compute, and we can't compute Omega, does that mean that it's impossible for us to know what it's possible for us to know?

I wonder how Daniel Tammet would describe this number?

Tom:

Nope, doesn't quite work.

The problem is that programs *larger* than the size threshold you pick can have a "bubble-up" effect - remember that Omega is a sum. So, for example, if you have a program of size 23, and there are 17 1s in the first 23 digits of Omega, that doesn't mean that there are 17 halting programs of length 23 or less. There could be 1000 programs of length 28 that halt - and so their effect was pushed upwards. So their could be *less* than 17 halting programs of size 23 or less - and you'll spend forever trying to wait for 17 programs to halt.

Steve:

Omega doesn't tell us what it's possible to compute; it expresses an overall probability of finding a halting program via a particular kind of random search. In terms of "what we know", the nature of Omega does tell us more about what we can't know than the halting problem, but it's the same kind of knowledge that we get from the un-solvability of the halting problem.

Well, the developers of MacOS 8.x must have inadvertantly embedded omega into the code somehow, because I remember my Quadra used to halt every couple of hours.

"I think my brain just exploded.

Posted by: Joshua | August 7, 2006 02:58 PM "

That's the only post that I've understood so far.

Mark:

Benjamin's and Tom's arguments are correct. Actually, Chaitin uses a similar argument. You can see this on: http://cs.umaine.edu/~chaitin/sciamer3.html look for "Why Is Omega Incompressible?".

Chaitin says:

"My strategy for demonstrating that Î© is incompressible is to show that having the first N bits of Î© would tell me how to solve the Turing halting problem for programs up to length N bits".

That is an amazing fact: Omega contains the solution of every halting problem (restricted to a specific UTM).

So, for instance, if you want to know if he Riemann hypothesis is true, you can write a program wich halts until it finds a counter-example. If you have Omega you can know the answer in finte time.

...extremely freaky songi bout tranzak

...wonderfully informative about something close to me that I knew nothing about...

...[Î©] embodies an enormous amount of wisdom in a

very small space . . . inasmuch as its first few thousands

digits, which could be written on a small piece

of paper, contain the answers to more mathematical

questions than could be written down in the entire

universe....bennet (bennet & gardner 79)...

...conceptually it is very powerful...

...thanx...

This research will lead to the

end of the world as we know it, withCharlton Hestonshooting atnocturnal, homicidal maniacs.I've seen this future, in a premonition.

No, wait...it was a movie.

.......and so will be

Premonition, starring Sandra Bullock, in 2007.This is completely incomprehensible gibberish. It makes no sense at all. Utter rubbish.

"...it encodes the probability that a given infinitely long random bit string contains a prefix that represents a halting program."

BUT...

"Uncompressible: there is no way to represent Î© in a non-infinite way; in fact, there is no way to represent any substring of Î© in less bits than there are in that substring."

Doesn't the first statement, as a definition or "representation" of omega, refute the second?

Isn't "the smallest integer which is a sum of three cubes" a perfectly good "representation" of 36?

Ralph:

You've nailed one of the strange things about Ω which I deliberately didn't mention; I wanted to see how long it took before someone commented on it. :-)

Ω is one of those numbers that trips Gödel self-reference. You *can* define it in a way that inherently involves self-referentiality within the formal system where you're expressing the definition.

On the other hand, the second statement is true, assuming the right definition of "representation", which is something far too complex to get into in a comment! But basically, it's talking about a "representation" that allows you to *use* the information that is naturally encoded in Ω.

"Yes, I sloppily assumed that a randomly generated and selected real ("single random") has a randomised sequence of digits. Why shouldn't it have - does each pseudorandom algorithm produce uncountably many sequences?"

Dear me, what a fool I can be.

IIRC now, that this is one of the elementary results in my probability course book (which as usual I can't find) - the events of the random real numbers are completely orthogonal to the events of the digit sequence.

Which I allow for above but doesn't take care of - any sequence, even ordered, will happen.

So to answer my own question, a single isolated event can never be proved to be random by looking at the digit sequence, one must collect more events and model them (measure the statistics) to see that they may be.

Same for the events of the "random real" digits in the sequence, as Walker discuss.

Steve:

"does that mean that it's impossible for us to know what it's possible for us to know?"

I thought gÃ¶del incompleteness pretty much assured us that we will never know the end of it or what will happen.

If so, it is a great argument to work in science, much as I guess the halting problem is an argument to work with CS/programming, and the NFL theorems seems to be an argument to work on signal processing/detection problems.

Could you reformulate omega as the sum of BB(n)/2^n for all positive integers n (BB=Busy Beaver)? I don't see why it only works with prefix-free encodings.

want to represent it, you either need a singular symbol such as Î© or an infinite number of sets of symbols that represent the 'structure' that when viewed as a whole is in itself a singularity. A singular symbol doesn't tell us a lot about what is inside, so we want to find a way to look at its pieces. The favorite method is to try to represent it decimally. In that context, only the first nine digits truly do not repeat themselves when we take each as a singular entity. So, to represent all of the 10 unique 'holders' I actually have to have 9+8+7+6+5+4+3+2+1+0(1) (yes you have to count zero as one because it represents a state in the decimal system) buckets to represent them. My one ponit there is to get you thinking about what each of those digits represents. Remember, a decimal 6 is merely a mathematical representation of 3+3 or 1+1+1+1+1+1. So to demonstrate a 6 I can use 6 (1)'s. That fact in itself breaks the rule of not being able to represent Î© using a summed subset such as A + B = Î©. Or does it (sorry about my grammer)? Perhaps it simply means that Î© cannot be represented using the decimal system since the instance of a six would either be in violation of the rule of being able to represent Î© as a summed set of subsets (a 6 is already a summed set of subsets as I demonstrated above). The same argument can be used against the ability to represent it in binary. It all boils down to how you are trying to measure it. By what standard can you measure a singularity? Trying to represent it in any other format would be a form of measuring a singularity and that violates the rule of computability. Therefore it cannot be represented via any other form of symbology. A singularity is unique by its natural. Therefore trying to measure it by any standard is futile because there is no standard or any combination of standards by which to measure something truly unique. That also makes the argument of compressibility as true since a singularity cannot be compressed as compression requires measurement and yada yada, back to the start. See where it goes?

Sorry the beginning got chopped by the web code. Here it is;

Such a narrow view of what all of this means. Consider this. The theory basically states that Î© is a singularity that has a size greater than one, right (ok, I'll explain)? So, you want to represent it, you either need a singular symbol such as Î© or an infinite number of sets of symbols that represent the 'structure' that when viewed as a whole is in itself a singularity. A singular symbol doesn't tell us a lot about what is inside, so we want to find a way to look at its pieces. The favorite method is to try to represent it decimally. In that context, only the first nine digits truly do not repeat themselves when we take each as a singular entity. So, to represent all of the 10 unique 'holders' I actually have to have 9+8+7+6+5+4+3+2+1+0(1) (yes you have to count zero as one because it represents a state in the decimal system) buckets to represent them. My one ponit there is to get you thinking about what each of those digits represents. Remember, a decimal 6 is merely a mathematical representation of 3+3 or 1+1+1+1+1+1. So to demonstrate a 6 I can use 6 (1)'s. That fact in itself breaks the rule of not being able to represent Î© using a summed subset such as A + B = Î©. Or does it (sorry about my grammer)? Perhaps it simply means that Î© cannot be represented using the decimal system since the instance of a six would either be in violation of the rule of being able to represent Î© as a summed set of subsets (a 6 is already a summed set of subsets as I demonstrated above). The same argument can be used against the ability to represent it in binary. It all boils down to how you are trying to measure it. By what standard can you measure a singularity? Trying to represent it in any other format would be a form of measuring a singularity and that violates the rule of computability. Therefore it cannot be represented via any other form of symbology. A singularity is unique by its natural. Therefore trying to measure it by any standard is futile because there is no standard or any combination of standards by which to measure something truly unique. That also makes the argument of compressibility as true since a singularity cannot be compressed as compression requires measurement and yada yada, back to the start. See where it goes?

I also said this and didn't finish the sentence;

Perhaps it simply means that Î© cannot be represented using the decimal system since the instance of a six would either be in violation of the rule of being able to represent Î© as a summed set of subsets (a 6 is already a summed set of subsets as I demonstrated above).

It should continue as;

or mean that if a 6 is used int he representation, then an occurance of a 3 could not be used in its representation, but that would digress into a whole different argument.

So that's what 42 stood for!

Well that's blown the last few brain cells that I had...

"that Î© is a singularity"

How is Omega a singularity?

What is a singularity to you?

"In mathematics, a singularity is in general a point at which a given mathematical object is not defined, or a point of an exceptional set where it fails to be well-behaved in some particular way, such as differentiability." ( http://www.answers.com/topic/mathematical-singularity )

And as Mark said:

"this number - this probability - is a number which is easy to define"

yeah, omega is cool ! his complexity is unfinite, and his bits, being mathematical facts, are true, but with no reason.

It sounds to me like you can use Î© as a halting oracle. All you have to do is translate any given program into an equivalent program with a unique length. For any binary string as a program, one-hot encoding that string will provide a unique length for every possible program, at which point you can just check the appropriate bit of Î©.

>Nope - that's just apophenia, like Gary Osborn and his 23.5 degree angles.<

Mark,

You are just plain wrong to keep spouting this nonsense about me and my work which is about bringing something to light that others have not noticed for hundreds of years.

Its now beyond coincidence and Apophenia.

Take a look at the article on this page.

http://garyosborn.moonfruit.com/revelations2

Click on the image entitled underneath 'The Baptist Revelation' and read through the presentation pages.

There are over twenty paintings - from between the 15th and 18th centuries with the index finger of John the Baptist pointing at 23.5 degrees.

Gary Osborn

Gary:

So are you going to answer the question of just how you decide what line to use to measure the angle of those fingers? Or are you going to just go off again ranting about how it's obvious?

Looking at the pictures on your page, it looks to me like you're taking anything that's anywhere close to your angle, and insisting that it's right. In the first three images, *none* of them look to me like the angle is what you claim it is. How do you decide what line to use for measuring the angle of a finger?

I responded on the wrong thread, but that's really not important when you continue to casually drop in derogatory remarks about myself and my work - saying its all "apothenia".

You write:

>In the first three images, *none* of them LOOK TO ME like the angle is what you claim it is<.

(My emphasis)

Get real! I take in what you are saying Mark, but I don't accept it.

"Look to me"

Exactly!

What and only the "first three images"?

Mark. Your attempt at rebuttal is pathetically desperate.

You are being overly cricial and ungenerous, but that's what we come to expect from an 'anal-retentive'.

You know you are wrong and your attitude to these latest examples really show that you do not have the courage to admit in public that you were wrong all along about what I have discovered in regard to numerous references to this angle in many different sources throughout history - and which everyone appears to have overlooked until now.

Let's be clear here. Despite your feelings about it (whatever is eating you up about it) and your unreasonable denial of what is clear to everyone else, It is a reality that the index fingers in 20+ paintings on John the Baptist and St Jerome are pointing at the angle of 23.5 degrees - and in all eight orientations of this angle - end of story.

20+ is beyond coincidence. That's almost all the paintings that feature John the Baptist from the period I mentioned.

Your question as to "what line of the finger" I am measuring is also a 'desperate measure' (pardon the pun) when these index fingers - and sometimes the first digit of this finger - are clearly pointing at this angle. That's why I include a line at this angle to show that the finger is pointing at this angle!!! Get it? What could be clearer than that?

Even if some of the fingers are not precisely at this angle, (and please don't bore us further by misinterpreting my words here - i.e., that I am uncertain about this) nevertheless, they are in the "right ball park" for this angle, and that's something that cannot be denied and explained away as a coincidence. Surely there is still a pattern here worthy of investigation.

In any case, there is a logical reason behind this. You are not looking at the "bigger picture" here Mark, because some dedicated research into 'why' these fingers are pointing at the angle of 23.5 degrees, will support what I have found here. But you will not dedicate yourself to this research, because you are not really interested; wouldn't have a clue where to begin; do not possess even the basic knowledge in regard to the philosophical 'raison d'etre' behind these references and are only interested in trying to debunk my work here.

Again, you have no knowledge as to what it was I discovered, which prompted me to look for these references in the first place which I then discovered, and not because it was merely a case of finding what I am looking for in that "we will often see what we believe", but because it was a logical step in my research. I found these references simply because 'they are there' and this is all quite predictable when you know the beliefs of the people who featured these references in their works.

If you are interested, I have discovered that these particular paintings were by a fraternity of artists; members of the 'Angelic Society' - the "Brouillards" - later associated with the Polaires, the Martinists and the Synarchists. And one of the beliefs of these people is related to the present obliquity of the earth's axis; that it is an unnatural condition and reflects the imbalance of human consciousness and vice versa. These people also believed that the earth was upright at one time, and will be again.

In no way do the beliefs of these people reflect my own beliefs, but I do find it all very interesting, and so should other people - especially those who are interested in secret societies, as well as the beliefs of these people and perhaps the information and knowledge that has been preserved from a former time in man's long history.

When it comes to in-depth knowledge about esoteric principles and beliefs, and the reason why these references exist, then you are out of your depth Mark. Now get over it, be a gentleman and allow me to get on with my research into this 'real' phenomenon, as I allow you to continue "educating" people as to the difference between good and bad math.

Please, no more of your "apophenia" nonsense. Its really not warranted and anyone with some good old common sense can see that.

Now tell me that I simply chose not to answer your question, which indicates that I cannot answer it because you must be right - which is your usual response tactic.

G.O.

Gary:

The reason I keep pointing out that refuse to answer questions

is because you refuse to answer perfectly reasonable legitimate questions.If the fundamental result of your research is the discovery of certain *extremely* precise angles in paintings through the ages, then asking how you determine the precise angles of features in the paintings is a perfectly legitimate question.

The fact that you refuse to answer it

isdamning.You'll write paragraphs and pages to attack me for daring to question you. But you won't answer simple questions about your research metholodogy.

I'm a professional researcher. I'm *constantly* grilled about my research methodology. Often in very unpleasant hostile or confrontational ways. I've never refused to answer a question about my research methods. Even obnoxious ones, even mind-bogglingly ignorant or stupid ones. It comes with the territory.

Anyone who does real research for a living will say the same. We

expectquestions; wewelcomequestions; and weanswerquestions.If you don't want to answer the questions, you can't act surprised when people don't take you seriously.

Oh, and Gary? One more thing:

If you're incapable of actually figuring out which post on a blog to respond to, you might want to take some time out and learn to read. It's not hard to tell the difference between a post on the golden ratio and the ways in which it's been misused, and a post on the universal halting probability.

You write:

----------------------------------------------------

>>If the fundamental result of your research is the discovery of certain *extremely* precise angles in paintings through the ages, then asking how you determine the precise angles of features in the paintings is a perfectly legitimate question. The fact that you refuse to answer it is damning.<<

----------------------------------------------------

Really? I agree with your comments about being "grilled" etc., and that its to be expected, and I'm OK with that. However, if we go back to your initial page on my work, which was a 'blatant attack', you were the one who was being obnoxious, even mind-bogglingly ignorant and stupid.

Take another read of your first "critique". Just look at how pathetically juvenile, groundless and unfair it was; it just expressed your own uninformed views and narrow beliefs. That was where you were coming from and you really "went to town" on it because you didn't expect a response from me.

In other words, it appeared that you were against anyone discovering anything new; that new discoveries should not be tolerated and anyone discovering something new should be ridiculed because they are "deluding themselves" and you know better, even though 'discovery' is a natural occurence in our world; its what spurs us on, and also natural that people are going to follow up on their discoveries with research because people are naturally curious. That was all I was doing, and I did find something - in fact like many other people, I have had many new insights and as a result have discovered many things other than these recently found references.

Now, because I was discovering things and interpreting them in ways that went against your own beliefs or world view, you "knee jerked" and you felt it right to attack me on your blog.

However, what you didn't take into account was, that my own interpretations of the data are based on what other people believe(d) - i.e., why people felt it necessary to encode this information in these various sources.

Its now been accepted that there is a secret fraternity of creative people; a tradition that has been carried on through the centuries and that these people - i.e., artists, writer-poets, dramatists etc., have been encoding information and were/are interested in common themes, creating puns, rebuses, riddles, sometimes for the purposes of satire, social criticism, but also concealing ancient wisdom and knowledge.

Also, I have not 'refused' anything. I have answered this question before in one of our earlier correspondences.

I first measure the angles of these conspicuous features in the paintings using computer art programs. If the results are interesting, I then move onto acquiring large prints of the paintings (which can be expensive) and using a simple protractor: I've used various ways and methods and I'm always arriving at the same conclusion.

The pages I'm presenting on my website are only the tip of the 'proverbial iceburg'. I am working with other people on this now who, while being critical of new claims, are not vehemently opposed to them. These people agree with me that using the right instruments to determine the angle is all that is necessary really, and that it should be simple and plain for anyone to see . . . and anyone who is unreasonable in looking beyond that to debunk it is just nit-picking.

I'm also presently working with an academic with a Ph.D in 'probability theory' to work out the odds of the 23.5 coming up with the frequency I have discovered in these various sources, and the results are interesting. I'll have it submitted for peer review soon.

G.O.

PS as for your stoop-low sarcastic comments in your last post: thanks for pointing that out to me, however, I'm not 'incapable', just getting straight to the point. I would rather cut my toenails than read through your boring blogs, which I would have never done if you and your first virulent and UNFAIR attack of my work hadn't first been brought to my attention. I'm only now interested in sifting out your derogatory comments about me and my research; defending myself and putting you right mate, as well I should.

You'd rather cut your toenails than read my blog; but any time I so much as mention your name in a post, you're right there in the comments writing page-long diatribes. Quite interesting.

And I note that *still* you avoid answering my question. From the time I first talked about your gibberish, I've been asking: how do you measure the angles? Answering "with a protractor" isn't a real answer, and you know it. The issue is that you're looking at features of a painting that *aren't* simple lines. They're fingers, or arms, or legs, or angles between features. How you determine *what angle* you're measuring? Are you determining midlines? If so, how? Are you measuring it to find a valid midline? Or are you just eyeballing it and picking a line that just happens to measure 23.5 degrees, because that's what you expect to see?

>>And I note that *still* you avoid answering my question.<<

I have answered them in different ways. Check all my posts.

These sources - not just these Baptist paintings, as this is just one 'variation on a theme' - also include distinct lines and linear objects at the angle of 23.5 degrees EXACTLY, which makes the fingers and limbs in other paintings (some by the same artists) and which I say are also POINTING at this angle - most likely.

Also, if a protractor was used to determine the angle in the first place, then obviously a protractor can be used to read the angle.

As for the fingers and limbs; my favourite method now is to use an extended rectangle at the angle of 23.5 degrees - adjusting the width to the sides of the finger or limb at the fattest or fleshiest area and if the whole of the finger sits squarly and centrally inside the length of this rectangle with equal space either side - e.g., especially where the finger thins and tapers off at the tip - then I can only conclude that the finger or limb is pointing at this angle. So yes, you could say I'm measuring the midline - which is the most logical thing to do.

With this template, one could say that I'm making the findings fit, but really I do this to make sure that the match is accurate, because as I make my way from one painting to the next, the results are so regular that I already know that the fingers in these paintings are pointing at this angle. In fact, each time I see them, I get a little annoyed now, even saying to myself "not another one?", because I know I have to go through the same time-consuming task of referencing and filing it.

Thing is, if this pattern was not intentional, then using the template rectangle at 23.5 degrees, the angle of the majority of these fingers would be found to be pointing at other angles in the 360-degree circle, with perhaps one or two at 23,5 degrees. However we find that the index fingers in 20+ paintings - which is virtually every painting and drawing I have found from this period (15th - 18th cent) - are pointing at this same angle.

Cross-reference these particular findings with the conspicuous lines, edges and linear objects in many other paintings and sources, all with related themes, and which are at the same angle exactly and we have a real phenomenon here, and one that others are free to examine for themselves.

Does that answer your question?

You write:

>>You'd rather cut your toenails than read my blog; but any time I so much as mention your name in a post, you're right there in the comments writing page-long diatribes. Quite interesting.<<

---------------------------------------------------

Well I could imagine people thinking me a little sad giving this some attention, and I must admit, there are more important things to do and think about - but unlike you I'm not online everyday or on the computer 24/7. You are not the only person I correspond with Mark and while I'm here at the computer answering my posts, this provides a little amusement, and anyway others might be interested in looking into this themselves - if they haven't already been put off by believing your de-bunkum.

Rather cut my toenails? Well I'm only kidding.

G.O.

I'm not shure if I understand the practical purpose of Î©. Given a sequence of code, a function for example that calls itself and has a conditioned return statement, Î© would be the statistical probability that that function would eventually return, and thus finish the nested execution?

No, it's more like this:

1) You have a turing-equivalent computing device that takes a bit string as input and does a computation. The encoding of the programs is such that no valid program has a prefix which is also a valid program.

2) Choose a program by flipping a fair coin over and over. The probability that this method of choosing a program will eventually hit a halting program is Î©.

Of course, the precise value of Î© isn't fixed until you fix a computing device and program encoding.

Some notes on Omega:

If you have the first n bits of Omega for a given universal machine, you can solve the halting problem for all programs up to length n. Just run all programs and add up their contributions; as soon as the first n bits match, then you know that no more programs whose length is less than or equal to n can halt, because it would change one of the bits that you already know is correct.

Omega is really a zeta function evaluated at 1, which is why you need prefix-free programs. If you define

Omega(s) = Σp halts 2-s|p|

then it converges for any Turing machine and s>1. This zeta function is related to the geometric zeta function of the HALT fractal, the set of real numbers in the unit interval whose binary expansions are concatenations of halting programs.

A more natural definition of the zeta function is

zeta(s) = ΣΣn halts n-s

where n is the index on a lex-order enumeration of programs. With this definition, there's a class of programs including prefix-free ones that give random sequences when evaluated at 1, and the total Turing machine, that halts on all inputs, is Riemann's zeta function.

Chaitin showed that given any finite axiomatic system, you can only prove finitely many bits of the sequence; by adding one more bit of axiom to your FAS, you can prove at most one more bit of the sequence.

There's a natural extension of this definition to partial randomness: by adding one more bit of axiom, you can derive s more bits of the sequence. Such a sequence is called 1/s-random. A neat theorem by Tadaki is that Omega(s) is precisely 1/s-random.

I showed in my master's thesis that random sequences satisfy an uncertainty principle. Given a random sequence x of bits, you essentially take two to the power of both sides in the inequality

H(x1 x2 x3 ... xn) >= n-c

Sort programs in quasi-lex order; then the index of the program times the uncertainty in your estimation of Omega given the first n bits is greater than a constant.

By the way, the guy you linked to with the 84 bits of Omega was my thesis advisor, Cris Calude.

"00, 01, 10, 100"

this should be "00, 01, 10, 11" right? or am i missing somthing?

This is an exceptionally interesting post on Î©! I have learned much. My son has Aspergers and loves math (except at school- he's bored) I think there are a couple of thoughts I can throw his way that will really intrique him. Thank you.

Note: Gary and Mark...Grow up. Give the other person a call and "quit bellyachin!" You guys may be smart in a lot of ways, but boy are you ignorant when it comes to taking care of conflict in a mature fashion.

Step 1: call the other person...This will reveal who the kid is and who the man is. Quit posting this trash in a great blog.

Step 2: realize that you might only agree to disagree

Step 3: realize that the other person might actually be wholly or partially right about the issue(s)...and you might actually be wholly or partially wrong about the issue(s).

Step 4: Be man enough to walk away and realize that you might only agree to disagree

Step 5: Don't be a grade schooler and keep whining.

Step 6: Have lunch/dinner or coffee together and be friends. Life is too short to do otherwise.

Hope it all works out.

Please elaborate!

"the statement that the halting problem is not computable is actually equivalent to the fundamental statement of GÃ¶del's incompleteness theorem"

How exactly is undecidability equivalent to incompleteness?

Both undecidability and incompleteness can be formalized as statements in ZF set theory and proven in ZF. This makes them equivalent in the trivial sense that any two theorems of ZF are equivalent. But that's not what you meant by saying they're equivalent. So what precisely do you mean? I'm not trying to nit-pick or criticize, I really want to know, as precisely and formally as possible, in what deep way the two are related.

Sorry for going on a tangent to the main topic.