I'm back in Istanbul--reunited with seven cats who were overjoyed to see me--and I've got to return my thoughts to all things Eastern, because there's a lot going on here, as you may have heard. But before that, a few more reflections from the Great Expectations conference.

First: Ricochet members, you did me proud. I wish you all could have seen the faces of the participants in the conference when they read your comments. I was kvelling. See? Absolutely the best, most intelligent conversation to be had anywhere on the Internet. And quite a few people, by the end, were willing to come out and be interviewed--they were persuaded by reading what you had to say and by seeing your questions. Unfortunately, we just didn't have enough time. Interviewing the participants required separating them from everyone else, and they were all so excited about talking to each other that I just couldn't peel them away. 

We've asked Mike Denton and Paul Nelson to come join Ricochet as guest contributors this summer, however. Paul cheerfully accepted, and since Mike promised to answer all your questions, I think he has to accept--he was flying off to visit his daughter in London, so I didn't get a chance to press him on the point. But I will, not to worry. Everyone wanted to talk about your comments and questions. So you'll hear a lot more from them, and, I suspect, from other people who were there--including, of course, my Pop. 

Second: Here's another video. This is Robert Marks reflecting on the conference from an engineer's perspective (he's an engineer, obviously). He discusses the idea of information, addressing some of the questions raised by our members. 

And finally, a conversation with Rabbi Moshe Averick, the Maverick Rabbi. He's the author of Nonsense of a High Order: The Confused and Illusory World of the Atheist. He's completely charming: I wish you'd all had the chance to talk to him. As you might expect, given that he's in the rabbi business, he felt like discussing the religious perspective on these ideas:

 

 

That's all for now. Turkey's waiting for me, and so are a lot of editors who remain peevishly attached to the concept of "Western deadlines." 

Thank you, Ricochet, for having added to the joy of this conference--I know everyone there was delighted to hear from you, even the people who weren't quite ready to reveal themselves.  Sooner or later they will, I suspect. They know now that the water at Ricochet is warm.

  • Comment Filters
Contributor Comments
Member Comments
Comment Popularity

Comments (displaying 37 of 37):

Paul Snively
Joined
Oct '10
Paul Snively

Robert Marks had me right up until he claimed we need a new information theory to account for the information we see. I don't believe that follows from any current scientific evidence of any kind, or from anything he said in his interview, although I would like to read his paper... :-)

His comments sound as if he understands quite well that an axiom system consists of a lot of "compressed" information, and performing inferences based on those axioms "decompresses" the information that's already there. Ignore the references to probability; probability can be, should be, and is described entirely in terms of logic. The best reference on information theory as learning I'm aware of is The Minimum Description Length Principle. Finally, the logical basis for connecting the two is New Axioms for Rigorous Bayesian Probability, which I linked to earlier.

My claim is that this is all you need to have explanatory and predictive power over any classical natural phenomenon. It's true that we need to develop quantum information theory, but not yet. :-)

I'd be interested in Robert Mark's comments, if he sees this.

Thanks, Claire, for all the great reporting!

Edited on Jun 17 at 07:06 am
Claire Berlinski, Ed.

Paul Snively:I'd be interested in Robert Mark's comments, if he sees this.

Thanks, Claire, for all the great reporting! · Jun 17 at 7:06am

Edited on Jun 17 at 07:06 am

You're so welcome, and thank you for all the great responses. I'll make sure everyone sees all the comments, and I expect everyone will want to reply. But I know they're all getting off long flights, so I imagine they'll want to rest a bit, first. 

I'm sorry now that I didn't ask the question I thought of asking when he said that, which was, "Why exactly do we need a new information theory?" That was exactly what came to my mind, but I got distracted by another thought. 

KC Mulville
Joined
Jan '11
KC Mulville

A couple months ago, I picked up a book on Kindle entitled: "The Information: A History, a Theory, a Flood" by James Gleick. It gives something of a survey of Information Theory, and spends considerable time on Shannon and Jon von Neumann. I thought the book was very entertaining, and pardon the pun, informative.

Philosophy has a parallel debate on "meaning" that plays into some of the same topics. Most students are acquainted with the rule that the predicate can't be contained within the subject, otherwise the statement is a tautology. If you say "Bob Smith is Bob Smith," your predicate (the thing you're attributing to the subject) is already contained in the subject. So, for a statement to be meaningful, the predicate has to attribute something that isn't already assumed - for example, "Bob Smith is the murderer."

If you think about it, that's a pre-condition for what Marks described with Shannon information. If you tell me that Bob Smith is Bob Smith, the probability is 100%. If you tell me that he's a murderer, there's nothing in the subject that necessarily implies the predicate. The probability drops close to zero.

KC Mulville
Joined
Jan '11
KC Mulville

I'd say that the chief reason we need a better Information Theory is because they're likely facing the same obstacles that philosophy faces with meaning. If you want a decent overview of that debate, check out the Stanford Encyclopedia entry

Maybe I'm pushing the "meaning" angle so much because that's the language with which I'm familiar. But if you read the later Wittgenstein, especially the idea of games and ostensive definition ... and then read about Information Theory ... you'd find yourself hitting the same ideas from different angles.

Paul Snively
Joined
Oct '10
Paul Snively
KC Mulville: I'd say that the chief reason we need a better Information Theory is because they're likely facing the same obstacles that philosophy faces with meaning.

As much as it pains me to insist on this point—because I have no intention of shutting down discussion—no, that's not a reason we need a new Information Theory. We don't need a new Information Theory, and Information Theory doesn't suffer from the philosophical problem of meaning.

If you want a decent overview of that debate, check out the Stanford Encyclopedia entry.

I see a lot of verbiage that doesn't help me understand why Bayesian/Laplacian Probability + Shannon/Kolmogorov/Solomonoff/Chaitin Information Theory is inadequate. That might just mean I don't understand the verbiage, of course.

"We need a new Information Theory" presumes that there are areas in which the extant theories necessarily fail to offer explanatory or predictive power, vs. errors in understanding or application of the extant theories. That's the argument I'd like to see made as rigorously as possible.

Paul Snively
Joined
Oct '10
Paul Snively

To follow up re: philosophy, I always dislike taking on my brothers and sisters in the philosophical community because I do think that work is important, but unfortunately, the great bulk of it isn't rigorous. Analytic philosophy tries to be rigorous, but only a subset of analytic philosophy really is. I can't put it any better than this:

"If we are to model our theory of probabilistic reasoning on a computer, we cannot be satisfied with giving the kind of vague untestable 'picture,' devoid of detail, that is characteristic of so much contemporary philosophy. Such a picture is not a theory. It is at best a proposal for where to start in constructing a theory. This problem is automatically avoided by insisting that a theory be capable of generating a computer program that models part of human reasoning. You cannot get a computer to do anything by waving your hands at it." — John L. Pollock, "Nomic Probability and the Foundations of Induction"

Edited on Jun 17 at 09:07 am
Brian Watt
Joined
Jun '10
Brian Watt

I found the tail end of the brief interview with Rabbi Averick to be quite condescending, to wit:

"If you actually believe that a human being is nothing but a glorified cockroach...or a hunk of meat that eventually gets put into a hole in the ground..."

I haven't encountered one, not one, secularist, agnostic or atheist who believes this. In fact, I would say that most non-believers I know are incredibly well-read and well-versed in history, art, politics, philosophy and science and have an extremely high regard for the value of human life and humanity in general and are quite respectful and appreciative of many of the good works that religions are involved in. 

Gallup recently reported that 90% of Americans believe in God. Since Roe v. Wade 52 million abortions have been conducted in America. Is it the atheists who are aborting children in these horrific numbers? Or are believers aborting children? Shouldn't the good rabbi and his like-minded clerics be focusing their attention on their own...what should we call them...flocks?


Joined
Mar '11
Abdiel

Brian Watt: I found the tail end of the brief interview with Rabbi Averick to be quite condescending, to wit:

Gallup recently reported that 90% of Americans believe in God. Since Roe v. Wade 52 million abortions have been conducted in America. Is it the atheists who are aborting children in these horrific numbers? Or are believers aborting children? Shouldn't the good rabbi and his like-minded clerics be focusing their attention on their own...what should we call them...flocks? · Jun 17 at 9:15am

This really needs repeating. That 2% of the population who identify themselves as atheists can't possibly be responsible for all 52 million abortions, or our 50% divorce rate. Somewhere down the line Christians need to acknowledge that this is a problem of moral relativism being coupled with faith.

Edited on Jun 17 at 09:38 am
KC Mulville
Joined
Jan '11
KC Mulville
 Information Theory doesn't suffer from the philosophical problem of meaning.

I didn't say it did. That's why I phrased it as "the same obstacles."

Meaning is association. To say that X means Y is to say that when you encounter X, you should think of Y. It's a translation; how to get from X to Y? Translation depends on the observer. The translation is essentially linguistic, and it's affected by the social constructs inherent in language.

The ambiguities embedded within those social constructs is equivalent to the "noise" of information theory.

Right now, most of the information theory out there is oriented to the practical problem of eliminating noise, so that a transmission can arrive at reception in a recognizable state.

But what do you do when you don't have access to the original transmission, and all you have is the message? As Quine and Chomsky argue, you have no immediate way of knowing if, or how much, the message is distorted. You only know what the message means to you, but you can't trust that this is the same as what the sender "meant."

Edited on Jun 17 at 10:16 am
KC Mulville
Joined
Jan '11
KC Mulville

I've seen Quine and Chomsky offer challenging arguments to the notion of meaning in philosophy, and the "problem of noise" in information theory sounds familiar. That's my only point.

The corollary problem in information theory is that when you encounter something in the world, without knowing how it got there, how can you predict what it means? 

If you encounter a "regularity" in microbiological structures, how do you interpret what it means? Your immediate reaction is to assume that the regularity is meaningful, but that's what the debate about Intelligent Design is all about. Is it meaningful?

Think of the "watch" argument for the existence of God. If you found a watch, with all of its complexity, there's a low probability that such a watch could have formed without design. You assign the fact of the watch's complexity as evidence because of its probability.

Information deals with the same probability calculations. The problem, however, is that probability is as much dependent on the observer as meaning is. When you say that something is more probable, it's because you already have expectations on what counts (or doesn't) for probability.

KC Mulville
Joined
Jan '11
KC Mulville
Paul Snively: This problem is automatically avoided by insisting that a theory be capable of generating a computer program that models part of human reasoning.

Why must a philosophical discussion be capable of generating a computer program? Sounds cool, but why?


Joined
Mar '11
Roy Lofquist

Normally I don't state my qualifications but I think it might be germane here. I started in the computer business in 1960. I read von Neuman before he published his first book in the Journal of the Association for Computing Machinery. I grew up with Information Theory from its early days.

John Marks is quite correct when he says we need some new theories. The problem is that we still don't fully understand what constitutes information. We've been chomping around the edges for 30 or 40 years and each time we take a bite new questions arise.

For example, there is still vigorous debate about which algorithms produce a truly random number. Whether you can determine the stopping point of a Turing machine is unsettled. Roger Penrose developed a proof that is still controversial.  Many of these problems are essentially involved with extending Godel's Theorem beyond the realm of integers.

I hope this sheds some light.

Brian Watt
Joined
Jun '10
Brian Watt

Roy Lofquist: Normally I don't state my qualifications but I think it might be germane here. I started in the computer business in 1960. I read von Neuman before he published his first book in the Journal of the Association for Computing Machinery. I grew up with Information Theory from its early days.

John Marks is quite correct when he says we need some new theories. The problem is that we still don't fully understand what constitutes information. We've been chomping around the edges for 30 or 40 years and each time we take a bite new questions arise.

For example, there is still vigorous debate about which algorithms produce a truly random number. Whether you can determine the stopping point of a Turing machine is unsettled. Roger Penrose developed a proof that is still controversial.  Many of these problems are essentially involved with extending Godel's Theorem beyond the realm of integers.

I hope this sheds some light. · Jun 17 at 10:02am

As long as you can figure out how I can win the MegaMillions lottery my life will be complete. :-)

Paul Snively
Joined
Oct '10
Paul Snively
KC Mulville Why must a philosophical discussion be capable of generating a computer program? Sounds cool, but why?

Bearing in mind that I'm presuming to speak for the late Dr. Pollock, because it keeps philosophy grounded in a physical process, which is key to it having meaning. :-)

KC Mulville
Joined
Jan '11
KC Mulville

Paul Snively

KC Mulville Why must a philosophical discussion be capable of generating a computer program? Sounds cool, but why?

Bearing in mind that I'm presuming to speak for the late Dr. Pollock, because it keeps philosophy grounded in a physical process, which is key to it having meaning. :-) · Jun 17 at 10:21am

With respect, why is being grounded in a physical process the key to meaning? (I'm not trying to be combative, so please don't misinterpret me.) But it strikes me that philosophy is deliberately abstract, for good reason. Being grounded in a physical process isn't an intellectual discipline as much as it is an intellectual preference. 

Paul Snively
Joined
Oct '10
Paul Snively
Roy Lofquist: For example, there is still vigorous debate about which algorithms produce a truly random number.

"Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin." — John von Neumann

No algorithm can produce a random number, of course.

 Whether you can determine the stopping point of a Turing machine is unsettled.

The Halting Problem is known not to be solvable.

Roger Penrose developed a proof that is still controversial.

A proof of what? Penrose, in "The Emperor's New Mind" and "Shadows of Consciousness," posits a need for a physics that is distinct from the experimentally-supported quantum mechanics, on the basis of nothing more than his desire for the mind not to be a quantum mechanical machine. I see no evidence for his assertions, let alone proof of anything.

Many of these problems are essentially involved with extending Godel's Theorem beyond the realm of integers.

This doesn't mean anything. Gödel's theorems already apply to anything that's as powerful as Peano arithmetic or stronger, and practically all logics are stronger.

Paul Snively
Joined
Oct '10
Paul Snively
KC Mulville With respect, why is being grounded in a physical process the key to meaning? (I'm not trying to be combative, so please don't misinterpret me.) But it strikes me that philosophy is deliberately abstract, for good reason. Being grounded in a physical process isn't an intellectual discipline as much as it is an intellectual preference. 

No, it's an excellent question. The briefest way I can put it—also not to be combative—is that it keeps philosophy from wasting time discussing admittedly more sophisticated variants of "If God is omnipotent, can He make a stone so heavy that even He can't move it?" or "How many angels can dance on the head of a pin?" That is, philosophy only has a problem of meaning to the extent that it insists upon not limiting itself to contemplating meaningful propositions.

KC Mulville
Joined
Jan '11
KC Mulville
Paul Snively  it keeps philosophy from wasting time discussing admittedly more sophisticated variants of ...

Well, if you're making the point that philosophy is stuffed with irrelevance ... frankly, I'll agree to that in a heartbeat. And I spent tuition money to be trained to read this stuff! 

But let's get back to information theory. I'm curious about your response to Roy Lofquist. Given those comments, how would you define information? 


Joined
Mar '11
Roy Lofquist

Gentlemen,

Interesting discussion.  As I stated in the first paragraph, I am making assertions based upon 40 years of living with this stuff. Tough to explain 40 years of thinking in 200 words.

As to Penrose, I just picked up "The Emperor's New Mind" from my desk. Publication date 1989. It's getting a little dog-eared now.

Re the question of proof of a particular philosophy you have to consider what in Computation Theory is termed np-complete or in Penrose's term, non-computable. Before Turing there were a number of problems that were recognized to be too difficult to solve with pencil and paper and were consigned to the attic of interesting but unresolveable. The advent of digital computers prompted the opening of dusty boxes. We found that some were easy, some were tough but given sufficient computing power were solveable, while a surprising number were impossible to solve with any conceiveable computer in any amount of time. Considerable amount of effort is being expended to find a proof of whether thay are or are not np-complete.

Word limit.

Thanks guys,

Roy

Paul Snively
Joined
Oct '10
Paul Snively
KC Mulville Well, if you're making the point that philosophy is stuffed with irrelevance ... frankly, I'll agree to that in a heartbeat. And I spent tuition money to be trained to read this stuff!

LOL OK, fair enough. I guess, coming from a math/logic background, I sometimes let myself lose sight of... well, the human side of the humanities, if you will.

But let's get back to information theory. I'm curious about your response to Roy Lofquist. Given those comments, how would you define information?

I don't. I'm satisfied with Probability Theory: The Logic of Science, The Minimum Description Length Principle, and New Axioms for Rigorous Bayesian Probability. Together, I believe they completely define how to reason under uncertainty and integrate new information.



Would you like to comment on this conversation?

Become a Member for $3.58 a month.

Join the Conversation
Already a member? Sign In
x

Most Popular Services

Become a Member to enjoy the full benefits of Ricochet:

Join Ricochet today!

Already a Member? Sign In