December 12, 2005
You Do Not Want To Test My 'Powers of Philosophy'
Many thanks to Richard Zach at LogBlog for giving me a reason to get back in the game here at Metametaland.
So you think that philosophers are some type of girly men, and that you can just kick sand in their faces with impunity?
Thank again, little-minded man.
For now, the true Ubermen will reveal themselves. I give you:
My favorites are:
Powers: remarkably original, ahead of his time
Notes: The reader may wonder, "Why is Kant blue with red tiger stripes?" Well, why twelve categories? I don't know. It was decided that a toy representing as important a philosopher as Kant ought to be unique in some way, that there ought to be some kind of toyly manifestation of his philosophical greatness. Blue with red tiger stripes was deemed sufficiently bad ass.
Notes: As these Leibniz figures are painstakingly manufactured to be exactly alike (even in regard to spatio-temporal location), collectors will be disappointed to discover that there is only one in existence, in accord with Leibniz' Law. Also try Leibniz' delicious cookies!
March 30, 2005
Can motion induced blindness be used to target for the logical possibility of zombies?
This is really what philososphers do. Any wonder why I got my undergraduate degree in the first science. That's right, I'm extremely concerned with the metaphysical existence of cryptozoological entities.
Blogosophy has also started a "Top Five Philosophers" Meme. Here are my lists.
Top Five Philosophers I'd bring with me on a desert island (chosen for my interest and their readability)
Philip K. Dick
Robert Anton Wilson
Top Five Philosophers of which I wish had read more:
December 01, 2004
Did somebody say Mutual Aid?
Not that I've been haranguing all my friends with the current culture's fundamental misunderstanding of the process of natural selection or anything.
Clearing up misunderstanding #1:
In this big wide world, competition does not seem so prevalent between members of the same base unit of selection (gene, species, community) as it is between those members and 'capital N' Nature.
Clearing up misunderstanding #2:
The evidence seems to favor the gene (as defined by Dawkins and others) as the base unit of selection.
June 04, 2004
Or, MIT Courses I want to take towards becoming a Professor of Information Theory & Theoretical Physics.
I was cataloging MIT Class 6.050J, Information and Entropy, when I found this FAQ for prospective students. I had to stop and create a new entry in my moleskine (more on the moleskine to come) for courses at MIT that I've cataloged and want to attend.
I've been starting to talk to folks about Information Theory, a subject I think I'd like to be an expert on at some point. I know that I'm going back to get a Phd when I'm done with Information Architecture and Digital Libraries. Discovering Information Theory gave me a natural course back to academic study. I'd basically be doing the more experimental and out there cool stuff in Information Science that I don't have the time or money to do today. And as 6.050J shows, I'd be very close to my one true scholarly passion, metaphysics. The Second Law of Thermodynamics is the most glorious and painful of all of nature's wonders. It is the enemy of all librarians, the rule that says no matter how much we fight against disorder with our catalogs and rules and taxonomies, the universe inexorably trends towards disorder. We're fighting a battle we can never win. We know it and fight anyway, remember that next time you have a chance to interact with a librarian. It gives me great pleasure just to think about entropy.
Anyway, Information Theory is an attempt to define Information using formal mathematical language (calculus). In essence it defines information as a measure of the unpredictiability of any communication. At my website, I've a page where I've collected some wisdoms I hold as true. There you'll find this mathematical expression of Information. What I Know
Anyway, on to the FAQ. My favorite part is its clear expression of what the information age is and why were in it. Namely, that the transmission of information is in the happy zone between which it does not concern itself with physical limitations and so is for the most part free of entropic concerns. Without these conditions, there is no proliferation of hypertextual exchange of information on the ridiculous scale we've seen the last ten years.
1. Are these questions really frequently asked?
Of course not. Most have never been asked in exactly this form. However, questions and answers are a good way to explain things.
2. What is this course all about?
Information and entropy. Information is a measure of the amount of knowledge; it is not that knowledge itself. Thus information is like energy, in the sense that knowing the amount of energy does not tell you anything about its nature, location, form, etc. The same is true with information. Entropy is a one kind of information. It is the information you do not have about a situation.
3. The information I don't have? Is this different from the information you don't have?
Certainly. Information is subjective. You know things that I don't. This is, after all, why communication is useful.
4. But is entropy also a subjective quantity?
Strictly speaking, yes. However, in physical situations the difference of entropy as perceived by two observers may be negligibly small.
5. If information is the measure of a quantity, what are its units?
Information is measured in bits. More bits means more information.
6. Is entropy also measured in bits?
Yes. However, in physical situations there is lots of information that is not known. (Think about how many bits of information are needed to specify the position of all atoms in an object.) It is impractical to work with such large numbers, so another set of units is used: Joules per degree Kelvin.
7. Wait a minute. Entropy is physical and information is mathematical. How can they be conceptually the same?
Historically, information storage and transmission required a physical artifact of some kind. Think of newspapers, books, and medieval manuscripts. So traditionally information has also been physical most of the cost of information processing was due to the physical carrier of the information. It is only recently that the cost of storing, moving, or processing information is so low that we think of information apart from its physical form, not even subject to physical laws. But this is only a temporary situation. Eventually information technology will have to face the limits imposed by quantum mechanics and thermodynamics, and it will again be necessary to understand the fundamental physics of information. This time it will be entropy that is the relevant physical concept.
8. Why is information so important?
This is the beginning of the information age. Just as the industrial age was opened up by our ability to manage energy, so the information age is upon us because we are learning how to manage information effectively.
9. Why is entropy so important?
Entropy is one of the most mysterious of all the concepts dealt with by scientists. The Second Law of Thermodynamics states that the entropy of a given situation cannot decrease unless there is a greater increase elsewhere. Thus entropy has the unusual property that it is not conserved, as energy is, but monotonically increases over time. The Second Law of Thermodynamics is often regarded as one of science's most glorious laws. And also one of them the most difficult to understand.
10. If entropy is so difficult, can a freshman really understand it?
Certainly. It's all in how the topics are approached. It is true that the concepts involved here are not normally taught to freshmen. This is a shame, because they have the background necessary to appreciate them if approached from the point of view of information. Traditionally the Second Law of Thermodynamics is taught as part of a course on thermodynamics, and a background in physics or chemistry is needed. Also, the examples used come from thermodynamics. In this course, the Second Law is treated as an example of information processing in natural and man-made systems; the examples come from many domains.
11. I am thinking about taking this course. What do I need to know to start?
You need to understand energy, and how it can exist in one place or another, how it can be transported from here to there, and can be stored for later use. You need to know how to deal with a conserved quantity like energy mathematically, and to appreciate that if energy increases in one region, it must decrease elsewhere. More specifically, the prerequisite for this course is the first semester freshman physics subject 8.01 (or 8.012, 8.01L, or 8.01X).
12. Is entropy useful outside of thermodynamics?
All physical systems obey the laws of thermodynamics. The challenge is to express these laws in simple but general forms so that their significance in, for example, a biological system gives insight. Besides, laws similar to the Second Law of Thermodynamics are found in abstract systems governed more by mathematics than physics two examples discussed in the course are computers and communications systems. In these contexts the "information" part of "Information and Entropy" is important.
13. Why aren't information and entropy normally thought of together?
Most scientists recognize that information can be exchanged for entropy and vice versa. but they don't consider that fact important. The reason is that in typical physical situations the number of bits of entropy is far larger than the number of bits of information that even the largest information processing systems can deal with. In other words, the scales involved are vastly different. This is because there are a large number of atoms in physical systems.
14. But then why treat them together now?
For two reasons. First, the underlying principles are the same so you only have to learn them once. And second, modern technology is continuously increasing the amount of information that computers and communication systems can deal with. Another way of saying this is to observe that modern microelectronic systems control more bits with fewer atoms. As the number of atoms per bit comes down, the difference in scale between information in computers and communications systems and entropy in the corresponding physical systems shrinks. Eventually it will be possible to make devices that cannot be understood without considering the interplay between the information stored in the device and the entropy of its physical configuration.
15. If I take this course, can I get a summer job?
Certainly, but probably not because of anything you learn here.
February 24, 2004
War, good god, what is it good for?
Just an interesting bit of information, take it with a grain of salt (especially as its unsourced). Does not constitute an endorsement of either candidacy on anyone's part (especially mine).