Nomad and Relativism

Nomad by Ayaan Hirsi Ali is a good book. However this statement in her Letter to My Unborn Daughter is silly:
I will bring you up to have faith in yourself, in science and your own reason and the force of life. And I will never seek to impose my own beliefs or unbelief on you.
The things in the first sentence are Ayaan's beliefs. She says both that she will cause her daughter to have them, and that she won't.

Ayaan fully intends to design her parenting so that her daughter does grow up to have certain beliefs rather than others. She's not going to be passive about it. Somehow she's trying to limit what "beliefs" encompass even though she's well aware that some cultures have wildly different beliefs about science, faith, life and reason (explaining that is a theme of her books).

A theme of Nomad is that modern Western culture is objectively better then primitive, tribal cultures, and that relativism is a mistake. Ayaan is wise to choose to promote her belief in science to her daughter. Her statement "I will never seek to impose my own beliefs" is a thoughtless platitude of just the type her book rightly opposes.

Elliot Temple | Permalink | Message (1)

mac has global autocomplete!!! and more about cocoa hotkeys

omg, opt-esc, global hotkey for cocoa text fields to try to complete the current word, try it!

there's also tons of other stuff like ctrl-k and ctrl-y (like cut/paste, but with a separate buffer, and cuts to end of line instead of using a selection). or ctrl-a and ctrl-e (move cursor to start/end of paragraph). and you can also make your own hotkeys including ones to do multiple things, like i made one to duplicate the current line by piecing together several commands. they also allow hotkey sequences as triggers.

i also just found out about the esc based shortcuts in terminal (press esc, let go, then hit key). esc-d, esc-delete, esc-f, esc-b :-D (those are forward-delete-word, backward-delete-word, forward word, and backward word)

apple does a great job with details like this, when they try. i hope OS X gets more love soon (though i accept that iphone os is more important to their business atm)

Details

http://www.hcs.harvard.edu/~jrus/Site/system-bindings.html
http://www.erasetotheleft.com/post/mac-os-x-key-bindings/
http://www.hcs.harvard.edu/~jrus/Site/cocoa-text.html
http://www.freesmug.org/tutorial/tips/terminalkey

# in terminal
bind -p
open /System/Library/Frameworks/AppKit.framework/Resources/StandardKeyBinding.dict

Elliot Temple | Permalink | Messages (0)

Whirlwind Tour of Justificationism

From an email thread about free will:

Once upon a time (624 BC) Thales was born. Thus began philosophy.

Thales invented criticism. Instead of telling his followers what to believe, he made suggestions, and asked that they think for themselves and form their own ideas.

A little later, Xenophanes invented fallibility and the idea of seeking the truth to improve our knowledge without finding the final truth. He also identified and criticized parochialism.

In the tradition of Thales and Xenophanes came Socrates, the man who was wise for admitting his vast ignorance (among other things).

But only two generations after Socrates, philosophy was changed dramatically by Aristotle. Aristotle invented justificationism which has been the dominant school of philosophy since, and which opposes the critical, fallibilist philosophies which preceded him (and which were revived by Popper and Deutsch).

Aristotle's way of thinking had some major strands such as:

1) he wanted episteme -- objectively true knowledge.
2) he wanted to guarantee that he really had episteme -- he wanted justified, true knowledge. he rejected doxa (conjecture).
3) he thought he had episteme -- he was "the man who knows"
4) he thought he had justification
5) in relation to this, he invented induction as a method of justifying knowledge

Thus Aristotle rejected the fallibilist, uncertain ethos of striving to improve that preceded him, and replaced it with an authoritarian approach seeking guarantees and to establish existing knowledge against doubt.

Induction, as well as all other attempts, were unable to justify knowledge. Nothing can guarantee that some idea is episteme, so all attempts to do it failed.

Much later, Bacon attached induction to science and empiricism. And some people like Hume noticed it didn't work. But they didn't know what to do without it because they were still focussed on the same problem situation Aristotle had laid out: that we should justify our knowledge and find guarantees. So without induction they still had to figure out how to do that, and salvaging induction seemed easier than starting over. Hence the persistent interest in reviving induction.

What Popper did is go back to the old pre-Aristotle philosophical tradition which favors criticism and fallibilism, and which has no need for justification. Popper accepted that doxa (conjectures) have value, as Xenophanes had, and he explained how we can improve our knowledge without justification. He also refuted a bunch of justificationist ideas.

Then David Deutsch wrote "A Conversation About Justification" in _The Fabric of Reality_.

So how does that relate to free will? The basic argument against free will goes like this, "There is no way to justify free will, or guarantee it exists, therefore it's nonsense." The primary argument against free will is nothing but a demand for justification in the Aristotelian style.

As an example, one might say free will is nothing but a conjecture without an empirical evidence. To translate, that means free will is merely doxa, and hasn't got any empirical justification. This is essentially true, but not actually a problem.

Arguments against free will take many guises, but justificationist thinking is the basic theme giving them appeal.

Elliot Temple | Permalink | Messages (13)

Elliot Temple | Permalink | Messages (6)

Programming and Epistemology

Organizing code is an instance of organizing knowledge. Concepts like being clear, and putting things in sections, apply to programming and philosophy both.

DRY and YAGNI aren't just programming principles. They also apply to thinking and knowledge in general. It's better to recognize a general case, then think of several cases in separate, repetitive ways. And it's better to come up with solutions for actual problems and not create a bunch of over-engineered theory that may never have a purpose.

The programming methodology of starting with the minimum thing that will work, and then making lots of little improvements to it until its awesome -- based in part on actual feedback and experience with the early versions -- is also a good general method of thinking connected to gradualist, evolutionary epistemology. It's also how, say, political change should be done: don't design a utopia and then try to implement it (like the French Revolution), instead look for smaller steps so it's possible to change course mid way once you learn more about it, so you get some immediate benefit and to reduce risk.

Programmers sometimes write articles about how evil rewrites are, and how they lead to vaporware. Nothing is ever perfect, but existing products have a lot of useful work put into them, so don't start over (you'll inevitably run into new, unforeseen problems) but instead try to improve what you have. Similarly, philosophically, there are three broad schools of thought:

1) the conservative approach where you try to prevent any changes.

2) the liberal approach where you try to improve what you have.

3) the radical approach, where you say existing ideas/knowledge/traditions are broken and worthless, and should be tossed out and recreated from scratch.

The liberal, non-revolutionary approach is the right one not just for code rewrites but also in philosophy in general (and in politics).


Consider two black boxes which take input and give output according to some unknown code inside. You try them out, and both boxes give identical output for all possible inputs. You wonder: are the boxes identical? Are they the same, for all practical intents and purposes? Must they even be similar?

Programmers, although they don't usually think about it this way, already know the answer. Code can be messy, disorganized, and unreadable, or not. Code can have helpful comments, or not. One can spend a day refactoring or deleting code, and make sure all the tests pass, so it does exactly the same thing as before, but now it's better. Some code can be reused in other projects, and some isn't set up for that. Some code has tests, and some doesn't. One box could be written in C, and another in lisp.

None of these things matter if you only treat code as a black box and just want to use it. But if you ever have to change the code, like adding new features, doing maintenance or doing bug fixes, then all these differences which don't affect the code's output are important.

I call what the code actually does its "denotation" and the other aspects its "structure", and I call this field structural epistemology. Programming is the best example of where it comes up, but it also has more philosophical relevance. One interesting question is if/how/why evolution creates good structure in genetic code (I think it does, but I'm not so clear on what selection pressure caused it). Another example is that factories have knowledge structure issues: you can have two factories both making toys, with the same daily output, but one is designed so it's easier to convert it to a car factory later.

Elliot Temple | Permalink | Messages (10)

Mises on Force and Persuasion

Liberalism in the Classical Tradition by Ludwig von Mises, p 51
Repression by brute force is always a confession of the inability to make use of the better weapons of the intellect
This is similar to Godwin:
If he who employs coercion against me could mould me to his purposes by argument, no doubt he would. He pretends to punish me because his argument is strong; but he really punishes me because his argument is weak.

Elliot Temple | Permalink | Messages (0)

Milton Friedman was a Statist

Now you know.

http://www.hoover.org/multimedia/uk/3411401.html

Edit:

In the interview, he expresses disagreement with Ayn Rand and her view that the State is bad because it uses force against its citizens. He does not provide any argument that she's mistaken, or that his view is better.

Milton also, for example, advocated a negative income tax. That means if you contribute a sufficiently small amount to the economy then the State takes money by force from other citizens and gives it to you.

The purpose of this post is simply to inform people about how a libertarian icon is a blatant Statist. (And, by the way, he's not the only one.)

Elliot Temple | Permalink | Messages (3)

Beyond Criticism?

The Retreat To Commitment, by William Warren Bartley III, p 123:
There may, of course, be other nonlogical considerations which lead one to grant that it would be pointless to hold some particular view as being open to criticism. It would, for instance, be a bit silly for me to maintain that I held some statements that I might make—e.g., "I am over two years old"—open to criticism and revision.

Yet the fact that some statements are in some sense like this "beyond criticism" is irrelevant to our problems of relativism, fideism, and scepticism.
The claim that some statements are beyond criticism is anti-fallibilist and anti-Popperian.

It is not at all silly to maintain that the example statement is open to criticism. It's essential. Not doing so would be deeply irrational. We can make mistakes, and denying that has consequences, e.g. we'll wonder: how do we know which things we can't be mistaken about? And that question begs for an authoritarian, as well as false, answer.

You may be thinking, "Yes, Elliot, but you are over two years old, and we both know it, and you can't think of a single way that might be false." But I can.

For example, my understanding of time could contain a mistake. Is that a ridiculous possibility? It is not. Most people today have large mistakes in their understanding of time (and of space)! Einstein and other physicists discovered that and space are connected and it's weird and doesn't follow common sense. For example, the common sense concept of two things happening simultaneously at different places is a mistake: what appears simultaneous actually depends where you watch from. If some common sense notions of time can be mistaken, why laugh off the possibility that our way of keeping track of how much time has passed contains a mistake?

Another issue is when you start counting. At conception? Most people would say at birth. But why birth? Maybe we should start counting from the time Bartley was a person. That may have been before or after birth. According to many people, brain development doesn't finish until age 20 or so. In that case, a 21 year old might only have been a full person for one year.

Of course there are plenty of other ways the statement could be mistaken. We must keep an open mind to them so that when someone has a new, counter-intuitive idea we don't just laugh at him but listen. Sure the guy might be a crank, but if we ignore all such ideas that will include the good ones.

Elliot Temple | Permalink | Messages (94)