Showing posts with label idiots. Show all posts
Showing posts with label idiots. Show all posts

Wednesday, 30 December 2009

Say it with me: dumb ideas are dumb

There is a prevalent and dangerous meme rife in society today, and though some people may find the following offensive, judgemental or unfashionable, I believe it needs to be said. Your forbearance is therefore appreciated while I do so. ;-)

First, some axioms. These should be unarguable:

  • Everyone is entitled to their own opinion.
  • Not everyone's opinions is as valid, useful or has as much merit as everyone else's in every single situation.
  • Nobody is entitled to their own facts.
  • You have freedom of speech, thought and association. You do not have freedom from criticism, freedom from offence or freedom from correction.

The problem happened where the first axiom (a healthy recognition that other people have different opinions) turned into the second and subsequent beliefs; that everyone's opinion is equally valid, and that contradicting someone in error is impolite, arrogant or somehow infringing on their freedoms.

One look in some Lit Crit classrooms will show you what happens when you aren't allowed to contradict or dispute someone else's opinions, and one look in a politicised fundamentalist church will show you what happens when you believe you're allowed your own facts, instead of just your own opinions.

And while people might enjoy studying Lit Crit or subscribe to fundamentalist religions, if they've got any sense they'll notice that people acting in either of these two roles have rarely done anything tangible to better the overall lot of their fellow man... unlike all those rude, elitist, judgemental, snobby scientists, engineers, geeks and other educated types (who instinctively recognise that ideas vary in quality and efficacy, and have therefore been quietly and industriously changing the world for the better for the last few hundred years).

The Western world (ably lead, as ever, by America) is learning the hard way what happens when you confuse recognition of existence of everyone's opinions with equality or worth of everyone's opinions. Moreover, while we mouth thought-terminating clichés like "everyone deserves an equal say", we routinely disregard them in practice. Who seriously consults their toddler in the back seat on how to get home when lost in the car? Who leaves their neurosurgeon's office and seeks a second opinion from their local garage mechanic?

It's ok to judge and disregard things which demonstrably have no merit. We commonly all agree that "all people" deserve some sort of minimum baseline freedoms, protection, treatment and standard of living. And yet we still deny some of those benefits to those people who we have judged and found undeserving of them or actively dangerous (imprisoned criminals, for example).

We try to pretend that all ideas are equal, but it's not true - some ideas are brilliant, explanatory and useful, but some are stupid, dangerous or self-destructive. And refusing to judge them and pretending those ideas are harmless, valid or beneficial has much the same effect on society in the long term as refusing to judge dangerous people would have on society - internal chaos and developmental stagnation.

We don't have to ban stupid ideas or opinions, like we don't have to kill criminals. Instead we isolate criminals using jails so they can't damage society any more.

We can do the same with ideas, simply by agreeing they're dumb.

Refusing to publicly label a dumb idea "dumb" for fear of offending someone is - long term - as bad for our culture and society as refusing to lock away criminals "because their families might be upset".

Although it's unpopular to point out, sometimes people and ideas need to be judged for the good of society, even if it does end up upsetting or offending some people.

For the last decade or two - beginning around the advent of political correctness, though I suspect that was a symptom rather than a cause - we've done the intellectual equivalent of systematically dismantling the judicial system and all the courts and prisons in society. Now - in the same way if we dismantled all the prisons we'd be overrun with criminals - we're overrun with stupid ideas, unqualified but strongly-expressed opinions and people who act as if they can choose their own facts.

The only way you can help redress this situation is by not being afraid to offend people - if someone says something stupid, call them on it. Politely but firmly correct when people make erroneous claims. Question badly-thought-out ideas, and don't let people get away with hand-waving or reasoning based on obvious flaws or known logical fallacies. Yes they'll get annoyed, and yes they'll choose to take offence, but we don't free criminals because they or their families are "offended" at their having to stay in prison. They are there - largely - because they deserved and invited it, and because the world is better with them there. Likewise, dumb ideas deserve and invite correction, and the world would be a better place for everybody if more people judged and criticised them when we came across them.

Sometimes uncomfortable things do need to happen to people, and certainly if they invite them. There's no advancement without the possibility of failure, and removing the opportunity for failure removes the opportunity to develop. If no-one ever tells you you're wrong, how will you ever learn?

But most important of all, while judging people is unfashionable, can be dangerous and should largely be left to trained professionals, don't ever be afraid to judge ideas.

Wednesday, 29 July 2009

Your opinion is worthless

This is a slightly self-indulgent post, relating to website and forum discussions, rather than a generally-applicable epiphanette. Nevertheless, I think it's an important point, and one which far too few people understand...

I find when browsing internet discussion forums, when someone with a controversial or non-mainstream opinions posts and gets voted down I frequently run across run across comments similar to the following:

I find I get downmodded a lot because I'm a person willing to speak my mind. That makes a lot of the insecure people here (of which there are many!) uncomfortable, and to try and counter that they downmod my posts.

Straight to it: although sometimes the commenter has a point (people get very attached to their ideas, and can react irrationally when they're threatened), general attitudes like this always make me uncomfortable, because they smack of self-delusion and comfort-beliefs.

Everyone has some element of this in their thinking, but it's rarely justified. As an experiment, consider the following:

Aside from your own clearly-biased personal opinion of your posts, what evidence do you have that your thoughts or beliefs are generally:

  1. Insightful
  2. Interesting
  3. Well-expressed, or
  4. Correct?

Secondly, how many people - even really stupid, boring people - do you think get up in the morning, look in the mirror and think "shit man, I'm a really windy, boring, unoriginal fucker", and then spend a lot of time expressing their opinions to others?

Most people think what they have to say is insightful, interesting, adequately-expressed and correct, or they wouldn't bother posting it.

Now, this idea is correct in that some people vote down anything which contradicts the prevailing wisdom, but people also vote down things which are wrong, stupid, ridiculous or badly-expressed.

Conversely, I know from repeated personal experience that in many communities a well-written, well-argued, non-whingey post which counters the prevailing wisdom frequently still gets a high score, sometimes because of its contrary position.

I know when I post all I have to go on is my own opinion of my posts, which (as we've established) is almost laughably unreliable. Instead, the votes my posts get serve as a useful barometer of how much my opinion of a well-written, well-argued post compares with the general opinion.

It's terribly flattering to think of oneself as a persecuted martyr, but it also usually requires a lot of egotism and a willing blindness to statistics.

To quote the great Carl Sagan:

They laughed at Galileo... but they also laughed at Bozo the clown.

Given a poster's personal opinion is biased to the point it's worthless, and given there are many more clowns in the world than misunderstood geniuses, on what basis do people claim to be downmodded for the content of their opinions, rather than for their worth, or the reliability of the arguments they use to support them?

Claiming you're being downvoted simply because your opinions run counter to the prevailing wisdom, rather than simply because you're self-important or wrong requires you to not only assume you're vastly more intelligent or educated than the average person, but also that most people voting you down are doing so because of a deficiency in their psychology, rather than your own.

When all the objective evidence you have is that a lot of other people disagree with you, it's terribly tempting to believe you're a misunderstood intellectual martyr like Galileo.

The trouble with this, of course, is that while paradigm-shifting geniuses like Galileo only come along a few times a generation, we're knee-deep in idiots, and the tide is rising.

There are literally thousands of times more idiots than geniuses, so claiming you must be a genius on the basis you were voted down doesn't mean you're a genius - it means not only are you overwhelmingly likely to be a self-important idiot, but you're also bad at maths.

Act appropriately.

Thursday, 18 June 2009

The myth of idiot-proofing

Idiot-proofing is a myth. Attempting to simplify an over-complex task is good, but be careful how you do it - beyond a certain point you aren't idiot-proofing, just idiot-enabling.

Three classes of tool

Tools (like programming languages, IDEs, applications and even physical tools) can be grouped into three loose categories: idiot-prohibiting, idiot-proof and the merely idiot-enabling.

Idiot-prohibiting tools are those which are effectively impossible to do anything useful with unless you've at least taken some steps towards learning the subject - tools like assembly language, or C, or Emacs. Jumping straight into assembly without any idea what you're doing and typing code to see what happens will never, ever generate you a useful program.

Perhaps "prohibiting" is too strong a word - rather than prohibiting idiots these tools may only discourage naive users. These idiot-discouraging tools are those with which it's possible to get some results, but which leave you in no doubt as to your level of ability - tools like perl, or the W3C XHTML validator. Sure you might be able to open a blank .html or .pl file and write a few lines of likely-looking pseudocode, but if you want to have your HTML validate (or your Perl code actually do anything useful and expected), you're soon going to be confronted with a screen-full of errors, and you're going to have to stop, study and get to grips with the details of the system. You're going to have to spend time learning how to use the tool, and along the way you'll practice the skills you need to get good at what you're doing.

The myth of idiot-proofing

Garbage in, garbage out is an axiom of computing. It's generally impossible to design a system such that it's completely impossible for someone sufficiently incompetent to screw it up - as the old saw goes: "make something idiot-proof and they'll invent a better idiot".

A natural fall-out consequence of this is that making anything completely idiot-proof is effectively impossible - any tool will always be somewhere on the "prohibiting->discouraging->enabling" scale.

Furthermore, even if your tool merely makes it harder for "idiots" to screw things up, at the same time that very feature will attract more idiots to use it.

Shaper's Law of Idiot-Proofing:

Lowering the bar to prevent people falling over it increases the average ineptitude of the people trying to cross it.

Or, more simply:

Making a task easier means (on average) the people performing it will be less competent to undertake it.

Obviously I don't have hard statistics to back this up (though there could be an interesting project for a Psychology graduate...), but it often seems the proportion of people failing at the task will stay roughly constant - all you're really doing is increasing the absolute number of people who just scrape over the bar... and who then go on to produce nasty, inefficient, buggy - or just plain broken - solutions.

Fixing the wrong problem

The trouble with trying to make some things "idiot-proof" is that you're solving the wrong problem.

For example, when people are first learning programming they typically spend a lot of their time concentrating on the syntax of the language - when you need to use parentheses, remembering when and when not to end lines with semi-colons, etc. These are all problems for the non-programmer, but they aren't the important ones.

The major difficulty in most programming tasks is understanding the problem in enough detail to solve it[1]. Simplifying using the tools doesn't help you understand the problem better - all it does is allow people who would ordinarily never have attempted to solve a problem to try it.

Someone who's already a skilled developer won't benefit much from the simplicity, but many unskilled or ignorant people will be tempted by the extra simplicity to try and do things that are (realistically) completely beyond their abilities. By simplifying the wrong thing you permit more people to "succeed", but you don't increase the quality of the average solution (if anything, you drastically decrease it).

By analogy, spell-checkers improved the look of finished prose, but they didn't make anyone a better author. All they did was make it easier to look professional, and harder to tell the difference between someone who knew what they were talking about and a kook.

Postel's Law

The main difficulty of programming is the fact that - by default - most people simply don't think in enough detail to successfully solve a programming problem.

Humans are intelligent, flexible and empathic, and we share many assumptions with each other. We operate on a version of Postel's Law:

"Be conservative in what you emit, liberal in what you accept"

The trouble begins because precision requires effort, and Postel's Law means we're likely to be understood even if we don't try hard to express ourselves precisely. Most people communicate primarily with other humans, so to save time and effort we express ourselves in broad generalities - we don't need to specify every idea to ten decimal places, because the listener shares many of our assumptions and understands the "obvious" caveats. Moreover, such excessive precision is often boring to the listener - when I say "open the window" I don't have to specify "with your hand, using the handle, without damaging it and in such a way that we can close it again later" because the detail is assumed.

Sapir-Whorf - not a character on Star Trek

The problem arises that because we communicate in generalities, we also tend to think in generalities - precision requires effort and unnecessary effort is unpleasant, so we habitually tend to think in the most vague way we can get away with. When I ask you to close the window I'm not imagining you sliding your chair back, getting up, navigating your way between the bean-bag and the coffee table, walking over to the window, reaching out with your hand and closing the window - my mental process goes more like "window open -> ask you -> window closed".

To be clear: there's nothing inherently wrong with thinking or communicating in generalities - it simplifies and speeds our thought processes. Problems only arise when you try to communicate with something which doesn't have that shared library of context and assumptions - something like a computer. Suddenly, when that safety-net of shared experience is removed - and our communication is parsed exclusively on its own merits - we find that a lifetime of dealing in vague generalities has diminished our ability to deal with specifics.

For example, consider probably the simplest programming-style problem you'll ever encounter:

You want to make a fence twenty metres long. You have ten wooden boards, each two metres long, and you're going to the hardware shop - how many fence-posts do you need?

No programmer worth his salt should get this wrong, but most normal people will have to stop and think carefully before answering[2]. In fact this type of error is so common to human thinking that it even has its own name - the fencepost (or off-by-one) error.

Solve the right problem

Any task involves overcoming obstacles.

Obstacles which are incidental to the task ("publishing my writing">"learning HTML") are safe to ameliorate. These obstacles are mere by-products of immature or inadequate technology, unrelated to the actual task. You can remove these without affecting the nature of the task at hand.

Obstacles which are an intrinsic part of the task are risky or even counter-productive to remove. Removing these obstacles doesn't make the task any less difficult, but it removes the experience of difficulty.

Counter-intuitively, these obstacles can actually be a good thing - when you don't know enough to judge directly, the difficulties you experience in solving a problem serve as a useful first-order approximation of your ability at the task[3].

Despite years of trying, making programming languages look more like English hasn't helped people become better programmers[4].

This is because learning the syntax of a programming language isn't an important part of learning to program. Issues like task-decomposition, system architecture and correct control flow are massively more important (and difficult) than remembering "if x then y" is expressed as "if(x) { y; }".

Making the syntax more familiar makes it easier to remember and reduces compile-time errors - making the task seem easier to a naive user - but it does nothing to tackle the real difficulties of programming - inadequately-understood problems or imprecisely-specified solutions.

The trouble is that the "worth" of a program is in the power and flexibility of its design, the functionality it offers and (inversely) the number of bugs it has, not in how many compiler errors it generates the first time it's compiled.

However, to a naive beginner beauty and solidity of design are effectively invisible, whereas compiler-errors are obvious and easy to count. To a beginner a program that's poorly designed but compiles first time will seem "better" than a beautifully-designed program with a couple of trivial syntax errors.

Thus to the beginner a language with a familiar syntax appears to make the entire tasks easier - not because it is easier, but because they can't accurately assess the true difficulty of the task. Moreover, by simplifying the syntax we've also taken away the one indicator of difficulty they will understand.

If a user's experiencing frustration because their fire-alarm keeps going off the solution is for the user to learn to put out the fire, not for the manufacturer to make quieter fire alarms.

This false appearance of simplicity begets over-confidence, directly working against the scepticism with the current solution which is an essential part of the improvement process[5].

Giving a short person a stepladder is a good thing. Giving a toddler powertools isn't.


Footnotes

[1] This is why talented programmers will so often use exploratory programming (or, more formally, RAD) in preference to designing the entire system first on paper - because although you might have some idea how to tackle a problem, you often don't really understand it fully until you've already tried to solve it. This is also why many developers prefer to work in dynamic scripting languages like Perl or Python rather than more static languages like C or Java - scripting languages are inherently more flexible, allowing you to change how a piece of code works more easily. This means your code can evolve and mutate as you begin to understand more about the problem, instead of limiting your options and locking you into what you now know is the wrong (or at least sub-optimal) approach.

[2] Obviously, the answer's not "ten".

[3] When I'm learning a new language I know I'm not very good at it, because I have to keep stopping to look up language syntax or the meanings of various idioms. As I improve I know I'm improving, because I spend less time wrestling with the syntax and more time wrestling with the design and task-decomposition. Eventually I don't even notice the syntax any more - all I see is blonde, brunette, readhead... ;-)

[4] Regardless of your feelings about the languages themselves, it's a truism that for years many of the most skilled hackers have preferred to code in languages like Lisp or Perl (or these days, Python or Ruby), which look little or nothing like English. Conversely, it's a rare developer who would disagree that some of the worst code they've seen was written in VB, BASIC or PHP. Hmm.

[5] From Being Popular by Paul Graham, part 10 - Redesign:

To write good software you must simultaneously keep two opposing ideas in your head. You need the young hacker's naive faith in his abilities, and at the same time the veteran's scepticism... The trick is to realize that there's no real contradiction here... You have to be optimistic about the possibility of solving the problem, but sceptical about the value of whatever solution you've got so far. People who do good work often think that whatever they're working on is no good. Others see what they've done and are full of wonder, but the creator is full of worry. This pattern is no coincidence: it is the worry that made the work good.

Friday, 1 May 2009

The incompetent leading the credulous - your mildly disconcerting thought for the day

It's well-known to psychologists, public speakers, politicians and con-men[1] that in general the more confident an individual appears, the more persuasive they are to other people. This effect holds regardless of the veracity or provability of their assertions. In other words, confidently and assertively talking horseshit will make you more persuasive than simply talking horseshit on its own, regardless of the fact it's horseshit.

In other news, the Dunning-Kruger effect demonstrates that - in general - the more incompetent or ignorant someone is of a subject, the more they will over-estimate their own expertise or understanding of it. Equally, the more experienced and competent a person becomes in a subject, the more they will begin to underestimate their true level of knowledge or expertise, downplaying their understanding and qualifying their statements. In effect, when trying to assess one's own level of ability in a subject increased expertise is inversely proportional to confidence in your expertise.

The net effect of this is that - again, in general - ignorant or incompetent people are subconsciously predisposed to be more confident in their opinions, and all people are subconsciously predisposed to find confident people persuasive.

In a nutshell, all things being equal, people are instinctively predisposed to find ignorant or incompetent people disproportionately persuasive and trustworthy compared to more competent, more experienced experts.

Distressingly it appears that in the kingdom of the blind the one-eyed man is not king. Instead, in the kingdom of the blind the true king is the one blind guy who's sufficiently incompetent or delusional that he honestly believes he can still see.

This has been your mildly disconcerting thought for the day.


Footnotes

[1] The author acknowledges that there may be some overlap in these categories.