Showing posts with label logic. Show all posts
Showing posts with label logic. Show all posts

Wednesday, 26 May 2010

An attempt at a simple, two-rule morality

I've been thinking about morality recently. Plenty of people claim to offer moral systems, but as a modern, relatively enlightened individual most of them seem to include relatively arbitrary injunctions, and as a geek most of them seem both over-complicated and over-specified, and yet still riddled with unhandled edge-cases.

Take the Ten Commandments, for example:

  1. I am the Lord your God
  2. You shall have no other gods before me/You shall not make for yourself an idol
  3. You shall not make wrongful use of the name of your God
  4. Remember the Sabbath and keep it holy
  5. Honor your father and mother
  6. You shall not murder
  7. You shall not commit adultery
  8. You shall not steal
  9. You shall not bear false witness against your neighbor
  10. You shall not covet your neighbour's wife/You shall not covet anything that belongs to your neighbour

As a modern weak atheist there seem to be some obvious errors or redundancies there:

  • The first two injunctions presuppose a belief in supernatural entity, so as someone who finds no rational reason to believe in a supernatural entity, these seem suspect or redundant. Firstly they could be better summarised as "Do not believe in any gods other than me". Secondly, unless God can himself demonstrate his moral authority (instead of, as most religions do, simply assuming it) they seem more concerned with promoting and propagating one religion than with laying down universal moral rules to live by.
  • The latter half of the Second Commandment seems to contradict the First Commandment and the first half of the Second. As a non-Christian, I would define an idol as an entity which is worshipped blindly and absolutely. This definitely includes the Christian God. Alternatively, one can take the assumed definition in context as "anything other than the Christian God"; but then (as above) it amounts to an empty re-iteration of the first commandment-and-a-half, which themselves rely on the undemonstrated assumption that the Christian God is an absolute moral authority.
  • The third again seems unnecessary - why should a system of morality define it as immoral to take the name of its creator in vain? A system of morality should stand up on its own to reasonable argument, and defining veneration of its creator as a moral requirement frankly sounds far too much like begging the question.
  • The fourth is simply redundant - why should a moral system concern itself with keeping a day of the week specifically marked out? Admittedly there may be some social benefits to setting aside a whole day of the week for adherents to remember and reflect upon their moral choices, but I don't see why such an injunction is morally good, rather than simply a good idea.
  • The fifth is again a good idea, but too over-simplified and prone to edge-cases. Sure honouring your parents is good for social stability, but what if your father is a deadbeat dad and your mother a shiftless crackhead? This commandment smacks entirely too much of the kind of unconditional, assumed authority that typifies the Ten Commandments, and is far too incomplete to serve as a good rule. Moreover, why should parents get special treatment? Why not simply honour anyone who is wiser, more intelligent or more experienced than you?
  • The sixth through ninth are pretty good, prohibiting murder, adultery, theft and lying. However, you have to be careful with definitions - for example, distinguishing between "murder" and "killing", which may include self-defence or defence of a third party). Moreover, I can't help wonder if these are overly specific, leaving out whole classes of immoral behaviour not explicitly prohibited. Take "dropping litter in public", for example - most of us would agree that it's a comparatively moral issue, and yet it's not covered by these four injunctions.
  • Leaving aside the implication that a wife is a possession to be owned, the tenth is again pretty good - I've always understood this as an injunction to try not to feel jealousy (because it's frequently a sterile, unproductive emotion), but rather to concentrate on bettering your own life and resist the temptation to waste it wishing you had someone else's.

Clearly, then, there's a lot of fat that could be cut, and a lot of edge-cases to handle.

Instead, I present my best stab at a moral system. It's only two injunctions:

  1. Do unto others as you would have them do unto you, at the highest level of abstraction possible.
  2. Always seek to minimise harm in the long run.

There are a couple of subtle but key points here.

"Do unto others as you would have them do unto you" is a pretty good moral system on its own, but the addition of "at the highest level of abstraction possible" removes edge-cases and makes it a lot more specific and defensible.

For example, it would now prohibit a masochist excusing undesired violence against others on the basis that he liked to receive violence himself. Rather, he is now constrained to consider their wishes when deciding whether it's acceptable to hurt others, rather than simply the shallow fact of his actions.

The second injunction is a somewhat Utilitarian attempt to minimise the total amount of harm in the world (where we define harm in the usual way, as "physical or mental damage").

This prohibits short-termism in decision-making (which often merely saves up problems or harm for later, possibly even increasing the total amount of harm).

It also allows for harm to be caused where necessary, but only where such harm is in the service of preventing greater harm - this would permit otherwise difficult moral choices, such as the hypothetical "killing a single child to prevent a nuclear weapon going off in a major city".

More trivially, it also permits things like "contradicting someone you believe is incorrect", but when considered in conjunction with the first, only if you're happy being contradicted or corrected by others in turn. It also effectively prohibits you from debating others' positions unless you're equally willing to give their arguments due consideration.

So that's it. The first injunction prohibits most non-victimless crimes, because we would all rather not be the victim of them, and the second permits harm to come to others, but only if we can reasonably assert that it will prevent greater harm elsewhere, or in the future.

With a little reasoning, as far as I can tell, every action or injunction we can reasonable justify as "moral" seems to be derived from these two principles.

Saturday, 15 May 2010

Geeks can be hard to work with

Geeks and especially programmers often have strong belief in "doing things right". People have remarked on this tendency, and given it a variety of negative characterisations: obsessive-compulsive, irrational, making a moral issue out of a pragmatic question.

As a geek and a programmer, I put my hands up to this stereotype - it doesn't affect us all, but enough of us (myself included) have some degree of it that I don't think it's inaccurate. However, I believe that far from being a drawback, it's arguably a vital component of a gifted developer.

I think this urge toward technical correctness comes from three main sources:

Obsessive-compulsiveness

Good programmers spend a lot of their time thinking in details - they have to, to be able to write reliable code. Programmers who gloss over or fudge details write buggy code with unhandled and unknown edge-cases.

Importantly, this code usually "sort-of" works most of the time (or at least, the obviously broken bits quickly get patched), and then occasionally fails spectacularly and catastrophically... at which point it's also usually blamed on the programmer who wrote it, rather than the manager who specified an over-complex requirement, or who provided an unreasonably small amount of resources (time, money, manpower) to achieve it.

Being a programmer is 50% artistic and 50% autistic, so it's hardly surprising that programmers can be a bit Aspergers-y about their code, especially when they're likely to carry the can for any catastrophic failures caused by it.

Artistic merit

Any programmer who isn't a journeyman or hack does the job because they like to create things, and as you get better mere creation isn't satisfying - instead you want to create things of beauty.

It's been noted before that (many or most good) programmers are makers - we need to feel that what we're creating reflects our abilities - something we'd be happy to put our name to - and we like delivering good, reliable, flexible, well-designed systems.

Banging out code or design you know is buggy, unreliable, inflexible or has unhandled edge-cases is simply not rewarding in the slightest. It's like hiring an artist to paint a wall blue, or hiring a sprinter to walk slowly to the shops for you.

To be fair, this is a very wishy-washy, non-business-oriented motivation, but I'd go so far as to say pretty much all the best programmers suffer from it - it seems to go with the territory, and not just from programmers, either - chefs are notoriously histrionic, artists are notoriously high-strung about their work and musicians are notoriously unstable or prone to mental illnesses. I don't think you can have excellence without pride in the work, and I don't think you can have pride in work without disillusionment and frustration when you're forced to churn out work you know is crap.

Pragmatism

I've been privately or professionally involved in software development for around 15 years. I've worked in a variety of companies, in a variety of languages and systems, and with a variety of different management styles.

Across all companies, management structures, languages, technologies and system there are only two things I'm utterly certain of:

  1. Given the choice between a longer/more expensive design and a simpler, less flexible one, management will almost always say "design it to the current requirements, because we'll never need <hypothetical scenario X>, and it would be a waste of time to build a system which takes it into account".
  2. From the date of hearing this, within six months to a year they will discover a pressing, immediate, unavoidable and business-critical need for scenario X... and if they don't react well to being told that this will now take longer to accommodate, they often react really badly to being gently informed this is the result of the decision they took several months previously, at which point you (or your predecessor) informed them that this would be the consequence if they ever encountered scenario X.

Personally I try very hard to avoid taking dogmatic positions on anything, but in the years I've spent programming (especially professionally, when you're most likely to have to compromise on the design or implementation to hit deadlines or budgets), I - literally, with no exaggeration - don't believe I've ever heard "we'll never need that" and then not had to implement it (whatever it was) within a few months of that date.

This only has to happen a few times (especially when you're responsible for cleaning up the mess) before you become firmly convinced that creating a system that's any less flexible than you can possibly manage is ultimately tantamount to just fucking yourself square in the ass.

There are reasons...

So yes - there are a number of reasons why programmers are obsessed with Doing The Right Thing, and why we tend to react with aesthetic revulsion to the idea of fudging designs, or hard-coding things for convenience.

Some of them are unfortunate but understood and accommodated in other disciplines - try commissioning an artist to produce art for your offices then ban him from using anything except potato-prints, or tell an architect (for deadline/budget reasons) to design you a building that they know damn well could fall down at any moment.

Others are actually vital aspects of being a programmer that you can't easily switch on and off - athletes have to be fit, programmers have to be anally-retentive and precise - and that you really wouldn't want to do without in your dev team.

Lastly, if you're a non-technical co-worker or manager, programmers often have a lot more experience working on software projects than you do, and have learned hard-won lessons you aren't even aware are available to learn... particularly lessons where they have to pick up the pieces from their own (or other people's) mistakes.

So yeah, geeks can be hard to work with. But then the guy who knows where the landmines are buried can be awfully prescriptive about where you put your feet as you navigate through a minefield, too. And unless you want your feet blown off, it's often worth listening to them.

Thursday, 29 April 2010

The feeling you're about to get smarter

As you might have noticed if you read this blog, I'm quite an aggressive rationalist - I'm big on introspection, and strive to be as rational, consistent and justified in my beliefs as possible. If someone demonstrates to me conclusively that I'm wrong I'll generally (at least: I'll try to) reverse my position on a dime.

Because I try not to invest identity in my opinions it's usually not too difficult to change a belief or position when new information or reasoning comes along. However, even I'll admit that despite my efforts in this area, It's Not Fun Being Wrong.

In particular, everyone hates that point in a discussion that most rational people experience occasionally, where you discover your argument has a gaping hole in it. You know the one - you get that sick, empty, vertiginous, see-sawing feeling where it feels like you've inched yourself out over a long drop because you trusted the plank you were standing on, only to notice now that it was apparently made of cardboard the whole time.

However, I've realised recently, that feeling is exciting and scary, but it should be savoured and sought-after, because it's the feeling you're about to get smarter.

It may feel unpleasant, but that doesn't mean it should be unpleasant - that's much more down to what we associate with the feeling than the feeling itself. For example, a muscular ache is rarely considered pleasant... and yet after a good workout we can even enjoy it. This is because - while the feeling is the same - when we've worked out hard out we often feel virtuous, and good about ourselves. Although our muscles ache, every time we notice it it's a reminder that we did something we think of as good, and that we're slightly fitter or healthier now (or can just eat an extra cream cake without feeling guilty) as a result.

Equally, it's usually unpleasant to have our insides jangled about, or to feel like we're falling, or to be out of control... and yet many people love roller-coasters. It's unpleasant to be frightened... but some people will watch horror movies for fun.

In every case, the important difference is that although the sensation might be unpleasant on its own, we recognise in each case that we're getting something greater out of it, that makes the uncomfortable sensation worth-while - health and fitness, or novelty, or entertainment and feeling more secure the rest of the time. Indeed, when we associate it strongly enough, you can find yourself searching out these unpleasant sensations, and relishing the discomfort because of what it signifies.

"Being wrong" is one such unpleasant sensation, but as pointed out above, it's actually the feeling that you've just become smarter. This is unambiguously a good thing... and yet we generally don't realise or acknowledge to ourselves that that's what's happening, so people often simply fixate on (or even actively, instinctively try to avoid) the unpleasant sensation.

What this means, then, is that as humans we very deeply, subconsciously, instinctively try to stop ourselves from becoming smarter, and we don't even realise we're doing it. Whatever we consciously tell ourselves, subconsciously we would rather feel good about ourselves and be wrong than be correct or rational in our opinions.

As you're reading this blog, I'll assume you're the kind of person who would rather be right (even if it's uncomfortable) than deluded but confident. This, then, is clearly a problem.

What to do about it?

The good news is that - because it's a subconscious association, it's malleable. You can change and modify (even, as we've seen, completely reverse) these associations with a little effort.

Try the following: next time you realise (or someone proves) you're wrong about something, stop and consciously acknowledge it to yourself. Try to hold and really feel that sensation of being wrong. Try to consciously acknowledge and analyse the emotions you're feeling - are you embarrassed? Ashamed? Annoyed? At the other person or yourself? Do you suddenly feel less sure of your place in the world, or your opinions on certain subjects? Can you feel that bruise on your ego?

Be brutally honest with yourself - if it helps, if you don't feel any of the above (or something similar), you're probably not human.

Now you're fully engaged, and aware of how you feel, try to modify it. Acknowledge that you're feeling bad, but remind yourself it's only because your ego is wounded. Realise that the only thing making you feel bad is egotism, but that even that is both instinctive (ie, uncontrollable and not your fault), and a normal part of being human.

Remind yourself that you want to be smart and right and rational about things, and remind yourself that what you're feeling is the feeling of getting smarter, that that's unambiguously good. Try to explicitly relate the uncomfortable sensation to the positive feelings you have about being smart, or correct, or rational in your beliefs. Nice, isn't it? So, like exercise, or taking care of paperwork, that feeling you initially shied away from or avoided is actually a good thing, even if it's briefly uncomfortable in the short term.

Once you're smarter or more right about a subject, you're generally smarter or more right about it for the rest of your life. Isn't that worth a brief, temporary, silly little sting?

Now you've got the hang of it, go out and try to find things you're wrong about. Read up on subjects that interest you. Challenge your beliefs and attitudes by seeking out dissenting opinions and viewpoints, and see if you can prove your existing opinions wrong. Treat it like an intellectual game of conkers - every time you're proven wrong you get a little bit smarter, and every time you win a debate you can reward yourself by being a little more confident in that opinion or line of reasoning.

Test your ideas by subjecting them to challenges, discard the ones which fail and adopt the ones which succeed. And remember - the whole time you're doing it, you're becoming smarter, more educated and more rational.

Wednesday, 30 December 2009

Say it with me: dumb ideas are dumb

There is a prevalent and dangerous meme rife in society today, and though some people may find the following offensive, judgemental or unfashionable, I believe it needs to be said. Your forbearance is therefore appreciated while I do so. ;-)

First, some axioms. These should be unarguable:

  • Everyone is entitled to their own opinion.
  • Not everyone's opinions is as valid, useful or has as much merit as everyone else's in every single situation.
  • Nobody is entitled to their own facts.
  • You have freedom of speech, thought and association. You do not have freedom from criticism, freedom from offence or freedom from correction.

The problem happened where the first axiom (a healthy recognition that other people have different opinions) turned into the second and subsequent beliefs; that everyone's opinion is equally valid, and that contradicting someone in error is impolite, arrogant or somehow infringing on their freedoms.

One look in some Lit Crit classrooms will show you what happens when you aren't allowed to contradict or dispute someone else's opinions, and one look in a politicised fundamentalist church will show you what happens when you believe you're allowed your own facts, instead of just your own opinions.

And while people might enjoy studying Lit Crit or subscribe to fundamentalist religions, if they've got any sense they'll notice that people acting in either of these two roles have rarely done anything tangible to better the overall lot of their fellow man... unlike all those rude, elitist, judgemental, snobby scientists, engineers, geeks and other educated types (who instinctively recognise that ideas vary in quality and efficacy, and have therefore been quietly and industriously changing the world for the better for the last few hundred years).

The Western world (ably lead, as ever, by America) is learning the hard way what happens when you confuse recognition of existence of everyone's opinions with equality or worth of everyone's opinions. Moreover, while we mouth thought-terminating clichés like "everyone deserves an equal say", we routinely disregard them in practice. Who seriously consults their toddler in the back seat on how to get home when lost in the car? Who leaves their neurosurgeon's office and seeks a second opinion from their local garage mechanic?

It's ok to judge and disregard things which demonstrably have no merit. We commonly all agree that "all people" deserve some sort of minimum baseline freedoms, protection, treatment and standard of living. And yet we still deny some of those benefits to those people who we have judged and found undeserving of them or actively dangerous (imprisoned criminals, for example).

We try to pretend that all ideas are equal, but it's not true - some ideas are brilliant, explanatory and useful, but some are stupid, dangerous or self-destructive. And refusing to judge them and pretending those ideas are harmless, valid or beneficial has much the same effect on society in the long term as refusing to judge dangerous people would have on society - internal chaos and developmental stagnation.

We don't have to ban stupid ideas or opinions, like we don't have to kill criminals. Instead we isolate criminals using jails so they can't damage society any more.

We can do the same with ideas, simply by agreeing they're dumb.

Refusing to publicly label a dumb idea "dumb" for fear of offending someone is - long term - as bad for our culture and society as refusing to lock away criminals "because their families might be upset".

Although it's unpopular to point out, sometimes people and ideas need to be judged for the good of society, even if it does end up upsetting or offending some people.

For the last decade or two - beginning around the advent of political correctness, though I suspect that was a symptom rather than a cause - we've done the intellectual equivalent of systematically dismantling the judicial system and all the courts and prisons in society. Now - in the same way if we dismantled all the prisons we'd be overrun with criminals - we're overrun with stupid ideas, unqualified but strongly-expressed opinions and people who act as if they can choose their own facts.

The only way you can help redress this situation is by not being afraid to offend people - if someone says something stupid, call them on it. Politely but firmly correct when people make erroneous claims. Question badly-thought-out ideas, and don't let people get away with hand-waving or reasoning based on obvious flaws or known logical fallacies. Yes they'll get annoyed, and yes they'll choose to take offence, but we don't free criminals because they or their families are "offended" at their having to stay in prison. They are there - largely - because they deserved and invited it, and because the world is better with them there. Likewise, dumb ideas deserve and invite correction, and the world would be a better place for everybody if more people judged and criticised them when we came across them.

Sometimes uncomfortable things do need to happen to people, and certainly if they invite them. There's no advancement without the possibility of failure, and removing the opportunity for failure removes the opportunity to develop. If no-one ever tells you you're wrong, how will you ever learn?

But most important of all, while judging people is unfashionable, can be dangerous and should largely be left to trained professionals, don't ever be afraid to judge ideas.

Wednesday, 29 July 2009

Your opinion is worthless

This is a slightly self-indulgent post, relating to website and forum discussions, rather than a generally-applicable epiphanette. Nevertheless, I think it's an important point, and one which far too few people understand...

I find when browsing internet discussion forums, when someone with a controversial or non-mainstream opinions posts and gets voted down I frequently run across run across comments similar to the following:

I find I get downmodded a lot because I'm a person willing to speak my mind. That makes a lot of the insecure people here (of which there are many!) uncomfortable, and to try and counter that they downmod my posts.

Straight to it: although sometimes the commenter has a point (people get very attached to their ideas, and can react irrationally when they're threatened), general attitudes like this always make me uncomfortable, because they smack of self-delusion and comfort-beliefs.

Everyone has some element of this in their thinking, but it's rarely justified. As an experiment, consider the following:

Aside from your own clearly-biased personal opinion of your posts, what evidence do you have that your thoughts or beliefs are generally:

  1. Insightful
  2. Interesting
  3. Well-expressed, or
  4. Correct?

Secondly, how many people - even really stupid, boring people - do you think get up in the morning, look in the mirror and think "shit man, I'm a really windy, boring, unoriginal fucker", and then spend a lot of time expressing their opinions to others?

Most people think what they have to say is insightful, interesting, adequately-expressed and correct, or they wouldn't bother posting it.

Now, this idea is correct in that some people vote down anything which contradicts the prevailing wisdom, but people also vote down things which are wrong, stupid, ridiculous or badly-expressed.

Conversely, I know from repeated personal experience that in many communities a well-written, well-argued, non-whingey post which counters the prevailing wisdom frequently still gets a high score, sometimes because of its contrary position.

I know when I post all I have to go on is my own opinion of my posts, which (as we've established) is almost laughably unreliable. Instead, the votes my posts get serve as a useful barometer of how much my opinion of a well-written, well-argued post compares with the general opinion.

It's terribly flattering to think of oneself as a persecuted martyr, but it also usually requires a lot of egotism and a willing blindness to statistics.

To quote the great Carl Sagan:

They laughed at Galileo... but they also laughed at Bozo the clown.

Given a poster's personal opinion is biased to the point it's worthless, and given there are many more clowns in the world than misunderstood geniuses, on what basis do people claim to be downmodded for the content of their opinions, rather than for their worth, or the reliability of the arguments they use to support them?

Claiming you're being downvoted simply because your opinions run counter to the prevailing wisdom, rather than simply because you're self-important or wrong requires you to not only assume you're vastly more intelligent or educated than the average person, but also that most people voting you down are doing so because of a deficiency in their psychology, rather than your own.

When all the objective evidence you have is that a lot of other people disagree with you, it's terribly tempting to believe you're a misunderstood intellectual martyr like Galileo.

The trouble with this, of course, is that while paradigm-shifting geniuses like Galileo only come along a few times a generation, we're knee-deep in idiots, and the tide is rising.

There are literally thousands of times more idiots than geniuses, so claiming you must be a genius on the basis you were voted down doesn't mean you're a genius - it means not only are you overwhelmingly likely to be a self-important idiot, but you're also bad at maths.

Act appropriately.

Friday, 1 May 2009

The incompetent leading the credulous - your mildly disconcerting thought for the day

It's well-known to psychologists, public speakers, politicians and con-men[1] that in general the more confident an individual appears, the more persuasive they are to other people. This effect holds regardless of the veracity or provability of their assertions. In other words, confidently and assertively talking horseshit will make you more persuasive than simply talking horseshit on its own, regardless of the fact it's horseshit.

In other news, the Dunning-Kruger effect demonstrates that - in general - the more incompetent or ignorant someone is of a subject, the more they will over-estimate their own expertise or understanding of it. Equally, the more experienced and competent a person becomes in a subject, the more they will begin to underestimate their true level of knowledge or expertise, downplaying their understanding and qualifying their statements. In effect, when trying to assess one's own level of ability in a subject increased expertise is inversely proportional to confidence in your expertise.

The net effect of this is that - again, in general - ignorant or incompetent people are subconsciously predisposed to be more confident in their opinions, and all people are subconsciously predisposed to find confident people persuasive.

In a nutshell, all things being equal, people are instinctively predisposed to find ignorant or incompetent people disproportionately persuasive and trustworthy compared to more competent, more experienced experts.

Distressingly it appears that in the kingdom of the blind the one-eyed man is not king. Instead, in the kingdom of the blind the true king is the one blind guy who's sufficiently incompetent or delusional that he honestly believes he can still see.

This has been your mildly disconcerting thought for the day.


Footnotes

[1] The author acknowledges that there may be some overlap in these categories.

Thursday, 1 January 2009

Engines of reason

Initially we as acted as individuals - what we understood of reality was what we experienced and determined for ourselves. There was no understanding or appreciation of the world outside our direct experience.

Later we developed language, and what we understood of reality was formed from our own perceptions and conclusions, influenced by the perceptions and conclusions of our family and social group (family, clan, village, etc).

Next we developed the printing press, and mass-media. These allowed centralised governments and organisations to accumulate and weigh information and experiences and broadcast them to the populace. We still held our own council on personal or local matters, but since we rarely (if ever) knew anyone who had experienced such events outside our local region, we largely received all our knowledge and understanding of the outside world from centralised authority - governments, news media organisations, etc.

Finally, with the advent of the web we're enabling anyone to publish their personal experiences, in a way that anyone else in the world can then receive. No longer do we simply not have access to information, nor do we receive distilled, refined, potentially biased information from one or a few sources. Now we're capable in theory of hearing every point of view from every participant in an event, untainted by anything but their personal, arbitrary biases.

We are still receiving information on a global scale, but for the first time it's potentially all primary evidence, untainted and unfiltered by a single agenda or point of view.

The trouble with this is that brains, personalities, cultures and institutions long-accustomed to received wisdom now have to compare, contrast, weigh and discern the trustworthiness of multiple conflicting points of view for themselves. For the first time since our pre-linguistic ancestors you - and only you - are primarily responsible for determining truth from falsehood, and for the first time in history you have to do so on a global scale, involving events of which you have no direct experience.

To be clear: this is hard. Many people instinctively reject the terrifying uncertainty and extra effort, instead abdicating their personal responsibility and fleeing to any source of comforting certainty they can find. This explains why even in these supposedly scientific and rational times people still subscribe to superstitions or religions, or simplistic, fundamentalist philosophies, or blindly consume and believe opinionated but provably-biased sources like political leaders, charismatic thinkers or biased news organisations.

So it's a double-edged sword - for the first time in history we have access to primary evidence about events in the world, rather than receiving conclusions from a central source along with only what secondary or tertiary evidence supports them. However, in doing so the one thing we've noticed is that the channels we've relied-upon up till now are biased, agenda-laden and incomplete.

Obviously in an ideal world everyone could be relied-upon to train their bullshit-filters and research and determine the truth for themselves. However, given the newness of the current situation we can't rely upon this any time soon. Likewise, given both the sheer volume of information and humanity's propensity for laziness and satisficing, we'll likely never be able to rely on the majority of people doing this for every single issue they hold an opinion on.

So what's a species to do? We've turned on the firehose of knowledge, and it's shown that the traditional channels of received wisdom are unreliable, but many people find it impractically hard to drink from it.

There are three choices here:

  • We could allow the majority of people to reject their responsibilities and abdicate their reasoning processes to others of unknown reliability... though this is the kind of thing that leads to fundamentalism, anti-intellectualism and cultural and scientific stagnation.
  • Alternatively, we could encourage people to distrust authority and try to decide for themselves... though even if we win, if the effort of self-determination is too great we risk merely leaving people floundering in a morass of equally-likely-looking alternatives (I believe this is a primary cause of baseless, unproven but trendy philosophies like excessive cultural relativism - if you're lost in a sea of indistinguishable alternatives, it's flattering and tempting to believe there is no difference in their correctness).
  • Lastly, we can make an effort to formalise and simplify the process of determining reliability and truth - striving to create democratic, transparent mechanisms where objective truth is determined and rewarded, and falsehood or erroneous information is discarded... lowering the bar to individual decision-making, but avoiding unilateral assumption of authority by a single (or small group of) agendas.

Stupid as it may seem, I believe this is the ultimate destination towards which sites like reddit or Wikipedia are slowly converging - people post evidence, assertions or facts, those facts are discussed, weighed and put in context, and (so the theory goes) accuracy and factual truth is ascertained by exposing the process to a large enough consensus.

It doesn't always work - many of these early attempts suffer from a poor mechanism, or attract a community who vote based on their prejudices rather than rational argument, or end up balkanised in one or more isolated areas of parochial groupthink.

However, the first heavier-than-air aircraft didn't work too well either, and here we are a few decades later flying around the planet faster than the sun. As a species we're still only a few years into what may be a decades- or centuries-long process - one which could change the very foundations of (and mechanism by which we determine) what we understand as factual reality.

People love to rag on social news sites, discussion forums and sites like Wikipedia for what amounts to failing to have already achieved perfection. I prefer to salute them for what they are - hesitant, often blind, stumbling baby-steps towards solving a problem many people don't yet even realise exists.