philosophy - GreenAsh Poignant wit and hippie ramblings that are pertinent to philosophy https://greenash.net.au/thoughts/topics/philosophy/ 2020-10-11T00:00:00Z Tolstoy: the forgotten philosopher 2020-10-11T00:00:00Z 2020-10-11T00:00:00Z Jaza https://greenash.net.au/thoughts/2020/10/tolstoy-the-forgotten-philosopher/ I recently finished reading the classic novel War and Peace. The 19th-century epic is considered the masterpiece of Leo Tolstoy, and I must say it took me by surprise. In particular, I wasn't expecting its second epilogue, which is a distinct work of its own (and one that arguably doesn't belong in a novel): a philosophical essay discussing the question of "free will vs necessity". I know that the second epilogue isn't to everyone's taste, but personally I feel that it's a real gem.

I was also surprised to learn, after doing a modest bit of research, that Tolstoy is seldom mentioned amongst any of the prominent figures in philosophy or metaphysics over the past several centuries. The only articles that even deign to label Tolstoy as a philosopher, are ones that are actually more concerned with Tolstoy as a cult-inspirer, as a pacifist, and as an anarchist.

So, while history has been just and generous in venerating Tolstoy as a novelist, I feel that his contribution to the field of philosophy has gone unacknowledged. This is no doubt in part because Tolstoy didn't consider himself a philosopher, and because he didn't pen any purely philosophical works (published separately from novels and other works), and because he himself criticised the value of such works. Nevertheless, I feel warranted in asking: is Tolstoy a forgotten philosopher?

Tolstoy statue in British Columbia
Tolstoy statue in British Columbia
Image source: Waymarking

Free will in War and Peace

The concept of free will that Tolstoy articulates in War and Peace (particularly in the second epilogue), in a nutshell, is that there are two forces that influence every decision at every moment of a person's life. The first, free will, is what resides within a person's mind (and/or soul), and is what drives him/her to act per his/her wishes. The second, necessity, is everything that resides external to a person's mind / soul (that is, a person's body is also for the most part considered external), and is what strips him/her of choices, and compels him/her to act in conformance with the surrounding environment.

Whatever presentation of the activity of many men or of an individual we may consider, we always regard it as the result partly of man's free will and partly of the law of inevitability.

War and Peace, second epilogue, chapter IX

A simple example that would appear to demonstrate acting completely according to free will: say you're in an ice cream parlour (with some friends), and you're tossing up between getting chocolate or hazelnut. There's no obvious reason why you would need to eat one flavour vs another. You're partial to both. They're both equally filling, equally refreshing, and equally (un)healthy. You'll be able to enjoy an ice cream with your friends regardless. You're free to choose!

You say: I am not and am not free. But I have lifted my hand and let it fall. Everyone understands that this illogical reply is an irrefutable demonstration of freedom.

War and Peace, second epilogue, chapter VIII

And another simple example that would appear to demonstrate being completely overwhelmed by necessity: say there's a gigantic asteroid on a collision course for Earth. It's already entered the atmosphere. You're looking out your window and can see it approaching. It's only seconds until it hits. There's no obvious choice you can make. You and all of humanity are going to die very soon. There's nothing you can do!

A sinking man who clutches at another and drowns him; or a hungry mother exhausted by feeding her baby, who steals some food; or a man trained to discipline who on duty at the word of command kills a defenseless man – seem less guilty, that is, less free and more subject to the law of necessity, to one who knows the circumstances in which these people were placed …

War and Peace, second epilogue, chapter IX

Decisions decisions
Decisions decisions
Image source: Wikimedia Commons

However, the main point that Tolstoy makes regarding these two forces, is that neither of them does – and indeed, neither of them can – ever exist in absolute form, in the universe as we know it. That is to say, a person is never (and can never be) free to decide anything 100% per his/her wishes; and likewise, a person is never (and can never be) shackled such that he/she is 100% compelled to act under the coercion of external agents. It's a spectrum! And every decision, at every moment of a person's life (and yes, every moment of a person's life involves a decision), lies somewhere on that spectrum. Some decisions are made more freely, others are more constrained. But all decisions result from a mix of the two forces.

In neither case – however we may change our point of view, however plain we may make to ourselves the connection between the man and the external world, however inaccessible it may be to us, however long or short the period of time, however intelligible or incomprehensible the causes of the action may be – can we ever conceive either complete freedom or complete necessity.

War and Peace, second epilogue, chapter X

So, going back to the first example: there are always some external considerations. Perhaps there's a little bit more chocolate than hazelnut in the tubs, so you'll feel just that little bit guilty if you choose the hazelnut, that you'll be responsible for the parlour running out of it, and for somebody else missing out later. Perhaps there's a deal that if you get exactly the same ice cream five times, you get a sixth one free, and you've already ordered chocolate four times before, so you feel compelled to order it again this time. Or perhaps you don't really want an ice cream at all today, but you feel that peer pressure compels you to get one. You're not completely free after all!

If we consider a man alone, apart from his relation to everything around him, each action of his seems to us free. But if we see his relation to anything around him, if we see his connection with anything whatever – with a man who speaks to him, a book he reads, the work on which he is engaged, even with the air he breathes or the light that falls on the things about him – we see that each of these circumstances has an influence on him and controls at least some side of his activity. And the more we perceive of these influences the more our conception of his freedom diminishes and the more our conception of the necessity that weighs on him increases.

War and Peace, second epilogue, chapter IX

And, going back to the second example: you always have some control over your own destiny. You have but a few seconds to live. Do you cower in fear, flat on the floor? Do you cling to your loved one at your side? Do you grab a steak knife and hurl it defiantly out the window at the approaching asteroid? Or do you stand there, frozen to the spot, staring awestruck at the vehicle of your impending doom? It may seem pointless, weighing up these alternatives, when you and your whole world are about to be pulverised; but aren't your last moments in life, especially if they're desperate last moments, the ones by which you'll be remembered? And how do you know for certain that there will be nobody left to remember you (and does that matter anyway)? You're not completely bereft of choices after all!

… even if, admitting the remaining minimum of freedom to equal zero, we assumed in some given case – as for instance in that of a dying man, an unborn babe, or an idiot – complete absence of freedom, by so doing we should destroy the very conception of man in the case we are examining, for as soon as there is no freedom there is also no man. And so the conception of the action of a man subject solely to the law of inevitability without any element of freedom is just as impossible as the conception of a man's completely free action.

War and Peace, second epilogue, chapter X

Background story

Tolstoy's philosophical propositions in War and Peace were heavily influenced by the ideas of one of his contemporaries, the German philosopher Arthur Schopenhauer. In later years, Tolstoy candidly expressed his admiration for Schopenhauer, and he even went so far as to assert that, philosophically speaking, War and Peace was a repetition of Schopenhauer's seminal work The World as Will and Representation.

Schopenhauer's key idea, was that the whole universe (at least, as far as any one person is concerned) consists of two things: the will, which doesn't exist in physical form, but which is the essence of a person, and which contains all of one's drives and desires; and the representation, which is a person's mental model of all that he/she has sensed and interacted with in the physical realm. However, rather than describing the will as the engine of one's freedom, Schopenhauer argues that one is enslaved by the desires imbued in his/her will, and that one is liberated from the will (albeit only temporarily) by aesthetic experience.

Schopenhauer: big on grey tufts, small on optimism
Schopenhauer: big on grey tufts, small on optimism
Image source: 9gag

Schopenhauer's theories were, in turn, directly influenced by those of Immanuel Kant, who came a generation before him, and who is generally considered the greatest philosopher of the modern era. Kant's ideas (and his works) were many (and I have already written about Kant's ideas recently), but the one of chief concern here – as expounded primarily in his Critique of Pure Reason – was that there are two realms in the universe: the phenomenal, that is, the physical, the universe as we experience and understand it; and the noumenal, that is, a theoretical non-material realm where everything exists as a "thing-in-itself", and about which we know nothing, except for what we are able to deduce via practical reason. Kant argued that the phenomenal realm is governed by absolute causality (that is, by necessity), but that in the noumenal realm there exists absolute free will; and that the fact that a person exists in both realms simultaneously, is what gives meaning to one's decisions, and what makes them able to be measured and judged in terms of ethics.

We can trace the study of free will further through history, from Kant, back to Hume, to Locke, to Descartes, to Augustine, and ultimately back to Plato. In the writings of all these fine folks, over the millennia, there can be found common concepts such as a material vs an ideal realm, a chain of causation, and a free inner essence. The analysis has become ever more refined with each passing generation of metaphysics scholars, but ultimately, it has deviated very little from its roots in ancient times.

It's unique

There are certainly parallels between Tolstoy's War and Peace, and Schopenhauer's The World as Will and Representation (and, in turn, with other preceding works), but I for one disagree that the former is a mere regurgitation of the latter. Tolstoy is selling himself short. His theory of free will vs necessity is distinct from that of Schopenhauer (and from that of Kant, for that matter). And the way he explains his theory – in terms of a "spectrum of free-ness" – is original as far as I'm aware, and is laudable, if for no other reason, simply because of how clear and easy-to-grok it is.

It should be noted, too, that Tolstoy's philosophical views continued to evolve significantly, later in his life, years after writing War and Peace. At the dawn of the 1900s (by which time he was an old man), Tolstoy was best known for having established his own "rational" version of Christianity, which rejected all the rituals and sacraments of the Orthodox Church, and which gained a cult-like following. He also adopted the lifestyle choices – extremely radical at the time – of becoming vegetarian, of renouncing violence, and of living and dressing like a peasant.

Battle of Austerlitz
Battle of Austerlitz
Image source: Flickr

War and Peace is many things. It's an account of the Napoleonic Wars, its bloody battles, its geopolitik, and its tremendous human cost. It's a nostalgic illustration of the old Russian aristocracy – a world long gone – replete with lavish soirees, mountains of servants, and family alliances forged by marriage. And it's a tenderly woven tapestry of the lives of the main protagonists – their yearnings, their liveliest joys, and their deepest sorrows – over the course of two decades. It rightly deserves the praise that it routinely receives, for all those elements that make it a classic novel. But it also deserves recognition for the philosophical argument that Tolstoy peppers throughout the text, and which he dedicates the final pages of the book to making more fully fledged.

]]>
How can we make AI that reasons? 2019-03-23T00:00:00Z 2019-03-23T00:00:00Z Jaza https://greenash.net.au/thoughts/2019/03/how-can-we-make-ai-that-reasons/ The past decade or so has been touted as a high point for achievements in Artificial Intelligence (AI). For the first time, computers have demonstrated formidable ability in such areas as image recognition, speech recognition, gaming, and (most recently) autonomous driving / piloting. Researchers and companies that are heavily invested in these technologies, at least, are in no small way lauding these successes, and are giving us the pitch that the current state-of-the-art is nothing less than groundbreaking.

However, as anyone exposed to the industry knows, the current state-of-the-art is still plagued by fundamental shortcomings. In a nutshell, the current generation of AI is characterised by big data (i.e. a huge amount of sample data is needed in order to yield only moderately useful results), big hardware (i.e. a giant amount of clustered compute resources is needed, again in order to yield only moderately useful results), and flawed algorithms (i.e. algorithms that, at the end of the day, are based on statistical analysis and not much else – this includes the latest Convolutional Neural Networks). As such, the areas of success (impressive though they may be) are still dwarfed by the relative failures, in areas such as natural language conversation, criminal justice assessment, and art analysis / art production.

In my opinion, if we are to have any chance of reaching a higher plane of AI – one that demonstrates more human-like intelligence – then we must lessen our focus on statistics, mathematics, and neurobiology. Instead, we must turn our attention to philosophy, an area that has traditionally been neglected by AI research. Only philosophy (specifically, metaphysics and epistemology) contains the teachings that we so desperately need, regarding what "reasoning" means, what is the abstract machinery that makes reasoning possible, and what are the absolute limits of reasoning and knowledge.

What is reason?

There are many competing theories of reason, but the one that I will be primarily relying on, for the rest of this article, is that which was expounded by 18th century philosopher Immanuel Kant, in his Critique of Pure Reason and other texts. Not everyone agrees with Kant, however his is generally considered the go-to doctrine, if for no other reason (no pun intended), simply because nobody else's theories even come close to exploring the matter in such depth and with such thoroughness.

Immanuel Kant's head (lots of philosophy inside)
Immanuel Kant's head (lots of philosophy inside)
Image source: Wikimedia Commons

One of the key tenets of Kant's work, is that there are two distinct types of propositions: an analytic proposition, which can be universally evaluated purely by considering the meaning of the words in the statement; and a synthetic proposition, which cannot be universally evaluated, because its truth-value depends on the state of the domain in question. Further, Kant distinguishes between an a priori proposition, which can be evaluated without any sensory experience; and an a posteriori proposition, which requires sensory experience in order to be evaluated.

So, analytic a priori statements are basically tautologies: e.g. "All triangles have three sides" – assuming the definition of a triangle (a 2D shape with three sides), and assuming the definition of a three-sided 2D shape (a triangle), this must always be true, and no knowledge of anything in the universe (except for those exact rote definitions) is required.

Conversely, synthetic a posteriori statements are basically unprovable real-world observations: e.g. "Neil Armstrong landed on the Moon in 1969" – maybe that "small step for man" TV footage is real, or maybe the conspiracy theorists are right and it was all a hoax; and anyway, even if your name was Buzz Aldrin, and you had seen Neil standing there right next to you on the Moon, how could you ever fully trust your own fallible eyes and your own fallible memory? It's impossible for there to be any logical proof for such a statement, it's only possible to evaluate it based on sensory experience.

Analytic a posteriori statements, according to Kant, are impossible to form.

Which leaves what Kant is most famous for, his discussion of synthetic a priori statements. An example of such a statement is: "A straight line between two points is the shortest". This is not a tautology – the terms "straight line between two points" and "shortest" do not define each other. Yet the statement can be universally evaluated as true, purely by logical consideration, and without any sensory experience. How is this so?

Kant asserts that there are certain concepts that are "hard-wired" into the human mind. In particular, the concepts of space, time, and causality. These concepts (or "forms of sensibility", to use Kant's terminology) form our "lens" of the universe. Hence, we are able to evaluate statements that have a universal truth, i.e. statements that don't depend on any sensory input, but that do nevertheless depend on these "intrinsic" concepts. In the case of the above example, it depends on the concept of space (two distinct points can exist in a three-dimensional space, and the shortest distance between them must be a straight line).

Another example is: "Every event has a cause". This is also universally true; at least, it is according to the intrinsic concepts of time (one event happens earlier in time, and another event happens later in time), and causality (events at one point in space and time, affect events at a different point in space and time). Maybe it would be possible for other reasoning entities (i.e. not humans) to evaluate these statements differently, assuming that such entities were imbued with different "intrinsic" concepts. But it is impossible for a reasoning human to evaluate those statements any other way.

The actual machinery of reasoning, as Kant explains, consists of twelve "categories" of understanding, each of which has a corresponding "judgement". These categories / judgements are essentially logic operations (although, strictly speaking, they predate the invention of modern predicate logic, and are based on Aristotle's syllogism), and they are as follows:

Group Categories / Judgements
Quantity Unity
Universal
All trees have leaves
Plurality
Particular
Some dogs are shaggy
Totality
Singular
This ball is bouncy
Quality Reality
Affirmative
Chairs are comfy
Negation
Negative
No spoons are shiny
Limitation
Infinite
Oranges are not blue
Relation Inherence / Subsistence
Categorical
Happy people smile
Causality / Dependence
Hypothetical
If it's February, then it's hot
Community
Disjunctive
Potatoes are baked or fried
Modality Existence
Assertoric
Sharks enjoy eating humans
Possibility
Problematic
Beer might be frothy
Necessity
Apodictic
6 times 7 equals 42

The cognitive mind is able to evaluate all of the above possible propositions, according to Kant, with the help of the intrinsic concepts (note that these intrinsic concepts are not considered to be "innate knowledge", as defined by the rationalist movement), and also with the help of the twelve categories of understanding.

Reason, therefore, is the ability to evaluate arbitrary propositions, using such cognitive faculties as logic and intuition, and based on understanding and sensibility, which are bridged by way of "forms of sensibility".

AI with intrinsic knowledge

If we consider existing AI with respect to the above definition of reason, it's clear that the capability is already developed maturely in some areas. In particular, existing AI – especially Knowledge Representation (KR) systems – has no problem whatsoever with formally evaluating predicate logic propositions. Existing AI – especially AI based on supervised learning methods – also excels at receiving and (crudely) processing large amounts of sensory input.

So, at one extreme end of the spectrum, there are pure ontological knowledge-base systems such as Cyc, where virtually all of the input into the system consists of hand-crafted factual propositions, and where almost none of the input is noisy real-world raw data. Such systems currently require a massive quantity of carefully curated facts to be on hand, in order to make inferences of fairly modest real-world usefulness.

Then, at the other extreme, there are pure supervised learning systems such as Google's NASNet, where virtually all of the input into the system consists of noisy real-world raw data, and where almost none of the input is human-formulated factual propositions. Such systems currently require a massive quantity of raw data to be on hand, in order to perform classification and regression tasks whose accuracy varies wildly depending on the target data set.

What's clearly missing, is something to bridge these two extremes. And, if transcendental idealism is to be our guide, then that something is "forms of sensibility". The key element of reason that humans have, and that machines currently lack, is a "lens" of the universe, with fundamental concepts of the nature of the universe – particularly of space, time, and causality – embodied in that lens.

Space and time
Space and time
Image source: Forbes

What fundamental facts about the universe would a machine require, then, in order to have "forms of sensibility" comparable to that of a human? Well, if we were to take this to the extreme, then a machine would need to be imbued with all the laws of mathematics and physics that exist in our universe. However, let's assume that going to this extreme is neither necessary nor possible, for various reasons, including: we humans are probably only imbued with a subset of those laws (the ones that apply most directly to our everyday existence); it's probably impossible to discover the full set of those laws; and, we will assume that, if a reasoning entity is imbued only with an appropriate subset of those laws, then it's possible to deduce the remainder of the laws (and it's therefore also possible to deduce all other facts relating to observable phenomena in the universe).

I would, therefore, like to humbly suggest, in plain English, what some of these fundamental facts, suitable for comprising the "forms of sensibility" of a reasoning machine, might be:

  • There are four dimensions: three space dimensions, and one time dimension
  • An object exists if it occupies one or more points in space and time
  • An object exists at zero or one points in space, given a particular point in time
  • An object exists at zero or more points in time, given a particular point in space
  • An event occurs at one point in space and time
  • An event is caused by one or more different events at a previous point in time
  • Movement is an event that involves an object changing its position in space and time
  • An object can observe its relative position in, and its movement through, space and time, using the space concepts of left, right, ahead, behind, up, and down, and using the time concepts of forward and backward
  • An object can move in any direction in space, but can only move forward in time

I'm not suggesting that the above list is really a sufficient number of intrinsic concepts for a reasoning machine, nor that all of the above facts are the correct choice nor correctly worded for such a list. But this list is a good start, in my opinion. If an "intelligent" machine were to be appropriately imbued with those facts, then that should be a sufficient foundation for it to evaluate matters of space, time, and causality.

There are numerous other intrinsic aspects of human understanding that it would also, arguably, be essential for a reasoning machine to possess. Foremost of these is the concept of self: does AI need a hard-wired idea of "I"? Other such concepts include matter / substance, inertia, life / death, will, freedom, purpose, and desire. However, it's a matter of debate, rather than a given, whether each of these concepts is fundamental to the foundation of human-like reasoning, or whether each of them is learned and acquired as part of intellectual experience.

Reasoning AI

A machine as discussed so far is a good start, but it's still not enough to actually yield what would be considered human-like intelligence. Cyc, for example, is an existing real-world system that basically already has all these characteristics – it can evaluate logical propositions of arbitrary complexity, based on a corpus (a much larger one than my humble list above) of intrinsic facts, and based on some sensory input – yet no real intelligence has emerged from it.

One of the most important missing ingredients, is the ability to hypothesise. That is, based on the raw sensory input of real-world phenomena, the ability to observe a pattern, and to formulate a completely new, original proposition expressing that pattern as a rule. On top of that, it includes the ability to test such a proposition against new data, and, when the rule breaks, to modify the proposition such that the rule can accommodate that new data. That, in short, is what is known as deductive reasoning.

A child formulates rules in this way. For example, a child observes that when she drops a drinking glass, the glass shatters the moment that it hits the floor. She drops a glass in this way several times, just for fun (plenty of fun for the parents too, naturally), and observes the same result each time. At some point, she formulates a hypothesis along the lines of "drinking glasses break when dropped on the floor". She wasn't born knowing this, nor did anyone teach it to her; she simply "worked it out" based on sensory experience.

Some time later, she drops a glass onto the floor in a different room of the house, still from shoulder-height, but it does not break. So she modifies the hypothesis to be "drinking glasses break when dropped on the kitchen floor" (but not the living room floor). But then she drops a glass in the bathroom, and in that case it does break. So she modifies the hypothesis again to be "drinking glasses break when dropped on the kitchen or the bathroom floor".

But she's not happy with this latest hypothesis, because it's starting to get complex, and the human mind strives for simple rules. So she stops to think about what makes the kitchen and bathroom floors different from the living room floor, and realises that the former are hard (tiled), whereas the latter is soft (carpet). So she refines the hypothesis to be "drinking glasses break when dropped on a hard floor". And thus, based on trial-and-error, and based on additional sensory experience, the facts that comprise her understanding of the world have evolved.

Broken glass on the floor
Broken glass on the floor
Image source: CoreSight

Some would argue that current state-of-the-art AI is already able to formulate rules, by way of feature learning (e.g. in image recognition). However, a "feature" in a neural network is just a number, either one directly taken from the raw data, or one derived based on some sort of graph function. So when a neural network determines the "features" that correspond to a duck, those features are just numbers that represent the average outline of a duck, the average colour of a duck, and so on. A neural network doesn't formulate any actual facts about a duck (e.g. "ducks are yellow"), which can subsequently be tested and refined (e.g. "bath toy ducks are yellow"). It just knows that if the image it's processing has a yellowish oval object occupying the main area, there's a 63% probability that it's a duck.

Another faculty that the human mind possesses, and that AI currently lacks, is intuition. That is, the ability to reach a conclusion based directly on sensory input, without resorting to logic as such. The exact definition of intuition, and how it differs from instinct, is not clear (in particular, both are sometimes defined as a "gut feeling"). It's also unclear whether or not some form of intuition is an essential ingredient of human-like intelligence.

It's possible that intuition is nothing more than a set of rules, that get applied either before proper logical reasoning has a chance to kick in (i.e. "first resort"), or after proper logical reasoning has been exhausted (i.e. "last resort"). For example, perhaps after a long yet inconclusive analysis of competing facts, regarding whether your Uncle Jim is telling the truth or not when he claims to have been to Mars (e.g. "Nobody has ever been to Mars", "Uncle Jim showed me his medal from NASA", "Mum says Uncle Jim is a flaming crackpot", "Uncle Jim showed me a really red rock"), your intuition settles the matter with the rule: "You should trust your own family". But, on the other hand, it's also possible that intuition is a more elementary mechanism, and that it can't be expressed in the form of logical rules at all: instead, it could simply be a direct mapping of "situations" to responses.

Is reason enough?

In order to test whether a hypothetical machine, as discussed so far, is "good enough" to be considered intelligent, I'd like to turn to one of the domains that current-generation AI is already pursuing: criminal justice assessment. One particular area of this domain, in which the use of AI has grown significantly, is determining whether an incarcerated person should be approved for parole or not. Unsurprisingly, AI's having input into such a decision has so far, in real life, not been considered altogether successful.

The current AI process for this is based almost entirely on statistical analysis. That is, the main input consists of simple numeric parameters, such as: number of incidents reported during imprisonment; level of severity of the crime originally committed; and level of recurrence of criminal activity. The input also includes numerous profiling parameters regarding the inmate, such as: racial / ethnic group; gender; and age. The algorithm, regardless of any bells and whistles it may claim, is invariably simply answering the question: for other cases with similar input parameters, were they deemed eligible for parole? And if so, did their conduct after release demonstrate that they were "reformed"? And based on that, is this person eligible for parole?

Current-generation AI, in other words, is incapable of considering a single such case based on its own merits, nor of making any meaningful decision regarding that case. All it can do, is compare the current case to its training data set of other cases, and determine how similar the current case is to those others.

A human deciding parole eligibility, on the other hand, does consider the case in question based on its own merits. Sure, a human also considers the numeric parameters and the profiling parameters that a machine can so easily evaluate. But a human also considers each individual event in the inmate's history as a stand-alone fact, and each such fact can affect the final decision differently. For example, perhaps the inmate seriously assaulted other inmates twice while imprisoned. But perhaps he also read 150 novels, and finished a university degree by correspondence. These are not just statistics, they're facts that must be considered, and each fact must refine the hypothesis whose final form is either "this person is eligible for parole", or "this person is not eligible for parole".

A human is also influenced by morals and ethics, when considering the character of another human being. So, although the question being asked is officially: "is this person eligible for parole?", the question being considered in the judge's head may very well actually be: "is this person good or bad?". Should a machine have a concept of ethics, and/or of good vs bad, and should it apply such ethics when considering the character of an individual human? Most academics seem to think so.

According to Kant, ethics is based on a foundation of reason. But that doesn't mean that a reasoning machine is automatically an ethical machine, either. Does AI need to understand ethics, in order to possess what we would consider human-like intelligence?

Although decisions such as parole eligibility are supposed to be objective and rational, a human is also influenced by emotions, when considering the character of another human being. Maybe, despite the evidence suggesting that the inmate is not reformed, the judge is stirred by a feeling of compassion and pity, and this feeling results in parole being granted. Or maybe, despite the evidence being overwhelmingly positive, the judge feels fear and loathing towards the inmate, mainly because of his tough physical appearance, and this feeling results in parole being denied.

Should human-like AI possess the ability to be "stirred" by such emotions? And would it actually be desirable for AI to be affected by such emotions, when evaluating the character of an individual human? Some such emotions might be considered positive, while others might be considered negative (particularly from an ethical point of view).

I think the ultimate test in this domain – perhaps the "Turing test for criminal justice assessment" – would be if AI were able to understand, and to properly evaluate, this great parole speech, which is one of my personal favourite movie quotes:

There's not a day goes by I don't feel regret. Not because I'm in here, or because you think I should. I look back on the way I was then: a young, stupid kid who committed that terrible crime. I want to talk to him. I want to try and talk some sense to him, tell him the way things are. But I can't. That kid's long gone and this old man is all that's left. I got to live with that. Rehabilitated? It's just a bulls**t word. So you can go and stamp your form, Sonny, and stop wasting my time. Because to tell you the truth, I don't give a s**t.

"Red" (Morgan Freeman)

The Shawshank Redemption (1994)

Red's parole hearing
Red's parole hearing
Image source: YouTube

In the movie, Red's parole was granted. Could we ever build an AI that could also grant parole in that case, and for the same reasons? On top of needing the ability to reason with real facts, and to be affected by ethics and by emotion, properly evaluating such a speech requires the ability to understand humour – black humour, no less – along with apathy and cynicism. No small task.

Conclusion

Sorry if you were expecting me to work wonders in this article, and to actually teach the world how to build artificial intelligence that reasons. I don't have the magic answer to that million dollar question. However, I hope I have achieved my aim here, which was to describe what's needed in order for it to even be possible for such AI to come to fruition.

It should be clear, based on what I've discussed here, that most current-generation AI is based on a completely inadequate foundation for even remotely human-like intelligence. Chucking big data at a statistic-crunching algorithm on a fat cluster might be yielding cool and even useful results, but it will never yield intelligent results. As centuries of philosophical debate can teach us – if only we'd stop and listen – human intelligence rests on specific building blocks. These include, at the very least, an intrinsic understanding of time, space, and causality; and the ability to hypothesise based on experience. If we are to ever build a truly intelligent artificial agent, then we're going to have to figure out how to imbue it with these things.

Further reading

]]>
DNA: the most chaotic, most illegible, most mature, most brilliant codebase ever 2018-04-21T00:00:00Z 2018-04-21T00:00:00Z Jaza https://greenash.net.au/thoughts/2018/04/dna-the-most-chaotic-most-illegible-most-mature-most-brilliant-codebase-ever/ As a computer programmer – i.e. as someone whose day job is to write relatively dumb, straight-forward code, that controls relatively dumb, straight-forward machines – DNA is a fascinating thing. Other coders agree. It has been called the code of life, and rightly so: the DNA that makes up a given organism's genome, is the set of instructions responsible for virtually everything about how that organism grows, survives, behaves, reproduces, and ultimately dies in this universe.

Most intriguing and most tantalising of all, is the fact that we humans still have virtually no idea how to interpret DNA in any meaningful way. It's only since 1953 that we've understood what DNA even is; and it's only since 2001 that we've been able to extract and to gaze upon instances of the complete human genome.

Watson and Crick showing off their DNA model in 1953.
Watson and Crick showing off their DNA model in 1953.
Image source: A complete PPT on DNA (Slideshare).

As others have pointed out, the reason why we haven't had much luck in reading DNA, is because (in computer science parlance) it's not high-level source code, it's machine code (or, to be more precise, it's bytecode). So, DNA, which is sequences of base-4 digits, grouped into (most commonly) 3-digit "words" (known as "codons"), is no more easily decipherable than binary, which is sequences of base-2 digits, grouped into (for example) 8-digit "words" (known as "bytes"). And as anyone who has ever read or written binary (in binary, octal, or hex form, however you want to skin that cat) can attest, it's hard!

In this musing, I'm going to compare genetic code and computer code. I am in no way qualified to write about this topic (particularly about the biology side), but it's fun, and I'm reckless, and this is my blog so for better or for worse nobody can stop me.

Authorship and motive

The first key difference that I'd like to point out between the two, is regarding who wrote each one, and why. For computer code, this is quite straightforward: a given computer program was written by one of your contemporary human peers (hopefully one who is still alive, as you can then ask him or her about anything that's hard to grok in the code), for some specific and obvious purpose – for example, to add two numbers together, or to move a chomping yellow pac-man around inside a maze, or to add somersaulting cats to an image.

For DNA, we don't know who, if anyone, wrote the first ever snippet of code – maybe it was G-d, maybe it was aliens from the Delta Quadrant, or maybe it was the random result of various chemicals bashing into each other within the primordial soup. And as for who wrote (and who continues to this day to write) all DNA after that, that too may well be The Almighty or The Borg, but the current theory of choice is that a given snippet of DNA basically keeps on re-writing itself, and that this auto-re-writing happens (as far as we can tell) in a pseudo-random fashion.

This guy didn't write binary or DNA, I'm pretty sure.
This guy didn't write binary or DNA, I'm pretty sure.
Image source: Art UK.

Nor do we know why DNA came about in the first place. From a philosophical / logical point of view, not having an answer to the "who" question, kind of makes it impossible to address the "why", by defintion. If it came into existence randomly, then it would logically follow that it wasn't created for any specific purpose, either. And as for why DNA re-writes itself in the way that it does: it would seem that DNA's, and therefore life's, main purpose, as far as the DNA itself is concerned, is simply to continue existing / surviving, as evidenced by the fact that DNA's self-modification results, on average, over the long-term, in it becoming ever more optimally adapted to its surrounding environment.

Management processes

For building and maintaining computer software, regardless of "methodology" (e.g. waterfall, scrum, extreme programming), the vast majority of the time there are a number of common non-dev processes in place. Apart from every geek's favourite bit, a.k.a. "coding", there is (to name a few): requirements gathering; spec writing; code review; testing / QA; version control; release management; staged deployment; and documentation. The whole point of these processes, is to ensure: that a given snippet of code achieves a clear business or technical outcome; that it works as intended (both in isolation, and when integrated into the larger system); that the change it introduces is clearly tracked and is well-communicated; and that the codebase stays maintainable.

For DNA, there is little or no parallel to most of the above processes. As far as we know, when DNA code is modified, there are no requirements defined, there is no spec, there is no review of the change, there is no staging environment, and there is no documentation. DNA seems to follow my former boss's preferred methodology: JFDI. New code is written, nobody knows what it's for, nobody knows how to use it. Oh well. Straight into production it goes.

However, there is one process that DNA demonstrates in abundance: QA. Through a variety of mechanisms, the most important of which is repair enzymes, a given piece of DNA code is constantly checked for integrity errors, and these errors are generally repaired. Mutations (i.e. code changes) can occur during replication due to imperfect copying, or at any other time due to environmental factors. Depending on the genome (i.e. the species) in question, and depending on the gene in question, the level of enforcement of DNA integrity can vary, from "very relaxed" to "very strict". For example, bacteria experience far more mutation between generations than humans do. This is because some genomes consider themselves to still be in "beta", and are quite open to potentially dangerous experimentation, while other genomes consider themselves "mature", and so prefer less change and greater stability. Thus a balance is achieved between preservation of genes, and evolution.

The coding process

For computer software, the actual process of coding is relatively structured and rational. The programmer refers to the spec – which could be anything from a one-sentence verbal instruction bandied over the water-cooler, to a 50-page PDF (preferably it's something in between those two extremes) – before putting hands to keyboard, and also regularly while coding.

The programmer visualises the rough overall code change involved (or the rough overall components of a new codebase), and starts writing. He or she will generally switch between top-down (focusing on the framework and on "glue code") and bottom-up (focusing on individual functions) several times. The code will generally be refined, in response to feedback during code review, to fixing defects in the change, and to the programmer's constant critiquing of his or her own work. Finally, the code will be "done" – although inevitably it will need to be modified in future, in response to new requirements, at which point it's time to rinse and repeat all of the above.

For DNA, on the other hand, the process of coding appears (unless we're missing something?) to be akin to letting a dog randomly roll around on the keyboard while the editor window is open, then cleaning up the worst of the damage, then seeing if anything interesting was produced. Not the most scientific of methods, you might say? But hey, that's science! And it would seem that, amazingly, if you do that on a massively distributed enough scale, over a long enough period of time, you get intelligent life.

DNA modification in progress.
DNA modification in progress.
Image source: DogsToday.

When you think about it, that approach isn't really dissimilar to the current state-of-the-art in machine learning. Getting anything approaching significant or accurate results with machine learning models, has only been possible quite recently, thanks to the availability of massive data sets, and of massive hardware platforms – and even when you let a ML algorithm loose in that environment for a decent period of time, it produces results that contain a lot of noise. So maybe we are indeed onto something with our current approach to ML, although I don't think we're quite onto the generation of truly intelligent software just yet.

Grokking it

Most computer code that has been written by humans for the past 40 years or so, has been high-level source code (i.e. "C and up"). It's written primarily to express business logic, rather than to tell the Von Neumann machine (a.k.a. the computer hardware) exactly what to do. It's up to the compiler / interpreter, to translate that "call function abc" / "divide variable pqr by 50" / "write the string I feel like a Tooheys to file xyz" code, into "load value of register 123" / "put that value in register 456" / "send value to bus 789" code, which in turn actually gets represented in memory as 0s and 1s.

This is great for us humans, because – assuming we can get our hands on the high-level source code – we can quite easily grok the purpose of a given piece of code, without having to examine the gory details of what the computer physically does, step-by-tiny-tedious-step, in order to achieve that noble purpose.

DNA, as I said earlier, is not high-level source code, it's machine code / bytecode (more likely the latter, in which case the actual machine code of living organisms is the proteins, and other things, that DNA / RNA gets "compiled" to). And it now seems pretty clear that there is no higher source code – DNA, which consists of long sequences of Gs, As, Cs, and Ts, is the actual source. The code did not start in a form where a given gene is expressed logically / procedurally – a form from which it could be translated down to base pairs. The start and the end state of the code is as base pairs.

A code that was cracked - can the same be done for DNA?
A code that was cracked - can the same be done for DNA?
Image source: The University Network.

It also seems that DNA is harder to understand than machine / assembly code for a computer, because an organic cell is a much more complex piece of hardware than a Von Neumann-based computer (which itself is a specific type of Turing machine). That's why humans were perfectly capable of programming computers using only machine / assembly code to begin with, and why some specialised programmers continue primarily coding at that level to this day. For a computer, the machine itself only consists of a few simple components, and the instruction set is relatively small and unambiguous. For an organic cell, the physical machinery is far more complex (and whether a DNA-containing cell is a Turing machine is itself currently an open research question), and the instruction set is riddled with ambiguous, context-specific meanings.

Since all we have is the DNA bytecode, all current efforts to "decode DNA" focus on comparing long strings of raw base pairs with each other, across different genes / chromosomes / genomes. This is akin to trying to understand what software does by lining up long strings of compiled hex digits for different binaries side-by-side, and spotting sequences that are kind-of similar. So, no offense intended, but the current state-of-the-art in "DNA decoding" strikes me as incredibly primitive, cumbersome, and futile. It's a miracle that we've made any progress at all with this approach, and it's only thanks to some highly intelligent people employing their best mathematical pattern analysis techniques, that we have indeed gotten anywhere.

Where to from here?

Personally, I feel that we're only really going to "crack" the DNA puzzle, if we're able to reverse-engineer raw DNA sequences into some sort of higher-level code. And, considering that reverse-engineering raw binary into a higher-level programming language (such as C) is a very difficult endeavour, and that doing the same for DNA is bound to be even harder, I think we have our work cut out for us.

My interest in the DNA puzzle was first piqued, when I heard a talk at PyCon AU 2016: Big data biology for pythonistas: getting in on the genomics revolution, presented by Darya Vanichkina. In this presentation, DNA was presented as a riddle that more programmers can and should try to help solve. Since then, I've thought about the riddle now and then, and I have occasionally read some of the plethora of available online material about DNA and genome sequencing.

DNA is an amazing thing: for approximately 4 billion years, it has been spreading itself across our planet, modifying itself in bizarre and colourful ways, and ultimately evolving (according to the laws of natural selection) to become the codebase that defines the behaviour of primitive lifeforms such as humans (and even intelligent lifeforms such as dolphins!).

Dolphins! (Brainier than you are).
Dolphins! (Brainier than you are).
Image source: Days of the Year.

So, let's be realistic here: it took DNA that long to reach its current form; we'll be doing well if we can start to understand it properly within the next 1,000 years, if we can manage it at all before the humble blip on Earth's timeline that is human civilisation fades altogether.

]]>
The Jobless Games 2017-03-19T00:00:00Z 2017-03-19T00:00:00Z Jaza https://greenash.net.au/thoughts/2017/03/the-jobless-games/ There is growing concern worldwide about the rise of automation, and about the looming mass unemployment that will logically result from it. In particular, the phenomenon of driverless cars – which will otherwise be one of the coolest and the most beneficial technologies of our time – is virtually guaranteed to relegate to the dustbin of history the "paid human driver", a vocation currently pursued by over 10 million people in the US alone.

Them robots are gonna take our jobs!
Them robots are gonna take our jobs!
Image source: Day of the Robot.

Most discussion of late seems to treat this encroaching joblessness entirely as an economic issue. Families without incomes, spiralling wealth inequality, broken taxation mechanisms. And, consequently, the solutions being proposed are mainly economic ones. For example, a Universal Basic Income to help everyone make ends meet. However, in my opinion, those economic issues are actually relatively easy to address, and as a matter of sheer necessity we will sort them out sooner or later, via a UBI or via whatever else fits the bill.

The more pertinent issue is actually a social and a psychological one. Namely: how will people keep themselves occupied in such a world? How will people nourish their ambitions, feel that they have a purpose in life, and feel that they make a valuable contribution to society? How will we prevent the malaise of despair, depression, and crime from engulfing those who lack gainful enterprise? To borrow the colourful analogy that others have penned: assuming that there's food on the table either way, how do we head towards a Star Trek rather than a Mad Max future?

Keep busy

The truth is, since the Industrial Revolution, an ever-expanding number of people haven't really needed to work anyway. What I mean by that is: if you think about what jobs are actually about providing society with the essentials such as food, water, shelter, and clothing, you'll quickly realise that fewer people than ever are employed in such jobs. My own occupation, web developer, is certainly not essential to the ongoing survival of society as a whole. Plenty of other occupations, particularly in the services industry, are similarly remote from humanity's basic needs.

So why do these jobs exist? First and foremost, demand. We live in a world of free markets and capitalism. So, if enough people decide that they want web apps, and those people have the money to make it happen, then that's all that's required for "web developer" to become and to remain a viable occupation. Second, opportunity. It needs to be possible to do that thing known as "developing web apps" in the first place. In many cases, the opportunity exists because of new technology; in my case, the Internet. And third, ambition. People need to have a passion for what they do. This means that, ideally, people get to choose an occupation of their own free will, rather than being forced into a certain occupation by their family or by the government. If a person has a natural talent for his or her job, and if a person has a desire to do the job well, then that benefits the profession as a whole, and, in turn, all of society.

Those are the practical mechanisms through which people end up spending much of their waking life at work. However, there's another dimension to all this, too. It is very much in the interest of everyone that makes up "the status quo" – i.e. politicians, the police, the military, heads of big business, and to some extent all other "well to-do citizens" – that most of society is caught up in the cycle of work. That's because keeping people busy at work is the most effective way of maintaining basic law and order, and of enforcing control over the masses. We have seen throughout history that large-scale unemployment leads to crime, to delinquency and, ultimately, to anarchy. Traditionally, unemployment directly results in poverty, which in turn directly results in hunger. But even if the unemployed get their daily bread – even if the crisis doesn't reach let them eat cake proportions – they are still at risk of falling to the underbelly of society, if for no other reason, simply due to boredom.

So, assuming that a significantly higher number of working-age men and women will have significantly fewer job prospects in the immediate future, what are we to do with them? How will they keep themselves occupied?

The Games

I propose that, as an alternative to traditional employment, these people engage in large-scale, long-term, government-sponsored, semi-recreational activities. These must be activities that: (a) provide some financial reward to participants; (b) promote physical health and social well-being; and (c) make a tangible positive contribution to society. As a massive tongue-in-cheek, I call this proposal "The Jobless Games".

My prime candidate for such an activity would be a long-distance walk. The journey could take weeks, months, even years. Participants could number in the hundreds, in the thousands, even in the millions. As part of the walk, participants could do something useful, too; for example, transport non-urgent goods or mail, thus delivering things that are actually needed by others, and thus competing with traditional freight services. Walking has obvious physical benefits, and it's one of the most social things you can do while moving and being active. Such a journey could also be done by bicycle, on horseback, or in a variety of other modes.

How about we all just go for a stroll?
How about we all just go for a stroll?
Image source: The New Paper.

Other recreational programs could cover the more adventurous activities, such as climbing, rafting, and sailing. However, these would be less suitable, because: they're far less inclusive of people of all ages and abilities; they require a specific climate and geography; they're expensive in terms of equipment and expertise; they're harder to tie in with some tangible positive end result; they're impractical in very large groups; and they damage the environment if conducted on too large a scale.

What I'm proposing is not competitive sport. These would not be races. I don't see what having winners and losers in such events would achieve. What I am proposing is that people be paid to participate in these events, out of the pocket of whoever has the money, i.e. governments and big business. The conditions would be simple: keep up with the group, and behave yourself, and you keep getting paid.

I see such activities co-existing alongside whatever traditional employment is still available in future; and despite all the doom and gloom predictions, the truth is that there always has been real work out there, and there always will be. My proposal is that, same as always, traditional employment pays best, and thus traditional employment will continue to be the most attractive option for how to spend one's days. Following that, "The Games" pay enough to get by on, but probably not enough to enjoy all life's luxuries. And, lastly, as is already the case in most first-world countries today, for the unemployed there should exist a social security payment, and it should pay enough to cover life's essentials, but no more than that. We already pay people sit down money; how about a somewhat more generous payment of stand up money?

Along with these recreational activities that I've described, I think it would also be a good idea to pay people for a lot of the work that is currently done by volunteers without financial reward. In a future with less jobs, anyone who decides to peel potatoes in a soup kitchen, or to host bingo games in a nursing home, or to take disabled people out for a picnic, should be able to support him- or herself and to live in a dignified manner. However, as with traditional employment, there are also only so many "volunteer" positions that need filling, and even with that sector significantly expanded, there would still be many people left twiddling their thumbs. Which is why I think we need some other solution, that will easily and effectively get large numbers of people on their feet. And what better way to get them on their feet, than to say: take a walk!

Large-scale, long-distance walks could also solve some other problems that we face at present. For example, getting a whole lot of people out of our biggest and most crowded cities, and "going on tour" to some of our smallest and most neglected towns, would provide a welcome economic boost to rural areas, considering all the support services that such activities would require; while at the same time, it would ease the crowding in the cities, and it might even alleviate the problem of housing affordability, which is acute in Australia and elsewhere. Long-distance walks in many parts of the world – particularly in Europe – could also provide great opportunities for an interchange of language and culture.

In summary

There you have it, my humble suggestion to help fill the void in peoples' lives in the future. There are plenty of other things that we could start paying people to do, that are more intellectual and that make a more tangible contribution to society: e.g. create art, be spiritual, and perform in music and drama shows. However, these things are too controversial for the government to support on such a large scale, and their benefit is a matter of opinion. I really think that, if something like this is to have a chance of succeeding, it needs to be dead simple and completely uncontroversial. And what could be simpler than walking?

Whatever solutions we come up with, I really think that we need to start examining the issue of 21st-century job redundancy from this social angle. The economic angle is a valid one too, but it has already been analysed quite thoroughly, and it will sort itself out with a bit of ingenuity. What we need to start asking now is: for those young, fit, ambitious people of the future that lack job prospects, what activity can they do that is simple, social, healthy, inclusive, low-impact, low-cost, and universal? I'd love to hear any further suggestions you may have.

]]>
Protect the children, but don't blindfold them 2014-03-18T00:00:00Z 2014-03-18T00:00:00Z Jaza https://greenash.net.au/thoughts/2014/03/protect-the-children-but-dont-blindfold-them/ Being a member of mainstream society isn't for everyone. Some want out.

Societal vices have always been bountiful. Back in the ol' days, it was just the usual suspects. War. Violence. Greed. Corruption. Injustice. Propaganda. Lewdness. Alcoholism. To name a few. In today's world, still more scourges have joined in the mix. Consumerism. Drug abuse. Environmental damage. Monolithic bureaucracy. And plenty more.

There always have been some folks who elect to isolate themselves from the masses, to renounce their mainstream-ness, to protect themselves from all that nastiness. And there always will be. Nothing wrong with doing so.

However, there's a difference between protecting oneself from "the evils of society", and blinding oneself to their very existence. Sometimes this difference is a fine line. Particularly in the case of families, where parents choose to shield from the Big Bad World not only themselves, but also their children. Protection is noble and commendable. Blindfolding, in my opinion, is cowardly and futile.

How's the serenity?
How's the serenity?
Image source: greenskullz1031 on Photobucket.

Seclusion

There are plenty of examples from bygone times, of historical abstainers from mainstream society. Monks and nuns, who have for millenia sought serenity, spirituality, abstinence, and isolation from the material. Hermits of many varieties: witches, grumpy old men / women, and solitary island-dwellers.

Religion has long been an important motive for seclusion. Many have settled on a reclusive existence as their solution to avoiding widespread evils and being closer to G-d. Other than adult individuals who choose a monastic life, there are also whole communities, composed of families with children, who live in seclusion from the wider world. The Amish in rural USA are probably the most famous example, and also one of the longest-running such communities. Many ultra-orthodox Jewish communities, particularly within present-day Israel, could also be considered as secluded.

Amish people in a coach.
Amish people in a coach.
Image source: Wikipedia: Amish.

More recently, the "commune living" hippie phenomenon has seen tremendous growth worldwide. The hippie ideology is, of course, generally an anti-religious one, with its acceptance of open relationships, drug use, lack of hierarchy, and often a lack of any formal G-d. However, the secluded lifestyle of hippie communes is actually quite similar to that of secluded religious groups. It's usually characterised by living amidst, and in tune with, nature; rejecting modern technology; and maintaining a physical distance from regular urban areas. The left-leaning members of these communities tend to strongly shun consumerism, and to promote serenity and spirituality, much like their G-d fearing comrades.

In a bubble

Like the members of these communities, I too am repulsed by many of the "evils" within the society in which we live. Indeed, the idea of joining such a community is attractive to me. It would be a pleasure and a relief to shut myself out from the blight that threatens me, and from everyone that's "infected" by it. Life would be simpler, more peaceful, more wholesome.

I empathise with those who have chosen this path in life. Just as it's tempting to succumb to all the world's vices, so too is it tempting to flee from them. However, such people are also living in a bubble. An artificial world, from which the real world has been banished.

What bothers me is not so much the independent adult people who have elected for such an existence. Despite all the faults of the modern world, most of us do at least enjoy far-reaching liberty. So, it's a free land, and adults are free to live as they will, and to blind themselves to what they will.

What does bother me, is that children are born and raised in such an existence. The adult knows what it is that he or she is shut off from, and has experienced it before, and has decided to discontinue experiencing it. The child, on the other hand, has never been exposed to reality, he or she knows only the confines of the bubble. The child is blind, but to what, it knows not.

Child in a bubble.
Child in a bubble.
Image source: CultureLab: Breaking out of the internet filter bubble.

This is a cowardly act on the part of the parents. It's cowardly because a child only develops the ability to combat and to reject the world's vices, such as consumerism or substance abuse, by being exposed to them, by possibly experimenting with them, and by making his or her own decisions. Parents that are serious about protecting their children do expose them to the Big Bad World, they do take risks; but they also do the hard yards in preparing their children for it: they ensure that their children are raised with education, discipline, and love.

Blindfolding children to the reality of wider society is also futile — because, sooner or later, whether still as children or later as adults, the Big Bad World exposes itself to all, whether you like it or not. No Amish countryside, no hippie commune, no far-flung island, is so far or so disconnected from civilisation that its inhabitants can be prevented from ever having contact with it. And when the day of exposure comes, those that have lived in their little bubble find themselves totally unprepared for the very "evils" that they've supposedly been protected from for all their lives.

Keep it balanced

In my opinion, the best way to protect children from the world's vices, is to expose them in moderation to the world's nasty underbelly, while maintaining a stable family unit, setting a strong example of rejecting the bad, and ensuring a solid education. That is, to do what the majority of the world's parents do. That's right: it's a formula that works reasonably well for billions of people, and that has been developed over thousands of years, so there must be some wisdom to it.

Obviously, children need to be protected from dangers that could completely overwhelm them. Bringing up a child in a favela environment is not ideal, and sometimes has horrific consequences, just watch City of G-d if you don't believe me. But then again, blindfolding is the opposite extreme; and one extreme can be as bad as the other. Getting the balance somewhere in between is the key.

]]>
Money: the ongoing evolution 2013-04-10T00:00:00Z 2013-04-10T00:00:00Z Jaza https://greenash.net.au/thoughts/2013/04/money-the-ongoing-evolution/ In this article, I'm going to solve all the monetary problems of the modern world.

Oh, you think that's funny? I'm being serious.

Alright, then. I'm going to try and solve them. Money is a concept, a product and a system that's been undergoing constant refinement since the dawn of civilisation; and, as the world's current financial woes are testament to, it's clear that we still haven't gotten it quite right. That's because getting financial systems right is hard. If it were easy, we'd have done it already.

I'm going to start with some background, discussing the basics such as: what is money, and where does it come from? What is credit? What's the history of money, and of credit? How do central banks operate? How do modern currencies attain value? And then I'm going to move on to the fun stuff: what can we do to improve the system? What's the next step in the ongoing evolution of money and finance?

Disclaimer: I am not an economist or a banker; I have no formal education in economics or finance; and I have no work experience in these fields. I'm just a regular bloke, who's been thinking about these big issues, and reading up on a lot of material, and who would like to share his understandings and his conclusions with the world.

Ancient history

Money has been around for a while. When I talk about money, I'm talking cash. The stuff that leaves a smell on your fingers. The stuff that jingles in your pockets. Cold hard cash.

The earliest known example of money dates back to the 7th century BC, when the Lydians minted coins using a natural gold-based alloy called electrum. They were a crude affair – with each coin being of a slightly different shape – but they evolved to become reasonably consistent in their weight in precious metal; and many of them also bore official seals or insignias.

Ancient silver Greek coins.
Ancient silver Greek coins.
Source: Ancient coins.

From Lydia, the phenomenom of minted precious-metal coinage spread: first to her immediate neighbours – the Greek and Persian empires – and then to the rest of the civilised world. By the time the Romans rose to significance, around the 3rd century BC, coinage had become the norm as a medium of exchange; and the Romans established this further with their standard-issue coins, most notably the Denarius, which were easily verifiable and reliable in their precious metal content.

Ten?! Are you trying to insult me?! Me, with a poor dying grandmother?! Ten?!
Ten?! Are you trying to insult me?! Me, with a poor dying grandmother?! Ten?!
Source: London Evening Standard. Quote: Life of Brian haggling scene.

Money, therefore, is nothing new. This should come as no surprise to you.

What may surprise you, however, is that credit existed before the arrival of money. How can that be? I hear you say. Isn't credit – the business of lending, and of recording and repaying a debt – a newer and more advanced concept than money? No! Quite the reverse. In fact, credit is the most fundamental concept of all in the realm of commerce; and historical evidence shows that it was actually established and refined, well before cold hard cash hit the scene. I'll elaborate further when I get on to definitions (next section). For now, just bear with me.

One of the earliest known historical examples of credit – in the form of what essentially amount to "IOU" documents – is from Ancient Babylonia:

… in ancient Babylonia … common commercial documents … are what are called "contract tablets" or "shuhati tablets" … These tablets, the oldest of which were in use from 2000 to 3000 years B. C. are of baked or sun-dried clay … The greater number are simple records of transactions in terms of "she," which is understood by archaeologists to be grain of some sort.

From the frequency with which these tablets have been met with, from the durability of the material of which they are made, from the care with which they were preserved in temples which are known to have served as banks, and more especially from the nature of the inscriptions, it may be judged that they correspond to the medieval tally and to the modern bill of exchange; that is to say, that they are simple acknowledgments of indebtedness given to the seller by the buyer in payment of a purchase, and that they were the common instrument of commerce.

But perhaps a still more convincing proof of their nature is to be found in the fact that some of the tablets are entirely enclosed in tight-fitting clay envelopes or "cases," as they are called, which have to be broken off before the tablet itself can be inspected … The particular significance of these "case tablets" lies in the fact that they were obviously not intended as mere records to remain in the possession of the debtor, but that they were signed and sealed documents, and were issued to the creditor, and no doubt passed from hand to hand like tallies and bills of exchange. When the debt was paid, we are told that it was customary to break the tablet.

We know, of course, hardly anything about the commerce of those far-off days, but what we do know is, that great commerce was carried on and that the transfer of credit from hand to hand and from place to place was as well known to the Babylonians as it is to us. We have the accounts of great merchant or banking firms taking part in state finance and state tax collection, just as the great Genoese and Florentine bankers did in the middle ages, and as our banks do to-day.

Source: What is Money?
Original source: The Banking Law Journal, May 1913, By A. Mitchell Innes.

As the source above mentions (and as it describes in further detail elsewhere), another historical example of credit – as opposed to money – is from medieval Europe, where the split tally stick was commonplace. In particular, in medieval England, the tally stick became a key financial instrument used for taxation and for managing the Crown accounts:

A tally stick is "a long wooden stick used as a receipt." When money was paid in, a stick was inscribed and marked with combinations of notches representing the sum of money paid, the size of the cut corresponding to the size of the sum. The stick was then split in two, the larger piece (the stock) going to the payer, and the smaller piece being kept by the payee. When the books were audited the official would have been able to produce the stick with exactly matched the tip, and the stick was then surrendered to the Exchequer.

Tallies provide the earliest form of bookkeeping. They were used in England by the Royal Exchequer from about the twelfth century onward. Since the notches for the sums were cut right through both pieces and since no stick splits in an even manner, the method was virtually foolproof against forgery. They were used by the sheriff to collect taxes and to remit them to the king. They were also used by private individuals and institutions, to register debts, record fines, collect rents, enter payments for services rendered, and so forth. By the thirteenth century, the financial market for tallies was sufficiently sophisticated that they could be bought, sold, or discounted.

Source: Tally sticks.

Thirteenth century English tally sticks.
Thirteenth century English tally sticks.
Source: The National Archives.

It should be noted that unlike the contract tablets of Babylonia (and the similar relics of other civilisations of that era), the medieval tally stick existed alongside an established metal-coin-based money system. The ancient tablets recorded payments made, or debts owed, in raw goods (e.g. "on this Tuesday, Bishbosh the Great received eight goats from Hammalduck", or "as of this Thursday, Kimtar owes five kwetzelgrams of silver and nine bushels of wheat to Washtawoo"). These societies may have, in reality, recorded most transactions in terms of precious metals (indeed, it's believed that the silver shekel emerged as the standard unit in ancient Mesopotamia); but these units had non-standard shapes and were unsigned, whereas classical coinage was uniform in shape, and possessed insignias.

In medieval England, the common currency was sterling silver, which consisted primarily of silver penny coins (but there were also silver shilling coins, and gold pound coins). The medieval tally sticks recorded payments made, or debts owed, in monetary value (e.g. "on this Monday, Lord Snottyham received one shilling and eight pence from James Yoohooson", or "as of this Wednesday, Lance Alot owes sixpence to Sir Robin").

Definitions

Enough history for now. Let's stop for a minute, and get some basic definitions clear.

First and foremost, the most basic question of all, but one that surprisingly few people have ever actually stopped to think about: what is money?

There are numerous answers:

Money is a medium of exchange.

Source: The Privateer - What is money?

Money itself … is useless until the moment we use it to purchase or invest in something. Although money feels as if it has an objective value, its worth is almost completely subjective.

Source: Forbes - Money! What is it good for?

As with other things, necessity is, indeed, the mother of invention. People needed a formula of stating the standard value of trade goods.

Thus, money was born.

Source: The Daily Bluster - Where did money come from, anyway?

The seller and the depositor alike receive a credit, the one on the official bank and the other direct on the government treasury, The effect is precisely the same in both cases. The coin, the paper certificates, the bank-notes and the credit on the books of the bank, are all indentical in their nature, whatever the difference of form or of intrinsic value. A priceless gem or a worthless bit of paper may equally be a token of debt, so long as the receiver knows what it stands for and the giver acknowledges his obligation to take it back in payment of a debt due.

Money, then, is credit and nothing but credit. A's money is B's debt to him, and when B pays his debt, A's money disappears. This is the whole theory of money.

Source: What is Money?
Original source: The Banking Law Journal, May 1913, By A. Mitchell Innes.

For some, money is a substance in which one may bathe.
For some, money is a substance in which one may bathe.
Image source: DuckTales…Woo-ooo!

I think the first definition is the easiest to understand. Money is a medium of exchange: it has no value in and of itself; but it allows us to more easily exchange between ourselves, things that do have value.

I think the last definition, however, is the most honest. Money is credit: or, to be more correct, money is a type of credit; a credit that is expressed in a uniform, easily quantifiable / divisible / exchangeable unit of measure (as opposed to a credit that's expressed in goats, or in bushels of wheat).

(Note: the idea of money as credit, and of credit as debt, comes from the credit theory of money, which was primarily formulated by Innes (quoted above). This is just one theory of money. It's not the definitive theory of money. However, I tend to agree with the theory's tenets, and various parts of the rest of this article are founded on the theory. Also, it should not be confused with The Theory of Money and Credit, a book from the Austrian School of economics, which asserts that the only true money is commodity money, and which is thus pretty well the opposite extreme from the credit theory of money.)

Which brings us to the next defintion: what is credit?

In the article giving the definition of "money as credit", it's also mentioned that "credit" and "debt" are effectively the same thing; just that the two words represent the two sides of a single relationship / transaction. So, then, perhaps it would make more sense to define what is debt:

Middle English dette: from Old French, based on Latin debitum 'something owed', past participle of debere 'owe'.

A debt is something that one owes; it is one's obligation to give something of value, in return for something that one received.

Conversely, a credit is the fact of one being owed something; it is a promise that one has from another person / entity, that one will be given something of value in the future.

So, then, if we put the two definitions together, we can conclude that: money is nothing more than a promise, from the person / entity who issued the money, that they will give something of value in the future, to the current holder of the money.

Perhaps the simplest to understand example of this, in the modern world, is the gift card typically offered by retailers. A gift card has no value itself: it's nothing more than a promise by the retailer, that they will give the holder of the card a shirt, or a DVD, or a kettle. When the card holder comes into the shop six months later, and says: "I'd like to buy that shirt with this gift card", what he/she really means is: "I have here a written promise from you folks, that you will give me a shirt; I am now collecting what was promised". Once the shirt has been received, the gift card is suddenly worthless, as the documented promise has been fulfilled; this is why, when the retailer reclaims the gift card, they usually just dispose of it.

However, there is one important thing to note: the only value of the gift card, is that it's a promise of being exchangeable for something else; and as long as that promise remains true, the gift card has value. In the case of a gift card, the promise ceases to be true the moment that you receive the shirt; the card itself returns to its original issuer (the retailer), and the story ends there.

Money works the same way, only with one important difference: it's a promise from the government, of being exchangeable for something else; and when you exchange that money with a retailer, in return for a shirt, the promise remains true; so the money still has value. As long as the money continues to be exchanged between regular citizens, the money is not returned to its original issuer, and so the story continues.

So, as with a gift card: the moment that money is returned to its original issuer (the government), that money is suddenly worthless, as the documented promise has been fulfilled. What do we usually return money to the government for? Taxes. What did the government originally promise us, by issuing money to us? That it would take care of us (it doesn't buy us flowers or send us Christmas cards very often; it demonstrates its caring for us mainly with other things, such as education and healthcare). What happens when we pay taxes? The government takes care of us for another year (it's supposed to, anyway). Therefore, the promise ceases to be true; and, believe it or not, the moment that the government reclaims the money in taxes, that money ceases to exist.

The main thing that a government promises, when it issues money, is that it will take care of its citizens; but that's not the only promise of money. Prior to quite recent times, money was based on gold: people used to give their gold to the government, and in return they received money; so, money was a promise that the government would give you back your gold, if you ever wanted to swap again.

In the modern economic system, the governments of the world no longer promise to give you gold (although most governments still have quite a lot of gold, in secret buildings with a lot of fancy locks and many armed guards). Instead, by issuing money these days, a government just promises that its money is worth as much as its economy is worth; this is why governments and citizens the world over are awfully concerned about having a "strong economy". However, what exactly defines "the economy" is rather complicated, and it only gets trickier with every passing year.

So, a very useful side effect of money – as opposed to gift cards – is that as long as the promise of money remains true (i.e. as long as the government keeps taking care of its people, and as long as the economy remains strong), regular people can use whatever money they have left-over (i.e. whatever money doesn't return to the government, at which point it ceases to exist), as a useful medium of exchange in regular day-to-day commerce. But remember: when you exchange your money for a kettle at the shop, this is what happens: at the end of the day, you have a kettle (something of value); and the shop has a promise from the government that it is entitled to something (presumably, something of value).

Recent history

Back to our history class. This time, more recent history. The modern monetary system could be said to have begun in 1694, when the Bank of England was founded. The impetus for establishing it should be familiar to all 21st-century readers: the government of England was deeply in debt; and the Bank was founded in order to acquire a loan of £1.2 million for the Crown. Over the subsequent centuries, it evolved to become the world's first central bank. Also, of great note, this marked the first time in history that a bank (rather than the king) was given the authority to mint new money.

The grand tradition of English banking: The Dawes, Tomes, Mousely, Grubbs, Fidelity Fiduciary Bank.
The grand tradition of English banking: The Dawes, Tomes, Mousely, Grubbs, Fidelity Fiduciary Bank.
Image source: Scene by scene Mary Poppins.

During the 18th and 19th centuries, and also well into the 20th century, the modern monetary system was based on the gold standard. Under this system, countries tied the value of their currency to gold, by guaranteeing to buy and sell gold at a fixed price. As a consequence, the value of a country's currency depended directly on the amount of gold reserves in its possession. Also, consequently, money at that time represented a promise, by the money's issuer, to give an exact quantity of gold to its current holder. This could be seen as a hangover from ancient and medieval times, when money was literally worth the weight of gold (or, more commonly, silver) of which the coins were composed (as discussed above).

During that same time period, the foundation currency – and by far the dominant currency – of the world monetary system was the British Pound. As the world's strongest economy, the world's largest empire (and hence the world's largest trading bloc), and the world's most industrialised nation, all other currencies were valued relative to the Pound. The Pound became the reserve currency of choice for nations worldwide, and most international transactions were denominated with it.

In the aftermath of World War II, the Allies emerged victorious; but the Pound Sterling met its defeat at long last, at the hands of a new world currency: the US Dollar. Because the War had taken place in Europe (and Asia), the financial cost to the European Allied powers was crippling; North America, on the other hand, hadn't witnessed a single enemy soldier set foot on its soil, and so it was that, with the introduction of the Bretton Woods system in 1944, the Greenback rapidly and ruthlessly conquered the world.

The Dollars are Coming!
The Dollars are Coming!
Image source: The Guardian: Reel history.

Under the Bretton Woods system, the gold standard remained in place: the only real difference, was that gold was now spelled with a capital S with a line through it ($), instead of being spelled with a capital L with a line through it (£). The US Dollar replaced the Pound as the dominant world reserve currency and international transaction currency.

The gold standard finally came to an end when, in 1971, President Nixon ended the direct convertibility of US Dollars to gold. Since then, the USD has continued to reign supreme over all other currencies (although it's been increasingly facing competition). However, under the current system, there is no longer a "other currencies -> USD -> gold" pecking order. Theoretically, all currencies are now created equal; and gold is now just one more commodity on the world market, rather than "the shiny stuff that gives money value".

Since the end of Bretton Woods, the world's major currencies exist in a floating exchange rate regime. This means that the only way to measure a given currency's value, is by determining what quantity of another given currency it's worth. Instead of being tied to the value of a real-life object (such as gold), the value of a currency just "floats" up and down, depending on the fluctuations in that country's economy, and depending on the fluctuations in peoples' relative perceptions of its value.

What we have now

The modern monetary system is a complex beast, but at its heart it consists of three players.

The mythical Hydra, a multi-headed monster. Grandaddy of the modern monetary system, perhaps?
The mythical Hydra, a multi-headed monster. Grandaddy of the modern monetary system, perhaps?
Image source: HydraVM.

First, there are the governments of the world. In most countries, there's a department that "represents" the government as a whole, within the monetary system: this is usually called the "Treasury"; it may also be called the Ministry of Finance, among other names. Contrary to what you might think, Treasury does not bring new money into existence (even though Treasury usually governs a country's mint, and thus Treasury is the manufacturer of new physical money).

As discussed in definitions (above), in a "pure" system, money comes into existence when the government issues it (as a promise), and money ceases to exist when the government takes it back (in return for fulfilling a promise). However, in the modern system, the job of bringing new money into existence has been delegated; therefore, money does not cease to exist, the moment that it returns to the government (i.e. the "un-creation" of money has also been delegated).

This delegation allows the government itself to function like any other individual or entity within the system. That is, the government has an "account balance", it receives monetary income (via taxation), it spends money (via its budget program), and it can either be "in the green" or "in the red" (with a strong tendency towards the latter). Thus, the government itself doesn't have to worry too much about the really complicated parts of the modern monetary system; and instead, it can just get on with the job of running the country. The government can also borrow money, to supplement what it receives from taxation; and it can lend money, in addition to its regular spending.

Second, there are these things called "central banks" (also known as "reserve banks", among other names). In a nutshell: the central bank is the entity to which all that stuff I just mentioned gets delegated. The central bank brings new money into existence – officially on behalf of the government; but since the government is usually highly restricted from interfering with the central bank's operation, this is a half-truth at best. It creates new money in a variety of ways. One way – which in practice is usually responsible for only a small fraction of overall money creation, but which I believe is worth focusing on nonetheless – is by buying government (i.e. Treasury) bonds.

Just what is a bond? (Seems we're not yet done with definitions, after all.) A bond is a type of debt (or a type of credit, depending on your perspective). A lends money to B, and in return, B gives A bonds. The bonds are a promise that the debt will be repaid, according to various terms (time period, interest payable, etc). So, bonds themselves have no value: they're just a promise that the holder of the bonds will receive something of value, at some point in the future. In the case of government bonds, the bonds are a promise that the government will provide something of value to their current holder.

But, hang on… isn't that also what money is? A promise that the government will provide something of value to the current holder of the money? So, let me get this straight: the Treasury writes a document (bonds) saying "The government (on behalf of the Treasury) promises to give the holder of this document something of value", and gives it to the central bank; and in return, the central bank writes a document (money) also saying "The government (on behalf of the central bank) promises to give the holder of this document something of value", and gives it to the Treasury; and at the end of the day, the government has more money? Or, in other words (no less tangled): the government lends itself money, and money is also itself a government loan? Ummm… WTF?!

Glad I'm not the only one that sees a slight problem here.
Glad I'm not the only one that sees a slight problem here.
Image source: Lol Zone.

Third, there are the commercial banks. The main role of these (private) companies is to safeguard the deposits of, and provide loans to, the general public. The main (original) source of commercial banks' money, is from the deposits of their customers. However, thanks to the practice of fractional reserve banking that's prevalent in the modern monetary system, commercial banks are also responsible for about 95% of the money creation that occurs today; almost all of this private-bank-created money is interest (and principal) from loans. So, yes: money is created out of thin air; and, yes, the majority of money is not created by the government (either on behalf of Treasury or the central bank), but by commercial banks. No surprise, then, that about 97% of the world's money exists only electronically in commercial bank accounts (with physical cash making up the other 3%).

This presents another interesting conundrum: all money supposedly comes from the government, and is supposedly a promise from the government that they will provide something of value; but in today's reality, most of our money wasn't created by the government, it was created by commercial banks! So, then: if I have $100 in my bank account, does that money represent a promise from the government, or a promise from the commercial banks? And if it's a promise from the commercial banks… what are they promising? Beats me. As far as I know, commercial banks don't promise to take care of society; they don't promise to exchange money for gold; I suppose the only possibility is that, much as the government promises that money is worth as much as the nation's economy is worth, commercial banks promise that money is worth as much as they are worth.

And what are commercial banks worth? A lot of money (and not much else), I suppose… which starts taking us round in circles.

I should also mention here, that the central banks' favourite and most oft-used tool in controlling the creation of money, is not the buying or selling of bonds; it's something else that we hear about all the time in the news: the raising or lowering of official interest rates. Now that I've discussed how 95% of money creation occurs via the creation of loans and interest within commercial banks, it should be clear why interest rates are given such importance by government and by the media. The central bank only sets the "official" interest rate, which is merely a guide for commercial banks to follow; but in practice, commercial banks adjust their actual interest rates to closely match the official one. So, in case you had any lingering doubts: the central banks and the commercial banks are, of course, all "in on it" together.

Oh yeah, I almost forgot… and then there are regular people. Just trying to eke out a living, doing whatever's necessary to bring home the dough, and in general trying to enjoy life, despite the best efforts of the multi-headed beast mentioned above. But they're not so important; in fact, they hardly count at all.

In summary: today's system is very big and complex, but for the most part it works. Somehow. Sort of.

Broken promises

In case you haven't worked it out yet: money is debt; debt is credit; and credit is promises.

Bankers, like politicians, are big on promises. In fact, bankers are full of promises (by definition, since they're full of money). And, also like politicians, bankers are good at breaking promises.

Or, to phrase it more accurately: bankers are good at convincing you to make promises (i.e. to take out a loan); and they're good at promising you that you'll have no problem in not breaking your promises (i.e. in paying back the loan); and they're good at promising you that making and not breaking your promises will be really worthwhile for you (i.e. you'll get a return on your loan); and (their favourite part) they're exceedingly good at holding you to your promises, and at taking you to the dry cleaners in the event that you are truly unable to fulfil your promises.

Since money is debt, and since money makes the world go round, the fact that the world is full of debt really shouldn't make anyone raise an eyebrow. What this really means, is that the world is full of promises. This isn't necessarily a bad thing, assuming that the promises being made are fair. In general, however, they are grossly unfair.

Let's take a typical business loan as an example. Let's say that Norbert wants to open a biscuit shop. He doesn't have enough money to get started, so he asks the bank for a loan. The bank lends Norbert a sum of money, with a total repayment over 10 years of double the value of the sum being lent (as is the norm). Norbert uses the money to buy a cash register, biscuit tins, and biscuits, and to rent a suitable shop venue.

There are two possibilities for Norbert. First, he generates sufficient business selling biscuits to pay off the loan (which includes rewarding the bank with interest payments that are worth as much as it cost him to start the business), and he goes on selling biscuits happily ever after. Second, he fails to bring in enough revenue from the biscuit enterprise to pay off the loan, in which case the bank seizes all of his business-related assets, and he's left with nothing. If he's lucky, Norbert can go back to his old job as a biscuit-shop sales assistant.

What did Norbert input, in order to get the business started? All his time and energy, for a sustained period. What was the real cost of this input? Very high: Norbert's time and energy is a tangible asset, which he could have invested elsewhere had he chosen (e.g. in building a giant Lego elephant). And what is the risk to Norbert? Very high: if business goes bad (and the biscuit market can get volatile at times), he loses everything.

What did the bank input, in order to get the business started? Money. What was the real cost of this input? Nothing: the bank pulled the money out of thin air in order to lend it to Norbert; apart from some administrative procedures, the bank effectively spent nothing. And what is the risk to the bank? None: if business goes well, they get back double the money that they lent Norbert (which was fabricated the moment that the loan was approved anyway); if business goes bad, they seize all Norbert's business-related assets (biscuit tins and biscuits are tangible assets), and as for the money… well, they just fabricated it in the first place anyway, didn't they?

Broke(n) nations

One theme that I haven't touched on specifically so far, is the foreign currency exchange system. However, I've already explained that money is worth as much as a nation's economy is worth; so, logically, the stronger a nation's economy is, the more that nation's money is worth. This is the essence of foreign currency exchange mechanics. Here's a formula that I just invented, but that I believe is reasonably accurate, for determining the exchange rate r between two given currencies a and b:

My unofficial exchange rate formula.
My unofficial exchange rate formula.
That's: ra:b = (sa ÷ qa) : (qb ÷ sb)

Where sx is the strength of the given economy, and qx is the quantity of the given currency in existence.

So, for example, say we want to determine the exchange rate of US Dollars to Molvanîan Strubls. Let's assume that the US economy is worth "1,000,000" (which is good), and that there are 1,000,000 US Dollars (a) in existence; and let's assume that the Molvanîan economy is worth "100" (which is not so good), and that there are 1,000,000,000 Molvanîan Strubls (b) in existence. Substituting values into the formula, we get:

ra:b = (1,000,000 ÷ 1,000,000 USD) : (1,000,000,000 Strubls ÷ 100)

ra:b = 1 USD : 10,000,000 Strubls

This, in my opinion, should be sufficient demonstration of why the currencies of strong economies have value, and why people the world over like getting their hands dirty with them; and why the currencies of weak economies lack value, and why their only practical use is for cleaning certain dirty orifices of one's body.

Or, for a real-world example of a currency worth less than its weight in toilet paper, see Zimbabwe.
Or, for a real-world example of a currency worth less than its weight in toilet paper, see Zimbabwe.
Image source: praag.org.

Now, getting back to the topic of lending money. Above, I discussed how banks lend money to individuals. As it turns out, banks also lend money to foreign countries. Either commercial banks, central banks, or an international bank (such as the IMF), doesn't matter in this context. And, either foreign individuals, foreign companies, or foreign governments, doesn't matter either in this context. The point is: there are folks whose local currency isn't accepted worldwide (if it's even accepted locally), and who need to purchase goods and services from the world market; and so, these folks ask for a loan from banks elsewhere, who are able to lend them money in a strong currency.

The example scenario that I described above (Norbert), applies equally here. Only this time, Norbert is a group of people from a developing country (let's call them The Morbert Group), and the bank is a corporation from a developed country. As in Norbert's case, The Morbert Group input a lot of time and effort to start a new business; and the bank input money that it pulled out of thin air. And, as in Norbert's case, The Morbert Group has a high risk of losing everything, and at the very least is required to pay an exorbitant amount of interest on its loan; whereas the bank has virtually no risk of losing anything, as it's a case of "the house always wins".

So, the injustice of grossly unfair and oft-broken promises between banks and society doesn't just occur within a single national economy, it occurs on a worldwide scale within today's globalised economy. Yes, the bank is the house; and yes (aside from a few hiccups), the house just keeps winning and winning. This is how, in the modern monetary system, a nation's rich people keep getting richer while its poor people keep getting poorer; and it's how the world's rich countries keep getting richer, while the poor countries keep getting poorer.

Serious problems

Don't ask "how did it come to this?" I've just spent a large number of words explaining how it's come to this (see everything above). Face the facts: it has come to this. The modern monetary system has some very serious problems. Here's my summary of what I think those problems are:

  • Currency inequality promotes poverty. In my opinion, this is the worst and the most disgraceful of all our problems. A currency is only worth as much as its nation's economy is worth. This is wrong. It means that the people who are issued that currency, participate in the global economy with a giant handicap. It's not their fault that they were born in a country with a weaker economy. Instead, a currency should be worth as much as its nation's people are worth. And all people in all countries are "worth" the same amount (or, at least, they should be).
  • Governments manipulate currency for their own purposes. All (widely-accepted) currency in the world today is issued by governments, and is controlled by the world's central banks. While many argue that the tools used to manipulate the value of currency – such as adjusting interest rates, and trading in bonds – are "essential" for "stabilising" the economy, it's clear that very often, governments and/or banks abuse these tools in the pursuit of more questionable goals. Governments and central banks (particularly those in "strong" countries, such as the US) shouldn't have the level of control that they do over the global financial system.
  • Almost all new money is created by commercial banks. The creation of new money should not be entrusted to a handful of privileged companies around the world. These companies continue to simply amass ever-greater quantities of money, further promoting poverty and injustice. Money creation should be more fairly distributed between all individuals and between all nations.
  • It's no longer clear what gives money its value. In the olden days, money was "backed" by gold. Under the current system, money is supposedly backed by the value of the issuing country's economy. However, the majority of new money today is created by commercial banks, so it's unclear if that's really true or not. Perhaps a new way definition of what "backs" money is needed?

Now, at long last – after much discussion of promises made and promises broken – it's time to fulfil the promise that I made at the start of this article. Time to solve all the monetary problems of the modern world!

Possible solutions

One alternative to the modern monetary system, and its fiat money roots (i.e. money "backed by nothing"), is a return to the gold standard. This is actually one of the more popular alternatives, with many arguing that it worked for thousands of years, and that it's only for the past 40-odd years (i.e. since the Nixon Shock in 1971) that we've been experimenting with the current (broken) system.

This is a very conservative argument. The advocates of "bringing back the gold standard" are heavily criticised by the wider community of economists, for failing to address the issues that caused the gold standard to be dropped in the first place. In particular, the critics point out that the modern world economy has been growing much faster than the world's supply of gold has been growing, and that there literally isn't enough physical gold available, for it to serve as the foundation of the modern monetary system.

Personally, I take the critics' side: the gold standard worked up until modern times; but gold is a finite resource, and it has certain physical characteristics that limit its practical use (e.g. it's quite heavy, it's not easily divisible into sufficiently small parts, etc). Gold will always be a valuable commodity – and, as the current economic crisis shows, people will always turn to gold when they lose confidence in even the most stable of regular currencies – but its days as the foundation of currency were terminated for a reason, and so I don't think it's altogether bad that we relegate the gold standard to the annals of history.

How about getting rid of money altogether? For virtually as long as money has existed, it's been often labelled "the root of all evil". The most obvious solution to the world's money problems, therefore, is one that's commonly proposed: "let's just eliminate money." This has been the cry of hippies, of communists, of utopianists, of futurists, and of many others.

Imagine no possessions... I wonder if you can.
Imagine no possessions... I wonder if you can.
Image source: Etsy.

Unfortunately, the most prominent example so far in world history of (effectively) eliminating money – 20th century communism – was also an economic disaster. In the Soviet Union, although there was money, the price of all basic goods and services was fixed, and everything was centrally distributed; so money was, in effect, little more than a rationing token. Hence the famous Russian joke: "We pretend to work, They pretend to pay us".

Utopian science fiction is also rife with examples of a future without money. The best-known and best-developed example is Star Trek (an example with which I'm also personally well-acquainted). In the Star Trek universe, where virtually all of humanity's basic needs (i.e. food, clothing, shelter, education, medicine) are provided for in limitless supply by modern technology, "the economics of the future are somewhat different". As Captain Picard says in First Contact: "The acquisition of wealth is no longer the driving force in our lives. We work to better ourselves and the rest of humanity." This is a great idea in principle; but Star Trek also fails to address the practical issues of such a system, any better than contemporary communist theory does.

Star Trek IV:
Star Trek IV: "They're still using money. We need to get some."
Image source: Moar Powah.

While I'm strongly of the opinion that our current monetary system needs reform, I don't think that abolishing the use of money is: (a) practical (assuming that we want trade and market systems to continue existing in some form); or (b) going to actually address the issues of inequality, corruption, and systemic instability that we'd all like to see improved. Abolishing money altogether is not practical, because we do require some medium of exchange in order for the civilised world (which has always been built on trade) to function; and it's not going to address the core issues, because money is not the root of all evil, money is just a tool which can be used for either good or bad purposes (the same as a hammer can be used to build a house or to knock someone on the head – the hammer itself is "neutral"). The problem is not money; the problem is greed.

For a very different sci-fi take on the future of money, check out the movie In Time (2011). In this dystopian work, there is a new worldwide currency: time. Every human being is born with a "biological watch", that shows on his/her forearm how much time he/she has left to live. People can earn time, trade with time, steal time, donate time, and store time (in time banks). If you "time out" (i.e. run out of time), you die instantly.

In Time: you're only worth as many seconds as you have left to live.
In Time: you're only worth as many seconds as you have left to live.
Image source: MyMovie Critic.

The monetary system presented by In Time is interesting, because it's actually very stable (i.e. the value of "time" is very clear, and time as a currency is quite resilient to inflation / deflation, speculation, etc), and it's a currency that's "backed" by a real commodity (i.e. time left alive; commodities don't get much more vital). However, the system also has gross potential for inequality and corruption – and indeed, in the movie, it's clearly demonstrated that everyone could live indefinitely if the banks just kept rewarding infinte quantities of time; but instead. time is meagerly rationed out by the rich and powerful elite (who can create more time out of thin air whenever they want, much as today's elite do with money), in order to enforce a status quo upon the impoverished masses.

One of the most concerted efforts that has been made in recent times, to disrupt (and potentially revolutionise) the contemporary monetary system, is the much-publicised Bitcoin project. Bitcoin is a virtual currency, which isn't issued or backed by any national government (or by any official organisation at all, for that matter), but which is engineered to mimic many of the key characteristics of gold. In particular, there's a finite supply of Bitcoins; and new Bitcoins can only be created by "mining" them.

Bitcoin makes no secret of the fact that it aims to become a new global world currency, and to bring about the demise of traditional government-issued currency. As I've already stated here, I'm in favour of replacing the current world currencies; and I applaud Bitcoin's pioneering endeavours to do this. Bitcoin sports the key property that I think any contender to the "brave new world of money" would need: it's not generated by central banks, nor by any other traditional contemporary authority. However, there are a number of serious flaws in the Bitcoin model, which (in my opinion) mean that Bitcoin cannot and (more importantly) should not ever achieve this.

Most importantly, Bitcoin fails to adequately address the issue of "money creation should be fairly distributed between all". In the Bitcoin model, money creation is in the hands of those who succeed in "mining" new Bitcoins; and "mining" Bitcoins consists of solving computationally expensive cryptographic calculations, using the most powerful computer hardware possible. So, much as Bitcoin shares many of gold's advantages, it also shares many of its flaws. Much as gold mining unfairly favours those who discover the gold-hills first, and thereafter favours those with the biggest drills and the most grunt; so too does Bitcoin unfairly favour those who knew about Bitcoin from the start, and thereafter favour those with the beefiest and best-engineered hardware.

Mining: a dirty business that rewards the boys with the biggest toys.
Mining: a dirty business that rewards the boys with the biggest toys.
Image source: adelaidenow.

Bitcoin also fails to address the issue of "what gives money its value". In fact, "what gives Bitcoin its value" is even less clear than "what gives contemporary fiat money its value". What "backs" Bitcoin? Not gold. Not any banks. Not any governments or economies. Supposedly, Bitcoin "is" the virtual equivalent of gold; but then again (as others have stated), I'll believe that the day I'm shown how to convert digital Bitcoins into physical metal chunks that are measured in Troy ounces. It's also not clear if Bitcoin is a commodity or a currency (or both, or neither); and if it's a commodity, it's not completely clear how it would succeed as the foundation of the world monetary system, where gold failed.

Plus, assuming that Bitcoin is the virtual equivalent of gold, the fact that it's virtual (i.e. technology-dependent for its very existence) is itself a massive disadvantage, compared to a physical commodity. What happens if the Internet goes down? What happens if there's a power failure? What happens if the world runs out of computer hardware? Bye-bye Bitcoin. What happens to gold (or physical fiat money) in any of these cases? Nothing.

Additionally, there's also significant doubt and uncertainty over the credibility of Bitcoin, meaning that it fails to address the issue of "manipulation of currency [by its issuers] for their own purposes". In particular, many have accused Bitcoin of being a giant scam in the form of a Ponzi scheme, which will ultimately crash and burn, but not before the system's founders and earliest adopters "jump ship" and take a fortune with them. The fact that Bitcoin's inventor goes by the fake name "Satoshi Nakamoto", and has disappeared from the Bitcoin community (and kept his true identity a complete mystery) ever since, hardly enhances Bitcoin's reputation.

This article is not about Bitcoin; I'm just presenting Bitcoin here, as one of the recently-proposed solutions to the problems of the world monetary system. I've heavily criticised Bitcoin here, to the point that I've claimed it's not suitable as the foundation of a new world monetary system. However, let me emphasise that I also really admire the positive characteristics of Bitcoin, which are numerous; and I hope that one day, a newer incarnation is born that borrows these positive characteristics of Bitcoin, while also addressing Bitcoin's flaws (and we owe our thanks to Bitcoin's creator(s), for leaving us an open-source system that's unencumbered by copyright, patents, etc). Indeed, I'd say that just as non-virtual money has undergone numerous evolutions throughout history (not necessarily with each new evolution being "better" than its predecessors); so too will virtual currency undergo numerous evolutions (hopefully with each new evolution being "better"). Bitcoin is only the beginning.

My humble proposal

The solution that I'd like to propose, is a hybrid of various properties of what's been explored already. However, the fundamental tenet of my solution, is something that I haven't discussed at all so far, and it is as follows:

Every human being in the world automatically receives an "allowance", all the time, all their life. This "allowance" could be thought of as a "global minimum wage"; although everyone receives it regardless of, and separate to, their income from work and investments. The allowance could be received once a second, or once a day, or once a month – doesn't really matter; I guess that's more a practical question of the trade-off in: "the more frequent the allowance, the more overhead involved; the less frequent the allowance, the less accurate the system is."

Ideally, the introduction of this allowance would be accompanied by the introduction of a new currency; and this allowance would be the only permitted manner in which new units of the currency are brought into existence. That is, new units of the currency cannot be generated ad lib by central banks or by any other organisation (and it would be literally impossible to circumvent this, a la Bitcoin, thus making the currency a commodity rather than a fiat entity). However, a new currency is not the essential idea – the global allowance per person is the core – and it could be done with one or more existing currencies, although this would obviously have disadvantages.

The new currency for distributing the allowance would also ideally exist primarily in digital form. It would be great if, unlike Bitcoin and its contemporaries, the currency could also exist in a physical commodity form, with an easy way of transforming the currency between digital and physical form, and vice versa. This would require technology that doesn't currently exist – or, at the least, some very clever engineering with the use of current technology – and is more "wishful thinking" at this stage. Additionally, the currency could also exist as an "account balance" genetically / biologically stored within each person, much like in the movie In Time; except that you don't die if you run out of money (you just ain't got no money). However, all of this is non-essential bells and whistles, supplementing my core proposal.

There are a number of other implementation details, that I don't think particularly need to all be addressed at the conceptual level, but that would be significant at the practical level. For example: should the currency be completely "tamper-proof", or should there be some new international body that could modify various parameters (e.g. change the amount of the allowance)? And should the allowance be exactly the same for everyone, or should it vary according to age, physical location, etc? Personally, I'd opt for a completely "tamper-proof" currency, and for a completely standard allowance; but other opinions may differ.

Taxation would operate in much the same way as it does now (i.e. a government's primary source of revenue, would be taxing the income of its citizens); however, the wealth difference between countries would reduce significantly, because at a minimum, every country would receive revenue purely based on its population.

A global allowance (issued in the form of a global currency), doesn't necessarily mean a global government (although the two would certainly function much better together). It also doesn't necessarily mean the end of national currencies; although national currencies would probably, in the long run, struggle to compete for value / relevance with a successful global currency, and would die out.

If there's a global currency, and a global allowance for everyone on the planet, but still individual national governments (some of which would be much poorer and less developed than others), then taxation would still be at the discretion of each nation. Quite possibly, all nations would end up taxing 100% of the allowance that their citizens receive (and corrupt third-world nations would definitely do this); in which case it would not actually be an allowance for individuals, but just a way of enforcing more economic equality between countries based on population.

However, this doesn't necessarily make the whole scheme pointless. If a developing country receives the same revenue from its population's global allowance, as a developed country does (due to similar population sizes), then: the developing country would be able to compete more fairly in world trade; it would be able to attract more investment; and it wouldn't have to ask for loans and to be indebted to wealthier countries.

So, with such a system, the generated currency wouldn't be backed by anything (no more than Bitcoin is backed by anything) – but it wouldn't be fiat, either; it would be a commodity. In effect, people would be the underlying commodity. This is a radical new approach to money. It would also have potential for corruption (e.g. it could lead to countries kidnapping / enslaving each others' populations, in order to steal the commodity value of another country). However, appropriate practical safeguards in the measuring of a country's population, and in the actual distribution of new units of the currency, should be able to prevent this.

It's not absolutely necessary that a new global currency is created, in order to implement the "money for all countries based on population" idea: all countries could just be authorised to mint / receive a quantity of an existing major currency (e.g. USD or EUR) proportional to its population. However, this would be more prone to corruption, and would lack the other advantages of a new global currency (i.e. not backed by any one country, not produced by any central bank).

It has been suggested that a population-based currency is doomed to failure:

So here is were [sic] you need to specify. What happens at round two?

If you do nothing things would go back to how they are now. The rich countries would have the biggest supply of this universal currency (and the most buying power) and the poor countries would loose [sic] their buying power after this one explosive buying spree.

And things would go back to exactly as they are now- no disaster-but no improvemen [sic] to anything.

But if youre [sic] proposing to maintain this condition of equal per capita money supply for every nation then you have to rectify the tendency of the universal currency to flow back to the high productivity countries. Buck the trend of the market somehow.

Dont [sic] know how you would do it. And if you were able to do it I would think that it would cause severe havoc to the world economy.It would amount to very severe global income redistribution.

The poor countries could use this lopsided monetary system to their long term advantage by abstaining from buying consumer goods and going full throttle into buying capital goods from industrialized world to industrialize their own countries. In the long term that would be good for them and for the rich countries as well.

So it could be viewed as a radical form of foreign aid. But its [sic] a little too radical.

Source: Wrong Planet.

Alright: so once you get to "round two" in this system, the developed countries once again have more money than the developing countries. However, that isn't any worse than what we've got at the moment.

And, alright: so this system would effectively amount to little more than a radical form of foreign aid. But why "too radical"? In my opinion, the way the world currently manages foreign aid is not working (as evidenced by the fact that the gap between the world's richest and poorest nations is increasing, not decreasing). The world needs a radical form of foreign aid. So, in my opinion: if the net result of a global currency whose creation and distribution is tied to national populations, is a radical form of foreign aid; then, surely what would be a good system!

Conclusion

So, there you have it. A monster of an article examining the entire history of money, exploring the many problems with the current world monetary system, and proposing a humble solution, which isn't necessarily a good solution (in fact, it isn't even necessarily better than, or as good as, the other possible solutions that are presented here), but which at least takes a shot at tackling this age-old dilemna. Money: can't live with it; can't live without it.

Money has already been a work-in-progress for about 5,000 years; and I'm glad to see that at this very moment, efforts are being actively made to continue refining that work-in-progress. I think that, regardless of what theory of money one subscribes to (e.g. money as credit, money as a commodity, etc), one could describe money as being the "grease" in the global trade machine, and actual goods and services as being the "cogs and gears" in the machine. That is: money isn't the machine itself, and the machine itself is more important than money; but then again, the machine doesn't function without money; and the better the money works, the better the machine works.

So, considering that trade is the foundation of our civilised existence… let's keep refining money. There's still plenty of room for improvement.

]]>
How compatible are the world's major religions? 2012-10-17T00:00:00Z 2012-10-17T00:00:00Z Jaza https://greenash.net.au/thoughts/2012/10/how-compatible-are-the-worlds-major-religions/ There are a tonne of resources around that compare the world's major religions, highlighting the differences between each. There are some good comparisons of Eastern vs Western religions, and also numerous comparisons of Christianity vs non-Christianity.

However, I wasn't able to find any articles that specifically investigate the compatibility between the world's major religions. The areas where different religions are "on the same page", and are able to understand each other and (in the better cases) to respect each other; vs the areas where they're on a different wavelength, and where a poor capacity for dialogue is a potential cause for conflict.

I have, therefore, taken the liberty of penning such an analysis myself. What follows is a very humble list of aspects in which the world's major religions are compatible, vs aspects in which they are incompatible.

Compatible:

  • Divinity (usually although not universally manifested by the concept of a G-d or G-ds; this is generally a religion's core belief)
  • Sanctity (various events, objects, places, and people are considered sacred by the religion)
  • Community (the religion is practiced by more than one person; the religion's members assemble in order to perform significant tasks together; the religion has the fundamental properties of a community – i.e. a start date, a founder or founders, a name / label, a size as measured by membership, etc)
  • Personal communication with the divine and/or personal expression of spirituality (almost universally manifested in the acts of prayer and/or meditation)
  • Stories (mythology, stories of the religion's origins / founding, parables, etc)
  • Membership and initiation (i.e. a definition of "who is a member" of the religion, and defined methods of obtaining membership – e.g. by birth, by initiation ritual, by force)
  • Death rites (handling of dead bodies – e.g. burial, cremation; mourning rituals; belief in / position regarding one's fate following death)
  • Material expression, often (although not always) involving symbolism (e.g. characteristic clothing, music, architecture, and artwork)
  • Ethical guidance (in the form of books, oral wisdom, fundamental precepts, laws, codes of conduct, etc – although it should also be noted that religion and ethics are two different concepts)
  • Social guidance (marriage and family; celebration of festivities and special occasions; political views; behaviour towards various societal groups e.g. children, elders, community leaders, disadvantaged persons, members of other religions)
  • Right and wrong, in terms of actions and/or thoughts (i.e. definition of "good deeds", and of "sins"; although the many connotations of sin – e.g. punishment, divine judgment, consequences in the afterlife – are not universal)
  • Common purpose (although it's impossible to definitively state what religion's purpose is – e.g. religion provides hope; "religion's purpose is to provide a sense of purpose"; religion provides access to the spiritual and the divine; religion exists to facilitate love and compassion – also plenty of sceptical opinions, e.g. religion is the "opium of the masses"; religion is superstition and dogma for fools)
  • Explanation of the unknown (religion provides answers where reason and science cannot – e.g. creation, afterlife)

Incompatible:

  • The nature of divinity (one G-d vs many G-ds; G-d-like personification of divinity vs more abstract concept of a divine force / divine plane of existence; infinite vs constrained extent of divine power)
  • Acknowledgement of other religions (not all religions even acknowledge the existence of others; of those that do, many refuse to acknowledge their validity; and of those that acknowledge validity, most consider other religions as "inferior")
  • Tolerance of other religions (while some religions encourage harmony with the rest of the world, other religions promote various degrees of intolerance – e.g. holy war, forced conversion, socio-economic discrimination)
  • Community structure (religious communities range from strict bureaucratic hierarchies, to unstructured liberal movements, with every possible shade of grey in between)
  • What has a "soul" (all objects in the universe, from rocks upward, have a soul; vs only living organisms have a soul; vs only humans have a soul; vs there is no such thing as a soul)
  • Afterlife (re-incarnation vs eternal afterlife vs soul dies with body; consequences, if any, of behaviour in life on what happens after death)
  • Acceptable social norms (monogamous vs polygamous marriage; fidelity vs open relationships; punishment vs leniency towards children; types of prohibited relationships)
  • Form of rules (strict laws with strict punishments; vs only general guidelines / principles)
  • Ethical stances (on a broad range of issues, e.g. abortion, drug use, homosexuality, tattoos / piercings, blood transfusions, organ donation)
  • Leader figure(s) (Christ vs Moses vs Mohammed vs Buddha vs saints vs pagan deities vs Confucius)
  • Holy texts (Qu'ran vs Bible vs Torah vs Bhagavad Gita vs Tripitaka)
  • Ritual manifestations (differences in festivals; feasting vs fasting vs dietary laws; song, dance, clothing, architecture)

Why can't we be friends?

This quick article is my take on the age-old question: if all religions are supposedly based on universal peace and love, then why have they caused more war and bloodshed than any other force in history?

My logic behind comparing religions specifically in terms of "compatibility", rather than simply in terms of "similarities and differences", is that a compatibility analysis should yield conclusions that are directly relevant to the question that we're all asking (i.e. Why can't we be friends?). Logically, if religions were all 100% compatible with each other, then they'd never have caused any conflict in all of human history. So where, then, are all those pesky incompatibilities, that have caused peace-avowing religions to time and again be at each others' throats?

The answer, I believe, is the same one that explains why Java and FORTRAN don't get along well (excuse the geek reference). They both let you write computer programs – but on very different hardware, and in very different coding styles. Or why Chopin fans and Rage Against the Machine fans aren't best friends. They both like to listen to music, but at very different decibels, and with very different amounts of tattoos and piercings applied. Or why a Gemini and a Cancer weren't meant for each other (if you happen to believe in astrology, which I don't). They're both looking for companionship in this big and lonely world, but they laugh and cry in different ways, and the fact is they'll just never agree on whether sushi should be eaten with a fork or with chopsticks.

Religions are just one more parallel. They all aim to bring purpose and hope to one's life; but they don't always quite get there, because along the way they somehow manage to get bogged down discussing on which day of the week only raspberry yoghurt should be eaten, or whether the gates of heaven are opened by a lifetime of charitable deeds or by just ringing the buzzer.

Religion is just one more example of a field where the various competing groups all essentially agree on, and work towards, the same basic purpose; but where numerous incompatibilities arise due to differences in their implementation details.

Perhaps religions could do with a few IEEE standards? Although, then again, perhaps if the world can't even agree on a globally compatible standard for something as simple as what type of electrical plug to use, I doubt there's any hope for religion.

]]>
Argentina: ¿que onda? 2012-04-21T00:00:00Z 2012-04-21T00:00:00Z Jaza https://greenash.net.au/thoughts/2012/04/argentina-que-onda/ A few days ago, Argentina decided to nationalise YPF, which is the largest oil company operating in the country. It's doing this by expropriating almost all of the YPF shares currently owned by Spanish firm Repsol. The move has resulted in Spain — and with it, the entire European Union — condemning Argentina, and threatening to relatiate with trade sanctions.

This is the latest in a long series of decisions that Argentina has made throughout its modern history, all of which have displayed: hot-headed nationalist sentiment; an arrogant and apathetic attitude towards other nations; and utter disregard for diplomatic and economic consequences. As with previous decisions, it's also likely that this one will ultimately cause Argentina more harm than good.

I think it's time to ask: Argentina, why do you keep shooting yourself in the foot? Argentina, are you too stubborn, are you too proud, or are you just plain stupid? Argentina, ¿que onda?

I've spent quite a lot of time in Argentina. My first visit was five years ago, as a backpacker. Last year I returned, and I decided to stay for almost six months, volunteering in a soup kitchen and studying Spanish (in Mendoza). So, I believe I've come to know the country and its people reasonably well — about as well as a foreigner can hope to know it, in a relatively short time.

I also really like Argentina. I wouldn't have chosen to spend so much time there, if I disliked the place. Argentines have generally been very warm and welcoming towards me. Argentines love to enjoy life, as is evident in their love of good food, fine beverages, and amazing (and late) nightlife. They are a relaxed people, who value their leisure time, never work too hard, and always have one minute more for a casual chat.

During my first visit to Argentina, this was essentially my complete view of the nation. However, having now spent significantly more time in the country, I realise that this was a rather rose-coloured view, and that it far from makes up the full story. Argentina is also a country facing many, many problems.

Questionable choices

What pains me most about Argentina, is that it seems to have everything going for it, and yet it's so much less than it could be. Argentina is a land of enormous potential, most of it squandered. A land of opportunities that are time and time again passed up. It's a nation that seems to be addicted to making choices that are "questionable", to put it nicely.

The most famous of these voluntary decisions in Argentina's history was when, in 1982, the nation decided to declare war on Great Britain over the Falkland Islands (Islas Malvinas). By most logical counts, this was a bad decision for a number of reasons.

Diplomatically, the Falklands war was bad: most of the world condemned Argentina for attacking the sovereign territory of another nation pre-emptively — few countries were sympathetic to Argentina's cause. Economically it was bad: it took a heavy toll on Argentina's national budget, and it also resulted in the UK (and the wider European community) imposing various trade sanctions on Argentina. And militarily it was bad: by all accounts, it should have been clear to the Argentine generals that the British military was far superior to their own, and that a war for Argentina was almost completely unwinnable.

Argentina shocked the world again when, in late 2001, it decided to default on its enormous debt to the IMF. Around the same time, the government also froze all bank accounts in the country, and severely limited the bank withdrawals that private citizens could make. Shortly thereafter, Argentina also abandoned its 10-year-long policy of pegging its currency to the US Dollar, as a result of the economic crisis that this policy had ended in.

While this decision was one of the more understandable in Argentina's history — the country's economy was in a desperate state, and few other options were available — it was still highly questionable. Defaulting on virtually the entire national debt had disastrous consequences for Argentina's international economic relations. In the short-term, foreign investment vanished from Argentina, a blow that took many years thereafter to recover (and one that continues a struggled recovery, to this day). Of all the choices open to it, Argentina elected the one that would shatter the rest of the world's confidence in its economic stability, more than any other.

And now we see history repeating itself, with Argentina once again damaging its own economic credibility, by effectively stealing a company worth billions of dollars from Spain (which, to make matters even worse, is one of Argentina's larger trading partners). We should hardly be surprised.

Culture, not just politics

It would be a bit less irrational, if we could at least confine these choices to having been "imposed" on the nation by its politicians. However, this would be far from accurate. On the contrary, all of these choices (with the possible exception of the loan defaulting) were overwhelmingly supported by the general public of Argentina. To this day, you don't have to travel far in Argentina, before you see a poster or a graffitied wall proclaiming: "Las Malvinas son Argentinas" ("The Falklands are Argentine"). And, similarly, this week's announcement to nationalise YPF was accompanied by patriotic protestors displaying banners of: "YPF de los Argentinos".

So, how can this be explained culturally? How can a nation that on the surface appears to be peaceful, fun-loving, and Western in attitude, also consistently support decisions of an aggressive character that isolate the country on the international stage?

As I said, I've spent some time in Argentina. And, as I've learned, the cultural values of Argentina appear at first to be almost identical to those of Western Europe, and of other Western nations such as Canada, Australia, and New Zealand. Indeed, most foreigners comment, when first visiting, that Argentina is by far the most European place in Latin America. However, after getting to know Argentines better, one comes to realise that there are actually some significant differences in cultural values, lying just under the surface.

Firstly, Argentina has a superiority complex. Many Argentines honestly believe that their country is one of the mightiest, the smartest, and the richest in the world. Obviously, both the nation's history, and independent statistics (by bodies such as the UN), make it clear that Argentina is none of these things. That, however, seems to be of little significance to the average Argentine. Indeed, a common joke among Argentines is that: "Argentina should be attached to Europe, but by some mistake it floated over and joined South America". Also particularly baffling, is that many Argentines seem to be capable of maintaining their superiority, while at the same time serving up a refreshingly honest criticism of their nation's many problems. This superiority complex can be explained in large part by my second point.

Secondly, Argentina has a (disturbingly high) penchant for producing and for swallowing its own propaganda. For a country that supposedly aspires to be a liberal Western democracy, Argentina is rife with misinformation about its own history, about its own geography, and about the rest of the world. In my opinion, the proliferation of exaggerated or outright false teachings in Argentina borders on a Soviet-Russian level of propaganda. Some studies indicate that a prolonged and systematic agenda of propaganda in education is to blame for Argentina's current misinformed state. I'm no expert on Argentina's educational system, and anyway I'd prefer not to pin the blame on any one factor, such as schooling. But for me, there can be no doubt: Argentines are more accustomed to digesting their own version of the truth, than they are to listening to any facts from the outside.

Finally, Argentina has an incredibly strong level of patriotism in its national psyche. Patriotism may not at first seem any different in Argentina, to patriotism in other countries. It's strong in much of the world, and particularly in much of the rest of Latin America. But there's something about the way Argentines identify with their nation — I can't pinpoint it exactly, but perhaps the way they cling to national icons such as football and mate, or the level of support they give to their leaders — there's something that's different about Argentine patriotism. In my opinion, it's this sentiment that fuels irrational thinking and irrational decisions in Argentina more than anything else. It's this patriotism that somehow disfigures the logic of Argentines into thinking: "whatever we decide as a nation is right, and the rest of the world can get stuffed".

Final thoughts

It really does pain me to say negative things about Argentina, because it's a country that I've come to know and to love dearly, and I have almost nothing but happy memories from all my time spent there. This is also the first time I've written an article critical of Argentina; and perhaps I've become a little bit Argentine myself, because I feel just a tad bit "unpatriotic" in publishing what I've written.

However, I felt that I needed to explore why this country, that I feel such affection for, continually chooses to isolate itself, to damage itself, and to stigmatise itself. I'm living in Chile this year, just next door; and I must admit, I feel melancholy at being away from the buena onda, but also relief at keeping some distance from the enormous instability that is Argentina.

]]>
Travel is a luxury 2011-03-25T00:00:00Z 2011-03-25T00:00:00Z Jaza https://greenash.net.au/thoughts/2011/03/travel-is-a-luxury/ International travel has become so commonplace nowadays, some people do it just for a long weekend. Others go for two-year backpacking marathons. And with good reason, too. Travelling has never been easier, it's never been cheaper, and it's never before been so accessible. I, for one, do not hesitate to take advantage of all this.

One other thing, though. It's also never been easier to inadvertently take it all for granted. To forget that just one generation ago, there were no budget intercontinental flights, no phrasebooks, no package tours, no visa-free agreements. And, of course, snail mail and telegrams were a far cry from our beloved modern Internet.

But that's not all. The global travel that many of us enjoy today, is only possible thanks to a dizzying combination of fortunate circumstances. And this tower (no less) of circumstances is far from stable. On the contrary: it's rocking to and fro like a pirate ship on crack. I know it's hard for us to comprehend, let alone be constantly aware of, but it wasn't like this all that long ago, and it simply cannot last like this much longer. We are currently living in a window of opportunity like none ever before. So, carpe diem — seize the day!

Have you ever before thought about all the things that make our modern globetrotting lives possible? (Of course, when I say "us", I'm actually referring to middle- or upper-class citizens of Western countries, a highly privileged minority of the world at large). And have you considered that if just one of these things were to swing suddenly in the wrong direction, our opportunities would be slashed overnight? Scary thought, but undeniably true. Let's examine things in more detail.

International relations

In general, these are at an all-time global high. Most countries in the world currently hold official diplomatic relations with each other. There are currently visa-free arrangements (or very accessible tourist visas) between most Western countries, and also between Western countries and many developing countries (although seldom vice versa, a glaring inequality). It's currently possible for a Western citizen to temporarily visit virtually every country in the world; although for various developing countries, some bureaucracy wading may be involved.

International relations is the easiest thing for us to take for granted, and it's also the thing that could most easily and most rapidly change. Let's assume that tomorrow, half of Asia and half of Africa decided to deny all entry to all Australians, Americans, and Europeans. It could happen! It's the sovereign right of any nation, to decide who may or may not enter their soil. And if half the governments of the world decide — on the spur of the moment — to bar entry to all foreigners, there's absolutely nothing that you or I can do about it.

Armed conflict

This is (of course) always a problem in various parts of the world. Parts of Africa, Asia, and Latin America are currently unsafe due to armed conflict, mainly from guerillas and paramilitary groups (although traditional war between nations still exists today as well). Armed conflict is relatively contained within pockets of the globe right now.

But that could very easily change. World War III could erupt tomorrow. Military activity could commence in parts of the world that have been boring and peaceful for decades, if not centuries. Also, in particular, most hostility in the world today is currently directed towards other local groups; that hostility could instead be directed at foreigners, including tourists.

War between nations is also the most likely cause for a breakdown in international relations worldwide (it's not actually very likely that they'd break down for no reason — although a global spout of insane dictators is not out of the question). This form of conflict is currently very confined. But if history is any guide, then that is an extremely uncommon situation that cannot and will not last.

Infectious diseases

This is also a problem that has never gone away. However, it's currently relatively safe for tourists to travel to almost everywhere in the world, assuming that proper precautions are taken. Most infectious diseases can be defended against with vaccines. AIDS and other STDs can be controlled with safe and hygienic sexual activity. Water-borne sicknesses such as giardia, and mosquito-borne sicknesses such as malaria, can be defended against with access to bottled water and repellents.

Things could get much worse. We've already seen, with recent scares such as Swine Flu, how easily large parts of the world can become off-limits due to air-borne diseases for which there is no effective defence. In the end, it turned out that Swine Flu was indeed little more than a scare (or an epidemic well-handled; perhaps more a matter of opinion than of fact). If an infectious disease were contagious enough and aggressive enough, we could see entire continents being indefinitely declared quarantine zones. That could put a dent in some people's travel plans!

Environmental contamination

There are already large areas of the world that are effectively best avoided, due to some form of serious environmental contamination. But today's picture is merely the tip of the iceberg. If none of the other factors get worse, then I guarantee that this is one factor that will. It's happening as we speak.

Air pollution is already extreme in many of the world's major cities and industrial areas, particularly in Asia. However, serious though it is, large populations are managing to survive in areas where it's very high. Water contamination is a different story. If an entire country, or even an entire region, has absolutely no potable water, then living and travelling in those areas becomes quite hard.

Of course, the most serious form of environmental contamination possible, is a nuclear disaster. Unfortanately, the potential for nuclear catastrophe is still positively massive. Nuclear disarmament has been a slow and limited process. And weapons aside, nuclear reactors are still abundant in much of the world. A Chernobyl-like event on a scale 100 times bigger — that could sure as hell put travel plans to entire continents on hold indefinitely.

Flights

The offering of long-distance international flights today is simply mind-boggling. The extensive number of routes / destinations, the frequency, and of course the prices; all are at an unprecedented level of awesomeness. It's something you barely think about: if you want to get from London to Singapore next week, just book a flight. You'll be there in 14 hours or so.

Sorry to burst the bubble, folks; but this is one more thing that simply cannot and will not last. We already saw, with last year's Iceland volcano eruption, just how easily the international aviation network can collapse, even if only temporarily. Sept 11 pretty well halted global flights as well. A more serious environmental or security problem could halt flights for much, much longer.

And if nothing else grounds the planes first, then sooner or later, we're going to run out of oil. In particular, jet fuel is the highest-quality, most refined of all petroleum, and it's likely to be the first that we deplete within the next century. At the moment, we have no real alternative fuel — hopefully, a renewable form of jet propulsion will find itself tested and on the market before we run out.

Money

Compared to all the hypothetical doomsday scenarios discussed above, this may seem like a trivial non-issue. But in fact, money is the most fundamental of all enablers of our modern globetrotting lifestyle, and it's the enabler that's most likely to disappear first. The fact is that many of us have an awful lot of disposable cash (especially compared with the majority of the world's population), and that cash goes an awfully long way in many parts of the world. This is not something we should be taking for granted.

The global financial crisis has already demonstrated the fragility of our seemingly secure wealth. However, despite the crisis, most Westerners still have enough cash for a fair bit of long-distance travel. Some are even travelling more than ever, because of the crisis — having lost their jobs, and having saved up cash over a long period of time, many have found it the perfect opportunity to head off on a walkabout.

Then there is the strange and mysterious matter of the international currency exchange system. I don't claim to be an expert on the topic, by any means. Like most simple plebs, I know that my modest earnings (by Western standards) tower above the earnings of those in developing countries; and I know that when I travel to developing countries, my Western cash converts into no less than a veritable treasure trove. And I realise that this is pretty cool. However, it's also a giant inequality and injustice. And like all glaring inequalities throughout history, it's one that will ultimately fall. The wealth gap between various parts of the world will inevitably change, and it will change drastically. This will of course be an overwhelmingly good thing; but it will also harm your travel budget.

Final words

Sorry that this has turned out to be something of a doomsday rant. I'm not trying to evoke the end of the world, with all these negative hypotheticals. I'm simply trying to point out that if any one of a number of currently positive factors in the world were to turn sour, then 21st century travel as we know it could end. And it's not all that likely that any one of these factors, by itself, will head downhill in the immediate future. But the combination of all those likelihoods does add up rather quickly.

I'd like to end this discussion on a 100% positive note. Right now, none of the doom-n-gloom scenarios I've mentioned has come to fruition. Right now, for many of us, la vita e bella! (Although for many many others, life is le shiiiite). Make the most of it. See the world in all its glory. Go nuts. Global travel has been one of the most difficult endeavours of all, for much of human history; today, it's at our fingertips. As Peter Pan says: "Second star to the right, and straight on 'till morning."

]]>
Boycott GPS 2010-08-25T00:00:00Z 2010-08-25T00:00:00Z Jaza https://greenash.net.au/thoughts/2010/08/boycott-gps/ Last month, I went on a road trip with a group of friends, up the east coast of Queensland. We hired two campervans, and we spent just over a week cruising our way from Brisbane to Cairns. Two of my friends insisted on bringing their GPS devices with from home. Personally, I don't use a GPS; but these friends of mine use them all the time, and they assured me that having them on the trip would make our lives much easier, and hence that the trip would be much more enjoyable.

I had my doubts. Australia — for those of you that don't know — is a simple country with simple roads. The coast of Queensland is no exception. There's one highway, and it's called Route 1, and it goes up the coast in a straight line, from Brisbane to Cairns, for about 1,600 km's. If you see a pub, it means you've driven through a town. If you see two pubs, a petrol station, a real estate agent and a post office (not necessarily all in different buildings), that's a big town. If you see houses as well, you must be in a capital city. It's pretty hard to get lost. Why would we need a GPS?

To cut a long story short, the GPSes were a major annoyance throughout the trip, and they were of no real help for the vast majority our our travelling. Several times, they instructed us to take routes that were a blatant deviation from the main route that prominent road signs had marked, and that were clearly not the quickest route anyhow. They discouraged going off the beaten track and exploring local areas, because they have no "shut up I'm going walkabout now" mode. And, what got to me more than anything, my travel buddies were clearly unable to navigate along even the simplest stretch of road without them, and it made me sad to see my friends crippled by these devices that they've come to so depend upon.

In the developed world, with its developed mapping providers and its developed satellite coverage, GPS is becoming ever more popular amongst automobile drivers. This is happening to the extent that I often wonder if the whole world is now running on autopilot. "In two hundred metres, take the second exit at the roundabout, then take the third left."

Call me a luddite and a dinosaur if you must, all ye GPS faithful… but I refuse to use a GPS. I really can't stand the things. They're annoying to listen to. I can usually find a route just fine without them. And using them makes you navigationally illiterate. Join me in boycotting GPS!

GPS is eroding navigational skills

This is my main gripe with GPS devices. People who use them seem to become utterly dependent on them, sticking with them like crack junkies stick to the walls of talcum powder factories. If a GPS addict is at any time forced to drive without his/her beloved electronic companion, he/she is utterly lost. Using a GPS all the time makes you forget how to navigate. It means that you don't explore or immerse yourself in the landscape around you. It's like walking through a maze blindfolded.

I must point out, though, that GPS devices don't have to make us this stupid. However, this is the way the current generation of devices are designed. Current GPSes encourage stimulus-driven rather than spatially-driven navigation. Unless you spend quite a lot of time changing the default settings, 99% of consumer-car GPSes will only show you the immediate stretch of road in front of you in their map display, and the audio will only instruct you as to the next immediate action you are to take.

Worse still, the action-based instructions that GPSes currently provide are completely devoid of the contextual richness that we'd utilise, were we humans still giving verbal directions to each other. If you were driving to my house, I'd tell you: "turn right when you see the McDonald's, then turn left just before the church, at the bottom of the hill". The GPS, on the other hand, would only tell you: "in 300 metres, turn right, then take the second left". And, because you've completely tuned in to the hypnotic words of the GPS, and tuned out to the world around you, it's unlikely you'd even notice that there's a Maccas, or a church, or a hill, near my house.

Even the US military is having trouble with its troops suffering from reduced navigational ability, as a direct result of their dependence on field GPS devices. Similarly, far North American Inuits are rapidly losing the traditional arctic navigation skills that they've been passing down through the generations for centuries, due to the recent introduction of GPS aids amongst hunters and travellers in their tribes. So, if soldiers who are highly trained in pathfinding, and polar hunters who have pathfinding in their blood — if these people's sense of direction is eroding, what hope is there for us mere mortals?

I got started thinking about this, when I read an article about this possibility: Could GPS create a world without signs? I found this to be a chilling prediction to reflect upon, particularly for a GPS-phobe like myself. The eradication of traditional street signs would really be the last straw. It would mean that the GPS-averse minority would ultimately be forced to convert — presumably by law, since if we assume that governments allowed most street signs to discontinue, we can also assume that they'd make GPS devices compulsory for safety reasons (not to mention privacy concerns, anyone?).

Explorer at heart

I must admit, I'm a much more keen navigator and explorer than your average Joe. I've always adored maps — when I was a kid, I used to spend hours poring over the street directory, or engrossing myself in an atlas that was (at the time) taller than me. Nowadays, I can easily burn off an entire evening panning and zooming around Google Earth.

I love to work out routes myself. I also love to explore the way as I go. Being a keen urban cyclist, this is an essential skill — cycling is also one of the best methods for learning your way around any local area. It also helped me immensely in my world trip several years ago, particularly when hiking in remote mountain regions, but also in every new city I arrived at. I'm more comfortable if I know the compass bearings in any given place I find myself, and I attempt to derive compass bearings using the position of the sun whenever I can.

So, OK, I'm a bit weird, got a bit of a map and navigation fetish. I also admit, I took the Getting Lost orientation test, and scored perfectly in almost every area (except face recognition, which is not my strong point).

I'm one of those people who thinks it would be pretty cool to have lived hundreds of years ago, when intrepid sailors ventured (with only the crudest of navigational aids) to far-flung oceans, whose edges were marked on maps as being guarded by fierce dragons; and when fearless buccaneers ventured across uncharted continents, hoping that the natives would point them on to the next village, rather than skewer them alive and then char-grill their livers for afternoon tea. No wonder, then, that I find it fun being without GPS, whether I'm driving around suburban Sydney, or ascending a mountain in Bolivia.

Then again, I'm also one of those crazy luddites that think the world would be better without mobile phones. But that's a rant for another time.

]]>
Media through the ages 2009-12-30T00:00:00Z 2009-12-30T00:00:00Z Jaza https://greenash.net.au/thoughts/2009/12/media-through-the-ages/ The 20th century was witness to the birth of what is arguably the most popular device in the history of mankind: the television. TV is a communications technology that has revolutionised the delivery of information, entertainment and artistic expression to the masses. More recently, we have all witnessed (and participated in) the birth of the Internet, a technology whose potential makes TV pale into insignificance in comparison (although, it seems, TV isn't leaving us anytime soon). These are fast-paced and momentous times we live in. I thought now would be a good opportunity to take a journey back through the ages, and to explore the forms of (and devices for) media and communication throughout human history.

Our journey begins in prehistoric times, (arguably) before man even existed in the exact modern anatomical form that all humans exhibit today. It is believed that modern homo sapiens emerged as a distinct genetic species approximately 200,000 years ago, and it is therefore no coincidence that my search for the oldest known evidence of meaningful human communication also brought me to examine this time period. Evidence suggests that at around this time, humans began to transmit and record information in rock carvings. These are also considered the oldest form of human artistic expression on the planet.

From that time onwards, it's been an ever-accelerating roller-coaster ride of progress, from prehistoric forms of media such as cave painting and sculpture, through to key discoveries such as writing and paper in the ancient world, and reaching an explosion of information generation and distribution in the Renaissance, with the invention of the printing press in 1450AD. Finally, the modern era of the past two centuries has accelerated the pace to dizzying levels, beginning with the invention of the photograph and the invention of the telegraph in the early 19th century, and culminating (thus far) with mobile phones and the Internet at the end of the 20th century.

List of communication milestones

I've done some research in this area, and I've compiled a list of what I believe are the most significant forms of communication or devices for communication throughout human history. You can see my list in the table below. I've also applied some categorisation to each item in the list, and I'll discuss that categorisation shortly.

Prehistoric (200,000 BC - 4,000 BC)

Name Year Directionality Preservation
rock carving c. 200,000 BC down permanent
song, music and dance between 100,000 BC and 30,000 BC down or up or lateral transient
language and oration between 100,000 BC and 30,000 BC down or up or lateral transient
body art between 100,000 BC and 30,000 BC down or up or lateral transient
jewellery between 100,000 BC and 30,000 BC down or up or lateral permanent
mythology between 100,000 BC and 30,000 BC down transient
cave painting and visual symbols between 100,000 BC and 30,000 BC down permanent
sculpture between 100,000 BC and 30,000 BC down permanent
pottery c. 14,000 BC down permanent
megalithic architecture c. 4000 BC down permanent

Ancient (3000 BC - 100 AD)

Name Year Directionality Preservation
writing c. 3000 BC down permanent
metallurgical art and bronze sculpture c. 3000 BC down permanent
alphabet c. 2000 BC down permanent
drama c. 500 BC down or up or lateral transient
paper c. 100 AD down permanent

Renaissance (1450 AD - 1620)

Name Year Directionality Preservation
printing press 1450 AD down permanent
printed books c. 1500 down permanent
newspapers and magazines c. 1620 down permanent

Modern (1839 - present)

Name Year Directionality Preservation
photograph 1839 down or up or lateral permanent
telegraph 1844 lateral permanent
telephone 1876 lateral transient
phonograph (gramophone) 1877 down permanent
movie camera 1891 down or up or lateral permanent
film 1894 down permanent
radio 1906 down permanent
television 1936 down permanent
videotape 1958 down or up or lateral permanent
cassette tape 1964 down or up or lateral permanent
personal computer 1973 down or up or lateral permanent
compact disc 1983 down permanent
mobile phone 1991 lateral transient
internet 1992 down or up or lateral permanent

Note: pre-modern dates are approximations only, and are based on the approximations of authoritative sources. For modern dates, I have tried to give the date that the device first became available to (and first started to be used by) the general public, rather than the date the device was invented.

Directionality and preservation

My categorisation system in the list above is loosely based on coolscorpio's types of communication. However, I have used the word "directionality" to refer to his "downward, upward and lateral communication"; and I have used the word "preservation" and the terms "transient and permanent" to refer to his "oral and written communication", as I needed terms more generic than "oral and written" for my data set.

Preservation of information is something that we've been thinking about as a species for an awfully long time. We've been able to record information in a permanent, durable form, more-or-less for as long as the human race has existed. Indeed, if early humans hadn't found a way to permanently preserve information, then we'd have very little evidence of their being able to conduct advanced communication at all.

Since the invention of writing, permanent preservation of information has become increasingly widespread*. However, oral language has always been our richest and our most potent form of communication, and it hasn't been until modern times that we've finally discovered ways of capturing it; and even to this very day, our favourite modern oral communication technology — the telephone — remains essentially transient and preserves no record of what passes through it.

Directionality of communication has three forms: from a small group of people (often at the "top") down to a larger group (at the "bottom"); from a large group up to a small one; and between any small groups in society laterally. Human history has been an endless struggle between authority and the masses, and that struggle is reflected in the history of human communication: those at the top have always pushed for the dominance of "down" technologies, while those at the bottom have always resisted, and have instead advocated for more neutral technologies. From looking at the list above, we can see that the dominant communications technologies of the time have had no small effect on the strength of freedom vs authority of the time.

Prehistoric human society was quite balanced in this regard. There were a number of powerful forms of media that only those at the top (i.e. chiefs, warlords) had practical access to. These were typically the more permanent forms of media, such as the paintings on the cave walls. However, oral communication was really the most important media of the time, and it was equally accessible to all members of society. Additionally, societies were generally grouped into relatively small tribes and clans, leaving less room for layers of authority between the top and bottom ranks.

The ancient world — the dawn of human "civilisation" — changed all this. This era brought about three key communications media that were particularly well-suited to a "down" directionality, and hence to empowering authority above the common populace: megalithic architecture (technically pre-ancient, but only just); metallurgy; and writing. Megalithic architecture allowed kings and Pharoahs to send a message to the world, a message that would endure the sands of time; but it was hardly a media accessible to all, as it required armies of labourers, teams of designers and engineers, as well as hordes of natural and mineral resources. Similarly, metallurgy's barrier to access was the skilled labour and the mineral resources required to produce it. Writing, today considered the great enabler of access to information and of global equality, was in the ancient world anything but that, because all but the supreme elite were illiterate, and the governments of the day wanted nothing more but to maintain that status quo.

Gutenberg's invention of the printing press in 1450 AD is generally considered to be the most important milestone in the history of human communication. Most view it purely from a positive perspective: it helped spread literacy to the masses; and it allowed for the spread of knowledge as never before. However, the printing press was clearly a "down" technology in terms of directionality, and this should not be overlooked. To this very day, access to mass printing and distribution services is a privilege available only to those at the very top of society, and it is a privilege that has been consistently used as a means of population control and propaganda. Don't get me wrong, I agree with the general consensus that the positive effects of the printing press far outweigh its downside, and I must also stress that the printing press was an essential step in the right direction towards technologies with more neutral directionality. But essentially, the printing press — the key device that led to the dawn of the Renaissance — only served to further entrench the iron fist of authority that saw its birth in the ancient world.

Modern media technology has been very much a mixed bag. On the plus side, there have been some truly direction-neutral communication tools that are now accessible to all, with photography, video-recording, and sound-recording technologies being the most prominent examples. There is even one device that is possibly the only pure lateral-only communication tool in the history of the world, and it's also become one of the most successful and widespread tools in history: the telephone. On the flip side, however, the modern world's two most successful devices are also the most sinister, most potent "down" directionality devices that humanity has ever seen: TV and radio.

The television (along with film and the cinema, which is also a "down" form of media) is the defining symbol of the 20th century, and it's still going strong into the 21st. Unfortunately, the television is also the ultimate device allowing one-way communication from those at the top of society, to those at the bottom. By its very definition, television is "broadcast" from the networks to the masses; and it's quite literally impossible for it to allow those at the receiving end to have their voices heard. What the Pyramids set in stone before the ancient masses, and what the Gutenberg bibles stamped in ink before the medieval hordes, the television has now burned into the minds of at least three modern generations.

The Internet, as you should all know by now, is changing everything. However, the Internet is also still in its infancy, and the Internet's fate in determining the directionality of communication into the next century is still unclear. At the moment, things look very positive. The Internet is the most accessible and the most powerful direction-neutral technology the world has ever seen. Blogging (what I'm doing right now!) is perhaps the first pure "up" directionality technology in the history of mankind, and if so, then I feel privileged to be able to use it.

The Internet allows a random citizen to broadcast a message to the world, for all eternity, in about 0.001% of the time that it took a king of the ancient world to deliver a message to all the subjects of his kingdom. I think That's Cool™. But the question is: when every little person on the planet is broadcasting information to the whole world, who does everyone actually listen to? Sure, there are literally millions of personal blogs out there, much like this one; and anyone can look at any of them, with just the click of a button, now or 50 years from now (50 years… at least, that's the plan). But even in an information ecosystem such as this, it hasn't taken long for the vast majority of people to shut out all sources of information, save for a select few. And before we know it — and without even a drop of blood being shed in protest — we're back to 1450 AD all over again.

It's a big 'Net out there, people. Explore it.

* Note: I've listed television and radio as being "permanent" preservation technologies, because even though the act of broadcasting is transient, the vast majority of television and radio transmissions throughout modern times have been recorded and formally archived.

]]>
What are fossil fuels? 2009-10-25T00:00:00Z 2009-10-25T00:00:00Z Jaza https://greenash.net.au/thoughts/2009/10/what-are-fossil-fuels/ Let me begin with a little bit of high school revision. Fossil fuels are composed primarily of carbon and hydrogen. There are basically three types of fossil fuels on Earth: coal, oil, and natural gas. It's common knowledge that fossil fuels are the remains of prehistoric plants and animals. That's why they're called "fossil fuels" (although they're not literally made from prehistoric bones, or at least not in any significant amount). Over a period of millions of years, these organic remains decomposed, and they got buried deep beneath rock and sea beds. A combination of heat and pressure caused the organic material to chemically alter into the fuel resources that we're familiar with today. The fuels became trapped between layers of rock in the Earth's geological structure, thus preserving them and protecting them from the elements up to the present day.

Hang on. Let's stop right there. Fossil fuels are dead plants and animals. And we burn them in order to produce the energy that powers most of our modern world (86% of it, to be precise). In other words, modern human civilisation depends (almost exclusively) upon the incineration of the final remains of some of the earliest life on Earth. In case there weren't enough practical reasons for us to stop burning fossil fuels, surely that's one hell of a philosophical reason. Wouldn't you say so?

The term "fossil fuels" seems to be bandied about more and more casually all the time. World energy is built upon fossil fuels, and this is of course a massive problem for a number of practical reasons. We should all be familiar by now with these reasons: they're a non-renewable source of energy; they're a major source of greenhouse gas emissions (and hence a major contributor to global warming); and they generate toxic air and water pollution. However, we seldom seem to stop and simply think about the term "fossil fuels" itself, and what it means.

The various fossil fuels trace their origins back to anywhere between 60 million and 300 million years ago. Coal is generally considered to be the older of the fuels, with most of it having formed 300 to 350 million years ago during the Carboniferous period (the period itself is named after coal). 300 million years ago, life on Earth was very different to how it looks today. Most of the world was a giant swamp. Continents were still unstable and were entirely unrecognisable from their present shapes and positions. And life itself was much more primitive: the majority of lifeforms were simple microscopic organisms such as phytoplankton; plants were dominated by ferns and algae (flowers weren't yet invented); terrestrial animals were limited to small reptiles and amphibians (the long-lost ancestors of those we know today); and only fish had reached a relatively more advanced state of evolution. Birds and mammals wouldn't be invented for quite a few more million years.

Coal is believed to be composed of the remains of all of these lifeforms, to some extent. The largest component of most coal is either plant matter, or the remains of microscopic organisms; however, the primitive animals of the Carboniferous period are no doubt also present in smaller quantities.

Oil and natural gas — which are typically formed and are found together — are believed on the whole to have formed much later, generally around 60 million years ago. Like coal, oil and natural gas are composed primarily of plant matter and of microscopic organisms (with both being more of marine origin for oil and natural gas than for coal). It's a popular belief that oil contains the decomposed remains of the dinosaurs; and while this is probably true to some extent, the reality is that dinosaurs and other complex animals of the time are probably only present in very small quantities in oil.

So, all of the three fossil fuels contain the remains of dead plants and animals, but:

  • the remains contain far more plant (and microscopic organism) matter than they do animal matter;
  • the remains have been chemically altered, to the extent that cell structures and DNA structures are barely present in fossil fuels; and
  • most of the lifeforms are from a time so long ago that they'd be virtually unrecognisable to us today anyway.

And does that matter? Does that in any way justify the fact that we're incinerating the remains of ancient life on Earth? Does that change the fact that (according to the theory of evolution) we're incinerating our ancestors to produce electricity and to propel our cars?

I don't think so. Life is life. And primitive / chemically altered or not, fossil fuels were not only life, they were the precursor to our own life, and they were some of the first true lifeforms ever to dwell on this planet. I don't know about you, but I believe that the remains of such lifeforms deserve some respect. I don't think that a coal-fired power station is an appropriate final destination for such remains. Carboniferous life is more than simply historic, it is prehistoric. And prehistoric life is something that we should handle with dignity and care. It's not a resource. It's a sacred relic of a time older than we can fathom. Exploiting and recklessly destroying such a relic is surely a bad omen for our species.

]]>
On the causes of the First World War 2008-06-24T00:00:00Z 2008-06-24T00:00:00Z Jaza https://greenash.net.au/thoughts/2008/06/on-the-causes-of-the-first-world-war/ WWI was one of the most costly and the most gruesome of wars that mankind has ever seen. It was also one of the most pointless. I've just finished reading The First Casualty, a ripper of a novel by author and playwright Ben Elton. The novel is set in 1917, and much of the story takes place at the infamous Battle of Passchendaele, which is considered to have been the worst of all the many hellish battles in the war. I would like to quote one particular passage from the book, which I believe is the best summary of the causes of WWI that I've ever read:

'The question I always asks is, why did anyone give a fuck about this bleeding Archduke Ferdinand what's-his-face in the first place?' one fellow said. 'I mean, come on, nobody had even heard of the cunt till he got popped off. Now the entire fucking world is fighting 'cos of it.'

'You dozy arse', another man admonished, 'that was just a bleeding spark, that was. It was a spark. Europe was a tinder box, wasn't it? Everyone knows that.'

'Well, I don't see as how he was even worth a spark, mate,' the first man replied. 'Like I say, who'd even heard of the cunt?'

A corporal weighed in to settle the matter.

'Listen, it's yer Balkans, innit? Always yer Balkans. Balkans, Balkans, Balkans. You see, yer Austro-Hungarians—'

'Who are another bunch we never gave a fuck about till all this kicked off,' the first man interjected.

'Shut up an' you might learn something,' the corporal insisted. 'You've got your Austro-Hungarians supposed to be in charge in Sarajevo but most of the Bosnians is Serbs, right, or at least enough of 'em is to cause a t'do.'

'What's Sarajevo got to do with Bosnia then?'

'Sarajevo's in Bosnia, you monkey! It's the capital.'

'Oh. So?'

'Well, your Austrians 'ave got Bosnia, right, but your Bosnians are backed by your Serbs, right? So when a Bosnian Serb shoots—'

'A Bosnian or a Serb?'

'A Bosnian and a bleeding Serb, you arse. When this Bosnian Serb loony shoots Ferdinand who's heir to the Austro-Hungarian throne, the Austrians think, right, here's a chance to put Serbia back in its bleeding box for good, so they give 'em an ultimatum. They says, "You topped our Archduke so from now on you can bleeding knuckle under or else you're for it." Which would have been fine except the Serbs were backed by the Russians, see, and the Russians says to the Austrians, you has a go at Serbia, you has a go at us, right? But the Austrians is backed by the Germans who says to the Russians, you has a go at Austria, you has a go at us, right? Except the Russians is backed by the French who says to the Germans, you has a go at Russia, you has a go at us, right? And altogether they says kick off! Let's be having you! And the ruck begins.'

'What about us then?' the first man enquired. The rest of the group seemed to feel that this was the crux of it.

'Entente bleeding cordiale, mate,', the corporal replied. 'We was backing the French except it wasn't like an alliance — it was just, well, it was a bleedin' entente, wasn't it.'

'An' what's an entente when it's at home?'

'It means we wasn't obliged to fight.'

'Never! You mean we didn't have to?'

'Nope.'

'Why the fuck did we then?'

'Fuckin' Belgium.'

'Belgium?'

'That's right, fuckin' Belgium.'

'Who gives a fuck about Belgium?'

'Well, you'd have thought no one, wouldn't you? But we did. 'Cos the German plan to get at the French was to go through Belgium, but we was guaranteeing 'em, see. So we says to the Germans, you has a go at Belgium, you has a go at us. We'd guaranteed her, see. It was a matter of honour. So in we came.'

Kingsley could not resist interjecting.

'Of course it wasn't really about honour,' he said.

'Do what?' queried the corporal.

'Well, we'd only guaranteed Belgium because we didn't want either Germany or France dominating the entire Channel coast. In the last century we thought that letting them both know that if they invaded Belgium they'd have us to deal with would deter them.'

'But it didn't.'

'Sadly not.'

'So what about the Italians, an' the Japs, an' the Turks, an' the Yanks, eh? How did they end up in it?' asked the original inquisitor.

'Fuck knows,' said the corporal. 'I lost track after the Belgians.'

Ben Elton (2005), 'The First Casualty', Ch. 36: 'A communal interlude', Bantam Press, pp. 206-208.

And if I'm not mistaken, that pretty well sums it up. I remember studying WWI, back in high school. Like so many other students of history, I couldn't help but notice the irony of it — the sheer and absurd stupidity of an entire continent (supposedly the most advanced in all the world, at the time), falling like a pack of dominoes and descending into an all-out bloodbath. It would have been funny, were it not for the fact that half the young men of early 20th-century Europe paid for it with their lives.

It would have been great if they'd simply told me to read this, instead of having me study and memorise the ridiculous chain of events that led up to the Great War. 'Russia declares war on Austria', 'Germany declares war on Russia', 'France declares war on Germany', etc. I've also finally unravelled the mystery of how the hell it was that us Aussies managed to get roped into the war, and of how the hell our most sacred event of national heritage involved several thousand of our Grandads getting mowed down by machine-guns on a beach in Turkey. I guess we fall into the category of: 'Fuck knows... I lost track after the Belgians.'

]]>
Consciously directed healing 2007-11-02T00:00:00Z 2007-11-02T00:00:00Z Jaza https://greenash.net.au/thoughts/2007/11/consciously-directed-healing/ The human body is a self-sustaining and self-repairing entity. When you cut your hand, when you blister your foot, or when you burn your tongue, you know — and you take it for granted — that somehow, miraculously, your body will heal itself. All it needs is time.

This miracle is possible, because our bodies are equipped with resources more vast and more incredible than most people ever realise, let alone think about. Doctors know these resources inside-out — they're called cells. We have billions upon billions of cells, forming the building-blocks of ourselves: each of them is an independent living thing; and yet each is also purpose-built for serving the whole in a specific way, and is 100% at the disposal of the needs of the whole. We have cells that make us breathe. Cells that make us digest. Cells that make us grow. And, most important of all, cells that tell all the other cells what to do — those are known as brain cells.

In the case of common muscle injuries, it's the tissue cells (i.e. the growing cells — they make us grow by reproducing themselves) and the brain cells, among others, that are largely responsible for repairs. When an injury occurs, the brain cells receive reports of the location and the extent of the problem. They then direct the tissue cells around the affected area to grow — i.e. to reproduce themselves — into the injury, thus slowly bringing new and undamanged tissue to the trouble spot, and bit-by-bit restoring it to its original and intended state. Of course, it's a lot more complicated than that: I'm not a doctor, so I'm not going to pretend I understand it properly. But as far as I'm aware, that's the basics of it.

However, there are many injuries that are simply too severe for the body to repair by itself in this way. In these cases, help may be needed in the form of lotions, medicines, or even surgery. Now, what I want to know is: why is this so? With all its vast resources, what is it that the human body finds so difficult and so time-consuming in healing a few simple cuts and bruises? Surely — with a little bit of help, and a lot more conscious concentration — we should be capable of repairing so much more, all by ourselves.

More brain power

There is a widely-known theory that we humans only use 10% of our brains. Now, this theory has many skeptics: and those skeptics pose extremely valid arguments against the theory. For example, we may only use 10-20% of our brains at any one time, but we certainly use the majority of our brains at some point in our lives. Also, brain research is still (despite years of money and effort) an incredibly young field, and scientists really have no idea how much of our brains we use, at this point in time. However, it still seems fairly likely that we do indeed only use a fraction of our brain's capacity at any given time — even in times of great pain and injury — and that were we able to use more of that capacity, and to use it more effectively, that would benefit us in numerous manifold ways.

I personally am inclined to agree with the myth-toting whackos, at least to some extent: I too believe that the human brain is a massively under-utilised organ of the body; and that modern medicine has yet to uncover the secrets that will allow us to harness that extra brain power, in ways that we can barely imagine. I'm certainly not saying that I agree with the proponents of the Quantum-Touch theory, who claim to be able to "heal others by directing their brain's energy" — that's a bit far-fetched for my liking. Nor am I in any way agreeing with ideas such as psychokinesis, which claims that the mere power of the brain is capable of anything, from levitating distant objects to affecting the thoughts and senses of others. No: I'm not agreeing with anything that dodgy or supernatural-like.

I am, however, saying that the human brain is a very powerful organ, and that if we could utilise it more, then our body would be able to do a lot more things (including the self-healing that it's already been capable of since time immemorial) a lot more effectively.

More concentration

As well as utilising more of our brains, there is also (even more vexingly) the issue of directing all that extra capacity to a particular purpose. Now, in my opinion, this is logically bound to be the trickier bit, from a scientific standpoint. For all practical purposes, we're already able to put our brains into an "extreme mode", where we utilise a lot more capacity all at once. What do you think conventional steroids do? Or the myriad of narcotic "party drugs", such as Speed and Ecstasy, that are so widely sought-after worldwide? Upping the voltage isn't that hard: we've already figured it out. But where does it go? We have no idea how to direct all that extra capacity, except into such useless (albeit fun) pursuits as screaming, running, or dancing like crazy. What a waste.

I don't know what the answer to this one is: whether it be a matter of some future concentration-enhancing medicine; of simply having a super-disciplined mind; or of some combination of this and other solutions. Since nobody to date has conclusively proven and demonstrated that they can direct their brain's extra capacity to somewhere useful, without medical help, I doubt that anything truly amazing is physically possible, with concentration alone. But whatever the solution is, it's only a matter of time before it is discovered; and its discovery is bound to have groundbreaking implications for medicine and for numerous other fields.

Future glimpse

Basically, what I'm talking about in this article is a future wonder-invention, that will essentially allow us to utilise our brain's extra capacity, and to direct that extra capacity to somewhere useful, for the purpose of carrying out conventional self-healing in a much faster and more effective way than is currently possible. This is not about doing anything that's simply impossible, according to the laws of medicine or physics — such as curing yourself of cancer, or vaporising your enemies with a stare — it's about taking something that we do now, and enhancing it. I'm not a scientist or a doctor, I'm just someone who has too much time on his hands, and who occasionally thinks about how cool it would be for the world to have things like this. Nevertheless, I really do believe that consciously directed healing is possible, and that it's only a matter of time before we work out how to do it.

]]>
Economics and doomed jobs 2007-07-07T00:00:00Z 2007-07-07T00:00:00Z Jaza https://greenash.net.au/thoughts/2007/07/economics-and-doomed-jobs/ There are a great many people in this world — particularly in third-world countries — that spend their entire lives performing jobs that are dangerous, labour-intensive, unhealthy, and altogether better-suited for machines. I've often heard the argument that "it's better that they do what they do, than that they have no job at all". Examples of such jobs include labouring in textile sweatshops, packaging raw meat in processing plants, taking care of dangerous and disease-prone animals, and working in mines with atrocious conditions and poisonous fumes.

After visiting the hellish mines of Potosí in Bolivia, I disagree with the "better than no job at all" argument more strongly than ever. I'm now 100% convinced that it's better for jobs as atrocious as this to disappear from the face of the Earth; and that it's better for those affected to become unemployed and to face economic hardship in the short-term, while eventually finding newer and better jobs; than to continue in their doomed and unpleasant occupations forever.

As far as I've been able to tell so far, most people in this world seem to believe that it's better for really unpleasant jobs to exist, than for all the people performing them to be unemployed. In fact, more than anyone else, the majority of people performing these jobs believe this (apparently) logical rhetoric. Most people believe it, because it is very simple, and on the surface it does make sense. Yes, a lot of people have jobs that are barely fit for humans to perform. Yes, it affects their health and the health of their children. Yes, they get paid peanuts for it, and they're being exploited. But then again, they have little or no education, and there are almost no other jobs available in their area. And it's all they know. Isn't it better that they're at least able to put food on the table, and to feed their family, than that they're able to do nothing at all?

But the thing is, if there's one thing that the past 200 years of industrial progress have shown us, it's that replacing manual-labour human-performed jobs with machines does not destroy employment opportunities. Quite the opposite, in fact. Sure, it makes a lot of people unemployed and angry in the short-term. But in the long-term, it creates a lot of newer and much better jobs. Jobs that foster education, good working conditions, and higher levels of skill and innovation. Jobs that are not ridiculously dangerous and unpleasant. Jobs that are fit for intelligent, capable 21st-century human beings to get involved in.

Take the textile industry, for example. Originally, all clothing was made by hand, by artisans such as weavers, sewing people, and knitters. These days, much of the time those jobs are performed by machines, and the human occupations that used to exist for them are largely obsolete. However, there are now new jobs. There are people who perform maintenance work on the weaving machines, and on the sewing machines. There are people who design new ways of making the weaving machines and the sewing machines work better. There are people who consult with the public, on what they'd like to see done better in the textile industry, and on what they'd be willing to pay more for if it was done to a higher calibre. There are an endless number of newer and better jobs, that have sprung up as a result of the lower jobs changing from human labour to automated mechanisation.

And the other economic issue that the Potosí experience has brought to my attention, is that of small ventures vs. big companies. Now, this is another one where a lot of people are not going to be on my side. The classic argument is that it's a real shame that the single-man or the family-run business is being swallowed up, by the beast that is the multi-national corporation. In the old days, people say, a man was able to earn an honest living; and he was able to really create an enterprise of his own, and to reap the rewards of working for himself. These days, everyone has sold their soul to big corporations; and the corporation owns everything, while exploiting the people at the bottom, and only giving each of them the measliest pittance of a salary that it can get away with.

For some industries, I agree that this really is a shame. For many of the industries that humans can, and have, performed better and more skilfully in small-group enterprises for thousands of years — such as medicine, sport, literature, and most especially music — the big corporation has definitely done more harm than good. But for the industries that are built on mechanisation, and that require large amounts of investment and organisation — such as transportation, modern agriculture, and mining — I've now seen it being done with the big corporation (in Western countries, such as Australia), and without the big corporation (in third-world countries, such as Bolivia). And it's clear that the big corporation is actually needed, if the operation is to be carried out with any degree of planning, safety, or efficiency.

The sad fact is that big ventures require big organisations behind them. Mining is a big venture: it involves huge amounts of raw material, machinery, personnel, transportation, land, and money. It is, by its very nature, an unsustainable and an environmentally destructive venture, and as such is is "bad": but it is also necessary, in order for the products and the goods of our modern world to be produced; and as such, I'm sorry to say that it's not going away any time soon. And so, bearing that in mind, I'd rather see mining done "right" — by big corporations, that know what they're doing — than done "wrong", by 400 co-operatives that squabble and compete, and that get very little done, while also doing nothing to improve the lives or the jobs of their people.

This is why I now look at "doomed jobs" and "doomed little-man ventures" in many industries, and instead of feeling sorry for their inevitable demise and hardship, I instead believe that their doom is ultimately for the best. In due course, all those unemployed labourers will, inevitably, move up in the world, and will be able to contribute to humanity in bigger and more meaningful ways, while having a more pleasant and a more fulfilling life. And, in due course, the corporate-run ventures will actually be more organised and more beneficial for everyone, than a gaggle of individually-run ventures could ever possibly be.

Of course, the forced change of occupation will be rejected by some; it will be unattainable for others; and it will come at a high cost for all. And naturally, the corporations will cut corners and will exploit the people at the bottom, unless (and, in many cases, even when) subject to the most rigorous of government regulations and union pressures. But ultimately, for modern and industrialised fields of work, it's the only way.

Because of all this, I look forward to the day when the mountain of Cerro Rico in Bolivia comes crashing down, and when the miners of Potosí (most of whom hopefully will not get killed by the mountain collapsing) are left wondering what the hell to do with their lives. The day that this happens, will be the day that those people stop spending their lives doing work that's barely fit for cattle, and start finding other jobs, that are more appropriate for adult human beings with a brain and a decent amount of common sense.

]]>
Pleasure vs pro 2006-11-15T00:00:00Z 2006-11-15T00:00:00Z Jaza https://greenash.net.au/thoughts/2006/11/pleasure-vs-pro/ It's become popular in recent times for people to quit their boring day jobs, and instead to work full-time on something that they really love doing. We've all heard people say: now I'm spending every day doing what I enjoy most, and I couldn't be happier.

This has been happening in my industry (the IT industry) perhaps more than in any other. Deflated workers have flocked away — fleeing such diverse occupations as database admin, systems admin, support programmer, and project manager — driven by the promise of freedom from corporate tyranny, and hoping to be unshackled from the manacles of boring and unchallenging work. The pattern can be seen manifesting itself in other industries, too, from education to music, and from finance to journalism. More than ever, people are getting sick of doing work that they hate, and of being employed by people who wouldn't shed a tear if their pet panda kicked the bucket and joined the bleedin' choir invisibile.

And why are people doing this? The biggest reason is simply because — now more than ever — they can. With IT in particular, it's never been easier to start your own business from scratch, to develop and market hip-hop new applications in very small teams (or even alone), and to expand your skill set far beyond its humble former self. On top of that, people are being told (via the mass media) not only that they can do it, but that they should. It's all the rage. Doing-what-thou-lovest is the new blue. Considering the way in which a career has been reduced to little more than yet another consumer product (in recent times), this attitude should come as no surprise. After all, a job where you do exactly what you want sounds much better than a job where you do exactly what you're told.

Call me a cynic, but I am very dubious of the truth of this approach. In my experience, as soon as you turn a pleasurable pastime into a profession, you've suddenly added a whole new bucket of not-so-enjoyable tasks and responsibilities into the mix; and in the process, you've sacrificed at least some of the pleasure. You've gone from the humble foothills to the pinnacle of the mountaintop — so to speak — in the hope of enjoying the superior view and the fresh air; only to discover that the mountain frequently spews ash and liquid hot magma from its zenith, thus rather spoiling the whole venture.

When I say that these things are in my experience, I am (of course) referring to my experience in the world of web design and development. I've been doing web design in one form or another for about 8 years now. That's almost as long as I've been online (which is for almost 9 years). I'm proud to say that ever since I first joined the web as one of its netizens (wherever did that term go, anyway? Or did it never really make it in the first place? *shrugs*), at age 12, I've wanted to make my own mark on it. Back then, in 1998, equipped with such formidable tools as Microsoft Word™ and its Save as HTML feature, and inhabiting a jungle where such tags as

<blink>

and

<marquee>

were considered "Web Standards", it was all fun and games. In retrospect, I guess I really was making my mark on the web, in the most crude sense of the term. But hey: who wasn't, back then?

From these humble beginnings, my quirky little hobby of producing web sites has grown into a full-fledged business. Well, OK: not exactly full-fledged (I still only do it on the side, in between study and other commitments); but it's certainly something that's profitable, at any rate. Web design (now known as web development, according to the marketing department of my [one-man] company) is no longer a hobby for me. It's a business. I have clients. And deadlines. And accounts. Oh, and a bit of web development in between all that, too. Just a bit.

Don't get me wrong. I'm not trying to complain about what I do. I chose to be a web developer, and I love being a web developer. I'm not saying that it's all a myth, and that you can't work full-time on something that you're passionate about, and retain your passion for it. But I am saying that it's a challenge. I am saying that doing an activity as a professional is very, very different from doing it as an amateur enthusiast. This may seem like an obvious statement, but in Wild Wild West industries such as web development, it's one that not everyone has put much thought into.

Going pro has many distinct advantages: you push yourself harder; you gain much more knowledge of (and experience in) your domain; you become a part of the wider professional community; and of course, you have some bread on the table at the end of the day. But it also has its drawbacks: you have to work all the time, not just when you're in the mood for it; you're not always doing exactly what you want to do, or not always doing things exactly the way you want them done; and worst of all, you have to take care of all the other "usual" things that come with running a small business of any kind. The trick is to make sure that the advantages always outweigh the drawbacks. That's all that any of us can hope for, because drawbacks are a reality in every sphere of life. They don't go away: they just get overshadowed by good things.

Looking back on my choice of career — in light of this whole pleasure-vs-pro argument — I'm more confident than ever that I've made the right move by going into the IT profession. Back when I was in my final year of high school, I was tossing up between a career in IT, and a career in Journalism (or in something else related to writing). Now that IT is my day job, my writing hobby is safe and sound as a pristine, undefiled little pastime. And in my opinion, IT (by its very nature) is much more suitable as a profession than as a pastime, and writing (similarly) is much more suitable as a pastime than as a profession. That's how I see it, anyway.

For all of you who are planning to (or who already have) quit your boring day jobs, in order to follow your dreams, I say: good luck to you, and may you find your dreams, rather than just finding another boring day job! If you're ever feeling down while following your dreams, just think about what you're doing, and you'll realise that you've actually got nothing to feel down about. Nothing at all.

]]>
Must have, must do 2006-09-16T00:00:00Z 2006-09-16T00:00:00Z Jaza https://greenash.net.au/thoughts/2006/09/must-have-must-do/ I have a beautiful little step-sister, who is almost three years old. She has the face of an angel, and she could charm a slab of granite if she had to; but boy, can she put up a storm when she doesn't get her way. Like all three-year-olds, she is very concerned with having things. There are a great many things that she wants to have. Sweets, videos, clothes, toys, rides on the swing, trips to the supermarket, and plastic dummies are some of the higher-priority of these things.

Lately, her choice of vernacular expression has taken an interesting route. And no, it's not the scenic route, either: it's the I-ain-stuffin-around freeway express route. She no longer wants things; she needs them. Mummy, I neeed some biscuits, and I neeed to go on the swing. I guess it's the logical choice of words to use, from her point of view: she's worked out that a need is stronger and more urgent than a want; so clearly, using the word 'need' is a more effective way of getting what you 'want'. Of course, she doesn't yet understand the concept of reserving the use of strong language for when it really is 'needed' (no pun intended). In fact, even some adults don't understand this concept.

We humans are naturally selfish creatures. This selfishness is most evident in children, who are completely uninhibited in expressing their every desire. But really, most adults retain this selfishness for their entire lives; the difference is only that they learn to conceal it, to mask it, and to express it more subtly. In many cases, the only things that really change are the desires themselves: the colourful, innocent little desires of childhood are replaced by bigger and less virtuous ones, such as ego, money, and sex.

But actually, there's more to it than this. Perhaps it's just me, but I think that the very nature of our desires changes over time. As children, we are obsessed with owning or possessing things: our entire lives revolve around having toys, having food, having entertainment. But as we get older, we seem to become less concerned with having things, and more concerned with doing things. We're still very much acting selfishly, but our goals and aspirations change dramatically. And in a way, that's what makes all the difference.

I've noticed this change in myself, and in many of the close friends that I've grown up with. When I was a wee lad, for example, I was extremely fond of Lego™. Barely a waking moment went by in which I wasn't salivating over the next Lego model on my wish-list, or plotting up cunning ways by which I could obtain more of the stuff. Many other toys and gizmos filled my heart with longing throughout my childhood: TV shows, magic cards, and console / computer games, to name a few. For many years, these were the things that made life worth living for. Without them, the world was an empty void.

But as I've grown older and hoarier (although not that hoary), these possessions have begun to seem much less important. Where has it gone, all that desire to accumulate things? It seems to have been extinguished. In its place is a new desire, far more potent than its predecessor ever was: a desire to do things. To see the world. To share my knowledge. To build useful tools. To help out.

I see the same changes in many of my peers. All of the things that they once considered to be of prime importance - the rock-star posters, the model aeroplane collections, the signed baseball caps - it seems that all of a sudden, nobody has time for them anymore. Everybody is too busy doing things: earning a university degree; gaining work experience; volunteering in their spare time. Even socialising seems to have subtly changed: from having friends, to being friends (a small but fundamental change in perception).

Now, while I am arguing that we humans have a greater desire to do positive acts as we enter adulthood, I am not arguing that this desire stems from any noble or benevolent motive. On the contrary, the motive generally remains the same as ever: self-benefit. There are many personal rewards to be found from doing things that make a difference: ego boost, political power, popularity, and money are some of the more common ones. Nevertheless, motives aside, the fact that we have this desire, and that we act on it, is surely a good thing, in and of itself.

This shift in our underlying desires strikes me as a fascinating change, and also as one of the key transitions between childhood and adulthood. Of course, I could be wrong, it could be just me - perhaps everyone else was born wanting to make a difference by doing, and it's just me that was the spoilt, selfish little kid who always wanted more toys to play with. If that's the case, then I can live with that. But until I'm proven wrong, I think I'll stick with my little theory.

]]>
Stop shining that light in my face 2006-08-19T00:00:00Z 2006-08-19T00:00:00Z Jaza https://greenash.net.au/thoughts/2006/08/stop-shining-that-light-in-my-face/ Evangelism. For centuries, many of the world's largest and most influential religions have practiced it. The word itself is generally associated with Christianity, and rightly so. But a number of other religions, such as Islam, also actively encourage it.

The idea behind evangelism is that one particular religion is the one true way to find G-d and to live a good life. It is therefore a duty, and an act of kindness, for the followers of that religion to "spread the word", and to help all of humanity to "see the light".

The catalyst behind my writing this article was that I happened to run into a Christian evangelist today, whilst out walking on the streets. I've never actually stopped and talked to one of these people before: my standard procedure is to ignore them when out and about, and to slam the door in their faces when they come a-knocking. This, quite understandably, is also how most other people react. But today I stopped and talked.

To cut a long story short, I walked away from the conversation almost an hour later, more certain than ever that evangelism is a bad idea.

Now, don't get me wrong: I'm all for the spreading of knowledge, and I think that connecting with and learning about religions and cultures outside of your own is a very worthwhile endeavour. I have personally devoted a fair amount of effort into this form of learning, and I don't regret one minute of it.

But imposing your ideas onto others is a whole different ball game. Teaching is one thing, and dictating is quite another. Unfortunately, evangelism is not about the sharing of knowledge or opinions. Sharing would involve telling people: "these are my beliefs, what are yours?" Instead, evangelism involves telling people: "these are my beliefs, and if you know what's good for you, they'll be yours too".

I happen to be a member of the Jewish faith, although I ain't the most religious Jew on the block, and I don't agree with everything that my religion has to say. I believe that Jesus was a great bloke, who obviously performed a great many charitable deeds in his life, and who was revered and respected by many of his contemporaries. As far as I'm concerned, someone who blesses fishermen and promotes world peace is a nice guy.

Nice guy, sure; but not son of G-d. Nice guy, sure; but not responsible for atoning for the sins of every man, woman, and child, for all eternity, that believes in his divinity. Nice guy. Jewish too, by the way (not roman). But that's it.

Today, my over-zealous acquaintance in the shopping mall told me his beliefs, which happened to be slightly different to my own. I had no problem with listening to them. According to my acquaintance, Jesus is the son of G-d, he was resurrected from the dead, and he atoned for all the sins of his followers through his death. I am aware that this is the belief held by millions of Christians around the world, and I respect that belief, and I have no desire to impose any other conflicting belief upon any Christian person. I just happen to have a different belief, that's all.

However, after that, things started getting a bit ugly. Next, I was informed that I am in grave danger. It is imperative that I accept a belief in Jesus and in Christianity, because only then will I be forgiven for all of my sins. Should I fail to accept this belief, I am doomed to eternity in hell.

Thanks for the warning, buddy - I appreciate you looking out for me, and I'm grateful that you've been kind enough to help me avoid eternal damnation 'n' all. But actually, I happen to believe that everyone goes to heaven (with a sprinkling of hellish punishment on the way, of course, depending on how much you've sinned), and that I already have a means of getting the all-clear from the Big Man regarding forgiveness, through my own religion.

The response? I'm wrong. I'm doomed. I haven't seen the light. Such a pity - it seemed, at first, that there was hope for me. If only I wasn't so damn stubborn.

Actually, I did see the light. How could I miss it, when it was being shone right in my face? For the sake of everyone's retinas, I say to all evangelists: stop shining that accursed light in our faces! Instead, why don't you practice what you preach, and respect the rights of others to serve G-d and to be charitable in their own way?

I don't respond well to advertisements that proclaim too-good-to-be-true offers. Hence my reaction to the whole "believe-in-my-way-and-all-your-sins-are-forgiven" thing. I also don't respond well to threats. Hence my reaction to the whole "believe-in-my-way-or-spend-eternity-in-hell" thing. It amazes and deeply disturbs me that this crude and archaic form of coercion has been so successful throughout the history of organised religion. But then again, those "$0 mobile phone" deals have been quite successful as well. I guess some people really are a bit simple.

I applaud the millions of Christian people (some of whom are my personal friends or acquaintances) who openly criticise and shun the evangelism of their brethren. It's a relief to know that the majority of people agree with my opinion that evangelism is the wrong way to go.

What this world needs is a bit more respect for others. We need to respect the rights of other people to live out a good life, according to whatever religion or doctrine they choose. We need to accept that if people want to conform to our ways, then they'll come of their own volition, and not through coercion. And we need to accept that imposing one's beliefs upon others is an arrogant, disrespectful, and hostile act that is not appreciated. World peace is a long way off. The practice of evangelism is a sound way to keep it like that. A better alternative is to agree to disagree, and to get on with doing things that really do make the world a better place.

]]>
On death and free minds 2006-06-12T00:00:00Z 2006-06-12T00:00:00Z Jaza https://greenash.net.au/thoughts/2006/06/on-death-and-free-minds/ For many years, a certain scene from a certain movie has troubled me deeply. In The Matrix (1999), there is a scene where the character 'Neo' is killed. Stone dead, no heartbeat for over thirty seconds, multiple bulletholes through chest. And then his girlfriend, 'Trinity', kisses him; and just like in the fairy tales, he magically comes back to life.

I have had far more than the recommended dosage of The Matrix in my time. In watching the film, my friends and I have developed a tradition of sorts, in that we have always derided this particular scene as being 'fake' and 'medically impossible'. We endlessly applaud the dazzling special effects, the oh-so-cool martial arts, the ultra-quotable screenplay, and the relishably noirish techniques, all of which the film overall is brimming with; nevertheless, we cannot help but feel disappointed by this one spot, where it seems that romantic melodrama has thwarted plausibility.

But like Neo, I believe that I may finally have the answer.

Ironically (and, since The Matrix itself has so much irony of a similar nature, doubly ironically), the answer has been staring me in the face all along. Indeed, like Morpheus, I too "came to realise the obviousness of the truth".

In almost any other movie, there can be no acceptable excuse for bringing someone back from the dead. It's a moviemaking sin. It's the lazy way to set the film's plot back on course, and the cheap way to engender drama and climax. This is because films are expected to follow the same rules that apply in the real world. When someone dies in real life, they stay dead. There is no way to reverse what is, by its very definition, fatal. Even in the realm of sci-fi, where technology is capable of anything, there is an unwritten rule that even the best gadgets cannot undo death - at least not without some serious scientific justification. Surely The Matrix is not exempt from this rule? Surely the film's producers cannot expect to conjure up the impossible on-screen, and expect something as unscientific as a kiss to qualify as "serious scientific justification"?

But what we can so easily forget, when making all these judgements, is the central theme and message of The Matrix. This message is constantly re-iterated throughout the film:

  • There Are No Rules
  • There is no spoon
  • Free your mind
  • What is real?
  • The mind makes it real

In a nutshell, this quote sums it up best:

What you must learn is that these rules are no different than the rules of a computer system. Some of them can be bent. Others can be broken.

By being 'The One', Neo's job is to understand and fully absorb the concept that everything in life is a rule, and that every rule can be broken. What is a rule? A rule is something that determines cause and effect. What happens if you are able to 'break' a rule? The cause occurs, but the effect does not. Simple as that. The only tricky bit is: how do you break a rule? You choose to deny that the rule applies to you, and therefore you are not governed by that rule. That last bit is the only bit that we haven't (yet?) proven in real life.

So, what are some basic rules that virtually all of the main characters in The Matrix learn to break, and that the audience is generally able to accept as being 'broken'? Gravity. Speed. Agility. Stamina. To name a few.

Matrix agents shooting
Matrix agents shooting

When you think about it like this, it becomes clear that death is just another one of these rules; and that if (according to 'Matrix logic') you are able to choose to deny death, then being killed (i.e. 'the cause') should not necessarily result in your dying (i.e. 'the effect'). In fact, when you follow this logic, then it could be argued that had Neo died permanently, he would have been permanently affected by the 'rule of death', and hence the movie would have been inconsistent. Instead, the movie portrays death as 'the ultimate rule to overcome'; and when Neo does succeed in thwarting it, he suddenly finds himself utterly unshackled from the confines that the Matrix attempts to place upon him.

The mind has an instinctive conviction that when the body is fatally injured, the whole bang shoot (i.e. mind / body / soul) ceases to function. By freeing your mind of this conviction, you are freeing yourself from the consequences that the conviction entails. That's the theory being touted, anyway. Personally, I find it to be a very self-centred and arrogant theory, but an enthralling one nonetheless.

This logic has helped me to 'accept' Neo's death and re-animation somewhat. But I still have a healthy level of cynicism, and I hope you do too. Despite all the work I've just done justifying it, the re-animation was a second-rate effort at tying up the story, and it was particularly poorly executed with the Hollywood kiss 'n' all. This sin is only barely explainable, and is certainly not admirable. But at least it is (in my eyes, at least) excusable, which is more than I can say for any other movies that do it.

]]>
A patch of flowers, a patch of code 2006-01-20T00:00:00Z 2006-01-20T00:00:00Z Jaza https://greenash.net.au/thoughts/2006/01/a-patch-of-flowers-a-patch-of-code/ The word patch has several different meanings. The first is that with which a gardener would be familiar. Every gardener has a little patch of this planet that he or she loves and tends to. In the gardening sense, a patch is a rectangular plot of land filled with dirt and flora; but, as gardeners know, it is also so much more than that. It is a living thing that needs care and attention; and in return, it brings great beauty and a feeling of fulfilment. A patch is also a permanent space on the land - a job that is never finished, some would say - and for as long as it is tended, it will flower and blossom.

Another meaning of patch is that usually applied to clothing and fabrics. To a tailor or a seamstress, a patch is a small piece of material, sewn over a hole or a rent in a garment. It is an imperfect and often temporary fix for a permanently damaged spot. A patch in this sense, therefore, connotes ugliness and scarring. A seamstress is hardly going to put as much time and attention into a patch as she would into a new garment: why bother, when it is only a 'quick fix', and will never look perfect anyway? There is no pride to be gained in making unseemly little patches. Designing and crafting something new, where all of the parts blend together into a beautiful whole, is considered a much more noble endeavour.

The latter and less glamorous meaning of this word seems to be the one that applies more widely. When a doctor tends your wounds, he or she may tell you that you're being patched up. Should you ever happen to meet a pirate, you may notice that he wears an eye patch (as well as saying "arrr, me hearties, where be ye wallets and mobile phones?"). And if you're a software programmer, and you find a bug in your program, then you'll know all about patching them pesky bugs, to try and make them go away.

But is it really necessary for software patches to live up to their unsightly reputation? Do patches always have to be quick 'n' dirty fixes, done in order to fix a problem that nobody believes will ever heal fully? Must they always be temporary fixes that will last only a few weeks, or perhaps a few months, before they have to be re-plastered?

As with most rhetorical questions that are asked by the narrator of a piece of writing, and that are constructed to be in a deliberately sarcastic tone, the answer to all of the above is, of course, 'no'. ;-)

I used to believe that software patches were inherently dirty things. This should come as no surprise, seeing that the only time I ever used to encounter patches, was when I needed to run Windows™ Update, which invariably involves downloading a swarm of unruly patches, 99% of which I am quite certain were quick and extremely dirty fixes. I'm sure that there are many others like me, who received their education on the nature of patches from Microsoft; but such people are misinformed, because this is analogous to being educated on the nature of equality by growing up in Soviet Russia.

Now, I'm involved in the Drupal project as a developer. Drupal is an open-source project - like many others - where changes are made by first being submitted to the community, in the form of a patch. I've seen a number of big patches go through the process of review and refinement, before finally being included in Drupal; and more recently, I've written some patches myself, and taken them through this process. Just yesterday, I got a fairly big patch of mine committed to Drupal, after spending many weeks getting feedback on it and improving it.

Developing with Drupal has taught me one very important thing about patches. When done right, they are neither quick nor dirty fixes. Patching the right way involves thinking about the bigger picture, and about making the patch fit in seamlessly with everything that already exists. It involves lengthy reviews and numerous modifications, often by a whole team of people, in order to make it as close to perfect as it can practically be.

But most of all, true patching involves dedication, perseverance, and love. A patch done right is a patch that you can be proud of. It doesn't give the garment an ugly square mark that people would wish to hide; it makes the garment look more beautiful than ever it was before. Patching is not only just as noble an endeavour as is designing and building software from scratch; it is even more so, because it involves building on the amazing and well-crafted work of others; and when done right, it allows the work of many people to amazingly (miraculously?) work together in harmony.

And as for patches being temporary and transient - that too is a myth. Any developer knows that once they make a contribution to a piece of software, they effectively own a little piece of its code. They become responsible for maintaining that piece, for improving it, and for giving it all the TLC that it needs. If they stick around and nurture their spot, then in time it will blossom, and its revitalising scent will spread and sweeten all the other spots around it.

Let's stop thinking about patches in their negative sense, and start thinking about them in their positive sense. That's the first step, and the hardest of all. After that, the next step - making and maintaining patches in the positive sense - will be child's play.

]]>
What a mean word 2005-08-12T00:00:00Z 2005-08-12T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/08/what-a-mean-word/ Some words are perfectly suited to their alternative definitions. The word 'mean' is one of these. 'Mean' refers to the average of a set of numbers. For example, you can calculate the mean of your school marks, or the mean of your bank savings, or the mean of many other things. A mean is a cruel, unforgiving, and brutally honest number: in short, it really is a mean number.

What brought this to mind was my recent University results. My marks have been pretty good so far, during my time at uni: overall I've scored pretty highly. But there have been a few times, here and where, where I've slipped a bit below my standard. For those of you that don't know, the big thing that everyone's worried about at uni is their GPA (Grade Point Average), which is - as its name suggests - the mean of all your marks in all your subjects to date.

My GPA is pretty good (I ain't complaining), but it's a number that reflects my occasional slip-ups as clearly as it does my usual on-par performance. Basically, it's a mean number. It's a number that remembers every little bad thing you've done, so that no matter how hard you try to leave your mistakes behind you, they keep coming back to haunt you. It's a merciless number, based purely on facts and logic and cold, hard mathematics, with no room for leniency or compassion.

A mean makes me think of what (some people believe) happens when you die: your whole life is shown before you, the good and the bad; and all the little things are added up together, in order to calculate some final value. This value is the mean of your life's worth: all your deeds, good and bad, are aggregated together, for The Powers That Be to use in some almighty judgement. Of course, many people believe that this particular mean is subject to a scaling process, which generally turns out to be advantageous to the end number (i.e. the Lord is merciful, he forgives all sins, etc).

Mean is one of many words in the English language that are known as polysemes (polysemy is not to be confused with polygamy, which is an entirely different phenomenon!). A polyseme is a type of homonym (words that are spelt the same and/or sound the same, but have different meanings). But unlike other homonyms, a polyseme is one where the similarily in sound and/or spelling is not just co-incidental - it exists because the words have related meanings.

For example, the word 'import' is a homonym, because its two meanings ('import goods from abroad', and 'of great import') are unrelated. Although, for 'import', as for many other homonyms, it is possible to draw a loose connection between the meanings (e.g. perhaps 'of great import' came about because imported goods were historically usually more valuable / 'important' than local goods, or vice versa).

The word 'shot', on the other hand, is clearly a polyseme. As this amusing quote on the polyseme 'shot' demonstrates, 'a shot of whisky', 'a shot from a gun', 'a tennis shot', etc, are all related semantically. I couldn't find a list of polysemes on the web, but this list of heteronyms / homonyms (one of many) has many words that are potential candidates for being polysemes. For example, 'felt' (the fabric, and the past tense of 'feel') could easily be related: perhaps the first person who 'felt' that material decided that it had a nice feeling, and so decided to name the material after that first impression response.

I couldn't find 'mean' listed anywhere as a polyseme. In fact, for some strange reason, I didn't even see it under the various lists of homonyms on the net - and it clearly is a homonym. But personally, I think it's both. Very few homonyms are clearly polysemes - for most the issue is debatable, and is purely a matter of speculation, impossible to prove (without the aid of a time machine, as with so many other things!) - but that's my $0.02 on the issue, anyway.

The movie Robin Hood: Men in Tights gives an interesting hypothesis on how and why one particular word in the English language is a polyseme. At the end of the movie, the evil villain King John is cast down by his brother, King Richard. In order to make a mockery of the evil king, Richard proclaims: "henceforth all toilets in the land shall be known as Johns". Unfounded humour, or a plausible explanation? Who knows?

There are surely many more homonyms in the English language that, like 'mean' and 'felt' and 'shot', are also polysemes. If any of you have some words that you'd like to nominate as polysemes, feel free. The more bizarre the explanation, the better.

]]>
The river without a river-bed 2005-08-12T00:00:00Z 2005-08-12T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/08/the-river-without-a-river-bed/ A few days ago, I was fortunate enough to be able to get away from my urban life for the weekend. I escaped the concrete and the cars, I abandoned my WiFi and my WorRies, and I dwelt for two days in a secluded retreat, far from civilisation, amidst a tranquil rainforest-like reserve.

I sat by a river one morning, and watched it for some time. It was a beautiful river: from where I sat, looking upstream, the water was perfectly still and tranquil. Almost like a frozen lake, like the ones you see on the backs of postcards that people send when vacationing in Canada. The tranquil waters were confined by a wide ledge, over which they cascaded in a sleek, thin waterfall. Past the waterfall, the river flowed messily and noisily through a maze of rocks and boulders - sometimes into little rock-pools, sometimes through crevasses and cracks - always onwards and downstream.

It was on one of these boulders, immediately downstream of the waterfall, that I sat and cogitated. I thought about how rivers usually flow from the mountains to the sea (as this one seemed to be doing); about how river valleys are usually V-shaped and steep-sided (as opposed to glacier-formed valleys, which are usually U-shaped and gentle-sided); about how the water flows down the river-bed, constantly, continuously, along the exact same path, for hundreds of millions of years.

How transient, then, is man, who cannot even maintain a constant course for a few thousand years. A man could sit by a river, and watch it for all the days of his life; and in dedicating his entire life (of 80 or so years) thusly, he would have shared in less than a second of the river's life. All the 10,000 or so years of civilised man's time upon this Earth, would equate to about 10 minutes in the passing of the river. Barely long enough to qualify a mention, in nature's reckoning. Blink and you've missed us.

A river is very much a metaphor for the cycle of all things in nature. A river flows from its source, down its long-established river-bed, until it reaches its destination; the water then journeys until it once again reaches its source, and so the cycle continues. The animal kingdom, like a river, is based upon cycles: animals are born, they live out their lives; and when they pass on, their offspring live on to continue the great cycle that is life.

As with a river, the animal kingdom flows steadily down a long-established course; the cycle is the same from one generation to the next. But, also like a river, the animal kingdom may alter its course slightly, from time to time, as external factors force it to adapt to new conditions. If a boulder lands in the middle of a river, the course will change so that the water flows around the boulder; similarly, if food diminishes in the middle of an animal group's grazing area, the group will migrate to a nearby area where food is still plentiful. But the river will never change its course radically, and never of its own accord. The same can be said of the animal kingdom.

But what of mankind?

Mankind was once a member of the animal kingdom. In those times, our lives changed only very gradually (if at all) from one generation to the next. We were but drops in the flow of the great river, endlessly coursing downstream, endlessly being reborn upstream, part of the symmetrical and seemingly perpetual cycle of nature. Like the rest of our animal brethren, we adapted to new conditions when necessary, but we took no steps to instigate change of our own accord. We were subject to the law of inertia: that is, the law that nothing changes unless forced to do so by an external force.

And so it was, that Australopithecus evolved into Homo Erectus, which in turn evolved into Homo Sapiens; that is, the species that is the mankind of today. But when we reached the end of that line of human evolution, something begun that had never happened before. The river, which had altered its course only according to the law of inertia thus far, reached a point where it began to diverge, but without the encouragement of any apparent external force. Mankind began to cascade off in a new direction, when it had no urgent need to do so. It was almost as if the river was flowing uphill.

We have followed this divergence from the natural cyclic flow, unto the present day, and seem if anything to be following it ever more vigorously as the years march on. From the invention of the wheel, to the age of farming and tilling the Earth, to the age of iron and steel, to the industrial revolution, to the space age, and now to the information age: with an ambition fuelled by naught but the sweetness of our own success, we are ploughing ever on, into the unknown.

Mankind is now a river that plots its own course, oblivious to the rocks and the dirt that try feebly to gird it. The river flows as erratically as it does fiercely: the only certainty, it seems, is that it will always flow somewhere new with its next turn. And so there is no cycle: for the waves upon which one generation ride, the next generation never sees; and the new waves are so different in composition from the old, that the two bear little or no resemblance to each other.

A river sans a river-bed,
We craft our muddy track,
Whither will the current lead?
None know, except not back.

We dash the rocks with vigour,
We drown the shrubs and trees,
Destroying all that's in our way,
Not least our memories.

The river-banks once led us,
Along a certain way,
Who leads us now in darkness?
Whatever fool that may.

What is our destination?
What beacon do we seek?
A tower of enlightenment,
Or a desert, dead and bleak.

Is it a good thing or a bad thing, this unstoppable torrent upon which for 10,000 years we have ridden, and which it seems is only just beginning? Is it progress? Is it natural? I would argue with (what I suspect is) the majority: I say that it is a good thing, even if it goes against the example set to us by every single other entity, living or non-living, that we see around us. I advocate progress. I advocate moving forward, and living in times of innovation - even if such times are also inevitably times of uncertainty.

The only thing I don't advocate, is the cost of our progress on the natural environment. If we're going to break free of the law of inertia, and move ever onwards, we have a responsibility to do so without damaging the other forms of life around us. As the threats of global warming and of rising pollution levels are telling us, we are still a part of the natural ecosystem, no matter how hard we try to diverge and isolate ourselves from it. We depend on our planet: we need it to be living, breathing, and healthy; we need to get serious about conservation, for our own sake, if not for nature's.

Nature is a beautiful thing; but it is also filled with a profound and subtle wisdom. I sat upon a rock and watched the river flow by, and thought about life for a while. Those that come after me should be able to do the same. We've laid waste to enough rivers, streams, and (even) seas in our time. Let's be neighbours with those that still remain.

]]>
The glossy future 2005-07-04T00:00:00Z 2005-07-04T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/07/the-glossy-future/ According to science fiction, humanity should by now be exploring the depths of outer space in sleek, warp-powered vehicles. Instead of humble suburban houses, we should all be living in gargantuan towers, that soar up into the sky at impossible angles, and that burrow down deep into the Earth. Instead of growing old and sick, we should be staying young and healthy for an unnaturally long time, thanks to the wonders of modern medicine.

This fantastical Utopia of the 21st century, however, could hardly be further from our own reality. What's interesting to note, however, is that there are a whole bunch of amazing things in this world today, that would have been considered science fiction not that long ago; but that these things are quite different to what the prophecies of science and of popular fiction have foretold.

The general pattern, as far as I can tell, is that it's the fun, the colourful, and the sexy ideas that inevitably wind up being predicted; and that it's the useful, the mundane, and the boring ideas that actually come into existence, and that (to a greater or lesser extent) become part of the global furnishings in which we live.

Take the Internet, for example. As a basic idea, the Internet is not very sexy (how ironic!): it's just a whole bunch of computers, all around the world, that are able to communicate with each other. How about computers? Even less sexy: a pile of circuits and resistors, that can perform complex calculations real fast. The mobile phone (a.k.a. cell phone)? Big deal: a phone that you can carry in your pocket. These are all things that we take for granted every day. They're also all things that have radically changed the way we live. Only, not in the sexy way that we imagined.

Where are the laser guns? Where are the cool spaceships? Where are the zero-gravity shoes, the holographic worlds, and the robotic people? When can you beam me up? Where's the sexy stuff? Sure, my iPod™ looks cool, but chopping my foot off and swallowing a pill to make it grow back would be so much cooler.

Flavours of prediction

Futuristic predictions come in two main flavours: the sleek, awe-inspiringly glossy flavour; and the crude, depressingly 'realistic' flavour. The science fiction movie genre has gone through both these phases. In the earlier years (the 60s and 70s), the genre seemed to favour the former flavour (the futuristic Utopia); and in the latter years (the 80s and 90s), it seemed to favour the latter flavour (usually in the form of a post-disaster world).

Take, for example, this shot from the classic Stanley Kubrick film, 2001: A Space Odyssey:

Galactic waltz in 2001: A Space Odyssey
Galactic waltz in 2001: A Space Odyssey

This is a beautiful machine, full of graceful curves and subtle gradients - very much compatible with the elegant classical music to which it 'waltzes'. This kind of spaceship is the perfect example of a 'glossy' future - that is, one that has been made to look perfect and insatiable. Here's another example of a future that's been glossed up:

Sunset on Coruscant in Star Wars
Sunset on Coruscant in Star Wars

This shot is from Star Wars: Episode II - Attack of the Clones. I'm actually cheating a bit here, because Star Wars is technically set 'a long time ago, in a galaxy far away'; and also because Episode II was made recently, not in the 60s or 70s. Also, Star Wars has traditionally had less glossy-looking scenery than this. But nevertheless, this shot gets the point across pretty well: the future is a gleaming city of spires and curvaceous arches, with thousands of little shuttles zooming through the sky.

And now for something completely different, here's a shot from Ridley Scott's film Blade Runner:

Ziggurat in Blade Runner
Ziggurat in Blade Runner

This is a totally different prediction of the future: not attractive, but grotesque; not full of life, but devoid of it; not designed, but mass-manufactured. This is not just a different image, but a whole different philosophy on how to foretell the future. Similar things can be said about this shot from The Matrix Reloaded:

Zion dock in The Matrix Reloaded
Zion dock in The Matrix Reloaded

While this one isn't quite so obviously blocky and noirish as the Blade Runner shot, it still looks more 'realistic' than 'futuristic', if you know what I mean. It's got that industrial, scrap-metal-dump-site kind of look about it.

Conclusions

So, which of these flavours of prediction is accurate? Will the future continue to disappoint, as it has up until now, or will it live up to its glossy predictions?

As I said earlier, it's the cool ideas that get predicted, and the practical ideas that actually happen. Bearing that in mind, I think that many of the cool ideas from science fiction will become a part of our real lives in the not-too-distant future (in fact, many of them are already), but not in the sexy way that they've been portrayed by the mass media.

Many of the ideas that futurists have come up with will never (foreseeably) actually happen, because they're simply too impractical to implement in the real world, or because implementing them is way beyond our current capacity in the realms of science. And, as we've seen in the past few decades, many useful but mundane innovations will continue to spring up, without ever having been predicted by even the most talented of modern prophets. Maybe such ideas are just too simple. Maybe they're too boring. Maybe they're just easier to sell in desktop form, or in pocket form, than in panoramic wide-screen digital surround-sound form.

I don't guarantee that the future will be any glossier than the present. The future might not be as jazzed up as Hollywood has made it.

But one thing's for sure: the future will be cool.

]]>
Busker bigotry 2005-05-14T00:00:00Z 2005-05-14T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/05/busker-bigotry/ I have a confession to make. I am guilty of the crime of busker bigotry. I justify my sin by convincing myself that it's not my fault. It's one of life's greatest dilemnas: you can't drop a coin to every busker that you pass in the street. It's simply not feasible. You'd probably go broke; and even if you could afford to do it, there are a plethora of other reasons against doing it (e.g. constant weight of coins, stopping and getting wet if it's raining, risk of getting mugged, etc).

I was walking through the Devonshire St tunnel, in Sydney's Central Railway Station, about a week ago. 'The tunnel' is one of those places that seems to inexorably draw buskers to it, much like members of the female race are inexorably drawn towards shoe shops (whilst members of the male race are inexorably oblivious to the existence of shoe shops - I can honestly say that I can't remember ever walking past such a place, and consciously noticing or registering the fact that it's there).

Normally, as I walk through the tunnel - passing on average about five buskers - I don't stop to even consider dropping a coin for any of them. It doesn't really matter how desperate or how pitiful they look, nor how much effort they're putting in to their performance. I walk through there all the time. I'm in a hurry. I don't have time to stop and pull out my wallet, five times per traverse, several traverses per week. And anyway, I didn't ask to be entertained whilst commuting, so why should I have to pay?

But last week, I was walking through the tunnel as usual, when I saw some buskers that I just couldn't not stop and listen to. They were a string quartet. Four musicians, obviously talented and accomplished professionals, playing a striking piece of classical music. Their violin and cello cases were open on the ground, and they weren't exactly empty. Evidently, a lot of people had already felt compelled to stop and show their appreciation. I too felt this need. I pulled out my wallet and gave them a few coins.

This is a rare act for me to engage in. What was it that triggered me to support these particular buskers, when I had indifferently ignored so many before them, and would continue to ignore so many after them? What mode of measurement had I used to judge their worthiness, and why had I used this mode?

The answer is simple. I stopped because I thought: These guys are good. Really good. I like the music they're playing. I'm being entertained by them. The least I can do, in return for this, is to pay them. Basically, I felt that they were doing something for me - I was getting something out of their music - and so I felt obliged to pay them for their kind service.

And then it occurred to me what my mode of measurement is: my judgement of a busker is based solely on whether or not I notice and enjoy their music enough to warrant my stopping and giving them money. That is, if I consciously decide that I like what they're playing, then I give money; otherwise, I don't.

Nothing else enters into the equation. Fancy instruments, exotic melodies, and remarkable voices contribute to the decision. Pitiful appearance, desperate pleas, and laudable (if fruitless) effort do not. Talk about consumer culture. Talk about heartless. But hey, don't tell me you've never done the same thing. We all have. Like I said, you can't stop for every busker. You have to draw the line somewhere, and somehow.

]]>
Desire and suffering 2005-03-23T00:00:00Z 2005-03-23T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/03/desire-and-suffering/ I remember learning once about the Eastern philosophy relating to the nature of suffering. It was during a religious studies course that I took during my senior years of high school. Part of the course involved a study of Buddhism, and the Buddhist / Hindu ideas about what causes suffering. Those of you that are somewhat familiar with this, you should already have an idea of what I'm talking about. Those of you that are very familiar with it, you probably know far more than I do (I'm no expert on the matter, I have only a very basic knowledge), so please forgive me for any errors or gross simplifications that I make.

In essence (as I was taught, anyway), Buddhists believe that all suffering is caused by desire. It's really quite a logical concept:

  • we desire what we do not have;
  • we suffer because we desire things and we do not have them;
  • therefore, if we free ourselves from desire (i.e. if we do not desire anything), then we become free of suffering (i.e. we achieve the ultimate level of happiness in life).

I haven't read the book The Art of Happiness, by His Holiness the Dalai Lama (I'm waiting for the movie to come out), but I'm guessing that this is the basic message that it gives. Although I could be wrong - if you really want to know, you really should just buy the book!

The concept is so simple, and when you think about it, it's kind of cool how it just makes sense™. Put it in the perspective of modern Western culture, which is (in stark contrast to this philosophy) totally centred around the consumer and the individual's wants. In Western society, our whole way of thinking is geared towards fulfilling our desires, so that we can then be happy (because we have what we want). But as everyone knows, the whole Western individual-consumer-selfish-driven philosophy is totally flawed in practice, because:

  • as soon as we fulfil one desire, it simply leads to more desires (the old OK, I've bought a Mercedes, now I want a Porsche example comes to mind here);
  • there are heaps of desires that everyone has, that will never be fulfilled (hence you will never truly be happy).

Then there is the great big fat lie of the consumer era: things that we desire aren't really desires, because most of them are actually things that we need, not things that we want. Justifying that we need something has become second nature. I need the new 40GB iPod, because otherwise I'll go crazy sitting on the train for 2 hours each way, each day, on the way to work. I need a top-of-the-range new computer, because my old one is too slow and is stopping me from getting my work done productively. I need a designer jacket, because it's nearly winter and my old ones are ready for the bin, and I'll be cold without it. I need the biggest thing on the menu at this restaurant, because I'm really hungry and I haven't eaten since lunch. We know, deep down, that we don't need any of these things, just as we know that having them won't make us "happy". But we kid ourselves anyway.

And this is where the whole Buddhism thing starts to make sense. Hang on, you think. If I just stop desiring that new iPod, and that fast PC, and that designer jacket, and that massive steak, then I won't have the problem of being unhappy that I haven't got them. I don't really need them anyway, and if I can just stop wanting them, I'll be better off anyway. I can spend my money on more important things. And so you see how, as I said, it really does just make sense™.

But all this got me thinking, what about other things? Sure, it's great to stop desiring material objects, but what of the more abstract desires? There are other things that we desire, and that we suffer from because of our desire, but that don't lead to simply more greed. Love is the obvious example. We all desire love, but once you've found one person that you love, then (assuming you've found someone you really love) you stop desiring more people to love (well... that's the theory, anyway - but let's not get into that :-)).

Another is knowledge. Now, this is a more complicated one. There are really no constants when it comes to knowledge. Sometimes you desire knowledge, and sometimes you don't. Sometimes fulfilling your desire (for knowledge) leads to more desire, but other times, it actually stops you desiring any more. Sometimes fulfilling a desire for knowledge makes you happy, and sometimes it makes you realise that it wasn't such a good idea to desire it in the first place.

Take for example history. You have a desire to learn more about World War II. You have a thirst for knowledge about this particular subject. But when you actually fulfil that desire, by acquiring a degree of knowledge, it leads to a desire not to know any more. Learning about the holocaust is horrible - you wish you'd never learnt it in the first place. You wish you could unlearn it.

Take another example, this time from the realm of science. In this example, assume that you have no desire whatsoever to learn about astronomy. You think it's the most boring topic on Earth (or beyond :-)). But then someone tells you about how they've just sent a space probe to Titan (one of Saturn's moons), and how it's uncovering new facts that could lead to the discovery of life beyond Earth. Suddenly, you want to learn more about this, and your initial lack of desire turns into an eventual desire for more knowledge.

Clearly, knowledge and the desire for it cannot be explained with the same logic that we were using earlier. It doesn't follow the rules. With knowldge, desire can lead to no desire, and vice versa. Fulfilment can lead to sadness, or to happiness. So the question that I'm pondering here is basically: is it bad to desire knowledge? Is this one type of desire that it's good to have? Is there any constant effect of attaining knowledge, or does it depend entirely on what the knowledge is and how you process it?

My answer would be that yes, it's always good to desire knowledge. Even if you cannot say with certainty whether the result of attaining knowledge is "good" or "bad" (if such things can even be measured), it's still good to always desire to know more, just for the sake of being a more informed and more knowledgeable human being. Of course, I can't even tell you what exactly knowledge is, and how you can tell knowledge apart from - well, from information that's total rubbish - that would be a whole new topic. But my assertion is that whatever the hell knowledge is, it's good to have, it's good to desire, and it's good to accumulate over the long years of your life.

Buddhist philosophy probably has its own answers to these questions. I don't know what they are, so because of my lack of knowledge (otherwise known as ignorance!), I'm trying to think of some answers of my own. And I guess that in itself is yet another topic: is it better to be told the answers to every big question about life and philosophy, or is it sometimes better to come up with your own? Anyway, this is one little complex question that I'm suggesting an answer to. But by no means is it the right answer. There is no right answer. There is just you and your opinion. Think about it, will you?

]]>
The fantasy genre: an attempt to explain its success 2005-02-21T00:00:00Z 2005-02-21T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/02/the-fantasy-genre-an-attempt-to-explain-its-success/ It has now been more than 50 years since J.R.R. Tolkien published the first volume of his epic trilogy - The Lord of the Rings - marking the birth of the modern fantasy genre. Tolkien's masterpiece was voted "best book of the 20th century" a few years ago, and it has been read and cherished by millions of devotees worldwide. Ever since then, fantasy books have been springing up left, right, and centre; and judging by the way they keep on selling (and selling), it seems that the fans just can't get enough of them. David Eddings' Belgariad (and others); Robert Jordan's Wheel of Time; Terry Goodkind's Sword of Truth; and Ursula LeGuin's Earthsea: these are but a handful of the vast treasure trove of fantasy works, to be found on the shelves of bookshops the world over.

But just what is it that makes these books so damn popular? As I lay awake last night, having just finished reading Harry Potter and the Goblet of Fire (I'll talk about J.K. Rowling's work in a minute), I pondered this question, and I would now like to share with you all the conclusions that I came up with.

We all know that fantasy books all have a lot in common. A lot. In fact, most of them are so similar, that once you've read a few, you could read the rest of them with your eyes shut, and still be able to take a pretty good guess at what the plot is. They all follow the classic formula, which goes something along these lines:

  1. Young boy (or girl, or creature... or hobbit) grows up in rural setting, living peaceful childhood, knowing nothing about magic, dark lords, wizards, etc.
  2. Hero is forced to leave beloved home (may or may not have found true love by now), because dark power is growing stronger, and hero must go on dangerous journey to escape.
  3. Hero soon realises that his parents/ancestors (whom he never met - up until now he thought those nice country bumpkins who raised him were his closest family) were powerful wizards, or kings/queens, or something else really famous.
  4. Hero discovers that he has amazing magical powers, and that he was born with a destiny to overthrow some terrifying evil power. He is taught how to use his cool abilities.
  5. Hero overcomes all odds, battles huge monsters, forges empires, unites many nations, fulfils several gazillion prophecies, defeats dark lord (who always has some weird, hard-to-pronounce name - not that that matters, because everyone is scared to pronounce it anyway), marries beautiful princess/sorceress love of his life (who is also really brave, fulfilled many prophecies, and helped him do battle), and everyone lives happily ever after.

Fantasy books are also always set in a pre-industrial world, where by some amazing miracle, every nation on the planet has managed to stay in the middle ages for several thousand years, without a single person inventing things such as gunpowder, petrol engines, or electronics (although they have all the things that would naturally lead up to such inventions, such as steel, sulfur, etc). Instead of all these modern inventions, fantasy worlds are filled with magic. In most books, magic is a natural phenomenon that some people are born with, and that most are not. There are various magical creatures (e.g. elves, vampires, dragons), all sorts of magical artefacts, and usually various branches (or specialised areas) within the field of magic.

The common argument for why fantasy books are so popular, is because people say that they're "perfect worlds". Yet this is clearly not so: if they're perfect, then why do they all have dark lords that want to enslave the entire human race; terrifying and revolting evil creatures; warring nations that don't get on with each other; and hunger, disease, poverty, and all the other bad things that afflict us the same way in the real world? Fantasy worlds may have magic, but magic doesn't make the world perfect: as with anything, magic is just a tool, and it can be used (and is used) for good or for evil.

So fantasy worlds aren't perfect. If they were, then the whole good vs evil idea, which is central to every fantasy book, would be impossible to use as the basis for the book's plot.

Now, let's go back to that idea of a pre-industrial world. Every since the industrial revolution of the 1840s, many people upon this Earth have grown worried that we are destroying the planet, that we are making the world artificial, and that all the beautiful natural creations that make up the planet are wasting away. Fantasy worlds, which are all set before the growth of industry, are lacking this ugly taint. Fantasy worlds are always natural.

And that, in my opinion, is why people can't get enough of fantasy books. Everything about fantasy worlds is natural. The environment is natural: most of the world is untouched wilderness, and human settlements (e.g. farms and cities) do not have a noticeable impact upon that environment. It's a bit like those strategy-based computer games, where no matter what you do or build upon the terrain map, you have no effect upon the map itself: it remains the same. The people are natural: they show qualities like bravery, honour, dignity, and trust; qualities that many consider to be disappearing from the human race today. Even magic, which is not part of any natural world that we know, is a natural part of the fantasy landscape: people are born with it, people use it instinctively, and it can be used to accomplish things that are still not possible with modern technology, but in a natural and clean way.

Harry Potter is unique among fantasy books, because unlike most fantasy worlds, which are totally contrived and are alien to our own, the world of Harry Potter is part of our own modern world. This lets us contrast the fantasy environment with our own very clearly: and the result, almost invariably, is that we find the fantasy world much more attractive than our own. It may lack all the modern wonders that we have artificially created - and that we consider to be our greatest achievements - but it seems to easily outdo these things with its own, natural wonders: magic, raw human interaction, and of course, a pristine natural environment.

It's quite ironic, really, that we so often applaud ourselves on being at the height of our intellectual, social, and technological greatness (and we're getting higher every day, or so we're told), but that when it comes down to it, we consider a world without all these great things to be much more appealing. Traversing a continent on horseback seems far more... chivalric... than zooming over it in a 747. The clash of a thousand swords seems much more... glorious... than the bang of a million guns. And the simple act of lighting a candle, and reading great and spectacular works of literature in leather-bound volumes in the dark of night, seems much more... fulfilling... than turning on a light bulb, and booting up your computer, so that you can go online and read people's latest not-at-all-great-or-spectacular blog entries, in one of many sprawling big cities that never sleeps.

]]>
Why junk collecting is good 2005-01-21T00:00:00Z 2005-01-21T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/01/why-junk-collecting-is-good/ Everyone collects something that's a little weird. Sport fanatics collect footy jerseys and signed cricket bats. Internet junkies collect MP3s and ripped DVDs. Nature lovers collect pets and plants. People with too much money collect Ferraris and yachts and tropical islands. People who can't think of anything else to collect go for stamps.

Personally, I collect tickets. I absolutely love collecting every ticket that I (or my friends or family or acquaintances) can get my hands on. It started off as just train tickets, but over the years my collection has grown to include movie tickets, bus tickets, plane tickets, ski tickets, theme park tickets, concert tickets, and many more.

When I see my friends after not having seen them for a while, I am greeted not with a handshake or a hug, but with a formidable pile of tickets that they've saved up for me. When my wallet gets too full with them, I empty them out and add them to the ever-burgeoning box in my room. I even received several thousand train tickets for my birthday last year. Basically, I am a ticket whore. No matter how many of the ruddy things I get, my thirst for more can never be quenched.

The obvious question that has confronted me many times: why? Why oh why, of all things, do I collect tickets? I mean, it's not like I do anything with them - they just sit in my room in a big box, and collect dust! What's the point? Do I plan to ever use these tickets to contribute to the global good of humanity? Am I hoping that in 5000 year's time, they'll be vintage collector's items, and my great-x104-grandchildren will be able to sell them and become millionaires? Am I waiting until I have enough that if I chucked them in a big bucket of water and swirled them around, I'd have enough recycled paper to last me for the rest of my life? WHY?

The answer to this is the same as the answer to why most people collect weird junk: why not? No, at the moment I don't have any momentous plans for my beloved tickets. I just continue to mindlessly collect them, as I have done for so long already. But does there have to be a reason, a point, a great guiding light for everything we do in life? If you ask me, it's enough to just say: "I've done it for this long, a bit more couldn't hurt". Of course, this philosophy applies strictly to innocent things such as ticket-collecting: please do not take this to imply in any way that I condone serial killing / acts of terrorism / etc etc, under the same umbrella. But for many other weird junk-collecting hobbies (e.g. sand grain collecting, rusty ex-electronic-component collecting, leaf collecting - and no, I don't collect any of these! ... yet :-)), the same why not principle can and should be applied.

So yes, ticket collecting is pointless, and despite that - no, because of that - I will continue to engage in it for as long as I so fancy. No need to spurt out any philosophical mumbo-jumbo about how I'm making a comment on nihilism or chaos theory or the senselessness of life or any of that, because I'm not. I just like collecting tickets! Plans for the future? I will definitely count them one day... should be a riveting project. I may take up the suggestion I was given once, about stapling them all together to make the world's longest continuous string of tickets. Yeah, Guinness Book of Records would be good. But then again, I might do nothing with them. After all, there's just so many of them!

]]>
Always read the book first 2005-01-16T00:00:00Z 2005-01-16T00:00:00Z Jaza https://greenash.net.au/thoughts/2005/01/always-read-the-book-first/ I've been saying it for years now, but I'll say it again: seeing the movie before reading the book is always a bad idea. For one thing, nine times out of ten, the movie is nowhere near as good as the book. And in the rare cases where it is better, it still leaves out huge chunks of the book, while totally spoiling other parts! Also, the book is invariably always written (long) before the movie is made (at least, it is for every book-movie combo that I can think of - maybe there are some exceptions). So if the author wrote the book before the producer made the movie, it makes sense that you should read the book before seeing the movie (I know, feeble logic alert - but hey, who asked you anyway?).

The reason for my sudden urge to express this opinion, is a particular series of books that I'm reading now. I've (stupidly) been putting off reading Harry Potter for many years now, but have finally gotten round to reading it at the moment. Unfortunately, I saw two of the movies - 'Harry Potter and the Philosopher's Stone', and 'Harry Potter and the Prisoner of Azkaban' - before starting the books. Although I was reluctant to see them, I was able to put off reading the books by myself, but not able to get out of seeing the movies with my friends.

Luckily, when I started reading 'Harry Potter and the Philosopher's Stone' (that's 'sorcerer' for all you Americans out there), I couldn't remember much of the movie, so the book wasn't too spoiled. However, having a predefined visual image of the characters was a definite drawback (unable to let the imagination flow), as was knowledge of some of the Potter-lingo (e.g. 'muggles'), and of the nature of magic. 'Harry Potter and the Chamber of Secrets' (which I still haven't seen the movie for) was a much better read, as I knew nothing of the storyline, and had no predefined image of all the new characters.

I'm up to the third one now ('Prisoner of Azkaban'), and having seen the movie not that long ago, I can remember most of it pretty clearly. To be quite honest, having the movie in my head is ruining the book. I'm not digging any of the suspense, because I already know what's going to happen! There are no grand visual concoctions growing in my head, because I've already got some shoved in there! It's a downright pain, and I wish I'd never seen the movie. I'm definitely not seeing any more 'Harry Potter' movies until I've finished the books.

This is in contrast to my experience with 'Lord of the Rings'. I honestly believe this to be the best book of all time, but perhaps if I'd seen the movie(s) first, rather than reading all the books twice before seeing any of the movies, my opinion might differ. The movies of LOTR are absolute masterpieces, no doubt about that. But seeing them after having read the books makes them infinitely more worthwhile. When you see the landscapes on the big screen, you also see the kings and queens and battles and cities of long-gone history, that aren't part of the movie, and that non-readers have no idea about. When you hear the songs (usually only in part), you know the full verses, and you know the meaning behind them. And when things are done wrongly in the movie, they stick out to you like a sore thumb, while to the rest of the audience they are accepted as the original gospel truth. Tragic, nothing less.

So my advice, to myself and to all of you, is to always read the book first, because it's always better than the movie, and while watching the movie (first) spoils the book, doing the reverse has the opposite effect!

]]>