There are many aspects of code that you can care about. Formatting. Modularity. Meaningful naming. Performance. Security. Test coverage. And many more. Even if you care about just one of these, then: (a) I salute you, for you are a good dev; and (b) that means that you're passionate about code, which in turn means that you'll care about more aspects of code as you grow and mature, which in turn means that you'll develop more of them there skills, as a natural side effect. The fact that you care, however, is the foundation of it all.
If you care about code, then code isn't just a means to an end: it's an end unto itself. If you truly don't care about code at all, but only what it accomplishes, then not only are you not a good dev, you're not really a dev at all. Which is OK, not everyone has to be a dev. If what you actually care about is that the "Unfranked Income YTD" value is accurate, then you're probably a (good) accountant. If it's that the sidebar is teal, then you're probably a (good) graphic designer. If it's that national parks are distinguishable from state forests at most zoom levels, then you're probably a (good) cartographer. However, if you copy-pasted and cobbled together snippets of code to reach your goal, without properly reading or understanding or caring about the content, then I'm sorry, but you're not a (good) dev.
Of course, a good dev needs at least some "hard" skills too. But, as anyone who has ever interviewed or worked with a dev knows, those skills – listed so prominently on CVs and in JDs – are pretty worthless if there's no quality included. Great, 10 years of C++ experience! And you've always given all variables one-character names? Great, you know Postgres! But you never add an index until lots of users complain that a page is slow? Great, a Python ninja! What's that, you just write one test per piece of functionality, and it's a Selenium test? Call me harsh, but those sound to me like devs who just don't care.
"Soft" skills are even easier to rattle off on CVs and in JDs, and are worth even less if accompanied by the wrong attitude. Conversely, if a dev has the right attitude, then these skills flourish pretty much automatically. If you care about the code you write, then you'll care about documentation in wiki pages, blog posts, and elsewhere. You'll care about taking the initiative in efforts such as refactoring. You'll care about collaborating with your teammates more. You'll care enough to communicate with your teammates more. "Caring" is the biggest and the most important soft skill of them all!
Formal education in programming (from a university or elsewhere) certainly helps with developing your skills, and it can also start you on your journey of caring about code. But you can find it in yourself to care, and you can learn all the tools of the trade, without any formal education. Many successful and famous programmers are proof of that. Conversely, it's possible to have a top-notch formal education up your sleeve, and to still not actually care about code.
It's frustrating when I encounter code that the author clearly didn't care about, at least not in the same ways that I care. For example, say I run into a thousand-line function. Argh, why didn't they break it up?! It might bother me first and foremost because I'm the poor sod who has to modify that code, 5 years later; that is, now it's my problem. But it would also sadden me, because I (2021 me, at least!) would have cared enough to break it up (or at least I'd like to think so), whereas that dev at that point in time didn't care enough to make the effort. (Maybe that dev was me 5 years ago, in which case I'd be doubly disappointed, although wryly happy that present-day me has a higher care factor).
Some aspects of code are easy to start caring about. For example, meaningful naming. You can start doing it right now, no skills required, except common sense. You can, and should, make this New Year's resolution: "I will not name any variable, function, class, file, or anything else x
, I will instead name it num_bananas_in_tummy
"! Then follow through on that, and the world will be a better place. Amen.
Others are more challenging. For example, test coverage. You need to first learn how to write and run tests in one or more programming languages. That has gotten much easier over the past few decades, depending on the language, but it's still a learning curve. You also need to learn the patterns of writing good tests (which can be a whole specialised career in itself). Plus, you need to understand why tests (particularly unit tests), and test coverage, are important at all. Only then can you start caring. I personally didn't start writing or caring about tests until relatively recently, so I empathise with those of you who haven't yet got there. I hope to see you soon on the other side.
I suspect that this theory of mine applies in much the same way, to virtually all other professions in the world. Particularly professions that involve craftsmanship, but other professions too. Good pharmacists actually care about chemical compounds. Good chefs actually care about fresh produce. Good tailors actually care about fabrics. Good builders actually care about bricks. It's not enough to just care about the customers. It's not enough to just care about the end product. And it's certainly not enough to just care about the money. In order to truly excel at your craft, you've got to actually care about the raw material.
I'm not writing this as an attack on anyone that I know, or that I've worked with, or whose code I've seen. In fact, I've been fortunate in that almost all fellow devs with whom I have crossed paths, are folks who have demonstrated that they care, and who are therefore, in my humble opinion, good devs. And I'm not trying to make myself out to be the patron saint of caring about code, either. Sorry if I sound patronising in this article. I'm not perfect any more than anyone else is. Plenty of people care more than I do. And different people care about different things. And we're all on a journey: I cared about less aspects of code 10 years ago, than I do now; and I hope to care about more aspects of code than I do today, 10 years in the future.
]]>I was also surprised to learn, after doing a modest bit of research, that Tolstoy is seldom mentioned amongst any of the prominent figures in philosophy or metaphysics over the past several centuries. The only articles that even deign to label Tolstoy as a philosopher, are ones that are actually more concerned with Tolstoy as a cult-inspirer, as a pacifist, and as an anarchist.
So, while history has been just and generous in venerating Tolstoy as a novelist, I feel that his contribution to the field of philosophy has gone unacknowledged. This is no doubt in part because Tolstoy didn't consider himself a philosopher, and because he didn't pen any purely philosophical works (published separately from novels and other works), and because he himself criticised the value of such works. Nevertheless, I feel warranted in asking: is Tolstoy a forgotten philosopher?
The concept of free will that Tolstoy articulates in War and Peace (particularly in the second epilogue), in a nutshell, is that there are two forces that influence every decision at every moment of a person's life. The first, free will, is what resides within a person's mind (and/or soul), and is what drives him/her to act per his/her wishes. The second, necessity, is everything that resides external to a person's mind / soul (that is, a person's body is also for the most part considered external), and is what strips him/her of choices, and compels him/her to act in conformance with the surrounding environment.
Whatever presentation of the activity of many men or of an individual we may consider, we always regard it as the result partly of man's free will and partly of the law of inevitability.
War and Peace, second epilogue, chapter IX
A simple example that would appear to demonstrate acting completely according to free will: say you're in an ice cream parlour (with some friends), and you're tossing up between getting chocolate or hazelnut. There's no obvious reason why you would need to eat one flavour vs another. You're partial to both. They're both equally filling, equally refreshing, and equally (un)healthy. You'll be able to enjoy an ice cream with your friends regardless. You're free to choose!
You say: I am not and am not free. But I have lifted my hand and let it fall. Everyone understands that this illogical reply is an irrefutable demonstration of freedom.
War and Peace, second epilogue, chapter VIII
And another simple example that would appear to demonstrate being completely overwhelmed by necessity: say there's a gigantic asteroid on a collision course for Earth. It's already entered the atmosphere. You're looking out your window and can see it approaching. It's only seconds until it hits. There's no obvious choice you can make. You and all of humanity are going to die very soon. There's nothing you can do!
A sinking man who clutches at another and drowns him; or a hungry mother exhausted by feeding her baby, who steals some food; or a man trained to discipline who on duty at the word of command kills a defenseless man – seem less guilty, that is, less free and more subject to the law of necessity, to one who knows the circumstances in which these people were placed …
War and Peace, second epilogue, chapter IX
However, the main point that Tolstoy makes regarding these two forces, is that neither of them does – and indeed, neither of them can – ever exist in absolute form, in the universe as we know it. That is to say, a person is never (and can never be) free to decide anything 100% per his/her wishes; and likewise, a person is never (and can never be) shackled such that he/she is 100% compelled to act under the coercion of external agents. It's a spectrum! And every decision, at every moment of a person's life (and yes, every moment of a person's life involves a decision), lies somewhere on that spectrum. Some decisions are made more freely, others are more constrained. But all decisions result from a mix of the two forces.
In neither case – however we may change our point of view, however plain we may make to ourselves the connection between the man and the external world, however inaccessible it may be to us, however long or short the period of time, however intelligible or incomprehensible the causes of the action may be – can we ever conceive either complete freedom or complete necessity.
War and Peace, second epilogue, chapter X
So, going back to the first example: there are always some external considerations. Perhaps there's a little bit more chocolate than hazelnut in the tubs, so you'll feel just that little bit guilty if you choose the hazelnut, that you'll be responsible for the parlour running out of it, and for somebody else missing out later. Perhaps there's a deal that if you get exactly the same ice cream five times, you get a sixth one free, and you've already ordered chocolate four times before, so you feel compelled to order it again this time. Or perhaps you don't really want an ice cream at all today, but you feel that peer pressure compels you to get one. You're not completely free after all!
If we consider a man alone, apart from his relation to everything around him, each action of his seems to us free. But if we see his relation to anything around him, if we see his connection with anything whatever – with a man who speaks to him, a book he reads, the work on which he is engaged, even with the air he breathes or the light that falls on the things about him – we see that each of these circumstances has an influence on him and controls at least some side of his activity. And the more we perceive of these influences the more our conception of his freedom diminishes and the more our conception of the necessity that weighs on him increases.
War and Peace, second epilogue, chapter IX
And, going back to the second example: you always have some control over your own destiny. You have but a few seconds to live. Do you cower in fear, flat on the floor? Do you cling to your loved one at your side? Do you grab a steak knife and hurl it defiantly out the window at the approaching asteroid? Or do you stand there, frozen to the spot, staring awestruck at the vehicle of your impending doom? It may seem pointless, weighing up these alternatives, when you and your whole world are about to be pulverised; but aren't your last moments in life, especially if they're desperate last moments, the ones by which you'll be remembered? And how do you know for certain that there will be nobody left to remember you (and does that matter anyway)? You're not completely bereft of choices after all!
… even if, admitting the remaining minimum of freedom to equal zero, we assumed in some given case – as for instance in that of a dying man, an unborn babe, or an idiot – complete absence of freedom, by so doing we should destroy the very conception of man in the case we are examining, for as soon as there is no freedom there is also no man. And so the conception of the action of a man subject solely to the law of inevitability without any element of freedom is just as impossible as the conception of a man's completely free action.
War and Peace, second epilogue, chapter X
Tolstoy's philosophical propositions in War and Peace were heavily influenced by the ideas of one of his contemporaries, the German philosopher Arthur Schopenhauer. In later years, Tolstoy candidly expressed his admiration for Schopenhauer, and he even went so far as to assert that, philosophically speaking, War and Peace was a repetition of Schopenhauer's seminal work The World as Will and Representation.
Schopenhauer's key idea, was that the whole universe (at least, as far as any one person is concerned) consists of two things: the will, which doesn't exist in physical form, but which is the essence of a person, and which contains all of one's drives and desires; and the representation, which is a person's mental model of all that he/she has sensed and interacted with in the physical realm. However, rather than describing the will as the engine of one's freedom, Schopenhauer argues that one is enslaved by the desires imbued in his/her will, and that one is liberated from the will (albeit only temporarily) by aesthetic experience.
Schopenhauer's theories were, in turn, directly influenced by those of Immanuel Kant, who came a generation before him, and who is generally considered the greatest philosopher of the modern era. Kant's ideas (and his works) were many (and I have already written about Kant's ideas recently), but the one of chief concern here – as expounded primarily in his Critique of Pure Reason – was that there are two realms in the universe: the phenomenal, that is, the physical, the universe as we experience and understand it; and the noumenal, that is, a theoretical non-material realm where everything exists as a "thing-in-itself", and about which we know nothing, except for what we are able to deduce via practical reason. Kant argued that the phenomenal realm is governed by absolute causality (that is, by necessity), but that in the noumenal realm there exists absolute free will; and that the fact that a person exists in both realms simultaneously, is what gives meaning to one's decisions, and what makes them able to be measured and judged in terms of ethics.
We can trace the study of free will further through history, from Kant, back to Hume, to Locke, to Descartes, to Augustine, and ultimately back to Plato. In the writings of all these fine folks, over the millennia, there can be found common concepts such as a material vs an ideal realm, a chain of causation, and a free inner essence. The analysis has become ever more refined with each passing generation of metaphysics scholars, but ultimately, it has deviated very little from its roots in ancient times.
There are certainly parallels between Tolstoy's War and Peace, and Schopenhauer's The World as Will and Representation (and, in turn, with other preceding works), but I for one disagree that the former is a mere regurgitation of the latter. Tolstoy is selling himself short. His theory of free will vs necessity is distinct from that of Schopenhauer (and from that of Kant, for that matter). And the way he explains his theory – in terms of a "spectrum of free-ness" – is original as far as I'm aware, and is laudable, if for no other reason, simply because of how clear and easy-to-grok it is.
It should be noted, too, that Tolstoy's philosophical views continued to evolve significantly, later in his life, years after writing War and Peace. At the dawn of the 1900s (by which time he was an old man), Tolstoy was best known for having established his own "rational" version of Christianity, which rejected all the rituals and sacraments of the Orthodox Church, and which gained a cult-like following. He also adopted the lifestyle choices – extremely radical at the time – of becoming vegetarian, of renouncing violence, and of living and dressing like a peasant.
War and Peace is many things. It's an account of the Napoleonic Wars, its bloody battles, its geopolitik, and its tremendous human cost. It's a nostalgic illustration of the old Russian aristocracy – a world long gone – replete with lavish soirees, mountains of servants, and family alliances forged by marriage. And it's a tenderly woven tapestry of the lives of the main protagonists – their yearnings, their liveliest joys, and their deepest sorrows – over the course of two decades. It rightly deserves the praise that it routinely receives, for all those elements that make it a classic novel. But it also deserves recognition for the philosophical argument that Tolstoy peppers throughout the text, and which he dedicates the final pages of the book to making more fully fledged.
]]>However, as anyone exposed to the industry knows, the current state-of-the-art is still plagued by fundamental shortcomings. In a nutshell, the current generation of AI is characterised by big data (i.e. a huge amount of sample data is needed in order to yield only moderately useful results), big hardware (i.e. a giant amount of clustered compute resources is needed, again in order to yield only moderately useful results), and flawed algorithms (i.e. algorithms that, at the end of the day, are based on statistical analysis and not much else – this includes the latest Convolutional Neural Networks). As such, the areas of success (impressive though they may be) are still dwarfed by the relative failures, in areas such as natural language conversation, criminal justice assessment, and art analysis / art production.
In my opinion, if we are to have any chance of reaching a higher plane of AI – one that demonstrates more human-like intelligence – then we must lessen our focus on statistics, mathematics, and neurobiology. Instead, we must turn our attention to philosophy, an area that has traditionally been neglected by AI research. Only philosophy (specifically, metaphysics and epistemology) contains the teachings that we so desperately need, regarding what "reasoning" means, what is the abstract machinery that makes reasoning possible, and what are the absolute limits of reasoning and knowledge.
There are many competing theories of reason, but the one that I will be primarily relying on, for the rest of this article, is that which was expounded by 18th century philosopher Immanuel Kant, in his Critique of Pure Reason and other texts. Not everyone agrees with Kant, however his is generally considered the go-to doctrine, if for no other reason (no pun intended), simply because nobody else's theories even come close to exploring the matter in such depth and with such thoroughness.
One of the key tenets of Kant's work, is that there are two distinct types of propositions: an analytic proposition, which can be universally evaluated purely by considering the meaning of the words in the statement; and a synthetic proposition, which cannot be universally evaluated, because its truth-value depends on the state of the domain in question. Further, Kant distinguishes between an a priori proposition, which can be evaluated without any sensory experience; and an a posteriori proposition, which requires sensory experience in order to be evaluated.
So, analytic a priori statements are basically tautologies: e.g. "All triangles have three sides" – assuming the definition of a triangle (a 2D shape with three sides), and assuming the definition of a three-sided 2D shape (a triangle), this must always be true, and no knowledge of anything in the universe (except for those exact rote definitions) is required.
Conversely, synthetic a posteriori statements are basically unprovable real-world observations: e.g. "Neil Armstrong landed on the Moon in 1969" – maybe that "small step for man" TV footage is real, or maybe the conspiracy theorists are right and it was all a hoax; and anyway, even if your name was Buzz Aldrin, and you had seen Neil standing there right next to you on the Moon, how could you ever fully trust your own fallible eyes and your own fallible memory? It's impossible for there to be any logical proof for such a statement, it's only possible to evaluate it based on sensory experience.
Analytic a posteriori statements, according to Kant, are impossible to form.
Which leaves what Kant is most famous for, his discussion of synthetic a priori statements. An example of such a statement is: "A straight line between two points is the shortest". This is not a tautology – the terms "straight line between two points" and "shortest" do not define each other. Yet the statement can be universally evaluated as true, purely by logical consideration, and without any sensory experience. How is this so?
Kant asserts that there are certain concepts that are "hard-wired" into the human mind. In particular, the concepts of space, time, and causality. These concepts (or "forms of sensibility", to use Kant's terminology) form our "lens" of the universe. Hence, we are able to evaluate statements that have a universal truth, i.e. statements that don't depend on any sensory input, but that do nevertheless depend on these "intrinsic" concepts. In the case of the above example, it depends on the concept of space (two distinct points can exist in a three-dimensional space, and the shortest distance between them must be a straight line).
Another example is: "Every event has a cause". This is also universally true; at least, it is according to the intrinsic concepts of time (one event happens earlier in time, and another event happens later in time), and causality (events at one point in space and time, affect events at a different point in space and time). Maybe it would be possible for other reasoning entities (i.e. not humans) to evaluate these statements differently, assuming that such entities were imbued with different "intrinsic" concepts. But it is impossible for a reasoning human to evaluate those statements any other way.
The actual machinery of reasoning, as Kant explains, consists of twelve "categories" of understanding, each of which has a corresponding "judgement". These categories / judgements are essentially logic operations (although, strictly speaking, they predate the invention of modern predicate logic, and are based on Aristotle's syllogism), and they are as follows:
Group | Categories / Judgements | ||
---|---|---|---|
Quantity |
Unity Universal All trees have leaves |
Plurality Particular Some dogs are shaggy |
Totality Singular This ball is bouncy |
Quality |
Reality Affirmative Chairs are comfy |
Negation Negative No spoons are shiny |
Limitation Infinite Oranges are not blue |
Relation |
Inherence / Subsistence Categorical Happy people smile |
Causality / Dependence Hypothetical If it's February, then it's hot |
Community Disjunctive Potatoes are baked or fried |
Modality |
Existence Assertoric Sharks enjoy eating humans |
Possibility Problematic Beer might be frothy |
Necessity Apodictic 6 times 7 equals 42 |
The cognitive mind is able to evaluate all of the above possible propositions, according to Kant, with the help of the intrinsic concepts (note that these intrinsic concepts are not considered to be "innate knowledge", as defined by the rationalist movement), and also with the help of the twelve categories of understanding.
Reason, therefore, is the ability to evaluate arbitrary propositions, using such cognitive faculties as logic and intuition, and based on understanding and sensibility, which are bridged by way of "forms of sensibility".
If we consider existing AI with respect to the above definition of reason, it's clear that the capability is already developed maturely in some areas. In particular, existing AI – especially Knowledge Representation (KR) systems – has no problem whatsoever with formally evaluating predicate logic propositions. Existing AI – especially AI based on supervised learning methods – also excels at receiving and (crudely) processing large amounts of sensory input.
So, at one extreme end of the spectrum, there are pure ontological knowledge-base systems such as Cyc, where virtually all of the input into the system consists of hand-crafted factual propositions, and where almost none of the input is noisy real-world raw data. Such systems currently require a massive quantity of carefully curated facts to be on hand, in order to make inferences of fairly modest real-world usefulness.
Then, at the other extreme, there are pure supervised learning systems such as Google's NASNet, where virtually all of the input into the system consists of noisy real-world raw data, and where almost none of the input is human-formulated factual propositions. Such systems currently require a massive quantity of raw data to be on hand, in order to perform classification and regression tasks whose accuracy varies wildly depending on the target data set.
What's clearly missing, is something to bridge these two extremes. And, if transcendental idealism is to be our guide, then that something is "forms of sensibility". The key element of reason that humans have, and that machines currently lack, is a "lens" of the universe, with fundamental concepts of the nature of the universe – particularly of space, time, and causality – embodied in that lens.
What fundamental facts about the universe would a machine require, then, in order to have "forms of sensibility" comparable to that of a human? Well, if we were to take this to the extreme, then a machine would need to be imbued with all the laws of mathematics and physics that exist in our universe. However, let's assume that going to this extreme is neither necessary nor possible, for various reasons, including: we humans are probably only imbued with a subset of those laws (the ones that apply most directly to our everyday existence); it's probably impossible to discover the full set of those laws; and, we will assume that, if a reasoning entity is imbued only with an appropriate subset of those laws, then it's possible to deduce the remainder of the laws (and it's therefore also possible to deduce all other facts relating to observable phenomena in the universe).
I would, therefore, like to humbly suggest, in plain English, what some of these fundamental facts, suitable for comprising the "forms of sensibility" of a reasoning machine, might be:
I'm not suggesting that the above list is really a sufficient number of intrinsic concepts for a reasoning machine, nor that all of the above facts are the correct choice nor correctly worded for such a list. But this list is a good start, in my opinion. If an "intelligent" machine were to be appropriately imbued with those facts, then that should be a sufficient foundation for it to evaluate matters of space, time, and causality.
There are numerous other intrinsic aspects of human understanding that it would also, arguably, be essential for a reasoning machine to possess. Foremost of these is the concept of self: does AI need a hard-wired idea of "I"? Other such concepts include matter / substance, inertia, life / death, will, freedom, purpose, and desire. However, it's a matter of debate, rather than a given, whether each of these concepts is fundamental to the foundation of human-like reasoning, or whether each of them is learned and acquired as part of intellectual experience.
A machine as discussed so far is a good start, but it's still not enough to actually yield what would be considered human-like intelligence. Cyc, for example, is an existing real-world system that basically already has all these characteristics – it can evaluate logical propositions of arbitrary complexity, based on a corpus (a much larger one than my humble list above) of intrinsic facts, and based on some sensory input – yet no real intelligence has emerged from it.
One of the most important missing ingredients, is the ability to hypothesise. That is, based on the raw sensory input of real-world phenomena, the ability to observe a pattern, and to formulate a completely new, original proposition expressing that pattern as a rule. On top of that, it includes the ability to test such a proposition against new data, and, when the rule breaks, to modify the proposition such that the rule can accommodate that new data. That, in short, is what is known as deductive reasoning.
A child formulates rules in this way. For example, a child observes that when she drops a drinking glass, the glass shatters the moment that it hits the floor. She drops a glass in this way several times, just for fun (plenty of fun for the parents too, naturally), and observes the same result each time. At some point, she formulates a hypothesis along the lines of "drinking glasses break when dropped on the floor". She wasn't born knowing this, nor did anyone teach it to her; she simply "worked it out" based on sensory experience.
Some time later, she drops a glass onto the floor in a different room of the house, still from shoulder-height, but it does not break. So she modifies the hypothesis to be "drinking glasses break when dropped on the kitchen floor" (but not the living room floor). But then she drops a glass in the bathroom, and in that case it does break. So she modifies the hypothesis again to be "drinking glasses break when dropped on the kitchen or the bathroom floor".
But she's not happy with this latest hypothesis, because it's starting to get complex, and the human mind strives for simple rules. So she stops to think about what makes the kitchen and bathroom floors different from the living room floor, and realises that the former are hard (tiled), whereas the latter is soft (carpet). So she refines the hypothesis to be "drinking glasses break when dropped on a hard floor". And thus, based on trial-and-error, and based on additional sensory experience, the facts that comprise her understanding of the world have evolved.
Some would argue that current state-of-the-art AI is already able to formulate rules, by way of feature learning (e.g. in image recognition). However, a "feature" in a neural network is just a number, either one directly taken from the raw data, or one derived based on some sort of graph function. So when a neural network determines the "features" that correspond to a duck, those features are just numbers that represent the average outline of a duck, the average colour of a duck, and so on. A neural network doesn't formulate any actual facts about a duck (e.g. "ducks are yellow"), which can subsequently be tested and refined (e.g. "bath toy ducks are yellow"). It just knows that if the image it's processing has a yellowish oval object occupying the main area, there's a 63% probability that it's a duck.
Another faculty that the human mind possesses, and that AI currently lacks, is intuition. That is, the ability to reach a conclusion based directly on sensory input, without resorting to logic as such. The exact definition of intuition, and how it differs from instinct, is not clear (in particular, both are sometimes defined as a "gut feeling"). It's also unclear whether or not some form of intuition is an essential ingredient of human-like intelligence.
It's possible that intuition is nothing more than a set of rules, that get applied either before proper logical reasoning has a chance to kick in (i.e. "first resort"), or after proper logical reasoning has been exhausted (i.e. "last resort"). For example, perhaps after a long yet inconclusive analysis of competing facts, regarding whether your Uncle Jim is telling the truth or not when he claims to have been to Mars (e.g. "Nobody has ever been to Mars", "Uncle Jim showed me his medal from NASA", "Mum says Uncle Jim is a flaming crackpot", "Uncle Jim showed me a really red rock"), your intuition settles the matter with the rule: "You should trust your own family". But, on the other hand, it's also possible that intuition is a more elementary mechanism, and that it can't be expressed in the form of logical rules at all: instead, it could simply be a direct mapping of "situations" to responses.
In order to test whether a hypothetical machine, as discussed so far, is "good enough" to be considered intelligent, I'd like to turn to one of the domains that current-generation AI is already pursuing: criminal justice assessment. One particular area of this domain, in which the use of AI has grown significantly, is determining whether an incarcerated person should be approved for parole or not. Unsurprisingly, AI's having input into such a decision has so far, in real life, not been considered altogether successful.
The current AI process for this is based almost entirely on statistical analysis. That is, the main input consists of simple numeric parameters, such as: number of incidents reported during imprisonment; level of severity of the crime originally committed; and level of recurrence of criminal activity. The input also includes numerous profiling parameters regarding the inmate, such as: racial / ethnic group; gender; and age. The algorithm, regardless of any bells and whistles it may claim, is invariably simply answering the question: for other cases with similar input parameters, were they deemed eligible for parole? And if so, did their conduct after release demonstrate that they were "reformed"? And based on that, is this person eligible for parole?
Current-generation AI, in other words, is incapable of considering a single such case based on its own merits, nor of making any meaningful decision regarding that case. All it can do, is compare the current case to its training data set of other cases, and determine how similar the current case is to those others.
A human deciding parole eligibility, on the other hand, does consider the case in question based on its own merits. Sure, a human also considers the numeric parameters and the profiling parameters that a machine can so easily evaluate. But a human also considers each individual event in the inmate's history as a stand-alone fact, and each such fact can affect the final decision differently. For example, perhaps the inmate seriously assaulted other inmates twice while imprisoned. But perhaps he also read 150 novels, and finished a university degree by correspondence. These are not just statistics, they're facts that must be considered, and each fact must refine the hypothesis whose final form is either "this person is eligible for parole", or "this person is not eligible for parole".
A human is also influenced by morals and ethics, when considering the character of another human being. So, although the question being asked is officially: "is this person eligible for parole?", the question being considered in the judge's head may very well actually be: "is this person good or bad?". Should a machine have a concept of ethics, and/or of good vs bad, and should it apply such ethics when considering the character of an individual human? Most academics seem to think so.
According to Kant, ethics is based on a foundation of reason. But that doesn't mean that a reasoning machine is automatically an ethical machine, either. Does AI need to understand ethics, in order to possess what we would consider human-like intelligence?
Although decisions such as parole eligibility are supposed to be objective and rational, a human is also influenced by emotions, when considering the character of another human being. Maybe, despite the evidence suggesting that the inmate is not reformed, the judge is stirred by a feeling of compassion and pity, and this feeling results in parole being granted. Or maybe, despite the evidence being overwhelmingly positive, the judge feels fear and loathing towards the inmate, mainly because of his tough physical appearance, and this feeling results in parole being denied.
Should human-like AI possess the ability to be "stirred" by such emotions? And would it actually be desirable for AI to be affected by such emotions, when evaluating the character of an individual human? Some such emotions might be considered positive, while others might be considered negative (particularly from an ethical point of view).
I think the ultimate test in this domain – perhaps the "Turing test for criminal justice assessment" – would be if AI were able to understand, and to properly evaluate, this great parole speech, which is one of my personal favourite movie quotes:
There's not a day goes by I don't feel regret. Not because I'm in here, or because you think I should. I look back on the way I was then: a young, stupid kid who committed that terrible crime. I want to talk to him. I want to try and talk some sense to him, tell him the way things are. But I can't. That kid's long gone and this old man is all that's left. I got to live with that. Rehabilitated? It's just a bulls**t word. So you can go and stamp your form, Sonny, and stop wasting my time. Because to tell you the truth, I don't give a s**t.
"Red" (Morgan Freeman)
In the movie, Red's parole was granted. Could we ever build an AI that could also grant parole in that case, and for the same reasons? On top of needing the ability to reason with real facts, and to be affected by ethics and by emotion, properly evaluating such a speech requires the ability to understand humour – black humour, no less – along with apathy and cynicism. No small task.
Sorry if you were expecting me to work wonders in this article, and to actually teach the world how to build artificial intelligence that reasons. I don't have the magic answer to that million dollar question. However, I hope I have achieved my aim here, which was to describe what's needed in order for it to even be possible for such AI to come to fruition.
It should be clear, based on what I've discussed here, that most current-generation AI is based on a completely inadequate foundation for even remotely human-like intelligence. Chucking big data at a statistic-crunching algorithm on a fat cluster might be yielding cool and even useful results, but it will never yield intelligent results. As centuries of philosophical debate can teach us – if only we'd stop and listen – human intelligence rests on specific building blocks. These include, at the very least, an intrinsic understanding of time, space, and causality; and the ability to hypothesise based on experience. If we are to ever build a truly intelligent artificial agent, then we're going to have to figure out how to imbue it with these things.
Most discussion of late seems to treat this encroaching joblessness entirely as an economic issue. Families without incomes, spiralling wealth inequality, broken taxation mechanisms. And, consequently, the solutions being proposed are mainly economic ones. For example, a Universal Basic Income to help everyone make ends meet. However, in my opinion, those economic issues are actually relatively easy to address, and as a matter of sheer necessity we will sort them out sooner or later, via a UBI or via whatever else fits the bill.
The more pertinent issue is actually a social and a psychological one. Namely: how will people keep themselves occupied in such a world? How will people nourish their ambitions, feel that they have a purpose in life, and feel that they make a valuable contribution to society? How will we prevent the malaise of despair, depression, and crime from engulfing those who lack gainful enterprise? To borrow the colourful analogy that others have penned: assuming that there's food on the table either way, how do we head towards a Star Trek rather than a Mad Max future?
The truth is, since the Industrial Revolution, an ever-expanding number of people haven't really needed to work anyway. What I mean by that is: if you think about what jobs are actually about providing society with the essentials such as food, water, shelter, and clothing, you'll quickly realise that fewer people than ever are employed in such jobs. My own occupation, web developer, is certainly not essential to the ongoing survival of society as a whole. Plenty of other occupations, particularly in the services industry, are similarly remote from humanity's basic needs.
So why do these jobs exist? First and foremost, demand. We live in a world of free markets and capitalism. So, if enough people decide that they want web apps, and those people have the money to make it happen, then that's all that's required for "web developer" to become and to remain a viable occupation. Second, opportunity. It needs to be possible to do that thing known as "developing web apps" in the first place. In many cases, the opportunity exists because of new technology; in my case, the Internet. And third, ambition. People need to have a passion for what they do. This means that, ideally, people get to choose an occupation of their own free will, rather than being forced into a certain occupation by their family or by the government. If a person has a natural talent for his or her job, and if a person has a desire to do the job well, then that benefits the profession as a whole, and, in turn, all of society.
Those are the practical mechanisms through which people end up spending much of their waking life at work. However, there's another dimension to all this, too. It is very much in the interest of everyone that makes up "the status quo" – i.e. politicians, the police, the military, heads of big business, and to some extent all other "well to-do citizens" – that most of society is caught up in the cycle of work. That's because keeping people busy at work is the most effective way of maintaining basic law and order, and of enforcing control over the masses. We have seen throughout history that large-scale unemployment leads to crime, to delinquency and, ultimately, to anarchy. Traditionally, unemployment directly results in poverty, which in turn directly results in hunger. But even if the unemployed get their daily bread – even if the crisis doesn't reach let them eat cake proportions – they are still at risk of falling to the underbelly of society, if for no other reason, simply due to boredom.
So, assuming that a significantly higher number of working-age men and women will have significantly fewer job prospects in the immediate future, what are we to do with them? How will they keep themselves occupied?
I propose that, as an alternative to traditional employment, these people engage in large-scale, long-term, government-sponsored, semi-recreational activities. These must be activities that: (a) provide some financial reward to participants; (b) promote physical health and social well-being; and (c) make a tangible positive contribution to society. As a massive tongue-in-cheek, I call this proposal "The Jobless Games".
My prime candidate for such an activity would be a long-distance walk. The journey could take weeks, months, even years. Participants could number in the hundreds, in the thousands, even in the millions. As part of the walk, participants could do something useful, too; for example, transport non-urgent goods or mail, thus delivering things that are actually needed by others, and thus competing with traditional freight services. Walking has obvious physical benefits, and it's one of the most social things you can do while moving and being active. Such a journey could also be done by bicycle, on horseback, or in a variety of other modes.
Other recreational programs could cover the more adventurous activities, such as climbing, rafting, and sailing. However, these would be less suitable, because: they're far less inclusive of people of all ages and abilities; they require a specific climate and geography; they're expensive in terms of equipment and expertise; they're harder to tie in with some tangible positive end result; they're impractical in very large groups; and they damage the environment if conducted on too large a scale.
What I'm proposing is not competitive sport. These would not be races. I don't see what having winners and losers in such events would achieve. What I am proposing is that people be paid to participate in these events, out of the pocket of whoever has the money, i.e. governments and big business. The conditions would be simple: keep up with the group, and behave yourself, and you keep getting paid.
I see such activities co-existing alongside whatever traditional employment is still available in future; and despite all the doom and gloom predictions, the truth is that there always has been real work out there, and there always will be. My proposal is that, same as always, traditional employment pays best, and thus traditional employment will continue to be the most attractive option for how to spend one's days. Following that, "The Games" pay enough to get by on, but probably not enough to enjoy all life's luxuries. And, lastly, as is already the case in most first-world countries today, for the unemployed there should exist a social security payment, and it should pay enough to cover life's essentials, but no more than that. We already pay people sit down money; how about a somewhat more generous payment of stand up money?
Along with these recreational activities that I've described, I think it would also be a good idea to pay people for a lot of the work that is currently done by volunteers without financial reward. In a future with less jobs, anyone who decides to peel potatoes in a soup kitchen, or to host bingo games in a nursing home, or to take disabled people out for a picnic, should be able to support him- or herself and to live in a dignified manner. However, as with traditional employment, there are also only so many "volunteer" positions that need filling, and even with that sector significantly expanded, there would still be many people left twiddling their thumbs. Which is why I think we need some other solution, that will easily and effectively get large numbers of people on their feet. And what better way to get them on their feet, than to say: take a walk!
Large-scale, long-distance walks could also solve some other problems that we face at present. For example, getting a whole lot of people out of our biggest and most crowded cities, and "going on tour" to some of our smallest and most neglected towns, would provide a welcome economic boost to rural areas, considering all the support services that such activities would require; while at the same time, it would ease the crowding in the cities, and it might even alleviate the problem of housing affordability, which is acute in Australia and elsewhere. Long-distance walks in many parts of the world – particularly in Europe – could also provide great opportunities for an interchange of language and culture.
There you have it, my humble suggestion to help fill the void in peoples' lives in the future. There are plenty of other things that we could start paying people to do, that are more intellectual and that make a more tangible contribution to society: e.g. create art, be spiritual, and perform in music and drama shows. However, these things are too controversial for the government to support on such a large scale, and their benefit is a matter of opinion. I really think that, if something like this is to have a chance of succeeding, it needs to be dead simple and completely uncontroversial. And what could be simpler than walking?
Whatever solutions we come up with, I really think that we need to start examining the issue of 21st-century job redundancy from this social angle. The economic angle is a valid one too, but it has already been analysed quite thoroughly, and it will sort itself out with a bit of ingenuity. What we need to start asking now is: for those young, fit, ambitious people of the future that lack job prospects, what activity can they do that is simple, social, healthy, inclusive, low-impact, low-cost, and universal? I'd love to hear any further suggestions you may have.
]]>I'd never before stopped to think about whether or not there was a limit to how much you can put in a cookie. Usually, cookies only store very small string values, such as a session ID, a tracking code, or a browsing preference (e.g. "tile" or "list" for search results). So, usually, there's no need to consider its size limits.
However, while working on a new side project of mine that heavily uses session storage, I discovered this limit the hard (to debug) way. Anyway, now I've got one more adage to add to my developer's phrasebook: if you're trying to store more than 4KiB in a cookie, you're doing it wrong.
Actually, according to the web site Browser Cookie Limits, the safe "lowest common denominator" maximum size to stay below is 4093 bytes. Also check out the Stack Overflow discussion, What is the maximum size of a web browser's cookie's key?, for more commentary regarding the limit.
In my case – working with Flask, which depends on Werkzeug – trying to store an oversized cookie doesn't throw any errors, it simply fails silently. I've submitted a patch to Werkzeug, to make oversized cookies raise an exception, so hopefully it will be more obvious in future when this problem occurs.
It appears that this is not an isolated issue; many web frameworks and libraries fail silently with storage of too-big cookies. It's the case with Django, where the decision was made to not fix it, for technical reasons. Same story with CodeIgniter. Seems that Ruby on Rails is well-behaved and raises exceptions. Basically, your mileage may vary: don't count on your framework of choice alerting you, if you're being a cookie monster.
Also, as several others have pointed out, trying to store too much data in cookies is a bad idea anyway, because that data travels with every HTTP request and response, so it should be as small as possible. As I learned, if you find that you're dealing with non-trivial amounts of session data, then ditch client-side storage for the app in question, and switch to server-side session data storage (preferably using something like Memcached or Redis).
]]>If your design is sufficiently custom that you're writing theme-level Views template files, then chances are that you'll be in danger of creating duplicate templates. I've committed this sin on numerous sites over the past few years. On many occasions, my Views templates were 100% identical, and after making a change in one template, I literally copy-pasted and renamed the file, to update the other templates.
Until, finally, I decided that enough is enough – time to get DRY!
Being less repetitive with your Views templates is actually dead simple. Let's say you have three identical files – views-view-fields--search_this_site.tpl.php
, views-view-fields--featured_articles.tpl.php
, and views-view-fields--articles_archive.tpl.php
. Here's how you clean up your act:
template.php
file:<?php
function mytheme_preprocess_views_view_fields(&$vars) {
if (in_array(
$vars['view']->name, array(
'search_this_site',
'featured_articles',
'articles_archive'))) {
$vars['theme_hook_suggestions'][] =
'views_view_fields__search_this_site';
}
}
I've found that views-view-fields.tpl.php
-based files are the biggest culprits for duplication; but you might have some other Views templates in need of cleaning up, too, such as:
<?php
function mytheme_preprocess_views_view(&$vars) {
if (in_array(
$vars['view']->name, array(
'search_this_site',
'featured_articles',
'articles_archive'))) {
$vars['theme_hook_suggestions'][] =
'views_view__search_this_site';
}
}
And, if your views include a search / filtering form, perhaps also:
<?php
function mytheme_preprocess_views_exposed_form(&$vars) {
if (in_array(
$vars['view']->name, array(
'search_this_site',
'featured_articles',
'articles_archive'))) {
$vars['theme_hook_suggestions'][] =
'views_exposed_form__search_this_site';
}
}
That's it – just a quick tip from me for today. You can find out more about this technique on the Custom Theme Hook Suggestions documentation page, although I couldn't find an example for Views there, nor anywhere else online for that matter; hence this article. Hopefully this results in a few kilobytes saved, and (more importantly) a lot of unnecessary copy-pasting of template files saved, for fellow Drupal devs and themers.
]]>Societal vices have always been bountiful. Back in the ol' days, it was just the usual suspects. War. Violence. Greed. Corruption. Injustice. Propaganda. Lewdness. Alcoholism. To name a few. In today's world, still more scourges have joined in the mix. Consumerism. Drug abuse. Environmental damage. Monolithic bureaucracy. And plenty more.
There always have been some folks who elect to isolate themselves from the masses, to renounce their mainstream-ness, to protect themselves from all that nastiness. And there always will be. Nothing wrong with doing so.
However, there's a difference between protecting oneself from "the evils of society", and blinding oneself to their very existence. Sometimes this difference is a fine line. Particularly in the case of families, where parents choose to shield from the Big Bad World not only themselves, but also their children. Protection is noble and commendable. Blindfolding, in my opinion, is cowardly and futile.
There are plenty of examples from bygone times, of historical abstainers from mainstream society. Monks and nuns, who have for millenia sought serenity, spirituality, abstinence, and isolation from the material. Hermits of many varieties: witches, grumpy old men / women, and solitary island-dwellers.
Religion has long been an important motive for seclusion. Many have settled on a reclusive existence as their solution to avoiding widespread evils and being closer to G-d. Other than adult individuals who choose a monastic life, there are also whole communities, composed of families with children, who live in seclusion from the wider world. The Amish in rural USA are probably the most famous example, and also one of the longest-running such communities. Many ultra-orthodox Jewish communities, particularly within present-day Israel, could also be considered as secluded.
More recently, the "commune living" hippie phenomenon has seen tremendous growth worldwide. The hippie ideology is, of course, generally an anti-religious one, with its acceptance of open relationships, drug use, lack of hierarchy, and often a lack of any formal G-d. However, the secluded lifestyle of hippie communes is actually quite similar to that of secluded religious groups. It's usually characterised by living amidst, and in tune with, nature; rejecting modern technology; and maintaining a physical distance from regular urban areas. The left-leaning members of these communities tend to strongly shun consumerism, and to promote serenity and spirituality, much like their G-d fearing comrades.
Like the members of these communities, I too am repulsed by many of the "evils" within the society in which we live. Indeed, the idea of joining such a community is attractive to me. It would be a pleasure and a relief to shut myself out from the blight that threatens me, and from everyone that's "infected" by it. Life would be simpler, more peaceful, more wholesome.
I empathise with those who have chosen this path in life. Just as it's tempting to succumb to all the world's vices, so too is it tempting to flee from them. However, such people are also living in a bubble. An artificial world, from which the real world has been banished.
What bothers me is not so much the independent adult people who have elected for such an existence. Despite all the faults of the modern world, most of us do at least enjoy far-reaching liberty. So, it's a free land, and adults are free to live as they will, and to blind themselves to what they will.
What does bother me, is that children are born and raised in such an existence. The adult knows what it is that he or she is shut off from, and has experienced it before, and has decided to discontinue experiencing it. The child, on the other hand, has never been exposed to reality, he or she knows only the confines of the bubble. The child is blind, but to what, it knows not.
This is a cowardly act on the part of the parents. It's cowardly because a child only develops the ability to combat and to reject the world's vices, such as consumerism or substance abuse, by being exposed to them, by possibly experimenting with them, and by making his or her own decisions. Parents that are serious about protecting their children do expose them to the Big Bad World, they do take risks; but they also do the hard yards in preparing their children for it: they ensure that their children are raised with education, discipline, and love.
Blindfolding children to the reality of wider society is also futile — because, sooner or later, whether still as children or later as adults, the Big Bad World exposes itself to all, whether you like it or not. No Amish countryside, no hippie commune, no far-flung island, is so far or so disconnected from civilisation that its inhabitants can be prevented from ever having contact with it. And when the day of exposure comes, those that have lived in their little bubble find themselves totally unprepared for the very "evils" that they've supposedly been protected from for all their lives.
In my opinion, the best way to protect children from the world's vices, is to expose them in moderation to the world's nasty underbelly, while maintaining a stable family unit, setting a strong example of rejecting the bad, and ensuring a solid education. That is, to do what the majority of the world's parents do. That's right: it's a formula that works reasonably well for billions of people, and that has been developed over thousands of years, so there must be some wisdom to it.
Obviously, children need to be protected from dangers that could completely overwhelm them. Bringing up a child in a favela environment is not ideal, and sometimes has horrific consequences, just watch City of G-d if you don't believe me. But then again, blindfolding is the opposite extreme; and one extreme can be as bad as the other. Getting the balance somewhere in between is the key.
]]>There are plenty of articles round and about the interwebz, aimed more at the practical side of coming to Chile: i.e. tips regarding how to get around; lists of rough prices of goods / services; and crash courses in Chilean Spanish. There are also a number of commentaries on the cultural / social differences between Chile and elsewhere – on the national psyche, and on the political / economic situation.
My endeavour is to avoid this article from falling neatly into either of those categories. That is, I'll be covering some eccentricities of Chile that aren't practical tips as such, although knowing about them may come in handy some day; and I'll be covering some anecdotes that certainly reflect on cultural themes, but that don't pretend to paint the Chilean landscape inside-out, either.
Que disfrutiiy, po.
Here in Chile, all that is money-related is monthly. You pay everything monthly (your rent, all your bills, all membership fees e.g. gym, school / university fees, health / home / car insurance, etc); and you get paid monthly (if you work here, which I don't). I know that Chile isn't the only country with this modus operandi: I believe it's the European system; and as far as I know, it's the system in various other Latin American countries too.
In Australia – and as far as I know, in most English-speaking countries – there are no set-in-stone rules about the frequency with which you pay things, or with which you get paid. Bills / fees can be weekly, monthly, quarterly, annual… whatever (although rent is generally charged and is talked about as a weekly cost). Your pay cheque can be weekly, fortnightly, monthly, quarterly… equally whatever (although we talk about "how much you earn" annually, even though hardly anyone is paid annually). I guess the "all monthly" system is more consistent, and I guess it makes it easier to calculate and compare costs. However, having grown up with the "whatever" system, "all monthly" seems strange and somewhat amusing to me.
In Chile, although payment due dates can be anytime throughout the month, almost everyone receives their salary at fin de mes (the end of the month). I believe the (rough) rule is: the dosh arrives on the actual last day of the month if it's a regular weekday; or the last regular weekday of the month, if the actual last day is a weekend or public holiday (which is quite often, since Chile has a lot of public holidays – twice as many as Australia!).
This system, combined with the last-minute / impulsive form of living here, has an effect that's amusing, frustrating, and (when you think about it) depressingly predictable. As I like to say (in jest, to the locals): in Chile, it's Christmas time every end-of-month! The shops are packed, the restaurants are overflowing, and the traffic is insane, on the last day and the subsequent few days of each month. For the rest of the month, all is quiet. Especially the week before fin de mes, which is really Struggle Street for Chileans. So extreme is this fin de mes culture, that it's even busy at the petrol stations at this time, because many wait for their pay cheque before going to fill up the tank.
This really surprised me during my first few months in Chile. I used to ask: ¿Qué pasa? ¿Hay algo important hoy? ("What's going on? Is something important happening today?"). To which locals would respond: Es fin de mes! Hoy te pagan! ("It's end-of-month! You get paid today!"). These days, I'm more-or-less getting the hang of the cycle; although I don't think I'll ever really get my head around it. I'm pretty sure that, even if we did all get paid on the same day in Australia (which we don't), we wouldn't all rush straight to the shops in a mad stampede, desperate to spend the lot. But hey, that's how life is around here.
Continuing with the socio-economic theme, and also continuing with the "all-monthly" theme: another Chile-ism that will never cease to amuse and amaze me, is the omnipresent cuotas ("monthly instalments"). Chile has seen a spectacular rise in the use of credit cards, over the last few decades. However, the way these credit cards work is somewhat unique, compared with the usual credit system in Australia and elsewhere.
Any time you make a credit card purchase in Chile, the cashier / shop assistant will, without fail, ask you: ¿cuántas cuotas? ("how many instalments?"). If you're using a foreign credit card, like myself, then you must always answer: sin cuotas ("no instalments"). This is because, even if you wanted to pay for your purchase in chunks over the next 3-24 months (and trust me, you don't), you can't, because this system of "choosing at point of purchase to pay in instalments" only works with local Chilean cards.
Chile's current president, the multi-millionaire Sebastian Piñera, played an important part in bringing the credit card to Chile, during his involvement with the banking industry before entering politics. He's also generally regarded as the inventor of the cuotas system. The ability to choose your monthly instalments at point of sale is now supported by all credit cards, all payment machines, all banks, and all credit-accepting retailers nationwide. The system has even spread to some of Chile's neighbours, including Argentina.
Unfortunately, although it seems like something useful for the consumer, the truth is exactly the opposite: the cuotas system and its offspring, the cuotas national psyche, has resulted in the vast majority of Chileans (particularly the less wealthy among them) being permanently and inescapably mired in debt. What's more, although some of the cuotas offered are interest-free (with the most typical being a no-interest 3-instalment plan), some plans and some cards (most notoriously the "department store bank" cards) charge exhorbitantly high interest, and are riddled with unfair and arcane terms and conditions.
Chile's a funny place, because it's so "not Latin America" in certain aspects (e.g. much better infrastructure than most of its neighbours), and yet it's so "spot-on Latin America" in other aspects. The última hora ("last-minute") way of living definitely falls within the latter category.
In Chile, people do not make plans in advance. At least, not for anything social- or family-related. Ask someone in Chile: "what are you doing next weekend?" And their answer will probably be: "I don't know, the weekend hasn't arrived yet… we'll see!" If your friends or family want to get together with you in Chile, don't expect a phone call the week before. Expect a phone call about an hour before.
I'm not just talking about casual meet-ups, either. In Chile, expect to be invited to large birthday parties a few hours before. Expect to know what you're doing for Christmas / New Year a few hours before. And even expect to know if you're going on a trip or not, a few hours before (and if it's a multi-day trip, expect to find a place to stay when you arrive, because Chileans aren't big on making reservations).
This is in stark contrast to Australia, where most people have a calendar to organise their personal life (something extremely uncommon in Chile), and where most peoples' evenings and weekends are booked out at least a week or two in advance. Ask someone in Sydney what their schedule is for the next week. The answer will probably be: "well, I've got yoga tomorrow evening, I'm catching up with Steve for lunch on Wednesday, big party with some old friends on Friday night, beach picnic on Saturday afternoon, and a fancy dress party in the city on Saturday night." Plus, ask them what they're doing in two months' time, and they'll probably already have booked: "6 nights staying in a bungalow near Batemans Bay".
The última hora system is both refreshing and frustrating, for a planned-ahead foreigner like myself. It makes you realise just how regimented, inflexible, and lacking in spontenaeity life can be in your home country. But, then again, it also makes you tear your hair out, when people make zero effort to co-ordinate different events and to avoid clashes. Plus, it makes for many an awkward silence when the folks back home ask the question that everybody asks back home, but that nobody asks around here: "so, what are you doing next weekend?" Depends which way the wind blows.
In Chile (and elsewhere nearby, e.g. Argentina), you do not eat or drink while standing. In most bars in Chile, everyone is sitting down. In fact, in general there is little or no "bar" area, in bars around here; it's all tables and chairs. If there are no tables or chairs left, people will go to a different bar, or wait for seats to become vacant before eating / drinking. Same applies in the home, in the park, in the garden, or elsewhere: nobody eats or drinks standing up. Not even beer. Not even nuts. Not even potato chips.
In Australia (and in most other English-speaking countries, as far as I know), most people eat and drink while standing, in a range of different contexts. If you're in a crowded bar or pub, eating / drinking / talking while standing is considered normal. Likewise for a big house party. Same deal if you're in the park and you don't want to sit on the grass. I know it's only a little thing; but it's one of those little things that you only realise is different in other cultures, after you've lived somewhere else.
It's also fairly common to see someone eating their take-away or other food while walking, in Australia. Perhaps some hot chips while ambling along the beach. Perhaps a sandwich for lunch while running (late) to a meeting. Or perhaps some lollies on the way to the bus stop. All stuff you wouldn't blink twice at back in Oz. In Chile, that is simply not done. Doesn't matter if you're in a hurry. It couldn't possibly be such a hurry, that you can't sit down to eat in a civilised fashion. The Chilean system is probably better for your digestion! And they have a point: perhaps the solution isn't to save time by eating and walking, but simply to be in less of a hurry?
One of the most striking visual differences between the Santiago and Sydney streetscapes, in my opinion, is that walled-up and shuttered-up buildings are far more prevalent in the former than in the latter. Santiago is not a dangerous city, by Latin-American or even by most Western standards; however, it often feels much less secure than it should, particularly at night, because often all you can see around you is chains, padlocks, and sturdy grilles. Chileans tend to shut up shop Fort Knox-style.
Walk down Santiago's Ahumada shopping strip in the evening, and none of the shopfronts can be seen. No glass, no lit-up signs, no posters. Just grey steel shutters. Walk down Sydney's Pitt St in the evening, and – even though all the shops close earlier than in Santiago – it doesn't feel like a prison, it just feels like a shopping area after-hours.
In Chile, virtually all houses and apartment buildings are walled and gated. Also, particularly ugly in my opinion, schools in Chile are surrounded by high thick walls. For both houses and schools, it doesn't matter if they're upper- or lower-class, nor what part of town they're in: that's just the way they build them around here. In Australia, on the other hand, you can see most houses and gardens from the street as you go past (and walled-in houses are criticised as being owned by "paranoid people"); same with schools, which tend to be open and abundant spaces, seldom delimiting their boundary with anything more than a low mesh fence.
As I said, Santiago isn't a particularly dangerous city, although it's true that robbery is far more common here than in Sydney. The real difference, in my opinion, is that Chileans simply don't feel safe unless they're walled in and shuttered up. Plus, it's something of a vicious cycle: if everyone else in the city has a wall around their house, and you don't, then chances are that your house will be targeted, not because it's actually easier to break into than the house next door (which has a wall that can be easily jumped over anyway), but simply because it looks more exposed. Anyway, I will continue to argue to Chileans that their country (and the world in general) would be better with less walls and less barriers; and, no doubt, they will continue to stare back at me in bewilderment.
So, there you have it: a few of my random observations about life in Santiago, Chile. I hope you've found them educational and entertaining. Overall, I've enjoyed my time in this city; and while I'm sometimes critical of and poke fun at Santiago's (and Chile's) peculiarities, I'm also pretty sure I'll miss then when I'm gone. If you have any conclusions of your own regarding life in this big city, feel free to share them below.
]]>Two weeks ago, the Gillard government succeeded in passing legislation for a new carbon tax through the lower house of the Australian federal parliament. Shortly after, opposition leader Tony Abbott made a "pledge in blood", promising that: "We will repeal the tax, we can repeal the tax, we must repeal the tax".
The passing of the carbon tax bill represents a concerted effort spanning at least ten years, made possible by the hard work and the sacrifice of numerous Australians (at all levels, including at the very top). Australia is the highest per-capita greenhouse gas emitter in the developed world. We need climate change legislation enactment urgently, and this bill represents a huge step towards that endeavour.
I don't usually publish direct political commentary here. Nor do I usually name and shame. But I feel compelled to make an exception in this case. For me, Tony Abbott's response to the carbon tax can only possibly be addressed in one way. He leaves us with no option. If this man has sworn to repeal the good work that has flourished of late, then the solution is simple. Tony Abbott must never lead this country. The consequences of his ascension to power would be, in a nutshell, diabolical.
So, join me in making a blood pledge to never vote for Tony Abbott.
Fortunately, as commentators have pointed out, it would actually be extremely difficult — if not downright impossible — for an Abbott-led government to repeal the tax in practice (please G-d may such government never come to pass). Also fortunate is the fact that support for the anti-carbon-tax movement is much less than Abbott makes it out to be, via his dramatic media shenanigans.
Of course, there are also a plethora of other reasons to not vote for tony. His hard-line Christian stance on issues such as abortion, gay marriage, and euthanasia. His xenophobia towards what he perceives as "the enemies of our Christian democratic society", i.e. Muslims and other minority groups. His policies regarding Aboriginal rights. His pathetic opportunism of the scare-campaign bandwagon to "stop the boats". His unashamed labelling of himself as "Howard 2.0". His budgie smugglers (if somehow — perhaps due to a mental disability — nothing else about the possibility of Abbott being PM scares the crap out of you, at least consider this!).
In last year's Federal election, I was truly terrified at the real and imminent possibility that Abbott could actually win (and I wasn't alone). I was aghast at how incredibly close he came to claiming the top job, although ultimately very relieved in seeing him fail (a very Bush-esque affair, in my opinion, was Australia's post-election kerfuffle of 2010 — which I find fitting, since I find Abbott almost as nauseating as the legendarily dimwitted G-W himself).
I remain in a state of trepidation, as long as Abbott continues to have a chance of leading this country. Because, laugh as we may at Gillard's 2010 election slogan of "Moving Forward Together", I can assure you that Abbott's policy goes more along the line of "Moving Backward Stagnantly".
Image courtesy of The Wire.
]]>I've always hated Facebook. I originally joined not out of choice, but out of necessity, there being no other way to contact numerous friends of mine who had decided to boycott all alternative methods of online communication. Every day since joining, I've remained a reluctant member at best, and an open FB hater to say the least. The recent decisions of several friends of mine to delete their FB account outright, brings a warm fuzzy smile to my face. I haven't deleted my own FB account — I wish I could; but unfortunately, doing so would make numerous friends of mine uncontactable to me, and numerous social goings-on unknowable to me, today as much as ever.
There are, however, numerous features of FB that I have refused to utilise from day one, and that I highly recommend that all the world boycott. In a nutshell: any feature that involves FB being the primary store of your important personal data, is a feature that you should reject outright. Facebook is an evil company, and don't you forget it. They are not to be trusted with the sensitive and valuable data that — in this digital age of ours — all but defines who you are.
I do not upload any photos to FB. No exceptions. End of story. I uploaded a handful of profile pictures back in the early days, but it's been many years since I did even that.
People who don't know me so well, will routinely ask me, in a perplexed voice: "where are all your Facebook photos?" As if not putting photos on Facebook is akin to not diving onto the road to save an old lady from getting hit by a five-car road train.
My dear friends, there are alternatives! My photos all live on Flickr. My Flickr account has an annual fee, but there are a gazillion advantages to Flickr over FB. It looks better. It doesn't notify all my friends every time I upload a photo. For a geek like me, it has a nice API (FB's API being anything but nice).
But most importantly, I can trust Flickr with my photos. For many of us, our photos are the most valuable digital assets we possess, both sentimentally, and in information identity monetary terms. If you choose to upload your photos to FB, you are choosing to trust FB with those photos, and you are relinquishing control of them over to FB. I know people who have the only copy of many of their prized personal photos on FB. This is an incredibly bad idea!
FB's Terms of Service are, to say the least, horrendous. They reserve the right to sell, to publish, to data mine, to delete, and to prevent deletion of, anything that you post on FB. Flickr, on the other hand, guarantees in its Terms of Service that it will do none of these things; on the contrary, it even goes so far as to allow you to clearly choose the license of every photo you upload to the site (e.g. Creative Commons). Is FB really a company that you're prepared to trust with such vital data?
If you're following my rule above, of not uploading photos to FB, then not tagging your own photos should be unavoidable. Don't tag your friends' photos either!
FB sports the extremely popular feature of allowing users to draw a box around their friends' faces in a photo, and to tag those boxes as corresponding to their friends' FB accounts. For a geek like myself, it's been obvious since the moment I first encountered this feature, that it is Pure Evil™. I have never tagged a single face in a FB photo (although unfortunately I've been tagged in many photos by other people). Boycott this tool!
Why is FB photo tagging Pure Evil™, you ask? Isn't it just a cool idea, that means that when you hover over peoples' faces in a photo, you are conveniently shown their names? No — it has other conveniences, not for you but for the FB corporation, for other businesses, and for governments; and those conveniences are rather more sinister.
Facial recognition software technology has been advancing at a frighteningly rapid pace, over the past several years. Up until now, the accuracy of such technology has been insufficient for commercial or government use; but we're starting to see that change. We're seeing the emergence of tools that are combining the latest algorithms with information on the Web. And, as far as face-to-name information online goes, FB — thanks to the photo-tagging efforts of its users — can already serve as the world's largest facial recognition database.
This technology, combined with other data mining tools and applications, make tagged FB photos one of the biggest potential enemies of privacy and anti- Big Brother in the world today. FB's tagged photo database is a wet dream for the NSA and cohort. Do you want to voluntarily contribute to the wealth of everything they know about everyone? Personally, I think they know more than enough about us already.
This is a simple question of where your online correspondence is archived, and of how much you care about that. Your personal messages are an important digital asset of yours. Are they easily searchable? Are you able to export them and back them up? Do you maintain effective ownership of them? Do you have any guarantee that you'll be able to access them in ten years' time?
If a significant amount of your correspondence is in FB messages, then then the answer to all the above questions is "no". If, on the other hand, you still use old-fashioned e-mail to send privates messages whenever possible, then you're in a much better situation. Even if you use web-based e-mail such as Gmail (which I use), you're still far more in control of your mailbox content than you are with FB.
For me, this is also just a question of keeping all my personal messages in one place, and that place is my e-mail archives. Obviously, I will never have everything sent to my FB message inbox. So, it's better that I keep it all centralised where it's always been — in my good "ol' fashioned" e-mail client.
Don't use FB Pages as your web site. Apart from being unprofessional, and barely a step above (*shudder*) MySpace (which is pushing up the daisies, thank G-d), this is once again a question of trust and of content ownership. If you care about the content on your web site, you should care about who's caring for your web site, too. Ideally, you're caring for it yourself, or you're paying someone reliable to do so for you. At least go one step up, and use Google Sites — because Google isn't as evil as FB.
Don't use FB Notes as your blog. Same deal, really. If you were writing an old-fashioned paper diary, would you keep it on top of your highest bookshelf at home, or would you chain it to your third cousin's dog's poo-covered a$$? Well, guess what — FB is dirtier and dodgier than a dog's poo-covered a$$. So, build your own blog! Or at least use Blogger or Wordpress.com, or something. But not FB!
Don't put too many details in your FB profile fields. This is more the usual stuff that a million other bloggers have already discussed, about maintaining your FB privacy. So I'll just be quick. Anything that you're not comfortable with FB knowing about, doesn't belong in your FB profile. Where you live, where you work, where you studied. Totally optional information. Relationship status — I recommend never setting it. Apart from the giant annoyance of 10 gazillion people being notified of when you get together or break up with your partner, does a giant evil corporation really need to know your relationship / marital status, either?
Don't friend anyone you don't know in real life. Again, many others have discussed this already. You need to understand the consequences of accepting someone as your friend on FB. It means that they have access to a lot of sensitive and private information about you (although hopefully, if you follow all my advice, not all that much private information). It's also a pretty lame ego boost to add friends whom you don't know in real life.
Don't use any FB apps. I don't care what they do, I don't care how cool they are. I don't want them, I don't need them. No marketplace, thanks! No stupid quizzes, thanks! And please, for the love of G-d, I swear I will donate my left testicle to feed starving pandas in Tibet before I ever play Farmville. No thankyou sir.
Don't like things on FB. I hate the "Like" button. It's a useless waste-of-time gimmick. It also has some (small) potential to provide useful data mining opportunities to the giant evil FB corporation. I admit, I have on occasion liked things. But that goes against my general rule of hating FB and everything on it.
So, if you boycott all these things, what's left on FB, you ask? Actually, in my opinion, with all these things removed, what you're left with is the pure essentials of FB, and when viewed by themselves they're really not too bad.
The core of FB is, of course: having a list of friends; sharing messages and external content with groups of your friends (on each others' walls); and being notified of all your friends' activity through your stream. There is also events, which is in my opinion the single most useful feature of FB — they really have done a good job at creating and refining an app for organising events and tracking invite RSVPs; and for informal social functions (at least), there actually isn't any decent competition to FB's events engine available at present. Plus, the integration of the friends list and the event invite system does work very nicely.
What's left, at the core of FB, doesn't involve trusting FB with data that may be valuable to you for the rest of your life. Links and YouTube videos that you share with your friends, have a useful lifetime of about a few days at best. Events, while potentially sensitive in that they reveal your social activity to Big Brother, do at least also have limited usefulness (as data assets) past the date of the event.
Everything else is valuable data, and it belongs either in your own tender loving hands, or in the hands of a provider signficantly more responsible and trustworthy than FB.
]]>However, one of my best friends recently died from a drug overdose. On account of that, I feel compelled to pen a short article, describing what I believe are some good reasons to choose to not take drugs.
It's no secret that narcotic substances cause physical and mental damage to those who use them. Recreational users are often quick to deny the risks, but ultimately there's no hiding from the truth.
The list of physical problems that can directly arise from drug use is colossal: heart attack; stroke; liver failure; diabetes; asthma; eye deterioration; and sexual impotence, to name a few common ones. In the case of injected drugs, the risk of HIV / AIDS from infected needles is, of course, also a major risk.
Physical damage is, however, generally nothing compared to the long-term mental damage caused by narcotics: anxiety; hallucination; schizophrenia; and profound depression, to name but a few. Perhaps the worst mental damage of all, though, is the chemical addiction that results from the use of most narcotics.
Narcotic substances can, and often do, radically change someone's personality. They result in the user transforming into a different person, and this seldom means transforming for the better. The worst harm they do, is that they rob you of who you once were, with little hope of return. Couple this with the problem of addiction, and the only way forward for many drug users is downhill.
This is in one respect the most trivial reason to not take drugs, and in another respect a very serious concern. Anyway, the fact is that at their Western "street prices", drugs are no cheap hobby. For some people (unfortunately not for everyone), the fact that drugs are clearly a ripoff and an utter waste of money, is enough to act as a deterrant.
At the trivial end of things, the fact that recreational drug use is expensive isn't by itself a concern. All hobbies cost something. You could easily pay more for a golf club membership, or for an upmarket retail therapy spree, or for a tropical paradise vacationing habit. If your income can support your leisure, then hey, you might as well enjoy life.
However, as mentioned above, drugs are addictive. As such, a mounting drug addiction inevitably leads to an exponential increase in the cost of maintaining the habit. No matter how much money you have, eventually a drug addiction will consume all of it, and it will leave you so desperate for more, that no avenue to cash will be below you.
Drug users that begin "recreationally", all too often end up stealing, cheating and lying, just to scrounge up enough cash for the next fix. As such, it's not the price itself of drugs, but rather the depths to which addicts are prepared to plunge in order to pay for them, that is the real problem.
Whether you have much respect for the law or not, the fact is that narcotics are illegal in every corner of the world, and by consuming them you are breaking the law. If you don't respect the law, you should at least bear in mind that possession of narcotics carries serious criminal penalties in most countries, ranging from a small monetary fine, to the extreme sentence of capital punishment.
There's also a lot more to the illegality of drugs than the final act of the consumer, in purchasing and using them. By consuming drugs, you are supporting and contributing to the largest form of organised crime in the world. Drug users are in effect giving their endorsement to the entire illegal chain that constitutes the global narcotic enterprise, from cultivation and processing, to trafficking, to dealing on the street.
The mafia groups responsible for the global drug business, also routinely commit other crimes, most notably homicide, kidnapping, torture, extortion, embezzlement, and bribery. These other crimes exist in synergy with the drug enterprise: one criminal activity supports the others, and vice versa. Whether or not you believe drug use should be a crime, the fact is that drug use indirectly results in many other activities, of whose criminal nature there can be no doubt.
Related to (although separate from) the illegality issue above, there is also a bigger picture regarding the harm that's caused by drug use. Many hobbies have some negative impact on the wider world. Hunting whales endangers a species; buying Nike shoes promotes child labour; flying to Hawaii produces carbon emissions. However, the wider harm caused by drug use, compared to other more benign hobbies, is very great indeed.
Most narcotics are grown and produced in third world countries. The farmers and labourers who produce them, at the "bottom of the chain", do so often under threat of death, often for little monetary gain, and often at risk of pursuit by authorities. Meanwhile, the drug barons at the "top of the chain" routinely bribe authorities, extort those below them, and reap enormous profit from all involved.
In many drug-producing countries, wilderness areas such as rainforests are destroyed in order to cultivate more illegal crops. Thus, narcotics are responsible for environmental problems such as deforestation, and often with subsequent side-effects of deforestation such as soil erosion, salinity, and species extinction.
The drug industry routinely attracts poor and uneducated people who are desperate for work opportunities. However, it ultimately provides these people with little monetary gain and no economic security. Additionally, youths in impoverished areas are enticed to take up a criminal life as traffickers, dealers, and middlemen, leaving them and their families with a poor reputation and with many serious risks.
For anyone who cares about their friends and family — and everyone should, they're the most important thing in all our lives — the negative impact of drug use on loved ones, is possibly the worst of all the ways in which drugs cause harm.
Friends and family have to bear the pain of seeing a drug user suffer and degenerate from the effects of prolonged use. It is also they who end up caring for the drug user, which is a task possibly even more difficult than the care of a seriously ill friend or relative usually is.
Worst of all, drugs can lead people to steal from, lie to, verbally abuse, and even physically attack friends and family. Then there is the ultimate woe: the pain of drug abuse claiming the life of a close friend or relative. The harm to a drug user ends at the final hour; but for friends and family, the suffering and grief continue for many long years.
As I said, I'm not a drug user myself. I've only taken illicit drugs once in my life (hallucinogens), several years ago. I admit, it was a very fun experience. However, in retrospect, taking the drugs was also clearly a stupid decision. At the time, I was not thinking about any of the things that I've discussed in this article.
I regret very few things in my life, but I regret the choice that I made that day. I feel fortunate that the drugs left me with no addiction and with no long-term harm, and I have no intention whatsoever of taking drugs again in my life.
Yes, drugs are fun. But the consequences of taking them are not. As discussed here, the consequences are dead serious and they are quite real. I remain a libertarian, as far as drugs go — if you want to consume them, I see no reason to stop you. But seriously, think twice before deciding to take that route.
This article is dedicated to the memory of Josh Gerber, one of my best friends for many years, and the tragic victim of a drug overdose in May 2011. Chef of the highest calibre, connoisseur of fine music, always able to make me laugh, a critical thinker and a person of refined wit. May he be remembered for how he lived, not for how he died.
One other thing, though. It's also never been easier to inadvertently take it all for granted. To forget that just one generation ago, there were no budget intercontinental flights, no phrasebooks, no package tours, no visa-free agreements. And, of course, snail mail and telegrams were a far cry from our beloved modern Internet.
But that's not all. The global travel that many of us enjoy today, is only possible thanks to a dizzying combination of fortunate circumstances. And this tower (no less) of circumstances is far from stable. On the contrary: it's rocking to and fro like a pirate ship on crack. I know it's hard for us to comprehend, let alone be constantly aware of, but it wasn't like this all that long ago, and it simply cannot last like this much longer. We are currently living in a window of opportunity like none ever before. So, carpe diem — seize the day!
Have you ever before thought about all the things that make our modern globetrotting lives possible? (Of course, when I say "us", I'm actually referring to middle- or upper-class citizens of Western countries, a highly privileged minority of the world at large). And have you considered that if just one of these things were to swing suddenly in the wrong direction, our opportunities would be slashed overnight? Scary thought, but undeniably true. Let's examine things in more detail.
In general, these are at an all-time global high. Most countries in the world currently hold official diplomatic relations with each other. There are currently visa-free arrangements (or very accessible tourist visas) between most Western countries, and also between Western countries and many developing countries (although seldom vice versa, a glaring inequality). It's currently possible for a Western citizen to temporarily visit virtually every country in the world; although for various developing countries, some bureaucracy wading may be involved.
International relations is the easiest thing for us to take for granted, and it's also the thing that could most easily and most rapidly change. Let's assume that tomorrow, half of Asia and half of Africa decided to deny all entry to all Australians, Americans, and Europeans. It could happen! It's the sovereign right of any nation, to decide who may or may not enter their soil. And if half the governments of the world decide — on the spur of the moment — to bar entry to all foreigners, there's absolutely nothing that you or I can do about it.
This is (of course) always a problem in various parts of the world. Parts of Africa, Asia, and Latin America are currently unsafe due to armed conflict, mainly from guerillas and paramilitary groups (although traditional war between nations still exists today as well). Armed conflict is relatively contained within pockets of the globe right now.
But that could very easily change. World War III could erupt tomorrow. Military activity could commence in parts of the world that have been boring and peaceful for decades, if not centuries. Also, in particular, most hostility in the world today is currently directed towards other local groups; that hostility could instead be directed at foreigners, including tourists.
War between nations is also the most likely cause for a breakdown in international relations worldwide (it's not actually very likely that they'd break down for no reason — although a global spout of insane dictators is not out of the question). This form of conflict is currently very confined. But if history is any guide, then that is an extremely uncommon situation that cannot and will not last.
This is also a problem that has never gone away. However, it's currently relatively safe for tourists to travel to almost everywhere in the world, assuming that proper precautions are taken. Most infectious diseases can be defended against with vaccines. AIDS and other STDs can be controlled with safe and hygienic sexual activity. Water-borne sicknesses such as giardia, and mosquito-borne sicknesses such as malaria, can be defended against with access to bottled water and repellents.
Things could get much worse. We've already seen, with recent scares such as Swine Flu, how easily large parts of the world can become off-limits due to air-borne diseases for which there is no effective defence. In the end, it turned out that Swine Flu was indeed little more than a scare (or an epidemic well-handled; perhaps more a matter of opinion than of fact). If an infectious disease were contagious enough and aggressive enough, we could see entire continents being indefinitely declared quarantine zones. That could put a dent in some people's travel plans!
There are already large areas of the world that are effectively best avoided, due to some form of serious environmental contamination. But today's picture is merely the tip of the iceberg. If none of the other factors get worse, then I guarantee that this is one factor that will. It's happening as we speak.
Air pollution is already extreme in many of the world's major cities and industrial areas, particularly in Asia. However, serious though it is, large populations are managing to survive in areas where it's very high. Water contamination is a different story. If an entire country, or even an entire region, has absolutely no potable water, then living and travelling in those areas becomes quite hard.
Of course, the most serious form of environmental contamination possible, is a nuclear disaster. Unfortanately, the potential for nuclear catastrophe is still positively massive. Nuclear disarmament has been a slow and limited process. And weapons aside, nuclear reactors are still abundant in much of the world. A Chernobyl-like event on a scale 100 times bigger — that could sure as hell put travel plans to entire continents on hold indefinitely.
The offering of long-distance international flights today is simply mind-boggling. The extensive number of routes / destinations, the frequency, and of course the prices; all are at an unprecedented level of awesomeness. It's something you barely think about: if you want to get from London to Singapore next week, just book a flight. You'll be there in 14 hours or so.
Sorry to burst the bubble, folks; but this is one more thing that simply cannot and will not last. We already saw, with last year's Iceland volcano eruption, just how easily the international aviation network can collapse, even if only temporarily. Sept 11 pretty well halted global flights as well. A more serious environmental or security problem could halt flights for much, much longer.
And if nothing else grounds the planes first, then sooner or later, we're going to run out of oil. In particular, jet fuel is the highest-quality, most refined of all petroleum, and it's likely to be the first that we deplete within the next century. At the moment, we have no real alternative fuel — hopefully, a renewable form of jet propulsion will find itself tested and on the market before we run out.
Compared to all the hypothetical doomsday scenarios discussed above, this may seem like a trivial non-issue. But in fact, money is the most fundamental of all enablers of our modern globetrotting lifestyle, and it's the enabler that's most likely to disappear first. The fact is that many of us have an awful lot of disposable cash (especially compared with the majority of the world's population), and that cash goes an awfully long way in many parts of the world. This is not something we should be taking for granted.
The global financial crisis has already demonstrated the fragility of our seemingly secure wealth. However, despite the crisis, most Westerners still have enough cash for a fair bit of long-distance travel. Some are even travelling more than ever, because of the crisis — having lost their jobs, and having saved up cash over a long period of time, many have found it the perfect opportunity to head off on a walkabout.
Then there is the strange and mysterious matter of the international currency exchange system. I don't claim to be an expert on the topic, by any means. Like most simple plebs, I know that my modest earnings (by Western standards) tower above the earnings of those in developing countries; and I know that when I travel to developing countries, my Western cash converts into no less than a veritable treasure trove. And I realise that this is pretty cool. However, it's also a giant inequality and injustice. And like all glaring inequalities throughout history, it's one that will ultimately fall. The wealth gap between various parts of the world will inevitably change, and it will change drastically. This will of course be an overwhelmingly good thing; but it will also harm your travel budget.
Sorry that this has turned out to be something of a doomsday rant. I'm not trying to evoke the end of the world, with all these negative hypotheticals. I'm simply trying to point out that if any one of a number of currently positive factors in the world were to turn sour, then 21st century travel as we know it could end. And it's not all that likely that any one of these factors, by itself, will head downhill in the immediate future. But the combination of all those likelihoods does add up rather quickly.
I'd like to end this discussion on a 100% positive note. Right now, none of the doom-n-gloom scenarios I've mentioned has come to fruition. Right now, for many of us, la vita e bella! (Although for many many others, life is le shiiiite). Make the most of it. See the world in all its glory. Go nuts. Global travel has been one of the most difficult endeavours of all, for much of human history; today, it's at our fingertips. As Peter Pan says: "Second star to the right, and straight on 'till morning."
]]>I had my doubts. Australia — for those of you that don't know — is a simple country with simple roads. The coast of Queensland is no exception. There's one highway, and it's called Route 1, and it goes up the coast in a straight line, from Brisbane to Cairns, for about 1,600 km's. If you see a pub, it means you've driven through a town. If you see two pubs, a petrol station, a real estate agent and a post office (not necessarily all in different buildings), that's a big town. If you see houses as well, you must be in a capital city. It's pretty hard to get lost. Why would we need a GPS?
To cut a long story short, the GPSes were a major annoyance throughout the trip, and they were of no real help for the vast majority our our travelling. Several times, they instructed us to take routes that were a blatant deviation from the main route that prominent road signs had marked, and that were clearly not the quickest route anyhow. They discouraged going off the beaten track and exploring local areas, because they have no "shut up I'm going walkabout now" mode. And, what got to me more than anything, my travel buddies were clearly unable to navigate along even the simplest stretch of road without them, and it made me sad to see my friends crippled by these devices that they've come to so depend upon.
In the developed world, with its developed mapping providers and its developed satellite coverage, GPS is becoming ever more popular amongst automobile drivers. This is happening to the extent that I often wonder if the whole world is now running on autopilot. "In two hundred metres, take the second exit at the roundabout, then take the third left."
Call me a luddite and a dinosaur if you must, all ye GPS faithful… but I refuse to use a GPS. I really can't stand the things. They're annoying to listen to. I can usually find a route just fine without them. And using them makes you navigationally illiterate. Join me in boycotting GPS!
This is my main gripe with GPS devices. People who use them seem to become utterly dependent on them, sticking with them like crack junkies stick to the walls of talcum powder factories. If a GPS addict is at any time forced to drive without his/her beloved electronic companion, he/she is utterly lost. Using a GPS all the time makes you forget how to navigate. It means that you don't explore or immerse yourself in the landscape around you. It's like walking through a maze blindfolded.
I must point out, though, that GPS devices don't have to make us this stupid. However, this is the way the current generation of devices are designed. Current GPSes encourage stimulus-driven rather than spatially-driven navigation. Unless you spend quite a lot of time changing the default settings, 99% of consumer-car GPSes will only show you the immediate stretch of road in front of you in their map display, and the audio will only instruct you as to the next immediate action you are to take.
Worse still, the action-based instructions that GPSes currently provide are completely devoid of the contextual richness that we'd utilise, were we humans still giving verbal directions to each other. If you were driving to my house, I'd tell you: "turn right when you see the McDonald's, then turn left just before the church, at the bottom of the hill". The GPS, on the other hand, would only tell you: "in 300 metres, turn right, then take the second left". And, because you've completely tuned in to the hypnotic words of the GPS, and tuned out to the world around you, it's unlikely you'd even notice that there's a Maccas, or a church, or a hill, near my house.
Even the US military is having trouble with its troops suffering from reduced navigational ability, as a direct result of their dependence on field GPS devices. Similarly, far North American Inuits are rapidly losing the traditional arctic navigation skills that they've been passing down through the generations for centuries, due to the recent introduction of GPS aids amongst hunters and travellers in their tribes. So, if soldiers who are highly trained in pathfinding, and polar hunters who have pathfinding in their blood — if these people's sense of direction is eroding, what hope is there for us mere mortals?
I got started thinking about this, when I read an article about this possibility: Could GPS create a world without signs? I found this to be a chilling prediction to reflect upon, particularly for a GPS-phobe like myself. The eradication of traditional street signs would really be the last straw. It would mean that the GPS-averse minority would ultimately be forced to convert — presumably by law, since if we assume that governments allowed most street signs to discontinue, we can also assume that they'd make GPS devices compulsory for safety reasons (not to mention privacy concerns, anyone?).
I must admit, I'm a much more keen navigator and explorer than your average Joe. I've always adored maps — when I was a kid, I used to spend hours poring over the street directory, or engrossing myself in an atlas that was (at the time) taller than me. Nowadays, I can easily burn off an entire evening panning and zooming around Google Earth.
I love to work out routes myself. I also love to explore the way as I go. Being a keen urban cyclist, this is an essential skill — cycling is also one of the best methods for learning your way around any local area. It also helped me immensely in my world trip several years ago, particularly when hiking in remote mountain regions, but also in every new city I arrived at. I'm more comfortable if I know the compass bearings in any given place I find myself, and I attempt to derive compass bearings using the position of the sun whenever I can.
So, OK, I'm a bit weird, got a bit of a map and navigation fetish. I also admit, I took the Getting Lost orientation test, and scored perfectly in almost every area (except face recognition, which is not my strong point).
I'm one of those people who thinks it would be pretty cool to have lived hundreds of years ago, when intrepid sailors ventured (with only the crudest of navigational aids) to far-flung oceans, whose edges were marked on maps as being guarded by fierce dragons; and when fearless buccaneers ventured across uncharted continents, hoping that the natives would point them on to the next village, rather than skewer them alive and then char-grill their livers for afternoon tea. No wonder, then, that I find it fun being without GPS, whether I'm driving around suburban Sydney, or ascending a mountain in Bolivia.
Then again, I'm also one of those crazy luddites that think the world would be better without mobile phones. But that's a rant for another time.
]]>Unfortunately, for those of us on Mac OS X 10.5 (Leopard), installing uploadprogress ain't all smooth sailing. The problem is that the extension must be compiled from source in order to be installed; and on Leopard machines, which all run on a 64-bit processor, it must be compiled as a 64-bit binary. However, the gods of Mac (in their infinite wisdom) decided to include with Leopard (after Xcode is installed) a C compiler that still behaves in the old-school way, and that by default does its compilation in 32-bit mode. This is a right pain in the a$$, and if you're unfamiliar with the consequences of it, you'll likely see a message like this coming up in your Apache error log when you try to install uploadprogress and restart your server:
PHP Warning: PHP Startup: Unable to load dynamic library '/usr/local/php5/lib/php/extensions/no-debug-non-zts-20060613/uploadprogress.so' - (null) in Unknown on line 0
Hmmm… (null) in Unknown on line 0
. WTF is that supposed to mean? (You ask). Well, it means that the extension was compiled for the wrong environment; and when Leopard tries to execute it, a low-level error called a segmentation fault occurs. In short, it means that your binary is $#%&ed.
But fear not, Leopard PHP developers! Here are some instructions for how to install uploadprogress by compiling it as a 64-bit binary:
/usr/bin
to be symlinks to the proper versions in /usr/local/php5/bin
.cd
to the directory containing the extracted tarball that you downloaded, e.g.cd /download/uploadprogress-1.0.0
sudo phpize
MACOSX_DEPLOYMENT_TARGET=10.5 CFLAGS="-arch x86_64 -g -Os -pipe -no-cpp-precomp" CCFLAGS="-arch x86_64 -g -Os -pipe" CXXFLAGS="-arch x86_64 -g -Os -pipe" LDFLAGS="-arch x86_64 -bind_at_load" ./configure
sudo su
before running it, and type exit
after running it).sudo make
sudo make install
extension=uploadprogress.so
to your php.ini
file (for Entropy users, this can be found at /usr/local/php5/lib/php.ini
)sudo apachectl restart
If all is well, then a phpinfo()
check should output an uploadprogress
section, with a listing for the config variables uploadprogress.file.contents_template
, uploadprogress.file.filename_template
, and uploadprogress.get_contents
. Your Drupal status report should be happy, too. And, of course, FileField will totally rock.
The project is a Drupal multisite setup, and like most multisite setups, it uses a bunch of symlinks in order for multiple subdomains to share a single codebase. For each subdomain, I create a symlink that points to the directory in which it resides; in effect, each symlink points to itself. When Apache comes along, it treats a symlink as the "directory" for a subdomain, and it follows it. By the time Drupal is invoked, we're in the root of the Drupal codebase shared by all the subdomains. Everything works great. All our favourite friends throw a party. Champagne bottles pop.
The bash command to create the symlinks is pretty simple — for each symlink, it looks something like this:
ln -s . subdomain
Unfortunately, a symlink like this does not play well with certain IDEs that try to walk your filesystem. When they hit such a symlink, they get stuck infinitely recursing (or at least, they keep recursing for a long time before they give up). The solution? Simple: delete such symlinks from your development environment. If this is what's been dragging your system down, then removing them will instantly cure all your woes. For each symlink, deleting it is as simple as:
rm subdomain
(Don't worry, deleting a symlink doesn't also delete the thing that it's pointing at).
It seems obvious, now that I've worked it out; but this annoying "slow-down" of Eclipse and TextMate had me stumped for quite a while until today. I've only recently switched to Mac, and I've only made the switch because I'm working at Digital Eskimo, which is an all-out Mac shop. I'm a Windows user most of the time (God help me), and Eclipse on Windows never gave me this problem. I use the new Vista symbolic links functionality, which actually works great for me (and which is possibly the only good reason to upgrade from XP to Vista). Eclipse on Windows apparently doesn't try to follow Vista symlinks. This is probably why it took me so long so figure it out (that, and Murphy's Law) — I already had the symlinks when I started the project on Windows, and Eclipse wasn't hanging on me then.
I originally thought that the cause of the problem was Git. Live local is the first project that I've managed with Git, and I know that Git has a lot of metadata, as well as compressed binary files for all the non-checked-out branches and tags of a repository. These seemed likely candidates for making Eclipse and TextMate crash, especially since neither of these tools have built-in support for Git. But I tried importing the project without any Git metadata, and it was still hanging forever. I also thought maybe it was some of the compressed JavaScript in the project that was to blame (e.g. jQuery, TinyMCE). Same story: removing the compressed JS files and importing the directory was still ridiculoualy slow.
IDEs should really be smart enough to detect self-referencing or cyclic symlinks, and to stop themselves from recursing infinitely over them. There is actually a bug filed for TextMate already, so maybe this will be fixed in future versions of TextMate. Couldn't find a similar bug report for Eclipse. Anyway, for now, you'll just have to be careful when using symlinks in your (Drupal or other) development environment. If you have symlinks, and if your IDE is crashing, then try taking out the symlinks, and see if all becomes merry again. Also, I'd love to hear if other IDEs handle this better (e.g. Komodo, PHPEdit), or if they crash just as dismally when faced with symlinks that point to themselves.
]]>I recently finished reading an excellent book: Collapse, by Jared Diamond. Diamond explains in this book, through numerous examples, the ways in which we are consuming above our needs, and he points out the enormous gap in the average consumption of 1st- vs 3rd-world citizens. He also explains the myriad ways in which we can reduce our personal consumption, and yet still maintain a healthy modern existence. In the conclusion of this book, Diamond presents a powerful yet perfectly sensible suggestion to the problem of global over-consumption: it's we that do the consuming; and as such, the power to make a difference is in our hands.
It's a simple matter of chain reactions. First, we (local consumers) choose to purchase less products. Next, the retail and wholesale businesses in our area experience a drastic decrease in their sales — resulting in an (unfortunate but inevitable) loss of income for them — and as such, they radically downsize their offerings, and are forced to stock a smaller quantity of items. After that, manufacturers and packagers begin to receive smaller bulk orders from their wholesale customers, and they in turn slow down their production, and pump out a lower quantity of goods. Finally, the decreased demand hits the primary producers (e.g. miners, farmers, fishermen and loggers) at the bottom of the chain, and they consequently mine less metals, clear less grazing fields, catch less fish and chop down less trees.
A simple example of this would be if, for example, the entire population of a major metropolitan city decided to boycott all (new, not second-hand) furniture purchases, building and renovation efforts for an entire year. If this happened, then the first people to be affected would be furniture businesses, building companies and home improvement stores, who would quickly be forced to stock far less items (and to employ far less people in building far less houses). Next, timber wholesalers and distributors would follow suite by stocking less bulk-quantity timber. Finally, logging companies would inevitably be forced to simply chop down less trees, as they would simply have insufficient customers to buy all the resultant timber. And when you think about it, almost every major city in the world could do this, as most cities literally do have enough existing houses, commercial buildings and furnishings, that their entire population could choose not to buy or build anything new for an entire year, and yet everyone would still have ample supplies.
What so many people fail to realise is this: it's we that are the source of the problem; and as such, it's we that are also the only chance of there being a solution. There's only ever one way to tackle a big problem, and that's by digging down until you find the root cause, and attacking the problem directly at that root. We can launch protests and demonstrations against governments and big businesses — but they're not the root of the problem, they just represent us. We can write letters, draw cartoons, publish blog posts, and capture footage for the media — but they're not the root of the problem, they just inform us. And we can blame environmental stuff-ups such as oil spills, fossil fuel burning and toxic waste dumping — but they're not the problem either, they're the side-effects of the problem. The problem is our own greed: we lavish ourselves with goods, and the world suffers as a consequence.
One of the biggest obstacles with the principle of reducing consumption, is the automatic rhetoric that so many people — laymen and politicians alike — will blurt out at the mere suggestion of it: "the economy would collapse, and unemployment would go through the roof." While this is true to some extent, it is at the end of the day not correct. Yes, the economy would suffer in the short-term: but in the long-term, the prices of less-consumed goods would rise to reflect the new demand, and businesses would find a new level at which to competitively operate. Yes, unemployment would spike for a time: but before long, new jobs would open up, and the service economy would expand to accommodate a higher number of workers. Ultimately, our current inflated rate of consumption is unsustainable for the economy, as it's only a matter of time before various resources begin to run out. Reducing consumption before that happens is going to make the inevitable day a lot less unpleasant, and it will impact our lifestyle a lot less if we've already made an effort to adjust.
At the more personal level, the main obstacle is education: informing people of how much they're consuming unnecessarily; explaining the ways in which consumption can be reduced; and increasing awareness of the impacts of over-consumption, on the environment and on the rest of global society. Few developed-world people realise that they consume more than 10 times as much as their third-world neighbours, in almost every major area — from food, to clothing, to electronic, to stationery, to toys, to cigarettes. Less still are aware of exactly what kind of a difference they can make, by purchasing such things as new vs recycled paper, or old-growth vs plantation timber products. And what's more, few have a clear idea of what (small but important) steps they can take, in order to slowly but surely reduce their consumption: things such as hand-me-down clothing, home-grown vegetable gardening, and a paperless office being just a few. As I blogged about previously, new approaches to reuse can also play an integral part in the process.
Finally, we get to the most basic and yet the most tricky obstacle of them all. As I wrote at the start of this blog entry, consuming less is the simplest thing we can do to help this planet, and yet also the most complicated. And the reason for this can be summed up in one smelly, ugly little word: greed. The fact is, we humans are naturally greedy. And when we live in a world where supermarkets and mega-stores offer aisle after aisle of tantalising purchases, and where our wallets are able to cater to all but the dearest of them, the desire to buy and to consume can be nothing less than irresistible. And to solve this one, it's simply a matter of remembering that you need a better reason to buy something, than simply because "it's there" and "I can afford it". Hang on. Do you need it? What's wrong with what you've already got? And how long will the new purchase last? (Boycotting cheap consumer goods with a "4-year life span" is also a good idea — it's not just a longer-term investment, it's also a consumption cut.)
I don't have the answer to the problem of the greediness that's an inherent part of human nature in all of us. But I do believe that with a bit more education, and a bit more incentive, we'll all be able to exercise more self-discipline in our spending-driven lives. And if we can manage that, then we will be step one in a chain reaction that will radically reshape the global economy, and that will bring that previously-thought unstoppable beast to a grinding halt.
]]>The pitfalls are so many, that if we actually stopped to think about them, we'd all realise that we have no choice but to go and live in a cave for the rest of our lives. TV? Runs on power, comes from coal (in many places), contributes heat to the planet. Air-conditioning? Both of the above as well, multiplied by about a hundred. Car? Runs on petrol, comes from oilfields that were once natural habitats, combusts and produces CO2 that warms up the planet. Retail food? Comes from farms, contributing to deforestation and erosion, built on lands where native flora once grew and where native fauna once lived, carried on trucks and ships that burn fuel, packaged in plastic wrappers and tin cans that get thrown away and sent to landfill.
The act of writing this thought? Using electricity and an Internet connection, carried on power lines and data cables that run clean through forests, that squash creatures at the bottom of the ocean, powering a computer made of plastics and toxic heavy metals that will one day be thrown into a hole in the ground. I could go on all day.
Our daily lives are a crazy black comedy of blindness: each of us is like a blind butcher who carves up his customers, thinking that they're his animal meats; or like a blind man in his house, who thinks he's outside enjoying a breeze, when he's actually feeling the blizzard blowing in through his bedroom window. It would be funny, if it wasn't so pitifully tragic, and so deadly serious. We've forgotten who we are, and where we are, and what we're part of. We've become blind to the fact that we're living beings, and that we exist on a living planet, and that we're a part of the living system on this planet.
Finally, however, more and more people are taking off the blindfold, and realising that they do actually exist in this world, and that closing the window isn't the answer to stopping that breeze from getting warmer.
When people take off the blindfold, they immediately see that every little thing that we do in this world has consequences. In this age of globalisation, these consequences can be much more far-reaching than we might imagine. And at a time when our natural environment is in greater peril than ever before, they can also be serious enough to affect the future of the world for generations to come.
In a recent address to the National Press Club of Australia, famed environmentalist Dr David Suzuki suggests that it's time we all started to "think big". The most effective way to start getting serious about sustainability, and to stop worldwide environmental catastrophe, is for all of us to understand that our actions can have an impact on environmental issues the world over, and that it is our responsibility to make a positive rather than a negative impact.
The world's leading scientists are taking a stronger stance than ever on the need for the general public to get serious about sustainability. Last week, the Intergovernmental Panel on Climate Change released a report on global warming, the most comprehensive and authoritative one written to date. The report has erased any doubts that people may have had about (a) the fact that global warming exists, and (b) the fact that humanity is to blame for it, saying that it is "very likely" — 90%+ probability — that human action has been the overwhelming cause of rising temperatures.
Governments of the world are finally starting to listen to the experts as well. The release of the IPCC report prompted the Bush Administration to officially accept that humanity is causing global warming for the first time ever. The Howard government, here in Australia, is making slightly more effort [than its usual nothing] to do something about climate change, by introducing new clean energy initiatives, such as solar power subsidies, "clean coal" development, and a (highly controversial) push for nuclear power — although all of that is probably due to the upcoming federal election. And the European Union is going to introduce stricter emission limits for all cars sold in Europe, known as the 'Euro 5 emissions standards'.
And, of course, the new documentary / movie by Al Gore, An Inconvenient Truth, has done wonders for taking the blindfold off of millions of Joe Citizens the world over, and for helping them to see just what's going on around them. After I saw this film, it made me shudder to think just what a different world we'd be living in today, if Al Gore had been elected instead of George Dubbya back in 2000. How many wars could have been prevented, how many thousands of people could have lived, how many billions of tonnes of fuel could have been spared, and how many degrees celsius could the world's average temperature have been reduced, if someone like Al Gore had been in charge for the past 7 years?
It's great to see the global environmental movement gaining a new level of respect that it's never before attained. And it's great so see that more people than ever are realising the truth about environmental activism: it's not about chaining yourself to trees, growing your hair down to your bum, and smoking weed all day (although there's nothing wrong with doing any of those things :P); it's about saving the planet. Let's hope we're not too late.
]]>This has been happening in my industry (the IT industry) perhaps more than in any other. Deflated workers have flocked away — fleeing such diverse occupations as database admin, systems admin, support programmer, and project manager — driven by the promise of freedom from corporate tyranny, and hoping to be unshackled from the manacles of boring and unchallenging work. The pattern can be seen manifesting itself in other industries, too, from education to music, and from finance to journalism. More than ever, people are getting sick of doing work that they hate, and of being employed by people who wouldn't shed a tear if their pet panda kicked the bucket and joined the bleedin' choir invisibile.
And why are people doing this? The biggest reason is simply because — now more than ever — they can. With IT in particular, it's never been easier to start your own business from scratch, to develop and market hip-hop new applications in very small teams (or even alone), and to expand your skill set far beyond its humble former self. On top of that, people are being told (via the mass media) not only that they can do it, but that they should. It's all the rage. Doing-what-thou-lovest is the new blue. Considering the way in which a career has been reduced to little more than yet another consumer product (in recent times), this attitude should come as no surprise. After all, a job where you do exactly what you want sounds much better than a job where you do exactly what you're told.
Call me a cynic, but I am very dubious of the truth of this approach. In my experience, as soon as you turn a pleasurable pastime into a profession, you've suddenly added a whole new bucket of not-so-enjoyable tasks and responsibilities into the mix; and in the process, you've sacrificed at least some of the pleasure. You've gone from the humble foothills to the pinnacle of the mountaintop — so to speak — in the hope of enjoying the superior view and the fresh air; only to discover that the mountain frequently spews ash and liquid hot magma from its zenith, thus rather spoiling the whole venture.
When I say that these things are in my experience, I am (of course) referring to my experience in the world of web design and development. I've been doing web design in one form or another for about 8 years now. That's almost as long as I've been online (which is for almost 9 years). I'm proud to say that ever since I first joined the web as one of its netizens (wherever did that term go, anyway? Or did it never really make it in the first place? *shrugs*), at age 12, I've wanted to make my own mark on it. Back then, in 1998, equipped with such formidable tools as Microsoft Word™ and its Save as HTML feature, and inhabiting a jungle where such tags as
<blink>
and
<marquee>
were considered "Web Standards", it was all fun and games. In retrospect, I guess I really was making my mark on the web, in the most crude sense of the term. But hey: who wasn't, back then?
From these humble beginnings, my quirky little hobby of producing web sites has grown into a full-fledged business. Well, OK: not exactly full-fledged (I still only do it on the side, in between study and other commitments); but it's certainly something that's profitable, at any rate. Web design (now known as web development, according to the marketing department of my [one-man] company) is no longer a hobby for me. It's a business. I have clients. And deadlines. And accounts. Oh, and a bit of web development in between all that, too. Just a bit.
Don't get me wrong. I'm not trying to complain about what I do. I chose to be a web developer, and I love being a web developer. I'm not saying that it's all a myth, and that you can't work full-time on something that you're passionate about, and retain your passion for it. But I am saying that it's a challenge. I am saying that doing an activity as a professional is very, very different from doing it as an amateur enthusiast. This may seem like an obvious statement, but in Wild Wild West industries such as web development, it's one that not everyone has put much thought into.
Going pro has many distinct advantages: you push yourself harder; you gain much more knowledge of (and experience in) your domain; you become a part of the wider professional community; and of course, you have some bread on the table at the end of the day. But it also has its drawbacks: you have to work all the time, not just when you're in the mood for it; you're not always doing exactly what you want to do, or not always doing things exactly the way you want them done; and worst of all, you have to take care of all the other "usual" things that come with running a small business of any kind. The trick is to make sure that the advantages always outweigh the drawbacks. That's all that any of us can hope for, because drawbacks are a reality in every sphere of life. They don't go away: they just get overshadowed by good things.
Looking back on my choice of career — in light of this whole pleasure-vs-pro argument — I'm more confident than ever that I've made the right move by going into the IT profession. Back when I was in my final year of high school, I was tossing up between a career in IT, and a career in Journalism (or in something else related to writing). Now that IT is my day job, my writing hobby is safe and sound as a pristine, undefiled little pastime. And in my opinion, IT (by its very nature) is much more suitable as a profession than as a pastime, and writing (similarly) is much more suitable as a pastime than as a profession. That's how I see it, anyway.
For all of you who are planning to (or who already have) quit your boring day jobs, in order to follow your dreams, I say: good luck to you, and may you find your dreams, rather than just finding another boring day job! If you're ever feeling down while following your dreams, just think about what you're doing, and you'll realise that you've actually got nothing to feel down about. Nothing at all.
]]>The idea behind evangelism is that one particular religion is the one true way to find G-d and to live a good life. It is therefore a duty, and an act of kindness, for the followers of that religion to "spread the word", and to help all of humanity to "see the light".
The catalyst behind my writing this article was that I happened to run into a Christian evangelist today, whilst out walking on the streets. I've never actually stopped and talked to one of these people before: my standard procedure is to ignore them when out and about, and to slam the door in their faces when they come a-knocking. This, quite understandably, is also how most other people react. But today I stopped and talked.
To cut a long story short, I walked away from the conversation almost an hour later, more certain than ever that evangelism is a bad idea.
Now, don't get me wrong: I'm all for the spreading of knowledge, and I think that connecting with and learning about religions and cultures outside of your own is a very worthwhile endeavour. I have personally devoted a fair amount of effort into this form of learning, and I don't regret one minute of it.
But imposing your ideas onto others is a whole different ball game. Teaching is one thing, and dictating is quite another. Unfortunately, evangelism is not about the sharing of knowledge or opinions. Sharing would involve telling people: "these are my beliefs, what are yours?" Instead, evangelism involves telling people: "these are my beliefs, and if you know what's good for you, they'll be yours too".
I happen to be a member of the Jewish faith, although I ain't the most religious Jew on the block, and I don't agree with everything that my religion has to say. I believe that Jesus was a great bloke, who obviously performed a great many charitable deeds in his life, and who was revered and respected by many of his contemporaries. As far as I'm concerned, someone who blesses fishermen and promotes world peace is a nice guy.
Nice guy, sure; but not son of G-d. Nice guy, sure; but not responsible for atoning for the sins of every man, woman, and child, for all eternity, that believes in his divinity. Nice guy. Jewish too, by the way (not roman). But that's it.
Today, my over-zealous acquaintance in the shopping mall told me his beliefs, which happened to be slightly different to my own. I had no problem with listening to them. According to my acquaintance, Jesus is the son of G-d, he was resurrected from the dead, and he atoned for all the sins of his followers through his death. I am aware that this is the belief held by millions of Christians around the world, and I respect that belief, and I have no desire to impose any other conflicting belief upon any Christian person. I just happen to have a different belief, that's all.
However, after that, things started getting a bit ugly. Next, I was informed that I am in grave danger. It is imperative that I accept a belief in Jesus and in Christianity, because only then will I be forgiven for all of my sins. Should I fail to accept this belief, I am doomed to eternity in hell.
Thanks for the warning, buddy - I appreciate you looking out for me, and I'm grateful that you've been kind enough to help me avoid eternal damnation 'n' all. But actually, I happen to believe that everyone goes to heaven (with a sprinkling of hellish punishment on the way, of course, depending on how much you've sinned), and that I already have a means of getting the all-clear from the Big Man regarding forgiveness, through my own religion.
The response? I'm wrong. I'm doomed. I haven't seen the light. Such a pity - it seemed, at first, that there was hope for me. If only I wasn't so damn stubborn.
Actually, I did see the light. How could I miss it, when it was being shone right in my face? For the sake of everyone's retinas, I say to all evangelists: stop shining that accursed light in our faces! Instead, why don't you practice what you preach, and respect the rights of others to serve G-d and to be charitable in their own way?
I don't respond well to advertisements that proclaim too-good-to-be-true offers. Hence my reaction to the whole "believe-in-my-way-and-all-your-sins-are-forgiven" thing. I also don't respond well to threats. Hence my reaction to the whole "believe-in-my-way-or-spend-eternity-in-hell" thing. It amazes and deeply disturbs me that this crude and archaic form of coercion has been so successful throughout the history of organised religion. But then again, those "$0 mobile phone" deals have been quite successful as well. I guess some people really are a bit simple.
I applaud the millions of Christian people (some of whom are my personal friends or acquaintances) who openly criticise and shun the evangelism of their brethren. It's a relief to know that the majority of people agree with my opinion that evangelism is the wrong way to go.
What this world needs is a bit more respect for others. We need to respect the rights of other people to live out a good life, according to whatever religion or doctrine they choose. We need to accept that if people want to conform to our ways, then they'll come of their own volition, and not through coercion. And we need to accept that imposing one's beliefs upon others is an arrogant, disrespectful, and hostile act that is not appreciated. World peace is a long way off. The practice of evangelism is a sound way to keep it like that. A better alternative is to agree to disagree, and to get on with doing things that really do make the world a better place.
]]>The reason for my sudden urge to express this opinion, is a particular series of books that I'm reading now. I've (stupidly) been putting off reading Harry Potter for many years now, but have finally gotten round to reading it at the moment. Unfortunately, I saw two of the movies - 'Harry Potter and the Philosopher's Stone', and 'Harry Potter and the Prisoner of Azkaban' - before starting the books. Although I was reluctant to see them, I was able to put off reading the books by myself, but not able to get out of seeing the movies with my friends.
Luckily, when I started reading 'Harry Potter and the Philosopher's Stone' (that's 'sorcerer' for all you Americans out there), I couldn't remember much of the movie, so the book wasn't too spoiled. However, having a predefined visual image of the characters was a definite drawback (unable to let the imagination flow), as was knowledge of some of the Potter-lingo (e.g. 'muggles'), and of the nature of magic. 'Harry Potter and the Chamber of Secrets' (which I still haven't seen the movie for) was a much better read, as I knew nothing of the storyline, and had no predefined image of all the new characters.
I'm up to the third one now ('Prisoner of Azkaban'), and having seen the movie not that long ago, I can remember most of it pretty clearly. To be quite honest, having the movie in my head is ruining the book. I'm not digging any of the suspense, because I already know what's going to happen! There are no grand visual concoctions growing in my head, because I've already got some shoved in there! It's a downright pain, and I wish I'd never seen the movie. I'm definitely not seeing any more 'Harry Potter' movies until I've finished the books.
This is in contrast to my experience with 'Lord of the Rings'. I honestly believe this to be the best book of all time, but perhaps if I'd seen the movie(s) first, rather than reading all the books twice before seeing any of the movies, my opinion might differ. The movies of LOTR are absolute masterpieces, no doubt about that. But seeing them after having read the books makes them infinitely more worthwhile. When you see the landscapes on the big screen, you also see the kings and queens and battles and cities of long-gone history, that aren't part of the movie, and that non-readers have no idea about. When you hear the songs (usually only in part), you know the full verses, and you know the meaning behind them. And when things are done wrongly in the movie, they stick out to you like a sore thumb, while to the rest of the audience they are accepted as the original gospel truth. Tragic, nothing less.
So my advice, to myself and to all of you, is to always read the book first, because it's always better than the movie, and while watching the movie (first) spoils the book, doing the reverse has the opposite effect!
]]>