Friday, December 18, 2015

Implementing Egg in PHP

I recently implemented an interpreter for the toy language Egg as described in Marijn Haverbeke's Eloquent JavaScript.

It was an interesting exercise for two reasons.

Firstly, I implemented it in PHP rather than JavaScript.

I did this not because I'm more comfortable in PHP than I am in JS (I'm at least equally comfortable), but because I wanted to make it difficult for me to get lazy and copy and paste parts of the JS implementation.

This turned out to be a great move because it forced me to flex PHP muscles that I haven't ever really used. For instance, I've used anonymous functions before, but the fact that PHP closures don't automatically capture variables from its parent scope meant I needed to come up with a couple of creative solutions for something that just comes along "for free" in JS.

Secondly, I implemented the entire thing using TDD.

This was, by far, the most difficult part of the exercise. Haverbeke's code is clean and succinct and after reading it I had a clear idea of what I wanted to implement. However, I forced myself to stick to the Red-Green-Refactor cycle and built the system in the smallest (reasonable) increments in functionality I could think of.

This was frustrating in parts. As this was the first time I've used TDD properly to drive the design of something bigger than a code kata or coderetreat exercise. I've used unit testing pretty extensively in some large projects, but never to explicitly drive the design, this is new for me.
Having seen Haverbeke's implementation I knew exactly where I wanted to go, but forcing myself to drive the design and implementation using TDD (and being as methodical as I was able to be) meant that I couldn't skip over a bunch of steps and simply write a PHP version of Haverbeke's code. I had to go the long way around, and I'm glad I did as it meant I learnt a lot about both the testing framework I was using (PHPSpec) as well as testing in general.
The upshot of sticking to TDD all the way through, though, was that once I'd finished the project and actually looked at the resulting code, I was pleasantly surprised to see just how clean it was. Far cleaner, I'd wager, than if I'd just bombed into writing an implementation straight. I'm also pretty sure that approaching it the way I did meant that troubleshooting was a breeze. I can imagine that if I hadn't written my tests upfront (and in the way I did) I'd have spent a bunch of time tracking down logic errors.

I'm not going to be putting the code up anywhere - the project wasn't anything more than a project a Saturday and Sunday night after the rest of the family had gone to bed. I would recommend that anyone who is interested in programming languages writes an interpreter at some stage, though.
Writing an interpreter does two things.
Firstly, it demystifies programming languages. It shows us how an interpreter is a program just like any other.
Secondly, even though an interpreter (or compiler, or assembler, or whatever you're writing) is just a program, you'll be blown away by the power of the ideas embodied in that program.

As a final note, it's been a good few years since I experienced that unique thrill that comes from having a program you've written do something cool. It was most definitely one of those moments when my interpreter could take this

do(define(pow, fun(base, exp,
if(==(exp, 0),
*(base, pow(base, -(exp, 1)))))),
print(pow(2, 10)))

and do this

Saturday, December 5, 2015

Guidance, binary, and responsibility.

When I was in Std. 6 (Grade 8) I had the worst "learning" experience of my life.
It was in what was called "guidance", ostensibly a class meant to furnish us with the skills we needed for life outside of school.

On that particular day we were discussing careers. This was around 1993 and computers/IT/whatever was still seen as being a particularly smart career move (although this was still before the feeding frenzy of the dotcom boom, so I don't think it was as hot a choice as it was later taken to be).
A fair number of us loved computer games, and some of us -- myself included -- thought it would be cool to grow up and make our own. That is, to be programmers.
At some stage someone (it could have even been me) mentioned becoming a programmer as a possibility.

The teacher then asked "who of you know what binary code is?", some of us put up our hands.

Then he asked "do any of you know how it works?" and I kept my hand up.

He then offered me a piece of chalk and asked if I wouldn't come to the board and write up something simple in binary, maybe something like the number 5, I can't remember exactly which problem he posed. I do remember, though, that I worked it out when I got home.

I went up to the board, and froze.

Now, there are a couple of ways a teacher could respond to this. I've taught classes before, and have been lucky enough to have seen brilliant teachers in action (my philosophy supervisor in his second year class on Epistemology was, perhaps, the greatest example of how to teach I've ever seen -- managing to turn even questions that seem entirely hopeless into opportunities for learning).

The correct response to a frozen student is to engage them, draw out what they actually know. Ask leading questions, encourage them to come to the right conclusions.

What my teacher did was use it as an opportunity to humiliate me and the rest of my class.

After a few moments of standing before the board frozen he sent me back to my desk and said something along the lines of "not everyone is cut out for being a programmer, those of you in this class interested in computers can certainly find work in computers, fixing them, replacing hard-drives and such, but not necessarily as programmers".

You see, my class was made up of guys who had been kept back the previous year, who weren't doing particularly well academically, or who had (like me) transferred from other schools. Leftovers, basically. And the attitude towards us was, more or less, get them out of the school as quickly as possible (many of my class did, in fact, leave our high school before graduating, opting for a skills based college education).

He had an opportunity to encourage us, and he used it as an opportunity to make us feel like shit about ourselves and our prospects.

I was 13. Think about that. What kind of grown man goes out of their way try to crush the dreams of a 13 year old?

If he had decided that, rather than using this as an opportunity to belittle me and had, rather, asked me what problem I was having with the binary conversion, the class may have turned out quite differently.

I had, in fact, by that time taught myself to program by reading the Q-Basic help manuals that came with MS-DOS 5. Reading the source of gorillas.bas and nibbles.bas.

As for binary (and remember, we had no internet in these days) I'd actually worked it out from reading about it in Wyndham's novel, Chocky. I'd never actually seen a lot of binary written out and couldn't remember if it was supposed to be written left to right or right to left. *edit: to be clear, the passage explaining binary in the novel is very clear, but I went out of my way to understand it -- I must have been around 8 at the time -- explaining it to anyone who would listen. It was a revelation to me*

Now, I don't think that teaching myself to program or learning binary from an SF novel (both before I was 11) is something that is particularly exceptional - I think that these are quite common experiences shared by many people, people who often become programmers.

 My experience in high school coloured my relationship to formal learning for a long time, and this kind of experience was repeated a number of times during my high school career.

I wish I could ask this teacher what he was thinking. Was he just having a bad day and so felt like lashing out at the "stupid class"? Did he realize that this episode may have actually discouraged learning in kids that could have, with even a little effort, a little guidance, flourished?

Wednesday, December 2, 2015

34 in review.

I've been inspired by Michael Fogus' year in review posts, and though that instead of doing it on the year, I'd do it on or around my birthday to try and get a sense of what I've managed to accomplish (if anything).

Favorite technical books discovered

The first two technical books that have impacted me this year have been related to high quality MOOCs. 

The first is Nisan and Schocken's "The Elements of Computing Systems" (and the associated MOOC on Coursera). This book, and the associated course, was one of the greatest learning experiences I've ever had in computing. Essentially they take you through "building" (viturally) a complete computing system from logic gates through to an operating system. Part 1 takes you through about half of the book, through to building an assembler. I'll be taking Part 2 next year when it's released. 
The MOOC is a really powerful supplement to the book. I can't recommend it highly enough.

The second is the monumental "Concepts, techniques, and models of computer programming" by Van-Roy and Haridi. This was linked to the EdX courses by Van-Roy where he covers about 50% of the text (and, as such, I've only read about half of the book). Still, what I have read has really changed my perspective on programming languages. As much as I dislike actually coding in Oz and the Mozart environment (the entire book is built around the language) I appreciate what it does. I've come away from the two-part series of courses having a much, much deeper appreciation of the ideas regarding (and my relative ignorance of) programming languages.

In terms of philosophy, my favourite read of the year was probably Peter Godfrey-Smith's "Philosophy of Biology".

Favorite fiction

Philip Roth's The Human Stain was, hands down, the best piece of fiction I read this year with Ishiguro's The Buried Giant a distant second. 

I read far too little fiction in the last year. I'll remedy that this year.


This year has been difficult in terms of my philosophical work. My actual work (see below) has been pretty busy, and so working on my thesis has been difficult. However, I managed to make a couple milestones. I wrote and defended my thesis proposal. I wrote up and presented a paper on representation and choice. Finally, the chapter I've been working on is sitting on around 20 000 words at the moment. Including the numerous rewrites I've done, I've written at least twice that (something more around 50-60K words) - so there is progress. 

Next year I'm pivoting to the evolutionary perspective on my work, this will in turn help me focus the sprawl that the first chapter currently is. 

I have read a lot of papers this year and although they're in the service of philosophy, very few of them were actually philosophy papers. We'll track this category a little closer next year.


This year I went out on my own again and co-founded a small consultancy. We're presently focused on professional PHP development and will be for the medium term. 
This has obviously taken up a lot of time. 

Still, I've managed to do a couple of cool things. I went to my first Code Retreat, an experience which was invaluable. I'll most definitely be attending again. 

I also built an InsectBot, which was a bunch of fun, as well as hacked around on my Arduino.

Further, I wrote and deployed my first Laravel app this year. I don't think I've had a better experience learning or working with a framework. Taylor Otwell is a champ.


I've had two publications this year. 

First, my story "A Darker Utility" appeared in New Contrast
Then, another one of my stories "Islands" appeared at The Kalahari Review

Both are what you might consider as "literary" fiction, and I'm quite proud of both pieces despite their flaws.

My story "After the Reception" also won first prize in the South African Writer's Circle SF competition, which was a nice surprise.

Plans for 35

I have a couple of things I want to achieve, in no particular order.

Firstly, I want to finish chapter 2 of my thesis. This'll get me past half way to my PhD. Totally doable.

Secondly, I'd really like to publish a philosophy paper. I'm working on a co-authored paper with my supervisor that is pretty well developed. So hopefully that'll happen soon.

Thirdly, I want to learn a new programming language. This year I learned Oz (barely) and relearned PHP. I'm thinking that I may want to give Clojure and Ruby a go. We'll see as the year progresses.

Fourthly, I want to write at least 60K words on my current WIP.

Finally, I really, really want to do more work contributing to FOSS. 

Saturday, April 18, 2015

Natural examples of Subsumption Architecture-like control systems (part 1?)

In the following blog post I want to do two things.

Firstly, I want to give a brief overview of the so-called Subsumption Architecture -- a kind of robot control system developed in the 80's primarily by the roboticist Rodney Brooks. I give this short introduction so that I wont have to do it again in this blog. It's going to come up again, and I'd like to point this post rather than reiterate it (I could also point you to his papers, which are extremely readable).

Secondly, I want to use these posts as repositories for biological examples of Subsumption Architecture like control systems.

Subsumption Architecture in brief


The SA is a bottom-­up approach to designing robotic control systems that proceeds by the accretion of complete behaviour producing layers. Each layer is complete in the sense that it is a total “activity producing subsystem” that takes an agent from perception through to action. During the design and implementation of these layers their adequacy is tested in the real world (the actual environment that the robot will inhabit). There are two distinguishing features of the Subsumption Architecture.

First, unlike traditional models of intelligent action where all information from sensors filter through to a central processor (after being converted into suitably neutral representations of the world) and used to plan a series of actions, each module that makes up a behavior producing layer is potentially connected
directly to both sensors and actuators. The SA is also explicitly anti-representation, in the sense that there are no centrally stored representations (think data-structures) shared between the behaviour producing layers. This aspect of the SA is the topic for a follow up post.

Secondly, the behaviour producing layers are arranged into a hierarchy (i.e. the “Subsumption hierarchy”). The lowest levels implement the most basic (but complete) behaviours, such as basic movement and collision avoidance, and each subsequent layer that is added will co opt, subsume, or, often, simply disable the behavioural competencies provided by the lower levels. The robot's entire behavioural repertoire is not “encoded” in any one place but rather emerges out of the interaction of several different activity producing layers. The communication between layers is one way, from the higher layers down to the lower and consists primarily of simple signaling (on, off, possibly a short bus that is able to represent a handful of numbers/states).

The examples

Kent Berridge, in his masterful review paper "Motivation Concepts in Behavioral Neuroscience" (Berridge,2004) describes Dethier's "Hungry Fly".
A Fly has two eating reflexes. When a fly lands on a food source, an excitatory reflex engages eating behaviour. When its stomach is full, a second reflex is engaged that inhibits eating behaviour.
Dethier was actually able to disable this second, inhibitory, reflex by severing a sensory nerve attached to the the fly's gut. When severed, the excitatory reflex was never overridden by the inhibitory reflex (i.e. the inhibitory reflex was never engaged) and so the fly would eat until its stomach burst.

The excitatory and inhibitory reflexes involved in the fly's eating behaviour have clear similarities to the kind of hierarchical overrides that drive action selection in the Subsumption Architecture. Lower level behavioural modules are engaged until overridden by a higher layer -- by being, either, co-opted or simply disabled.
Think, for example, of a simple robot whose simplest behaviour is to drive forward. Simply drive forward. This isn't a particularly useful behavioural profile because, unless this robot lives in an infinite and smooth plane (or sphere, or whatever, as long as there are no obstacles) this robot is going to run into some obstacle and get stuck, forever.
Now we add a second behavioural layer to this robot - a small sensor along its front that can sense when the robot butts up against an obstacle. This behavioural layer then disables the first layer so that the robot doesn't keep blindly racing forward, and then -- say -- engages only some of the robots wheels (or legs etc.) so that it turns in a random direction until the sensor no longer detects an obstacle. The second layer is then disengaged and the first layer is free to make the robot run off straight until another obstacle is detected.
If some techno-Dethier came and severed the wires running from our robot's sensor, it would run into an obstacle and be stuck forever (not quite as grim a fate as our "Hungry fly's" exploding stomach)1.

The housefly's flying behaviour is also reminiscent of the Subsumption Architecture (Clark, 1997), in that there is no central control mechanism that "chooses" flying behaviour out of a range of possible behaviours. Essentially, there are sensors in the fly's feet that are connected directly to the wings. When the fly's feet are no longer in contact with some surface, the wings begin to flap.

Another interesting example is predator evasion in Noctuid Moths (McFarland and Bösser, 1993). Noctuids are preyed upon by bats and the moths' auditory system is exquisitely attuned to their predators' echolocation system, being able to both sense distance, direction, and whether the bat is approaching. If the bat approaches within a few meters of the moth, nerve cells from the auditory system send its wing muscles into spasm, causing the moth to fly erratically and drop towards the ground, hopefully evading the bat. Here we again have a fairly complex behaviour being triggered directly by environmental stimuli -- at no point was this behaviour selected by a central control mechanism, and there was no internal representation of the external world at all. Further, we see the typical hierarchical overriding that's the defining characteristic of Subsumption Architecture action selection.

These examples can be easily multiplied (see, for example, David Spurrett's discussion of the Sea Slug's eating habits) and the fact that we're able to point to natural control systems that exhibit similarities to the Subsumption Architecture is evidence that Brooks et al were on the right track, at least for certain classes of behaviour.
The real question is how far are we able to go with the Subsumption Architecture? Can we get from "insect level" intelligence all the way to human level intelligence simply by scaling the Subsumption Architecture? I don't think that there's yet a convincing argument against the possibility, but I think it's unlikely (although I'll leave that question for a future post, and a paper, and a thesis).


1. It's important to note here that what I'm suggesting isn't that the way in which the competing reflexes are implemented is in fact by behavioural modules overriding each other, just that the way in which the two behaviours are engaged -- landing on food, fullness of gut -- and interact -- the inhibitory reflex "switching off" the excitatory reflex -- are reminicent of the Subsumption Architecture. One could easily imagine this behaviour being implemented in a SA.


Berridge, Kent C. "Motivation concepts in behavioral neuroscience." Physiology & behavior 81.2 (2004): 179-209.

Brooks, Rodney A. "Intelligence without representation." Artificial intelligence 47.1 (1991): 139-159.

Clark, Andy. Being there: Putting brain, body, and world together again. MIT press, 1997.

McFarland, David, and Tom Bösser. Intelligent behavior in animals and robots. Mit Press, 1993.

Thursday, March 26, 2015

Privilege. Framing. Rhodes must fall.

Here are two ways of telling the same story.

Version 1.

When I left school, my parents couldn't afford to send me to university - and, anyway, I didn't get matric exemption (of course, my parents couldn't afford to send me to the best schools, so this isn't really so surprising). 

I really valued education, though. I realized that being educated was something something important. So I registered for my undergraduate degree in Philosophy through UNISA.
Every day, before and after work, I would study. I would get into work at least an hour early, read, and then get going. Eight or nine hours later, I'd leave for home where I'd spend the rest of the evening studying. I'd average about 3.5 hours university work in the week, and about 10 over the weekend. I had no social life -- I made the choice to give it up.
I graduated in about 4 years with my BA, distinctions in both majors. 
I repeated this with my honours, again in philosophy, where I graduated in 2 years, again cum laude.
I then started a company, so I began working far harder with much longer hours. Even so, I was able to finish my Masters in cognitive science in 2 years, again cum laude

I did this all on my own steam, on my own dime, without anyone's help, working full time and only ever taking time off to actually write exams (I've never taken off whole weeks of "study leave").

Version 2.

I messed around in high school. Although we didn't have a lot of money, I never really wanted for anything. If I needed books, clothes, musical instruments, whatever, my dad would make a plan.
I never really worked that hard in high school because I spent most of my time either dreaming about girls, playing basketball, or playing in bands.

At the end of matric, when a not insignificant number of my friends went off to university (where they received tuition discounts because of their marks), I couldn't go, I didn't get exemption.  
I knew, though, that if I'd worked hard there was a good chance I could have received a bursary -- hell, a good few of my friends got them. Still, I wasn't much worried - I knew a lot about computers, I mean, I'd been messing around with them since I was about 11 years old, when my dad bought me an old XT with a monochrome monitor. It wasn't the best -- as I mentioned, we weren't the richest people we knew -- but it was enough to get me started, so that by the time I was in grade 12, I knew far more about computers than most my age (and most people, in general).

Off the back of my knowledge of computers, I managed to find jobs pretty easily. When I'd go into interviews, nobody questioned my intelligence. The fact that I'm English probably helped here, though, because I've seen native English speakers say the dumbest shit and get away with it, and people speaking the most profound truths written off because of their accent. Weird, no?

Now that I had a job, I could afford to buy stuff - I still lived at home, and I was only expected to contribute a token amount of cash for rent, so I could buy clothes, eat takeout, party. The usual. 
Around this time I bought a whole load of books with my spare cash. One of them I bought was Hofstadter's "Godel, Escher, Bach: an eternal golden braid". I read it and had a kind of intellectual revelation. I realized that there was all of this fascinating stuff that I knew nothing about, and that I had probably missed out on because I chose to mess around and not go to university. I could have, you know. My teachers told me. Dumbass me. 

What was cool, though, was that now that I was working, I could afford to spend some of my money (obviously not all of it - how would I afford Steers???) on studying. UNISA was relatively cheap back then, and so I signed up -- did an access programme -- and was a registered BA student.

My mom had given me her old car, a Fiat Uno Fire, so I could drive through to work really early to study. No need to waste any time on buses or taxis. The roads are completely empty if you get up early. 

Sure, I'd work 9 hours a day, but I worked in an air-conditioned office, so while it was a mission getting home and studying, I wasn't too wiped. I could even spend time hanging out with my (then) girlfriend (now wife) watching television at her mom's house. 

I got my degree, eventually, and I loved the stuff I studied so much that I registered for two more degrees. It's not my job, but it's something of a passion.  

Your point

Well, if you can't see the point, I don't think I can help you. But I think it's important to remember how easy it is to frame your own life story in a way that obscures the role of privilege. Both of these are brief, but true, accounts of a large part of my 20s. And yet framing it as I do in the first version makes it seem as though I triumphed in the face of adversity, where the truth is that I've not really experienced any adversity other than my own stupid distractions and bad decisions when I was in high school. And even after that act of self-sabotage it was relatively easy for me to get myself back on course. Sure, I couldn't really be a full time student, but the kind of work I do leaves me with a fair amount of free time, so it wasn't really a problem. You can read a lot of books if your day job doesn't involve backbreaking manual labor and you're willing to give up a couple hours sleep.

This is a story of privilege. My privilege. As a white male my privilege is multifaceted, and quite difficult grasp all at once. Of course, it's not my fault that I've been privileged -- but that's not the point. The point is that I've benefited in ways that people unlike me have not, and would not have unless they were like me. When you're privileged it's easy to discount this fact.

I encourage everyone to consider how they frame their own life's narrative, how they present it to themselves and others. What are the details you leave out? Where has it been easier for you than it would have been for someone with an accent, with different hair, with a different sex/gender, skin of a different colour, of a different sexual orientation, someone who can not walk, someone who can not hear, or speak at all.

Human beings are not the rational animal. They're the post-rationalizing animal. We're not that great at reasoning, but we're bloody brilliant at making up stories after the fact. This super power means that it's really, really easy for us to plaster over our privilege. We need to resist this urge.

Monday, February 23, 2015

"A darker utility" and some thoughts about fiction.

Jeremy Bentham's Corpse.
A Darker Utility indeed.
I'm happy to report that my short story "A darker utility" will be published in the next edition of New Contrast, South Africa's oldest literary journal still in print.

This is an interesting pub for a couple reasons.

The first is that this will be the first "literary" piece that I've had published.

The other cool thing about this is that it's the first story I've written that's actually inspired (broadly) by philosophy. I've been very careful to try and keep my study of philosophy out of my writing. I think that too much philosophical content misses the whole point of imaginative literature. If there is philosophy it should be implicit, it should be dramatized, it shouldn't be the centerpiece of the work. "Sophie's World" might be a bestseller, but god damn, as a work of fiction I'm happy to say that it's a major failure. "Zen and the Art of Motorcycle Maintenance" has far better integration of philosophical ideas, because it feels like someone thinking. It's thought, dramatized. That's why it's so much more successful than the follow up "Lila", which often reads as an exposition of an existing philosophy than thought coming into existence. In this regards, Descartes' "Meditations" is probably a more successful dramatization of thought in process than is Pirsig's second book.

In this case, though, my story was inspired by some work in moral psychology on psychopaths (see, for instance, Nichols' "How Psychopaths threaten Moral Rationalism: Is It Irrational to be Imoral?") - so the title suggests that the story is about the bad parts of utilitarian thinking (even though, strictly speaking, my main character isn't really a utilitarian - she started out being one, and at the end I just wanted to keep the title).

I have a lot that I'd like to say about the relationship between literature and philosophy, and -- in particular -- about philosophy in literature. Perhaps in another post, though.

Friday, February 20, 2015

Slightly Buggy CSS Lava Lamp

So here's a slightly buggy demo I wrote up after I read Chris Coyier's short tut on shape blobbing. I thought it was a cool effect and that it could probably be used to achieve something that looked a little like a Lava Lamp.

Essentially I generate a bunch of square divs made to look like circles (setting border-radius), make them randomly move either up or down at varying speeds.

I should probably spend a little time cleaning up the code -- I suppose that's what you get when you code while watching Project Runway :P

See the Pen wBmbee by Blaize Kaye (@bomoko) on CodePen.

Sunday, February 15, 2015

Is "Learn to code by coding" a useful answer for someone asking how to code?

Earlier today I asked a question to some of my friends on Facebook the gist of which was "Which practices to you find useful in your pursuit of becoming a better programmer". I was asking about Code Katas in particular, but the question was quite general.

I think the way I phrased the question must have -- to people who don't know that I am a professional programmer -- suggested that I was asking how I could learn to program, rather than inquiring of my peers how they hone their craft.

Someone, who I assume interpreted my question as if I were a beginner, answered
"The best way to learn to code is to code. Deep end"
In one sense, this is very good advice. Those of us who are able to code generally learned how to do it through hundreds of hours of effort. This kind of advice resonates with those of us who have crossed that particular chasm -- we nod gravely and acknowledge that, indeed, there is no royal road to code.

But then I got thinking about how I would have taken this answer if I wasn't someone who has the experience to recognize just what is being expressed in this nugget of truth. What would I have
thought if someone had answered a genuine request for advice from a beginner with this succinct expression of what all of us programmers know to be true.

I think I would have been baffled. I think I would have been no better off than if I had received no answer at all, because to people without the shared experience of learning how to program, who haven't already spent the hundreds of hours required to grasp how to solve problems with computational solutions, this kind of answer carries very little information.

I'm not denying the truth of the statement, I'm denying it's usefulness if it stands without qualification.

What a beginner needs more than Nietzschean brevity is a clearer guide to getting started. I've taught several programming classes, and have mentored several juniors, and one thing is certain - learning to program is often an overwhelming prospect for people who haven't yet learned how. Beginners need examples, they need explanations through books and (good) lectures, they need people who can guide them in the practice of thinking like a programmer.
What is true, though, is that at every step of their learning one would want to include a very hefty portion of programming practice. A series of increasingly complex exercises and projects, for example.

Furthermore, even for experienced programmers saying that the best way to learn how to code is simply to code is misleading. One can spend several years solving the same kinds of programming tasks and not improve in any substantive way. So it's not true that simply coding is enough to become better once you've already got a basic competency in the discipline. What is required is some kind of deliberative practice along several dimensions as well as a steady dose of theory. And, to his credit, this is exactly the kind of thing the person quoted above suggested in a later comment once we had cleared up what I was asking.

To answer the question in the title: Yes, "Learn to code by coding" is indeed a useful piece of advice, but never simply left at that. For the beginner, it should itself act as a qualification "Read X,Y, and Z, but never forget that actually coding is the most important thing". For the more experienced "You should learn to code by coding in this way".

Tuesday, January 27, 2015

Meditation for Deep Space

So you’ve decided to establish a meditation practice?


Many veteran deep-space pilots have found that meditating for as few as ten minutes at the beginning of every wake-cycle can bestow numerous physical, mental, and spiritual benefits.

Whether you aim to experience ultimate enlightenment, reduce stress levels, lower your blood pressure, or just to carve out a little “me time” in your crazy schedule, this short guide will help you take your first steps to attaining your goals.

1 - When beginning your practice it will be important to find a quiet place where you can sit without any distractions. This is not always possible if you have been assigned a co-pilot, as is typically the case with multi-year deep-space missions. Persevere! Your bunk, the wash-area, or an unused corner of the storage bay can serve nicely.

2 - While the traditional position for meditation -- the so-called “full lotus” -- has one sitting with legs folded so that the tops of the feet rest on the thighs of the opposite legs, low-to-no-gravity makes maintaining this kind of position difficult. Many deep-space meditators have found success with simply pulling their legs in towards their chest and letting themselves float. This is sometimes called “unopened lotus” position.

3 - The mouth should be kept closed, with the tip of the tongue pressing lightly against the back of the front-teeth, while the eyes should be kept half open. However, if you find your co-pilot is unable to stop wandering into your field of vision, thereby breaking your concentration and generally agitating the mind, you may close your eyes completely.

4 - It is recommended that those new to meditation begin with a technique called “breath counting”. One breathes naturally and counts each breath on the apex of the inhalation, counting from one till ten and then cycling back to one.

5 - When counting breaths, if you find yourself distracted by thoughts or sounds -- perhaps your co-pilot interrupts you with an unimportant question, or continually sucks at his teeth, or scrapes his god-damned chair across the bridge’s floor again and again and it cuts through you like nails across a chalk-board and you’re already at the end of your patience otherwise you wouldn't need to meditate in the first place -- just let go of the thought, or distraction, and begin counting from one again.

6 - (Optional)  At the beginning of their meditation practice, some people find themselves needing to take vigorous action in order to establish the requisite peace and quiet. On closing the airlock many of these meditators report hearing a deep ringing through the ship’s hull. This is natural and should pass quickly. In order not to disturb one’s equanimity it is best not to think of this sound as, for instance, the thrashings of a man or woman banging frantically on the ship’s exterior, but rather as the universe ringing out in celebration of one achieving some measure of peace.

7 -  Count your breath for at least ten minutes or longer, if you feel you can manage it. At the end of your session, take three deep, sharp breaths. Your mind and body should now both feel relaxed and ready for a full wake-cycle.