posted July 20 2009
the man who knew too much
Alan Turing is the most important and influential scientist you’ve never heard of. You’ve perhaps heard the term “Turing test” come up in discussions of artificial intelligence, but this represents a minuscule slice of the man’s work, and an often misremembered one at that. At the start of his career, Turing answered what was thought by many to be an unanswerable question in pure mathematics. To solve the problem, Turing took an imaginative leap that, incidentally, invented the modern computer. During World War II, Turing was the mastermind behind England’s incredible efforts to crack the German Enigma code, and in this capacity probably shortened the length of the war by several years. It was only in his post-war career that he became interested in artificial intelligence (probably the natural evolution of his interest in symbolic logic, which lies at the heart of the answer to the aforementioned formerly-unanswerable question).
What most people don’t know about Turing, even many of the academics who speak of him as the Founding Father of This or That in the chiseled-in-stone intonation reserved for Darwin, Newton, Einstein, and Obi-wan Kenobi, is that he was gay. Once the British government learned of his homosexuality, at the time illegal in Britain, they put him on a regimen of psychoanalysis and estrogen injections. Unsurprisingly, Turing killed himself shortly thereafter by biting into a cyanide-laced apple. If the reader is interested, Wikipedia, as always, has more.
I learned most of these things by reading David Leavitt’s The Man Who Knew Too Much: Alan Turing and the Invention of the Computer. I had wanted to learn more about Turing as a man instead of a collection of ideas, but the tone of the book is far more academic than I had anticipated, with Leavitt walking us through the often highly technical highlights of Turing’s brilliant work. This book is part of a “Great Discoveries” series, and I suppose it’s my own fault for expecting the wrong things out of it. In my defense, Leavitt’s prologue chapter makes it seem like the emphasis is going to be on the personal. His thesis, if a biography can have such a thing, is that Turing’s work in artificial intelligence was deeply connected to his homosexuality.
Unfortunately, Leavitt’s case for this connection is thoroughly unconvincing. He relies mostly on pop-psychological speculation and comically strained analogies. For example, “As a homosexual, he [Turing] was used to leading a double life; now the closeted engineer in him began to clamor for attention…”
This is particularly hilarious in light of the previous chapter, where Leavitt makes it clear that far from leading a “double life,” Turing was surprisingly honest and nonchalant about his sexuality, seeing it as a non-issue (a daring and even dangerous stance for a man of his time). Leavitt claims that Turing’s homosexuality made him an outsider who was “disinclined to overidentify with larger collectives,” thus giving him the sort of unconventional perspective that allowed him to solve a major mathematical quandary and invent the computer as a byproduct of his thought process. Later in life, as Turing is defending the potential of thinking machines, Leavitt makes a tortured jump and posits that Turing was secretly boosting not for the rights of machines, but for gays.
Leavitt is simply wrong about the influence of Turing’s homosexuality on his academic pursuits. The connection isn’t there, no matter how badly Leavitt would like this to be the case. I’d like to offer an alternative I find much more interesting.
Were Turing growing up today he would definitely be diagnosed with an autism spectrum disorder. Let’s be clear. I loathe autism’s status as the defacto pediatric diagnosis of the 2000s. Autism diagnoses increased ten fold between 1994 and 2006, and if you’re going to blame it on anything but hypochondria and over-labeling, the burden of proof is officially on you. Still, Alan Turing was a textbook case. As a mathematician, he lived almost entirely in his own head, had a variety of well-documented obsessive-compulsive behaviors, was famously literal-minded (if you used a metaphor around him, he’d take it at face value), and, most importantly, seemed to have little to no concept of social interaction. It’s not that Turing was shy or weird. He fundamentally did not understand how to interact with people in a social context. Like other high-functioning autistics such as Temple Grandin, social subtleties, the little nods and winks that we take for granted every day, went clear over his head. His total lack of social skill made him lonely and greatly hampered what could have been an even more stupendous intellectual career.
This, I believe, is the essence of Turing’s interest in artificial intelligence. The famous Turing test is straightforward. Put a human behind Door Number One. Put a computer behind Door Number Two. An observer in a third room types questions, and the human and computer type their answers back. How do we know that our computer is capable of true artificial intelligence? If it can fool the observer into mis-identifying where the computer is. Turing, in essence, defined a thinking machine as one that was able to engage itself in very accurate social cognition.This makes perfect sense in light of the way Turing lived his life. His mental world was one of pure logic. Out in the real world, he was unable to handle the complexities of social interaction. How perfect, then, that he would try to create a machine that could operate perfectly in the social sphere using pure logic as its foundation. In my opinion, the line from Turing’s mind to Turing’s machine is all too clear.
Turing’s hopes for thinking machines were overambitious, and scientists now tend to focus on building computational models and machines that simulate smaller slices of intelligence, rather than trying to create some kind of domain-general machine that can think just like a human (as the latter has proven to be incredibly difficult). In the realm of social cognition, progress has been cute but somewhat misleading. Attempts to simulate social cognition are only successful when you define “social” very, very narrowly, and it’s a fairly safe bet that Turing would be dismayed with the state of the art.
Turing’s real legacy may be a more philosophical one. In the 1950s he routinely defended the notion of thinking machines against all sorts of religious, artistic, and emotional attacks. His arguments were notable for their elegance and foresight. When confronted with a criticism such as, “A machine will never be able to compose a sonnet or paint a beautiful picture!” Turing might answer, “Well, neither can I, but surely you’d agree that I am still capable of thought?” He even dared to imagine that intelligent machines might prefer to converse among themselves, as so much of human convention would be irrelevant to them. Perhaps most importantly, he emphasized that in order for a machine to be considered intelligent, it cannot be infallible. This idea, unprecedented at the time, prefigures all modern work in computational neuroscience. If I’m building a learning algorithm, I don’t want a machine that simply gets better and better as I feed it new input. I want a machine that has not only human-like successes, but also has human-like failures.
Turing’s homosexuality did not influence his work. Being gay was a non-issue for him (although unfortunately, it was very much an issue for the British government). Turing’s real motivation comes from his desire to take his greatest strength—logic—and use it to unlock the secrets of social interaction, his greatest weakness. Although Turing’s goal remains elusive, he is one of the most influential thinkers in the history of artificial intelligence research. His philosophy lives on not just in science, but in Spock, Data, and the work of anyone else who ever wondered what it would be like if computers could think.
posted May 21 2009
mega man 9

I have beaten Mega Man 9. Now what am I going to do with the rest of my life?
Saying that Mega Man 9 is difficult is like saying that the surface of the sun is hot. The game is intentionally retro, boiling Mega Man back down to its most basic components. In fact, Mega Man can only do three things in the game. They are:
- Jump
- Shoot
- Die
The game is as simple as it is merciless, and the combination instantly fascinated me. I came late to the Mega Man 9 party. In truth, what really sparked my interest was this excellent essay by Bruce Morrison, in which he explains why a game so unbelievably difficult is ultimately a joy to play, if you have the right mindset. He also touches on the idea of game-based versus internal rewards. In short, players have come to expect that games will reward them for playing well, and game studios are only too happy to oblige. The Achievements systems on the XBox and Playstation make the notion explicit, and to a certain extent, ridiculous. Players are rewarded and patted on the back for making even the most rudimentary progress. Picked up the wrench in Bioshock? Have a badge. Successfully navigate the perils of the tutorial level? Here, you’ve achieved something.
Mega Man 9 does not partake of this orgy of auto-congratulation. Mega Man’s only reward—the only permanent reminder that you’ve made real progress—comes when you successfully complete an entire level and obtain a Robot Master’s power. Yes, Mega Man 9 does have Achievements, but they clearly take the term very seriously. Defeat each Robot Master in thirty seconds. Now do it in ten. Beat the entire game in under two hours. Never miss a shot. Never get hit. Would you like me to part the Red Sea while I’m at it?
Mega Man 9 feels exactly like a Buddhist enlightenment (bear with me). All is suffering. You die over, and over, and over. Amidst all the pain, you begin to accrue a deep, almost imperceptible inner knowledge. Gradually the illusions of the world around you fall away. That which once seemed impossible becomes nothing to you. You transcend. Plug Man’s infamous vanishing platforms become less an obstacle than an opportunity to demonstrate Jedi-like powers of precognition (fast-forward to about 1:00).
Enough about personal challenge, inner reward, and Buddhism. Mega Man 9 is a great game. The game’s decidedly lo-fi approach masks a surprisingly rich experience. Your options may be limited to jump, shoot, and die, but there’s a great deal of variety embedded in how you do those things. Consider the play style of one Ms. PinkKittyRose, which is very different from my own. Look at her on Wily Fortress Part 1. Unlike me, she barely touches her Robot Master powers. She’s far better than me with the default Mega Buster, and the way she plays is very different because of it. When she used Rush Coil to bypass the magma blasters (around 3:20), my jaw hit the floor. I must have died thirty times getting the jumps exactly right. Like a true Buddhist, she saw that the answer to this problem was no problem.
If you want to see true transcendence, you need look no further than the “speed runners,” brave souls whose sole purpose in life is to complete the game in the shortest possible amount of time. In their quest to beat the clock, these people (and I use the term loosely) have explored the game at the furthest boundaries of creativity and discovered subversive, unexpected treasures. To watch one of these people play the game is to witness, in one particularly specific form, the very length and breadth of human thought. For your information, Mega Man 9 can be played from start to finish in about twenty-three minutes and sixteen seconds. Got some free time? Here’s how you do it. Mesmerizing.
One particular reveiw of the game, written by Sumantra Lahiri, struck me as odd. He writes:
While Mega Man 9 has all of the elements to make a classic 8-Bit game, it some how just misses that 8-Bit perfection of Mega Man 2. For whatever reason, that intrinsic quality of a classic 8-Bit game seems to constantly elude it. In many ways, Mega Man 9 is definitely one of the better Mega Man games in its long storied run, but the truth is this; Mega Man 9 is good, not great. Though to say that Mega Man 9 did not attempt to capture that feeling of 8-Bit perfection would be false.
What does that mean? I think Lahiri’s problem is that he’s been confounded by what I call the Twelve Year Old Effect. The most awesome year of your life is the one in which you, personally, were twelve. The television shows will never be funnier, the ice cream will never be sweeter, and the video games will never be more “8-bit perfect” than they were then. It doesn’t strike me as particularly fair to dock Mega Man 9 points simply because it isn’t Mega Man 2.
Mega Man 9, like all great games, builds a complex experience out of a simple premise. Jump, shoot, die. Repeat until enlightenment is achieved.
posted March 23 2009
make with the bubbles
It seems that I have a thing for random assortments of bubbles. Two wallpapers for your online boxes, made with Processing. Blue and teal versions can be downloaded by clicking the previews above. The beauty of this system is that I really spend most of my time tweaking a range of parameters. Then it’s all just a matter of pressing the “Go” button until I get one that I like.
posted December 3 2008
let's run
Part One
Please take a moment to donate something to Child’s Play, my charity of choice. Buy yourself a shirt, pick out some toys for a hospital near you, or just throw a few bucks in that general direction. You’ll be performing an act of unambiguous good. How often do you get to do that?
Part Two
I am fourteen years old. I’ve been fourteen for less than a month, and I’ve spent a decent bit of the past three weeks being vaguely worried about what’s happening now. What’s happening now is that I’m in the pre-op ward of Overlook Hospital with an IV in my arm, waiting to be rolled into surgery. The surgeon assures me that this is a minor procedure, more like a booster shot than anything else. It will be quick, and I’ll be able to go home later today. He tells me that recovery will be rapid and I should be able to walk back into my junior high within a couple of weeks. This is the same man who performed my other leg surgeries eight years ago. As doctors go, he is a genius and a saint, and I trust him implicitly.
“Are you sure I can’t interest you in a general anesthetic? We have several interesting flavors,” he says.
“No, I really think I’ll be okay on just the local.” The local anesthetic will numb my legs but leave me awake during surgery, accomplishing three things. One, it’s manly. Two, I’m genuinely curious about how an operating room and my favorite surgeon work. Three, waking up from my first major surgery was the single most excruciating moment of my entire life. A person’s life is filled with a million little pains, but believe me, nothing has ever come close to matching that first moment of consciousness, in which I was blind from shock and on fire from the waist down. More than anything, I want to avoid replicating that memory today.
Eventually my number comes up and I’m brought into the operating room. The surgeon asks me if I’d like to listen to anything in particular on the radio. I give him my stock answer, “Anything but country.”
And then suddenly I’m in the recovery ward. No hallucinatory dream sequence, no fade to black, just a sharp cut from that room to this one. I’m surprised and angry to have missed an event that I was so intent on observing, and the anesthetic, whatever it is, has my brain all out of whack, and I’m crying a little, trying to explain that I’m not really upset about anything, and not even that uncomfortable.
Then comes the at-home recovery, in which I relearn how to walk. Again. I gradually progress from complete immobility to slowly dragging myself around on my grandmother’s walker. This period is a highlight reel of vulnerability and embarrassing moments, but suffice to say that things eventually hit a plateau. It has been four weeks, significantly longer than what the surgeon had promised. We schedule a check-up with him.
“Take a few steps for me, Jon.”
I’m in a large room that the surgeon uses for check-ups and evaluations. I take a few tentative steps.
“Alright, now pretend that there are really big rocks here, and take some big steps.”
I do so.
A slight smile catches on his mouth. “Alright,” he says, grabbing my hand, “now let’s run.”
Before I can really think about it, he’s pulling me across his office, and to my surprise, I’m keeping up.
I walked into school two weeks later.
Part Three
I learned after the fact that the expected recovery time for my surgery is not two weeks. The recovery time is better than what would have been possible even ten years prior, but still significantly longer than what my surgeon claimed. He low-balled the recovery time because he knew that if he’d said, “The standard recovery time is eight weeks,” then I’d definitely be recuperating for eight weeks. On the other hand, if I expect to be back to normal in two weeks, my recovery just might go faster.
Health is a state of mind. True, no amount of positive thinking will cure hepatitis, but a person’s state of mind, one’s attitude toward his illness or injury, makes a real difference in the quality and final outcome of healing. By buying video games for sick kids, you’re not just purchasing fun little distractions. You’re brightening the atmosphere of an unpleasant place, and giving young minds something new, fun, and exciting to think about. In a very real sense, you are helping them heal. So, again, Child’s Play.
posted September 17 2008
upstaged
I’m in a doctoral program. It’s a highly specialized road to travel, and there’s no single thing that brought me here. But I can point out one specific person who is most responsible for sending me down this path. Every year for the past five years, I have returned to his class to give a lecture on the basics of perception. It’s a fun, slapdash sort of talk where I get to cherry pick all the most interesting research from cognitive science and make a case before my audience. Kind of like Malcolm Gladwell.
The first time I put the talk together I wondered how I’d fill an hour and twenty minutes. Now I have the opposite problem. Should I maybe split this into two separate lectures? Unfortunately I’ll never have a chance to find out, as the professor in question will likely be retiring at the end of the year.
About halfway through this year’s lecture I digressed to tell an old joke. It begins, “What’s the difference between plagiarism and research?”
“Lipstick,” interjected the professor.
I’m really going to miss him.
(And if you really want to know the difference between plagiarism and research, it’s this: in plagiarism you steal from one person, whereas in research you steal from a lot of people.)


