IBM purges sections of supercomputer Watson's memory after it begins swearing.

Excuse me while I laugh until I injure myself.

In a way, this is tremendously impressive. The guy who covered it for io9 summed it up rather well, when he commented that if he'd been in the room when Watson started to drop the word "bullshit", his "hair would have stood on end". Profanity is one of the weirdest parts of language, and it is particularly weird in English, where the word 'fuck' can be folded, spindled and mutilated into almost any grammatical function you care to try. One of my best party tricks is pinpointing who isn't a native English speaker (and what their first language is) and an easy way to get your fingernails into cracking a problem like that is to watch for who swears weirdly, and in what way they're going slightly caterwampus. You can even use it to distinguish English dialects in writing, where an accent can't be heard -- although the test tends to fail on geeks and other pointy-headed types who grew up watching a lot of British TV, and who will actually use the phrase 'fucking hell' instead of, or in addition to, the more characteristically American 'goddamn it'.

The general geek reaction to hearing a computer swear correctly is to light up like a Christmas tree and run around yelping TURING TEST! TURING TEST! While this is indeed kind of exciting, there is much made of the Turing test that probably shouldn't be -- the impression I get is that most people have had the test explained to them as a brief abstract via some incredibly nerdy game of Telephone. They treat it as the gold standard of artificial intelligence work. This is probably a mistake, because if you dig up and read the original paper (published 1950 in Mind, which is a philosophy journal; you can skip the section on "universality of digital computers" if you accept the premise that computers can do anything you jolly well program them to do, and read the paper safely math-free), it becomes blindingly obvious that Turing is trying not to laugh himself into a coma while typing it out.

Turing's papers were often like that, because Turing was often like that. When they were designing the first physical computers at Bletchley Park during WWII, what passed for storage in those days was a setup called 'delay line memory'. Basically, you recorded information by passing it into a long tube of liquid as a series of pressure pulses, and it was effectively stored for however long it took the pulses to travel down the length of the tube, at which point you could pick them up at the output and use them or feed them back to the beginning if you wanted to keep them longer. There was some discussion about what liquid would be best to use in these, and Turing's suggestion -- which was both hilarious and perfectly workable as a answer -- was to use gin, on the grounds that it was cheap, they could get it anywhere, and nobody could possibly guess from a purchase order for booze what they hell kind of top secret things they were doing in there. The brass were less amused and went with mercury, but I still think this is one of the funniest correct engineering solutions I've run into.

Turing was a Cambridge alumnus and somewhat whimsical, and the military ran afoul of this more than once. He was frequently allowed to get away with it, inasmuch as he was Alan Turing and they needed him. He decided, at one point, that he needed to learn how to shoot. (I suppose he was bored -- he'd already figured out most of knitting on his own, although another Bletchley mathematician, a lady named Joan Clark, had to show him how to finish off the fingers on gloves.) He figured the easiest way to do this would be to sign up for the reserves, only when he'd finished qualifying with a firearm, he simply stopped turning up for duty. The CO hied himself to the bosses at Bletchley to complain about this, whereupon the Bletchley people sighed and pointed out that on their enlistment forms, there was a small blank next to a statement that said something like, "I have read this whole thing and I agree to abide by it forevermore." Everyone else in history had done what normal people understand the form is really asking for, and written YES on this blank; Turing, being not only a logician, but a logician who had no intention of actually being indentured to the British Army, had quite sensibly written NO. The Bletchley people advised the reserves officer that it was really in his best interest to just let the matter drop.

Although he is best known for being a mathematician and the ur-compsci geek, the Turing Test paper for Mind was neither the first nor the last thing he wrote for a non-math audience. Turing always had a peculiar fascination with wildflowers, especially daisies -- the frontispiece to one of the volumes of his collected papers is a sketch his mother made, of young Alan wandering away from the field hockey game he's supposed to be participating in, his stick dragging forgotten behind him, to scrutinize a daisy growing at the other edge of the green -- and late in his life took to applying his favorite calculus to working out how nature gets the petals to grow in such a neatly cyclical fashion around the center of the bloom.

The point is, Turing was not particularly dry and stuffy and neither was his thinking about AI. The Turing test is not actually that difficult to pass -- exactly how easy depends on how low your standards for humanity have fallen. I've personally witnessed a very rudimentary AI pass the test with flying colors, albeit the thing did so by pissing absolutely everyone off.

I was in this class, see. The class where I met Moggie, in fact. It was an anthropology class run by a trio of professors whose specialty was essentially investigating whether they could get a load of students to play a tabletop sim (seriously, they called it "SolSys Simulation") without any rulebooks, guidance, or even a particularly good chain of command. Here's a paper written about it by one of the participating instructors from another school. It was fascinating and frustrating and also incidentally what indirectly started me modeling, plus we drove out to a conference at NASA Ames to talk about it with two cars, seven people, eleven bottles of hard liquor and nobody brought any food, and also Moggie got completely wasted there for the first time in her life and ended up sitting through part of the lecture the next day on the floor in front of her chair with her head cradled in her hands, BUT none of that actually has to do with the story I'm telling you now, which is about the non-classroom portion of the course.

The electronic component was handled via MUD -- which took up server space stolen from DragonMUD, as Jopsy was also a graduate of the SolSys Sim -- and we were required to spend quite a bit of time on there in order to complete the class requirements. A number of alumni from previous years were also allowed to hang around the place, so we ran into weird people all the time. One day, a user named 'dogcow', after the simple-minded mascot for the old Mac printer drivers, started showing up. He was aggravating. The rest of us knew enough to not interrupt other people when they were trying to build their chunks of the space station, and we would have private conversations about IRL stuff all the time while doing it, but dogcow would not drop character. And as his character was an idiot, we all wanted to hunt him down and punch him for it. It took about two or three weeks and me taking a stab at some ASCII-only Canadian French to find out that one of the other almuni was actually dogcow's owner -- it was a bot pretending to be a human pretending to be a moron. Very convincingly.

The point is, Watson learning how to swear a blue streak might mean it passes the Turing test, but that's a lot less important than the fact that it creeped people the hell out. That means its behavior passed into the uncanny valley, a much more significant thing for human-computer interfaces. A spot in the valley means that it's striking us humans as like one of us but not one of us. We have not accidentally created a thing in our own image, so much as we have created a thing which is starting to create an independent image of itself. One of these days, someone's going to do that to a computer with a much less dignified official position in the world, and they're not going to purge the potty-mouth backtalk from its memory banks.

Comments

Popular posts from this blog

The mystery of "Himmmm"

WARNING! Sweeping generalizations inside!