At the turn of the mid-century, Brooklyn-born Lila Gleitman was a professor’s wife with the benefit of enrolling in tuition-free courses at Penn. She had just wrapped up her undergraduate career at Antioch College, where she studied English literature—a domain Gleitman says she had considered to be “the pinnacle of human achievement.”
“I probably considered myself to be at the pinnacle of human promise,” Gleitman says with a laugh.
Her studies at Antioch piqued her interest in classicism, which she chose to pursue at the graduate level at Penn. But upon taking a course with classicist and historical linguist Henry Hoenigswald, Gleitman rapidly discovered she wasn’t a classicist—she was a linguist in disguise.
“I can remember the moment,” Gleitman says. “We were reading something, and [Hoenigswald] stopped and said to the class, ‘Alcibiades was the fanciest person in Athens, and everyone else was just wearing his shoes.’ And at the time, with my Brooklyn background, I said, ‘Yeah, yeah, so let’s parse the next sentence.’”
Gleitman realized she was impatient with anything other than language analysis. Hoenigswald referred her to the Linguistics Department, where she went on to carve the beginning of her life’s work parsing sentence by sentence.
Gleitman obtained both her master’s degree and Ph.D. in linguistics at Penn, and next became an assistant professor of linguistics at Swarthmore College. She soon returned to Penn as the William T. Carter Professor of Education and then as a professor of linguistics. In 1990, she was named the Steven L. and Marsha P. Roth Term Professor of Psychology, which she served as until 1994.
Today at 83, Gleitman remains a professor emerita of psychology and linguistics. Her contributions to both of those fields—and her unique role in merging the two—remain unparalleled and have been recognized by organizations that include the National Academy of Science, the Association for Psychological Science, the Linguistic Society of America, and the Society of Experimental Psychologists.
Gleitman sat down with the Current to discuss her early days at Penn, her role in pioneering the intersection of psychology and linguistics, her work with one of the world’s most famous computers, how babies aren’t as “ignorant” as one might suspect, and why she’s taking a stand against Judge Judy’s “linguistic authoritarianism.”
Q: Coming from what you’ve called a ‘non-academic background,’ what was your understanding of the study of linguistics as you made the leap between disciplines?
A: I didn’t have a clue what it would be about. And actually, as predicted, I didn’t get into historical linguistics. It’s very hard to become a student of all of the Greek dialects and Latin dialects, unless you started studying those things when you were 4 or 5 years old. So instead I got interested in the theory of grammar. There was a great man at Penn at that time, world-famous linguist Zellig Harris. Rapidly I began to work with Harris, who was duly interested in parsing the next sentence. A strange thing was happening—a revolution was taking place, which I fell into the middle of by sheer good luck. That was at the same time Penn was very involved in developing the first feasible digital computers. But when I got here it was about ‘59 or ‘60, and this movement had started in the mid ‘50s. The founders then went to the RAND Corporation, but the RAND Corporation gave UNIVAC 1 and UNIVAC 2 [commercial computers] to Penn. Harris was a great visionary. He was a mathematician and a linguist, and he foresaw—though dimly—everything that you see today. He foresaw computational theory of language as way of actualizing this whole field. And this field [of computational linguistics] exploded, and Penn was at the very pioneering center of it.
Q: At the time, did you realize what you were getting yourself into?
A: Here I was, an escaped English major, and I signed up for this linguistics graduate school, and the professor says to me, ‘Go down to the computer room, and we’ll be doing computing of language.’ To which I probably said, ‘Duh,’ except the word hadn’t been invented yet. [Laughs.] He said, ‘Don’t worry—there’s this young electrical engineer who will teach you to use the computer.’ His name was Aravind Joshi. He was on the engineering side, and I was on the language side as we began, but he too became a computational linguist, and in a sense, we worked together forever. But the whole field did that—it became a synthesis of understanding and describing language in a new way. And almost immediately, people had this vision that the mind was like a digital computer and that language was the paradigm case. All the success at first started coming out of understanding language in this new way. Of course, along with Joshi, there was another great student that was there at this time named Noam Chomsky. He was also, in part, Harris’ student, but in part, everyone’s student. [Laughs.] Their positions diverged over time, but basically, this fundamental notion of a new understanding of language and computation, and the structure of mind all began to emerge at this time, and Penn was this great center. And here I was, wandering into it at just the right time.
Q: What did it take for you to wrap your head around those new concepts your peers were exploring?
A: There was another strand in my life, of course, and that was I was married to a psychologist who did learning theory. There’s a difference in point-of-view in reading into these differences [between psychology and linguistics] in how these fields approach the study of language. You know English, and you can think of that, and you at least vaguely have an idea of what it means when someone asks you what your native language is, and linguistics seeks to understand that internal knowledge. But then, is it the same thing to know English and to speak English? Well, there’s sort of a difference, because if you notice, there is a process, an activity rather than just a knowledge state. So sometimes when you speak English, you even make crazy mistakes. So instead of ‘language acquisition’ you say, ‘I’ll take an anguage lacquisition course.’ It’s not because you don’t know English. It’s because the actual process of going from the brain to the mouth—there’s machinery there, and it has a process of its own. That’s true of speech, of comprehension, and of learning, and psychologists tend to focus on those kinds of questions. Linguists, at least at the time, asked, ‘What is English? What is the structure?’ To put things into ridiculously oversimplified terms, linguists asked, ‘How come you say, the house is red, instead of, house the is red? How would you describe all of the good sentences of English? What kind of mathematics would describe that knowledge?’ Those questions would never be put in an ‘anguage lacquisition’ course—because that’s not what you know, that’s how you do.
Q: When did you make that transition as a professional psychologist?
A: I was in the car with my 2-year-old sitting next to me—this was before seat belts—and we were going around the corner, and I said, ‘Hold on tight.’ And the 2-year-old says, ‘Isn’t it tightly?’ She didn’t even speak right yet but she was ‘parsing the next sentence.’ But it was observing a child learning a language. This is a real miracle—I mean, it’s a miracle that happens once a minute—but it’s extraordinary because you can see the system, at least in part. The same thing would be happening if I commented how clever you are—you can distribute all the blood to all the cells in your body. That’s something you can do, but it’s really something inside of you that does it, you don’t know any such thing consciously. Knowledge of language is more like that than you might suspect. Starting around 2 or 3 years old, children can begin to consider the contents of their own knowledge.
Q: What factors influence the way children acquire a specific language?
A: What is interesting, to me, in another way than, say, biology, is that language looks different as you move from one community to another. French looks different from English, and you might be able to understand English but not French. Language doesn’t look like biology in which all humans are the same. It’s learned. But in other important ways language is a biologically driven process; you have to be a certain kind of machine to learn it. All the cats and the dogs in the house are hearing English too—but nothing happens. They have a language, given entirely by nature—you don’t have to teach a puppy the difference between a growl and a whimper. So it can’t be that language is completely innate—but it isn’t completely learned either. You can think about language like this: We come to Earth endowed … with some kind of internal programs relevant to something like language. But we have to recover the details from exposure because language is disguised as English, disguised as French, and so forth. So the kid is really partly discovering what he knows from a complex code in which language is hidden.
Q: Is that what you mean when you say, ‘Language is innate, but English isn’t’?
A: People ask me, ‘Is language innate?’ Well, that’s really silly. If it were innate in the sense of the growls and the barks, there wouldn’t be this learning function in which there is a suspicious correlation between being born in France and learning French. If you wanted to investigate language acquisition, you first, or so I thought, have to take these pieces apart and ask, ‘What can you say is given to the organism, and what is he really learning from the outside?’
Q: Is that what you were attempting to do in your studies of language acquisition among blind and deaf children?
A: It’s obvious that language is the relationship between sounds and meanings. What you know about English is, there’s a certain sound—say, ‘dog’—that has a certain meaning. But if you learn French, there isn’t a relation between ‘dog’ and actual dogs, but rather between ‘chien’ and dogs. That is enough to convince us that any particular language is some infinite set of relations between sounds and meanings—not just the words. It’s something about how the words combine into some larger units. How would you look at a case where a child didn’t have access to the same meanings? One thing you could do is look at blind children. Early studies we did involved these deprived populations, and the shocking thing was how little this huge difference in observational opportunity matters for learning. It takes very little imagination to realize how much you’re not getting as a blind learner. They come to understand even sighted terms such as ‘look’ and ‘see’ in ways that are slightly different, but closely related to sighted people’s understanding. If you say, ‘Look up,’ to a blind kid, he puts his hands in the air to explore what’s there. But if I blindfold a sighted child and said, ‘Look up,’ he’d turn his face, hence his eyes, upward. Both can look and explore the world, but you do it with a different sense organ if you’re blind than if you’re sighted; so they understand the meanings of these words slightly differently. Trying to find out the gaps in the blind child’s knowledge as a consequence for being cut off from the environment, you find out that it hardly matters. You find out very interesting things by studying the actual learners, and from the psychological side—this actual talking and understanding.
Q: What are some examples of those interesting things you learned from studying actual children as opposed to theory?
A: If you listen to little children speaking, they seem to leave out some of the words. So they say, ‘Daddy shoe.’ Or, ‘Go
zoo.’ So you might say, ‘Well that’s pretty primitive.’ But you might also say, ‘How do they know which words were leave-able out-able? How did they know what to learn about the system first?’ Because if I asked you to look at an Arabic text, and said, ‘Keep only the important things,’ you wouldn’t know what to do. How do the babies know what is important? And since babies know it, you now have to make this strange claim that babies have this mathematical system somehow encoded in their brains and in their genes—that’s not the way you usually think about babies. They can’t even add 2 and 2—on the surface they seem quite ignorant. What have they got hold of in the system? It’s quite interesting.
Q: It’s been rumored that you have some involvement in adding a certain four-letter expletive to the dictionary.
A: When I first came to Penn, because I knew about psychology, I was sent to work for a psychiatrist at Eastern Pennsylvania Psychiatric Institute. [The psychiatrist] was very proud to have what he called a linguist working for him—I was just a first-year graduate student, but I had the label. He had a contract to write the psychological definitions for Webster’s Dictionary. Of course he set me to work on these. Among the words that comes up is ‘f—k.’ Lo and behold, or so I assume, the word went into the Third Edition of Webster’s. So you could say I am the girl who put ‘f—k’ in the dictionary.
Q: You’ve been at Penn in some capacity—save for a few years of teaching at Swarthmore—since 1959-1960. From your perspective, how has Penn changed over the years?
A: Woodland Avenue, as it was, there was a rickety trolley car that ran down it, and I lived up at 36th Street in an apartment that was over Smokey Joe’s [before it was located on 40th Street]. It cost $26 dollars a month, but I shared it, after I left my first husband, with another woman, so we paid $13 in rent a month. It was a little noisy, and that wooden trolley kept everyone awake at night. Penn was a very different place. It always had many great intellectual departments and schools, but it wasn’t the kind of place it is now. The neighborhood was rundown, loud, rickety wooden trolley cars ran through what’s now the library building. [Penn Presidents] Judy Rodin and Amy Gutmann have to be given tremendous credit for all of the changes in the physical and institutional place we now call ‘Penn.’ Some of these changes were bound to happen as universities became more prominent institutions in the post-war culture, but some of the transformation of Penn was the great work of these leaders that have come in in the last few decades.
Q: In that same vein, have advances in technology changed language at all?
A: They’ve been saying that for the last 3,000 years: ‘Oh, the language is going to hell! It used to be such a good language. It used to be like Latin.’ Well, no, it didn’t. Not only didn’t it used to be like Latin, but Latin wasn’t like Latin either—at least in the way the general public thinks about it. An acute case we always bring up in Linguistics 1 is if you go from North to South in this country, you hear these big differences—the Boston accent, the South Carolina accent—but if you go from East to West, you don’t see such big differences. Guess why—the railroad, and then radio, of course. So separation of populations is going to allow change to take place. But people live in the same neighborhood have to understand each other, and so the language stays the same. Languages change over time, to be sure, but as far as we know, the changes are all in the details and the overall structure and contents remain about the same.
Q: There’s been some chatter about the Millennial generation and how technology allows them to be lazier in the way they communicate.
A: You hear, ‘They don’t know how to talk, they say like all the time, and like doesn’t mean anything, so everything they say is the same.’ But does ‘like’ mean something? Of course it does. It’s like a dubiety marker. You probably say things like, ‘He’s, like, a very decent man.’ But if I said to you, ‘2 and 2 is like 4,’ that sounds funny, because there’s no room for doubt. That already tells you that ‘like’ doesn’t occur in just any place it wants to. There are semantic and pragmatic principles of some precision that underlie its use. But there’s always a push for language to stay the same because grandparents have to be able to understand their grandchildren, and vice versa. So while language changes —because it can—there are forces that keep saying, ‘No, you should stay like you stayed before, lest my system not be your system.’
Q: What attitudes do linguists take toward those types of changes in language?
A: If you’re a linguist, you don’t usually take a moral attitude toward these things. Like the word ‘literally.’ People object because the younger generation seems to be using it to mean its opposite—figuratively, as in, ‘My brains literally fell on the floor.’ Linguists say, ‘Oh, we have this new usage, the word is now an intensifier—it makes your statement stronger.’ And another word or phrase will crop up, as needed, to mean what ‘literally’ used to mean. If you look back across the words—cute, meat, vulgar, public—as civilization changes and the context changes, the words change their meanings. And when they first do so, everybody gets mad. The older generation gets mad because there’s a conflict with their own prior knowledge. I always listen to Judge Judy because she comes from the same neighborhood I do. I like to hear her speech. But she is such a ‘linguistic authoritarian,’ an extreme in holding to some imagined standard language. And her total rage is at the word ‘tooken,’ [in place of ‘took’], where she practically froths at the mouth. She says, ‘There’s no such word as tooken in the English language.’ There was a great linguist whose name was Uriel Weinreich, who said famously, ‘A language is a dialect with an army and a navy.’ So the power dialect—the one the rich and the famous use—they say that’s the real language. [Laughs.] So I’ve been thinking of starting to say, ‘tooken.’
Originally published on November 14, 2013