<<

Workshops AI Magazine Volume 10 Number 4 (1989) (© AAAI) generated some controversy. Jacques Mehler of EHESS, Paris, and Juan Segui Cognitive Models of Speech of CNRS, Paris, argued that Cutler and Norris’s metrical segmentation Processing: strategy must be language specific because in a language such as French, all the syllables are full and unreduced, Psycholinguistic and and applying this strategy to French Computational Perspectives would lead to an explosion of seg- mentation errors. They argued that Gerry Altmann the syllable, irrespective of its strength, is the natural unit of lexical access. Relative to the discussion of the role of strong syllables in lexical seg- mentation, Gerry Altmann of CSTR reviewed some of the evidence based on computational studies of large The 1988 Workshop on Cognitive bone. Evidence from human studies computerized lexicons (20,000+ Models of Speech Processing was suggested that the spurious word is words). This evidence suggested that held at Park Hotel Fiorelle, Sperlonga, activated, even though in principle it a stressed syllable conveys more Italy, on 16–20 May 1988. Twenty- would be possible to prevent this acti- information about the identity of the five participants gathered in this vation by only accessing the lexicon at word in which it occurs than an small coastal village, where the the offset of some previously found unstressed syllable. By applying tech- Emperor Tiberius once kept a summer word (trom is not a word, so access niques borrowed from information house, to discuss psycholinguistic would not be reinitiated at its offset). theory, it is possible to show that this and computational issues in speech This finding was further discussed fact is not the result of some fortu- and natural language processing. when Uli Frauenfelder of the Max itous assignment of lexical stress to The main aim of the workshop was Planck Institute for , the most informative syllable. Rather, to draw together current research Nijmegen, The Netherlands, present- it is because more categories of trends within the fields of human ed a computational simulation of the stressed vowel exist than unstressed and natural lan- lexical access-segmentation process vowel (as calculated from a frequen- guage processing. Cognitive psychol- using the interactive-activation cy-weighted lexicon), and more ogists have attempted to model what model TRACE. The TRACE simulation words can be eliminated from the goes on at the many different levels was particularly useful because it search space when the set of cate- at which speech perception can be gave some insight into the way that gories with which to discriminate described; they have also been con- the TRACE architecture and the precise between the competing words is cerned with the interaction, if any, contents of its lexicon might help larger than when it is smaller. between these levels. The mecha- suppress spurious words within words. Stressing the computational aspect, nisms that have been proposed have Still on the subject of when—and Ellen Gurman Bard of CSTR reconsid- varied in the degree to which they how—words are accessed, Anne ered the theoretical implications of are amenable to detailed computa- Cutler of the Medical Research Coun- the original Cutler and Norris find- tional modeling. Recent develop- cil Applied Psychology Unit (APU), ings. She argued that an experiment ments involve the availability of new Cambridge, England, presented data by Cutler and Norris, which had and more powerful computational to support the theory that new words been interpreted as supporting their frameworks within which to model are hypothesized to start immediate- segmentation strategy, could be rein- cognitive processes (for example, par- ly prior to strong syllables (that is, terpreted within an interactive-acti- allel distributed processing [PDP] and syllables which contain unreduced, vation model so that the main effects the active chart). full vowels; in the case of Shillcock’s were not the result of lexical segmen- To attempt some form of integra- data, all the spurious words started tation but of inhibitory effects tion between the different types of with a strong syllable). Although the between lexical competitors. As in behavioral data and the different evidence is compelling for the the case of the Frauenfelder-Shillcock computational approaches to model- English language, this theory (devel- discussions, the consideration of a ing the data, scientists from both the oped with Dennis Norris, also of APU) computational framework within psycholinguistic and computational which to view the data on lexical domains were brought together for segmentation enriched the possible five days to discuss their work and The main aim of explanations of the empirical data. share their perspectives. The talks, the workshop was to William Marslen-Wilson of APU which covered a wide variety of also considered the effects of compe- topics, are summarized in the follow- draw together cur- tition among lexical hypotheses ing paragraphs. rent research trends during lexical access. He showed that Richard Shillcock of the Centre for the recognition of a word can be Speech Technology Research (CSTR), within the fields of affected by whether frequent words Edinburgh, Scotland, presented data human speech per- exist which resemble the word in on the recognition of words that ception and natural question. This result was taken as contain other spurious words within support for Marslen-Wilson’s Cohort them, as in trombone, which contains language processing model of speech perception. 20 AI MAGAZINE 0738-4602/89/$3.50 ©1989 AAAI Workshops

A Learn Lenat’s AI Techniaues Addison-Wesley BUILDING LARGE KNOWLEDGE-BASED SYSTEMS REPRESENTATION AND INFERENCE IN THE CYC PROJECT

by Douglas B. Lenat and R.V. Guha Now you can learn about the ground-breaking work in representation and inferencing techniques currently under way at MCC Corporation’s Cyc Project. In this new book, you will discover details about this arnbitous ten year AI Project whose goal is developing a knowledge base of 100 million facts, problem-solving methods and heuristics. Important topics include:

+ CycL Representation Language + Encoding representation “thorns” + 2 dozen inference mechanisms + Cyc’s Truth Maintenance System + Status Report on the Cyc Project + And a look ahead . . . g------___------___------. ve checked the boxes nextto the books I would like to RETURN TO: Addison-Wesley Publishing Co., Dept. BH, Route 128, Reading, MA. 01867 xamine for FREE 15-DAY TRIAL 1111either keep the For even fester service call 617-944-3700, ext. 2621 to place your order. oak(s) and send payment or return them without further bligation 1 NAME c] %51752 336 pi Buildi Large KrWVA?dge- I Eased Systems, Lenati75 uha. $39 75 ITITLE 0 #41621 704 pp. Programming for Al: iMetbod8, FIRM Tools & Appiicaiions, Kreutzer/McKenzie. $29 95 I ADDRESS 0 1151619 704~ TheHandbookofAf, Vol./V, EiarrlFeigen a.umlCohen, $39 95 CITY /STATE/ZIP

0 X34381 Free Computer Science catalog PHONE ( 1 I

Paul Luce of New York State Uni- being used top-down to assist per- work designed to model word pro- versity at Buffalo (SUNY Buffalo) pre- ception? Using different methodolo- duction could adequately model sented work in which he calculated gies, they both concluded that word recognition. A number of simu- the similarity neighborhoods of each whereas lexical information (informa- lations were presented that demon- word in a large computerized dictio- tion about words in the mental lexi- strated the ability of this network to nary. These neighborhoods simply con) can actually affect the percept recognize words from among their consisted of the phonetically similar of a sound making up a word, senten- competitors (including cases similar words with which the target word tial context (syntactic likelihood, to the trombone-bone case investi- might be confused, defined as those semantic plausibility, and so on) gated by Shillcock and discussed by words which differed from the target cannot influence the percept. Instead, Frauenfelder). in just one phoneme (beach and these sentential effects occur at a Elman moved to the sentence-pro- peach are, therefore, neighbors). Luce later stage in processing and do not cessing domain and demonstrated showed that the time to recognize a interact with the lower-level stages. that a variant of the Jordan network word was dependent on both the These findings argue for a different could predict the next word in a density of the neighborhood for this kind of relationship between the simple three-word sentence (for word and on the frequencies of the phonemic and lexical levels—an example, man break glass, cat eat members of the neighborhood (com- interactive one—than between the sandwich). He then showed that the pare Marslen-Wilson). Considerable lexical and sentential levels—a non- internal representations constructed discussion took place on just how interactive one. The implications for by the network to represent each lex- such effects might be manifested models of speech perception—and ical item grouped the items accord- within the interactive-activation the architectures required to model ing to certain natural properties (for framework. perception—were discussed at length. example, sandwich and cake would Arty Samuel of Yale University, Within the computational be grouped together but separately New Haven, Connecticut, and Cindy domain, Jeff Elman of the University from the grouping containing glass Connine of SUNY Binghampton of California, San Diego, and Dennis and plate; however, both groupings reviewed the data on the effects of Norris of APU presented computa- would be placed together by virtue of lexical and sentential context on lex- tional simulations based on PDP their syntactic properties). ical access. At issue was whether models that were sensitive to tempo- After dealing with lexical access these effects resembled the effects of ral variation. Both used recurrent and PDP architectures, the main lexical context on phonemic identifi- networks based on the work of focus of the workshop shifted toward cation: Do they interact in the same Michael Jordan. Norris demonstrated the syntactic level of analysis. The way, with higher-level information that a simplified version of a net- emphasis here was on the notion of

WINTER 1989 21 Workshops

a deterministic parser that could use Much of the discussion reflected the increased obligatory intonational boundaries emphasis . . . on (1) the flow of information to help it assign the right structure to between the different modules . . . under study a sentence. Unlike Steedman, Marcus assumed a separate component capa- and (2) the modularity . . . of the models ble of identifying the intonational which have been advanced to explain data on boundaries. His parser would then use this additional input to construct human speech and sentence processing. partial descriptions of otherwise stan- dard tree structures. It is impossible to catalog perhaps syntactic constraint and the immedi- hypotheses’ output by an automatic the most important aspect of the acy with which syntactic informa- continuous speech recognizer. Using workshop, namely, the discussions tion could be used to constrain the the active chart framework, they that arose, whether during the ses- interpretation of sentential input. described a number of different sions or away from the formal Mike Tanenhaus (Rochester, New implementations by which syntactic workshop structure. The formal dis- York) discussed some recent experi- information significantly reduced the cussions were led by Jan Charles-Luce ments that examined the time course number of word strings that would of SUNY Buffalo; Ellen Gurman Bard; with which verb control and verb be entertained by the processor. The Juan Segui; David Pisoni of Indiana argument structure are accessed and implementations were equivalent to University; Lolly Tyler of Cambridge used in sentence processing. The the extent that they led to the same University, Cambridge, England; results demonstrated that word degree of constraint. They differed, Janet Fodor of City University, New recognition makes information avail- however, in the degree to which they York; and Bonnie Webber and able about the argument structure of conformed to Fodor’s modularity Aravind Joshi of the University of the verb and that this information is hypothesis concerning the autonomy Pennsylvania, Philadelphia. Much of used immediately during parsing. of the processing of different infor- the discussion reflected the increased Lyn Frazier of the University of Mas- mation types. Thompson and Alt- emphasis within psycholinguistic sachusetts, Amherst, explored the mann concluded that modularity can and computational modeling on (1) relationship between different pro- be a useful theoretical construct but the flow of information between the posed postlexical modules that con- that a point comes when one has to different modules that compose the tribute to language comprehension. abandon modularity for the sake of (sub)systems under study and (2) the She discussed four modules, two that computational efficiency. modularity or otherwise of the models are primarily concerned with struc- The final topic considered in the which have been advanced to tural properties of the input (the workshop was intonation and the explain data on human speech and Binding module, which operates relationship between intonational sentence processing. according to linguistic principals structure and the syntactic processor. The workshop would not have such as c-command, and the con- Mark Steedman of the University of been possible without the generous stituent structure module, which Pennsylvania, Philadelphia, pointed financial support of British Telecom operates according to sisterhood rela- out that the recent theories of International. Additional financial tions in the syntactic structures) and prosody and intonation within the assistance was provided by the Amer- two that are concerned with more metrical framework postulate a level ican Association for Artificial Intelli- pragmatic aspects of the input (refer- of intonational structure indepen- gence. The proceedings of the work- ential semantics and thematic-predi- dent of syntactic surface structure. shop are being published by The MIT cation assignments). He argued that such an autonomous Press/Bradford Books and edited by In the computational domain but intonational structure is unnecessary Gerry Altmann. This workshop was still within the realm of syntactic and that an identical metrical the first of a series of biennial inter- processing, two presentations were prosody can be driven directly from national meetings based in Europe. concerned with the power of syntac- syntactic surface structure. This tech- tic information to constrain the vast nique requires a radical revision of number of spurious word candidates the concept of surface structure and Gerry Altmann is a lecturer in the Labo- that might be hypothesized by an is achieved by replacing the standard ratory of Experimental Psychology at the automatic speech-recognition system. syntactic component with a syntax University of Sussex, teaching AI and cog- Domenico Parisi of CNR, Rome, based on a combinatory generaliza- nitive science. He received his BSc. in described some experiments using tion of categorial grammar. This experimental psychology from the Uni- the Olivetti isolated word recognizer. theory of grammar is independently versity of Sussex in 1981 and a Ph.D. from He showed that performance was motivated by the purely syntactic the Centre for Cognitive Science at the significantly improved if the hypo- phenomena of coordinate structure University of Edinburgh. His Ph.D. work theses’ output for each word took and unbounded dependency. At the on the resolution of syntactic into account the sentential relations level of surface structure, it also was followed by four years of postdoctoral which could be expected to hold appears to provide exactly the struc- work at the Centre for Speech Technology Research, University of Edinburgh, on the between the different words in the tural units that are required by recent effects of context on word recognition in utterance. metrical accounts of intonation. continuous speech and computational Henry Thompson of CSTR and Whereas Steedman described a models of such effects. Gerry Altmann described the manner grammar for intonational structure, in which syntactic information could Mitch Marcus of the University of be made to constrain the lexical Pennsylvania, Philadelphia, described

22 AI MAGAZINE