H. R. TURTLE AND W. B. CROFT

20. S. E. Robertson, M. E. Maron and W. S. Cooper, Prob- Proceedings of the American Society for Information ability of relevance: a unification of two competing Science, 12, 105-106 (1975). models for document retrieval. Information Technology: 31. H. Turtle, Inference Networks for Document Retrieval. Research and Development, 1 (1), 1-21 (1982). PhD , Computer Science Department, University of 21. S. E. Robertson and K. Sparck Jones, Relevance weighting Massachusetts, Amherst, MA 01003 (1990). (Available as of search terms. Journal of the American Society for COINS Technical Report 90-92.) Information Science, 27, 129-146 (1976). 32. H. Turtle and W. Bruce Croft, Inference networks for 22. S. Ross, A First Course in Probability. Macmillan, New document retrieval. In Proceedings of the 13th International York, (1988). Conference on Research and Development in Information 23. G. Salton, E. Voorhees and E. A. Fox, A comparison of Retrieval, edited J.-L. Vidick, pp. 1-24 (1990). two methods for Boolean query relevancy feedback. 33. H. Turtle and W. Bruce Croft, Efficient probabilistic Information Processing and Management, 20 (5/6), 637-651 inference for text retrieval. In RIA091 Conference Pro- (1984). ceedings, pp. 644-661 (1991). 24. G. Salton, A simple blueprint for automatic Boolean query 34. H. Turtle and W. Bruce Croft, Evaluation of an inference processing. Information Processing and Management, 24 network-based retrieval model. ACM Transactions on (3), 269-280 (1988). Information Systems, 9 (3), 187-222 (1991). 25. G. Salton, Developments in automatic text retrieval. 35. C. J. van Rijsbergen, Information Retrieval. Butterworths Science, 253, 974-980 (1991). (1979). Downloaded from https://academic.oup.com/comjnl/article/35/3/290/525689 by guest on 30 September 2021 26. G. Salton and C. Buckley, Term weighting approaches in 36. S. K. M. Wong, W. Ziarko, V. V. Raghavan and P. C. N. automatic text retrieval. Information Processing and Man- Wong, On extending the vector space model for Boolean agement, 1A (5), 513-523 (1988). query processing. In Proceedings of the 9th International 27. G. Salton and C. Buckley, Improving retrieval per- conference on Research and Development in Information formance by relevance feedback. JASIS, 41, 288-297 Retrieval, pp. 175-185 (1986). (1990). 37. C. T. Yu and H. Mizuno, Two learning schemes for 28. G. Salton, E. Fox and H. Wu, Extended Boolean in- information retrieval. In Proceedings of the 11th Inter- formation retrieval. Communications of the ACM, 26 (11), national Conference on Research and Development in 1022-1036 (1983). Information Retrieval, edited Y. Chiaramella, pp. 201-218 29. G. Salton and M. J. McGill, to Modern In- (1988). formation Retrieval. McGraw-Hill, New York (1983). 38. S. B. Zdonik and D. Maier, Readings in Object-Oriented 30. K. H. Stirling, The effect of document ranking on retrieval Database Systems. Morgan Kaufmann, San Mateo, CA system performance: a search for an optimal ranking rule. (1990).

Book Review

ERIC DAVALO and PATRICK NAIM the Perceptron and the Hebbian learning rule. model were a disappointment, they just pro- Neural Networks This is followed by a discussion of multi-layer vide the barest minimum of the mathematical Macmillan Education Ltd, Houndmills, neural networks and in particular the back- basis for the models. Considering the Basingstoke, Hampshire RG21 2XS propagation model. The authors consider the appears to be aimed at the practitioner, the £13.99. 0-333-549961 use of this particular model and describe its engineer or the programmer who wants to use positive attributes and limitations. This is and explore neural networks, I was disap- This is another book in the Macmillan adequately illustrated with descriptions of pointed to find that there was no description Computer Science Series, which sets out to applications for back-propagation including of the algorithms for these models which provide the reader with a basic understanding the classical 'XOR' problem. could be used as the basis of exploration. It is of the subject - in this case neural networks - Two further models are described in some left to the practitioner to derive these or to and with one or two minor reservations it detail; these are the Hopfield model and the look elsewhere. achieves this. Kohonen model. The theory of the Hopfield Despite the slight reservations, I would The book is structured logically to provide model is examined followed by examples of its have no hesitation in recommending this book the reader with the background followed use, for example in the travelling salesman as the starting point for the computer (or by some basic principles, and finishes with problem. A further derivative of the Hopfield other) professional who feels the need to know more detail on some common models and model is introduced, namely the Boltzmann something about neural networks or who applications. machine, which utilises an analogy of anneal- wants to start using and exploring neural Upon first reading, I had to ask myself ing in metals called simulated annealing to networks. 'who is the target reader?'. There are many help the network find a 'better' solution. The M. A. TUREGA addressed at the neural network pro- Kohonen model, although not as well known Manchester fessional with which this book does not as back-propagation and Hopfield, is con- compete. There are, however, many more sidered in some detail, along with examples of readers to whom such a book would be its use. irrelevant and incomprehensible. I suspect the Probably the most interesting chapters from target reader is a professional in the computer the practitioner's point of view are the ones on Erratum or related fields who needs or wants to know applications of neural networks and neural more about neural networks because he or she computers. They look at the reasons for using We have been advised that there was a has a problem to solve. It is also likely that neural networks and their limitations, and the typesetting error in the letter by P. A. Kalin- this book could form the basis of an under- characteristics of problems which would be chenko, 34 (6), 502 (1991). Within the graduate course in neural networks. suitable for solving by using neural networks. subroutine PPOLYN, in the third column, The first examines the biological This is followed by descriptions of specific line 15, the statement should read foundations for neural networks by describing application examples, with one examined in in simple terms the way the brain and its cells more detail. The chapter on neural computers IF (ISTART + I) 9, 2, 9 are believed to function. This is followed by only covers the area superficially, but does and not an interesting description of the peculiarities provide an insight into how such computers of the visual system of a frog and mammals. work. IF (ISTART + 1) 9, 2, 9. Chapter 2 explains the historical basis and The two appendices which describe the We apologise to the author, and readers, for basic principles of neural models, introducing back-propagation model and the Kohonen the error.

290 THE COMPUTER JOURNAL, VOL. 35, NO. 3, 1992