Language : English
Published : 2018-02-09
Pages : 192
Common Sense, the Turing Test, and the Quest for Real AI
What artificial intelligence can tell us about the mind and intelligent behavior. What can artificial intelligence teach us about the mind? If AI’s underlying concept is that thinking is a computational process, then how can computation illuminate thinking? It’s a timely question. AI is all the rage, and the buzziest AI buzz surrounds adaptive machine learning: computer systems that learn intelligent behavior from massive amounts of data. This is what powers a driverless car, for example. In this book, Hector Levesque shifts the conversation to “good old fashioned artificial intelligence,” which is based not on heaps of data but on understanding commonsense intelligence. This kind of artificial intelligence is equipped to handle situations that depart from previous patterns — as we do in real life, when, for example, we encounter a washed-out bridge or when the barista informs us there’s no more soy milk. Levesque considers the role of language in learning. He argues that a computer program that passes the famous Turing Test could be a mindless zombie, and he proposes another way to test for intelligence — the Winograd Schema Test, developed by Levesque and his colleagues. “If our goal is to understand intelligent behavior, we had better understand the difference between making it and faking it,” he observes. He identifies a possible mechanism behind common sense and the capacity to call on background knowledge: the ability to represent objects of thought symbolically. As AI migrates more and more into everyday life, we should worry if systems without common sense are making decisions where common sense is needed.
Pre-Order (3-4 weeks)
Some books on algorithms are rigorous but incomplete; others cover masses of material but lack rigor. Introduction to Algorithms uniquely combines rigor and comprehensiveness. The book covers a broad range of algorithms in depth, yet makes their design and analysis accessible to all levels of readers. Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.The first edition became a widely used text in universities worldwide as well as the standard reference for professionals. The second edition featured new chapters on the role of algorithms, probabilistic analysis and randomized algorithms, and linear programming. The third edition has been revised and updated throughout. It includes two completely new chapters, on van Emde Boas trees and multithreaded algorithms, substantial additions to the chapter on recurrence (now called “Divide-and-Conquer”), and an appendix on matrices. It features improved treatment of dynamic programming and greedy algorithms and a new notion of edge-based flow in the material on flow networks. Many new exercises and problems have been added for this edition. As of the third edition, this textbook is published exclusively by the MIT Press. The hardcover edition does not include a dust jacket.
About the Author
Thomas H. Cormen is Professor of Computer Science and former Director of the Institute for Writing and Rhetoric at Dartmouth College. He is the coauthor (with Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein) of the leading textbook on computer algorithms, Introduction to Algorithms (third edition, MIT Press, 2009). Charles E. Leiserson is Professor of Computer Science and Engineering at the Massachusetts Institute of Technology. Ronald L. Rivest is Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology. Clifford Stein is Professor of Industrial Engineering and Operations Research at Columbia University.
The world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. With this new edition, Ralph Kimball and his colleagues have refined the original set of Lifecycle methods and techniques based on their consulting and training experience. They walk you through the detailed steps of designing, developing, and deploying a data warehousing/business intelligence system. With substantial new and updated content, this second edition again sets the standard in data warehousing for the next decade.
About the Author
The authors’ professional careers have followed remarkably similar paths. Each author has focused on data warehousing and business intelligence (DW/BI) consulting and education for more than fifteen years. Most worked together at Metaphor Computer Systems, a pioneering decision support vendor, in the 1980s. All the authors are members of the Kimball Group and teach for KimballUniversity. They contribute regularly to Intelligent Enterprise magazine and other industry publications; most have previously written books in the Toolkit series. Ralph Kimball founded the Kimball Group. Since the mid 1980s, he has been the DW/BI industry’s thought leader on the dimensional approach and trained more than 10,000 IT professionals. Ralph has his Ph.D. in Electrical Engineering from Stanford University. Margy Ross is President of the Kimball Group. She has focused exclusively on DW/BI since 1982 with an emphasis on business requirements analysis and dimensional modeling. Margy graduated with a BS in Industrial Engineering from Northwestern University. Warren Thornthwaite began his DW/BI career in 1980. After managing Metaphor’s consulting organization, he worked for Stanford University and WebTV. Warren holds a BAin Communications Studies from the University of Michigan and anMBA from the University of Pennsylvania’sWharton School. JoyMundy has focused onDW/BIsystems since 1992 with stints at Stanford, Web TV, and Microsoft’s SQL Server product development organization. Joy graduated from Tufts University with a BA in Economics, and from Stanford University with an MS in Engineering Economic Systems. Bob Becker has helped clients across a variety of industries with their DW/BI challenges and solutions since 1989, including extensive work with health care organizations. Bob has a BSB in Marketing from the University of Minnesota’s School of Business.
Evolutionary computation, the use of evolutionary systems as computational processes for solving complex problems, is a tool used by computer scientists and engineers who want to harness the power of evolution to build useful new artifacts, by biologists interested in developing and testing better models of natural evolutionary systems, and by artificial life scientists for designing and implementing new artificial evolutionary worlds. In this clear and comprehensive introduction to the field, Kenneth De Jong presents an integrated view of the state of the art in evolutionary computation. Although other books have described such particular areas of the field as genetic algorithms, genetic programming, evolution strategies, and evolutionary programming, Evolutionary Computation is noteworthy for considering these systems as specific instances of a more general class of evolutionary algorithms. This useful overview of a fragmented field is suitable for classroom use or as a reference for computer scientists and engineers.
Based on the Association for Computing Imagery model curriculum guidelines, Foundations of Computer Science gives students a bird’s eye view of Computer Science. This easy-to-read and easy-to-navigate text covers all the fundamentals of computer science required for first year undergraduates embarking on a computing degree.