Showing 1–12 of 596 results
Provide a balanced approach to databases as the market-leading DATABASE SYSTEMS: DESIGN, IMPLEMENTATION, and MANAGEMENT, 13E emphasizes the knowledge and skills necessary for success and makes databases accessible without overwhelming students. Readers gain a solid foundation in database design and implementation as diagrams, illustrations, and tables clarify in-depth coverage of database design. Students learn how successful database implementation involves designing databases to fit within a larger strategic data environment. Revised coverage of SQL introduces more examples and simpler explanations that focus on areas most important for a database career — making supplementary SQL materials unnecessary. Additional coverage of Big Data Analytics and NoSQL, including related Hadoop technologies, offers a strong hands-on approach. MindTap now include a digital auto-graded MySQL environment.
A Modern Approach to Functional Programming Objects First with Java: A Practical Introduction is an introduction to object-oriented programming for beginners. The main focus of the book is general object-oriented and programming concepts from a software engineering perspective. The first chapters are written for students with no programming experience with later chapters being more suitable for advanced or professional programmers. The Java programming language and BlueJ-the Java development environment – are the two tools used throughout the book. BlueJ’s clear visualisation of classes and objects means that students can immediately appreciate the differences between them and gain a much better understanding of the nature of an object than they would from simply reading source code. Unlike traditional textbooks, the chapters are not ordered by language features but by software development concepts. The Sixth Edition goes beyond just adding the new language constructs of Java 8. The book’s exploration of this new language demonstrates a renaissance of functional ideas in modern programming. While functional programming isn’t new in principle, it’s seen a boost in popularity based on the current computer hardware available and the changing nature of projects programmers wish to tackle. Functional language constructs make it possible to efficiently automate currency, make use of multiple cores without much effort on the side of the programmer, are both more elegant and readable, and offer great potential in solving the issue of parallel hardware. Functional programming has become an essential part of the field, and Objects First with Java gives students a basic understanding of an area they’ll need to master in order to succeed in the future.
This text is intended for a 1-semester CS1 course sequence. The Brief Version contains the first 18 chapters of the Comprehensive Version. The first 13 chapters are appropriate for preparing the AP Computer Science exam. For courses in Java Programming. A fundamentals-first introduction to basic programming concepts and techniques Designed to support an introductory programming course, Introduction to Java Programming and Data Structures teaches concepts of problem-solving and object-orientated programming using a fundamentals-first approach. Beginner programmers learn critical problem-solving techniques then move on to grasp the key concepts of object-oriented, GUI programming, advanced GUI and Web programming using JavaFX. This course approaches Java GUI programming using JavaFX, which has replaced Swing as the new GUI tool for developing cross-platform-rich Internet applications and is simpler to learn and use. The 11th edition has been completely revised to enhance clarity and presentation, and includes new and expanded content, examples, and exercises. MyLabTM Programming not included. Students, if MyLab is a recommended/mandatory component of the course, please ask your instructor for the correct ISBN and course ID. MyLab should only be purchased when required by an instructor. Instructors, contact your Pearson rep for more information. MyLab is an online homework, tutorial, and assessment product designed to personalize learning and improve results. With a wide range of interactive, engaging, and assignable activities, students are encouraged to actively learn and retain tough course concepts.
“Big data,” as it has become known in business and information technology circles, has the potential to improve our knowledge about human behavior, and to help us gain insight into the ways in which we organize ourselves, our cultures, and our external and internal lives. Libraries stand at the center of the information world, both facilitating and contributing to this flood as well as helping to shape and channel it to specific purposes. But all technologies come with a price. Where the tool can serve a purpose, it can also change the user’s behavior to fit the purposes of the tool. Big Data Shocks: An Introduction to Big Data for Librarians and Information Professionals examines the roots of big data, the current climate and rising stars in this world. The book explores the issues raised by big data and discusses theoretical as well as practical approaches to managing information whose scope exists beyond the human scale. What’s at stake ultimately is the privacy of the people who support and use our libraries and the temptation for us to examine their behaviors. Such tension lies deep in the heart of our great library institutions. This book addresses these issues and many of the questions that arise from them, including: -What is our role as librarians within this new era of big data? -What are the impacts of new powerful technologies that track and analyze our behavior? -Do data aggregators know more about us and our patrons than we do? -How can librarians ethically balance the need to demonstrate learning and knowledge creation and privacy? -Do we become less private merely because we use a tool or is it because the tool has changed us? -What’s in store for us with the internet of things combining with data mining techniques? All of these questions and more are explored in this book
The third edition of Preserving Digital Materials provides a survey of the digital preservation landscape. This book is structured around four questions: 1. Why do we preserve digital materials? 2. What digital materials do we preserve? 3. How do we preserve digital materials? 4. How do we manage digital preservation? This is a concise handbook and reference for a wide range of stakeholders who need to understand how preservation works in the digital world. It notes the increasing importance of the role of new stakeholders and the general public in digital preservation. It can be used as both a textbook for teaching digital preservation and as a guide for the many stakeholders who engage in digital preservation. Its synthesis of current information, research, and perspectives about digital preservation from a wide range of sources across many areas of practice makes it of interest to all who are concerned with digital preservation. It will be of use to preservation administrators and managers, who want a professional reference text, information professionals, who wish to reflect on the issues that digital preservation raises in their professional practice, and students in the field of digital preservation.
How do infants learn a language? Why and how do languages evolve? How do we understand a sentence? This book explores these questions using recent computational models that shed new light on issues related to language and cognition. The chapters in this collection propose original analyses of specific problems and develop computational models that have been tested and evaluated on real data. Featuring contributions from a diverse group of experts, this interdisciplinary book bridges the gap between natural language processing and cognitive sciences. It is divided into three sections, focusing respectively on models of neural and cognitive processing, data driven methods, and social issues in language evolution. This book will be useful to any researcher and advanced student interested in the analysis of the links between the brain and the language faculty.
The power of big data in cybersecurity — Big data analytics for network forensics — Dynamic analytics-driven assessment of vulnerabilities and exploitation — Big data analytics for mobile app security — Machine unlearning: repairing learning models in adversarial — Environments — Cybersecurity training — Machine unlearning: repairing learning models in adversarial environments — Big data analytics for mobile app security — Security, privacy and trust in cloud computing: challenges and solutions — Cybersecurity in internet of things (IOT) — Data visualization for cyber security — Analyzing deviant socio-technical behaviors using social network analysis and cyber forensics-based methodologies — Security tools — Data and research initiatives for cybersecurity analysis
A new edition of a book, written in a humorous question-and-answer style, that shows how to implement and use an elegant little programming language for logic programming. The goal of this book is to show the beauty and elegance of relational programming, which captures the essence of logic programming. The book shows how to implement a relational programming language in Scheme, or in any other functional language, and demonstrates the remarkable flexibility of the resulting relational programs. As in the first edition, the pedagogical method is a series of questions and answers, which proceed with the characteristic humor that marked The Little Schemer and The Seasoned Schemer. Familiarity with a functional language or with the first five chapters of T he Little Schemer is assumed. For this second edition, the authors have greatly simplified the programming language used in the book, as well as the implementation of the language. In addition to revising the text extensively, and simplifying and revising the “Laws” and “Commandments,” they have added explicit “Translation” rules to ease translation of Scheme functions into relations.
What artificial intelligence can tell us about the mind and intelligent behavior. What can artificial intelligence teach us about the mind? If AI’s underlying concept is that thinking is a computational process, then how can computation illuminate thinking? It’s a timely question. AI is all the rage, and the buzziest AI buzz surrounds adaptive machine learning: computer systems that learn intelligent behavior from massive amounts of data. This is what powers a driverless car, for example. In this book, Hector Levesque shifts the conversation to “good old fashioned artificial intelligence,” which is based not on heaps of data but on understanding commonsense intelligence. This kind of artificial intelligence is equipped to handle situations that depart from previous patterns — as we do in real life, when, for example, we encounter a washed-out bridge or when the barista informs us there’s no more soy milk. Levesque considers the role of language in learning. He argues that a computer program that passes the famous Turing Test could be a mindless zombie, and he proposes another way to test for intelligence — the Winograd Schema Test, developed by Levesque and his colleagues. “If our goal is to understand intelligent behavior, we had better understand the difference between making it and faking it,” he observes. He identifies a possible mechanism behind common sense and the capacity to call on background knowledge: the ability to represent objects of thought symbolically. As AI migrates more and more into everyday life, we should worry if systems without common sense are making decisions where common sense is needed.
Learn about an information-theoretic approach to managing interference in future generation wireless networks. Focusing on cooperative schemes motivated by Coordinated Multi-Point (CoMP) technology, the book develops a robust theoretical framework for interference management that uses recent advancements in backhaul design, and practical pre-coding schemes based on local cooperation, to deliver the increased speed and reliability promised by interference alignment. Gain insight into how simple, zero-forcing pre-coding schemes are optimal in locally connected interference networks, and discover how significant rate gains can be obtained by making cell association decisions and allocating backhaul resources based on centralized (cloud) processing and knowledge of network topology. Providing a link between information-theoretic analyses and interference management schemes that are easy to implement, this is an invaluable resource for researchers, graduate students and practicing engineers in wireless communications.
Following the successful PCS Auction conducted by the US Federal Communications Commission in 1994, auctions have replaced traditional ways of allocating valuable radio spectrum, a key resource for any mobile telecommunications operator. Spectrum auctions have raised billions of dollars worldwide and have become a role model for market-based approaches in the public and private sectors. The design of spectrum auctions is a central application of game theory and auction theory due to its importance in industry and the theoretical challenges it presents. Several auction formats have been developed with different properties addressing fundamental questions about efficiently selling multiple objects to a group of buyers. This comprehensive handbook features classic papers and new contributions by international experts on all aspects of spectrum auction design, including pros and cons of different auctions and lessons learned from theory, experiments, and the field, providing a valuable resource for regulators, telecommunications professionals, consultants, and researchers.