Skip to product information
1 of 1
Regular price £61.69 GBP
Regular price £72.00 GBP Sale price £61.69 GBP
Sale Sold out
Free UK Shipping

Freshly Printed - allow 10 days lead

Machine Learning and Data Mining

Igor Kononenko (Author), Matjaz Kukar (Author)

9781904275213, Elsevier Science

Paperback / softback, published 30 April 2007

480 pages
23.3 x 15.6 x 3 cm, 0.71 kg

"Readers are treated to a comprehensive look at the principles. …a fine overview of machine learning methods. …Recommended." --Choice Magazine

Data mining is often referred to by real-time users and software solutions providers as knowledge discovery in databases (KDD). Good data mining practice for business intelligence (the art of turning raw software into meaningful information) is demonstrated by the many new techniques and developments in the conversion of fresh scientific discovery into widely accessible software solutions. This book has been written as an introduction to the main issues associated with the basics of machine learning and the algorithms used in data mining.Suitable for advanced undergraduates and their tutors at postgraduate level in a wide area of computer science and technology topics as well as researchers looking to adapt various algorithms for particular data mining tasks. A valuable addition to the libraries and bookshelves of the many companies who are using the principles of data mining (or KDD) to effectively deliver solid business and industry solutions.

  • Foreword
  • Preface
    • Acknowledgements
  • Chapter 1: Introduction
    • 1.1 THE NAME OF THE GAME
    • 1.2 OVERVIEW OF MACHINE LEARNING METHODS
    • 1.3 HISTORY OF MACHINE LEARNING
    • 1.4 SOME EARLY SUCCESSES
    • 1.5 APPLICATIONS OF MACHINE LEARNING
    • 1.6 DATA MINING TOOLS AND STANDARDS
    • 1.7 SUMMARY AND FURTHER READING
  • Chapter 2: Learning and Intelligence
    • 2.1 WHAT IS LEARNING
    • 2.2 NATURAL LEARNING
    • 2.3 LEARNING, INTELLIGENCE, CONSCIOUSNESS
    • 2.4 WHY MACHINE LEARNING
    • 2.5 SUMMARY AND FURTHER READING
  • Chapter 3: Machine Learning Basics
    • 3.1 BASIC PRINCIPLES
    • 3.2 MEASURES FOR PERFORMANCE EVALUATION
    • 3.3 ESTIMATING PERFORMANCE
    • 3.4 *COMPARING PERFORMANCE OF MACHINE LEARNING ALGORITHMS
    • 3.5 COMBINING SEVERAL MACHINE LEARNING ALGORITHMS
    • 3.6 SUMMARY AND FURTHER READING
  • Chapter 4: Knowledge Representation
    • 4.1 PROPOSITIONAL CALCULUS
    • 4.2 *FIRST ORDER PREDICATE CALCULUS
    • 4.3 DISCRIMINANT AND REGRESSION FUNCTIONS
    • 4.4 PROBABILITY DISTRIBUTIONS
    • 4.5 SUMMARY AND FURTHER READING
  • Chapter 5: Learning as Search
    • 5.1 EXHAUSTIVE SEARCH
    • 5.2 BOUNDED EXHAUSTIVE SEARCH (BRANCH AND BOUND)
    • 5.3 BEST-FIRST SEARCH
    • 5.4 GREEDY SEARCH
    • 5.5 BEAM SEARCH
    • 5.6 LOCAL OPTIMIZATION
    • 5.7 GRADIENT SEARCH
    • 5.8 SIMULATED ANNEALING
    • 5.9 GENETIC ALGORITHMS
    • 5.10 SUMMARY AND FURTHER READING
  • Chapter 6: Measures for Evaluating the Quality of Attributes
    • 6.1 MEASURES FOR CLASSIFICATION AND RELATIONAL PROBLEMS
    • 6.2 MEASURES FOR REGRESSION
    • 6.3 **FORMAL DERIVATIONS AND PROOFS
    • 6.4 SUMMARY AND FURTHER READING
  • Chapter 7: Data Preprocessing
    • 7.1 REPRESENTATION OF COMPLEX STRUCTURES
    • 7.2 DISCRETIZATION OF CONTINUOUS ATTRIBUTES
    • 7.3 ATTRIBUTE BINARIZATION
    • 7.4 TRANSFORMING DISCRETE ATTRIBUTES INTO CONTINUOUS
    • 7.5 DEALING WITH MISSING VALUES
    • 7.6 VISUALIZATION
    • 7.7 DIMENSIONALITY REDUCTION
    • 7.8 **FORMAL DERIVATIONS AND PROOFS
    • 7.9 SUMMARY AND FURTHER READING
  • Chapter 8: *Constructive Induction
    • 8.1 DEPENDENCE OF ATTRIBUTES
    • 8.2 CONSTRUCTIVE INDUCTION WITH PRE-DEFINED OPERATORS
    • 8.3 CONSTRUCTIVE INDUCTION WITHOUT PRE-DEFINED OPERATORS
    • 8.4 SUMMARY AND FURTHER READING
  • Chapter 9: Symbolic Learning
    • 9.1 LEARNING OF DECISION TREES
    • 9.2 LEARNING OF DECISION RULES
    • 9.3 LEARNING OF ASSOCIATION RULES
    • 9.4 LEARNING OF REGRESSION TREES
    • 9.5 *INDUCTIVE LOGIC PROGRAMMING
    • 9.6 NAIVE AND SEMI-NAIVE BAYESIAN CLASSIFIER
    • 9.7 BAYESIAN BELIEF NETWORKS
    • 9.8 SUMMARY AND FURTHER READING
  • Chapter 10: Statistical Learning
    • 10.1 NEAREST NEIGHBORS
    • 10.2 DISCRIMINANT ANALYSIS
    • 10.3 LINEAR REGRESSION
    • 10.4 LOGISTIC REGRESSION
    • 10.5 *SUPPORT VECTOR MACHINES
    • 10.6 SUMMARY AND FURTHER READING
  • Chapter 11: Artificial Neural Networks
    • 11.1 INTRODUCTION
    • 11.2 TYPES OF ARTIFICIAL NEURAL NETWORKS
    • 11.3 *HOPFIELD’S NEURAL NETWORK
    • 11.4 *BAYESIAN NEURAL NETWORK
    • 11.5 PERCEPTRON
    • 11.6 RADIAL BASIS FUNCTION NETWORKS
    • 11.7 **FORMAL DERIVATIONS AND PROOFS
    • 11.8 SUMMARY AND FURTHER READING
  • Chapter 12: Cluster Analysis
    • 12.1 INTRODUCTION
    • 12.2 MEASURES OF DISSIMILARITY
    • 12.3 HIERARCHICAL CLUSTERING
    • 12.4 PARTITIONAL CLUSTERING
    • 12.5 MODEL-BASED CLUSTERING
    • 12.6 OTHER CLUSTERING METHODS
    • 12.7 SUMMARY AND FURTHER READING
  • Chapter 13: **Learning Theory
    • 13.1 COMPUTABILITY THEORY AND RECURSIVE FUNCTIONS
    • 13.2 FORMAL LEARNING THEORY
    • 13.3 PROPERTIES OF LEARNING FUNCTIONS
    • 13.4 PROPERTIES OF INPUT DATA
    • 13.5 CONVERGENCE CRITERIA
    • 13.6 IMPLICATIONS FOR MACHINE LEARNING
    • 13.7 SUMMARY AND FURTHER READING
  • Chapter 14: **Computational Learning Theory
    • 14.1 INTRODUCTION
    • 14.2 GENERAL FRAMEWORK FOR CONCEPT LEARNING
    • 14.3 PAC LEARNING MODEL
    • 14.4 VAPNIK-CHERVONENKIS DIMENSION
    • 14.5 LEARNING IN THE PRESENCE OF NOISE
    • 14.6 EXACT AND MISTAKE BOUNDED LEARNING MODELS
    • 14.7 INHERENT UNPREDICTABILITY AND PAC-REDUCTIONS
    • 14.8 WEAK AND STRONG LEARNING
    • 14.9 SUMMARY AND FURTHER READING
  • Appendix A: *Definitions of some lesser known terms
    • A.1 COMPUTATIONAL COMPLEXITY CLASSES
    • A.2 ASYMPTOTIC NOTATION
    • A.3 SOME BOUNDS FOR PROBABILISTIC ANALYSIS
    • A.4 COVARIANCE MATRIX
  • References
  • Index

Subject Areas: Machine learning [UYQM], Databases [UN], Library, archive & information management [GLC]

View full details