- Out-of-Stock
Products
Categories
- Main categories
-
- 3D PRINTING
- ARDUINO
- AUTOMATION
- BOOKS
- CYBERSECURITY
- EDUCATION
- ELECTRONICS
- Cables
- Cameras and accessories
- Communication
- Conductive materials
- Connectors
- ARK connectors (Terminal Block)
- Coaxial connectors (RF)
- Connectors
- Crocodile clip
- D-Sub drawer connectors
- DC power connectors
- FFC/FPC ZIF connectors
- Goldpin connectors
- IDC connectors
- JACK connectors
- JST connectors
- Jumpers
- Memory cards slots
- Other connectors
- Pogo pin
- RJ45 sockets
- Slip ring connector
- Supports
- Szybkozłącza
- USB connectors
- USB PD Adapters for Laptops
- Cooling
- Displays
- Electronic modules
- A/D and D/A converters
- Audio
- Barcode readers
- CAN converters
- Converters USB - UART / RS232
- Data logger
- DDS/PLL generators
- Digital potentiometers
- Encoders
- Expanders of the I/O
- Fingerprint readers
- HMI modules
- Image and video
- JTAG accessories
- Keyboards, buttons
- LED drivers
- Memory card readers
- Memory modules
- Modules with power outputs
- Motor controllers
- Power modules
- RS485 converters
- RTC modules
- Servo Controllers
- TSOP infrared receivers
- USB Converters - I2C / 1-Wire / SPI
- Voltage converters
- Gadgets
- GPS
- Intelligent clothes
- LED - diodes, displays, stripes
- Luminous wires and accessories
- Memory cards and other data storages
- Passive elements
- PC accessories
- Printers
- Prototype boards
- Relays
- Semiconductors
- A/C converters (ADC)
- Analog systems
- Audio systems
- Bridge rectifiers
- Button
- D/A Converters (DAC)
- DDS synthesizers
- Digital circuits
- Diodes
- Drivers of motors
- DSP microprocessors
- Energy counters
- Energy harvesting
- ESD security
- IGBT drivers and bridges
- Interface systems
- LED drivers
- Logic converters
- Memory
- Microcontrollers
- Optotriacs and optocouplers
- Other
- PLL generators
- Power systems
- Programmable systems
- Resetting systems
- RF systems
- RTC systems
- Sensors
- SoC systems
- Timery
- Touch sensors
- Transistors
- Sensors
- Accelerometers
- Air humidity sensors
- Air quality sensors
- Current sensors
- Distance sensors
- Flow sensors
- Gas sensors
- Gyroscopes
- Hall sensors
- Humidity sensors
- Infrared sensors
- Light and color sensors
- Liquid level sensors
- Magnetic sensors (compasses)
- Medical sensors
- Motion sensors
- PH sensors
- Position sensors
- Pressure sensors
- Pressure sensors
- Reflection sensors
- Sensors 6DOF/9DOF/10DOF
- Sensors of liquid quality
- Temperature sensors
- Vibration sensors
- Sound transducers
- Switches and buttons
- Cables
- FPGA DEVELOPMENT KITS
- Measuring devices
- MECHANICS
- MINICOMPUTERS (SBC)
- POWER
- RASPBERRY PI
- Accessories for Raspberry Pi
- Audio video cables for Raspberry Pi
- Case Raspberry Pi
- Cooling for Raspberry Pi
- Displays for Raspberry Pi
- Extension modules for Raspberry Pi
- Memory cards for Raspberry Pi
- Power for Raspberry Pi
- Raspberry Pi 3 model A+
- Raspberry Pi 3 model B
- Raspberry Pi 3 model B+
- Raspberry Pi 4 model B
- Raspberry Pi 400
- Raspberry Pi 5
- Raspberry Pi cameras
- Raspberry Pi Compute Module
- Raspberry Pi model A/ B+/2
- Raspberry Pi Pico
- Raspberry Pi prototyping
- Raspberry Pi Zero
- Raspberry Pi Zero 2 W
- RETIRED PRODUCTS
- SALE
- STARTER KITS, PROGRAMMERS, MODULES
- Atmel SAM
- Atmel Xplain
- AVR
- Coral
- DFRobot FireBeetle
- ESP32
- ESP8266
- Feather / Thing Plus
- Freedom (Kinetis)
- M5Stack
- Micro:bit
- Nordic nRF
- Other development kits
- Particle Photon
- Peripheral modules
- PIC
- Raspberry Pi RP2040
- RFID
- RISC-V
- Seeed Studio LinkIt
- Segger programmers
- SOFTWARE
- Sparkfun MicroMod
- STM32
- STM32 Discovery
- STM32 MP1
- STM32 Nucleo boards
- STM8
- Teensy
- Universal programmers
- WRTNode
- XIAO/Qt PY
- Atmel SAM
- WORKSHOP
- Adhesives and gluers
- Chemistry
- CNC milling machines
- Crimping tools
- Dispensing needles
- Heat-shrink tubing
- Insulation strippers
- Knives and scissors
- Laboratory power supplies
- Mikroskopy
- Mini drills and grindrers
- Organizers
- Power strips
- Power tools
- Safety glasses
- Soldering
- Antistatic mats and accessories (ESD)
- BGA balls
- BGA rework stations
- Brushes and ESD brushes
- Desoldering Wick
- Handles, magnifiers
- Heat guns
- Heaters and soldering irons
- Laminates
- Portable soldering irons
- Silicone Soldering Mats
- SMD Accessories
- Soldering accessories
- Soldering chemistry
- Soldering irons
- Soldering pastes
- Soldering pots
- Soldering stations
- Soldering tips
- Sponges and cleaners
- Stand for soldering irons
- Tin
- Tin extractors
- Ultrasonic cleaners
- Tapes (aluminum, kapton, copper, insulating)
- Tools
- Tweezers
- Vices
- 3D PRINTING
New products
New products
Learning systems
Free shipping
free shipping in Poland for all orders over 500 PLN
Same day shipping
If your payment will be credited to our account by 11:00
14 days for return
Each consumer can return the purchased goods within 14 days
The book is devoted to the methods of constructing learning computer programs that are able to improve their performance based on past experience. Programs of this type in an automated way acquire knowledge, which they use to implement the tasks set before them.
The author introduces the reader to the learning process of systems from an algorithmic point of view. It explains what inductive learning is all about. It presents two main approaches to this issue: the induction of decision trees and the induction of decision rules. He discusses probabilistic methods of learning as well as methods of conceptual grouping. He deals with the problems of transforming the set of attributes. Presents selected learning algorithms for function approximation. Describes the discovery of dependencies in data and automaton learning, and the problem of learning systems with amplification.
The substantive value of the book is raised by numerous exercises, divided into traditional, laboratory and design.
On the Internet, at http://www.ise.pw.edu.pl/~cichosz/SU, the reader will find a collection of programs in Common Lisp, which are the implementation of some of the algorithms discussed in the book. It will also be able to use links to other materials available on the Web about learning systems.
Table of Contents
Preface
thanks
List of important markings
1 Learning in an algorithmic approach
1.1. The importance of learning
1.1.1 Definition of learning
1.1.2. Learning programs
1.1.3 Examples of machine learning
1.1.4. Motivation to learn
1.2 Types of learning
1.2.1. Taxonomy of machine learning
1.2.2. Main branches of machine learning
1.3 Learning about learning
1.3.1 Three trends
1.3.2 Related domains
1.4. Learning in artificial intelligence
1.4.1. Weak and strong artificial intelligence
1.4.2 Main departments of artificial intelligence
1.4.3 Learning knowledge for inference
1.4.4 Learning of heuristics
1.5 Learning as reasoning
1.5.1 Types of inference
1.5.2. Transmutations of knowledge
1.5.3 Learning as a search
1.6 Applications of learning systems
1.7 A philosophical perspective
1.8 Summary
1.9 Historical and bibliographic notes
1.10 Exercises
2 Induction learning
2.1. Inductive applications
2.2. Main types of inductive learning
2.2.1. Learning concepts
2.2.2 Creating concepts
2.2.3 Learning the function approximation
2.2.4 Learning modes
2.2.5 Inductive load
2.2.6 Other types of inductive learning
2.3. Algorithmic theory of induction
2.3.1 Estimating hypotheses errors
2.3.2 The PAC model
2.3.3 PAC-teaching
2.3.4. Requirements for the number of examples
2.3.5 The dimension of Vapnik-Chervonenkisa
2.3.6 Ockham's razor
2.3.7 Model of limiting the number of mistakes
2.4 Learning the version space
2.4.1 Partial order of hypotheses
2.4.2. Representation of hypotheses by complexes
2.4.3 Representation of the version space
2.4.4 Algorithm for the elimination of candidates
2.4.5 Using the version space
2.4.6 Selection of training examples
2.4.7. Loading CAE algorithm
2.5 Practical issues
2.6 Implementation issues
2.6.1. Representation of attribute information
2.6.2 Representation of examples
2.7 Summary
2.8 Historical and bibliographic notes
2.9 Exercises
3 Induction of decision trees
3.1 Decision trees as hypotheses
3.1.1 Tree structure
3.1.2 Notation for decision trees
3.1.3 Advantages and limitations of decision trees
3.2. Descending construction of the tree
3.2.1. Criterion for stopping and labeling
3.2.2 Types of tests
3.2.3. The criterion for selecting the test
3.2.4. Candidate tests
3.3 Trimming a tree
3.3.1 Trimming scheme
3.3.2 Criteria for trimming
3.3.3 Probabilistic leaves
3.4 Computational complexity
3.5 Practical issues
3.5.1 Large sets of examples
3.5.2. Attributes with many values
3.5.3 Missing attribute values
3.5.4 Incremental construction of a tree
3.6 Implementation issues
3.6.1 Tree representation
3.6.2 Recursion
3.6.3 Passing arguments
3.6.4 Evaluation of inequality tests
3.7 Summary
3.8 Historical and bibliographic notes
3.9 Exercises
4 Rule induction
4.1 Sets of rules as a hypothesis
4.1.1 Logical basis of a rule representation
4.1.2. Representation of conditions
4.1.3 Rule collection
4.1.4 Complexity of hypotheses
4.2 Sequential covering
4.3 The AQ algorithm
4.3.1 Search strategy
4.3.2 Star specialization
4.3.3 Assessment of complexes
4.3.4 Selection of grains
4.3.5 Choosing a category
4.4. The CN2 algorithm
4.4.1 Search strategy
4.4.2 Evaluation of complexes
4.4.3 Selecting a category
4.5 Cropping sets of rules
4.5.1 Trimming scheme
4.5.2 Criteria for pruning
4.6 Computational complexity
4.7 Practical issues
4.7.1 Incorrect training data
4.7.2. Effective specialization
4.7.3 Ordinal and continuous attributes
4.7.4 Missing attribute values
4.7.5 Incremental rule induction
4.8 Implementation issues
4.8.1. Representation of complexes
4.8.2. Representation of sets of complexes
4.9 Summary
4.10 Historical and bibliographic notes
4.11 Exercises
5 Probabilistic methods
5.1. Probability in artificial intelligence
5.2 Bayes theorem
5.3. Bayes classification
5.3.1 Choosing a hypothesis
5.3.2 Probability of data
5.3.3 The optimal Bayesian classifier
5.3.4 Bayes classification in practice
5.3.5 Naive Bayesian classifier
5.4 Bayesian networks
5.4.1 Terminology
5.4. Network structure and semantics
5.4.3. Applying in Bayesian networks
5.4.4. Induction of Bayesian networks
5.5 Principle of the minimum code length
5.5.1 Probability and the length of the code
5.5.2 Induction as data compression
5.5.3 Coding
5.5.4 Data coding
5.5.5. Application of the minimum code length rule
5.6 Practical issues
5.6.1 Probabilities a priori
5.6.2. Continuous attributes
5.7 Implementation issues
5.8 Summary
5.9 Historical and bibliographic notes
5.10 Exercises
6 Conceptual grouping
6.1 Grouping as knowledge
6.1.1 Grouping by similarity
6.1.2 From grouping to concepts
6.2. Grouping with covers
6.2.1 Complexes as a representation of grouping
6.2.2. Determination of complexes describing categories
6.2.3. Assessment of the quality of grouping
6.2.4 Number of categories
6.2.5 Classifying examples
6.3 Probabilistic grouping
6.3.1 Conceptual grouping as a search
6.3.2 Grouping function
6.3.3 Grouping representation
6.3.4 Operators
6.3.5. Control strategies
6.3.6 Classifying examples
6.4 Grouping with continuous attributes
6.4.1 CLUSTER / 2 algorithm and continuous attributes
6.4.2 COBWEB algorithm and continuous attributes
6.4.3 Grouping only with continuous attributes
6.5 Grouping as data compression
6.5.1 Hypothesis coding
6.5.2 Data coding
6.6 Practical issues
6.6.1 Grouping in learning with supervision
6.6.2 Large data sets
6.7 Implementation issues
6.8 Summary
6.9 Historical and bibliographic notes
6.10 Exercises
7 Transforming attributes
7.1 Attributes space and hypotheses space
7.1.1 Information content of attributes
7.1.2. The role of attributes in grouping
7.1.3 Attributes and representation
7.1.4 Types of transformations
7.2 Discretization of continuous attributes
7.2.1 Discretization and aggregation of ordinal attributes
7.2.2 Benefits of discretization
7.2.3 Types of discretization
7.2.4 Compartments and threshold values
7.2.5 Primitive discretization methods
7.2.6. Discreet descending
7.2.7 Ascending discretization
7.2.8 Discretization as data compression
7.2.9 Discretization without supervision
7.2.10 Aggregation of ordinal attributes
7.3 Constructive induction
7.3.1 Types of constructive induction
7.3.2 Constructive induction based on data
7.3.3 Constructive induction based on hypotheses
7.4 Practical issues
7.4.1 The purposefulness of discretization
7.4.2 Selecting the discretization method
7.4.3 Number of discretization intervals
7.4.4 Attributes of unknown contradiction
7.4.5 Applying constructive induction
7.5 Implementation issues
7.5.1 Implementation of descending discretization
7.5.2 Implementation of uplink discretization
7.6 Summary
7.7 Historical and bibliographic notes
7.8 Exercises
8 Learning the function approximation
8.1. Approach
8.1.1. Features of examples
8.1.2 Approximated mapping
8.1.3. Assessment of hypotheses
8.1.4 Modes and speed of learning
8.1.5. Training information
8.2 Types of approximators
8.3 Parametric methods
8.3.1. Calculation of the function value
8.3.2. Updating the function value
8.3.3. Linear approximator
8.3.4 Non-linear approximators
8.3.5 Exponential-gradient methods
8.3.6 Regression
8.3.7 Discrete Attributes
8.4 Extended representation
8.4.1 Extending the description of examples
8.4.2 Randomized representation
8.4.3 Distributed approximate coding
8.5. Memory methods
8.5.1 Memory and its use
8.5.2 Nearest neighbor
8.5.3. Local weighted averages
8.5.4 Local regression
8.6 Symbolic methods
8.6.1 Modeling hypotheses
8.6.2 Modeling trees
8.7 Practical issues
8.7.1 Selection of step size
8.7.2 CMAC and limitless properties counter
8.7.3 Mixed attribute spaces
8.7.4 Limited memory in memory methods
8.8 Implementation issues
8.8.1 Weights in scattered approximate tables
8.8.2 Remembering examples
8.9 Summary
8.10 Historical and bibliographic notes
8.11 Exercises
9 Inductive logic programming
9.1 Knowledge representation in the predicate calculus
9.2. Learning concepts in the predicate calculus
9.3 Sequential coverage for predicates
9.3.1 Basic scheme
9.3.2 Generating conjunction
9.3.3 Local training collections
9.3.4. Stop criterion
9.3.5 Selection of literals
9.4 Reversing deduction
9.4.1. Applying by the principle of resolution
9.4.2 Reversed resolution
9.4.3 Learning by means of a reversed resolution
9.5 Practical issues
9.5.1 Avoiding excessive matching
9.5.2 Incorrect training data
9.5.3 Sentence concepts
9.6 Implementation issues
9.6.1. Representation of clauses
9.6.2 Literal representation
9.7 Summary
9.8 Historical and bibliographic notes
9.9 Exercises
10 Making discoveries
10.1 Knowledge and data
10.1.1. Relationships in data as knowledge
10.1.2 Dependencies in relational databases
10.1.3 Learning as discovering knowledge
10.2 The use of learning algorithms
10.2.1 Large data sets
10.2.2 Numerous attributes
10.2.3. Numerous categories
10.2.4 Uneven distribution of categories
10.2.5 Incremental update
10.2.6. Incomplete data
10.2.7 Incorrect data
10.2.8 Metering
10.3 Discovering associations
10.3.1 Association rules
10.3.2. Quota tables
10.3.3. Patterns in contingency tables
10.3.4 From patterns to association rules
10.3.5 Searching for patterns
10.3.6 Association rules for many attributes
10.4 Discovering equations
10.4.1 Terminology and notation for equations
10.4.2 Equations with two variables
10.4.3 Equations with many variables
10.5 Practical issues
10.6 Implementation issues
10.6.1 Data structures for large training sets
10.7 Summary
10.8 Historical and bibliographic notes
10.9 Exercises
11 Learning by explaining
11.1. Knowledge inherent in learning
11.1.1 The use of innate knowledge in induction
11.1.2 Learning as a compilation of innate knowledge
11.1.3. Load in learning through explanation
11.2 Generalization based on explanations
11.2.1. Formulating the task
11.2.2 The EBG algorithm
11.3. Improving problem-solving
11.3.1 Problem solving in artificial intelligence
11.3.2 Problem solving and planning
11.3.3. Learning to solve problems
11.4. Combining explanation and induction
11.4.1 Integration of induction and deduction
11.4.2 Deductive specialization
11.5 Practical issues
11.5.1 Imperfect theory
11.5.2 Incorrect examples
11.5.3. Negative examples
11.6 Implementation issues
11.7 Summary
11.8 Historical and bibliographic notes
11.9 Exercises
12 Learning machines
12.1 The automaton as a language and environment model
12.1.1. Automata are finite
12.1.2 Linguistic perspective
12.1.3 Identification perspective
12.1.4 The task of learning machines
12.2. Learning based on queries
12.2.1 Inquiries and answers
12.2.2 The L * algorithm
12.2.3 Weakening training information
12.3. Learning from experiments
12.3.1. Sequencing sequences algorithms
12.3.2. Test equivalence algorithm
12.4 Practical issues
12.4.1. Non-determinism of the target machine
12.4.2 Improving equivalence testing
12.5 Implementation issues
12.5.1. Representation of the observation table
12.5.2. Representation of test sets
12.6 Summary
12.7 Historical and bibliographic notes
12.8 Exercises
13 Learning with reinforcement
13.1. Learning task with reinforcement
13.1.1 Learning with valuing feedback
13.1.2 Scenario
13.1.3 Environment
13.1.4 Student's task
13.1.5 Episodic tasks
13.1.6 Learning modes
13.1.7 Specificity of learning with reinforcement
13.2 Markov decision processes
13.2.1 Markov property
13.2.2 Strategies and functions of values
13.2.3 Optimizing the strategy
13.2.4 Markov problems and reinforced learning
13.3 Dynamic programming
13.3.1 Bellman equations
13.3.2 Evaluating the strategy
13.3.3 Determining the optimal strategy
13.3.4 Dynamic programming and learning
13.4. Learning the value function
13.4.1 The TD algorithm
13.4.2 Convergence
13.5 Learning strategy
13.5.1 AHC algorithm
13.5.2. Q-learning algorithm
13.5.3 Sars algorithm
13.6. Selection of shares
13.6.1 Probabilistic strategies
13.6.2 Counter strategies
13.7 Function representation
13.8 Accelerating learning: TD (?> 0)
13.8.1 Traces of activity
13.8.2 TD income (?)
13.8.3. Learning strategy
13.8.4 Role and selection?
13.9 Learning to solve problems
13.9.1. Problem as an environment
13.9.2 Solution as a strategy
13.9.3 Value function as a heuristic
13.10 Practical issues
13.10.1 Hidden state
13.10.2 Active perception
13.10.3. Learning and planning
13.10.4 Hierarchical learning
13.10.5 Inborn knowledge
13.11 Implementation issues
13.11.1. Effective traces of activity
13.11.2 Elimination of traces of activity
13.12 Summary
13.13 Historical and bibliographic notes
13.14 Exercises
14 Termination
14.1 From algorithms to systems
14.1.1 Stages of constructing the system
14.2 List of absences
14.2.1 Neural networks
14.2.2 Evolutionary algorithms
14.2.3. Specific issues
14.3. Directions of research
14.3.1. Theoretical challenges
14.3.2 Conceptual challenges
14.3.3 Practical challenges
14.4 The final word
14.5 Summary
14.6 Historical and bibliographic observations
14.7 Exercises
A. Solutions and exercise tips
A.1 Chapter 1
A.2 Chapter 2
A.3 Chapter 3
A.4 Chapter 4
A.5 Chapter 5
A.6 Chapter 6
A.7 Chapter 7
A.8 Chapter 8
A.9 Chapter 9
A.10. Chapter 10
A.11 Chapter 11
A.12 Chapter 12
A.13 Chapter 13
A.14 Chapter 14
B Basics of probability theory
B.1 Events
B.2 Probability axioms
B.3 Conditional probability
B.4 Random variables
B.4.1 Discrete random variables
B.4.2 Continuous random variables
B.5. More important probability distributions
C Basics of predicate logic
C.1. Syntax of the predicate language
C.2. Semantics of the predicate language
C.3. Application in the predicate calculus
C.4. Predicate calculus in learning
D Internet page of the book
Literature
Index
Other products in the same category (16)
No product available!
Display board for the STM32F407G-DISC1 set. Built-in display with a resistive touch panel, has. The set includes a flexible cable that allows you to attach to the expansion board STM32F4DIS-BB
No product available!
LCD 2x16, 65x27.7mm, STN, no backlight, enhanced temperature range, CHIP ON GLASS, HD47780 compatible controller - NT7603
No product available!
No product available!
No product available!
2 MPx camera module with OG02B10 sensor that works with all versions of the Raspberry Pi. Equipped with a shutter, it is characterized by good sensitivity. ArduCAM B0199
No product available!
Base plate dedicated to the STM32F4DISCOVERY set. It offers all microcontroller interfaces on the connectors located on the edge of the board. Waveshare Open407V-D Standard
No product available!
No product available!
No product available!
No product available!
The 1.5V AAA alkaline battery allows the use of small portable electrical devices without the need for mains power. Set contains 12 pieces. GP 24A-PB(8+4)
No product available!
The ODROID-SHOW is the Arduino compatible that lets you see what your ODROID or PC is thinking using a small 2.2” TFT LCD. Works with ODROID C1+/C0/C2/XU4
No product available!
No product available!
Stanisław Osowski, Andrzej Cichocki, Krzysztof Siwek
No product available!
Adafruit 800 - Netduino Go! Starter Pack (Modular .NET microcontroller)
No product available!
Oualline Steve
No product available!