Skip to:Content
|
Bottom
Machine learning and metaheuristic computation için kapak resmi
Başlık:
Machine learning and metaheuristic computation
Yazar:
Cuevas, Erik, author.
ISBN:
9781394229680

9781394229673
Fiziksel Tanımlama:
1 online resource (431 pages).
İçerik:
About the Authors -- Preface -- Acknowledgments -- Introduction -- 1 Fundamentals of Machine Learning -- 1.1 Introduction -- 1.2 Different Types of Machine Learning Approaches -- 1.3 Supervised Learning -- 1.4 Unsupervised Learning -- 1.5 Reinforcement Learning -- 1.6 Which Algorithm to Apply? -- 1.7 Recommendation to Build a Machine Learning Model -- References -- 2 Introduction to Metaheuristics Methods -- 2.1 Introduction -- 2.2 Classic Optimization Methods -- 2.3 Descending Gradient Method -- 2.4 Metaheuristic Methods -- 2.5 Exploitation and Exploration -- 2.6 Acceptance and Probabilistic Selection -- 2.7 Random Search -- 2.8 Simulated Annealing -- References -- 3 Fundamental Machine Learning Methods -- 3.1 Introduction -- 3.2 Regression -- 3.2.1 Explanatory Purpose -- 3.2.2 Predictive Purpose -- 3.3 Classification -- 3.3.1 Relationship Between Regression and Classification -- 3.3.2 Differences Between Regression and Classification -- 3.4 Decision Trees -- 3.4.1 Procedure of Classification -- 3.4.2 Determination of the Splitting Point -- 3.4.2.1 Gini Index -- 3.4.2.2 Entropy -- 3.4.3 Example of Classification -- 3.5 Bayesian Classification -- 3.5.1 Conditional Probability -- 3.5.2 Classification of Fraudulent Financial Reports -- 3.5.3 Practical Constraints by Using the Exact Bayes Method -- 3.5.4 Naive Bayes Method -- 3.5.5 Computational Experiment -- 3.6 k-Nearest Neighbors (k-NN) -- 3.6.1 k-NN for Classification -- 3.6.2 k-NN for Regression -- 3.7 Clustering -- 3.7.1 Similarity Indexes -- 3.7.2 Methods for Clustering -- 3.8 Hierarchical Clustering -- 3.8.1 Implementation in MATLAB -- 3.9 K-Means Algorithm -- 3.9.1 Implementation of K-Means Method in MATLAB -- 3.10 Expectation-Maximization Method -- 3.10.1 Gaussian Mixture Models -- 3.10.2 Maximum Likelihood Estimation -- 3.10.3 EM in One Dimension -- 3.10.3.1 Initialization -- 3.10.3.2 Expectation -- 3.10.3.3 Maximization -- 3.10.4 Numerical Example -- 3.10.5 EM in Several Dimensions -- References -- 4 Main Metaheuristic Techniques -- 4.1 Introduction -- 4.1.1 Use of Metaphors -- 4.1.2 Problems of the Use of Metaphors -- 4.1.3 Metaheuristic Algorithms -- 4.2 Genetic Algorithms -- 4.2.1 Canonical Genetic Algorithm -- 4.2.2 Selection Process -- 4.2.3 Binary Crossover Process -- 4.2.4 Binary Mutation Process -- 4.2.5 Implementation of the Binary GA -- 4.2.6 Genetic Algorithm Utilizing Real-Valued Parameters -- 4.2.7 Crossover Operator for Real-Valued Parameters -- 4.2.8 Mutation Operator for Real-Valued Parameters -- 4.2.9 Computational Implementation of the GA with Real Parameters -- 4.3 Particle Swarm Optimization (PSO) -- 4.3.1 Strategy for Searching in Particle Swarm Optimization -- 4.3.2 Analysis of the PSO Algorithm -- 4.3.3 Inertia Weighting -- 4.3.4 Particle Swarm Optimization Algorithm Using MATLAB -- 4.4 Differential Evolution (DE) Algorithm -- 4.4.1 The Search Strategy of DE -- 4.4.2 The Mutation Operation in DE -- 4.4.2.1 Mutation Rand/ 1 -- 4.4.2.2 Mutación Best/ 1 -- 4.4.2.3 Mutation Rand/ 2 -- 4.4.2.4 Mutation Best/ 2 -- 4.4.2.5 Mutation Current-to-Best/ 1 -- 4.4.3 The Crossover Operation in DE -- 4.4.4 The Selection Operation in DE -- 4.4.5 Implementation of DE in MATLAB -- References -- 5 Metaheuristic Techniques for Fine-Tuning Parameter of Complex Systems -- 5.1 Introduction -- 5.2 Differential Evolution (DE) -- 5.2.1 Mutation -- 5.2.1.1 Mutation Best/ 1 -- 5.2.1.2 Mutation Rand/ 2 -- 5.2.1.3 Mutation Best/ 2 -- 5.2.1.4 Mutation Current-to-Best/ 1 -- 5.2.2 Crossover -- 5.2.3 Selection -- 5.3 Adaptive Network-Based Fuzzy Inference System (ANFIS) -- 5.4 Differential Evolution for Fine-Tuning ANFIS Parameters Setting -- References -- 6 Techniques of Machine Learning for Producing Metaheuristic Operators -- 6.1 Introduction -- 6.2 Hierarchical Clustering -- 6.2.1 Agglomerative Hierarchical Clustering Algorithm -- 6.3 Chaotic Sequences -- 6.4 Cluster-Chaotic-Optimization (CCO) -- 6.4.1 Initialization -- 6.4.2 Clustering -- 6.4.3 Intra-Cluster Procedure -- 6.4.3.1 Local Attraction Movement -- 6.4.3.2 Local Perturbation Strategy -- 6.4.3.3 Extra-Cluster Procedure -- 6.4.3.4 Global Attraction Movement -- 6.4.3.5 Global Perturbation Strategy -- 6.5 Computational Procedure -- 6.6 Implementation of the CCO Algorithm in MATLAB -- 6.7 Spring Design Optimization Problem Using the CCO Algorithm in MATLAB -- References -- 7 Techniques of Machine Learning for Modifying the Search Strategy -- 7.1 Introduction -- 7.2 Self-Organization Map (SOM) -- 7.2.1 Network Architecture -- 7.2.2 Competitive Learning Model -- 7.2.2.1 Competition Procedure -- 7.2.2.2 Cooperation Procedure -- 7.2.2.3 Synaptic Adaptation Procedure -- 7.2.3 Self-Organization Map (SOM) Algorithm -- 7.2.4 Application of Self-Organization Map (SOM) -- 7.3 Evolutionary-SOM (EA-SOM) -- 7.3.1 Initialization -- 7.3.2 Training -- 7.3.3 Knowledge Extraction -- 7.3.4 Solution Production -- 7.3.5 New Training Set Construction -- 7.4 Computational Procedure -- 7.5 Implementation of the EA-SOM Algorithm in MATLAB -- 7.6 Gear Design Optimization Problem Using the EA-SOM Algorithm in MATLAB -- References -- 8 Techniques of Machine Learning Mixed with Metaheuristic Methods -- 8.1 Introduction -- 8.2 Flower Pollination Algorithm (FPA) -- 8.2.1 Global Rule and Lévy Flight -- 8.2.2 Local Rule -- 8.2.3 Elitist Selection Procedure -- 8.3 Feedforward Neural Networks (FNNs) -- 8.3.1 Perceptron -- 8.3.2 Feedforward Neural Networks (FNNs) -- 8.4 Training an FNN Using FPA -- References -- 9 Metaheuristic Methods for Classification -- 9.1 Introduction -- 9.2 Crow Search Algorithm (CSA) -- 9.3 CSA for Nearest-Neighbor Method (k-NN) -- 9.4 CSA for Logistic Regression -- 9.5 CSA for Fisher Linear Discriminant -- 9.6 CSA for Naïve Bayes Classification -- 9.7 CSA for Support Vector Machine -- References -- 10 Metaheuristic Methods for Clustering -- 10.1 Introduction -- 10.2 Cuckoo Search Method (CSM) -- 10.3 Search Strategy for CSM -- 10.3.1 Initialization -- 10.3.2 Lévy Flight -- 10.3.3 Solution Replacement -- 10.3.4 Elitist Selection -- 10.4 Computational Procedure -- 10.4.1 Metaheuristic Operators for CSM -- 10.5 Implementation of the CSM in MATLAB -- 10.6 Cuckoo Search Method for K-Means -- 10.6.1 Implementation of KM algorithm in MATLAB -- 10.6.2 Cuckoo Search Method for K-Means -- 10.6.2.1 Implementation of CSM to KM Clustering in MATLAB -- References -- 11 Metaheuristic Methods for Dimensional Reduction -- 11.1 Introduction -- 11.2 Ant Colony Optimization (ACO) -- 11.2.1 Pheromone Representation -- 11.2.2 Ant-Based Solution Construction -- 11.2.3 Pheromone Update -- 11.3 Dimensionality Reduction -- 11.4 ACO for Feature Selection -- References -- 12 Metaheuristic Methods for Regression -- 12.1 Introduction -- 12.2 Genetic Algorithm (GA) -- 12.2.1 Computational Structure -- 12.2.2 Initialization -- 12.2.3 Selection Method -- 12.2.3.1 Roulette Wheel Selection -- 12.2.3.2 Stochastic Reminder Selection -- 12.2.3.3 Rank-Based Selection -- 12.2.3.4 Tournament Selection -- 12.2.4 Crossover -- 12.2.5 Mutation -- 12.3 Neural Network Regression with Artificial Genetic -- 12.4 Linear Regression Employing an Artificial Genetic -- References -- Index.
Notlar:
John Wiley and Sons
Ayırtma:
Kopya:

Rafta:*

Kütüphane
Materyal Türü
Demirbaş Numarası
Yer Numarası
Durumu/İade Tarihi
Materyal Ayırtma
Arıyor...
E-Kitap 599530-1001 Q325.5 .C84 2025 EB
Arıyor...

On Order

Go to:Top of Page