Hands-On Neural Network Programming with C#

Book description

Create and unleash the power of neural networks by implementing C# and .Net code

Key Features

  • Get a strong foundation of neural networks with access to various machine learning and deep learning libraries
  • Real-world case studies illustrating various neural network techniques and architectures used by practitioners
  • Cutting-edge coverage of Deep Networks, optimization algorithms, convolutional networks, autoencoders and many more

Book Description

Neural networks have made a surprise comeback in the last few years and have brought tremendous innovation in the world of artificial intelligence.

The goal of this book is to provide C# programmers with practical guidance in solving complex computational challenges using neural networks and C# libraries such as CNTK, and TensorFlowSharp. This book will take you on a step-by-step practical journey, covering everything from the mathematical and theoretical aspects of neural networks, to building your own deep neural networks into your applications with the C# and .NET frameworks.

This book begins by giving you a quick refresher of neural networks. You will learn how to build a neural network from scratch using packages such as Encog, Aforge, and Accord. You will learn about various concepts and techniques, such as deep networks, perceptrons, optimization algorithms, convolutional networks, and autoencoders. You will learn ways to add intelligent features to your .NET apps, such as facial and motion detection, object detection and labeling, language understanding, knowledge, and intelligent search.

Throughout this book, you will be working on interesting demonstrations that will make it easier to implement complex neural networks in your enterprise applications.

What you will learn

  • Understand perceptrons and how to implement them in C#
  • Learn how to train and visualize a neural network using cognitive services
  • Perform image recognition for detecting and labeling objects using C# and TensorFlowSharp
  • Detect specific image characteristics such as a face using Accord.Net
  • Demonstrate particle swarm optimization using a simple XOR problem and Encog
  • Train convolutional neural networks using ConvNetSharp
  • Find optimal parameters for your neural network functions using numeric and heuristic optimization techniques.

Who this book is for

This book is for Machine Learning Engineers, Data Scientists, Deep Learning Aspirants and Data Analysts who are now looking to move into advanced machine learning and deep learning with C#. Prior knowledge of machine learning and working experience with C# programming is required to take most out of this book

Table of contents

  1. Title Page
  2. Copyright and Credits
    1. Hands-On Neural Network Programming with C#
  3. Dedication
  4. Packt Upsell
    1. Why subscribe?
    2. Packt.com
  5. Contributors
    1. About the author
    2. About the reviewers
    3. Packt is searching for authors like you
  6. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
      1. Download the example code files
      2. Download the color images
    4. Code in Action
      1. Conventions used
    5. Get in touch
      1. Reviews
  7. A Quick Refresher
    1. Technical requirements
    2. Neural network overview
      1. Neural network training
      2. A visual guide to neural networks
    3. The role of neural networks in today's enterprises
    4. Types of learning
      1. Supervised learning
      2. Unsupervised learning
      3. Reinforcement learning
    5. Understanding perceptrons
      1. Is this useful?
    6. Understanding activation functions
      1. Visual activation function plotting
      2. Function plotting
    7. Understanding back propagation
      1. Forward and back propagation differences
    8. Summary
    9. References
  8. Building Our First Neural Network Together
    1. Technical requirements
    2. Our neural network
    3. Neural network training
      1. Synapses
      2. Neurons
      3. Forward propagation
      4. Sigmoid function
      5. Backward propagation
      6. Calculating errors
      7. Calculating a gradient
      8. Updating weights
      9. Calculating values
    4. Neural network functions
      1. Creating a new network
      2. Importing an existing network
      3. Importing datasets
      4. Testing the network
      5. Exporting the network
      6. Training the network
      7. Testing the network
      8. Computing forward propagation
      9. Exporting the network
      10. Exporting a dataset
    5. The neural network
      1. Neuron connection
    6. Examples
      1. Training to a minimum
      2. Training to a maximum
    7. Summary
  9. Decision Trees and Random Forests
    1. Technical requirements
    2. Decision trees
      1. Decision tree advantages
      2. Decision tree disadvantages
      3. When should we use a decision tree?
    3. Random forests
      1. Random forest advantages
      2. Random forest disadvantages
      3. When should we use a random forest?
    4. SharpLearning
      1. Terminology
      2. Loading and saving models
    5. Example code and applications
      1. Saving a model
      2. Mean squared error regression metric
      3. F1 score
      4. Optimizations
      5. Sample application 1
        1. The code
      6. Sample application 2 – wine quality
        1. The code
    6. Summary
    7. References
  10. Face and Motion Detection
    1. Technical requirements
    2. Facial detection
    3. Motion detection
      1. Code
    4. Summary
  11. Training CNNs Using ConvNetSharp
    1. Technical requirements
    2. Getting acquainted 
    3. Filters
    4. Creating a network
      1. Example 1 – a simple example
      2. Example 2 – another simple example
      3. Example 3 – our final simple example
      4. Using the Fluent API
    5. GPU
    6. Fluent training with the MNIST database
    7. Training the network
      1. Testing the data
      2. Predicting data
      3. Computational graphs
    8. Summary
    9. References
  12. Training Autoencoders Using RNNSharp
    1. Technical requirements
    2. What is an autoencoder?
    3. Different types of autoencoder
      1. Standard autoencoder
      2. Variational autoencoders
      3. De-noising autoencoders
      4. Sparse autoencoders
    4. Creating your own autoencoder
    5. Summary
    6. References
  13. Replacing Back Propagation with PSO
    1. Technical requirements
    2. Basic theory
      1. Swarm intelligence
      2. Particle Swarm Optimization
        1. Types of Particle Swarm Optimizations
        2. Original Particle Swarm Optimization strategy
        3. Particle Swarm Optimization search strategy
          1. Particle Swarm Optimization search strategy pseudo-code
        4. Parameter effects on optimization
    3. Replacing back propagation with Particle Swarm Optimization
    4. Summary
  14. Function Optimizations: How and Why
    1. Technical requirements
    2. Getting started
    3. Function minimization and maximization
      1. What is a particle?
      2. Swarm initialization
      3. Chart initialization
      4. State initialization
      5. Controlling randomness
      6. Updating the swarm position
      7. Updating the swarm speed
      8. Main program initialization
      9. Running Particle Swarm Optimization
      10. Our user interface
        1. Run button
        2. Rewind button
        3. Back button
        4. Play button
        5. Pause button
        6. Forward button
    4. Hyperparameters and tuning
      1. Function
      2. Strategy
      3. Dim size
      4. Upper bound
      5. Lower bound
      6. Upper bound speed
      7. Lower bound speed
      8. Decimal places
      9. Swarm size
      10. Max iterations
      11. Inertia
      12. Social weight
      13. Cognitive weight
      14. Inertia weight
    5. Understanding visualizations
      1. Understanding two-dimensional visualizations
      2. Understanding three-dimensional visualizations
    6. Plotting results
      1. Playing back results
      2. Updating the information tree
    7. Adding new optimization functions
      1. The purpose of functions
      2. Adding new functions
      3. Let's add a new function
    8. Summary
  15. Finding Optimal Parameters
    1. Technical requirements
    2. Optimization
      1. What is a fitness function?
        1. Maximization
        2. Gradient-based optimization
        3. Heuristic optimization
      2. Constraints
        1. Boundaries
        2. Penalty functions
        3. General constraints
        4. Constrained optimization phases
        5. Constrained optimization difficulties
        6. Implementation
      3. Meta-optimization
        1. Fitness normalization
        2. Fitness weights for multiple problems
        3. Advice
        4. Constraints and meta-optimization
        5. Meta-meta-optimization
    3. Optimization methods
      1. Choosing an optimizer
      2. Gradient descent (GD)
        1. How it works
        2. Drawbacks
      3. Pattern Search (PS)
        1. How it works
      4. Local Unimodal Sampling (LUS)
        1. How it works
      5. Differential Evolution (DE)
        1. How it works
      6. Particle Swarm Optimization (PSO)
        1. How it works
      7. Many Optimizing Liaisons (MOL)
      8. Mesh (MESH)
    4. Parallelism
      1. Parallelizing the optimization problem 
      2. Parallel optimization methods
        1. Necessary parameter tuning
      3. And finally, the code
      4. Performing meta-optimization
      5. Computing fitness
      6. Testing custom problems
      7. Base problem
      8. Creating a custom problem
      9. Our Custom Problem
    5. Summary
    6. References
  16. Object Detection with TensorFlowSharp
    1. Technical requirements
    2. Working with Tensors
      1. TensorFlowSharp
    3. Developing your own TensorFlow application
    4. Detecting images
      1. Minimum score for object highlighting
    5. Summary
    6. References
  17. Time Series Prediction and LSTM Using CNTK
    1. Technical requirements
    2. Long short-term memory
      1. LSTM variants
      2. Applications of LSTM
    3. CNTK terminology
    4. Our example
      1. Coding our application
        1. Loading data and graphs
        2. Loading training data
        3. Populating the graphs
        4. Splitting data
      2. Running the application
      3. Training the network
      4. Creating a model
      5. Getting the next data batch
      6. Creating a batch of data
    5. How well do LSTMs perform?
    6. Summary
    7. References
  18. GRUs Compared to LSTMs, RNNs, and Feedforward networks
    1. Technical requirements
    2. QuickNN
    3. Understanding GRUs
    4. Differences between LSTM and GRU
      1. Using a GRU versus a LSTM
    5. Coding different networks
      1. Coding an LSTM
      2. Coding a GRU
    6. Comparing LSTM, GRU, Feedforward, and RNN operations
    7. Network differences
    8. Summary
  19. Activation Function Timings
  20. Function Optimization Reference
    1. The Currin Exponential function
      1. Description
      2. Input domain
      3. Modifications and alternative forms
    2. The Webster function
      1. Description
      2. Input distributions
    3. The Oakley & O'Hagan function
      1. Description
      2. Input domain
    4. The Grammacy function
      1. Description
      2. Input fomain
    5. Franke's function
      1. Description
      2. Input domain
    6. The Lim function
      1. Description
      2. Input domain
    7. The Ackley function
      1. Description
      2. Input domain
      3. Global minimum
    8. The Bukin function N6
      1. Description
      2. Input domain
      3. Global minimum
    9. The Cross-In-Tray function
      1. Description
      2. Input domain
      3. Global minima
    10. The Drop-Wave function
      1. Description
      2. Input domain
      3. Global minimum
    11. The Eggholder function
      1. Description
      2. Input domain
      3. Global minimum
    12. The Holder Table function
      1. Description
      2. Input domain
      3. Global minimum
    13. The Levy function
      1. Description
      2. Input domain
      3. Global minimum
    14. The Levy function N13
      1. Description
      2. Input domain
      3. Global minimum
    15. The Rastrigin function
      1. Description
      2. Input domain
      3. Global minimum
    16. The Schaffer function N.2
      1. Description
      2. Input domain
      3. Global minimum
    17. The Schaffer function N.4
      1. Description
      2. Input domain
    18. The Shubert function
      1. Description
      2. Input domain
      3. Global minimum
    19. The Rotated Hyper-Ellipsoid function
      1. Description
      2. Input domain
      3. Global minimum
    20. The Sum Squares function
      1. Description
      2. Input domain
      3. Global minimum
    21. The Booth function
      1. Description
      2. Input domain
      3. Global minimum
    22. The Mccormick function
      1. Description
      2. Input domain
      3. Global minimum
    23. The Power Sum function
      1. Description
      2. Input domain
    24. The Three-Hump Camel function
      1. Description
      2. Input domain
      3. Global minimum
    25. The Easom function
      1. Description
      2. Input domain
      3. Global minimum
    26. The Michalewicz function
      1. Description
      2. Input domain
      3. Global minima
    27. The Beale function
      1. Description
      2. Input domain
      3. Global minimum
    28. The Goldstein-Price function
      1. Description
      2. Input domain
      3. Global minimum
    29. The Perm function
      1. Description
      2. Input domain
      3. Global minimum
    30. The Griewank function
      1. Description
      2. Input domain
      3. Global minimum
    31. The Bohachevsky function
      1. Description
      2. Input domain
      3. Global minimum
    32. The Sphere function
      1. Description
      2. Input domain
      3. Global minimum
    33. The Rosenbrock function
      1. Description
      2. Input domain
      3. Global minimum
    34. The Styblinski-Tang function
      1. Description
      2. Input domain
      3. Global minimum
    35. Summary
    36. Keep reading
  21. Other Books You May Enjoy
    1. Leave a review - let other readers know what you think

Product information

  • Title: Hands-On Neural Network Programming with C#
  • Author(s): Matt R. Cole
  • Release date: September 2018
  • Publisher(s): Packt Publishing
  • ISBN: 9781789612011