What is TensorFlow (TF)?What is Keras?What are the most important changes in TensorFlow 2.0?Introduction to neural networksPerceptronA first example of TensorFlow 2.0 codeMulti-layer perceptron – our first example of a networkProblems in training the perceptron and their solutionsActivation function – sigmoidActivation function – tanhActivation function – ReLUTwo additional activation functions – ELU and LeakyReLUActivation functionsIn short – what are neural networks after all?A real example – recognizing handwritten digitsOne-hot encoding (OHE)Defining a simple neural network in TensorFlow 2.0Running a simple TensorFlow 2.0 net and establishing a baselineImproving the simple net in TensorFlow 2.0 with hidden layersFurther improving the simple net in TensorFlow with DropoutTesting different optimizers in TensorFlow 2.0Increasing the number of epochsControlling the optimizer learning rateIncreasing the number of internal hidden neuronsIncreasing the size of batch computationSummarizing experiments run for recognizing handwritten chartsRegularizationAdopting regularization to avoid overfittingUnderstanding BatchNormalizationPlaying with Google Colab – CPUs, GPUs, and TPUsSentiment analysisHyperparameter tuning and AutoMLPredicting outputA practical overview of backpropagation What have we learned so far?Towards a deep learning approachReferences