Skip to Content
Deep Learning Quick Reference
book

Deep Learning Quick Reference

by Mike Bernico
March 2018
Intermediate to advanced
272 pages
7h 53m
English
Packt Publishing
Content preview from Deep Learning Quick Reference

Should network architecture be considered a hyperparameter?

In building even the simplest network, we have to make all sorts of choices about network architecture. Should we use 1 hidden layer or 1,000? How many neurons should each layer contain? Should they all use the relu activation function or tanh? Should we use dropout on every hidden layer, or just the first? There are many choices we have to make in designing a network architecture.

In the most typical case, we search exhaustively for optimal values for each hyperparameter. It's not so easy to exhaustively search for network architectures though. In practice, we probably don't have the time or computational power to do so. We rarely see researchers searching for the optimal architecture ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Keras Deep Learning Cookbook

Keras Deep Learning Cookbook

Rajdeep Dua, Sujit Pal, Manpreet Singh Ghotra
Deep Learning with Keras

Deep Learning with Keras

Antonio Gulli, Sujit Pal

Publisher Resources

ISBN: 9781788837996Supplemental Content