O'Reilly logo

Implementing Splunk: Big Data Reporting and Development for Operational Intelligence by Vincent Bumgarner

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Sizing indexers

There are a number of factors that affect how many Splunk indexers you will need, but starting with a "model" system with typical usage levels, the short answer is 100 gigabytes of raw logs per day per indexer. In the vast majority of cases, the disk is the performance bottleneck, except in the case of very slow processors.

Note

The measurements mentioned next assume that you will spread events across your indexers evenly, using the autoLB feature of the Splunk forwarder. We will talk more about this under Indexer load balancing.

The model system looks like this:

  • 8 gigabytes of RAM

    If more memory is available, the operating system will use whatever Splunk does not use for the disk cache.

  • Eight fast physical processors

    On a busy indexer, ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required