Reader small image

You're reading from  Apache Solr High Performance

Product typeBook
Published inMar 2014
Reading LevelIntermediate
Publisher
ISBN-139781782164821
Edition1st Edition
Languages
Tools
Right arrow
Author (1)
Surendra Mohan
Surendra Mohan
author image
Surendra Mohan

Surendra Mohan, who has served a few top-notch software organizations in varied roles, is currently a freelance software consultant. He has been working on various cutting-edge technologies like Drupal, Moodle, Apache Solr, ElasticSearch, Node.js, SoapUI, and so on for the past 10 years. He also delivers technical talks at various community events like Drupal Meetups and Drupal Camps. To find out more about him, his write-ups, technical blogs, and much more, go to http://www.surendramohan.info/. He has also written the books Administrating Solr and Apache Solr High Performance published by Packt Publishing and has reviewed other technical books such as Drupal 7 Multi Site Configuration and Drupal Search Engine Optimization, as well as titles on Drupal commerce, ElasticSearch, Drupal related video tutorials, titles on OpsView, and many more. Additionally, he writes technical blogs and articles with SitePoint.com. His published blogs and articles can be found at http://www.sitepoint.com/author/smohan/.
Read more about Surendra Mohan

Right arrow

Dealing with a huge count of open files


In this section, we will learn how to get rid of exceptions thrown due to a huge number of files that are open. Before you get into this section, it is recommended that you refer to one of the preceding sections called Reducing the file count in the index.

  1. For the purpose of demonstration, let us assume that Solr (running on a Unix environment) throws the exception whose header looks as follows:

    java.io.FileNotFoundException: /use/share/solr/data/index/_8.tii
    

    This shows that there are too many open files.

  2. We will increase the opened files' limit from 1000 (this was earlier set in my case, and is prone to differ) to 3000. To do so, we will use the ulimit command-line utility as follows:

    ulimit –n 3000
    
  3. Stopping at this stage would just prove to be a workaround. The primary cause behind this exception is the huge number of segment files that constitute an index. So, the immediate activity proceeding with the ulimit utility should be to optimize the index...

lock icon
The rest of the page is locked
Previous PageNext Page
You have been reading a chapter from
Apache Solr High Performance
Published in: Mar 2014Publisher: ISBN-13: 9781782164821

Author (1)

author image
Surendra Mohan

Surendra Mohan, who has served a few top-notch software organizations in varied roles, is currently a freelance software consultant. He has been working on various cutting-edge technologies like Drupal, Moodle, Apache Solr, ElasticSearch, Node.js, SoapUI, and so on for the past 10 years. He also delivers technical talks at various community events like Drupal Meetups and Drupal Camps. To find out more about him, his write-ups, technical blogs, and much more, go to http://www.surendramohan.info/. He has also written the books Administrating Solr and Apache Solr High Performance published by Packt Publishing and has reviewed other technical books such as Drupal 7 Multi Site Configuration and Drupal Search Engine Optimization, as well as titles on Drupal commerce, ElasticSearch, Drupal related video tutorials, titles on OpsView, and many more. Additionally, he writes technical blogs and articles with SitePoint.com. His published blogs and articles can be found at http://www.sitepoint.com/author/smohan/.
Read more about Surendra Mohan