Step 1. Everyday will backup Solr index files in both prodapp01 , prodapp02 servers
Step 2. Every Sunday Solr Reindex Aggregate @2.05 path /opt/solr/updated_full_index
Step 3. Once Solr Index Aggregate completed delete old index files /opt/solr/data/index and Copy From /opt/solr/updated_full_index To /opt/solr/data/index
above copy process will automated by cron job @2:15
Solr Index Aggregate complete hardly will take time 00:05 min
step 4 . we have to find a way how to Solr Index Aggregate in a different path = /opt/solr/updated_full_index
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
step 5 . You can very Existing index file corrupted or not executing following command
cd /opt/tomcat6/webapps/solr/WEB-INF/lib/
java -cp lucene-core-2.9.3.jar -ea:org.apache.lucene… org.apache.lucene.index.CheckIndex /home/expdev01/solr/data/index/
Opening index @ /home/expdev01/solr/data/index/
Segments file=segments_1vc3 numSegments=3 version=FORMAT_DIAGNOSTICS [Lucene 2.9]
1 of 3: name=_22su docCount=2304
compound=false
hasProx=true
numFiles=9
size (MB)=0.723
diagnostics = {optimize=false, mergeFactor=10, os.version=2.6.32-642.el6.x86_64, os=Linux, mergeDocStores=true, lucene.version=2.9.3 951790 - 2010-06-06 01:30:55, source=merge, os.arch=amd64, java.version=1.6.0_39, java.vendor=Sun Microsystems Inc.}
has deletions [delFileName=_22su_1.del]
test: open reader.........OK [1 deleted docs]
test: fields..............OK [45 fields]
test: field norms.........OK [45 fields]
test: terms, freq, prox...OK [20748 terms; 108150 terms/docs pairs; 257927 tokens]
test: stored fields.......OK [2303 total field count; avg 1 fields per doc]
test: term vectors........OK [0 total vector count; avg 0 term/freq vector fields per doc]
2 of 3: name=_22sv docCount=1
compound=false
hasProx=true
numFiles=9
size (MB)=0.001
diagnostics = {os.version=2.6.32-642.el6.x86_64, os=Linux, lucene.version=2.9.3 951790 - 2010-06-06 01:30:55, source=flush, os.arch=amd64, java.version=1.6.0_39, java.vendor=Sun Microsystems Inc.}
has deletions [delFileName=_22sv_1.del]
test: open reader.........OK [1 deleted docs]
test: fields..............OK [7 fields]
test: field norms.........OK [7 fields]
test: terms, freq, prox...OK [35 terms; 35 terms/docs pairs; 0 tokens]
test: stored fields.......OK [0 total field count; avg � fields per doc]
test: term vectors........OK [0 total vector count; avg � term/freq vector fields per doc]
3 of 3: name=_22sw docCount=1
compound=false
hasProx=true
numFiles=8
size (MB)=0
diagnostics = {os.version=2.6.32-642.el6.x86_64, os=Linux, lucene.version=2.9.3 951790 - 2010-06-06 01:30:55, source=flush, os.arch=amd64, java.version=1.6.0_39, java.vendor=Sun Microsystems Inc.}
no deletions
test: open reader.........OK
test: fields..............OK [7 fields]
test: field norms.........OK [7 fields]
test: terms, freq, prox...OK [35 terms; 35 terms/docs pairs; 35 tokens]
test: stored fields.......OK [1 total field count; avg 1 fields per doc]
test: term vectors........OK [0 total vector count; avg 0 term/freq vector fields per doc]
No problems were detected with this index. ##THIS INDEX FILE NO CORRUPTED
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Step 6 In dev01.learnexa.com
Solr Index file backup script available below path
/home/expdev01/bin/index_backup.sh
cron job will run every day at 23:59
/home/expdev01/solr/data/ #Backup of solr index files path
index_13-Jan-2017-04:06:35.zip
index_13-Jan-2017-04:05:40.zip
index_13-Jan-2017-04:04:25.zip
index_13-Jan-2017-04:04:04.zip
index_13-Jan-2017-04:04:03.zip
index_13-Jan-2017-04:04:01.zip
index_13-Jan-2017-04:03:59.zip
index_13-Jan-2017-04:03:58.zip
Automatically will rotate backup more than 8 days old backups will be deleted automatically by this shell script
In Dev server dont have multiple nodes not enable distributed backup copy
Same script we can enable distributed backup copy in Prodapp01 and Prodapp02
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Things to do later
will have to setup solr as like following link https://wiki.apache.org/solr/SolrReplication