How to Clone Wikipedia and Index it with Solr
Join the DZone community and get the full member experience.Join For Free
1. "Hardware. I found out the hard way that 32-bit Ubuntu machines with 613 MB RAM (Amazon’s ECS “micro” instances) were not big enough—they created time out errors that disappeared when I upgraded to 1.7GB / single cores. You will also need at least 200 GB disk space, 300 is a safe figure."
2. "Software. You will need MediaWiki 1.17 or greater, several extensions (listed in this good page by Metachronistic), either mwimport or http://www.mediawiki.org/wiki/Manual:MWDumper, mySQL, and Apache Solr 3.4. Install the necessary MediaWiki extensions now, it will make it easier later on verify that your database import was successful."
3. "Data. Get the latest Wikipedia dump from http://en.wikipedia.org/wiki/Wikipedia:Database_download#English-language_Wikipedia. You probably want the pages-articles file which is ~ 8 GB compressed and ~ 33 GB uncompressed."
I think this is a great real world example tutorial that could help any developer get familiar with the open source search utility of Solr, or just tune their skills. A good read.
Opinions expressed by DZone contributors are their own.