public class HadoopIndexing extends Object
Term-partitioning is the default scenario. In this scenario, the maximum reducers allowed is 32. To select document-partitioning, specify the -p flag to main();
Properties:
| Modifier and Type | Field and Description | 
|---|---|
| protected static org.slf4j.Logger | loggerlogger for this class | 
| Constructor and Description | 
|---|
| HadoopIndexing() | 
| Modifier and Type | Method and Description | 
|---|---|
| static void | deleteTaskFiles(String path,
               org.apache.hadoop.mapred.JobID job)Performs cleanup of an index path removing temporary files | 
| static void | main(String[] args)Starts the MapReduce indexing. | 
| protected static void | mergeLexiconInvertedFiles(String index_path,
                         int numberOfReducers)for term partitioned indexing, this method merges the lexicons from each reducer | 
public static void main(String[] args) throws Exception
args - Exceptionprotected static void mergeLexiconInvertedFiles(String index_path, int numberOfReducers) throws IOException
index_path - path of indexnumberOfReducers - number of inverted files expectedIOExceptionpublic static void deleteTaskFiles(String path, org.apache.hadoop.mapred.JobID job)
Terrier Information Retrieval Platform4.1. Copyright © 2004-2015, University of Glasgow