Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0
    • Fix Version/s: 3.5
    • Component/s: .structures
    • Labels:
      None

      Description

      Hadoop 0.18 is quite old now. We should aim to upgrade to 0.20 for the next release. Unfortunately, this isn't as simple as upgrading the jar file. Moreover, there is the choice of MapReduce APIs to consider.

        Attachments

        1. 20.1-patch.gz
          38 kB
        2. 20.1-patch-v2.gz
          2 kB
        3. 20.1-TR-115-v3.patch
          10 kB
        4. 20.1-TR-115-v4.patch
          12 kB
        5. hadoop-0.20.1+169.68-core.jar
          2.66 MB
        6. HadoopPlugin.java
          18 kB

          Issue Links

            Activity

            craigm Craig Macdonald created issue -
            Hide
            craigm Craig Macdonald added a comment -

            This patch has been contributed by Ian Soboroff (NIST), and is for Hadoop 0.20.1. The particular Hadoop core jar in use is attached to the issue.

            Show
            craigm Craig Macdonald added a comment - This patch has been contributed by Ian Soboroff (NIST), and is for Hadoop 0.20.1. The particular Hadoop core jar in use is attached to the issue.
            craigm Craig Macdonald made changes -
            Field Original Value New Value
            Attachment 20.1-patch.gz [ 10201 ]
            Attachment hadoop-0.20.1+169.68-core.jar [ 10202 ]
            Hide
            craigm Craig Macdonald added a comment -

            New version of the same patch, keeping only the changes to the source tree.

            Show
            craigm Craig Macdonald added a comment - New version of the same patch, keeping only the changes to the source tree.
            craigm Craig Macdonald made changes -
            Attachment 20.1-patch-v2.gz [ 10203 ]
            Hide
            craigm Craig Macdonald added a comment -

            Hadoop 0.20 depends on Java 6. Discuss.

            Show
            craigm Craig Macdonald added a comment - Hadoop 0.20 depends on Java 6. Discuss.
            craigm Craig Macdonald made changes -
            Link This issue is related to TR-104 [ TR-104 ]
            Hide
            craigm Craig Macdonald added a comment -

            Some more debugging on this patch. Found the issue that the JobTracker of 0.20 dislikes "" as a location for a split.

            Show
            craigm Craig Macdonald added a comment - Some more debugging on this patch. Found the issue that the JobTracker of 0.20 dislikes "" as a location for a split.
            craigm Craig Macdonald made changes -
            Attachment 20.1-TR-115-v3.patch [ 10210 ]
            Hide
            isoboroff Ian Soboroff added a comment -

            Note to build this patch you also need to replace lib/hadoop18.2-joined.jar with a hadoop-0.20-core.jar. I am using the one from Cloudera CDH2.

            Show
            isoboroff Ian Soboroff added a comment - Note to build this patch you also need to replace lib/hadoop18.2-joined.jar with a hadoop-0.20-core.jar. I am using the one from Cloudera CDH2.
            Hide
            craigm Craig Macdonald added a comment -

            hadoop18.2-joined.jar is a merge of many Hadoop-related jar files. I copied/symlinked in many of the other jar files from $HADOOP_HOME and $HADOOP_HOME/lib/

            Show
            craigm Craig Macdonald added a comment - hadoop18.2-joined.jar is a merge of many Hadoop-related jar files. I copied/symlinked in many of the other jar files from $HADOOP_HOME and $HADOOP_HOME/lib/
            Hide
            isoboroff Ian Soboroff added a comment -

            This patch gives the following NPE:

            $ bin/trec_terrier.sh -i -H
            Setting TERRIER_HOME to /home/soboroff/terrier-3.0
            10/04/23 10:12:39 WARN io.HadoopPlugin: Exception occurred while creating JobFactory
            java.lang.NullPointerException
            at org.terrier.utility.io.HadoopPlugin.getJobFactory(HadoopPlugin.java:284)
            at org.terrier.utility.io.HadoopPlugin.getJobFactory(HadoopPlugin.java:274)
            at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:121)
            at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373)
            at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573)
            at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)
            java.lang.Exception: Could not get JobFactory from HadoopPlugin
            java.lang.Exception: Could not get JobFactory from HadoopPlugin
            at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:123)
            at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373)
            at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573)
            at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)

            getJobFactory earlier calls getGlobalConfiguration, but I'm not sure that setGlobalConfiguration has ever been called. The only caller appears to be in utility.io.HadoopUtility, but application.HadoopIndexing goes straight to HadoopPlugin. Not sure I've understood the whole code flow.

            Show
            isoboroff Ian Soboroff added a comment - This patch gives the following NPE: $ bin/trec_terrier.sh -i -H Setting TERRIER_HOME to /home/soboroff/terrier-3.0 10/04/23 10:12:39 WARN io.HadoopPlugin: Exception occurred while creating JobFactory java.lang.NullPointerException at org.terrier.utility.io.HadoopPlugin.getJobFactory(HadoopPlugin.java:284) at org.terrier.utility.io.HadoopPlugin.getJobFactory(HadoopPlugin.java:274) at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:121) at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373) at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573) at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237) java.lang.Exception: Could not get JobFactory from HadoopPlugin java.lang.Exception: Could not get JobFactory from HadoopPlugin at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:123) at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373) at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573) at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237) getJobFactory earlier calls getGlobalConfiguration, but I'm not sure that setGlobalConfiguration has ever been called. The only caller appears to be in utility.io.HadoopUtility, but application.HadoopIndexing goes straight to HadoopPlugin. Not sure I've understood the whole code flow.
            Hide
            craigm Craig Macdonald added a comment - - edited

            Forgot to put this file (HadoopPlugin) in last patch. Updated patch to follow.

            Show
            craigm Craig Macdonald added a comment - - edited Forgot to put this file (HadoopPlugin) in last patch. Updated patch to follow.
            craigm Craig Macdonald made changes -
            Attachment HadoopPlugin.java [ 10211 ]
            Hide
            craigm Craig Macdonald added a comment -

            v4 patch. This includes the missing changes to HadoopPlugin.

            Show
            craigm Craig Macdonald added a comment - v4 patch. This includes the missing changes to HadoopPlugin.
            craigm Craig Macdonald made changes -
            Attachment 20.1-TR-115-v4.patch [ 10212 ]
            Hide
            isoboroff Ian Soboroff added a comment -

            The job now runs, but tasks die with failed spills:

            java.io.IOException: Spill failed
            at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:822)
            at org.apache.hadoop.mapred.MapTask$OldOutputCollector.collect(MapTask.java:466)
            at org.terrier.structures.indexing.singlepass.hadoop.HadoopRunWriter.writeTerm(HadoopRunWriter.java:84)
            at org.terrier.structures.indexing.singlepass.MemoryPostings.writeToWriter(MemoryPostings.java:151)
            at org.terrier.structures.indexing.singlepass.MemoryPostings.finish(MemoryPostings.java:112)
            at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.forceFlush(Hadoop_BasicSinglePassIndexer.java:308)
            at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.closeMap(Hadoop_BasicSinglePassIndexer.java:419)
            at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.close(Hadoop_BasicSinglePassIndexer.java:236)
            at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
            at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
            at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
            at org.apache.hadoop.mapred.Child.main(Child.java:170)
            Caused by: java.lang.NullPointerException
            at org.apache.hadoop.mapred.IFile$Writer.(IFile.java:102)
            at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1198)
            at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$1800(MapTask.java:648)
            at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1135)

            Show
            isoboroff Ian Soboroff added a comment - The job now runs, but tasks die with failed spills: java.io.IOException: Spill failed at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:822) at org.apache.hadoop.mapred.MapTask$OldOutputCollector.collect(MapTask.java:466) at org.terrier.structures.indexing.singlepass.hadoop.HadoopRunWriter.writeTerm(HadoopRunWriter.java:84) at org.terrier.structures.indexing.singlepass.MemoryPostings.writeToWriter(MemoryPostings.java:151) at org.terrier.structures.indexing.singlepass.MemoryPostings.finish(MemoryPostings.java:112) at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.forceFlush(Hadoop_BasicSinglePassIndexer.java:308) at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.closeMap(Hadoop_BasicSinglePassIndexer.java:419) at org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer.close(Hadoop_BasicSinglePassIndexer.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.lang.NullPointerException at org.apache.hadoop.mapred.IFile$Writer.(IFile.java:102) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1198) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$1800(MapTask.java:648) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1135)
            Hide
            craigm Craig Macdonald added a comment -

            I saw that yesterday. Do your map tasks have a warning about being unable to find native compression libraries?

            I presumed this was a configuration error with my Hadoop. Perhaps its a problem with CDH2 in general?

            Workaround - disable Map output compression : see src/core/org/terrier/applications/HadoopIndexing.java line 174

            Show
            craigm Craig Macdonald added a comment - I saw that yesterday. Do your map tasks have a warning about being unable to find native compression libraries? I presumed this was a configuration error with my Hadoop. Perhaps its a problem with CDH2 in general? Workaround - disable Map output compression : see src/core/org/terrier/applications/HadoopIndexing.java line 174
            Hide
            isoboroff Ian Soboroff added a comment -

            Disabling map output compression fixes it. I have the native libraries installed and bin/hadoop finds them, but perhaps the Terrier script doesn't? Should I set JAVA_LIBRARY_LIBS in terrier_env.sh?

            Show
            isoboroff Ian Soboroff added a comment - Disabling map output compression fixes it. I have the native libraries installed and bin/hadoop finds them, but perhaps the Terrier script doesn't? Should I set JAVA_LIBRARY_LIBS in terrier_env.sh?
            Hide
            craigm Craig Macdonald added a comment -

            I'm not convinced it's a terrier issue. I ran the hadoop grep program for a gzipped file and saw the same warning message.

            Hadoop should set library path when the map task child is forked. In this case it doesnt appear to be doing this.

            Show
            craigm Craig Macdonald added a comment - I'm not convinced it's a terrier issue. I ran the hadoop grep program for a gzipped file and saw the same warning message. Hadoop should set library path when the map task child is forked. In this case it doesnt appear to be doing this.
            Hide
            isoboroff Ian Soboroff added a comment -

            Sigh. 2.5 hours to index the wikipedia portion, btw.

            INFO - map 100% reduce 100%
            INFO - Job complete: job_201004231118_0004
            INFO - Counters: 23
            INFO - Job Counters
            INFO - Launched reduce tasks=26
            INFO - Rack-local map tasks=20
            INFO - Launched map tasks=49
            INFO - Data-local map tasks=29
            INFO - FileSystemCounters
            INFO - FILE_BYTES_READ=15790406810
            INFO - HDFS_BYTES_READ=50375670689
            INFO - FILE_BYTES_WRITTEN=23558026906
            INFO - HDFS_BYTES_WRITTEN=4826037969
            INFO - org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer$Counters
            INFO - INDEXED_POINTERS=2656803981
            INFO - INDEXED_TOKENS=5896971230
            INFO - INDEXED_DOCUMENTS=5957529
            INFO - INDEXER_FLUSHES=116
            INFO - Map-Reduce Framework
            INFO - Reduce input groups=6200909
            INFO - Combine output records=0
            INFO - Map input records=5957529
            INFO - Reduce shuffle bytes=7534296876
            INFO - Reduce output records=0
            INFO - Spilled Records=213486017
            INFO - Map output bytes=7646203740
            INFO - Map input bytes=-41692599706
            INFO - Combine input records=0
            INFO - Map output records=70620646
            INFO - Reduce input records=70620646
            WARN - No reduce 0 output : no output index [/home/soboroff/terrier-3.0/var/index,data-0]
            WARN - No reduce 1 output : no output index [/home/soboroff/terrier-3.0/var/index,data-1]
            WARN - No reduce 2 output : no output index [/home/soboroff/terrier-3.0/var/index,data-2]
            WARN - No reduce 3 output : no output index [/home/soboroff/terrier-3.0/var/index,data-3]
            WARN - No reduce 4 output : no output index [/home/soboroff/terrier-3.0/var/index,data-4]
            WARN - No reduce 5 output : no output index [/home/soboroff/terrier-3.0/var/index,data-5]
            WARN - No reduce 6 output : no output index [/home/soboroff/terrier-3.0/var/index,data-6]
            WARN - No reduce 7 output : no output index [/home/soboroff/terrier-3.0/var/index,data-7]
            WARN - No reduce 8 output : no output index [/home/soboroff/terrier-3.0/var/index,data-8]
            WARN - No reduce 9 output : no output index [/home/soboroff/terrier-3.0/var/index,data-9]
            WARN - No reduce 10 output : no output index [/home/soboroff/terrier-3.0/var/index,data-10]
            WARN - No reduce 11 output : no output index [/home/soboroff/terrier-3.0/var/index,data-11]
            WARN - No reduce 12 output : no output index [/home/soboroff/terrier-3.0/var/index,data-12]
            WARN - No reduce 13 output : no output index [/home/soboroff/terrier-3.0/var/index,data-13]
            WARN - No reduce 14 output : no output index [/home/soboroff/terrier-3.0/var/index,data-14]
            WARN - No reduce 15 output : no output index [/home/soboroff/terrier-3.0/var/index,data-15]
            WARN - No reduce 16 output : no output index [/home/soboroff/terrier-3.0/var/index,data-16]
            WARN - No reduce 17 output : no output index [/home/soboroff/terrier-3.0/var/index,data-17]
            WARN - No reduce 18 output : no output index [/home/soboroff/terrier-3.0/var/index,data-18]
            WARN - No reduce 19 output : no output index [/home/soboroff/terrier-3.0/var/index,data-19]
            WARN - No reduce 20 output : no output index [/home/soboroff/terrier-3.0/var/index,data-20]
            WARN - No reduce 21 output : no output index [/home/soboroff/terrier-3.0/var/index,data-21]
            WARN - No reduce 22 output : no output index [/home/soboroff/terrier-3.0/var/index,data-22]
            WARN - No reduce 23 output : no output index [/home/soboroff/terrier-3.0/var/index,data-23]
            WARN - No reduce 24 output : no output index [/home/soboroff/terrier-3.0/var/index,data-24]
            WARN - No reduce 25 output : no output index [/home/soboroff/terrier-3.0/var/index,data-25]
            java.lang.NullPointerException
            java.lang.NullPointerException
            at org.terrier.applications.HadoopIndexing.mergeLexiconInvertedFiles(HadoopIndexing.java:276)
            at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:231)
            at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373)
            at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573)
            at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)

            Show
            isoboroff Ian Soboroff added a comment - Sigh. 2.5 hours to index the wikipedia portion, btw. INFO - map 100% reduce 100% INFO - Job complete: job_201004231118_0004 INFO - Counters: 23 INFO - Job Counters INFO - Launched reduce tasks=26 INFO - Rack-local map tasks=20 INFO - Launched map tasks=49 INFO - Data-local map tasks=29 INFO - FileSystemCounters INFO - FILE_BYTES_READ=15790406810 INFO - HDFS_BYTES_READ=50375670689 INFO - FILE_BYTES_WRITTEN=23558026906 INFO - HDFS_BYTES_WRITTEN=4826037969 INFO - org.terrier.indexing.hadoop.Hadoop_BasicSinglePassIndexer$Counters INFO - INDEXED_POINTERS=2656803981 INFO - INDEXED_TOKENS=5896971230 INFO - INDEXED_DOCUMENTS=5957529 INFO - INDEXER_FLUSHES=116 INFO - Map-Reduce Framework INFO - Reduce input groups=6200909 INFO - Combine output records=0 INFO - Map input records=5957529 INFO - Reduce shuffle bytes=7534296876 INFO - Reduce output records=0 INFO - Spilled Records=213486017 INFO - Map output bytes=7646203740 INFO - Map input bytes=-41692599706 INFO - Combine input records=0 INFO - Map output records=70620646 INFO - Reduce input records=70620646 WARN - No reduce 0 output : no output index [/home/soboroff/terrier-3.0/var/index,data-0] WARN - No reduce 1 output : no output index [/home/soboroff/terrier-3.0/var/index,data-1] WARN - No reduce 2 output : no output index [/home/soboroff/terrier-3.0/var/index,data-2] WARN - No reduce 3 output : no output index [/home/soboroff/terrier-3.0/var/index,data-3] WARN - No reduce 4 output : no output index [/home/soboroff/terrier-3.0/var/index,data-4] WARN - No reduce 5 output : no output index [/home/soboroff/terrier-3.0/var/index,data-5] WARN - No reduce 6 output : no output index [/home/soboroff/terrier-3.0/var/index,data-6] WARN - No reduce 7 output : no output index [/home/soboroff/terrier-3.0/var/index,data-7] WARN - No reduce 8 output : no output index [/home/soboroff/terrier-3.0/var/index,data-8] WARN - No reduce 9 output : no output index [/home/soboroff/terrier-3.0/var/index,data-9] WARN - No reduce 10 output : no output index [/home/soboroff/terrier-3.0/var/index,data-10] WARN - No reduce 11 output : no output index [/home/soboroff/terrier-3.0/var/index,data-11] WARN - No reduce 12 output : no output index [/home/soboroff/terrier-3.0/var/index,data-12] WARN - No reduce 13 output : no output index [/home/soboroff/terrier-3.0/var/index,data-13] WARN - No reduce 14 output : no output index [/home/soboroff/terrier-3.0/var/index,data-14] WARN - No reduce 15 output : no output index [/home/soboroff/terrier-3.0/var/index,data-15] WARN - No reduce 16 output : no output index [/home/soboroff/terrier-3.0/var/index,data-16] WARN - No reduce 17 output : no output index [/home/soboroff/terrier-3.0/var/index,data-17] WARN - No reduce 18 output : no output index [/home/soboroff/terrier-3.0/var/index,data-18] WARN - No reduce 19 output : no output index [/home/soboroff/terrier-3.0/var/index,data-19] WARN - No reduce 20 output : no output index [/home/soboroff/terrier-3.0/var/index,data-20] WARN - No reduce 21 output : no output index [/home/soboroff/terrier-3.0/var/index,data-21] WARN - No reduce 22 output : no output index [/home/soboroff/terrier-3.0/var/index,data-22] WARN - No reduce 23 output : no output index [/home/soboroff/terrier-3.0/var/index,data-23] WARN - No reduce 24 output : no output index [/home/soboroff/terrier-3.0/var/index,data-24] WARN - No reduce 25 output : no output index [/home/soboroff/terrier-3.0/var/index,data-25] java.lang.NullPointerException java.lang.NullPointerException at org.terrier.applications.HadoopIndexing.mergeLexiconInvertedFiles(HadoopIndexing.java:276) at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:231) at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373) at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573) at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)
            Hide
            isoboroff Ian Soboroff added a comment -

            Ah hah, found where stuff went: into that path in HDFS, and apparently the reducer is looking in the normal filesystem. How can I restart the merge phase of the process?

            Show
            isoboroff Ian Soboroff added a comment - Ah hah, found where stuff went: into that path in HDFS, and apparently the reducer is looking in the normal filesystem. How can I restart the merge phase of the process?
            Hide
            craigm Craig Macdonald added a comment -

            You can you reindex, after setting the destination index path using an hdfs protocol:

            terrier.index.path=hdfs://node1:9000/path/to/index

            Cheers,

            Craig

            Show
            craigm Craig Macdonald added a comment - You can you reindex, after setting the destination index path using an hdfs protocol: terrier.index.path=hdfs://node1:9000/path/to/index Cheers, Craig
            Hide
            craigm Craig Macdonald added a comment -

            Current v4 patch has the following problem:

            Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.JobID.compareTo(Lorg/apache/hadoop/mapred/ID;)I
                    at org.terrier.applications.HadoopIndexing.deleteTaskFiles(HadoopIndexing.java:369)
                    at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:227)
                    at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373)
                    at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573)
                    at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)
            
            Show
            craigm Craig Macdonald added a comment - Current v4 patch has the following problem: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.JobID.compareTo(Lorg/apache/hadoop/mapred/ID;)I at org.terrier.applications.HadoopIndexing.deleteTaskFiles(HadoopIndexing.java:369) at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:227) at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:373) at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:573) at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:237)
            Hide
            craigm Craig Macdonald added a comment -

            Tagging for 3.1.

            Show
            craigm Craig Macdonald added a comment - Tagging for 3.1.
            craigm Craig Macdonald made changes -
            Fix Version/s 3.1 [ 10040 ]
            Hide
            craigm Craig Macdonald added a comment -

            Current trunk operating nicely on 0.20

            Show
            craigm Craig Macdonald added a comment - Current trunk operating nicely on 0.20
            craigm Craig Macdonald made changes -
            Status Open [ 1 ] Resolved [ 5 ]
            Resolution Fixed [ 1 ]
            Hide
            noiano Marco Didonna added a comment -

            Where can I get "current trunk" ?

            Show
            noiano Marco Didonna added a comment - Where can I get "current trunk" ?
            Hide
            craigm Craig Macdonald added a comment -

            This month!

            Show
            craigm Craig Macdonald added a comment - This month!
            Hide
            noiano Marco Didonna added a comment -

            ehm...it is taking a little longer

            Show
            noiano Marco Didonna added a comment - ehm...it is taking a little longer

              People

              • Assignee:
                craigm Craig Macdonald
                Reporter:
                craigm Craig Macdonald
              • Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: