Apache Nutch 1.12 与 Apache Solr 6.2.1 出现错误

Apache Nutch 1.12 with Apache Solr 6.2.1 give an error

我正在使用Apache Nutch 1.12和Apache Solr 6.2.1在互联网上抓取数据并对其进行索引,并且组合出现错误:java.lang.Exception:java.lang.IllegalStateException : 连接池关闭

根据我从 Nutch 教程中学到的知识,我完成了以下操作:https://wiki.apache.org/nutch/NutchTutorial

当我运行以下命令时,出现错误:

bin/crawl -i -D solr.server.url=http://localhost:8983/solr/TSolr urls/ TestCrawl/ 2

上面,TSolr 只是 Solr Core 的名称,您可能已经猜到了。

我将错误日志粘贴到下面的 hadoop.log 中:

    2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduce: crawldb: TestCrawl/crawldb
2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduce: linkdb: TestCrawl/linkdb
2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduces: adding segment: TestCrawl/segments/20161028161642
2016-10-28 16:21:46,353 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:21:46,355 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:21:46,415 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:21:46,416 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:21:46,565 INFO  anchor.AnchorIndexingFilter - Anchor deduplication is: off
2016-10-28 16:21:52,308 INFO  indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:21:52,424 INFO  solr.SolrIndexWriter - Indexing 42/42 documents
2016-10-28 16:21:52,424 INFO  solr.SolrIndexWriter - Deleting 0 documents
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:21:53,469 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:21:53,472 INFO  indexer.IndexingJob - Indexer: number of documents indexed, deleted, or skipped:
2016-10-28 16:21:53,476 INFO  indexer.IndexingJob - Indexer:     42  indexed (add/update)
2016-10-28 16:21:53,477 INFO  indexer.IndexingJob - Indexer: finished at 2016-10-28 16:21:53, elapsed: 00:00:32
2016-10-28 16:21:54,199 INFO  indexer.CleaningJob - CleaningJob: starting at 2016-10-28 16:21:54
2016-10-28 16:21:54,344 WARN  util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-10-28 16:22:19,739 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:22:19,741 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:22:19,797 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:22:19,799 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:22:19,807 WARN  output.FileOutputCommitter - Output Path is null in setupJob()
2016-10-28 16:22:25,113 INFO  indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:22:25,191 INFO  solr.SolrIndexWriter - SolrIndexer: deleting 6/6 documents
2016-10-28 16:22:25,300 WARN  output.FileOutputCommitter - Output Path is null in cleanupJob()
2016-10-28 16:22:25,301 WARN  mapred.LocalJobRunner - job_local1653313730_0001
java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
    at org.apache.http.util.Asserts.check(Asserts.java:34)
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
    at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
    at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:480)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:230)
    at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:150)
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483)
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464)
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:190)
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:178)
    at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:115)
    at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:120)
    at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
2016-10-28 16:22:25,841 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836)
    at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172)
    at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:195)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206)

正如您在上面的 bin/crawl 命令中看到的,我让 Nutch 运行 爬行了 2 轮。问题是上面的错误只发生在第二轮(种子站点更深一层)。因此,索引在第一轮成功,但在第二轮的第二次抓取和解析后,它吐出错误并停止。

为了尝试与我上面所做的第一个 运行 有所不同的事情,我在第二个 运行 上做了以下操作:

因此,对于我的 Solr,无法成功索引 Nutch 为第二轮或种子站点更深 1 级抓取的内容。

错误可能是由于种子站点的解析内容大小造成的吗?种子网站是一家报业公司的网站,所以我确信第二轮(更深一层)将包含大量解析为索引的数据。如果问题是解析的内容大小,我该如何配置我的 Solr 来解决这个问题?

如果错误是由其他原因引起的,有人可以帮我确定是什么原因以及如何解决吗?

发生此错误是因为与 Solr 的连接已关闭并尝试提交 (https://github.com/apache/nutch/blob/master/src/java/org/apache/nutch/indexer/CleaningJob.java#L120). This was identified on the NUTCH-2269 ticket on Jira and there is a PR on the way (https://github.com/apache/nutch/pull/156)。

对于那些经历过我经历过的事情的人,我想我会post解决我遇到的问题。

首先,Apach Nutch 1.12 似乎不支持 Apache Solr 6.X。如果您查看 Apache Nutch 1.12 发行说明,他们最近向 Nuch 1.12 添加了支持 Apache Solr 5.X 的功能,并且不包括对 Solr 6.X 的支持。因此,我决定使用 Solr 5.5.3 而不是 Solr 6.2.1。因此,我安装了 Apache Solr 5.5.3 以与 Apache Nutch 1.12

一起使用

正如 Jorge Luis 指出的那样,Apache Nutch 1.12 有一个错误,它在与 Apache Solr 一起使用时会出错。他们会修复这个错误并在某个时候发布 Nutch 1.13,但我不知道那是什么时候,所以我决定自己修复这个错误。

之所以报错是因为先调用了CleaningJob.java(of Nutch)中的close方法,然后调用了commit方法。然后,抛出以下异常:java.lang.IllegalStateException: Connection pool shut down.

修复实际上很简单。要了解解决方案,请转到此处:https://github.com/apache/nutch/pull/156/commits/327e256bb72f0385563021995a9d0e96bb83c4f8

正如您在上面的 link 中看到的,您只需重新定位 "writers.close();" 方法。

顺便说一句,为了修复错误,您需要 Nutch scr 包而不是二进制包,因为您将无法编辑 Nutch 二进制包中的 CleaningJob.java 文件。修复后,运行蚂蚁,一切就绪。

修复后,错误不再出现!

希望这对遇到我所面临问题的任何人有所帮助。