Linux 简单地使用Solr时如何解决“锁定获取超时”?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/20517348/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to solve the 'Lock obtain timed out' when using Solr plainly?
提问by Emre Sevin?
I have two cores for our Solr system (Solr version 3.6.1). When I invoke the following command line on our dedicated Solr server to add and then index a file:
我的 Solr 系统(Solr 版本 3.6.1)有两个核心。当我在我们的专用 Solr 服务器上调用以下命令行来添加并索引文件时:
java -Durl=http://solrprod:8080/solr/original/update -jar /home/solr/solr3/biomina/solr/post.jar /home/solr/tmp/2008/c2m-dump-01.noDEID_clean.xml
I get an exception in /usr/share/tomcat7/logs/solr.2013-12-11.log
file (after about 6 minutes of waiting):
我在/usr/share/tomcat7/logs/solr.2013-12-11.log
文件中收到一个异常(等待大约 6 分钟后):
SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/solr/solr3/biomina/solr/original/data/index/write.lock
(You can see the detailed output of it at the end of this message).
(您可以在此消息的末尾看到它的详细输出)。
I tried to modify the time-out for locks (by setting writeLockTimeout
to 300000
) , but this did not solve the problem. I'm not using any custom script, just the post.jar
that comes with Solr 3.1.6, to add and index.
我试图修改锁的超时时间(通过设置writeLockTimeout
为300000
),但这并没有解决问题。我没有使用任何自定义脚本,只是post.jar
使用 Solr 3.1.6 附带的脚本来添加和索引。
Any ideas about what needs to be changed to get rid of this error and successfully add the XML file about to Solr and index it?
关于需要更改哪些内容以消除此错误并成功将 XML 文件添加到 Solr 并为其编制索引的任何想法?
Contents of /home/solr/solr3/biomina/solr/solr.xml
:
内容/home/solr/solr3/biomina/solr/solr.xml
:
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<!--
All (relative) paths are relative to the installation path
persistent: Save changes made via the API to this file
sharedLib: path to a lib directory that will be shared across all cores
-->
<solr persistent="true">
<!--
adminPath: RequestHandler path to manage cores.
If 'null' (or absent), cores will not be manageable via request handler
-->
<cores adminPath="/admin/cores">
<core name="original" instanceDir="original" />
<core name="deidentified" instanceDir="deidentified" />
</cores>
</solr>
Relevat part of solrconfig.xml (for the core named original
):
solrconfig.xml 的相关部分(对于名为 的核心original
):
<indexConfig>
<!-- maxFieldLength specifies max number of *tokens* indexed per
field. Default: 10000 -->
<!-- <maxFieldLength>10000</maxFieldLength> -->
<!-- Maximum time to wait for a write lock (ms) for an IndexWriter.
Default: 1000 -->
<writeLockTimeout>300000</writeLockTimeout>
Relevat part of solrconfig.xml (for the core named deidentified
):
solrconfig.xml 的相关部分(对于名为 的核心deidentified
):
<indexConfig>
<!-- maxFieldLength specifies max number of *tokens* indexed per
field. Default: 10000 -->
<!-- <maxFieldLength>10000</maxFieldLength> -->
<!-- Maximum time to wait for a write lock (ms) for an IndexWriter.
Default: 1000 -->
<writeLockTimeout>300000</writeLockTimeout>
Detailed Output of Exception
异常的详细输出
Dec 11, 2013 11:27:25 AM org.apache.solr.core.SolrCore execute
INFO: [original] webapp=/solr path=/update params={} status=500 QTime=300070
Dec 11, 2013 11:32:25 AM org.apache.solr.common.SolrException log
SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/solr/solr3/biomina/solr/original/data/index/write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:84)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1098)
at org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:84)
at org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:101)
at org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:171)
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:219)
at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61)
at org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:115)
at org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:157)
at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:79)
at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:58)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376)
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:953)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1023)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626)
at java.lang.Thread.run(Thread.java:804)
Dec 11, 2013 11:32:25 AM org.apache.solr.core.SolrCore execute
INFO: [original] webapp=/solr path=/update params={} status=500 QTime=556916
System details:
系统详情:
uname -a
Linux solrprod 3.0.93-0.8-default #1 SMP Tue Aug 27 08:44:18 UTC 2013 (70ed288) x86_64 x86_64 x86_64 GNU/Linux
java -version
java version "1.7.0"
Java(TM) SE Runtime Environment (build pxa6470sr6-20131015_01(SR6))
IBM J9 VM (build 2.6, JRE 1.7.0 Linux amd64-64 Compressed References 20131013_170512 (JIT enabled, AOT enabled)
J9VM - R26_Java726_SR6_20131013_1510_B170512
JIT - r11.b05_20131003_47443
GC - R26_Java726_SR6_20131013_1510_B170512_CMPRSS
J9CL - 20131013_170512)
JCL - 20131011_01 based on Oracle 7u45-b18
采纳答案by Emre Sevin?
The following modifications solved the issue:
以下修改解决了该问题:
Applied the changes described at https://stackoverflow.com/a/3035916/236007
Switched to Oracle Javaruntime (it was IBM Javaruntime).
Put the
ulimit -v unlimited
in/etc/init.d/tomcat7
.Modified the
/usr/share/tomcat7/bin/setenv.sh
file as the following (giving it about 4 GB memory):export JAVA_OPTS="$JAVA_OPTS -Xmx4000m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/mnt/data/tomcat_dump"
切换到Oracle Java运行时(它是IBM Java运行时)。
把
ulimit -v unlimited
在/etc/init.d/tomcat7
。修改
/usr/share/tomcat7/bin/setenv.sh
文件如下(给它大约 4 GB 内存):export JAVA_OPTS="$JAVA_OPTS -Xmx4000m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/mnt/data/tomcat_dump"
回答by Gesias
Got issues with Lock obtain timed out
for write.lock
file. It occured after a pretty hard reset and it turned out that I, after restart, had two processes running since the first had been ungracefully killed before.
Lock obtain timed out
forwrite.lock
文件有问题。它发生在一次相当硬的重置之后,结果我在重新启动后有两个进程在运行,因为第一个进程之前被不礼貌地杀死了。
Running ps aux | grep solr
and killing the corrupted process and let the other start up then solved the issue.
运行ps aux | grep solr
并杀死损坏的进程并让另一个启动然后解决了问题。
回答by jan
I had these error for general Lucene library usage and the problem were file system errors, i.e. the reproducible error disappeared after fsck
with repair. I add this answer in this question, as I found this question first.
对于一般 Lucene 库的使用,我遇到了这些错误,问题是文件系统错误,即fsck
修复后可重现的错误消失了。我在这个问题中添加了这个答案,因为我首先发现了这个问题。