在执行 Google 数据存储区查询时了解 "CancellationException: Task was cancelled" 错误
Understanding "CancellationException: Task was cancelled" error while doing a Google Datastore query
我正在使用 Google App Engine 1.9.48 版。在我的一些数据存储查询中,我随机收到 "CancellationException: Task was cancelled" 错误。而且我不太确定到底是什么导致了这个错误。从other Whosebug posts,我隐约了解到这与超时有关,但不完全确定是什么原因造成的。我没有使用任何 TaskQueues - 如果有帮助的话。
下面是堆栈跟踪:
java.util.concurrent.CancellationException: Task was cancelled.
at com.google.common.util.concurrent.AbstractFuture.cancellationExceptionWithCause(AbstractFuture.java:1126)
at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:504)
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:407)
at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:86)
....
at com.sun.proxy.$Proxy14.size(Unknown Source)
at main.java.com.continentalist.app.model.Model.getEntitySentimentCounts(Model.java:285)
at main.java.com.continentalist.app.model.Model.access0(Model.java:37)
at main.java.com.continentalist.app.model.Model.vrun(Model.java:251)
at com.googlecode.objectify.VoidWork.run(VoidWork.java:14)
at com.googlecode.objectify.VoidWork.run(VoidWork.java:11)
at com.googlecode.objectify.ObjectifyService.run(ObjectifyService.java:81)
...
抛出该错误的我的应用引擎代码在此处。我在抛出错误的地方添加了行注释(通常在 list().size()
之一):
private EntityAnalysis getEntitySentimentCounts(ComboCall comboCall) {
Query<ObjectifyArticle> queryArticles = ofy().load().type(ObjectifyArticle.class);
queryArticles = queryArticles.filter("domain", comboCall.getDomain());
Set<Entity> entitySet = comboCall.getEntitySet();
SentimentCount[] allSentimentCounts = new SentimentCount[entitySet.size()];
int index = 0;
for(Entity eachEntity : entitySet) {
SentimentCount sentimentCount = new SentimentCount();
String eachEntityName = eachEntity.getText();
Query<ObjectifyArticle> newQuery = queryArticles;
newQuery = newQuery.filter("entityName", eachEntityName);
sentimentCount.setEntityName(eachEntityName);
Query<ObjectifyArticle> positiveFilter = newQuery;
positiveFilter = positiveFilter.filter("entityType", POSITIVE);
int positive = positiveFilter.list().size(); // ERROR EITHER HERE
sentimentCount.setPositiveCount(positive+"");
Query<ObjectifyArticle> negativeFilter = newQuery;
negativeFilter = negativeFilter.filter("entityType", NEGATIVE);
int negative = negativeFilter.list().size(); // OR HERE
sentimentCount.setNegativeCount(""+negative);
Query<ObjectifyArticle> neutralFilter = newQuery;
neutralFilter = neutralFilter.filter("entityType", NEUTRAL);
int neutral = neutralFilter.list().size(); // OR HERE
sentimentCount.setNeutralCount(""+neutral);
allSentimentCounts[index] = sentimentCount;
index++;
}
EntityAnalysis entityAnalysis = new EntityAnalysis();
entityAnalysis.setDomain(comboCall.getDomain());
entityAnalysis.setSentimentCount(allSentimentCounts);
return entityAnalysis;
}
不需要调用.list().size()
,直接调用count()
即可。
如果您只是数数,请使用仅键查询 - 它是免费的并且速度更快。
当您希望处理大量实体时,请不要忘记在查询中设置 chunckAll()
。它比默认设置快得多。
如果您仍然运行遇到这些异常,您需要在查询中使用游标。
出现此类错误的常见原因如下:
任何 API 调用如 datastore.list 超时
解决方法:通过游标以分页方式读取数据
父方法达到请求超时限制,例如cronjob 在标准应用引擎中达到 10 分钟限制
解决方案:增加超时时间或使用多线程或使用缓存来加速执行
我正在使用 Google App Engine 1.9.48 版。在我的一些数据存储查询中,我随机收到 "CancellationException: Task was cancelled" 错误。而且我不太确定到底是什么导致了这个错误。从other Whosebug posts,我隐约了解到这与超时有关,但不完全确定是什么原因造成的。我没有使用任何 TaskQueues - 如果有帮助的话。
下面是堆栈跟踪:
java.util.concurrent.CancellationException: Task was cancelled.
at com.google.common.util.concurrent.AbstractFuture.cancellationExceptionWithCause(AbstractFuture.java:1126)
at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:504)
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:407)
at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:86)
....
at com.sun.proxy.$Proxy14.size(Unknown Source)
at main.java.com.continentalist.app.model.Model.getEntitySentimentCounts(Model.java:285)
at main.java.com.continentalist.app.model.Model.access0(Model.java:37)
at main.java.com.continentalist.app.model.Model.vrun(Model.java:251)
at com.googlecode.objectify.VoidWork.run(VoidWork.java:14)
at com.googlecode.objectify.VoidWork.run(VoidWork.java:11)
at com.googlecode.objectify.ObjectifyService.run(ObjectifyService.java:81)
...
抛出该错误的我的应用引擎代码在此处。我在抛出错误的地方添加了行注释(通常在 list().size()
之一):
private EntityAnalysis getEntitySentimentCounts(ComboCall comboCall) {
Query<ObjectifyArticle> queryArticles = ofy().load().type(ObjectifyArticle.class);
queryArticles = queryArticles.filter("domain", comboCall.getDomain());
Set<Entity> entitySet = comboCall.getEntitySet();
SentimentCount[] allSentimentCounts = new SentimentCount[entitySet.size()];
int index = 0;
for(Entity eachEntity : entitySet) {
SentimentCount sentimentCount = new SentimentCount();
String eachEntityName = eachEntity.getText();
Query<ObjectifyArticle> newQuery = queryArticles;
newQuery = newQuery.filter("entityName", eachEntityName);
sentimentCount.setEntityName(eachEntityName);
Query<ObjectifyArticle> positiveFilter = newQuery;
positiveFilter = positiveFilter.filter("entityType", POSITIVE);
int positive = positiveFilter.list().size(); // ERROR EITHER HERE
sentimentCount.setPositiveCount(positive+"");
Query<ObjectifyArticle> negativeFilter = newQuery;
negativeFilter = negativeFilter.filter("entityType", NEGATIVE);
int negative = negativeFilter.list().size(); // OR HERE
sentimentCount.setNegativeCount(""+negative);
Query<ObjectifyArticle> neutralFilter = newQuery;
neutralFilter = neutralFilter.filter("entityType", NEUTRAL);
int neutral = neutralFilter.list().size(); // OR HERE
sentimentCount.setNeutralCount(""+neutral);
allSentimentCounts[index] = sentimentCount;
index++;
}
EntityAnalysis entityAnalysis = new EntityAnalysis();
entityAnalysis.setDomain(comboCall.getDomain());
entityAnalysis.setSentimentCount(allSentimentCounts);
return entityAnalysis;
}
不需要调用
.list().size()
,直接调用count()
即可。如果您只是数数,请使用仅键查询 - 它是免费的并且速度更快。
当您希望处理大量实体时,请不要忘记在查询中设置
chunckAll()
。它比默认设置快得多。
如果您仍然运行遇到这些异常,您需要在查询中使用游标。
出现此类错误的常见原因如下:
任何 API 调用如 datastore.list 超时
解决方法:通过游标以分页方式读取数据父方法达到请求超时限制,例如cronjob 在标准应用引擎中达到 10 分钟限制 解决方案:增加超时时间或使用多线程或使用缓存来加速执行