随机 java.util.ConcurrentModificationException: null

Random java.util.ConcurrentModificationException: null

我有一个方法,每 5 分钟 运行s 并从缓存中删除文件

private HashMap<String, CachedFile> cache = new HashMap<>();

@Scheduled(fixedDelay = 300000)
public void deleteFileCache() {

    //remove files last accessed > 12h
    cache.entrySet().stream().filter(entry -> LocalDateTime.now().minus(12, ChronoUnit.HOURS)
            .isAfter(entry.getValue().getLastAccessed())).forEach(entry -> {
        File file = new File(tempFolder, entry.getKey());
        boolean deleted = file.delete();
    });

    //remove entries from HashMap
    cache.entrySet().removeIf(entry -> LocalDateTime.now().minus(12, ChronoUnit.HOURS)
            .isAfter(entry.getValue().getLastAccessed()));

    //if little space left remove oldest files
    long freeSpace = tempFolder.getFreeSpace();
    while (freeSpace < 6000000000L) {
        Optional<String> fileToDelete = cache.entrySet().stream()
                .min(Comparator.comparing(stringCachedFileEntry -> stringCachedFileEntry.getValue().getLastAccessed()))
                .map(Map.Entry::getKey);

        fileToDelete.ifPresent(filename -> {
            new File(tempFolder, filename).delete();
            cache.remove(filename);
        });
        freeSpace = tempFolder.getFreeSpace();
    }
}

此方法每天约有 2-3 次因 ConcurrentModificationException 而失败。我不明白为什么,因为该方法只能同时 运行 一次,而且我没有在从 HashMap 中删除的同时进行迭代。

在第 .min(Comparator.comparing(stringCachedFileEntry ->...

行失败
java.util.ConcurrentModificationException: null
        at java.util.HashMap$EntrySpliterator.forEachRemaining(HashMap.java:1704) ~[na:1.8.0_212]
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[na:1.8.0_212]
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[na:1.8.0_212]
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_212]
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_212]
        at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:479) ~[na:1.8.0_212]
        at java.util.stream.ReferencePipeline.min(ReferencePipeline.java:520) ~[na:1.8.0_212]
        at ch.my.app.server.FileCacheService.deleteFileCache(FileCacheService.java:113) ~[classes!/:1.0-SNAPSHOT]

A ConcurrentModificationException 不仅是在迭代时从迭代集中删除而抛出的。插入也会导致它。因此,您的可能原因是您的 Spring 引导控制器在未与 deleteFileCache 方法同步的情况下写入映射。

明显的解决方案是使用一些线程安全映射而不是 HashMap,例如一些评论中提到的 ConcurrentHashMap

例如,Spliterator 的文档说:

… After binding a Spliterator should, on a best-effort basis, throw ConcurrentModificationException if structural interference is detected. …

从您的堆栈跟踪看来,min 方法使用了 Spliterator。

结构更改包括插入和删除(它们可能不包括替换映射中仅更改值的映射)。

Link: Documentation of Spliterator