如何使用 Stream API 查找出现次数最多的字母和出现次数?
How to find the most frequent letters and the number of occurrences, using Stream API?
我有里面有单个字母的列表。我需要计算所有重复项并找到最频繁的重复项。该列表是随机生成的,因此它可能包含几个最常见的字母。
是否可以在一个 Stream 中只创建一个 map 或将第二个 map 放在 Stream 中?我只需要一个 Stream 链,使用方法 groupingBy().
public static void mostFrequentlyDuplicateLetters(List<String> letters) {
Map<String, Long> collect = letterList
.stream()
.collect(Collectors.groupingBy(String::valueOf, Collectors.counting()))
.entrySet()
.stream()
.filter(// How to find the most frequent letters and put them on a map?))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
}
首先,您需要的 return 类型实际上是 Map.Entry
,而不是整个地图,因为您只需要出现次数最多的条目。
你可以这样试试:
public Map.Entry<String,Long> mostFrequentlyDuplicateLetters(List<String> letterList){
return letterList
.stream()
.collect(Collectors.groupingBy(String::valueOf, Collectors.counting()))
.entrySet()
.stream().max(Map.Entry.comparingByValue())
.get();
}
it may contain several most frequent letters
下面的解决方案允许从给定列表中确定 all the letters
具有 maximum frequency
.
public static void main(String[] args) {
Map<String, Long> frequencies = getFrequencyMap(List.of("A", "B", "B", "C", "C", "C", "B", "D"));
long max = getMaxFrequency(frequencies);
System.out.println("max = " + max);
System.out.println(mostFrequentlyDuplicateLetters(frequencies, max));
}
public static Map<String, Long> getFrequencyMap(List<String> letters) {
return letters.stream()
.collect(Collectors.groupingBy(UnaryOperator.identity(), Collectors.counting()));
}
public static long getMaxFrequency(Map<String, Long> frequencies) {
return frequencies.values().stream()
.mapToLong(Long::longValue)
.max()
.orElse(0);
}
public static List<String> mostFrequentlyDuplicateLetters(Map<String, Long> frequencies,
long frequency) {
return frequencies.entrySet().stream()
.filter(entry -> entry.getValue() == frequency)
.map(Map.Entry::getKey)
.collect(Collectors.toList());
}
输出(字母B
和C
的频率都为3)
max = 3
[B, C]
Is it possible to create only one map inside one Stream
如果您希望您的代码既高效又干净,答案是 NO
。通过尝试在一种方法中融合多个关注点,您将违反 SOLID 原则中的第一个 - The single responsibility principle
.
我有里面有单个字母的列表。我需要计算所有重复项并找到最频繁的重复项。该列表是随机生成的,因此它可能包含几个最常见的字母。
是否可以在一个 Stream 中只创建一个 map 或将第二个 map 放在 Stream 中?我只需要一个 Stream 链,使用方法 groupingBy().
public static void mostFrequentlyDuplicateLetters(List<String> letters) {
Map<String, Long> collect = letterList
.stream()
.collect(Collectors.groupingBy(String::valueOf, Collectors.counting()))
.entrySet()
.stream()
.filter(// How to find the most frequent letters and put them on a map?))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
}
首先,您需要的 return 类型实际上是 Map.Entry
,而不是整个地图,因为您只需要出现次数最多的条目。
你可以这样试试:
public Map.Entry<String,Long> mostFrequentlyDuplicateLetters(List<String> letterList){
return letterList
.stream()
.collect(Collectors.groupingBy(String::valueOf, Collectors.counting()))
.entrySet()
.stream().max(Map.Entry.comparingByValue())
.get();
}
it may contain several most frequent letters
下面的解决方案允许从给定列表中确定 all the letters
具有 maximum frequency
.
public static void main(String[] args) {
Map<String, Long> frequencies = getFrequencyMap(List.of("A", "B", "B", "C", "C", "C", "B", "D"));
long max = getMaxFrequency(frequencies);
System.out.println("max = " + max);
System.out.println(mostFrequentlyDuplicateLetters(frequencies, max));
}
public static Map<String, Long> getFrequencyMap(List<String> letters) {
return letters.stream()
.collect(Collectors.groupingBy(UnaryOperator.identity(), Collectors.counting()));
}
public static long getMaxFrequency(Map<String, Long> frequencies) {
return frequencies.values().stream()
.mapToLong(Long::longValue)
.max()
.orElse(0);
}
public static List<String> mostFrequentlyDuplicateLetters(Map<String, Long> frequencies,
long frequency) {
return frequencies.entrySet().stream()
.filter(entry -> entry.getValue() == frequency)
.map(Map.Entry::getKey)
.collect(Collectors.toList());
}
输出(字母B
和C
的频率都为3)
max = 3
[B, C]
Is it possible to create only one map inside one Stream
如果您希望您的代码既高效又干净,答案是 NO
。通过尝试在一种方法中融合多个关注点,您将违反 SOLID 原则中的第一个 - The single responsibility principle
.