I have a Map with sales numbers divided by year:
Map<Integer, BigDecimal> sales_by_year = new TreeMap<>();
sales_by_year.put(2012, BigDecimal.valueOf(19283));
sales_by_year.put(2013, BigDecimal.valueOf(24832));
sales_by_year.put(2014, BigDecimal.valueOf(19562));
sales_by_year.put(2015, BigDecimal.valueOf(21879));
sales_by_year.put(2016, BigDecimal.valueOf(23587));
sales_by_year.put(2017, BigDecimal.valueOf(28756));
and a list of which years I want to add up these sales:
Set<Integer> years = new HashSet<>(Arrays.asList(new Integer[] {2012, 2013, 2014}));
and I want to write a lambda to combine these years into one BigDecimal
. I wrote this:
BigDecimal sales_for_timeframe = sales_by_year.entrySet().stream()
.filter(a -> years.contains(a.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue))
.values().stream()
.reduce(BigDecimal.ZERO, BigDecimal::add);
System.out.println(sales_for_timeframe);
and it works. But is this the most efficient way? This solution:
BigDecimal
Is there a way to reduce the number of steps? Will doing so increase efficiency? Or is this the best solution?
You could shorten it slightly:
BigDecimal sum = years.stream()
.map(y -> sales_by_year.getOrDefault(y, BigDecimal.ZERO))
.reduce(BigDecimal.ZERO, BigDecimal::add);