Can I split an iterator with Item=(key, val) into separate key iter and val iter without collecting or cloning?

I have a function that looks like this:

    fn set_many(&mut self, key_vals: impl IntoIterator<Item = (Index, T)>)
        let (keys, vals): (Vec<_>, Vec<_>) = key_vals.into_iter().unzip();

        let mut viter = vals.into_iter();
        let mut iter_mut = self.many_iter_mut(keys);

        while let Some(mut setter) = {

fn main() {
    let mut v = SimpleVec(vec![100, 200, 300, 400, 500]);
    v.set_many([(1, 20), (2, 30), (4, 50)]);
    println!("modified vec: {:?}", v);


modified vec: SimpleVec([100, 20, 30, 400, 50])

This works, but it is necessary to call unzip() which collects keys and vals into separate Vec. For large collections, this is undesirable.

It would work to change the function signature and accept separate key and val iter params. Unfortunately, this function is part of a public API and is already in use, so I don't wish to cause unnecessary breakage downstream.

So I would like to know if there is any way to re-write this function without collect'ing?

It would also be nice to avoid the unwrap().

Here is a playground


  • No, this isn't really possible. The reason it's possible to zip iterators incrementally and not unzip them is actually pretty easy to comprehend: the need for virtually unlimited buffering.

    If you had an iterator I that you could unzip into two iterators, A and B, these two iterators would have to share a reference to I, since advancing I is the only way to get the next key and value. However, if you advanced A by 100 items and not B, you'd still have to somehow keep ahold of those 100 B items that had been pulled from the values of I but not yet read. In general, iterators are not expected to buffer their contents more than necessary, since this can add substantial memory usage.

    On the other hand, it's possible to zip A and B into I by simply advancing A and B one item each for each call to I's next. This is much more efficient, so it's implemented.