I have a slice of 30'000 strings. How do I separate processing this slice into, say, 10 goroutines that would take 3000 strings from the slice, extract some data from them and push into a new slice?
So, in the end, I will have 10 slices with 3000 processed results in each. What's the pattern to handle this problem?
I have had a look at this article, but not sure which of these patterns applies to my case.
Using a channel, read the elements from the slice, use a Fan out to distribute load and pass messages. Then, process the strings in goroutines and collect the results back (fan in ) in a single goroutine to avoid mutexes.
You may want to set the number of Max concurrent concurrent goroutines.
Keep in mind that slices are not thread safe when writing to them.
Useful info:
https://blog.golang.org/pipelines https://talks.golang.org/2012/concurrency.slide#1 https://blog.golang.org/advanced-go-concurrency-patterns https://talks.golang.org/2013/advconc.slide#1