Have a set of Tasks with only X running at a time Have a set of Tasks with only X running at a time multithreading multithreading

Have a set of Tasks with only X running at a time


SemaphoreSlim maxThread = new SemaphoreSlim(10);for (int i = 0; i < 115; i++){    maxThread.Wait();    Task.Factory.StartNew(() =>        {            //Your Works        }        , TaskCreationOptions.LongRunning)    .ContinueWith( (task) => maxThread.Release() );}


TPL Dataflow is great for doing things like this. You can create a 100% async version of Parallel.Invoke pretty easily:

async Task ProcessTenAtOnce<T>(IEnumerable<T> items, Func<T, Task> func){    ExecutionDataflowBlockOptions edfbo = new ExecutionDataflowBlockOptions    {         MaxDegreeOfParallelism = 10    };    ActionBlock<T> ab = new ActionBlock<T>(func, edfbo);    foreach (T item in items)    {         await ab.SendAsync(item);    }    ab.Complete();    await ab.Completion;}


You have several options. You can use Parallel.Invoke for starters:

public void DoWork(IEnumerable<Action> actions){    Parallel.Invoke(new ParallelOptions() { MaxDegreeOfParallelism = 10 }        , actions.ToArray());}

Here is an alternate option that will work much harder to have exactly 10 tasks running (although the number of threads in the thread pool processing those tasks may be different) and that returns a Task indicating when it finishes, rather than blocking until done.

public Task DoWork(IList<Action> actions){    List<Task> tasks = new List<Task>();    int numWorkers = 10;    int batchSize = (int)Math.Ceiling(actions.Count / (double)numWorkers);    foreach (var batch in actions.Batch(actions.Count / 10))    {        tasks.Add(Task.Factory.StartNew(() =>        {            foreach (var action in batch)            {                action();            }        }));    }    return Task.WhenAll(tasks);}

If you don't have MoreLinq, for the Batch function, here's my simpler implementation:

public static IEnumerable<IEnumerable<T>> Batch<T>(this IEnumerable<T> source, int batchSize){    List<T> buffer = new List<T>(batchSize);    foreach (T item in source)    {        buffer.Add(item);        if (buffer.Count >= batchSize)        {            yield return buffer;            buffer = new List<T>();        }    }    if (buffer.Count >= 0)    {        yield return buffer;    }}