python : can reduce be translated into list comprehensions like map, lambda and filter? python : can reduce be translated into list comprehensions like map, lambda and filter? python python

python : can reduce be translated into list comprehensions like map, lambda and filter?


It is no secret that reduce is not among the favored functions of the Pythonistas.

Generically, reduce is a left fold on a list

It is conceptually easy to write a fold in Python that will fold left or right on a iterable:

def fold(func, iterable, initial=None, reverse=False):    x=initial    if reverse:        iterable=reversed(iterable)    for e in iterable:        x=func(x,e) if x is not None else e    return x

Without some atrocious hack, this cannot be replicated in a comprehension because there is not accumulator type function in a comprehension.

Just use reduce -- or write one that makes more sense to you.


Since a list comprehension definitionally generates another list, you can't use it to generate a single value. The aren't for that. (Well... there is this nasty trick that uses a leaked implementation detail in old versions of python that can do it. I'm not even going to copy the example code here. Don't do this.)

If you're worried about the stylistic aspects of reduce() and its ilk, don't be. Name your reductions and you'll be fine. So while:

all_union = reduce(lambda a, b: a.union(b), L[1:], L[0])

isn't great, this:

from functools import reducedef full_union(input):    """ Compute the union of a list of sets """    return reduce(set.union, input[1:], input[0])result = full_union(L)

is pretty clear.

If you're worried about speed, check out the toolz and cytoolz packages, which are 'fast' and 'insanely fast,' respectively. On large datasets, they'll often let you avoid processing your data more than once or loading the whole set in memory at once, in contrast to list comprehensions.


Not really. List comprehensions are more similar to map, and possibly filter.