Paralipsis
Reading between the lines, Python has a complex and lengthy relationship with functional programming. A couple of years ago Guido van Rossum wrote:
About 12 years ago, Python acquired lambda, reduce(), filter() and map(), courtesy of (I believe) a Lisp hacker who missed them and submitted working patches. But, despite of the PR value, I think these features should be cut from Python 3000.
The argument against map
and filter
is that list comprehensions serve the
same need; and similarly, local functions render lambda
inessential. Python
prefers there to be a single obvious way to do things. As it happens, lambda
will persist — and since Python lambdas are limited to single
expressions, they’re quite hard to abuse. Map
and filter
are out, sort
of. In a subtle but inspired move they’re being over-written by their lazier
counterparts from the itertools
module, imap
and ifilter
. So in Python
3000 list comprehensions are the way to create lists from lists (and
iterables, of course); and map
and filter
are two of the most important
iterator adaptors/stream processors. All things considered, Pythonic support for
functional programming is definitely on the up.
Reduce
really is to depart from the language core, though I guess homesick
Lisp hackers will be able to find it kicking its heels somewhere in
functools. The primary argument against reduce
seems to be that
it’s been abused by naughty programmers to create unreadable and inefficient
code.
If I discover my children using their toys inappropriately — for hitting each other, destroying furniture, blocking plumbing, etc. — then the toys are confiscated (after fair warning, and temporarily; I’m a pretty soft dictator). Once removed, these toys become highly desirable; once returned, less so.
I don’t use reduce
much (as the article goes on to point out, efficient container operations such as sum
, string.join
, max
, min
, and more recent additions to the language like any
and all
eliminate any common needs for it), or at least I didn’t. Now though, despite – maybe because of – its imminent demise, reduce
refuses to be ignored. Here are some of its greatest hits, all of which I’ve found useful recently.
def unite(sets): return reduce(set.union, sets, set()) def intersect(sets): return reduce(set.intersection, sets) if sets else set() def bits_to_integer(bits): return reduce(lambda acc, bit: acc << 1 | bit, bits, 0) def concatenate(items, initial): from operator import concat return reduce(concat, items, initial) def product(items): from operator import mul return reduce(mul, items, 1)
Maybe we’ll see a few of these absorbed into the core language (I’m kind of surprised that set.intersection
and set.union
aren’t already flexible enough to accept more general inputs).
In case it looks as I’m making a case for keeping reduce
as a built-in, here’s a chunk of hideously inefficient and dys-functional code which demonstrates why I deserve to have it taken away from me.
from operator import add, mul, div from itertools import * from functools import partial accu = lambda terms: reduce(add, terms, 0) prod = lambda terms: reduce(mul, terms, 1) flip = partial(div, 1.0) def fact(n): return prod(islice(count(), 2, n + 1)) def sum_n(terms, n): return accu(islice(terms, n)) def e(): def terms(): return imap(flip, imap(fact, count())) return imap(lambda n: sum_n(terms(), n), count())
For what it’s worth, (the current version of) 2to3 leaves this code unchanged.