I'm attempting to write a function that calculates the number of unique permutations of a string. For example aaa
would return 1
and abc
would return 6
.
I'm writing the method like this:
(Pseudocode:)
len(string)! / (A!*B!*C!*...)
where A,B,C are the number of occurrences of each unique character. For example, the string 'aaa'
would be 3! / 3! = 1
, while 'abc'
would be 3! / (1! * 1! * 1!) = 6
.
My code so far is like this:
def permutations(n):
'''
returns the number of UNIQUE permutations of n
'''
from math import factorial
lst = []
n = str(n)
for l in set(n):
lst.append(n.count(l))
return factorial(len(n)) / reduce(lambda x,y: factorial(x) * factorial(y), lst)
Everything works fine, except when I try to pass a string that has only one unique character, i.e. aaa
- I get the wrong answer:
>>> perm('abc')
6
>>> perm('aaa')
2
>>> perm('aaaa')
6
Now, I can tell the problem is in running the lambda function with factorials on a list of length 1. I don't know why, though. Most other lambda functions works on a list of length 1 even if its expecting two elements:
>>> reduce(lambda x,y: x * y, [3])
3
>>> reduce(lambda x,y: x + y, [3])
3
This one doesn't:
>>> reduce(lambda x,y: ord(x) + ord(y), ['a'])
'a'
>>> reduce(lambda x,y: ord(x) + ord(y), ['a','b'])
195
Is there something I should be doing differently? I know I can rewrite the function in many different ways that will circumvent this, (e.g. not using lambda
), but I'm looking for why this specifically doesn't work.
Python's reduce
function doesn't always know what the default (initial) value should be. There should be a version that takes an initial value. Supply a sensible initial value and your reduce
should work beautifully.
Also, from the comments, you should probably just use factorial
on the second argument in your lambda:
reduce(lambda x,y: x * factorial(y), lst, 1)