recursively implementing 'minimum number of coins' in python
Posted
by
user5198
on Stack Overflow
See other posts from Stack Overflow
or by user5198
Published on 2012-06-11T03:21:33Z
Indexed on
2012/06/11
4:40 UTC
Read the original article
Hit count: 365
This problem is same as asked in here.
Given a list of coins, their values (c1, c2, c3, ... cj, ...), and the total sum i. Find the minimum number of coins the sum of which is i (we can use as many coins of one type as we want), or report that it's not possible to select coins in such a way that they sum up to S.
I"m just introduced to dynamic programming yesterday and I tried to make a code for it.
# Optimal substructure: C[i] = 1 + min_j(C[i-cj])
cdict = {}
def C(i, coins):
if i <= 0:
return 0
if i in cdict:
return cdict[i]
else:
answer = 1 + min([C(i - cj, coins) for cj in coins])
cdict[i] = answer
return answer
Here, C[i] is the optimal solution for amount of money 'i'. And available coins are {c1, c2, ... , cj, ...} for the program, I've increased the recursion limit to avoid maximum recursion depth exceeded error. But, this program gives the right answer only someones and when a solution is not possible, it doesn't indicate that.
What is wrong with my code and how to correct it?
© Stack Overflow or respective owner