about | help | code help+videos | done | prefs |
You have a certain amount of money, say P dollars (where P>0), and you're putting in into a bank account where it will grow by r% compounded annually. For example, if r = 3 then after one year the account will have 1.03 * P dollars in it. You're saving up for a vacation which costs V dollars (you may assume V > P). How many years will it be until you can take that vacation, assuming you don't add any more money to the account? Write a function called annualInterestYears which takes P,r,V as parameters and returns the number of years the money needs to sit till you can take that vacation. annualInterestYears(1000, 0.03, 1200) → 7 annualInterestYears(1, 0.01, 1000) → 695 annualInterestYears(10, 0.99, 50) → 3 ...Save, Compile, Run (ctrl-enter) |
Progress graphs:
Your progress graph for this problem
Random user progress graph for this problem
Random Epic Progress Graph
Copyright Nick Parlante 2017 - privacy