about | help | code help+videos | done | prefs |
Given a 5-digit binary number, convert the number to a decimal value. The binary number is passed as a string of 0's and 1's, beginning with the most significant digit. binaryToDecimal('11111') → 31 binaryToDecimal('00000') → 0 binaryToDecimal('00101') → 5 ...Save, Compile, Run (ctrl-enter) |
Progress graphs:
Your progress graph for this problem
Random user progress graph for this problem
Random Epic Progress Graph
Difficulty: 125 Post-solution available
Copyright Nick Parlante 2017 - privacy