| about | help | code help+videos | done | prefs |
risk
In the board game Risk, armies attack each other and resolve the battle by rolling dice. The attacker, under varying conditions, gets to roll 1, 2, or 3 dice. The defender rolls 1 or 2, and never more than the attacker. Then we compare the highest roll of each player, to determine who defeats the opponent. Ties go to the defender. So if the attacker's high roll is 5, then the defender wins with a 5 or 6, and attacker wins if the defender's high roll is 4 or less. If the defender rolls two dice, you next compare the second-highest dice of each player. For example, if attacker rolls [5, 3] and defender rolls [3, 4], then the attacker wins one point because the highest, 5, defeats 4; but the defender wins one because the second highest for each player is 3, and ties go to the defender. So the net gain for the attacker is +1 plus -1 or 0. Your task is to define the function risk(numA, numD) which returns the average net gain for the attacker (averaged over all possible dice rolls) when the attacker rolls numA dice and the defender rolls numD. For example, risk(1, 1) = -0.166... because if each player is rolling 1 die, there are 6*6 = 36 possibilities for the combinations, and of those there are 15 ways the attacker is higher, 15 lower, and 6 ties. Ties go to defender, so the average net gain is (+1*15 + -1 *(15+6))/36 = -.166... risk(3, 2) → 0.15817901234567902 risk(3, 1) → 0.3194444444444444 risk(2, 2) → -0.44135802469135804 ...Save, Compile, Run (ctrl-enter) |
Progress graphs:
Your progress graph for this problem
Random user progress graph for this problem
Random Epic Progress Graph
Difficulty: 555 Post-solution available
Copyright Nick Parlante 2017 - privacy