If #100# cats kill #100# mice in #100# days, then how long does it take #4# cats to kill #4# mice ?

2 Answers
Aug 16, 2015

#100#

Explanation:

Let #M# be the total number of mice killed, #c# be the number of cats, #i# be the number of days and #e# be the average number of mice killed by one cat in one day.

We would expect the number of mice killed to be proportional to both the number of cats and the number of days, so we can write:

#M = ice#

Substituting #M = 100#, #i = 100# and #c = 100# into this equation, we find:

#100 = 100 xx 100 xx e#

Divide both sides by #100 xx 100# to get:

#e = 1/100#

Then substitute #M=4#, #c = 4# and #e=1/100# to get:

#4 = i xx 4 xx 1/100 = i xx 1/25#

Multiply both sides by #25# to get: #i = 4 xx 25 = 100#

Aug 19, 2015

100 (Alternative method)

Explanation:

If over the space of #100# days, #100# cats kill #100# mice, then on average each cat has killed one mouse in that period.

So on average, in a #100# day period, #1# cat will kill #1# mouse and #4# cats will kill #4# mice.