This came up while playing Minecraft with a couple mods. In the game there was a machine that offered a trade of your stuff for a specific chance to get some cool loot. The same basic thing occurs in Perfect World, where you can trade some of your gathered resources for a chance to improve some equipment. And again in Tynon, where you can trade stars for a chance to upgrade a player character (although the mechanics are a bit more convoluted in Tynon).
If there is a 10% chance to succeed each time we burn a resource, then 90% of the tries will fail. Burning two resources would be 90% * 90% = 81% both fail. Three tries would be 90%*90%*90%=73% all fail, and so on. So we can express the chance of not getting any success as P(TotalFail) = P(eachTryFail)^numTries.
We know the eachTryFail is 0.90, so we've currently got P(TotalFail) = P(0.9) ^ numTries.
But I want to know how many times I should burn a resource, in order to be resonably certain of getting at least one prize. In order to do so, I have to define what Reasonably Certain means for this problem. It's a willingness to be wrong, a risk factor, and for this problem where the penalty is just a little amount of my time, I'll go with the standard 95% certainty. If the situation were about public safety, or a production line, we'de pick a much higher value. But for this exercise, being wrong 1/20 doesn't hurt too much.
With this decision in place, we now have a definition for TotalFail; we have decided that it will be 1/20.
P(totalFail) = P(eachTryFail)numTries
0.05 = 0.9numTries
log(0.05) = log(0.9) numTries
log(0.05)/log(0.9) = numTries
numTries =~ 28