# When debuffs are removed, stat numbers are wrong (AC in this case)

I made in informative video here about the numbers associated with debuffs when they are “cleared” or removed.

What happens is even though debuffs that are applied are multiplicative, when they get removed the stat associated with it, in this case is AC, the amount restored is the amount was lost initially. If an enemy has 100 AC and you apply a debuff that removes 75%(for 2 turns), they would have 25 left. Then if you apply another debuff like -50% AC(for 2 turns) they would have 13 AC (rounded up). But should the 75% debuff wear off the new AC should be (math: 13 divided by 0.25 ) 50 to 52. But instead it restores the full 75 points making the new AC 87 or 88. What happened to the 50% AC debuff that is still being applied?

Anyway I reported this bug, I hope this is a bug, because this makes using smaller debuffs then bigger debuffs better than the other way around.

It sounds like it is working as it should to me.

If and only IF the math checks out, after the 75% debuff is removed, why is slayer’s prey (50%), the only buff that is currently being applied, only applying a paltry 12%?

However I understand if the case is that it is how the way the game is intended to work with this kind of math but my issue here is other current debuffs are applying very small amounts after a big debuff has been removed.

It seems what’s happening is your second debuff is locking in it’s effect at 50% of the remaining 25% or 12.5%. It just never rechecks and this is probably by design. I’m surprised the abilities stack in the first place instead of ignoring the smaller debuff entirely.

Exactly. The debuff/buff happens when cast and then maintains the value determined upon casting until it expires. I’m not an expert on DND mechanics, but I suspect that comes out of the rulebook for the tabletop game.

If might be right according to DnD, but I agree that it seems wrong. If I have two 50% debuffs, taking a stat from 100 to 25, and one rebuff disappears, I should be at 50 for that stat regardless of which debuff is removed.

Well that’s a bit unfortunate I “suppose” this was done for balancing purposes…? But i dunno, it’s a bit inconvenient to not only keep track of which buffs are still being applied but also which order to apply it in so I get the best out of it.

I disagree with this If the first debuff takes off 50ac out of 100 and the second takes 25ac out of 50 leaving you with just 25ac, when the first debuff, that took off 50, wears off you should get that 50 back. Making you 75 again. If that particular spell took away 50 you should get 50 back not just 25. Why should losing that first debuff make the second one more powerful than it was when initially cast?

1 Like

Because that’s how multiplicative math works. If someone is currently being applied a debuff and another of something is applied, it is applied multiplicatively. If it was additive, THEN and only then it would overpowered. My reasoning here is that if there are two of the same debuffs happening and one should wear off, the existing ones should still be applying for how much it should take away, in other words, it should take a new snapshot of stats and apply the appropriate amount to deduct.

The first debuff is -75% and the second debuff is -50%. After 75% is removed you still have 50% to deduct but of course that’s not how it works in the dnd world. And to address your last sentence, the second debuff was never more powerful, it is and how formula “normally” works in most cases.

Let me call you out on this one as well. Let’s say you have can buff a character with +50% atk, and you apply another buff with + 50% atk. You apply them both 1 turn after another. Starting from 100 your atk values go from 100 to 150 to 225. Once the first buff runs out your atk goes from 225 to 175…

Why should losing that first (de)buff make the second one more powerful than it was when initially cast?

But of course someone out there will cite the “that’s how it works in the manual/dnd world” clause.

That is exactly the same thing I said only with addition instead of subtraction. The first buff added 50 the second added 75… the first buff wears off so you lose the 50 but keep the 75 and you lose what you gained from the first spell

My way you lost 50 ac then you lost 25 when you lose the first debuff you gain back 50 which you lost from the first spell

They are two spells which are two separate events.

The only effect they have on each other is that there is a separate starting point on the second because of the first. Spell A takes away 50 points Spell B takes away 25 points. It spell A took away 50 points and goes away, why should you only get half of that back? Now if the spell was designed to restart at the beginning of every turn and take away 50% of AC available on that turn then that would make sense… if at the time it was cast spell B could have cost more Ac but didn’t because it had a lower starting point should not mean that once the other spell disappears Spell B now becomes more powerful

That is how it’s supposed to work. It’s exactly how it’s supposed to work. It always starts one the base. The second debuff is not ‘half of the current value’ it’s an additional 50% after any other debuffs. If the first debuff goes away, it’s still 50% of the base value.

At every time you care, you take the base and then apply all buffs and debuffs in order to obtain a current value. (Actually order doesn’t matter, but you do it one at a time.)

If the game does it differently, then that’s Ludia’s choice. But I am pretty sure real DnD does it in this mathematically accurate way.

Sure… makes A LOT OF SENSE.