The way iPickMinChance is used is confusing
I'm talking about this function:
Specifically this part:
if (t < pcSneak / iPickMinChance)
{
return (roll > int(pcSneak / iPickMinChance));
}
else
{
t = std::min(float(iPickMaxChance), t);
return (roll > int(t));
}
}
What I would expect is for iPickMinChance to be used like iPickMaxChance - as a clamp on the probability. Something like:
t = std::max(float(iPickMinChance), t);
t = std::min(float(iPickMaxChance), t);
return (roll > int(t));
Instead we get this whole other logic - which leads to an exact inverse of what I'd expect - increasing the iPickMinChance actually reduces your chances of successfully pickpocketing. If iPickMinChance is set to 100 - which I'd expect to mean guaranteed success - then pcSneak / iPickMinChance would be smaller than if set to the default of 5, meaning a higher chance of failure. I know pickpocketing odds have been relitigated several times and that is not my goal here, but I see no explanation for this counterintuitive phenomenon.