It depends on the brand quality, the lower/higher end of voltage range charge disparity. You will have more risk with one cell at end of its supplied capacity than say 3.10 & 3.20V
I don't know if you will get the kind of certainty you are looking for, since these are questions that professional battery engineers would need to verify and resolve. I think the general ballpark of 0.05V is a sensible guideline from what I have read on primary cell differences being used together in series (assuming the same brand).
To be honest, I have used a Surefire 123a at 3.10 with another at 3.20V, but I'm not recommending it....and I'm not a battery engineer.