I don't know what's your definition of PWM.
Basically, there are two ways to turn a certain voltage into another. The one is called "Linear", and the other called "Switch". The way the first tech accomplishing the goal is to turn the unwanted energy into heat. For example, if you want to get 3V output from a 5V input, and the current is supposed to be 1A, a linear circuit will absorb 5W from the input, turn 2W into the heat, and then there will be 3V 1A output. Of course this is quite an unefficiency way.
The other is switch. To be easy to understand, you can imagine that in switch circuit, there is a "pool" that can contain energy. If output is lower than input, the circuit just first get a pocket of energy from the input, and put it into the pool, and then transform it to the output. Generally it is like this, and PWM is only one mode of switch circuit, there are others like PFM etc, but they are not important to this topic.
So every torch with MCU-controlled brightness adjusting does use a switch circuit(PWM or other). If a torch uses low frequency PWM driver and directly drives the LED, the LED will flicker. However if a torch uses a high frequency PWM driver that directly drives the LED, will not make human feel flicker due to human eyes is not that sensitive to high frequency switches. Or, a PWM driver with low frequency can add a capacitor between the LED's positive pin and the nagative pin to remove the flicker.
So...I think you may talking about flicker, right?