Is there a formula to determine that max flash rate countable by a video camera? I am thinking that any flash rate > # of fps is not practical. I get hung up on the fact that the shutter is open only a fraction of the amount of time required to produce a frame. 30fps is roughly 33.33ms. If the shutter is set for say 1/125 which is about 8ms or roughly 25% of the frame time. Does the shutter speed matter? I am thinking that unless they are sync'd the shutter could open at any point in the lamp flash ultimately making counting very difficult.
The application is just a general one. With today's high speed cameras (60fps or 120fps) can one reliably decide on the flash rate of a lamp. Think alarm panels, breathing monitors or heart rate monitors or the case of trying to determine duty cycle by visual means.
What you describe is related to the sampling problem. You can refer your problem to the Nyquist - Shannon theorem Given a certain frequency of acquisition (# of FPS) you can be sure of your counting (in every case, no matter of syncronization) if
"# of FPS" >= 2* flashing light frequency (in Hz)
Of course this is a general theoric rule, things can work in a quite different way (I am answering only regarding the number of FPS in a general case)