The sight picture may be wrong, the scale of target image to the gun may be wrong, they will not take “forward allowance” on a moving target into consideration, you will lose all binocular vision – “depth perception” (in 2D) and the controller will probably be ergonomically incorrect in relation to your head position. If the game relies on a visible laser beam to aim, how does that translate into real shooting – unless you’re in a room with a smoke machine ?
Good luck with all that.
A game will teach you correct technical skills only if the game is specifically designed to accurately simulate correct sight picture at the muzzle, the gun controller is the same dimensions as the gun avatar in the scene, your head position is accounted for in relation to the gun, and everything in the scene is to scale and ergonomically correct, which, not coincidentally, is exactly what we do.
We put the gun controller onto a real gun or a weighted replica gun, and we render that same gun in the game so that what you see in the game is what the gun controller is like on your shoulder or in your hand. And what you see in the sight picture is the same as what you see in real shooting, with correct eye-dominance, binocular vision and correct head position in relation to the gun. The sight picture, trajectories, ballistics, etc. are empirically derived and true to scale. You play the game, then you go to the range.
This can only be done effectively in 3D, ie. in virtual reality, where both the user’s head position in relation to the gun controller and the gun controller’s position is tracked and accurately represented stereoscopically in the VR scene. You know, like in CLAZER