My rotation sent to objects via an Xbox stick seems to change based on the FPS (or at least the performance) the game is running at. I found this by testing the game on a system that could barely handle it, then seeing the result of the controller sensitivity in a mini game which I doubled or even tripled the performance in.
So I started to think about it and maybe it's because things are done Every Frame, so if the system is running more frames, it does it more? It reads the input from the controller more per second?
So how do I balance this? I could create a master sensitivity (multiplier) float so players could change to suit during the game... but that would F ing suck. I mean to be practical, I would also need to add the ability to pause the game in all of it's mini games and etc, along with the menu to adjust the sensitivity with. Way too much work. Also each mini game and the main game will run at different performance levels, so would require constant adjusting.
I've attached an image showing roughly how I'm doing some first person aiming in a mini game. It only shoes some but the rest is more of the same.
I get the Input Axis (using InControl to map), multiply it by either -1 or -1 (was previously on 1.2 during last build), I add left and right floats, as well as up and down floats, so I have 2 combined floats, then I feed those in to a Set Rotation as the X and the Y angle done Every Frame.
Looking at it now as I'm writing, I see the settings; Per Second and Fixed Update. Could one of those be the answer?