My game is based heavily around detecting whether a player has made a good landing or not. I'm defining a good landing as one where the player doesn't slow down too much when hitting the ground, however in my game you can sometimes make a poor landing and keep your speed up, so I'm also considering some sort of "before landing angle" and "post landing angle" to compare and use. The latter, however doesn't seem to be too accurate to me.
I'm making it like so:
Get players velocity vector3 pre landing then compare .01 seconds later with a post landing velocity. I'm doing an Angle vector3 operator and then saying its a good landing if the result is under 35degrees or a bad landing if over. Ie, if you face plant the ground, you'll go from looking straight down to looking up 90degrees, a bad landing.
Like I said, this doesn't seem to be very accurate, landings that I think are fine create larger angles than I thought them to be. I've tried changing the wait time between pre/post landing velocities but it doesn't seem to do much.
Any other thoughts for how I might do this? I can't have players feeling ripped off where they think they've landed well but my game says they haven't!
I've also tried using a ray cast, but I was getting collisions when I didn't expect them. I debugged and found that setting the ray length to 1 seemed to be far longer than it should. I watched in scene view and saw this giant debug ray, "1" long next to something with a radius of .5, yet it seemed far longer. I've set the ray to .15 now, which seems the equivalent of 1.5 or so. Not sure why this might be?