beginner to Unity here with a very noob question!
I’m writing some C# code for my unity game to move an object to the left. An online tutorial told me that I need to multiply my position transformation by deltaTime in order to ensure that computers which run on different frame rates still move my object at the same rate, independently of their frame rate and instead based on time. Here is the code
void Update()
{
transform.posiiton = transform.position + (Vector3.left * moveSpeed) * Time.deltaTime;
}
“
What I am confused about is how does multiplying Time.deltaTime guarantee that we will not still be affected by framerate? Since we are not dividing or otherwise getting rid of any framerate parameter, wouldn’t multiplying Time.deltaTime simply confound the issue?
I followed an online tutorial and that’s what it instructed me to do, but I would like to know why.
onedayatatime is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.