I’m looking for help to solve the following situation. I have a line segment defined by two points P1 and P2. And I have a ray that starts at P3 and intersects segment P1P2 at P4. I want to calculate P5 so that it is always a given length l perpendicular distance from P1P2.
This is my current approach: When ray P3 intersects P1P2 (as a conditional, so P4 isn’t even used), I then calculate P1’P2′ where P1′ and P2′ are translated by the perpendicular vector towards P3 (shown below) and I find ray P3 intersection with P1’P2′.
This seems a wonky way to do it to me, though. First I have to find the orientation of P1, P2, and P3 and use that to get the right perpendicular vector and then I have to do my line intersection all over again with P1’P2′ to find P5. All this just so that P5 is always a specific distance l away from the segment P1P2.
It seems like there should be a more straightforward way to solve this with less math for the computer to perform, but I don’t have a strong enough math background to figure it out. And since I’m using vectors, that makes it more difficult to think about outside the tools I already have (line intersection was also high school math).
The use case is for collision detection. I’ve drawn P1P2 with a horizontal slope, but it isn’t necessarily the case that it always will be. I will be very surprised if anyone tells me my current solution is the right one and I’m hoping someone can point me toward a better way.
PS, my knowledge of math notation is also not strong. If there’s a request to see my work, I’ll have to show you using C#.