F
Freddie The Potato
Guest
Hola!
While writing a script to resolve 3D collisions, I encountered a problem and I can't quite figure it out!
I have a rectangular prism with varying width, height and depth. I also have a point located within this prism (A) and a normalized directional vector (d).
If a ray is cast from A and travels along the directional vector d, how do I calculate the point (B) where this ray intersects the outer bounds of the prism?
Any tips would be greatly appreciated
While writing a script to resolve 3D collisions, I encountered a problem and I can't quite figure it out!
I have a rectangular prism with varying width, height and depth. I also have a point located within this prism (A) and a normalized directional vector (d).
If a ray is cast from A and travels along the directional vector d, how do I calculate the point (B) where this ray intersects the outer bounds of the prism?
Any tips would be greatly appreciated