### Staring into the [render] distance

I've had my oculus quest for some months now and I've been thinking about how I can make the best sunset experience possible.

One of the ideas that I was exploring was that of prerendering. Since in a sunset experience, I don't expect the user to be able to move much out of their drawn area, there is going to be a maximum distance for which they can discern parallax. Using the area I can draw out, a 2m x 2m square, we can calculate how far away in virtual space an object needs to be before its motion from parallax would be sub-pixel.

A B

|\ |

| \ |

| \ |

| \ |

| \ |

| \|

C D

Using the above diagram, we need to calculate at what distance

The oculus quest has a pixel density of 20 pixels per degree. That means that each pixel is 0.05 degrees in the field of view. From that we need to calculate at what distance away from the user does an object have to be before the change in angle is less than this one pixel. That is if the user moves from C to D what distance does DB have to be such that the angle ADB < 0.05 degrees.

Since ABD is a right tryangle tan(ADB) = AB/DB so ADB = arctan(AB/DB)

arctan(AB/DB)<0.05

AB/DB<tan(0.05)

let AB=2m (The size of the play space)

DB = 2/tan(0.05) = 2200m

That means that for a play space of 2m anything further than 2km away will indistinguishable from a fixed background.

That is quite a long way away and apart from the sky and the horizon, there isn't much that would meet this criterion. We can however weaken the criteria.

If someone only wants to observe the sunset sitting in a chair, rather than have the ability to walk around the full 2m. It would be reasonable to fix them within half a meter. It is also arguable that we need to limit ourselves to individual pixels, we could say if the effect is less than 10 pixels we can ignore it.

This leads us to a much closer distance.

0.5/tan(0.5) = 57m

That could reduce the amount of visual rendering that needs to be done in real-time substantially.

## Comments

## Post a Comment