Parallax Test 4
April 27, 2018
It is necesary to fix that “z-fighting noise”. As solution, a depth texture used as z-buffer should solve that z-figthing:
At the moment of press play with this shader, nothing had change. After several tests, the only answer to the problem should be that InterlockedMax function does not work with textures.
Changing the RWTexture2D<float4> to a RWStructuredBuffer<int> the results have improved a lot but some anoying noise was still there. Changing depth map options and albedo options, the noise was more regular and less frequent.
Making research
April 25, 2018
Looking at other works to get inspired, some of them was made using a high-poly sphere and displacing vertex to simulate parallax. That have many restrictions as well as a performance problem when “high-poly” is too much vertices to handle.
Vertex displacement project from Joan: https://github.com/IraltaVR/6DoF
@soylentgraham from Twitter made some tips and provided a link to his “PopDepthMap360” project using kinect.
After some time searching solutions, we evaluate that use textures as point cloud should be the best way to work, and we decided to try compute shaders
Parallax Test 3
April 20, 2018
Last test makes 40 texture fetchs to select the best texel to ocuppy the current fragment, this also means that pixels can be moved only by 40 pixels, if more movement is required more fetchs should be done.
The main reason to fetch so much texels is the restriction of fragment shader on writing at a different position of the textures. To solve this restriction, the usual shader is replaced by a compute shader.
Here is the first compute shader developed:
The first problem that appeared at this moment is about concurrency. When a thread compute the new UV, it is done using height and many pixels can compute the same UV. If that happens, something similar to z-fighting is rendered.
In the video below it can be seen (near the windows is more noticeable).
Parallax Test 2
April 17, 2018
To solve the parallax test 1 errors, the current pixel should look at the neighborhood and calculate wich is the best pixel to occupy the current pixel.
That idea make lots of texture fetchs. When observer is displaced, every pixel should move on the same direction with diferent displacement amount. Taking that into account, the number of fetchs will be optimized only looking through the pixels into the displacement line.
With this algorithm the main problem is that many pixels wont have a good fetch to fill it. At the moment, this pixels will be pink.
The new fragment code is a bit more complicated:
This aproach is considerably better than last one, here it is the results.
Parallax Test 1
April 12, 2018
It is need make parallax in a plane image, it is needed to “move” pixels over the rendering surface. To make this possible the first idea is to use a shader.
Two additional parameters are provided:
- Parallax Amount: how much a pixel should move taking in account its depth.
- Relative Position: simulated displacement of the camera on the x axis (at view space).
This parameters are choosen to get faster results and check the problems, in the future will be calculated realtime.
The vertex shader is not interesting for what concerns us. The fragment shader is this:
Obviously this is a bad aproach. The pixel is chosen taking in acount the current pixel height and not the other pixel height. This is the visual result:
Objective of the project
April 9, 2018
The main objective is to reach 6 DOF in VR in 360 images using a few images to get it working. Parallax will be the main way to work.
What is Parallax
If an observer take a picture and then move the camera (without rotation) to the right and take another picture, some objects or a part of them that appear in the first picture will be occluded by other objects. Comparing both pictures and taking the first as reference, the nearby elements are apparently more displaced in the second picture than the distant ones. Thats because the human eye and cameras see as a persepective projection.
That phenomenon is called parallax.
Parallax example from imgur
Source images to work with
The source items to build it will be two images: the albedo and the depth map of the scene.
In order to get clean tests, the images will be generated by a renderer (getting a perfect depth field image). In a late stage, when the parallax is working, the images will be replaced for real images to discover the problems of our method with them.
The engine
To get faster results, a comercial engine will be used. The choosen one is Unity.