4 Replies Latest reply on Dec 6, 2018 10:25 AM by bryanrtboy

    DepthStream To Texture


      I am in need of an RGB or RGBA texture of the depthmap generated by the D415 camera in Unity. Visually, I can see what I want in the Unity editor, but the images are generated by shaders and are not the the raw data stream. I have read various answers and posts about getting depth data from the depth camera, which seems to be in Z16 format. The program I'm working on uses Texture2D images to drive a particle stream and when I directly convert the texture from the examples to a Texture2D the image is a dark red on black.


      I know this relates to the way depth is stored in the Z16 format, but I have been unable to find a straightforward way of getting a greyscale image from the stream as a texture. Any advice on how to approach this?


      My ultimate goal is to have particles emitted in the shape of whatever is X distance from the camera. The BackgroundSegmentation is exactly what I ultimately would like to do. But, the way that works is by clipping the texture in the shader so that it displays clipped. I have not been able to dissect the shader in order to clip the texture before sending it to my particle system as a texture input.

        • 1. Re: DepthStream To Texture

          The discussion linked to below talks about exporting a PNG image of the stream in the Unity SDK 2.0 wrapper by using a Unity instruction called EncodeToPNG.


          Re: Why can't I find the PXCMImage.ImageData.ToBitmap function in Unity platform (The Intel RealSense SDK 2.0)?


          Edit: I see that you want the rawstream and not the shader-made image.  So this method may not be suitable for your goal.


          The non-Unity SDK examples use OpenGL graphics to render streams such as point clouds in an OpenGL window.  So I wonder if also using Unity's implementation of OpenGL might be the most suitable way to adapt non-Unity sample code into a Unity program that works in a similar way, giving a true stream image.


          Unity - Manual: OpenGL Core

          • 2. Re: DepthStream To Texture

            I don't need to save the texture, I am able to send the depth stream directly to my particle system using the RsStreamTextureRenderer Texture binding (really nice how that works!). I wrote a simple function to do that:


               public void SendToFXThing(Texture tex)



               thisVFX.SetTexture("CamTexture", (Texture2D)tex);




            The problem is the texture that is sent is dark red/black and not the right range of values for my use. Here is a screen shot of the Depth example, top has a material on the RawImage, the bottom is the RawImage with no material. The RawImage with no material is what is being sent.


            Screen Shot 2018-12-06 at 8.17.12 AM.png


            I would like to send the colored image. I know one workaround would be for me to set up a new Camera with a RenderTexture in Unity, point it at the colored image and then send the RenderTexture. But, that is several layers of conversion and was hoping to find a simpler way to extract the depth data into an image format.  I'm coming from a VFX/3D background where I am used to using depth channels that look like this and are in RGB format:




            Conceptually, I know that the shader is extracting the depth from the Z16 stream, probably as a 16 bit integer and using that to clip/change the colors. I was hoping that the dropdown options in the RsStreamTextureRenderer would simply allow me to choose RGBA 8 and I would have a depth image like above, but it does not seem to work that way. Is Z16 the only option for a depth stream?


            Edit: I just saw your edit and I will take a look at doing this through Unity's OpenGL. I struggle with OpenGL code, so was hoping that there was a function already in place to produce a greyscale depth image like above!

            • 3. Re: DepthStream To Texture

              Z16 is the only depth option, yes.


              The dark particle shade reminds me of Unity particles that do not have light illumination shining on them (I am primarily a Unity developer myself and work with particles).  Would it be possible to place a light source in front of the surface that the particle display is on? Or set the particles to have an Emission quality so that they glow even when the scene is not lit?

              • 4. Re: DepthStream To Texture

                The dark image is the rawtexture stream. If someone else is trying to do this, here's a screenshot of what I'm trying to do in Unity. Basically, have particles emitted by things that are close to the camera. It's working, you can see the rawimage stream above, and below is a VFX particle system emitting based on the values in the stream (upside down).


                Screen Shot 2018-12-06 at 10.07.48 AM.png


                The texture is piped into the VFX system, and then I read out the Red channel to initialize (emit) particles. It works in realtime, as I move the particles are emitted only where it's red in the image. This is using the newest VFX system in 2018.3 beta.   I tried with 2017 and the older shuriken system and performance is much better in 2018 beta. Also, so far I've found the D415 performs much more stable in the beta, so that's cool!


                I guess what I'm not happy with is the low contrast/narrow range of the raw image. I would rather have a greyscale depth image to sample from, allowing me to fine tune at what distance should particles be emitted or not.


                Looking through the non-Unity samples, the opencv rs-grabcuts has a function that is similar what I would like to do:


                // The following operation is taking grayscale image,
                // performs threashold on it, closes small holes and erodes the white area
                auto create_mask_from_depth = [&](Mat& depth, int threh, ThresholdTypes type)
                threshold(depth, depth, thresh, 255, type);
                dilate(depth, depth, erode_less);
                erode(depth, depth, erode_more);


                But, I'm not able to figure out how to implement this just yet.