Horde3D
http://horde3d.org/forums/

using alpha channel in shader
http://horde3d.org/forums/viewtopic.php?f=2&t=652
Page 2 of 2

Author:  marciano [ 22.03.2011, 00:42 ]
Post subject:  Re: using alpha channel in shader

Here is a quick and dirty sample patch that shows how you can get a basic form of soft particles working in the hdr pipeline.
The scale constant could easily be exposed as a uniform so that it can be set from a material. The flag _F01_SoftParticles needs to be set in the particle system material to activate the soft particles.

Code:
Index: Horde3D/Binaries/Content/pipelines/hdr.pipeline.xml
===================================================================
--- Horde3D/Binaries/Content/pipelines/hdr.pipeline.xml   (revision 311)
+++ Horde3D/Binaries/Content/pipelines/hdr.pipeline.xml   (working copy)
@@ -19,6 +19,7 @@
       </Stage>
       
       <Stage id="Translucent">
+         <BindBuffer sampler="depthBuf" sourceRT="HDRBUF" bufIndex="32" />
          <DrawGeometry context="TRANSLUCENT" class="Translucent" order="BACK_TO_FRONT" />
       </Stage>
       
Index: Horde3D/Binaries/Content/shaders/particle.shader
===================================================================
--- Horde3D/Binaries/Content/shaders/particle.shader   (revision 311)
+++ Horde3D/Binaries/Content/shaders/particle.shader   (working copy)
@@ -7,6 +7,11 @@
    MaxAnisotropy = 1;
 };
 
+sampler2D depthBuf = sampler_state
+{
+   Address = Clamp;
+};
+
 // Contexts
 /*context SHADOWMAP
 {
@@ -68,11 +73,21 @@
 varying vec4 color;
 varying vec2 texCoords;
 
+#ifdef _F01_SoftParticles
+   varying vec4 vpos;
+#endif
+
+
 void main(void)
 {
    color = getParticleColor();
    texCoords = vec2( texCoords0.s, -texCoords0.t );
-   gl_Position = projMat * calcParticleViewPos( vertPos );
+   
+#ifndef _F01_SoftParticles
+   vec4 vpos;
+#endif
+   vpos = projMat * calcParticleViewPos( vertPos );
+   gl_Position = vpos;
 }
 
 
@@ -83,9 +98,21 @@
 varying vec4 color;
 varying vec2 texCoords;
 
+#ifdef _F01_SoftParticles
+   uniform sampler2D depthBuf;
+   varying vec4 vpos;
+#endif
+
 void main( void )
 {
    vec4 albedo = texture2D( albedoMap, texCoords );
+   gl_FragColor = albedo * color;
    
-   gl_FragColor = albedo * color;
+   // Soft particles
+#ifdef _F01_SoftParticles   
+   const float scale = 100;
+   vec3 fragCoord = (vpos.xyz / vpos.w) * 0.5 + 0.5;
+   float sceneDepth = texture2D( depthBuf, fragCoord.xy ).x;
+   gl_FragColor.a *= clamp((sceneDepth - fragCoord.z) * scale, 0, 1);
+#endif
 }

Author:  anchor [ 22.03.2011, 18:13 ]
Post subject:  Re: using alpha channel in shader

Thank You very much!

anchor

Author:  anchor [ 31.03.2011, 11:37 ]
Post subject:  Re: using alpha channel in shader

Hello Marciano!

Shame on Me, I can't modify the forward pipeline to setup this soft particle stuff.
Can You explain this to Me? :?

Author:  AlexL [ 02.04.2011, 09:13 ]
Post subject:  Re: using alpha channel in shader

I haven't tested Marciano's patch for the HDR pipeline, but couldn't it be a problem that the HDRBuf render-target is bound for write while reading from its Depth buffer attachment (bound as texture) at the same time?

@anchor: Modify the forward pipeline and use something like this (haven't tested it, but we use a similar pipeline)
Code:
<!-- Forward Shading Pipeline -->
<Pipeline>
   <Setup>
      <RenderTarget id="RT0" depthBuf="true" numColBufs="1" format="RGBA8" width="0" height="0" scale="1.0" maxSamples="0" />
   </Setup>

   <CommandQueue>
      <Stage id="Geometry" link="pipelines/globalSettings.material.xml">
         <SwitchTarget target="RT0" />
         <ClearTarget depthBuf="true" colBuf0="true" />
         
         <DrawGeometry context="AMBIENT" class="~Translucent" />
         <DoForwardLightLoop class="~Translucent" />
         
         <!-- here is marciano's change to bind the RT0's depth attachment as depthBuf sampler, not sure if this is a problem-->
         <BindBuffer sampler="depthBuf" sourceRT="RT0" bufIndex="32" />
         <DrawGeometry context="TRANSLUCENT" class="Translucent" order="BACK_TO_FRONT" />
         <UnbindBuffers />
      </Stage>

      <!-- do post-processing: here we only blit the RT0 to the backbuffer -->
      <Stage id="PostProcessing">
         <SwitchTarget target="" />   <!-- bind backbuffer as target -->
         <BindBuffer sampler="SceneBuffer" sourceRT="RT0" bufIndex="0" />
         <DrawQuad material="pipelines/postProcessing.material.xml" context="TRANSFER" />
         <UnbindBuffers />
      </Stage>
      
      <Stage id="Overlays">
         <DrawOverlays context="OVERLAY" />
      </Stage>
   </CommandQueue>
</Pipeline>

For your postProcessing.material.xml look at the postHDR.material.xml (and the corresponding postHDR.shader with the FINALPASS context)
Simply add a shader postProcessing.shader where you have a context "TRANSFER" and a sampler "SceneBuffer".
The shader then copies out the SceneBuffer to the bound target (which in the above pipeline is the backbuffer).
You can also do some more fancy post-processing here (and also bind the depth buffer of RT0 again if you need access to it for post-processing effects).

Author:  anchor [ 02.04.2011, 20:02 ]
Post subject:  Re: using alpha channel in shader

Thank You for the help!

It is now working :lol:

Before:
Image

After:
Image

Author:  marciano [ 03.04.2011, 21:30 ]
Post subject:  Re: using alpha channel in shader

AlexL wrote:
I haven't tested Marciano's patch for the HDR pipeline, but couldn't it be a problem that the HDRBuf render-target is bound for write while reading from its Depth buffer attachment (bound as texture) at the same time?

Good point Alex, I'm not sure what the spec says officially about that. At least depth writes are disabled while the particles are rendered so I would imagine that it should work (it definitely does with my AMD card).

Author:  AlexL [ 04.04.2011, 21:13 ]
Post subject:  Re: using alpha channel in shader

marciano wrote:
AlexL wrote:
I haven't tested Marciano's patch for the HDR pipeline, but couldn't it be a problem that the HDRBuf render-target is bound for write while reading from its Depth buffer attachment (bound as texture) at the same time?

Good point Alex, I'm not sure what the spec says officially about that. At least depth writes are disabled while the particles are rendered so I would imagine that it should work (it definitely does with my AMD card).

If I understood the specs correctly (and some forum posts over at opengl.org), you should not rely on it working correctly in this case.
Section
Code:
4.4.3  Rendering When an Image of a Bound Texture Object is Also Attached to the Framebuffer
Specs at: http://www.opengl.org/registry/specs/ARB/framebuffer_object.txt
The case that the texture is bound as a sampler but not enabled for writing (but as part of the rasterization stage, and not disabled through non-bound depth attachment), is not covered there.
Not sure what would be a nice solution (either you create a depth-only rendertarget and then blit to it from bound render buffer HDRBuf) or just use a float color buffer as storage for the depth buffer, or just have two fully-fledged rendertargets and then blit between them.

Author:  DarkAngel [ 05.04.2011, 08:05 ]
Post subject:  Re: using alpha channel in shader

In DX-land, this behaviour (using the currently bound depth buffer as a sampler with depth-writes disabled) was only introduced in DX11. In DX9/10, you've got to make a copy of the depth buffer.

If it works in GL on earlier cards, I'd wager that the driver is performing this copy for us behind the scenes.

Author:  AlexL [ 05.04.2011, 09:53 ]
Post subject:  Re: using alpha channel in shader

Thanks DarkAngel for the DX-Land insight.
Here is a rather recent discussion on the topic at opengl.org:
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=290777
My Conclusion: according to specs behavior is undefined, in practice it seems to work (at least on NVIDIA/AMD).

Author:  marciano [ 09.04.2011, 21:27 ]
Post subject:  Re: using alpha channel in shader

One way to overcome this problem would be to revise the pipeline design a bit. The render targets that can be defined in the pipeline are very similar to FBOs now in that they are a collection of output buffers. It would be more powerful if the user could define individual buffers/render targets and bind several of them using a SetRenderTargets command - which would be a lot more like in D3D. In GL this could be emulated by caching FBOs for the different combination of render targets.

Author:  zoombapup [ 05.03.2012, 16:32 ]
Post subject:  Re: using alpha channel in shader

Hey all. I've been implementing soft particles and had a look at the shader marciano posted here.

But I'm having issues with the results of the depth tests. In that my particles seem to be fading incorrectly.

Basically I think there's an issue with the calculation I'm using for the initial position (which is then passed as a varying to the main fragment shader).

In the code marciano posted, he passed a vertex position to calcParticleViewPos, yet that method doesn't take a vertex parameter.

So instead I used gl_Position after its been calculated with gl_ProjectionMatrix * calcParticleViewPos();

I'm thinking that this is where my problem lies. So anyone have any ideas what the calcParticleViewPos(vertex) method was actually doing? Or any way I can get the correct vpos value into the varying so that the test then works?

The rest of my code is as marciano posted, and looks alright from what I've read about doing soft particles (although a smoother transition could be useful, but will do that after this is fixed).

Any help appreciated.

Author:  Tindros [ 25.03.2012, 13:23 ]
Post subject:  Re: using alpha channel in shader

Hi Zoombapup,

From your description, I'd say there's only so many problem areas that could be causing the incorrect behaviour; I don't pretend to be particularly knowledgeable on the subject, but here are my thoughts:
Code:
gl_Position = gl_ProjectionMatrix * calcParticleViewPos();

This seems to be what you've come up with, which matches existing shaders for building translucency - hopefully you're not seeing bizarre mismatches of colours, just incorrect fading. If that is the case, then I'd surmise your code is fine at this point and getting the correct data; it's the next step of working out the alpha that'll need tweaking.

From marciano's code:
Code:
const float scale = 100;
vec3 fragCoord = (vpos.xyz / vpos.w) * 0.5 + 0.5;
float sceneDepth = texture2D( depthBuf, fragCoord.xy ).x;
gl_FragColor.a *= clamp((sceneDepth - fragCoord.z) * scale, 0, 1);

I apologise if my understanding is wholly incorrect, but isn't a scale of 100 out of place here? Between the clamped sceneDepth and fragCoord 'depth' values of [0..1] I'd expect this to represent the relevant alpha value.

Hope this helps.

Author:  zoombapup [ 25.03.2012, 14:28 ]
Post subject:  Re: using alpha channel in shader

Thanks Tindras.

Yes the scale value there is obviously pretty wrong. I think the original idea was to divide by the 100 so that you'd get a number between 0 and 1 for depths that differ by less than 100. In any case, I just had an idea, which was to draw the depth value for the existing depth into the particle. That way at least I know I'm getting a depth value that is correct. Useful for comparisons and should be able to visualize what is wrong.

Thanks for the feedback.

Page 2 of 2 All times are UTC + 1 hour
Powered by phpBB® Forum Software © phpBB Group
https://www.phpbb.com/