Horde3D

Next-Generation Graphics Engine
It is currently 24.11.2024, 20:11

All times are UTC + 1 hour




Post new topic Reply to topic  [ 7 posts ] 
Author Message
PostPosted: 13.09.2012, 09:59 
Offline

Joined: 06.09.2012, 14:18
Posts: 5
Hi, I have few questions that were bugging me and that I couldn't find a good answer no matter where I looked in. I hope you might help :)

How can I configure a render target and render to it. I didn't really understand that part in the pipelines configurations as I'm new to 3D programming. Well, just a simple example might do if possible.

What do lighting materials do? I tried to use the lighting material included with the samples, edited the shader file so that it discards whatever FS operations it did but I see no visible difference D:

And finally, I was wondering if it's possible to apply full screen fragment shaders, my guess is to render the whole scene to a different render target, then apply the shader to that render target when rendering it (using overlays?), but as stated earlier, I don't know how to configure render targets. But if there's a better way, I'd like to know :)

Thanks!


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 11:18 
Offline

Joined: 08.06.2010, 14:17
Posts: 63
For fullscreen shaders, look at hdr.pipeline.xml ,it's the simplest example of how it works.

Light materials are used in deferred lighting, try editing them while using deferred.pipeline.xml


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 11:35 
Offline

Joined: 23.07.2009, 21:03
Posts: 51
Location: Germany
Just in case you haven't found it, the pipeline documentation can be found in the manual here :wink:

It answers your lightning materials question:
Manual wrote:
For deferred shading you also need to specify a material for the light source. The shader defined in that material is used to draw the screen-space quads where the light's lightingContext attribute specifies the shader context which is used.


If you don't use the deferred sample pipeline, you don't need to worry about them. If you want to use it, wikipedia should be your first stop reading about deferred shading.

Your other questions can be answered by having a look at the sample HDR pipeline.

Code:
<Setup>
      <RenderTarget id="HDRBUF" depthBuf="true" numColBufs="1" format="RGBA16F" scale="1.0" maxSamples="16" />
      <RenderTarget id="BLURBUF1" depthBuf="false" numColBufs="1" format="RGBA8" scale="0.25" />
      <RenderTarget id="BLURBUF2" depthBuf="false" numColBufs="1" format="RGBA8" scale="0.25" />
   </Setup>


We set up three Rendertargets. The used rendertarget attributes are documented in manual.

First stage in our command queue:
Code:
<Stage id="Ambient" link="pipelines/globalSettings.material.xml">
         <SwitchTarget target="HDRBUF" />
         <ClearTarget depthBuf="true" colBuf0="true" />
         
         <DrawGeometry context="AMBIENT" class="~Translucent" />
      </Stage>

Basically says: Render my geometry (except the translucent stuff) to the rendertarget "HDRBUF".

The next two stages should be clear.

Then the rendertarget gets switched:
Code:
<!-- HDR post processing -->
      <Stage id="BrightPass">
         <SwitchTarget target="BLURBUF1" />
         <BindBuffer sampler="buf0" sourceRT="HDRBUF" bufIndex="0" />
         <DrawQuad material="pipelines/postHDR.material.xml" context="BRIGHTPASS" />
         <UnbindBuffers />
      </Stage>

Basically says: Give me the rendertarget "HDRBUF" we rendered to before, do something with it (what to do is specified in the material/shader of the <DrawQuad> command), and render the result to BLURBUF1.

The "Bloom" stage works in a similar way and just adds a <SetUniform> command (which sets a uniform :wink: )

In the last stage the rendertarget gets switched to an empty string (<SwitchTarget target="" /> ) which means the output buffer.

Thats it.

IMHO the best way to understand the pipelines is to play around with the existing ones using the HordeEditor. You can switch of stages, edit rendertargets configurations and see immediate results. And you can have a look at the each target on its own.


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 20:05 
Offline

Joined: 06.09.2012, 14:18
Posts: 5
Thank you guys, I think I've got a better grasp of how things work now, it seems I only needed a lead after all :P

But still, about render targets, let's say I'm making a car racing game and need to implement a rear-view mirror. My idea is that I should associate a camera with a render target, but h3dAddCameraNode only offers us to set a pipeline res, does it mean that my render target should be in a different pipeline? or is there a different way to handle this?

In the same context, in both of the previous cases, how can I access the render target for use in a material shader?

I know I'm asking a lot of questions, some of which might be stupid, but please bear with me :roll: .
And thanks again!


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 20:53 
Offline

Joined: 08.06.2010, 14:17
Posts: 63
Never tried to make any type of picture-in-picture myself, but i guess easiest way would be to have single pipeline with multiple stages , one of them renders stuff on screen, the other is the same except it renders into a separate render target. Then in game you disable the on screen stage (setPipelineStageActivation), render your rear view camera, enable on screen stage, and render "main" view with said RT used as a texture on your mirror object (or you can overlay it with a fullscreen shader).


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 21:20 
Offline

Joined: 23.07.2009, 21:03
Posts: 51
Location: Germany
The easiest way to get your mirror example running would be to use a second camera:

Create a texture (resolution and format are just an example):
Code:
int mirrorTexture = h3dCreateTexture(textureName, 320, 240, H3DFormats::TEX_BGRA8, H3DResFlags::TexRenderable);

Notice the Renderable flag.

Next, assign the texture to the material of e.g. a simple plane mesh, which will act as the mirror.
Code:
int samplerIndex = h3dFindResElem(PlaneMaterialResourceId, H3DMatRes::SamplerElem, H3DMatRes::SampNameStr, "albedoMap");
h3dSetResParamI(PlaneMaterialResourceId, H3DMatRes::SamplerElem, samplerIndex, H3DMatRes::SampTexResI, mirrorTexture);


Now you create your second camera and attach it to the car (and change its position and rotation so it properly renders the "back view").
Code:
int mirrorCam = h3dAddCameraNode( carNode, "myMirrorCam", pipelineResource );


Now you must set the camera to render to the mirror texture, not to the screen:
Code:
h3dSetNodeParamI( mirrorCam, H3DCamera::OutTexResI, mirrorTexture );


And don't forget to render your second camera in the game loop.
That should be it. I think .... :D


Top
 Profile  
Reply with quote  
PostPosted: 13.09.2012, 23:14 
Offline

Joined: 06.09.2012, 14:18
Posts: 5
Roland wrote:
Now you must set the camera to render to the mirror texture, not to the screen:
Code:
h3dSetNodeParamI( mirrorCam, H3DCamera::OutTexResI, mirrorTexture );


Yay! That was the part that I was missing. Not that I understood what "H3DCamera::OutTexResI" did exactly anyway :P

Well, thanks guys!


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 7 posts ] 

All times are UTC + 1 hour


Who is online

Users browsing this forum: No registered users and 7 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group