AcidFaucet wrote:
So my question is if there's an obvious means to render out the object/world space colors to a texture matching the assigned UV coordinates? Or can I by some means lock the shader from updating and render out the flat image?
This isn't an easy task... I think that the way that 3D programs do this is to render each polygon of the model one-at-a-time. When rendering each poly, you would have to construct an ortho camera pointing directly at that poly, then take the output of the render and copy it to the correct place in your final bitmap...
A much easier (but flawed) way would be to simply render the object as usual with a shader that outputted the object/world-space positions, and the texture-coordinates to different MRT's (e.g. buf0.rgb = world-space position, buf1.rg = texture co-ords). If you render the object from multiple angles with such a shader, you could then use the resulting bitmaps as an associative-array, which could be used to re-assemble the final bitmap using the CPU. The only problem with this approach is that any occluded pixels will be missing from the reconstruction...
e.g. to reassemble the rendered data into the final texture:
Code:
vec3 bitmap[xsize][ysize]
for each pixel in buf0/1 as x,y
vec3 position = buf0.rgb[x][y]
vec2 uvcoord = buf1.rg[x][y]
bitmap[uvcoord.r][uvcoord.b] = position;
[EDIT]
AcidFaucet wrote:
Additionally, is such functionality a welcomed addition to Horde? I would like to contribute my work if it is of interest as procedural content can help to reduce the resource consumption associated with high detail graphics as well as reduce the amount of time involved in content creation.
Yes, I'm very interested in this! Procedural textures are great, especially with the amount of high-resolution content needed for next-gen artwork!