Horde3D

Next-Generation Graphics Engine
It is currently 22.11.2024, 01:20

All times are UTC + 1 hour




Post new topic Reply to topic  [ 7 posts ] 
Author Message
PostPosted: 04.09.2008, 21:19 
Offline

Joined: 22.11.2007, 17:05
Posts: 707
Location: Boston, MA
I just had to replace my laptop, and I decided to move to a MacBook (white), which contains the much-maligned Intel GMA X3100. For some reason, despite being DX10-capable, the card doesn't fully export OpenGL 2.0, so a few workarounds are necessary to get Horde running.

First off, Horde checks for OpenGL 2.0, so we have to disable that check, or in this case downgrade it to a warning:
Code:
===================================================================
--- Horde3D/Source/Horde3DEngine/egRenderer.cpp   (revision 52)
+++ Horde3D/Source/Horde3DEngine/egRenderer.cpp   (working copy)
@@ -118,8 +118,8 @@
    // Check that OpenGL 2.0 is available
    if( glExt::majorVersion < 2 || glExt::minorVersion < 0 )
    {
-      Modules::log().writeError( "OpenGL 2.0 not supported" );
-      failed = true;
+      Modules::log().writeWarning( "OpenGL 2.0 not supported" );
+      //failed = true;
    }
    
    // Check extensions

That simple change is enough to get the terrain sample running, and the chicago sample mostly running. Unfortunately, it isn't a change I can put in SVN, because this may let other cards/platforms through which shouldn't.

As I said, the terrain sample runs fine with that change (at 130+ fps), and the Chicago sample renders (at a measly 2 fps) all except the ground plane (comes out as blue).

The Knight sample wont run at all, because the X3100 has no support for floating point render-buffers, and thus the HDR pipeline cannot be initialised.

I am going to be on this laptop for at least a year, and any games I build with Horde will be targeted at similar hardware (if Spore and StarCraft 2 can support it...), so I would like to get Horde to the point where it can fallback seamlessly from HDR and high-quality shaders to a base profile for generic cards (probably vertex shaders only, no HDR).

_________________
Tristam MacDonald - [swiftcoding]


Top
 Profile  
Reply with quote  
PostPosted: 05.09.2008, 01:29 
Offline

Joined: 08.11.2006, 03:10
Posts: 384
Location: Australia
Damn Intel! :(

If you're still interested in doing HDR, I think Valve published their technique which works on regular 8-bit channels, although, according to wikipedia the x3100 only supports GL1.5 (which is absurd seeing as it supports DX10 and Shader model 4.0!) so I don't know what that means for GLSL support at all...


Top
 Profile  
Reply with quote  
PostPosted: 05.09.2008, 02:28 
Offline

Joined: 22.11.2007, 17:05
Posts: 707
Location: Boston, MA
DarkAngel wrote:
according to wikipedia the x3100 only supports GL1.5 (which is absurd seeing as it supports DX10 and Shader model 4.0!) so I don't know what that means for GLSL support at all...
I have GLSL 1.2, and every extension required for Horde except for framebuffer_multisample and texture_float.

_________________
Tristam MacDonald - [swiftcoding]


Top
 Profile  
Reply with quote  
PostPosted: 06.09.2008, 17:04 
Offline
Engine Developer

Joined: 10.09.2006, 15:52
Posts: 1217
Getting an Intel GPU as a graphics programmer (esp. GL) is an adventurous step ;)

The good thing is that we have a fair chance now to make Horde run on a broader range of hardware. Concerning the low FPS, I think that's nothing too special for an integrated graphics card. In my notebook I have a NVidia 8400G and I'm also not too happy with the framerate (although it's much higher than yours). On a 8800 GTX I get great performance running the samples. At some point there is no way to further optimize stuff since you are just limited by the physical power of the hardware.


Top
 Profile  
Reply with quote  
PostPosted: 06.09.2008, 19:38 
Offline

Joined: 22.11.2007, 17:05
Posts: 707
Location: Boston, MA
marciano wrote:
Getting an Intel GPU as a graphics programmer (esp. GL) is an adventurous step ;)
Don't I know it! However, one of my goals in this switch is to start focusing on developing game play rather than graphics.

Quote:
Concerning the low FPS, I think that's nothing too special for an integrated graphics card. In my notebook I have a NVidia 8400G and I'm also not too happy with the framerate (although it's much higher than yours). On a 8800 GTX I get great performance running the samples. At some point there is no way to further optimize stuff since you are just limited by the physical power of the hardware.
Ja, I wouldn't expect anyone to use parallax shaders with an integrated GPU. This is going to require some thought - perhaps a way to detect integrated GPUs, so one can choose a simpler pipeline and shaders.

_________________
Tristam MacDonald - [swiftcoding]


Top
 Profile  
Reply with quote  
PostPosted: 24.08.2010, 21:25 
Offline

Joined: 08.12.2009, 21:13
Posts: 17
I'm having some similar x3100 problems, but with the current version of Horde the above workaround no longer works in egRendererBase.cpp (line 60)

Code:
   // Check that OpenGL 2.0 is available
   if( glExt::majorVersion < 2 || glExt::minorVersion < 0 )
   {
      Modules::log().writeError( "OpenGL 2.0 not available" );
      failed = true;
   }


Shows that 2.0 is available, but in egRenderer.cpp (line 104)

Code:
   // Check capabilities
   if( !_caps[RenderCaps::Tex_Float] )
      Modules::log().writeWarning( "Renderer: No floating point texture support available" );
   if( !_caps[RenderCaps::Tex_NPOT] )
      Modules::log().writeWarning( "Renderer: No non-Power-of-two texture support available" );
   if( !_caps[RenderCaps::RT_Multisampling] )
      Modules::log().writeWarning( "Renderer: No multisampling for render targets available" );


Shows that
No floating point texture support available
No multisampling for render targets available

Running Chicago.exe, the following works in egRendererBase.h (line 65) for a few calls

Code:
T &getRef( uint32 handle )
   {
      ASSERT( handle > 0 && handle <= _objects.size() );
      
      return _objects[handle - 1];
   }


but then the assert fails b/c the handle is 0

    Horde3D.dll!RBObjects<RBVertexLayout>::getRef(unsigned int handle=1) Line 67 C++
    Horde3D.dll!RendererBase::createVertexLayout(unsigned int elemCount=8) Line 941 + 0x16 bytes C++
    Horde3D.dll!Renderer::init() Line 112 + 0xa bytes C++
    Horde3D.dll!h3dInit() Line 88 + 0x20 bytes C++
    Sample_Chicago.exe!Application::init() Line 50 + 0x8 bytes C++
    Sample_Chicago.exe!main(int argc=1, char * * argv=0x00e520a8) Line 185 + 0xb bytes C++
    Sample_Chicago.exe!__tmainCRTStartup() Line 266 + 0x19 bytes C
    Sample_Chicago.exe!mainCRTStartup() Line 182 C
    kernel32.dll!76854911()
    [Frames below may be incorrect and/or missing, no symbols loaded for kernel32.dll]
    ntdll.dll!77c5e4b6()
    ntdll.dll!77c5e489()
    Horde3DUtils.dll!_isindst_nolock(tm * tb=0x00000000) Line 596 + 0x44 bytes C

Is there a new workaround for this, or has Horde3d grown to the point where this graphics card is no longer viable?


Top
 Profile  
Reply with quote  
PostPosted: 26.08.2010, 19:53 
Offline
Engine Developer

Joined: 10.09.2006, 15:52
Posts: 1217
aolney wrote:
but then the assert fails b/c the handle is 0

Are you sure that the handle is 0? The callstack indicates that it is 1 and I don't see a reason why adding the layout object should fail...


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 7 posts ] 

All times are UTC + 1 hour


Who is online

Users browsing this forum: No registered users and 15 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group