Horde3D http://horde3d.org/forums/ |
|
Horde on Mac Intel GMA X3100 http://horde3d.org/forums/viewtopic.php?f=8&t=492 |
Page 1 of 1 |
Author: | swiftcoder [ 04.09.2008, 21:19 ] |
Post subject: | Horde on Mac Intel GMA X3100 |
I just had to replace my laptop, and I decided to move to a MacBook (white), which contains the much-maligned Intel GMA X3100. For some reason, despite being DX10-capable, the card doesn't fully export OpenGL 2.0, so a few workarounds are necessary to get Horde running. First off, Horde checks for OpenGL 2.0, so we have to disable that check, or in this case downgrade it to a warning: Code: =================================================================== --- Horde3D/Source/Horde3DEngine/egRenderer.cpp (revision 52) +++ Horde3D/Source/Horde3DEngine/egRenderer.cpp (working copy) @@ -118,8 +118,8 @@ // Check that OpenGL 2.0 is available if( glExt::majorVersion < 2 || glExt::minorVersion < 0 ) { - Modules::log().writeError( "OpenGL 2.0 not supported" ); - failed = true; + Modules::log().writeWarning( "OpenGL 2.0 not supported" ); + //failed = true; } // Check extensions That simple change is enough to get the terrain sample running, and the chicago sample mostly running. Unfortunately, it isn't a change I can put in SVN, because this may let other cards/platforms through which shouldn't. As I said, the terrain sample runs fine with that change (at 130+ fps), and the Chicago sample renders (at a measly 2 fps) all except the ground plane (comes out as blue). The Knight sample wont run at all, because the X3100 has no support for floating point render-buffers, and thus the HDR pipeline cannot be initialised. I am going to be on this laptop for at least a year, and any games I build with Horde will be targeted at similar hardware (if Spore and StarCraft 2 can support it...), so I would like to get Horde to the point where it can fallback seamlessly from HDR and high-quality shaders to a base profile for generic cards (probably vertex shaders only, no HDR). |
Author: | DarkAngel [ 05.09.2008, 01:29 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
Damn Intel! ![]() If you're still interested in doing HDR, I think Valve published their technique which works on regular 8-bit channels, although, according to wikipedia the x3100 only supports GL1.5 (which is absurd seeing as it supports DX10 and Shader model 4.0!) so I don't know what that means for GLSL support at all... |
Author: | swiftcoder [ 05.09.2008, 02:28 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
DarkAngel wrote: according to wikipedia the x3100 only supports GL1.5 (which is absurd seeing as it supports DX10 and Shader model 4.0!) so I don't know what that means for GLSL support at all... I have GLSL 1.2, and every extension required for Horde except for framebuffer_multisample and texture_float.
|
Author: | marciano [ 06.09.2008, 17:04 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
Getting an Intel GPU as a graphics programmer (esp. GL) is an adventurous step ![]() The good thing is that we have a fair chance now to make Horde run on a broader range of hardware. Concerning the low FPS, I think that's nothing too special for an integrated graphics card. In my notebook I have a NVidia 8400G and I'm also not too happy with the framerate (although it's much higher than yours). On a 8800 GTX I get great performance running the samples. At some point there is no way to further optimize stuff since you are just limited by the physical power of the hardware. |
Author: | swiftcoder [ 06.09.2008, 19:38 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
marciano wrote: Getting an Intel GPU as a graphics programmer (esp. GL) is an adventurous step Don't I know it! However, one of my goals in this switch is to start focusing on developing game play rather than graphics.![]() Quote: Concerning the low FPS, I think that's nothing too special for an integrated graphics card. In my notebook I have a NVidia 8400G and I'm also not too happy with the framerate (although it's much higher than yours). On a 8800 GTX I get great performance running the samples. At some point there is no way to further optimize stuff since you are just limited by the physical power of the hardware. Ja, I wouldn't expect anyone to use parallax shaders with an integrated GPU. This is going to require some thought - perhaps a way to detect integrated GPUs, so one can choose a simpler pipeline and shaders.
|
Author: | aolney [ 24.08.2010, 21:25 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
I'm having some similar x3100 problems, but with the current version of Horde the above workaround no longer works in egRendererBase.cpp (line 60) Code: // Check that OpenGL 2.0 is available if( glExt::majorVersion < 2 || glExt::minorVersion < 0 ) { Modules::log().writeError( "OpenGL 2.0 not available" ); failed = true; } Shows that 2.0 is available, but in egRenderer.cpp (line 104) Code: // Check capabilities if( !_caps[RenderCaps::Tex_Float] ) Modules::log().writeWarning( "Renderer: No floating point texture support available" ); if( !_caps[RenderCaps::Tex_NPOT] ) Modules::log().writeWarning( "Renderer: No non-Power-of-two texture support available" ); if( !_caps[RenderCaps::RT_Multisampling] ) Modules::log().writeWarning( "Renderer: No multisampling for render targets available" ); Shows that No floating point texture support available No multisampling for render targets available Running Chicago.exe, the following works in egRendererBase.h (line 65) for a few calls Code: T &getRef( uint32 handle ) { ASSERT( handle > 0 && handle <= _objects.size() ); return _objects[handle - 1]; } but then the assert fails b/c the handle is 0
Horde3D.dll!RendererBase::createVertexLayout(unsigned int elemCount=8) Line 941 + 0x16 bytes C++ Horde3D.dll!Renderer::init() Line 112 + 0xa bytes C++ Horde3D.dll!h3dInit() Line 88 + 0x20 bytes C++ Sample_Chicago.exe!Application::init() Line 50 + 0x8 bytes C++ Sample_Chicago.exe!main(int argc=1, char * * argv=0x00e520a8) Line 185 + 0xb bytes C++ Sample_Chicago.exe!__tmainCRTStartup() Line 266 + 0x19 bytes C Sample_Chicago.exe!mainCRTStartup() Line 182 C kernel32.dll!76854911() [Frames below may be incorrect and/or missing, no symbols loaded for kernel32.dll] ntdll.dll!77c5e4b6() ntdll.dll!77c5e489() Horde3DUtils.dll!_isindst_nolock(tm * tb=0x00000000) Line 596 + 0x44 bytes C Is there a new workaround for this, or has Horde3d grown to the point where this graphics card is no longer viable? |
Author: | marciano [ 26.08.2010, 19:53 ] |
Post subject: | Re: Horde on Mac Intel GMA X3100 |
aolney wrote: but then the assert fails b/c the handle is 0 Are you sure that the handle is 0? The callstack indicates that it is 1 and I don't see a reason why adding the layout object should fail... |
Page 1 of 1 | All times are UTC + 1 hour |
Powered by phpBB® Forum Software © phpBB Group https://www.phpbb.com/ |