Horde3D
http://horde3d.org/forums/

Shaders' love-hate relationship with hardware...
http://horde3d.org/forums/viewtopic.php?f=2&t=959
Page 1 of 1

Author:  Orm [ 26.09.2009, 23:51 ]
Post subject:  Shaders' love-hate relationship with hardware...

The bane of every programmer... varying hardware...

I was wondering if there was any workaround for the fact that Intel Integrated Graphics Chipsets (primarily in laptops) don't support GLSL as they should. The demos work just fine on my desktop (Nvidia 9800 GTX) but on my laptop (Intel 4 Series Express), they crash.

In addition...
Does Cg have this same problem? If not, I was wondering if there was any planned support for Nvidia Cg or if anyone could help me out in integrating the language into the engine.

Author:  swiftcoder [ 27.09.2009, 00:54 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

If you develop on the Intel chipset, your game should work fine everywhere else. Beyond that, no guarantees - Intel integrated GPUs are just not up to most tasks.

One general guideline is to develop for Intel GMA if you are producing a casual game, and develop for real graphics cards if you can afford to target only gamers :)

Author:  Orm [ 27.09.2009, 03:02 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

My target is the widest range of hardware I can. Thing that sucks is that most of the time I develop on my laptop in between classes.

Author:  marciano [ 27.09.2009, 21:48 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
Does Cg have this same problem? If not, I was wondering if there was any planned support for Nvidia Cg or if anyone could help me out in integrating the language into the engine.

Cg is not directly supported by OpenGL. Cg shaders are compiled to either GLSL or to the old assembler-based ARB fragment and vertex programs. The latter seem to be supported by the integrated Intel cards, so Cg could help here. We don't plan to integrate Cg anytime soon but I guess it should not be too difficult to do.

Author:  Orm [ 29.09.2009, 01:34 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Well, even if you don't it seems like Intel's new cards have GLSL support, so I dunno... there has got to be some sort of a workarround though. It's kinda riddiculous that Intel would release graphics chips without even basic support for game related graphics, ESPECIALLY if they're making a move into the graphics market with Larabe.

Author:  swiftcoder [ 29.09.2009, 03:01 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
Well, even if you don't it seems like Intel's new cards have GLSL support, so I dunno... there has got to be some sort of a workarround though. It's kinda riddiculous that Intel would release graphics chips without even basic support for game related graphics, ESPECIALLY if they're making a move into the graphics market with Larabe.
I have an Intel GMA X3100, which is just about able to run Horde (i.e. forward render only, no HDR, no shadows). I am afraid that the GMA's are not gaming-grade GPUs, and if you plan to target them, you need a capable non-shader/minimal-shader pipeline (even then, they are no speed demons).

Author:  Orm [ 29.09.2009, 19:14 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

That's what variable game quality settings are for my friend. The demos wont work on my laptop with the integrated graphics, so that is the primary issue I am trying to work out. Even then, the G4 chips aren't slouches. You really aren't giving them enough credit, as I was able to run AirRivals, a game with the most inefficient and sloppily built game engine I have ever seen, with few problems.

Author:  swiftcoder [ 29.09.2009, 20:59 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
That's what variable game quality settings are for my friend.
Sure, but in this case the minimum engine spec is higher than your hardware ;)

By all means, cut the shaders down far enough to run on your hardware (and probably reduce the polycount of the models at the same time), but I found to run decently I basically had to disable everything.

Author:  Orm [ 29.09.2009, 21:29 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Alright, then I will have to read up on some of the Horde3D settings. Right now I am just developing a scene graph to tie the rendering engine and physics engine together, so no serious game development yet. Even then, games like World of Warcraft and Team Fortress 2 (not kidding) were designed to run on a fairly wide range of hardware, so I am simply looking for guidance in this area.

Author:  swiftcoder [ 29.09.2009, 21:44 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
Even then, games like World of Warcraft and Team Fortress 2 (not kidding) were designed to run on a fairly wide range of hardware, so I am simply looking for guidance in this area.
Both WoW and the Source engine fallback to fixed-function rendering on low-end hardware, and scale shader-based effects very heavily on shader-capable hardware.

Horde doesn't support fixed-function fallbacks, so that immediately cuts off a large range of WoW-supported hardware. Horde also doesn't have any built-in capacity to scale shader-based effects, although you can emulate this using defines in the shaders, and multiple pipelines selected at runtime.

Author:  Orm [ 29.09.2009, 21:55 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

So in other words, this is crap best left up to teams of dedicated professionals doing years of research and development. For a student project I should only focus on implementing something that actually works... is that what you're saying?

Author:  swiftcoder [ 30.09.2009, 02:13 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
So in other words, this is crap best left up to teams of dedicated professionals doing years of research and development. For a student project I should only focus on implementing something that actually works... is that what you're saying?
Yeah, pretty much. Support across both ATI and NVidia is bad enough, and when you add Intel to the mix... well, all hell breaks loose.

I would advocate that you get your game running on some reasonable hardware, and then see how far you can `downport` in the time you have available - otherwise you run the risk of spending all the time chasing obscure Intel bugs.

Author:  Orm [ 30.09.2009, 07:17 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Alright, then I'll focus on mainstream crap first. Nowdays you think there would be a "graphics standard".

Author:  DarkAngel [ 30.09.2009, 07:29 ]
Post subject:  Re: Shaders' love-hate relationship with hardware...

Orm wrote:
Nowdays you think there would be a "graphics standard".
The great thing about standards is that there's so many to choose from! </sarcasm> :D

As far as I know, most of Intel's chips only properly support the GL1.5 standard, whereas Horde expects full GL2.0 support.

Newer cards from NVidia/ATI have GL3.0 support :wink: ...we're just waiting for Intel to catch up.

Page 1 of 1 All times are UTC + 1 hour
Powered by phpBB® Forum Software © phpBB Group
https://www.phpbb.com/