我注意到,我一直在使用的OpenGL应用程序在Linux和WindowsXP上运行时有显着的性能差异。
当然,有很多纹理和阴影缓冲区,但是我估计应用程序在Windows XP上运行速度要慢大约10倍。
有任何想法吗?
任何将代码移植到DirectX的建议? 这可以轻松完成还是需要重新编写?
运行不同的硬件。 我没有Linux机箱的规格,但是我的xp机箱是搭配Nvidia Quadro FX 1500的Intel Duo Core 2。Linux机箱显卡是某种Nvidia Geforece(它是一台大学电脑)。
一些启动代码:
FlyWindow::FlyWindow() : GlowWindow("fly", 300, 100, // GlowWindow::autoPosition, GlowWindow::autoPosition, 700, 500, Glow::rgbBuffer | Glow::doubleBuffer | Glow::depthBuffer | Glow::multisampleBuffer, Glow::keyboardEvents | Glow::mouseEvents | Glow::dragEvents | /*Glow::menuEvents | */ Glow::motionEvents | Glow::visibilityEvents | Glow::focusEvents /* set ::glutEntryFunc */ ), W(700), H(500), flock(10), lastSeconds(myclock.getSecondsSinceStart()) { myfps = FPScounter(); GLdraw<float>::initGL(W,H); // Add a bouncing checkerboard MovingCB = Point3d<double>(50, 2, 50); Glow::RegisterIdle(this); bDebug = false; m_bLookAtCentroid = true; m_bLookAtGoal = false; }谢谢
I've noticed that a OpenGL app I've been working on has significant performance difference when run on Linux vs WindowsXP.
Granted, there are a lot of textures and shadow buffers but I would estimate that the app runs about 10x slower on Windows XP.
Any ideas?
Any suggestions for porting the code to DirectX? Can that be done easily or would a re-write be needed?
Running of different hardware. I don't have the specs of the Linux box, but my xp box is Intel Duo Core 2 with Nvidia Quadro FX 1500. The linux box video card was some sort of Nvidia Geforece (It was a University computer).
Some initiation code:
FlyWindow::FlyWindow() : GlowWindow("fly", 300, 100, // GlowWindow::autoPosition, GlowWindow::autoPosition, 700, 500, Glow::rgbBuffer | Glow::doubleBuffer | Glow::depthBuffer | Glow::multisampleBuffer, Glow::keyboardEvents | Glow::mouseEvents | Glow::dragEvents | /*Glow::menuEvents | */ Glow::motionEvents | Glow::visibilityEvents | Glow::focusEvents /* set ::glutEntryFunc */ ), W(700), H(500), flock(10), lastSeconds(myclock.getSecondsSinceStart()) { myfps = FPScounter(); GLdraw<float>::initGL(W,H); // Add a bouncing checkerboard MovingCB = Point3d<double>(50, 2, 50); Glow::RegisterIdle(this); bDebug = false; m_bLookAtCentroid = true; m_bLookAtGoal = false; }Thanks
最满意答案
正如DrJokepu在评论中提到的那样,XP可能会使用软件渲染,这意味着驱动程序的安装问题。 您可以通过查询GL_VENDOR和GL_RENDERER来验证这一点:
printf( "%s\n", (const char*)glGetString( GL_VENDOR ) ); printf( "%s\n", (const char*)glGetString( GL_RENDERER ) );供应商应该是NVidia而不是微软,渲染器应该至少是OpenGL 2.0。
As DrJokepu mentioned in the comments, it's possible XP is employing software rendering, implying a driver installation issue. You can verify this by querying GL_VENDOR and GL_RENDERER:
printf( "%s\n", (const char*)glGetString( GL_VENDOR ) ); printf( "%s\n", (const char*)glGetString( GL_RENDERER ) );The vendor should be NVidia and not Microsoft and the renderer should be at least OpenGL 2.0.
Linux和WindowsXP上的OpenGL性能差异(OpenGL performance difference on Linux and WindowsXP)我注意到,我一直在使用的OpenGL应用程序在Linux和WindowsXP上运行时有显着的性能差异。
当然,有很多纹理和阴影缓冲区,但是我估计应用程序在Windows XP上运行速度要慢大约10倍。
有任何想法吗?
任何将代码移植到DirectX的建议? 这可以轻松完成还是需要重新编写?
运行不同的硬件。 我没有Linux机箱的规格,但是我的xp机箱是搭配Nvidia Quadro FX 1500的Intel Duo Core 2。Linux机箱显卡是某种Nvidia Geforece(它是一台大学电脑)。
一些启动代码:
FlyWindow::FlyWindow() : GlowWindow("fly", 300, 100, // GlowWindow::autoPosition, GlowWindow::autoPosition, 700, 500, Glow::rgbBuffer | Glow::doubleBuffer | Glow::depthBuffer | Glow::multisampleBuffer, Glow::keyboardEvents | Glow::mouseEvents | Glow::dragEvents | /*Glow::menuEvents | */ Glow::motionEvents | Glow::visibilityEvents | Glow::focusEvents /* set ::glutEntryFunc */ ), W(700), H(500), flock(10), lastSeconds(myclock.getSecondsSinceStart()) { myfps = FPScounter(); GLdraw<float>::initGL(W,H); // Add a bouncing checkerboard MovingCB = Point3d<double>(50, 2, 50); Glow::RegisterIdle(this); bDebug = false; m_bLookAtCentroid = true; m_bLookAtGoal = false; }谢谢
I've noticed that a OpenGL app I've been working on has significant performance difference when run on Linux vs WindowsXP.
Granted, there are a lot of textures and shadow buffers but I would estimate that the app runs about 10x slower on Windows XP.
Any ideas?
Any suggestions for porting the code to DirectX? Can that be done easily or would a re-write be needed?
Running of different hardware. I don't have the specs of the Linux box, but my xp box is Intel Duo Core 2 with Nvidia Quadro FX 1500. The linux box video card was some sort of Nvidia Geforece (It was a University computer).
Some initiation code:
FlyWindow::FlyWindow() : GlowWindow("fly", 300, 100, // GlowWindow::autoPosition, GlowWindow::autoPosition, 700, 500, Glow::rgbBuffer | Glow::doubleBuffer | Glow::depthBuffer | Glow::multisampleBuffer, Glow::keyboardEvents | Glow::mouseEvents | Glow::dragEvents | /*Glow::menuEvents | */ Glow::motionEvents | Glow::visibilityEvents | Glow::focusEvents /* set ::glutEntryFunc */ ), W(700), H(500), flock(10), lastSeconds(myclock.getSecondsSinceStart()) { myfps = FPScounter(); GLdraw<float>::initGL(W,H); // Add a bouncing checkerboard MovingCB = Point3d<double>(50, 2, 50); Glow::RegisterIdle(this); bDebug = false; m_bLookAtCentroid = true; m_bLookAtGoal = false; }Thanks
最满意答案
正如DrJokepu在评论中提到的那样,XP可能会使用软件渲染,这意味着驱动程序的安装问题。 您可以通过查询GL_VENDOR和GL_RENDERER来验证这一点:
printf( "%s\n", (const char*)glGetString( GL_VENDOR ) ); printf( "%s\n", (const char*)glGetString( GL_RENDERER ) );供应商应该是NVidia而不是微软,渲染器应该至少是OpenGL 2.0。
As DrJokepu mentioned in the comments, it's possible XP is employing software rendering, implying a driver installation issue. You can verify this by querying GL_VENDOR and GL_RENDERER:
printf( "%s\n", (const char*)glGetString( GL_VENDOR ) ); printf( "%s\n", (const char*)glGetString( GL_RENDERER ) );The vendor should be NVidia and not Microsoft and the renderer should be at least OpenGL 2.0.
发布评论