开发者

How can I take advantage of my discrete graphics card on a headless server?

开发者 https://www.devze.com 2023-04-05 15:43 出处:网络
I\'m working on a remote visualization project for BioMesh3D, harnessing a beefed up server to do processing and rendering.It\'s easy to get it working if we setup a user 开发者_StackOverflow中文版to

I'm working on a remote visualization project for BioMesh3D, harnessing a beefed up server to do processing and rendering. It's easy to get it working if we setup a user 开发者_StackOverflow中文版to auto-login to the graphical desktop because then we have a running xserver. We'd like to run it completely headless and without having to do an auto-login.

It seems as if our only choice is to rely on something like Xvfb if we want to run the server completely headless. But as we are doing some fairly complex rendering we'd like to take advantage of the discrete graphics card.

Is it possible to get Xvfb to use the video card? If not, is there another method we can use?


If you want to use hardware OpenGL rendering on linux then your options are:

  • Try the bleeding edge Mesa with Gallium drivers and EGL. IIRC last I checked this only supported GLES so you may need to tweak your rendering code.
  • Run an X server to access the OpenGL hardware via traditional driver

The NVIDIA binary blob supports running a headless X server (even multiple ones). Check out the http://vizstack.sourceforge.net/ project for details on that.

As for VirtualGL, it will let you achieve transparent remote rendering without modifying your app (I don't understand your "it's a bit too intrusive" or "rewrite the application" comments, could you expand?), but it still requires the X server running. (Update: VirtualGL 3 supports an EGL backend, and doesn't require the X server.)


Did you take a look at http://www.virtualgl.org ?

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号