Java GPU 编程

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/3384970/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-13 22:48:43  来源:igfitidea点击:

Java GPU programming

javaperformancegpu

提问by Anand Sunderraman

Is it possible to do GPU programming in Java ? I mean without using native libraries.

是否可以在 Java 中进行 GPU 编程?我的意思是不使用本机库。

And how much of a performance improvement can one expect when we switch over to gpu's ?

当我们切换到 gpu 时,可以期待多少性能改进?

Edit:

编辑:

I am not looking at game programming, I want to do hard core number crunching.

我不是在看游戏编程,我想做硬核数字运算。

采纳答案by Gunslinger47

Yes.Java3D, LWJGL and JOGL support GLSL(OpenGL Shading Language).

是的。Java3D、LWJGL 和 JOGL 支持GLSL(OpenGL 着色语言)。

Edit:
You can use OpenCL if you want to do platform-neutral, general-purpose computation on GPUs. This framework lets you write code that treats all processing units identically, despite wildly varying feature sets and execution environments. Though, this is very low level programming compared with Java.

编辑:
如果您想在 GPU 上进行平台中立的通用计算,您可以使用 OpenCL。尽管功能集和执行环境千差万别,但该框架使您可以编写对所有处理单元一视同仁的代码。不过,与 Java 相比,这是非常低级的编程。

It seems your ideal would be a JVM written with OpenCL support. Searching online, I found a little bit of interest in the idea but no evidence of any major backing.

看来您的理想选择是使用 OpenCL 支持编写的 JVM。在网上搜索,我发现对这个想法有点兴趣,但没有任何主要支持的证据。

how much of a performance improvement can one expect

可以期待多大的性能改进

That depends on the system you're running on and what sort of data you're processing (Matrix and vector math is extremely efficient on GPUs). You'd likely get some major gains on a system like mine with two powerful GPUs and a modest single-core CPU. However on a computer with a modest GPU and a quad-core CPU, the performance gains might have a hard time overcoming the overhead.

这取决于您运行的系统以及您正在处理的数据类型(矩阵和向量数学在 GPU 上非常有效)。在像我这样配备两个强大 GPU 和适度单核 CPU 的系统上,您可能会获得一些重大收益。但是,在具有适度 GPU 和四核 CPU 的计算机上,性能提升可能难以克服开销。

回答by user619318

If you are still considering hard core number crunching in Java on GPU without using native libraries, you might be interested by this blog article http://ateji.blogspot.com/2011/02/java-on-gpu-with-ateji-px.htmlWe (I am part of the Ateji team) have seen performance of up to 60x so far on Java applications that can be massively parallelized.

如果您仍在考虑在不使用本机库的情况下在 GPU 上使用 Java 进行硬核数字运算,您可能会对这篇博客文章感兴趣http://ateji.blogspot.com/2011/02/java-on-gpu-with-ateji- px.html我们(我是 Ateji 团队的一员)到目前为止,在可以大规模并行化的 Java 应用程序上已经看到了高达 60 倍的性能。

回答by pcpratts

Rootbeer1 has just been released on github: https://github.com/pcpratts/rootbeer1

Rootbeer1 刚刚在 github 上发布:https: //github.com/pcpratts/rootbeer1

With Rootbeer you can program using almost any Java except the following:

使用 Rootbeer,您几乎可以使用任何 Java 进行编程,但以下情况除外:

  1. native methods and fields
  2. reflection
  3. dynamic method invocation
  4. garbage collection
  5. sleeping inside a monitor
  1. 本机方法和字段
  2. 反射
  3. 动态方法调用
  4. 垃圾收集
  5. 睡在显示器里

This means you can use arbitrary object graphs with composite types.

这意味着您可以使用具有复合类型的任意对象图。

回答by Andy

def check out rootbeer1, but you need to make sure you have a CUDA accepted GFX card first before you start on it and have done all of the NVIDIA setup etc

def检查rootbeer1,但在开始之前,您需要先确保您有一个CUDA接受的GFX卡并完成了所有NVIDIA设置等

download Link:google CUDA download

下载链接:google CUDA下载

getting started guidehttp://developer.download.nvidia.com/compute/DevZone/docs/html/C/doc/CUDA_C_Getting_Started_Windows.pdf

入门指南http://developer.download.nvidia.com/compute/DevZone/docs/html/C/doc/CUDA_C_Getting_Started_Windows.pdf