java 用java做微积分

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/3420265/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-30 01:45:12  来源:igfitidea点击:

Do calculus in java

javacalculus

提问by Realn0whereman

I am trying to implement a neural network in java (small one) and I'm using back propogation for the learning algorithm. This requires to find general derivatives. How do I find general derivatives in java?

我正在尝试用 Java(小型)实现一个神经网络,并且我正在使用反向传播进行学习算法。这需要找到一般导数。我如何在java中找到一般导数?

回答by Alain O'Dea

Try Helmut Dersch's Jasymca 2 http://webuser.hs-furtwangen.de/~dersch/jasymca2/. It's a Java API providing GNU Octave/Matlab-like capabilities. It includes symbolic math.

试试 Helmut Dersch 的 Jasymca 2 http://webuser.hs-furtwangen.de/~dersch/jasymca2/。它是一个提供 GNU Octave/Matlab 类功能的 Java API。它包括符号数学。

Jasymca has been recently worked on. The documentation is from March 2009 and it requires Java 1.5+.

Jasymca 最近一直在研究。该文档来自 2009 年 3 月,它需要 Java 1.5+。

CAVEAT: Jasymca is GPL so consult a lawyer before using it in a commercial product.

警告:Jasymca 是 GPL,因此在将其用于商业产品之前请咨询律师。

回答by duffymo

Depends on whether you have continuous or discrete data. I'm guessing that you have discrete data, since we're talking about neural nets.

取决于您有连续数据还是离散数据。我猜你有离散数据,因为我们在谈论神经网络。

Finite differences are one way to approximate derivatives. Another approach might be to do a fit of some kind and differentiate the fitting function, assuming that it's a well-known function with an easy-to-calculate derivative (e.g., polynomials).

有限差分是近似导数的一种方法。另一种方法可能是进行某种拟合并区分拟合函数,假设它是一个众所周知的函数,具有易于计算的导数(例如多项式)。

How many independent variables for your data? Functions of one variable are easy; two or more are harder because you need partial derivatives.

您的数据有多少个自变量?一个变量的函数很容易;两个或更多更难,因为你需要偏导数。

回答by Sindri Tór

You should try to hardcode it

您应该尝试对其进行硬编码

double derivative = (f(x+h) - f(x-h)) / (2*h);

回答by FullStack

If you can make HTTP requests to the world wide web, you can create a SaturnAPIintegration script.

如果您可以向万维网发出 HTTP 请求,则可以创建SaturnAPI集成脚本。

Disclosure: I worked on SaturnAPI

披露:我在 SaturnAPI 上工作

回答by Stebe23

If it comes to java, look at the DMelt math program. It free. In the manual, you can find how to take the derivations.

如果说到java,看DMelt math program。它免费。在手册中,您可以找到如何进行推导。

回答by Razor Storm

I'm pretty certain java does not have built in library for calculus functionality. However, it could range anywhere from trivial to quite challenging to implement differentiation by yourself.

我很确定java没有内置的微积分功能库。但是,自己实施差异化可能很简单,也可能非常具有挑战性。

If you already have the ability to store and analyze functions, then getting derivatives is as simple as programming the (quite limited) number of differentiation rules.

如果您已经具备存储和分析函数的能力,那么获取导数就像编写(非常有限)数量的微分规则一样简单。

However if you are looking at differentiation based on DATAsets (not abstract functions), then you can use various approximation techniques, such as simpsons rule.

但是,如果您正在查看基于数据集(而不是抽象函数)的微分,那么您可以使用各种近似技术,例如辛普森规则。

回答by JeffHeaton

Okay, if you are doing neural networks most likely you will NOT need to take just a general derivative of some arbitrary function. Which is what you would need a general Calculus library for. Backprop requires you to use the derivative of your activation function. USUALLY, your activation function is going to be the sigmoid function or the hyperbolic tan function. Both of which you can just get the derivative of from Wikipedia and simply provide that function to your neural network training. You do not need to actually solve the derivative each time.

好吧,如果你在做神经网络,你很可能不需要只取一些任意函数的一般导数。这就是您需要通用微积分库的目的。Backprop 要求您使用激活函数的导数。通常,您的激活函数将是 sigmoid 函数或双曲正切函数。您可以从 Wikipedia 中获得这两者的衍生物,然后简单地将该函数提供给您的神经网络训练。您不需要每次都实际求解导数。

There are other common activation functions, but there is really only a handful that is actually used. Just look up the derivative and make use of which one you want. Most neural network frameworks just build the regular activation function and derivative into some sort of a base class you use. Here are some of the most common ones:

还有其他常见的激活函数,但实际上真正使用的只有少数几个。只需查找导数并使用您想要的导数即可。大多数神经网络框架只是将常规激活函数和导数构建到您使用的某种基类中。以下是一些最常见的:

https://web.archive.org/web/20101105231126/http://www.heatonresearch.com/online/programming-neural-networks-encog-java/chapter-3/page2.html

https://web.archive.org/web/20101105231126/http://www.heatonresearch.com/online/programming-neural-networks-encog-java/chapter-3/page2.html