Javascript 中的线性回归
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/6195335/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Linear Regression in Javascript
提问by Chris W.
I want to do Least Squares Fitting in Javascript in a web browser.
我想在 Web 浏览器中用 Javascript 进行最小二乘拟合。
Currently users enter data point information using HTML text inputs and then I grab that data with jQuery and graph it with Flot.
目前用户使用 HTML 文本输入输入数据点信息,然后我使用 jQuery 获取该数据并使用Flot绘制它。
After the user had entered in their data points I would like to present them with a "line of best fit". I imagine I would calculate the linear, polynomial, exponential and logarithmic equations and then choose the one with the highest R^2
value.
在用户输入他们的数据点后,我想向他们展示“最佳拟合线”。我想我会计算线性、多项式、指数和对数方程,然后选择具有最高R^2
值的方程。
I can't seem to find any libraries that will help me to do this though. I stumbled upon jStat, but it is completely missing documentation (as far as I can find) and after digging through the the source code it doesn't seem to have any linear regression functionality built in--I'm basing this purely on function names however.
不过,我似乎找不到任何可以帮助我做到这一点的库。我偶然发现了jStat,但它完全缺少文档(据我所知)并且在挖掘源代码之后它似乎没有内置任何线性回归功能 - 我纯粹基于函数然而名字。
Does anyone know any Javascript libraries that offer simple regression analysis?
有谁知道任何提供简单回归分析的 Javascript 库?
The hope would be that I could use the library like so...
希望我可以像这样使用图书馆......
If I had some set of scatter points in an array var points = [[3,4],[15,45],...[23,78]]
, I would be able to hand that to some function like lin_reg(points)
and it would return something like [7.12,3]
if the linear equation was y = 7.12 x + 3
.
如果我在数组中有一些散点集var points = [[3,4],[15,45],...[23,78]]
,我将能够将它传递给某个函数,lin_reg(points)
并且它会返回类似于[7.12,3]
线性方程是y = 7.12 x + 3
.
采纳答案by Milimetric
What kind of linear regression? For something simple like least squares, I'd just program it myself:
什么样的线性回归?对于像最小二乘这样简单的东西,我只是自己编程:
http://mathworld.wolfram.com/LeastSquaresFitting.html
http://mathworld.wolfram.com/LeastSquaresFitting.html
The math is not too hard to follow there, give it a shot for an hour or so and let me know if it's too hard, I can try it.
数学并不难遵循,试一试一个小时左右,如果太难,请告诉我,我可以试试。
EDIT:
编辑:
Found someone that did it:
发现有人这样做了:
http://dracoblue.net/dev/linear-least-squares-in-javascript/159/
http://dracoblue.net/dev/linear-least-squares-in-javascript/159/
回答by o_c
The simplest solution I found for the question at hand can be found in the following post: http://trentrichardson.com/2010/04/06/compute-linear-regressions-in-javascript/
我为手头的问题找到的最简单的解决方案可以在以下帖子中找到:http: //trentrichardson.com/2010/04/06/compute-linear-regressions-in-javascript/
Note that in addition to the linear equation, it also returns the R2 score, which can be useful.
请注意,除了线性方程之外,它还返回 R2 分数,这很有用。
** EDIT **
** 编辑 **
Here is the actual code snippet:
这是实际的代码片段:
function linearRegression(y,x){
var lr = {};
var n = y.length;
var sum_x = 0;
var sum_y = 0;
var sum_xy = 0;
var sum_xx = 0;
var sum_yy = 0;
for (var i = 0; i < y.length; i++) {
sum_x += x[i];
sum_y += y[i];
sum_xy += (x[i]*y[i]);
sum_xx += (x[i]*x[i]);
sum_yy += (y[i]*y[i]);
}
lr['slope'] = (n * sum_xy - sum_x * sum_y) / (n*sum_xx - sum_x * sum_x);
lr['intercept'] = (sum_y - lr.slope * sum_x)/n;
lr['r2'] = Math.pow((n*sum_xy - sum_x*sum_y)/Math.sqrt((n*sum_xx-sum_x*sum_x)*(n*sum_yy-sum_y*sum_y)),2);
return lr;
}
To use this you just need to pass it two arrays, known_y's and known_x's, so this is what you might pass:
要使用它,您只需要向它传递两个数组,known_y's 和 known_x's,所以这就是您可能传递的内容:
var known_y = [1, 2, 3, 4];
var known_x = [5.2, 5.7, 5.0, 4.2];
var lr = linearRegression(known_y, known_x);
// now you have:
// lr.slope
// lr.intercept
// lr.r2
回答by JZL003
I found this great JavaScript library.
我发现了这个很棒的 JavaScript 库。
It's very simple, and seems to work perfectly.
它非常简单,而且似乎运行良好。
I also can't recommend Math.JS enough.
我也不能推荐 Math.JS。
回答by Richard Finney
Check out https://web.archive.org/web/20150523035452/https://cgwb.nci.nih.gov/cgwbreg.html(javascript regression calculator) - pure JavaScript, not CGI calls to server. The data and processing remains on your computer. Complete R style results and R code to check the work and a visualization of the results.
查看 https://web.archive.org/web/20150523035452/https://cgwb.nci.nih.gov/cgwbreg.html(javascript回归计算器) - 纯 JavaScript,而不是对服务器的 CGI 调用。数据和处理保留在您的计算机上。完成 R 风格的结果和 R 代码来检查工作和结果的可视化。
See the source code for the embedded JavaScript implementations of OLS and statistics associated with the results.
请参阅 OLS 的嵌入式 JavaScript 实现的源代码以及与结果相关的统计信息。
The code is my effort to port the GSL library functions to JavaScript.
代码是我将 GSL 库函数移植到 JavaScript 的努力。
The codes is released under GPL because it's basically line for line porting of GPL licensed Gnu Scientific Library (GSL) code.
这些代码是在 GPL 下发布的,因为它基本上是 GPL 许可的 Gnu 科学库 (GSL) 代码的行移植行。
EDIT: Paul Lutus also provides some GPL code for regression at: http://arachnoid.com/polysolve/index.html
编辑:Paul Lutus 还提供了一些用于回归的 GPL 代码:http://arachnoid.com/polysolve/index.html
回答by didinko
Simple linear regression with measures of variation ( Total sum of squares = Regression sum of squares + Error sum of squares ), Standard error of estimate SEE (Residual standard error), and coefficients of determination R2 and correlation R.
带有变异度量的简单线性回归(总平方和 = 回归平方和 + 误差平方和),估计的标准误差 SEE(残差标准误差),以及确定系数 R2 和相关系数 R。
const regress = (x, y) => {
const n = y.length;
let sx = 0;
let sy = 0;
let sxy = 0;
let sxx = 0;
let syy = 0;
for (let i = 0; i < n; i++) {
sx += x[i];
sy += y[i];
sxy += x[i] * y[i];
sxx += x[i] * x[i];
syy += y[i] * y[i];
}
const mx = sx / n;
const my = sy / n;
const yy = n * syy - sy * sy;
const xx = n * sxx - sx * sx;
const xy = n * sxy - sx * sy;
const slope = xy / xx;
const intercept = my - slope * mx;
const r = xy / Math.sqrt(xx * yy);
const r2 = Math.pow(r,2);
let sst = 0;
for (let i = 0; i < n; i++) {
sst += Math.pow((y[i] - my), 2);
}
const sse = sst - r2 * sst;
const see = Math.sqrt(sse / (n - 2));
const ssr = sst - sse;
return {slope, intercept, r, r2, sse, ssr, sst, sy, sx, see};
}
regress([1, 2, 3, 4, 5], [1, 2, 3, 4, 3]);
回答by Nic Mabon
Here is a snippet that will take an array of triplets (x, y, r) where r is the weight of the (x, y) data point and return [a, b] such that Y = a*X + b approximate the data.
这是一个片段,它将采用三元组 (x, y, r) 数组,其中 r 是 (x, y) 数据点的权重并返回 [a, b] 使得 Y = a*X + b 近似于数据。
// return (a, b) that minimize
// sum_i r_i * (a*x_i+b - y_i)^2
function linear_regression( xyr )
{
var i,
x, y, r,
sumx=0, sumy=0, sumx2=0, sumy2=0, sumxy=0, sumr=0,
a, b;
for(i=0;i<xyr.length;i++)
{
// this is our data pair
x = xyr[i][0]; y = xyr[i][1];
// this is the weight for that pair
// set to 1 (and simplify code accordingly, ie, sumr becomes xy.length) if weighting is not needed
r = xyr[i][2];
// consider checking for NaN in the x, y and r variables here
// (add a continue statement in that case)
sumr += r;
sumx += r*x;
sumx2 += r*(x*x);
sumy += r*y;
sumy2 += r*(y*y);
sumxy += r*(x*y);
}
// note: the denominator is the variance of the random variable X
// the only case when it is 0 is the degenerate case X==constant
b = (sumy*sumx2 - sumx*sumxy)/(sumr*sumx2-sumx*sumx);
a = (sumr*sumxy - sumx*sumy)/(sumr*sumx2-sumx*sumx);
return [a, b];
}
回答by Timmmm
Somewhat based on Nic Mabon's answer.
有点基于 Nic Mabon 的回答。
function linearRegression(x, y)
{
var xs = 0; // sum(x)
var ys = 0; // sum(y)
var xxs = 0; // sum(x*x)
var xys = 0; // sum(x*y)
var yys = 0; // sum(y*y)
var n = 0;
for (; n < x.length && n < y.length; n++)
{
xs += x[n];
ys += y[n];
xxs += x[n] * x[n];
xys += x[n] * y[n];
yys += y[n] * y[n];
}
var div = n * xxs - xs * xs;
var gain = (n * xys - xs * ys) / div;
var offset = (ys * xxs - xs * xys) / div;
var correlation = Math.abs((xys * n - xs * ys) / Math.sqrt((xxs * n - xs * xs) * (yys * n - ys * ys)));
return { gain: gain, offset: offset, correlation: correlation };
}
Then y' = x * gain + offset.
然后 y' = x * 增益 + 偏移。