javascript Google Chrome 中 array.splice() 的时间复杂度是多少?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/5175925/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-25 16:16:17  来源:igfitidea点击:

What's the time complexity of array.splice() in Google Chrome?

javascriptgoogle-chromebig-otime-complexityv8

提问by Ivan

If I remove one element from an array using splice() like so:

如果我使用 splice() 从数组中删除一个元素,如下所示:

arr.splice(i, 1);

Will this be O(n)in the worst case because it shifts all the elements after i? Or is it constant time, with some linked list magic underneath?

这会O(n)是最坏的情况,因为它会在 i 之后移动所有元素吗?或者它是恒定的时间,下面有一些链表魔术?

采纳答案by Andrew Marshall

Worst case shouldbe O(n)(copying all n-1elements to new array).

最坏的情况应该O(n)(将所有n-1元素复制到新数组)。

A linked list would be O(1)for a single deletion.

链接列表将O(1)用于单个删除。

For those interested I've made this lazily-crafted benchmark. (Please don't run on Windows XP/Vista). As you can see from this though, it looks fairly constant (i.e. O(1)), so who knows what they're doing behind the scenes to make this crazy-fast. Note that regardless, the actual spliceis VERY fast.

对于那些感兴趣的人,我已经制作了这个懒惰的基准测试。(请不要在 Windows XP/Vista 上运行)。但是,正如您从中可以看到的,它看起来相当稳定(即O(1)),所以谁知道他们在幕后做了什么来使这种疯狂快速。请注意,无论如何,实际splice速度非常快。

Rerunning an extended benchmarkdirectly in the V8 shell that suggest O(n). Note though that you need huge array sizes to get a runtime that's likely to affect your code. This should be expected as if you look at the V8 code it uses memmoveto create the new array.

直接在 V8 shell 中重新运行扩展基准测试,提示O(n). 请注意,您需要巨大的数组大小才能获得可能影响您的代码的运行时。这应该是预期的,就像您查看它用于memmove创建新数组的 V8 代码一样。

回答by Artur Grigio

The Test测试

I took the advice in the comments and wrote a simple test to time splicing a data-set array of size 3,000, each one containing 3,000 items in it. The test would simply splice the

我接受了评论中的建议,并编写了一个简单的测试,对大小为 3,000 的数据集数组进行时间拼接,每个数组包含 3,000 个项目。该测试将简单地拼接

  • first item in the first array
  • second item in the second array
  • third item in the third array
  • ...
  • 3000th item in the 3000th array
  • 第一个数组中的第一项
  • 第二个数组中的第二个项目
  • 第三个数组中的第三个项目
  • ...
  • 第 3000 个数组中的第 3000 个项目

I pre-built the array to keep things simple.

我预先构建了数组以保持简单。

The Findings:调查结果:

The weirdest thing is that the number of times where the process of the splice even takes longer than 1ms grows linearly as you increase the size of the dataset.

最奇怪的是,当你增加数据集的大小时,拼接过程甚至需要超过 1ms 的次数会线性增长。

I went as far as testing it for a dataset of 300,000 on my machine (but the SO snippet tends to crash after 3,000).

我在我的机器上测试了 300,000 的数据集(但 SO 片段往往会在 3,000 之后崩溃)。

I also noticed that the number of splice()s that took longer than 1ms for a given dataset (30,000 in my case) was random. So I ran the test 1,000 times and plotted the number of results, and it looked like a standard distribution; leading me to believe that the randomness was just caused by the scheduler interrupts.

我还注意到,splice()对于给定的数据集(在我的情况下为 30,000),花费超过 1 毫秒的s数量是随机的。所以我运行了 1000 次测试并绘制了结果的数量,它看起来像一个标准分布;让我相信随机性只是由调度程序中断引起的。

This goes against my hypothesis and @Ivan's guess that splice()ing from the beginning of an array will have a O(n)time complexity

这违背了我的假设和@Ivan的猜测,即splice()从数组的开头 ing 将具有O(n)时间复杂度

Below is my test以下是我的测试

let data = []
const results = []
const dataSet = 3000

function spliceIt(i) {
  data[i].splice(i, 1)
}

function test() {
  for (let i=0; i < dataSet; i++) {
    let start = Date.now()
    spliceIt(i); 
    let end = Date.now()
    results.push(end - start)
  }
}

function setup() {
  data = (new Array(dataSet)).fill().map(arr => new Array(dataSet).fill().map(el => 0))
}

setup()
test()
// console.log("data before test", data)
// console.log("data after test", data)
// console.log("all results: ", results)
console.log("results that took more than 1ms: ", results.filter(r => r >= 1))