Javascript ES6 集合的计算/时间复杂度
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/31091772/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Javascript ES6 computational/time complexity of collections
提问by Brian M. Hunt
What time complexity (in big-O notation) is provided by the ES6 specification for the Keyed Collections (Set, Map, WeakSet, and WeakMap)?
ES6 规范为 Keyed Collections(Set、Map、WeakSet 和 WeakMap)提供了什么时间复杂度(以 big-O 表示法)?
My expectation, and I expect that of most developers, is that the specifications and implementations would use widely acceptedperformant algorithms, in which case Set.prototype.has
, add
and delete
to all be O(1) in the average case. The same for the Map
and Weak–
equivalents.
我和大多数开发人员的期望是,规范和实现将使用广泛接受的高性能算法,在这种情况下Set.prototype.has
,add
并且delete
在平均情况下都是 O(1)。这同样适用于Map
和Weak–
等效物。
It is not entirely apparent to me whether the time complexity of the implementations was mandated e.g. in ECMAScript 2015 Language Specification - 6th Edition — 23.2 Set Objects.
对我来说,实现的时间复杂度是否在ECMAScript 2015 Language Specification - 6th Edition — 23.2 Set Objects 中是强制要求的,我并不完全清楚。
Unless I misunderstand it (and it's certainly very possible I do), it looks the ECMA spec mandates that the implementations (e.g. Set.prototype.has
) are to use a linear time (O(n)) algorithm. It would strike me as exceedingly surprising that more performant algorithms would not be mandated or even permitted by the spec, and I would be very interested in an explanation for why this is the case.
除非我误解了它(我当然很有可能会误解),看起来 ECMA 规范要求实现(例如Set.prototype.has
)使用线性时间(O(n))算法。让我感到非常惊讶的是,规范不会强制要求甚至允许更高性能的算法,而且我对解释为什么会这样非常感兴趣。
采纳答案by Bergi
Right from that very paragraphyour linked to:
从那段你链接到:
Set objects must be implemented using [mechanisms] that, on average, provide access times that are sublinear on the number of elements in the collection.
Set 对象必须使用 [机制] 来实现,该机制平均提供与集合中元素数量次线性的访问时间。
You will find the same sentence for Maps, WeakMapsand WeakSets.
你会发现Maps、WeakMaps和WeakSets有相同的句子。
It looks the ECMA spec mandates that the implementations (e.g. Set.prototype.has) are to use a linear time (
O(n)
) algorithm.
看起来 ECMA 规范要求实现(例如 Set.prototype.has)使用线性时间 (
O(n)
) 算法。
No:
不:
The data structures used in this
Set
objects specification is only intended to describe the required observable semantics of Set objects. It is not intended to be a viable implementation model.
本
Set
对象规范中使用的数据结构仅用于描述 Set 对象所需的可观察语义。它不是一个可行的实施模型。
The observable semantics are mostly related to the predictable iteration order (which still can be implemented efficient and fast). It is indeed expected by the specification that an implementation uses a hash table or something similar with constant access, though trees (with logarithmic access complexity) are allowed as well.
可观察语义主要与可预测的迭代顺序有关(仍然可以高效和快速地实现)。尽管也允许使用树(具有对数访问复杂性),但规范确实期望实现使用哈希表或类似的具有常量访问的东西。
回答by domdambrogia
For anyone who is curious, I did a very quick benchmark:
对于任何好奇的人,我做了一个非常快速的基准测试:
const benchmarkMap = size => {
console.time('benchmarkMap');
var map = new Map();
for (var i = 0; i < size; i++) map.set(i, i);
for (var i = 0; i < size; i++) var x = map.get(i);
console.timeEnd('benchmarkMap');
}
const benchmarkObj = size => {
console.time('benchmarkObj');
var obj = {};
for (var i = 0; i < size; i++) obj[i] = i;
for (var i = 0; i < size; i++) var x = obj[i];
console.timeEnd('benchmarkObj');
}
var size = 1000000;
benchmarkMap(size);
benchmarkObj(size);
I ran this a few times and yielded the following results:
我运行了几次并产生了以下结果:
(2017 MacBook Pro, 2.5 GHz i7 w/ 16G RAM)
(2017 款 MacBook Pro,2.5 GHz i7 带 16G RAM)
benchmarkMap: 189.120ms
benchmarkObj: 44.214ms
benchmarkMap: 200.817ms
benchmarkObj: 38.963ms
benchmarkMap: 187.968ms
benchmarkObj: 41.633ms
benchmarkMap: 186.533ms
benchmarkObj: 35.850ms
benchmarkMap: 187.339ms
benchmarkObj: 44.515ms