C# 我可以从 DirectorySearcher 中获取 1000 多条记录吗?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/90652/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-03 11:30:47  来源:igfitidea点击:

Can I get more than 1000 records from a DirectorySearcher?

提问by naspinski

I just noticed that the return list for results is limited to 1000. I have more than 1000 groups in my domain (HUGE domain). How can I get more than 1000 records? Can I start at a later record? Can I cut it up into multiple searches?

我刚刚注意到结果的返回列表限制为 1000。我的域(巨大域)中有 1000 多个组。如何获得超过 1000 条记录?我可以从稍后的记录开始吗?我可以把它分成多个搜索吗?

Here is my query:

这是我的查询:

DirectoryEntry dirEnt = new DirectoryEntry("LDAP://dhuba1kwtn004");
string[] loadProps = new string[] { "cn", "samaccountname", "name", "distinguishedname" };
DirectorySearcher srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps);
var results = srch.FindAll();

I have tried to set srch.SizeLimit = 2000;, but that doesn't seem to work. Any ideas?

我试图设置srch.SizeLimit = 2000; ,但这似乎不起作用。有任何想法吗?

采纳答案by Joe

You need to set DirectorySearcher.PageSize to a non-zero value to get all results.

您需要将 DirectorySearcher.PageSize 设置为非零值才能获得所有结果。

BTW you should also dispose DirectorySearcher when you're finished with it

顺便说一句,当您完成它时,您还应该处理 DirectorySearcher

using(var srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps))
{
    srch.PageSize = 1000;
    var results = srch.FindAll();
}

The API documentation isn't very clear, but essentially:

API 文档不是很清楚,但基本上:

  • when you do a paged search, the SizeLimit is ignored, and all matching results are returned as you iterate through the results returned by FindAll. Results will be retrieved from the server a page at a time. I chose the value of 1000 above, but you can use a smaller value if preferred. The tradeoff is: using a small PageSize will return each page of results faster, but will require more frequent calls to the server when iterating over a large number of results.

  • by default the search isn't paged (PageSize = 0). In this case up to SizeLimit results is returned.

  • 当您进行分页搜索时,SizeLimit 将被忽略,并且在您遍历 FindAll 返回的结果时会返回所有匹配的结果。结果将一次从服务器检索一页。我在上面选择了 1000 的值,但如果愿意,您可以使用较小的值。权衡是:使用较小的 PageSize 将更快地返回每页结果,但在迭代大量结果时需要更频繁地调用服务器。

  • 默认情况下,搜索不分页(PageSize = 0)。在这种情况下,最多返回 SizeLimit 结果。

As Biri pointed out, it's important to dispose the SearchResultCollection returned by FindAll, otherwise you may have a memory leak as described in the Remarks section of the MSDN documentation for DirectorySearcher.FindAll.

正如 Biri 指出的那样,处理 FindAll 返回的 SearchResultCollection 很重要,否则可能会出现内存泄漏,如 DirectorySearcher.FindAll 的 MSDN 文档的备注部分所述

One way to help avoid this in .NET 2.0 or later is to write a wrapper method that automatically disposes the SearchResultCollection. This might look something like the following (or could be an extension method in .NET 3.5):

在 .NET 2.0 或更高版本中帮助避免这种情况的一种方法是编写一个自动处理 SearchResultCollection 的包装方法。这可能类似于以下内容(或者可能是 .NET 3.5 中的扩展方法):

public IEnumerable<SearchResult> SafeFindAll(DirectorySearcher searcher)
{
    using(SearchResultCollection results = searcher.FindAll())
    {
        foreach (SearchResult result in results)
        {
            yield return result;        
        } 
    } // SearchResultCollection will be disposed here
}

You could then use this as follows:

然后,您可以按如下方式使用它:

using(var srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps))
{
    srch.PageSize = 1000;
    var results = SafeFindAll(srch);
}