xcode 在 iPhone 上,如何将一张图像与另一张图像进行比较,以查看它们是否在某个百分比上相似?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/6488732/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-14 21:21:20  来源:igfitidea点击:

How does one compare one image to another to see if they are similar by a certain percentage, on the iPhone?

iphonexcodeimageuiimagecompare

提问by SolidSnake4444

I basically want to take two images taken from the camera on the iPhone or iPad 2 and compare them to each other to see if they are pretty much the same. Obviously due to light etc the image will never be EXACTLY the same so I would like to check for around 90% compatibility.

我基本上想从 iPhone 或 iPad 2 上的相机拍摄两张图像,并将它们相互比较,看看它们是否几乎相同。显然由于光线等原因,图像永远不会完全相同,所以我想检查大约 90% 的兼容性。

All the other questions like this that I saw on here were either not for iOS or were for locating objects in images. I just want to see if two images are similar.

我在这里看到的所有其他类似问题要么不是针对 iOS 的,要么是针对在图像中定位对象的。我只想看看两个图像是否相似。

Thank you.

谢谢你。

采纳答案by aroth

As a quick, simple algorithm, I'd suggest iterating through about 1% of the pixels in each image and either comparing them directly against each other or keeping a running average and then comparing the two average color values at the end.

作为一种快速、简单的算法,我建议迭代每个图像中大约 1% 的像素,然后将它们直接相互比较或保持运行平均值,然后在最后比较两个平均颜色值。

You can look at this answerfor an idea of how to determine the color of a pixel at a given position in an image. You may want to optimize it somewhat to better suit your use-case (repeatedly querying the same image), but it should provide a good starting point.

您可以查看此答案,了解如何确定图像中给定位置的像素颜色。您可能希望对其进行一些优化以更好地适应您的用例(反复查询相同的图像),但它应该提供一个很好的起点。

Then you can use an algorithm roughly like:

然后您可以使用大致如下的算法:

float numDifferences = 0.0f;
float totalCompares = width * height / 100.0f;
for (int yCoord = 0; yCoord < height; yCoord += 10) {
    for (int xCoord = 0; xCoord < width; xCoord += 10) {
        int img1RGB[] = [image1 getRGBForX:xCoord andY: yCoord];
        int img2RGB[] = [image2 getRGBForX:xCoord andY: yCoord];
        if (abs(img1RGB[0] - img2RGB[0]) > 25 || abs(img1RGB[1] - img2RGB[1]) > 25 || abs(img1RGB[2] - img2RGB[2]) > 25) {
            //one or more pixel components differs by 10% or more
            numDifferences++;
        }
    }
}

if (numDifferences / totalCompares <= 0.1f) {
    //images are at least 90% identical 90% of the time
}
else {
    //images are less than 90% identical 90% of the time
}

回答by cprcrack

Based on aroth's idea, this is my full implementation. It checks if some random pixels are the same. For what I needed it works flawlessly.

基于 aroth 的想法,这是我的完整实现。它检查一些随机像素是否相同。对于我所需要的,它完美无缺。

- (bool)isTheImage:(UIImage *)image1 apparentlyEqualToImage:(UIImage *)image2 accordingToRandomPixelsPer1:(float)pixelsPer1
{
    if (!CGSizeEqualToSize(image1.size, image2.size))
    {
        return false;
    }

    int pixelsWidth = CGImageGetWidth(image1.CGImage);
    int pixelsHeight = CGImageGetHeight(image1.CGImage);

    int pixelsToCompare = pixelsWidth * pixelsHeight * pixelsPer1;

    uint32_t pixel1;
    CGContextRef context1 = CGBitmapContextCreate(&pixel1, 1, 1, 8, 4, CGColorSpaceCreateDeviceRGB(), kCGImageAlphaNoneSkipFirst);
    uint32_t pixel2;
    CGContextRef context2 = CGBitmapContextCreate(&pixel2, 1, 1, 8, 4, CGColorSpaceCreateDeviceRGB(), kCGImageAlphaNoneSkipFirst);

    bool isEqual = true;

    for (int i = 0; i < pixelsToCompare; i++)
    {
        int pixelX = arc4random() % pixelsWidth;
        int pixelY = arc4random() % pixelsHeight;

        CGContextDrawImage(context1, CGRectMake(-pixelX, -pixelY, pixelsWidth, pixelsHeight), image1.CGImage);
        CGContextDrawImage(context2, CGRectMake(-pixelX, -pixelY, pixelsWidth, pixelsHeight), image2.CGImage);

        if (pixel1 != pixel2)
        {
            isEqual = false;
            break;
        }
    }
    CGContextRelease(context1);
    CGContextRelease(context2);

    return isEqual;
}

Usage:

用法:

[self isTheImage:image1 apparentlyEqualToImage:image2
accordingToRandomPixelsPer1:0.001]; // Use a value between 0.0001 and 0.005

According to my performance tests, 0.005 (0.5% of the pixels) is the maximum value you should use. If you need more precision, just compare the whole images using this. 0.001 seems to be a safe and well-performing value. For large images (like between 0.5 and 2 megapixels or million pixels), I'm using 0.0001 (0.01%)and it works great and incredibly fast, it never makes a mistake.

根据我的性能测试,0.005(像素的 0.5%)是您应该使用的最大值。如果您需要更高的精度,只需使用此比较整个图像 。0.001 似乎是一个安全且性能良好的值。对于大图像(例如 0.5 到 2 兆像素或百万像素之间),我使用的是0.0001 (0.01%),它运行良好且速度非常快,永远不会出错。

But of course the mistake-ratio will depend on the type of images you are using. I'm using UIWebView screenshots and 0.0001 performs well, but you can probably use much less if you are comparing real photographs (even just compare one random pixel in fact). If you are dealing with very similar computer designed images you definitely need more precision.

但当然错误率将取决于您使用的图像类型。我正在使用 UIWebView 屏幕截图并且 0.0001 表现良好,但是如果您正在比较真实照片(甚至实际上只是比较一个随机像素),您可能使用的更少。如果您正在处理非常相似的计算机设计图像,您肯定需要更高的精度。

Note: I'm always comparing ARGB images without taking into account the alpha channel. Maybe you'll need to adapt it if that's not exactly your case.

注意:我总是在不考虑 alpha 通道的情况下比较 ARGB 图像。如果您的情况并非如此,也许您需要对其进行调整。