java OpenCV 特征检测器
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/14940111/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
OpenCV FeatureDetector
提问by user2083842
I am trying to write a code that applies SURFobject detection, so I took one of the openCV samples ( sample 3 ) and I started updating the onCameraViewStarted()
and onCameraFrame()
methods but I keep getting a runtime error when I try it on my galaxy S3 phone and I couldn't find anything to help with my problem here is my code and what I updated:
我正在尝试编写一个应用SURF对象检测的代码,所以我使用了一个 openCV 示例(示例 3)并开始更新onCameraViewStarted()
和onCameraFrame()
方法,但是当我在我的 Galaxy S3 手机上尝试它时,我不断收到运行时错误,我找不到任何可以帮助解决我的问题的内容,这是我的代码和我更新的内容:
public class Sample3Native extends Activity implements CvCameraViewListener{
private static final String TAG = "OCVSample::Activity";
private Mat mRgba;
private Mat mGrayMat;
private CameraBridgeViewBase mOpenCvCameraView;
Mat descriptors ;
List<Mat> descriptorsList;
FeatureDetector featureDetector;
MatOfKeyPoint keyPoints;
DescriptorExtractor descriptorExtractor;
DescriptorMatcher descriptorMatcher;**
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
Log.i(TAG, "OpenCV loaded successfully");
// Load native library after(!) OpenCV initialization
System.loadLibrary("native_sample");
mOpenCvCameraView.enableView();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
public void onCameraViewStarted(int width, int height) {
mRgba = new Mat(height, width, CvType.CV_8UC4);
mGrayMat = new Mat(height, width, CvType.CV_8UC1);
featureDetector=FeatureDetector.create(4); // SURF= 4;
descriptorExtractor=DescriptorExtractor.create(2);//SURF = 2
descriptorMatcher=DescriptorMatcher.create(6); //BRUTEFORCE_SL2 = 6**
}
public Mat onCameraFrame(Mat inputFrame) {
inputFrame.copyTo(mRgba);
//detect_1(0, mRgba.getNativeObjAddr(), keyPoints.getNativeObjAddr());
//Now mRgba contains the current frame ( start manipulation part)
//detecting keypoints
featureDetector.detect(mRgba, keyPoints);
//draw keypoints
// Features2d.drawKeypoints(mRgba, keyPoints, mRgba);
//finding descriptors
descriptorExtractor.compute(mRgba, keyPoints, descriptors);
//Matcher between 2 images or set of images
// Note: training set and query set are handled here! (in matcher)
//descriptorsList = descriptorMatcher.getTrainDescriptors();
//descriptorsList.add(descriptors);
// descriptorMatcher.add(descriptorsList);
//Imgproc.cvtColor(mRgba, mGrayMat, Imgproc.COLOR_RGBA2GRAY);
//FindFeatures(mGrayMat.getNativeObjAddr(), mRgba.getNativeObjAddr());
return mRgba;
}
}
Note: I have tried commenting everything but the featureDetector.detect(mRgba, keyPoints)
in the onCameraFrame()
method and still gave runtime error on my phone.
注意:我已经尝试featureDetector.detect(mRgba, keyPoints)
对onCameraFrame()
方法中的所有内容进行评论,但仍然在我的手机上出现运行时错误。
回答by cid
If I'm not mistaken, OpenCV SURF Feature Detector only works with grayscale images. So try to add this after your call to copyToin the onCameraFrame()
method:
如果我没记错的话,OpenCV SURF Feature Detector 只适用于灰度图像。因此,请尝试在方法中调用copyTo之后添加此内容onCameraFrame()
:
cvtColor(mRgba, mGrayMat, COLOR_RGBA2GRAY);
回答by Robert Wang
Are you sure that you use SIFT in a correct way? As far as I know, SIFT and SURF are not include in the distribution package of OpenCV Android. To use them, you need to compile the nonfree module and use it in your project. So, what you need to do is to create a NDK project, compile the nonfree module as a standalone library. Then use this library to compile your program. Then you should be able to build your application. You can refer to this tutorial.
您确定以正确的方式使用 SIFT 吗?据我所知,SIFT 和 SURF 不包含在 OpenCV Android 的分发包中。要使用它们,您需要编译非自由模块并在您的项目中使用它。因此,您需要做的是创建一个 NDK 项目,将 nonfree 模块编译为独立库。然后使用这个库来编译你的程序。然后您应该能够构建您的应用程序。你可以参考这个教程。
After you get the jni library, you can easily wrap it to a JAVA JNI interface. Then you should be able to use the JAVA interface in your Android application.
拿到jni库后,就可以很方便的把它封装成JAVA JNI接口了。然后您应该能够在您的 Android 应用程序中使用 JAVA 界面。
回答by Robert Wang
To comment on cid and HMK's answer (sorry, I don't have 50 reputation for "add comment", so I have to create a new answer).
评论 cid 和 HMK 的答案(抱歉,我没有 50 的“添加评论”声望,所以我必须创建一个新答案)。
The OpenCV library can accept color image as input. The following is my SIFT detection and description extraction code. It works pretty well. It means you don't need to convert the image into gray-scale format, although the SIFT algorithm only work on gray-scale image. I believe the OpenCV detector has done some preprocessing. (Since the suft detector and sift work in a similar way, I assume the SURF does not require gray-scale format input either)
OpenCV 库可以接受彩色图像作为输入。以下是我的SIFT检测和描述提取代码。它工作得很好。这意味着您不需要将图像转换为灰度格式,尽管 SIFT 算法仅适用于灰度图像。我相信 OpenCV 检测器已经做了一些预处理。(由于 suft 检测器和 sift 以类似的方式工作,我假设 SURF 也不需要灰度格式输入)
Mat image;
image = imread(argv[1], CV_LOAD_IMAGE_COLOR);
if(! image.data )
{
cout << "Could not open or find the image" << std::endl ;
return -1;
}
vector<KeyPoint> keypoints;
Mat descriptors;
// Create a SIFT keypoint detector.
SiftFeatureDetector detector;
detector.detect(image, keypoints);
cout << "Detected " << (int) keypoints.size() << " keypoints" <<endl;
// Compute feature description.
detector.compute(image,keypoints, descriptors);
cout << "Computed feature."<<endl;
回答by HMK
SURF or SIFT only support grayscale. So you have to convert it to grayscale first with the below code: cvtColor( mRgba, mRgba, CV_BGR2GRAY );
SURF 或 SIFT 仅支持灰度。因此,您必须首先使用以下代码将其转换为灰度: cvtColor( mRgba, mRgba, CV_BGR2GRAY );