xcode iOS - 计算距离、方位角、仰角和相对位置(增强现实)

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/14070931/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-15 02:24:57  来源:igfitidea点击:

iOS - Calculating distance, azimuth, elevation and relative position (Augmented Reality)

iphoneobjective-ciosxcodeaugmented-reality

提问by MCKapur

I am starting to build an augmented reality app where you can place an image on the screen on your augmented reality camera view and it stays in that position on the Earth, so someone else with their camera view can come by and see it on the augmented reality camera view. For this I know I need to calculate some sort of distance factor along with azimuth and elevation.

我开始构建一个增强现实应用程序,您可以在其中将图像放在增强现实相机视图的屏幕上,它会停留在地球上的那个位置,所以其他人可以通过他们的相机视图来查看它现实相机视图。为此,我知道我需要计算某种距离因子以及方位角和仰角。

So, I have already figured out how to send the object's graphics up to a server and retrieve it back, but how can I place it back on its original position, relative to Earth. I know I need to calculate its:

所以,我已经想出了如何将对象的图形发送到服务器并将其检索回来,但是我如何将它放回相对于地球的原始位置。我知道我需要计算它:

  • Altitude
  • Coordinates
  • Azimuth
  • Elevation
  • Distance
  • 高度
  • 坐标
  • 方位角
  • 海拔
  • 距离

But how would I calculate these and account for them/piece them together. I hope you understand what I mean.

但是我将如何计算这些并解释它们/将它们拼凑在一起。我希望你明白我的意思。

To refine your understanding let me give you a short demo of the app:

为了加深您的理解,让我给您一个简短的应用演示:

A man is in his house, he decides to place an image of a painting on one of his walls. He opens up the app which defaults to the augmented reality screen, he presses the plus button and adds an image from his photo library. Behind the scenes, it saves the location and positional data up to a server, someone with the app and its augmented reality screen comes by, it goes up to the server and finds images saved nearby, it then downloads the image and places it up on the wall so the other man can see it with his phone when he moves it by.

一个男人在他的房子里,他决定在他的一面墙上放一幅画的图像。他打开默认为增强现实屏幕的应用程序,按下加号按钮并从他的照片库中添加一张图像。在幕后,它将位置和位置数据保存到服务器上,当有人使用应用程序和增强现实屏幕时,它会到达服务器并找到附近保存的图像,然后下载图像并将其放置在墙,所以当他移动它时,另一个人可以用他的手机看到它。

What approach should I take to achieve this? Any outline, links, resources, tutorials, thoughts, experience etc. Thanks! This was a generally hard question to write down, I hope you can understand. If not please tell me and I will reword.

我应该采取什么方法来实现这一目标?任何大纲、链接、资源、教程、想法、经验等。谢谢!这是一个通常很难写下来的问题,我希望你能理解。如果没有,请告诉我,我会改写。

Rohan

罗汉

回答by Ricardo RendonCepeda

I'm working on two AR iOS apps which do the following: convert azimuth (compass, horizontal angle) and elevation (gyroscope, vertical angle) to a position in 3D space (e.g. spherical to cartesian).

我正在开发两个 AR iOS 应用程序,它们执行以下操作:将方位角(罗盘、水平角)和仰角(陀螺仪、垂直角)转换为 3D 空间中的位置(例如球面到笛卡尔)。

The frameworks you need are:

您需要的框架是:

  • CoreLocation
  • CoreMotion
  • 核心位置
  • 核心运动

Getting the geolocation (coordinates) is pretty straightforward for latitude, longitude, and altitude. You can easily find this information in several online sources, but this is the main call you need from the CLLocationManagerDelegateafter you call startUpdatingLocation:

获取纬度、经度和高度的地理位置(坐标)非常简单。您可以在多个在线资源中轻松找到此信息,但这是您CLLocationManagerDelegate在致电后需要的主要电话startUpdatingLocation

- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
    latitude = (float) manager.location.coordinate.latitude;
    longitude = (float) manager.location.coordinate.longitude;
    altitude = (float) manager.location.altitude;
}

Getting the azimuth angle is also pretty straightforward, using the same delegate as the location after calling startUpdatingHeading:

获取方位角也非常简单,调用后使用与位置相同的委托startUpdatingHeading

- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
    azimuth  = (float) manager.heading.magneticHeading;
}

Elevation is extracted from the gyroscope, which doesn't have a delegate but is also easy to set up. The call looks something like this (note: this works for my app running in landscape mode, check yours):

仰角是从陀螺仪中提取的,它没有代表,但也很容易设置。调用看起来像这样(注意:这适用于我在横向模式下运行的应用程序,请检查您的应用程序):

elevation = fabsf(self.motionManager.deviceMotion.attitude.roll);

Finally, you can convert your orientation coordinates into a 3D point like so:

最后,您可以将您的方向坐标转换为 3D 点,如下所示:

- (GLKVector3)sphericalToCartesian:(float)radius azimuth:(float)theta elevation:(float)phi
{
    // Convert Coordinates: Spherical to Cartesian
    // Spherical: Radial Distance (r), Azimuth (θ), Elevation (φ)
    // Cartesian: x, y, z

    float x = radius * sinf(phi) * sinf(theta);
    float y = radius * cosf(phi);
    float z = radius * sinf(phi) * cosf(theta);
    return GLKVector3Make(x, y, z);
}

For this last part be very wary of angle and axis naming conventions as they vary wildly from source to source. In my system, θ is the angle on the horizontal plane, φ is the angle on the vertical plane, x is left-right, y is down-up, and z is back-front.

对于最后一部分,要非常小心角度和轴的命名约定,因为它们因源而异。在我的系统中,θ 是水平平面上的角度,φ 是垂直平面上的角度,x 是左右,y 是下向上,z 是后前。

As for distance, I'm not sure you really need to use it but if you do then just substitute it for "radius".

至于距离,我不确定您是否真的需要使用它,但是如果您这样做了,那么只需将其替换为“半径”即可。

Hope that helps

希望有帮助

回答by Sébastien REMY

Swift 3

斯威夫特 3

Gyroscope code update:

陀螺仪代码更新:

import CoreMotion
...
        motionManager.deviceMotionUpdateInterval = 0.1
        motionManager.startDeviceMotionUpdates(to: OperationQueue.current!) { deviceManager, error in
            guard let dm = deviceManager else { return }
            let roll = dm.attitude.roll
            let pitch = dm.attitude.pitch
            let yaw = dm.attitude.yaw

            print("r: \(roll), p: \(pitch), y: \(yaw)")
        }