ios 应用程序的 Info.plist 必须包含一个 NSMicrophoneUsageDescription 键和一个字符串值,向用户解释应用程序如何使用这些数据
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/39589998/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
The app's Info.plist must contain an NSMicrophoneUsageDescription key with a string value explaining to the user how the app uses this data
提问by Anton Tropashko
Got a build rejection
The app's Info.plist
must contain an NSMicrophoneUsageDescription
key with a string value explaining to the user how the app uses this data.
得到一个构建拒绝应用程序Info.plist
必须包含一个NSMicrophoneUsageDescription
带有字符串值的键,向用户解释应用程序如何使用这些数据。
The app does not use microphone. Or so I think.
该应用程序不使用麦克风。或者我认为是这样。
How do I track down where mic is used?
如何追踪使用麦克风的位置?
UPD23112016: given that the lazy answer is being upvoted I've filed a new feature request with apple to close this security hole.
UPD23112016:鉴于懒惰的答案正在被投票,我已经向苹果提交了一个新的功能请求以关闭这个安全漏洞。
UPD05042017: it is still bothersome that once you proxy mic access into some 3rd party framework via some half baked NSMicrophoneUsageDescription you have zero control on where and when it can be used if user agrees to allow mic access. Folks, please do due diligence and craft precise NSMicrophoneUsageDescription that reflects on the fact that the mic is used by the code that's completely outside of your control when the usage is obscured by a 3rd party binary-only framework. Thanks.
UPD05042017:一旦您通过一些半成品 NSMicrophoneUsageDescription 将麦克风访问代理到某些 3rd 方框架中,如果用户同意允许麦克风访问,您就可以零控制何时何地使用它,这仍然很麻烦。伙计们,请进行尽职调查并制作精确的 NSMicrophoneUsageDescription,这反映了这样一个事实,即当使用被 3rd 方仅二进制框架掩盖时,完全不受您控制的代码使用麦克风。谢谢。
采纳答案by iYoung
Just add NSMicrophoneUsageDescription
key & in value add the justification that why your app is using Microphone. This is the latest requirement in iOS 10.
只需添加NSMicrophoneUsageDescription
key 和 in value 添加您的应用程序使用麦克风的理由。这是 iOS 10 中的最新要求。
回答by Paul Lehn
For the lazy:
对于懒人:
if you want to quickly add usageDescriptions for most media access (on-device photos, camera, video recording, location):
如果您想为大多数媒体访问(设备上的照片、相机、视频录制、位置)快速添加使用说明:
right click your info.plist file and -> open as -> Source Code
右键单击您的 info.plist 文件,然后 -> 打开为 -> 源代码
then paste the following between the current values:
然后在当前值之间粘贴以下内容:
<key>NSMicrophoneUsageDescription</key>
<string>Need microphone access for uploading videos</string>
<key>NSCameraUsageDescription</key>
<string>Need camera access for uploading images</string>
<key>NSLocationUsageDescription</key>
<string>Need location access for updating nearby friends</string>
<key>NSLocationWhenInUseUsageDescription</key>
<string>This app will use your location to show cool stuffs near you.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Need photo library access for uploading images</string>
These descriptions, of course, are up to you. I tried to make them as generic as possible.
当然,这些描述取决于您。我试图使它们尽可能通用。
Hope this saves someone's time!
希望这可以节省某人的时间!
回答by Anton Tropashko
And the culprit was (drums) : Instabug framework.
They tell you right there on their marketware pages they allow
users to take audio notes during feedback composition.
So I've added NSMicrophoneUsageDescription
into the app plist
explaining that.
罪魁祸首是(鼓):Instabug 框架。他们就在他们的市场软件页面上告诉您,他们允许用户在反馈撰写期间进行录音。所以我已经添加NSMicrophoneUsageDescription
到应用程序 plist 中解释了这一点。
Note that there is a lot of apple API that instabug uses
请注意,instabug 使用的苹果 API 有很多
Undefined symbols for architecture arm64: (i've removed some that seems legitimate according to what that framework claims to do and left what I see no claims for in the marketware)
架构 arm64 的未定义符号:(根据该框架声称的功能,我已经删除了一些看起来合法的符号,并留下了我在市场软件中没有看到的内容)
"_AVMakeRectWithAspectRatioInsideRect", referenced from: +[IBGIAMImageAttachmentView sizeForContent:forWidth:] in InstabugHost_lto.o
“_AVMakeRectWithAspectRatioInsideRect”,引用自:+[IBGIAMImageAttachmentView sizeForContent:forWidth:] instabugHost_lto.o
"_OBJC_CLASS_$_CTTelephonyNetworkInfo", referenced from: objc-class-ref in InstabugHost_lto.o
“_OBJC_CLASS_$_CTTelephonyNetworkInfo”,引用自:InstabugHost_lto.o 中的 objc-class-ref
"_AVNumberOfChannelsKey", referenced from: -[IBGVoiceNoteManager startRecording] in InstabugHost_lto.o
“_AVNumberOfChannelsKey”,引用自:-[IBGVoiceNoteManager startRecording] instabugHost_lto.o
"_CTRadioAccessTechnologyHSDPA", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyHSDPA”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyGPRS", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyGPRS”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyWCDMA", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyWCDMA”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyEdge", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyEdge”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyCDMA1x", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyCDMA1x”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyCDMAEVDORevA", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyCDMAEVDORevA”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyCDMAEVDORevB", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyCDMAEVDORevB”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyLTE", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyLTE”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_OBJC_CLASS_$_AVURLAsset", referenced from: _OBJC_CLASS_$_IBGAsset in InstabugHost_lto.o
“_OBJC_CLASS_$_AVURLAsset”,引用自:InstabugHost_lto.o 中的_OBJC_CLASS_$_IBGAsset
"_OBJC_METACLASS_$_AVURLAsset", referenced from: _OBJC_METACLASS_$_IBGAsset in InstabugHost_lto.o
“_OBJC_METACLASS_$_AVURLAsset”,引用自:InstabugHost_lto.o 中的_OBJC_METACLASS_$_IBGAsset
"_CTRadioAccessTechnologyCDMAEVDORev0", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyCDMAEVDORev0”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
"_CTRadioAccessTechnologyHSUPA", referenced from: +[IBGInspector getCarrier] in InstabugHost_lto.o
“_CTRadioAccessTechnologyHSUPA”,引用自:+[IBGInspector getCarrier] instabugHost_lto.o
ld: symbol(s) not found for architecture arm64
ld:找不到架构 arm64 的符号
So in this post-Snowden world I have to wonder why does it need coretelephony, for example.
所以在这个后斯诺登时代,我不得不想知道为什么它需要核心电话,例如。
So what I'm getting at is that if you do not have the source the a 3rd party framework you have to disclose to the user that your app itself is NOT using microphone, or camera so that the user has an option of denying access to that device.
所以我的意思是,如果您没有第 3 方框架的来源,您必须向用户披露您的应用程序本身没有使用麦克风或摄像头,以便用户可以选择拒绝访问那个设备。
You don't want to be in the news someday due to some security flaw exploited via YOUR app.
由于某些通过您的应用程序利用的安全漏洞,您不想有一天成为新闻。
Unresolved: The carefully crafted microphone usage description does not solve the issue with security completely though in case your app DOES use microphone and a 3rd party framework (think that it) needs it too.
未解决:精心设计的麦克风使用说明并不能完全解决安全问题,但万一您的应用确实使用麦克风并且第 3 方框架(认为它)也需要它。
Here's where credits disclosure could come handy giving users an idea which 3rd party code your are relying on. Give the credit where it's due :^)
在这里,信用披露可以派上用场,让用户了解您所依赖的第 3 方代码。给予应有的荣誉:^)
If you are lazy such as myself and never read through the ios security whitepaper here's a short https://developer.apple.com/videos/play/wwdc2016/705/
如果你像我一样懒惰并且从未阅读过 ios 安全白皮书,这里有一个简短的https://developer.apple.com/videos/play/wwdc2016/705/
In case you have no desire to watch the video in its entirety: around 19:00 mark the speaker tells you explicitly that you must not be lazy with those descriptions.
如果您不想完整观看视频:大约 19:00 时,演讲者明确告诉您,您不能对这些描述感到懒惰。
回答by Sharath Kumar
iOS apps require the user to grant permission before accessing the Microphone. Trying to access it without user permission will lead to app crash.
iOS 应用程序要求用户在访问麦克风之前授予权限。试图在未经用户许可的情况下访问它会导致应用程序崩溃。
To request user permission, we just need to add NSMicrophoneUsageDescriptionkey in the info.plistfile & and provide a value for this key. Value can be any string stating the applications need to access the microphone.
要请求用户权限,我们只需要在info.plist文件中添加NSMicrophoneUsageDescription键并为该键提供一个值。值可以是任何字符串,说明应用程序需要访问麦克风。
回答by MEnnabah
Instabug uses NSMicrophoneUsageDescription
to allow your users record a voice note about a bug or a feedback to you.
Instabug 用于NSMicrophoneUsageDescription
允许您的用户记录有关错误的语音注释或对您的反馈。
回答by bfx
Just having AVAudioSession.sharedInstance().requestRecordPermission()
somewhere in your code base is enough to trigger this error with iTunes Connect. It's not even necessary to actively call that code!
只需AVAudioSession.sharedInstance().requestRecordPermission()
在您的代码库中的某个地方就足以触发 iTunes Connect 的此错误。甚至没有必要主动调用该代码!