<tfoot id='oNpic'></tfoot>

    1. <small id='oNpic'></small><noframes id='oNpic'>

      <legend id='oNpic'><style id='oNpic'><dir id='oNpic'><q id='oNpic'></q></dir></style></legend>
        <bdo id='oNpic'></bdo><ul id='oNpic'></ul>

        <i id='oNpic'><tr id='oNpic'><dt id='oNpic'><q id='oNpic'><span id='oNpic'><b id='oNpic'><form id='oNpic'><ins id='oNpic'></ins><ul id='oNpic'></ul><sub id='oNpic'></sub></form><legend id='oNpic'></legend><bdo id='oNpic'><pre id='oNpic'><center id='oNpic'></center></pre></bdo></b><th id='oNpic'></th></span></q></dt></tr></i><div id='oNpic'><tfoot id='oNpic'></tfoot><dl id='oNpic'><fieldset id='oNpic'></fieldset></dl></div>
      1. 将 iPhone 摄像头实时流式传输到媒体服务器的最佳方式是什么?

        What#39;s the best way of live streaming iphone camera to a media server?(将 iPhone 摄像头实时流式传输到媒体服务器的最佳方式是什么?)

        <small id='F50b7'></small><noframes id='F50b7'>

              <tbody id='F50b7'></tbody>
            <legend id='F50b7'><style id='F50b7'><dir id='F50b7'><q id='F50b7'></q></dir></style></legend>
            <tfoot id='F50b7'></tfoot>

                <i id='F50b7'><tr id='F50b7'><dt id='F50b7'><q id='F50b7'><span id='F50b7'><b id='F50b7'><form id='F50b7'><ins id='F50b7'></ins><ul id='F50b7'></ul><sub id='F50b7'></sub></form><legend id='F50b7'></legend><bdo id='F50b7'><pre id='F50b7'><center id='F50b7'></center></pre></bdo></b><th id='F50b7'></th></span></q></dt></tr></i><div id='F50b7'><tfoot id='F50b7'></tfoot><dl id='F50b7'><fieldset id='F50b7'></fieldset></dl></div>
                • <bdo id='F50b7'></bdo><ul id='F50b7'></ul>
                  本文介绍了将 iPhone 摄像头实时流式传输到媒体服务器的最佳方式是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  据此 什么技术最适合将 iPhone 摄像机数据实时流式传输到计算机? 可以从 iphone 摄像机获取压缩数据,但正如我在 AVFoundation 参考中所读到的,您只能获得未压缩的数据.p>

                  所以问题是:

                  1) 如何从 iPhone 的相机中获取压缩帧和音频?

                  2) 使用 ffmpeg 的 API 编码未压缩的帧对于实时流式传输是否足够快?

                  任何帮助将不胜感激.

                  谢谢.

                  解决方案

                  你很可能已经知道......

                  <块引用>

                  1) 如何从 iPhone 的相机中获取压缩帧和音频?

                  你不能这样做.AVFoundation API 从各个角度防止了这种情况.我什至尝试过命名管道和其他一些鬼鬼祟祟的 unix foo.没有这样的运气.您别无选择,只能将其写入文件.在您链接的帖子中,用户建议设置回调以传递编码帧.据我所知,这对于 H.264 流是不可能的.捕获委托将提供以特定像素格式编码的图像.进行编码的是 Movie Writers 和 AVAssetWriter.

                  <块引用>

                  2) 使用 ffmpeg 的 API 编码未压缩的帧对于实时流式传输是否足够快?

                  是的.但是,您必须使用 libx264 才能进入 GPL 领域.这与应用商店不完全兼容.

                  出于效率原因,我建议使用 AVFoundation 和 AVAssetWriter.

                  According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.

                  So the questions are:

                  1) How to get compressed frames and audio from iPhone's camera?

                  2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

                  Any help will be really appreciated.

                  Thanks.

                  解决方案

                  You most likely already know....

                  1) How to get compressed frames and audio from iPhone's camera?

                  You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.

                  2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

                  Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.

                  I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.

                  这篇关于将 iPhone 摄像头实时流式传输到媒体服务器的最佳方式是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

                  相关文档推荐

                  In Objective-c, safe and good way to compare 2 BOOL values?(在 Objective-c 中,比较 2 个 BOOL 值的安全和好方法?)
                  iOS: Use a boolean in NSUserDefaults(iOS:在 NSUserDefaults 中使用布尔值)
                  Typical UDP latency on iPhone over 3G - are my numbers right?(iPhone 在 3G 上的典型 UDP 延迟 - 我的数字对吗?)
                  Scan networks (SSID#39;s) on iOS 7 by using private API(使用私有 API 在 iOS 7 上扫描网络 (SSID))
                  Determine whether iPhone is really connected to the internet or just behind a restricted hotspot(确定 iPhone 是真正连接到互联网还是仅在受限热点后面)
                  Am I using CNCopyCurrentNetworkInfo correctly?(我正确使用 CNCopyCurrentNetworkInfo 吗?)

                    <tbody id='pRBea'></tbody>
                  <legend id='pRBea'><style id='pRBea'><dir id='pRBea'><q id='pRBea'></q></dir></style></legend>
                  • <small id='pRBea'></small><noframes id='pRBea'>

                    <tfoot id='pRBea'></tfoot>

                          <i id='pRBea'><tr id='pRBea'><dt id='pRBea'><q id='pRBea'><span id='pRBea'><b id='pRBea'><form id='pRBea'><ins id='pRBea'></ins><ul id='pRBea'></ul><sub id='pRBea'></sub></form><legend id='pRBea'></legend><bdo id='pRBea'><pre id='pRBea'><center id='pRBea'></center></pre></bdo></b><th id='pRBea'></th></span></q></dt></tr></i><div id='pRBea'><tfoot id='pRBea'></tfoot><dl id='pRBea'><fieldset id='pRBea'></fieldset></dl></div>
                            <bdo id='pRBea'></bdo><ul id='pRBea'></ul>