WO2019019907A1 - 摄像方法、终端及存储介质 - Google Patents

摄像方法、终端及存储介质 Download PDF

Info

Publication number
WO2019019907A1
WO2019019907A1 PCT/CN2018/095034 CN2018095034W WO2019019907A1 WO 2019019907 A1 WO2019019907 A1 WO 2019019907A1 CN 2018095034 W CN2018095034 W CN 2018095034W WO 2019019907 A1 WO2019019907 A1 WO 2019019907A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
terminal
motion tracking
application
rgb
Prior art date
Application number
PCT/CN2018/095034
Other languages
English (en)
French (fr)
Inventor
闫晓梅
张凡
王金光
赵金锴
石林峰
肖龙安
黄竹邻
吴永辉
王强
王浩广
李伟
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2019019907A1 publication Critical patent/WO2019019907A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Definitions

  • the present disclosure relates to the field of augmented reality display technologies, and in particular, to an imaging method, a terminal, and a storage medium.
  • Google Tango is an augmented reality (AR) development platform that adds a fisheye camera and depth detection module to a common red, green and blue (RGB) camera.
  • the depth detection module consists of an infrared (IR) emitter and an IR sensor, the former for infrared emission and the latter for infrared detection. According to the distance, the infrared flight time, and the distance information of each pixel, a point cloud is formed, and then the RGB camera texture is used for 3D imaging.
  • the depth detection module can measure the surrounding objects to the centimeter level and can detect object distances from 0.4m to 4m.
  • the depth detection uses IR, so there is a requirement for light, it cannot be used outdoors, and there is also a requirement for light in the room.
  • the fisheye camera function is used for motion tracking, that is, it needs to perceive its own motion during the movement of the mobile device, and cooperates with the IMU (Inertial Measurement Unit) to complete the sensing movement of the mobile device.
  • IMU
  • the Google Tango technology solution is that some matching hardware components can only be used for terminal AR applications, and the degree of multiplexing is relatively low, which makes the cost relatively high.
  • the depth detection module uses infrared to detect the surrounding environment, is limited by light, and can only be used indoors, and when used indoors, there are also requirements for light, resulting in limited application.
  • the embodiments of the present disclosure provide an imaging method, a terminal, and a storage medium, which are used to solve the problem that the reusability of hardware components that adapt to the AR function in the prior art is low, thereby causing an increase in the overall cost of the terminal.
  • an image capturing method for use in a terminal having an AR function the terminal including a motion tracking camera adapted to an AR function and an RGB camera, the method comprising: applying a closed state in the terminal AR And, when receiving the photographing instruction, starting the motion tracking camera and the RGB camera; and synthesizing the image captured by the motion tracking camera and the RGB camera, wherein the motion tracking camera and the RGB camera The field of view overlap is greater than the set threshold.
  • a terminal including: a memory, a processor, an AR-capable motion tracking camera, a red-green-blue RGB camera, and stored on the memory and operable on the processor
  • a computer program that, when the processor runs the computer program, performs the imaging method in accordance with the present disclosure.
  • a computer readable storage medium having stored thereon a computer program that, when executed by a processor, performs an imaging method according to the present disclosure.
  • FIG. 1 is a flowchart of an imaging method according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart of an imaging method according to an embodiment of the present disclosure
  • FIG. 4 is a flowchart of an imaging method according to an embodiment of the present disclosure.
  • FIG. 5 is a flowchart of an imaging method according to an embodiment of the present disclosure.
  • FIG. 6 is a structural block diagram of a terminal according to an embodiment of the present disclosure.
  • FIG. 1 is a flowchart of an imaging method according to an embodiment of the present disclosure.
  • the imaging method according to the present embodiment can be applied to a terminal having an AR function, and the terminal can include a motion tracking camera adapted to the AR function, an RGB camera, a depth detecting module, and an inertial measuring unit.
  • the imaging method of this embodiment may include the following steps S101 to S102.
  • step S101 in a state where the terminal AR application is turned off, when a photographing instruction is received, the motion tracking camera and the RGB camera are activated.
  • step S102 the images captured by the motion tracking camera and the RGB camera are combined.
  • the field of view overlap of the motion tracking camera and the RGB camera is greater than a set threshold. According to an embodiment of the present disclosure, the field of view overlap of the motion tracking camera and the RGB camera may be greater than 70%. In order to improve the synthesis effect, the field of view overlap can reach 90%.
  • the motion tracking camera applied to the adaptation terminal AR is multiplexed, so that the motion tracking camera and the RGB camera constitute a dual camera for the terminal to photograph, without adding a terminal.
  • the camera and camera capabilities of the terminal are improved.
  • the motion tracking camera may include, but is not limited to, a fisheye camera.
  • the angle of view of the fisheye lens is large, generally reaching 220° or 230°, which creates conditions for close-up shooting of a wide range of scenes.
  • the fisheye camera can produce a very strong perspective when approaching the subject, emphasizing the contrast of the subject.
  • the fisheye lens has a fairly long depth of field, which is good for expressing the depth of field effect of the subject.
  • the resolution of the motion tracking camera may be 8 million (or larger) pixels.
  • an RGB camera may include an optical zoom camera.
  • the optical zoom camera can include, but is not limited to, a periscope camera.
  • the periscope camera is refracted by a prism, and the focal length is adjusted by the float of the lens module inside the machine, and the optical zoom can be realized without the protrusion of the camera, thereby clearly capturing a distant subject.
  • the RGB camera when the RGB camera includes an optical zoom camera, in a state where the terminal AR application is turned off, the RGB camera can operate in the optical zoom mode to enhance the effect of photographing, and obtain an image of high quality effect.
  • an optical zoom camera may include a telephoto camera for a terminal device in which a terminal AR application screen is not very demanded.
  • the RGB camera and the motion tracking camera are used for depth detection 3D texture imaging and motion tracking, respectively.
  • the RGB camera sends the captured environment view to the terminal AR application, and the terminal AR application cooperates with the environment view captured by the RGB camera and the data detected by the depth detection module to realize depth detection 3D texture imaging.
  • the motion tracking camera transmits the captured environment view to the terminal AR application, and the terminal AR application cooperates with the environment view captured by the motion tracking camera and the data detected by the inertial measurement unit to implement motion tracking of the terminal device.
  • the RGB camera when the RGB camera includes an optical zoom camera, the RGB camera can operate in the fixed focus mode in a state where the terminal AR application is turned on to reduce the power consumption of the terminal.
  • the focal length of the fixed focus can be pre-configured.
  • the RGB camera can also operate in the zoom mode in a state where the terminal AR application is turned on.
  • the RGB camera When the RGB camera is operating in the zoom mode, in the AR scene, the RGB camera enhances the ability to capture the distant view through optical zoom, thereby improving the effect of 3D texture imaging.
  • the working mode of the RGB camera in the state where the terminal AR application is turned on can be flexibly set according to requirements.
  • the depth detection module may include a time of flight (TOF) radar module. Since the radar depth detection technology is not affected by light and weather like infrared rays, the application such as real-time 3D navigation composition is no longer affected by light and weather, making the AR device suitable for outdoor use.
  • TOF time of flight
  • the imaging method according to an embodiment of the present disclosure can be applied to a smart mobile terminal of the Google Tango AR development platform, multiplexing a motion tracking camera (for example, a fisheye camera), and using a periscope camera as an RGB camera, thereby enabling the terminal AR to be operated.
  • the application can also produce high-quality effect video renderings.
  • the fisheye camera and the RGB camera can overlap by more than 90%.
  • FIG. 2 is a mode transition diagram in accordance with an embodiment of the present disclosure.
  • the default mode of the periscope camera is an optical zoom mode or a Tango AR mode.
  • the Tango AR app can be turned off and converted to optical zoom mode.
  • the periscope camera changes from a fixed focal length to a normal zoom function.
  • optical zoom mode the Tango AR app is turned on and the optical zoom mode is converted.
  • the periscope camera is fixed to a suitable focal length.
  • FIG. 3 is a flowchart of an imaging method according to an embodiment of the present disclosure.
  • the imaging method according to an embodiment of the present disclosure may include the following steps S301 to S304.
  • the default mode of the periscope camera may be an optical zoom mode.
  • step S302 the terminal determines whether the Tango AR application is started. If the Tango AR application is not activated, it determines that the terminal is in the optical zoom mode, and performs step S303. If the Tango AR application is started, determines that the terminal is in the Tango AR mode, and performs step S304. .
  • the periscope camera operates in the zoom mode. If it is determined that the terminal is in the Tango AR mode, the periscope camera operates in the fixed focus mode. In addition, for some specific AR application scenarios, such as distant scenes, the periscope camera can also work in the zoom mode. At this time, the user can adjust the focal length of the camera, and increase the image recognition accuracy, thereby improving the performance of the terminal AR application.
  • step S303 when receiving the photographing instruction of the user, the terminal starts the periscope camera and the fisheye camera, and synthesizes the images captured by the two cameras to obtain a high quality photograph.
  • the user can adjust the focal length of the periscope camera according to the needs of taking pictures, so that the fisheye camera and the periscope camera respectively framing the close-up and the distant view, and synthesizing the quality photos.
  • step S304 the periscope camera of the terminal captures the environment view, and sends the environment view to the terminal AR application to assist the terminal AR application to perform 3D navigation and composition, and the fisheye camera of the terminal captures the environment view and sends the environment view To the terminal AR application to assist the terminal AR application for motion tracking.
  • FIG. 4 is a flow chart of an imaging method in accordance with an embodiment of the present disclosure.
  • the imaging method according to the present embodiment can be applied to a terminal having an AR function, and the terminal can include a motion tracking camera adapted to the AR function, an RGB camera, a depth detecting module, and an inertial measuring unit. Emphasis will be placed on the differences from the embodiment described with reference to FIG.
  • the imaging method of this embodiment may include the following steps S401 to S402.
  • step S401 when the terminal AR application is turned off, when the photographing instruction is received, the motion tracking camera and the RGB camera are activated, and the images captured by the motion tracking camera and the RGB camera are combined.
  • step S402 the image captured by the motion tracking camera and the RGB camera is combined in a state in which the terminal AR application is turned on, and the synthesized image is transmitted to the terminal AR application to assist the terminal AR application.
  • 3D texture imaging the image captured by the motion tracking camera and the RGB camera is combined in a state in which the terminal AR application is turned on, and the synthesized image is transmitted to the terminal AR application to assist the terminal AR application.
  • the RGB camera when the RGB camera is an optical zoom camera, the RGB camera operates in a fixed focus mode or a zoom mode in a state where the terminal AR application is turned on.
  • the working mode of the RGB camera in the state where the terminal AR application is turned on can be flexibly set according to requirements.
  • a composite image assisting terminal AR application using an RGB camera and a motion tracking camera performs 3D texture imaging, which can improve the effect of 3D texture imaging.
  • FIG. 5 is a flowchart of an imaging method according to an embodiment of the present disclosure.
  • the imaging method of this embodiment may include the following steps S501 to S503.
  • step S501 the terminal device determines whether the Tango AR application is started. If the Tango AR application is not activated, it determines that the terminal is in the optical zoom mode, and performs step S502. If the Tango AR application is started, determines that the terminal is in the Tango AR mode, and performs steps. S503.
  • the periscope camera If it is determined that the terminal is in the optical zoom mode, the periscope camera operates in the zoom mode. If it is determined that the terminal is in the Tango AR mode, the periscope camera operates in the fixed focus mode or the zoom mode.
  • step S502 when receiving the photographing instruction of the user, the terminal starts the periscope camera and the fisheye camera, and synthesizes the images captured by the two cameras to obtain a high quality photograph.
  • step S503 the periscope camera of the terminal captures the environment view, and sends the environment view to the terminal AR application to assist the terminal AR application to perform 3D navigation and composition, and the fisheye camera of the terminal captures the environment view and sends the environment view To the terminal AR application to assist the terminal AR application for motion tracking.
  • the composite picture of the periscope camera and the fisheye camera can effectively improve the effect of the 3D texture in the terminal AR application.
  • FIG. 6 is a structural block diagram of a terminal according to an embodiment of the present disclosure.
  • the terminal may include a memory 610, a processor 620, a motion tracking camera 630 that adapts to the AR function, an RGB camera 640, a depth detection module 650, and an inertial measurement unit 660.
  • a computer program executable on the processor 620 is stored on the memory 610, and the image processing method according to various embodiments of the present disclosure may be executed when the processor 620 executes the computer program.
  • the field of view overlap of the motion tracking camera 630 and the RGB camera 640 is greater than a set threshold. According to an embodiment of the present disclosure, the field of view overlap of the motion tracking camera 630 and the RGB camera 640 may be greater than 70%. In order to improve the synthesis effect, the field of view overlap can reach 90%.
  • the motion tracking camera 630 may include, but is not limited to, a fisheye camera.
  • the resolution of the motion tracking camera 630 may be 8 million (or larger) pixels.
  • the RGB camera 640 may include an optical zoom camera.
  • the optical zoom camera can include, but is not limited to, a periscope camera.
  • the RGB camera 640 when the RGB camera 640 includes an optical zoom camera, in a state where the terminal AR application is turned off, the RGB camera 640 can operate in the optical zoom mode to enhance the effect of photographing, resulting in an image of high quality effect.
  • an optical zoom camera may include a telephoto camera for a terminal device in which a terminal AR application screen is not very demanded.
  • the RGB camera 640 and the motion tracking camera 630 are used for depth detection 3D texture imaging and motion tracking, respectively.
  • the RGB camera 640 transmits the captured environment view to the terminal AR application, and the terminal AR application cooperates with the environment view captured by the RGB camera 640 and the data detected by the depth detection module 650 to implement depth detection 3D texture imaging.
  • the motion tracking camera 630 transmits the captured environment view to the terminal AR application, and the terminal AR application cooperates with the environment view captured by the motion tracking camera 630 and the data detected by the inertial measurement unit 660 to implement motion tracking of the terminal device.
  • the RGB camera 640 when the RGB camera 640 includes an optical zoom camera, in a state where the terminal AR application is turned on, the RGB camera 640 can operate in the focus mode to reduce the power consumption of the terminal.
  • the focal length of the fixed focus can be pre-configured.
  • the RGB camera 640 can also operate in the zoom mode in a state where the terminal AR application is turned on.
  • the RGB camera 640 When the RGB camera 640 is operating in the zoom mode, in the AR scene, the RGB camera 640 enhances the ability to capture the distant view through optical zoom, thereby improving the effect of 3D texture imaging.
  • the operation mode of the RGB camera 640 in a state where the terminal AR application is turned on can be flexibly set according to requirements.
  • the images captured by the motion tracking camera 630 and the RGB camera 640 are combined, and the synthesized image is transmitted to the terminal AR application to assist the
  • the terminal AR application performs 3D texture imaging.
  • the composite image assisting terminal AR application of the RGB camera 640 and the motion tracking camera 630 performs 3D texture imaging, which can improve the effect of 3D texture imaging.
  • the depth detecting module 650 may include a TOF radar module, so that applications such as real-time 3D navigation composition are no longer affected by light and weather, making the AR device suitable for outdoor use.
  • the motion tracking camera applied to the adaptation terminal AR is multiplexed, so that the motion tracking camera and the RGB camera constitute a dual camera for the terminal to take a picture, and the implementation does not need to be increased.
  • the camera and camera capabilities of the terminal are improved.
  • the TOF radar depth detection module can extend the range of use of the AR function from indoor to outdoor, increasing the value of the AR platform.
  • Embodiments of the present disclosure also provide a computer readable storage medium having stored thereon a computer program that, when executed by a processor, performs an imaging method according to the present disclosure.
  • embodiments of the present disclosure can be provided as a method, apparatus, system, or computer program product. Accordingly, the present disclosure may take the form of a hardware embodiment, a software embodiment, or a combination of software and hardware aspects. Moreover, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

本公开公开了一种摄像方法、终端及存储介质,所述终端包括适配AR功能的运动追踪摄像头、红绿蓝RGB摄像头,所述方法包括:在终端AR应用关闭的状态下,当接收到拍照指令时,启动所述运动追踪摄像头和所述RGB摄像头;以及对所述运动追踪摄像头和所述RGB摄像头拍摄的图像进行合成处理,其中,所述运动追踪摄像头和RGB摄像头的视场重叠大于设定的阈值。

Description

摄像方法、终端及存储介质 技术领域
本公开涉及增强现实显示技术领域,具体地,涉及一种摄像方法、终端及存储介质。
背景技术
Google Tango是增强现实(AR)开发平台,其在普通红绿蓝(RGB)摄像头基础上添加了鱼眼(fisheye)摄像头和深度检测模组。深度检测模组由红外线(IR)发射器和IR传感器(sensor)组成,前者用于红外线发射,后者对红外线检测。根据距离远近、红外光飞行时间、每个像素保存距离信息,形成点云,再配合RGB摄像头纹理,进行3D成像。深度检测模组对周围物体测距能精确到厘米级,能检测到0.4m到4m物体距离。深度检测使用的是IR,所以对光有要求,不能用于室外,室内对光线也有要求。鱼眼摄像头作用是用于运动追踪,也就是在移动设备移动过程中需要感知自己的运动,配合IMU(惯性测量单元),完成感知移动设备运动轨迹。
然而,在一些情况下,Google Tango技术方案缺陷是一些匹配的硬件组件只能用于终端AR应用,复用程度比较低,使得成本比较高。同时,深度检测模组使用红外检测周围环境,受到光线限制,只能用于室内,并且室内使用时,对光线也有要求,导致应用受限。
发明内容
本公开实施例提出一种摄像方法、终端及存储介质,用以解决现有技术中适配AR功能的硬件组件复用性低,进而导致终端整体成本提高的问题。
依据本公开的一个方面,提供一种摄像方法,应用在具有AR功能的终端中,所述终端包括适配AR功能的运动追踪摄像头和RGB摄像头,所述方法包括:在终端AR应用关闭的状态下,当接收到拍照指令时,启动所述运动追踪摄像头和所述RGB摄像头;以及对所述运 动追踪摄像头和所述RGB摄像头拍摄的图像进行合成处理,其中,所述运动追踪摄像头和RGB摄像头的视场重叠大于设定的阈值。
依据本公开的另一个方面,提供一种终端,包括:存储器、处理器、适配AR功能的运动追踪摄像头、红绿蓝RGB摄像头以及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器运行所述计算机程序时,所述处理器执行根据本公开的摄像方法。
依据本公开的另一个方面,提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,所述处理器执行根据本公开的摄像方法。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的特征和特点对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本公开的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1为根据本公开实施例的摄像方法的流程图;
图2为根据本公开实施例的模式转换图;
图3为根据本公开实施例的摄像方法的流程图;
图4为根据本公开实施例的摄像方法的流程图;
图5为根据本公开实施例的摄像方法的流程图;以及
图6为根据本公开实施例的终端的结构框图。
具体实施方式
上述说明仅是本公开实施例技术方案的概述,为了能够更清楚了解本公开实施例的技术手段,而可依照说明书的内容予以实施,并且为了让本公开实施例的上述和其它目的、特征和优点能够更明显易懂,以下特举本公开实施例的具体实施方式。
下面将参照附图更详细地描述本公开的示例性实施例。虽然附图中显示了本公开的示例性实施例,然而应当理解,可以以各种形式 实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更透彻地理解本公开,并且能够将本公开的范围完整的传达给本领域的技术人员。
图1为根据本公开实施例的摄像方法的流程图。
根据本实施例的摄像方法可以应用在具有AR功能的终端中,所述终端可以包括适配AR功能的运动追踪摄像头、RGB摄像头、深度检测模组和惯性测量单元。
如图1所示,本实施例的摄像方法可以包括如下步骤S101至S102。
在步骤S101,在终端AR应用关闭的状态下,当接收到拍照指令时,启动运动追踪摄像头和RGB摄像头。
在步骤S102,对运动追踪摄像头和RGB摄像头拍摄的图像进行合成处理。
根据本公开实施例,运动追踪摄像头和RGB摄像头的视场重叠大于设定的阈值。根据本公开实施例,运动追踪摄像头和RGB摄像头的视场重叠可以大于70%。为了提高合成效果,视场重叠可以达到90%。
根据本公开实施例的摄像方法,在终端AR应用关闭的情况下,对适配终端AR应用的运动追踪摄像头进行复用,使得运动追踪摄像头与RGB摄像头构成终端拍照的双摄像头,在无需增加终端成本的情况下,提高了终端的拍照、摄像能力。
根据本公开实施例,运动追踪摄像头可以包括(但不限于)鱼眼摄像头。鱼眼镜头的视角范围大,一般可达到220°或230°,这为近距离拍摄大范围景物创造了条件。鱼眼摄像头在接近被摄物拍摄时能造成非常强烈的透视效果,强调被摄物近大远小的对比。鱼眼镜头具有相当长的景深,有利于表现被摄物他的景深效果。根据本公开实施例,为了保证运动追踪摄像头和RGB摄像头的图片合成效果,运动追踪摄像头的分辨率可以为800万(或更大)像素。
根据在本公开实施例,RGB摄像头可以包括光学变焦摄像头。光学变焦摄像头可以包括(但不限于)潜望式摄像头。潜望式摄像头是 通过棱镜折射,在机器内部通过镜片模组的浮动调节焦距,无需摄像头突出就能够实现光学变焦,从而清晰拍摄较远的拍摄物。
根据本公开实施例,当RGB摄像头包括光学变焦摄像头时,在终端AR应用关闭的状态下,RGB摄像头可以工作在光学变焦模式下,以提高拍摄的效果,得到高品质效果的图像。
根据本公开实施例,对于终端AR应用画面感要求不是很高的终端设备,光学变焦摄像头可以包括长焦摄像头。
根据本公开实施例,在终端AR应用开启的状态下,RGB摄像头和运动追踪摄像头分别用于深度检测3D纹理成像和运动追踪。RGB摄像头将拍摄的环境视图发送到终端AR应用,终端AR应用将RGB摄像头拍摄的环境视图和深度检测模组检测的数据相配合,实现深度检测3D纹理成像。运动追踪摄像头将拍摄的环境视图发送到终端AR应用,终端AR应用将运动追踪摄像头拍摄的环境视图和惯性测量单元检测的数据相配合,实现终端设备的运动追踪。
根据本公开实施例,当RGB摄像头包括光学变焦摄像头时,在终端AR应用开启的状态下,RGB摄像头可以工作在定焦模式下,以降低终端的功耗。定焦的焦距可预先配置。
可替代的,在终端AR应用开启的状态下,RGB摄像头也可工作在变焦模式下。当RGB摄像头工作在变焦模式下时,在AR场景中,RGB摄像头通过光学变焦来增强拍摄远景能力,进而提高3D纹理成像的效果。
可以根据需求灵活的设定在终端AR应用开启的状态下RGB摄像头的工作模式。
根据本公开实施例,深度检测模组可以包括飞行时间(TOF)雷达模组。由于雷达深度检测技术不像红外线那样受到光线和天气影响,从而使得实时类似3D导航构图这样应用不再受到光线、天气影响,使得该AR设备可适用于室外各种场合。
下面通过一个示例,对本公开实施例的摄像方法进行更详细的说明。
根据本公开实施例的摄像方法可应用在Google Tango AR开发 平台智能移动终端中,复用运动追踪摄像头(例如,鱼眼摄像头),并采用潜望式摄像头作为RGB摄像头,从而既能运行终端AR应用,又能拍出高品质效果视频效果图。鱼眼摄像头和RGB摄像头视场重叠可为90%以上。
图2为根据本公开实施例的模式转换图。
如图2所示,根据本公开的实施例,在终端开机时,潜望式摄像头的缺省模式为光学变焦模式或Tango AR模式。在Tango AR模式下,可以关闭Tango AR应用,从而转化为光学变焦模式,潜望式摄像头从固定某一焦距变为正常变焦功能;在光学变焦模式下,开启Tango AR应用,从光学变焦模式转化为Tango AR模式,潜望式摄像头定焦为某一合适焦距。
图3为根据本公开实施例的摄像方法的流程图。
如图3所示,根据本公开实施例的摄像方法可以包括如下步骤步骤S301至S304。
在步骤S301,开机时,潜望式摄像头的缺省模式可为光学变焦模式。
在步骤S302,终端判断Tango AR应用是否启动,如果Tango AR应用没有启动,则确定终端为光学变焦模式,并执行步骤S303;如果Tango AR应用启动,则确定终端为Tango AR模式,并执行步骤S304。
如果确定终端为光学变焦模式,则潜望式摄像头工作在变焦模式下。如果确定终端为Tango AR模式,则潜望式摄像头工作在定焦模式下。此外,对于一些特定AR应用场景,比如远景场合,潜望式摄像头也可工作在变焦模式下,此时,用户可以调潜望式摄像头焦距,增加图像识别准确度,从而提升终端AR应用性能。
在步骤S303,终端在接收到用户的拍照指令时,启动潜望式摄像头和鱼眼摄像头,并将两个摄像头拍摄的图像进行合成处理,得到优质的照片。
用户可以根据在拍照时的需要,调节潜望式摄像头的焦距,使得鱼眼摄像头和潜望式摄像头分别对近景和远景进行取景,并合成优 质照片。
在步骤S304,终端的潜望式摄像头拍摄环境视图,并将环境视图发送到终端AR应用,以辅助终端AR应用进行3D导航和构图,并且终端的鱼眼摄像头拍摄环境视图,并将环境视图发送到终端AR应用,以辅助终端AR应用进行运动追踪。
图4为根据本公开实施例的摄像方法的流程图。
根据本实施例的摄像方法可以应用在具有AR功能的终端中,所述终端可以包括适配AR功能的运动追踪摄像头、RGB摄像头、深度检测模组和惯性测量单元。将注重阐述与参照图1描述的实施例的不同之处。
如图4所示,本实施例的摄像方法可以包括如下步骤S401至S402。
在步骤S401,在终端AR应用关闭的状态下,当接收到拍照指令时,启动运动追踪摄像头和RGB摄像头,并对运动追踪摄像头和RGB摄像头拍摄的图像进行合成处理。
在步骤S402,在终端AR应用开启的状态下,将运动追踪摄像头和RGB摄像头拍摄的图像进行合成处理,并将合成处理后的图像传输给所述终端AR应用,以辅助所述终端AR应用进行3D纹理成像。
根据本公开实施例,当RGB摄像头为光学变焦摄像头时,在终端AR应用开启的状态下,RGB摄像头工作在定焦模式或者变焦模式下。可以根据需求灵活的设定在终端AR应用开启的状态下RGB摄像头的工作模式。
根据本公开实施例,采用RGB摄像头和运动追踪摄像头的合成图像辅助终端AR应用进行3D纹理成像,可以提高3D纹理成像的效果。
需要指出的是,步骤S401和S402之间没有严格的顺序关系。
图5为根据本公开实施例的摄像方法的流程图。
如图5所示,本实施例的摄像方法可以包括如下步骤S501至S503。
在步骤S501,终端设备判断Tango AR应用是否启动,如果Tango  AR应用没有启动,则确定终端为光学变焦模式,并执行步骤S502;如果Tango AR应用启动,则确定终端为Tango AR模式,并执行步骤S503。
如果确定终端为光学变焦模式,则潜望式摄像头工作在变焦模式下。如果确定终端为Tango AR模式,则潜望式摄像头工作在定焦模式或者变焦模式下。
在步骤S502,终端在接收到用户的拍照指令时,启动潜望式摄像头和鱼眼摄像头,并将两个摄像头拍摄的图像进行合成处理,得到优质的照片。
在步骤S503,终端的潜望式摄像头拍摄环境视图,并将环境视图发送到终端AR应用,以辅助终端AR应用进行3D导航和构图,并且终端的鱼眼摄像头拍摄环境视图,并将环境视图发送到终端AR应用,以辅助终端AR应用进行运动追踪。
本实施例中,潜望式摄像头和鱼眼摄像头的合成图片能有效提升终端AR应用中3D纹理的效果。
用户可以进行Tango AR应用的开启或者关闭操作,如果在Tango AR应用开启状态,用户执行关闭操作,则释放Tango AR相关资源,和普通Tango AR释放过程一样;如果在Tango AR关闭状态下,用户可以执行开启操作,则终端设备申请Tango AR相关资源,和普通Tango AR申请资源过程一样。
图6为根据本公开实施例的终端的结构框图。
如图6所示,根据本公开实施例的终端可以包括:存储器610、处理器620、适配AR功能的运动追踪摄像头630、RGB摄像头640、深度检测模组650和惯性测量单元660。在存储器610上存储有可在处理器620上运行的计算机程序,处理器620执行所述计算机程序时可以执行根据本公开各实施例的摄像方法。
根据本公开实施例,运动追踪摄像头630和RGB摄像头640的视场重叠大于设定的阈值。根据本公开实施例,运动追踪摄像头630和RGB摄像头640的视场重叠可以大于70%。为了提高合成效果,视场重叠可以达到90%。
根据本公开实施例,运动追踪摄像头630可以包括(但不限于)鱼眼摄像头。
根据本公开实施例,运动追踪摄像头630的分辨率可以为800万(或更大)像素。
根据本公开实施例,RGB摄像头640可以包括光学变焦摄像头。光学变焦摄像头可以包括(但不限于)潜望式摄像头。
根据本公开实施例,当RGB摄像头640包括光学变焦摄像头时,在终端AR应用关闭的状态下,RGB摄像头640可以工作在光学变焦模式下,以提高拍摄的效果,得到高品质效果的图像。
根据本公开实施例,对于终端AR应用画面感要求不是很高的终端设备,光学变焦摄像头可以包括长焦摄像头。
根据本公开实施例,在终端AR应用开启的状态下,RGB摄像头640和运动追踪摄像头630分别用于深度检测3D纹理成像和运动追踪。RGB摄像头640将拍摄的环境视图发送到终端AR应用,终端AR应用将RGB摄像头640拍摄的环境视图和深度检测模组650检测的数据相配合,实现深度检测3D纹理成像。运动追踪摄像头630将拍摄的环境视图发送到终端AR应用,终端AR应用将运动追踪摄像头630拍摄的环境视图和惯性测量单元660检测的数据相配合,实现终端设备的运动追踪。
根据本公开实施例,当RGB摄像头640包括光学变焦摄像头时,在终端AR应用开启的状态下,RGB摄像头640可以工作在定焦模式下,以降低终端的功耗。定焦的焦距可预先配置。
可替代的,在终端AR应用开启的状态下,RGB摄像头640也可工作在变焦模式下。当RGB摄像头640工作在变焦模式下时,在AR场景中,RGB摄像头640通过光学变焦来增强拍摄远景能力,进而提高3D纹理成像的效果。
可以根据需求灵活的设定在终端AR应用开启的状态下RGB摄像头640的工作模式。
根据本公开实施例,在终端AR应用开启的状态下,将运动追踪摄像头630和RGB摄像头640拍摄的图像进行合成处理,并将合成处 理后的图像传输给所述终端AR应用,以辅助所述终端AR应用进行3D纹理成像。采用RGB摄像头640和运动追踪摄像头630的合成图像辅助终端AR应用进行3D纹理成像,可以提高3D纹理成像的效果。
根据本公开实施例,深度检测模组650可以包括TOF雷达模组,从而使得实时3D导航构图等应用不再受到光线、天气影响,使得该AR设备适用于室外各种场合。
根据本公开实施例的终端,在终端AR应用关闭的情况下,对适配终端AR应用的运动追踪摄像头进行复用,使得运动追踪摄像头与RGB摄像头构成终端拍照的双摄像头,实现了在无需增加终端成本的情况下,提高了终端的拍照、摄像能力。另外,通过采用具有光学变焦的RGB摄像头,能有效提高现实环境成像的能力。再者,采用TOF雷达深度检测模组,可以将AR功能的使用范围从室内扩展到室外,提升了AR平台的价值。
本公开实施例还提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,所述处理器执行根据本公开的摄像方法。
本领域内的技术人员应明白,本公开的实施例可提供为方法、装置、系统、或计算机程序产品。因此,本公开可采用硬件实施例、软件实施例、或结合软件和硬件方面的实施例的形式。而且,本公开可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本公开是参照根据本公开实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
以上所述,仅为本公开的实施例,并非用于限定本公开的保护范围。

Claims (14)

  1. 一种摄像方法,应用在具有增强现实AR功能的终端中,所述终端包括适配AR功能的运动追踪摄像头、红绿蓝RGB摄像头,所述方法包括:
    在终端AR应用关闭的状态下,当接收到拍照指令时,启动所述运动追踪摄像头和所述RGB摄像头;以及
    对所述运动追踪摄像头和所述RGB摄像头拍摄的图像进行合成处理,
    其中,所述运动追踪摄像头和RGB摄像头的视场重叠大于设定的阈值。
  2. 如权利要求1所述的摄像方法,还包括:
    在终端AR应用开启的状态下,将所述运动追踪摄像头和所述RGB摄像头拍摄的图像进行合成处理,并将合成处理后的图像传输给所述终端AR应用,以辅助所述终端AR应用进行3D纹理成像。
  3. 一种具有增强现实AR功能的终端,包括:存储器、处理器、适配AR功能的运动追踪摄像头、红绿蓝RGB摄像头,在所述存储器上存储有在所述处理器上运行的计算机程序,当所述处理器运行所述计算机程序时,所述处理器执行如权利要求1至2任意一项所述的摄像方法。
  4. 如权利要求3所述的终端,其中,所述RGB摄像头包括光学变焦摄像头。
  5. 如权利要求4所述的终端,其中,所述光学变焦摄像头潜望式摄像头。
  6. 如权利要求4所述的终端,其中,在终端AR应用关闭的状 态下,所述RGB摄像头工作在光学变焦模式下,并且在终端AR应用开启的状态下,所述RGB摄像头工作在定焦模式或者光学变焦模式下。
  7. 如权利要求3所述的终端,其中,所述RGB摄像头包括数码变焦摄像头。
  8. 如权利要求7所述的终端,其中,所述数码变焦摄像头包括长焦摄像头。
  9. 如权利要求3所述的终端,其中,所述运动追踪摄像头和RGB摄像头的视场重叠大于70%。
  10. 如权利要求3所述的终端,其中,所述运动追踪摄像头包括鱼眼摄像头。
  11. 如权利要求3所述的终端,其中,所述运动追踪摄像头的分辨率为800万或更大像素。
  12. 如权利要求3所述的终端,其中,所述终端还包括适配AR功能的深度检测模组。
  13. 如权利要求12所述的终端,其中,所述深度检测模组包括飞行时间雷达模组。
  14. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,所述处理器执行如权利要求1至2中任一项所述的摄像方法。
PCT/CN2018/095034 2017-07-25 2018-07-09 摄像方法、终端及存储介质 WO2019019907A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710614083.7 2017-07-25
CN201710614083.7A CN109302561A (zh) 2017-07-25 2017-07-25 一种摄像方法、终端及存储介质

Publications (1)

Publication Number Publication Date
WO2019019907A1 true WO2019019907A1 (zh) 2019-01-31

Family

ID=65039976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/095034 WO2019019907A1 (zh) 2017-07-25 2018-07-09 摄像方法、终端及存储介质

Country Status (2)

Country Link
CN (1) CN109302561A (zh)
WO (1) WO2019019907A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919128B (zh) * 2019-03-20 2021-04-13 联想(北京)有限公司 控制指令的获取方法、装置及电子设备
CN111932901B (zh) * 2019-05-13 2022-08-09 斑马智行网络(香港)有限公司 道路车辆跟踪检测设备、方法及存储介质
CN112230217A (zh) * 2020-09-10 2021-01-15 成都多普勒科技有限公司 一种用于智能汽车的微型光电一体雷达传感器
CN113012199B (zh) * 2021-03-23 2024-01-12 北京灵汐科技有限公司 运动目标追踪的系统和方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681766A (zh) * 2016-03-21 2016-06-15 贵州大学 一种三维全景摄像机增强现实系统
CN106210547A (zh) * 2016-09-05 2016-12-07 广东欧珀移动通信有限公司 一种全景拍摄的方法、装置及系统
US20170148223A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of ar/vr content
CN106791298A (zh) * 2016-12-01 2017-05-31 广东虹勤通讯技术有限公司 一种具有双摄像头的终端及拍照方法
CN106941588A (zh) * 2017-03-13 2017-07-11 联想(北京)有限公司 一种数据处理方法及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2497118B (en) * 2011-12-01 2013-12-18 Sony Corp Image processing system and method
CN202818502U (zh) * 2012-09-24 2013-03-20 天津市亚安科技股份有限公司 多方向监控区域预警定位监控装置
US20140240469A1 (en) * 2013-02-28 2014-08-28 Motorola Mobility Llc Electronic Device with Multiview Image Capture and Depth Sensing
JP6165680B2 (ja) * 2014-06-27 2017-07-19 富士フイルム株式会社 撮像装置
US10040394B2 (en) * 2015-06-17 2018-08-07 Geo Semiconductor Inc. Vehicle vision system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170148223A1 (en) * 2014-10-31 2017-05-25 Fyusion, Inc. Real-time mobile device capture and generation of ar/vr content
CN105681766A (zh) * 2016-03-21 2016-06-15 贵州大学 一种三维全景摄像机增强现实系统
CN106210547A (zh) * 2016-09-05 2016-12-07 广东欧珀移动通信有限公司 一种全景拍摄的方法、装置及系统
CN106791298A (zh) * 2016-12-01 2017-05-31 广东虹勤通讯技术有限公司 一种具有双摄像头的终端及拍照方法
CN106941588A (zh) * 2017-03-13 2017-07-11 联想(北京)有限公司 一种数据处理方法及电子设备

Also Published As

Publication number Publication date
CN109302561A (zh) 2019-02-01

Similar Documents

Publication Publication Date Title
US11276149B2 (en) Double non-local means denoising
CN109309796B (zh) 使用多个相机获取图像的电子装置和用其处理图像的方法
CN107690649B (zh) 数字拍摄装置及其操作方法
JP6961797B2 (ja) プレビュー写真をぼかすための方法および装置ならびにストレージ媒体
WO2019019907A1 (zh) 摄像方法、终端及存储介质
WO2017016030A1 (zh) 一种图像处理方法及终端
WO2017113818A1 (zh) 一种无人机及其全景拼接方法、装置和系统
CN110035228B (zh) 摄像头防抖系统、方法、电子设备和计算机可读存储介质
US20170150126A1 (en) Photographing device and operating method of the same
US11301051B2 (en) Using natural movements of a hand-held device to manipulate digital content
US20230018557A1 (en) Photographing method and terminal
JP2017505004A (ja) 画像生成方法及びデュアルレンズ装置
US11810269B2 (en) Chrominance denoising
EP3316568B1 (en) Digital photographing device and operation method therefor
JP6011569B2 (ja) 撮像装置、被写体追尾方法及びプログラム
CN104660909A (zh) 图像获取方法、图像获取装置和终端
KR20160036985A (ko) 3d 파노라마 이미지 생성을 위한 영상 생성 장치 및 방법
WO2018121401A1 (zh) 一种全景视频图像的拼接方法及全景相机
JP2020188448A (ja) 撮像装置及び撮像方法
CN116437198B (zh) 图像处理方法与电子设备
US10645282B2 (en) Electronic apparatus for providing panorama image and control method thereof
CN108476290B (zh) 用于提供全景图像的电子装置及其控制方法
US10148874B1 (en) Method and system for generating panoramic photographs and videos
WO2021258249A1 (zh) 图像获取方法、电子设备和可移动设备
US11636708B2 (en) Face detection in spherical images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18837548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29.05.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18837548

Country of ref document: EP

Kind code of ref document: A1