WO2018120353A1 - Vr拍摄方法、系统及移动终端 - Google Patents

Vr拍摄方法、系统及移动终端 Download PDF

Info

Publication number
WO2018120353A1
WO2018120353A1 PCT/CN2017/073104 CN2017073104W WO2018120353A1 WO 2018120353 A1 WO2018120353 A1 WO 2018120353A1 CN 2017073104 W CN2017073104 W CN 2017073104W WO 2018120353 A1 WO2018120353 A1 WO 2018120353A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
wide
image information
angle
spatial image
Prior art date
Application number
PCT/CN2017/073104
Other languages
English (en)
French (fr)
Inventor
叶炯耀
颜惠琴
沈晓鶄
王小琴
Original Assignee
上海喆视智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海喆视智能科技有限公司 filed Critical 上海喆视智能科技有限公司
Publication of WO2018120353A1 publication Critical patent/WO2018120353A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present invention relates to the field of VR shooting technology, and in particular to a VR shooting method, system, and mobile terminal.
  • VR Virtual Reality
  • VR shooting devices are separate from VR generating devices and VR viewing devices. After recording the spatial image, the VR shooting device transmits the spatial image to the VR generating device through the network for generating the VR image, and then watching the VR image through the VR viewing device, and the VR image effect captured in the real-time observation cannot be observed, which affects The user's shooting experience and viewing experience.
  • an object of the embodiments of the present invention is to provide a cost-effective VR shooting method, system, and mobile terminal that can observe the captured VR image effect in real time.
  • a preferred embodiment of the present invention provides a VR shooting method.
  • the VR shooting method is applied to a mobile terminal, and the mobile terminal includes a first camera and a second camera facing in opposite directions, and a wide-angle lens fixed to the first camera and the second camera respectively by a fixing structure, wherein The first camera and the second camera respectively include at least one single camera.
  • the method includes: controlling the first camera and the second camera to respectively pass through The wide-angle lens acquires wide-angle spatial image information of the shooting area; and obtains VR image information of the shooting area according to the wide-angle spatial image information collected by the first camera and the wide-angle spatial image information collected by the second camera;
  • the VR image information of the photographing area is displayed in the mobile terminal.
  • a preferred embodiment of the present invention provides a VR shooting system, the shooting system is applied to a mobile terminal, the mobile terminal includes a first camera and a second camera facing opposite, and a fixed structure Wide-angle lenses respectively fixed to the first camera and the second camera, wherein the first camera and the second camera respectively comprise at least one single camera.
  • the system includes: a control module, configured to control the wide-angle spatial image information of the shooting area by the first camera and the second camera respectively through the wide-angle lens; and a synthesizing module, configured to collect according to the first camera Wide-angle spatial image information and wide-angle spatial image information collected by the second camera to obtain VR image information of the shooting area; a display module, configured to display VR image information of the shooting area in the mobile terminal .
  • a preferred embodiment of the present invention also provides a mobile terminal.
  • the mobile terminal includes: a memory, a processor, a first camera and a second camera facing opposite, a wide-angle lens and a VR photographing system respectively fixed to the first camera and the second camera by a fixing structure.
  • the first camera and the second camera respectively comprise at least one single camera
  • the wide-angle lens is detachably fixed on the mobile terminal by the fixing structure
  • the VR imaging system is installed or stored in the And executing, by the processor, execution of each functional module of the VR shooting system.
  • the VR shooting method is detachably passed by controlling the first camera and the second camera respectively.
  • Wide-angle lens fixed on the mobile terminal to capture a shooting area
  • the wide-angle spatial image information is obtained according to the wide-angle spatial image information collected by the first camera and the wide-angle spatial image information collected by the second camera, and the VR image information of the shooting region is obtained, and the VR image information of the shooting region is obtained.
  • the display is performed in the mobile terminal, thereby providing the user with a cost-effective VR shooting system based on the mobile terminal, realizing real-time observation of the captured VR image effect, and ensuring the user experience.
  • FIG. 1 is a block diagram of a mobile terminal according to a preferred embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a VR shooting method applied to the mobile terminal shown in FIG. 1 according to a preferred embodiment of the present invention.
  • FIG. 3 is a schematic flow chart of the sub-steps included in step S230 of FIG. 2.
  • FIG. 4 is another schematic flowchart of a VR shooting method applied to the mobile terminal shown in FIG. 1 according to a preferred embodiment of the present invention.
  • FIG. 5 is a block diagram of the VR photographing system shown in FIG. 1 according to a preferred embodiment of the present invention.
  • FIG. 6 is another block diagram of the VR photographing system shown in FIG. 1 according to a preferred embodiment of the present invention.
  • Icons 10-mobile terminal; 100-VR shooting system; 11-memory; 12-storage controller; 13-processor; 14-peripheral interface; 15-first camera; 16-second camera; 110-control module 120-synthesis module; 130-display module; 131-judgment sub-module; 132-display sub-module; 140-transmission module.
  • FIG. 1 is a block diagram of a mobile terminal 10 according to a preferred embodiment of the present invention.
  • the mobile terminal 10 may include a VR photographing system 100, a memory 11, a memory controller 12, a processor 13, a peripheral interface 14, a first camera 15, and a second camera 16.
  • the mobile terminal 10 may be, but not limited to, a smart phone, a personal computer (PC), a tablet computer, a personal digital assistant (PDA), Mobile Internet Device (MID), etc.
  • the mobile terminal 10 is a smart phone.
  • the memory 11, the storage controller 12, the processor 13, the peripheral interface 14, the first camera 15 and the second camera 16 are directly or indirectly electrically connected to each other to implement data. Transmission or interaction.
  • the components can be electrically connected to one another via at least one or more communication buses or signal lines.
  • the VR photographing system 100 includes at least one software function module that can be stored in the memory 11 or in an operating system (OS) of the mobile terminal 10 in the form of software or firmware.
  • OS operating system
  • the operating system of the mobile terminal 10 may be, but not limited to, an Android system, an iOS (iPhone operating system) system, or the like.
  • the processor 13 is configured to execute an executable module stored in the memory 11, such as a software function module, a computer program, and the like included in the VR photographing system 100.
  • the memory 11 can be, but not limited to, a random access memory (RAM), a read-only memory (ROM), and a programmable read-only memory (Programmable Read-Only Memory, PROM). ), Erasable Programmable Read-Only Memory (EPROM), Electric Erasable Programmable Read-Only Memory (EEPROM), and the like.
  • the memory 11 is configured to store a program, and the processor 13 executes the program after receiving an execution instruction, the program comprising a computer readable medium having a non-volatile program code executable by a processor, the computer A computer program for performing the above-described functions defined in the method of the present invention is stored on the readable medium.
  • the memory 11 can be All the image information obtained by the VR shooting by the mobile terminal 10 is stored, and the image information includes a picture, a video, and the like.
  • the processor 13 can be an integrated circuit chip with signal processing capabilities.
  • the processor 13 may be a general-purpose processor, including a central processing unit (CPU), a network processor (NP), etc.; or may be a digital signal processor (DSP) or an application specific integrated circuit (ASIC). ), off-the-shelf programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • the methods, steps, and logical block diagrams disclosed in the embodiments of the present invention may be implemented or carried out.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • peripheral interface 14 couples various input/output devices to the processor 13 and to the memory 11.
  • peripheral interface 14, processor 13 and memory controller 12 can be implemented in a single chip. In other instances, they can be implemented by separate chips.
  • the first camera 15 and the second camera 16 are oppositely facing cameras in the mobile terminal 10, and the first camera 15 and the second camera 16 are configured to acquire image information in a shooting area, wherein the first camera The camera 15 and the second camera 16 respectively include at least one single camera, and the number of cameras respectively included in the first camera 15 and the second camera 16 may be different.
  • the first camera 15 includes a single camera
  • the second camera The camera 16 includes two cameras.
  • the first camera 15 and the second camera 16 may be, but not limited to, a digital camera, an analog camera, and the like.
  • the mobile terminal 10 may further include a wide-angle lens fixed to the first camera 15 and the second camera 16, and a fixing structure for fixing the wide-angle lens.
  • the wide-angle lens may be detachably fixed on the mobile terminal 10 by the fixing structure, and the mobile terminal 10 may control the first camera 15 and the second
  • the camera 16 acquires wide-angle spatial image information of the shooting area through the wide-angle lens, and the wide-angle spatial image information includes a wide-angle space picture or a wide-angle space video.
  • the wide-angle lens includes a fisheye lens with a wide-angle range of more than 180 degrees, a standard lens, etc., and the wide-angle lens is preferably a fisheye lens with a wide-angle range of more than 180 degrees, for example, a 210-degree fisheye lens, 230-degree fish. Eyeglasses and 235 degree fisheye lens.
  • the fixing structure may be, but not limited to, a mobile phone card slot, a mobile phone case, and the like.
  • FIG. 1 is only a schematic structural diagram of the mobile terminal 10, and the mobile terminal 10 may further include more or less components than those shown in FIG. 1, or have the same as FIG. Show different configurations.
  • the components shown in Figure 1 can be implemented in hardware, software, or a combination thereof.
  • FIG. 2 is a schematic flowchart of a VR shooting method applied to the mobile terminal 10 shown in FIG. 1 according to a preferred embodiment of the present invention.
  • the mobile terminal 10 includes a first camera 15 and a second camera 16 facing in opposite directions, and a wide-angle lens fixed to the first camera 15 and the second camera 16, respectively, wherein the first camera 15 and the first camera
  • the two cameras 16 each include at least one single camera. The specific flow of the method will be described below.
  • the VR shooting method may include the following steps:
  • Step S210 controlling the first camera 15 and the second camera 16 to acquire wide-angle spatial image information of the shooting area through the wide-angle lens, respectively.
  • the wide-angle spatial image information is spatial image information that is obtained by the camera in the imaging area, and the wide-angle spatial image information includes a wide-angle spatial image, a wide-angle spatial video, and the like, and the wide-angle spatial image information. It can be stored in the memory 11.
  • the wide-angle lens is preferably a fisheye lens with a wide angle of more than 180 degrees, such as a 210-degree fisheye lens, a 230-degree fisheye lens, and a 235-degree fisheye lens.
  • the step of the mobile terminal 10 controlling the first camera 15 and the second camera 16 to acquire the wide-angle spatial image information of the shooting area through the wide-angle lens respectively includes:
  • the first camera 15 and the second camera 16 are controlled to perform wide-angle spatial image information acquisition on the photographing area with the same photographing time within a preset photographing period.
  • the first camera 15 and the second camera 16 need to be controlled.
  • the same shooting time is used in the preset shooting time period to collect the wide-angle spatial image information of the shooting area to ensure the accuracy of the collected wide-angle spatial image information.
  • the mobile terminal 10 controls the first camera 15 and the second camera 16 to perform wide-angle spatial image information collection on the shooting area with the same shooting time within a preset shooting time period.
  • the steps include:
  • the mobile terminal 10 supports the first camera 15 and the second camera 16 to simultaneously acquire the wide-angle spatial image information
  • the first camera 15 and the second camera 16 are controlled to simultaneously face the shooting region. Performing wide-angle spatial image information collection;
  • the first camera 15 and the second camera 16 are controlled to be time-interval.
  • the photographing area performs acquisition of wide-angle spatial image information.
  • different mobile terminals 10 may support different mobile terminals 10 to support the first camera 15 and the second camera 16 to simultaneously acquire wide-angle spatial image information during a preset shooting period. Some mobile terminals 10 cannot support the first camera 15 and the second camera 16 to simultaneously acquire wide-angle spatial image information during a preset shooting period. Therefore, it is necessary to detect whether the mobile terminal 10 supports the first camera 15 and the second camera 16 to simultaneously collect the wide-angle spatial image information.
  • the mobile terminal 10 may control the first camera 15 and the second camera 16 The shooting is switched at equal intervals to acquire wide-angle spatial image information of the shooting area.
  • the wide-angle spatial image information is represented as a picture
  • the time taken by the first camera 15 and the second camera 16 to be photographed needs to be controlled within an equal interval;
  • the wide-angle spatial image information is represented as a video
  • the first camera 15 and the second camera 16 both have the same shooting time, and the shooting time is the switching interval set by the mobile terminal 10.
  • the switching interval may be 1 frame, 2 frames, or multiple frames.
  • the mobile terminal 10 may control the first camera 15 and the first The two cameras 16 simultaneously acquire the wide-angle spatial image information of the shooting area.
  • the mobile terminal 10 can also control the first camera 15 and the first The two cameras 16 switch the shooting at equal time intervals to acquire wide-angle spatial image information of the shooting area.
  • Step S220 obtaining VR image information of the shooting area according to the wide-angle space image information collected by the first camera 15 and the wide-angle space image information collected by the second camera 16.
  • the step of obtaining the VR image information of the shooting area according to the wide-angle spatial image information collected by the first camera 15 and the wide-angle spatial image information collected by the second camera 16 includes:
  • the collected wide-angle spatial image The information is image-spliced and combined to obtain VR image information of the shooting area.
  • the wide-angle lens fixed on the first camera 15 and the wide-angle lens fixed on the second camera 16 may be different, as an example, for example, fixed to the first camera.
  • the wide-angle lens on the 15 is a 210-degree fisheye lens
  • the wide-angle lens fixed on the second camera 16 may be a 210-degree fisheye lens, which may be a 230-degree fisheye lens or a 235-degree fisheye lens.
  • the collected wide-angle spatial image information may be image-spliced and synthesized
  • the VR image information of the photographing area is obtained and stored in the memory 11.
  • the image stitching synthesis can form a seamless high-resolution image letter that presents overlapping wide-angle spatial image information into a complete representation of specific information of the shooting area.
  • Information that is, VR image information of the shooting area, the VR image information corresponding to the wide-angle spatial image information, including a VR picture or a VR video.
  • Step S230 displaying VR image information of the shooting area in the mobile terminal 10.
  • the mobile terminal 10 may acquire VR image information of the shooting area from the memory 11 and display the VR image information.
  • the manner in which the mobile terminal 10 displays the VR image information may be, but not limited to, a 3D rotation display mode, a VR mode that can be viewed by VR glasses, a two-dimensional expansion mode, a hexahedral mode, and the like.
  • the mobile terminal 10 may further include an NFC card reader for reading information of an NFC (Near Field Communication) card.
  • NFC Near Field Communication
  • An NFC card may be disposed on the fixed structure, and the NFC card reader is disposed in the mobile terminal 10.
  • the fixed structure is provided with an NFC card.
  • FIG. 3 it is a schematic flowchart of sub-steps included in step S230 of FIG. 2.
  • the step S230 is to perform the shooting area.
  • the step of displaying the VR image information in the mobile terminal 10 may include:
  • Sub-step S231 the mobile terminal 10 reads the information of the NFC card on the fixed structure through the NFC card reader, and determines whether the information of the NFC card is legal.
  • Sub-step S232 when the information of the NFC card is legal, the VR image information of the shooting area is displayed in the mobile terminal 10.
  • the legitimacy of the information of the NFC card marks the legitimacy of the fixed structure.
  • the mobile terminal 10 displays the VR image information of the shooting area only after the legality of the fixed structure is confirmed.
  • the mobile terminal 10 when the NFC card for embodying the legitimacy of the fixed structure is not disposed on the fixed structure, the mobile terminal 10 can directly display the VR image information of the shooting area. .
  • the mobile terminal 10 can perform wide-angle spatial image information collection on the shooting area by means of a hand, a floor-standing mobile phone holder, or a mobile phone selfie stick.
  • FIG. 4 is another schematic flowchart of a VR shooting method applied to the mobile terminal 10 shown in FIG. 1 according to a preferred embodiment of the present invention.
  • the VR shooting method may further include:
  • step S240 the VR image information of the shooting area is transmitted to the other mobile terminal 10.
  • the mobile terminal 10 may send the generated VR image information to other mobile terminals 10 for sharing through a network, and the network includes a wired network or a wireless network.
  • FIG. 5 is a block diagram of the VR photographing system 100 shown in FIG. 1 according to a preferred embodiment of the present invention.
  • the VR shooting system 100 is applied to the mobile terminal 10, and the mobile terminal 10 includes a first camera 15 and a second camera 16 facing in opposite directions, and is respectively fixed to the first camera 15 and the second camera 16.
  • the VR shooting system 100 can include a control module 110, a synthesizing module 120, and a display module 130.
  • the control module 110 is configured to control the first camera 15 and the second camera 16 to acquire wide-angle spatial image information of a shooting area through the wide-angle lens, respectively.
  • control module 110 controls the first camera 15 and the second camera 16 to collect the wide-angle spatial image information of the shooting area through the wide-angle lens respectively includes:
  • the first camera 15 and the second camera 16 are controlled to perform wide-angle spatial image information acquisition on the photographing area with the same photographing time within a preset photographing period.
  • control module 110 controls the manner in which the first camera 15 and the second camera 16 perform wide-angle spatial image information collection on the shooting area with the same shooting time in a preset shooting time period.
  • the mobile terminal 10 supports the first camera 15 and the second camera 16 to simultaneously acquire the wide-angle spatial image information
  • the first camera 15 and the second camera 16 are controlled to simultaneously face the shooting region. Performing wide-angle spatial image information collection;
  • the first camera 15 and the second camera 16 are controlled to be time-interval.
  • the photographing area performs acquisition of wide-angle spatial image information.
  • control module 110 may refer to the detailed description of step S210 shown in FIG. 2 .
  • the synthesizing module 120 is configured to obtain VR image information of the photographing area according to the wide-angle space image information collected by the first camera 15 and the wide-angle space image information collected by the second camera 16 .
  • the synthesizing module 120 obtains the VR image information of the shooting area according to the wide-angle spatial image information collected by the first camera 15 and the wide-angle spatial image information collected by the second camera 16 .
  • Ways include:
  • the collected The wide-angle spatial image information is image-spliced and combined to obtain VR image information of the shooting area.
  • the process performed by the synthesizing module 120 may refer to the detailed description of step S220 shown in FIG. 2 .
  • the display module 130 is configured to display VR image information of the shooting area in the mobile terminal 10.
  • the mobile terminal 10 may further include an NFC card reader for reading information of the NFC card.
  • An NFC card may be disposed on the fixed structure, and the NFC card reader is disposed in the mobile terminal 10.
  • FIG. 6 is another block diagram of the VR photographing system 100 shown in FIG. 1 according to a preferred embodiment of the present invention.
  • the display module 130 may include a determination sub-module 131 and a display sub-module 132.
  • the determining sub-module 131 is configured to read information of the NFC card on the fixed structure by using the NFC card reader, and determine whether the information of the NFC card is legal.
  • the display sub-module 132 is configured to display VR image information of the shooting area in the mobile terminal 10 when the information of the NFC card is legal.
  • the display sub-module 131 when the NFC card is disposed on the fixed structure, the display sub-module 131 is only valid after the determining sub-module 131 confirms that the information of the NFC card is legal, that is, the fixed structure is legal.
  • the module 132 displays the VR image information of the shooting area.
  • the determining sub-module 131 does not work, and the display sub-module 132 is directly The VR image information of the shooting area is displayed.
  • the VR photographing system 100 may further include a transmitting module 140 .
  • the sending module 140 is configured to send VR image information of the shooting area to other mobile terminals 10.
  • the VR shooting method determines the first camera and the above by controlling the first camera and the second camera to respectively acquire wide-angle spatial image information of a shooting area by a wide-angle lens detachably fixed on the mobile terminal Whether the wide-angle spatial image information collected by the second camera overlaps, and the collected wide-angle spatial image information is image-spliced and combined to obtain VR image information of the captured region, and the VR image information of the captured region is in the movement
  • the display is displayed in the terminal, thereby providing the user with a cost-effective VR shooting system based on the mobile terminal, realizing the real-time observation of the captured VR image effect and ensuring the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

提供一种VR拍摄方法、系统及移动终端,其中,所述方法包括:控制第一摄像头和第二摄像头分别通过广角镜头采集拍摄区域的广角空间图像信息;根据第一摄像头采集到的广角空间图像信息和第二摄像头采集到的广角空间图像信息,得到拍摄区域的VR图像信息;将所述拍摄区域的VR图像信息在所述移动终端中进行显示。所述VR拍摄方法、系统及移动终端可以为用户提供一种基于移动终端的性价比高的VR拍摄系统,实现对拍摄到的VR图像效果的实时观察,保证用户的体验。

Description

VR拍摄方法、系统及移动终端
本申请要求于2016年12月29日提交中国专利局、申请号为CN201611249548.5、发明名称为“VR拍摄方法、系统及移动终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及VR拍摄技术领域,具体而言,涉及一种VR拍摄方法、系统及移动终端。
背景技术
随着虚拟现实(Virtual Reality,VR)技术的深入发展,VR的制作技术也愈发成熟,但用于拍摄VR的VR拍摄设备的数目太少且造价昂贵,致使VR还未在人们的日常生活中普及。
目前而言,VR拍摄设备与VR生成设备以及VR观影设备都是分离的。VR拍摄设备在录取空间影像后,通过网络将空间影像传输给VR生成设备,用于生成VR图像,再通过VR观影设备观看VR图像,无法做到实时观察拍摄到的VR图像效果,影响到用户的拍摄体验和观看体验。
发明内容
为了克服现有技术中的上述不足,本发明实施例的目的在于提供一种性价比高的可实时观察拍摄到的VR图像效果的VR拍摄方法、系统及移动终端。
就VR拍摄方法而言,本发明较佳的实施例提供一种VR拍摄方法。所述VR拍摄方法应用于移动终端,所述移动终端包括朝向相反的第一摄像头和第二摄像头,及通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头,其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头。所述方法包括:控制所述第一摄像头和所述第二摄像头分别通过所 述广角镜头采集拍摄区域的广角空间图像信息;根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息;将所述拍摄区域的VR图像信息在所述移动终端中进行显示。
就VR拍摄系统而言,本发明较佳的实施例提供一种VR拍摄系统,所述拍摄系统应用于移动终端,所述移动终端包括朝向相反的第一摄像头和第二摄像头,及通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头,其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头。所述系统包括:控制模块,用于控制所述第一摄像头和所述第二摄像头分别通过所述广角镜头采集拍摄区域的广角空间图像信息;合成模块,用于根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息;显示模块,用于将所述拍摄区域的VR图像信息在所述移动终端中进行显示。
就移动终端而言,本发明较佳的实施例还提供一种移动终端。所述移动终端包括:存储器、处理器、朝向相反的第一摄像头和第二摄像头、通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头及VR拍摄系统。其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头,所述广角镜头通过所述固定结构可拆卸地固定在所述移动终端上,所述VR拍摄系统安装或存储于所述存储器中,并由所述处理器控制所述VR拍摄系统各功能模块的执行。
相对于现有技术而言,本发明实施例提供的VR拍摄方法、系统及移动终端具有以下有益效果:所述VR拍摄方法通过控制所述第一摄像头和所述第二摄像头分别通过可拆卸地固定在所述移动终端上的广角镜头采集拍摄区域的 广角空间图像信息,根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到拍摄区域的VR图像信息,并将所述拍摄区域的VR图像信息在所述移动终端中进行显示,从而为用户提供一种基于移动终端的性价比高的VR拍摄系统,实现对拍摄到的VR图像效果的实时观察,保证用户的体验。
为使本发明的上述目的、特征和优点能更明显易懂,下文特举本发明较佳实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本发明的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本发明较佳实施例提供的一种移动终端的方框示意图。
图2为本发明较佳实施例提供的应用于图1所示的移动终端的VR拍摄方法的一种流程示意图。
图3为图2中步骤S230包括的子步骤的流程示意图。
图4为本发明较佳实施例提供的应用于图1所示的移动终端的VR拍摄方法的另一种流程示意图。
图5为本发明较佳实施例提供的图1中所示的VR拍摄系统的一种方框示意图。
图6为本发明较佳实施例提供的图1中所示的VR拍摄系统的另一种方框示意图。
图标:10-移动终端;100-VR拍摄系统;11-存储器;12-存储控制器;13-处理器;14-外设接口;15-第一摄像头;16-第二摄像头;110-控制模块;120-合成模块;130-显示模块;131-判断子模块;132-显示子模块;140-发送模块。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本发明实施例的组件可以以各种不同的配置来布置和设计。
因此,以下对在附图中提供的本发明的实施例的详细描述并非旨在限制要求保护的本发明的范围,而是仅仅表示本发明的选定实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。同时,在本发明的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
请参照图1,是本发明较佳实施例提供的移动终端10的一种方框示意图。所述移动终端10可以包括:VR拍摄系统100、存储器11、存储控制器12、处理器13、外设接口14、第一摄像头15及第二摄像头16。在本发明实施例中,所述移动终端10可以是,但不限于,智能手机、个人电脑(Personal Computer,PC)、平板电脑、个人数字助理(Personal Digital Assistant,PDA)、 移动上网设备(Mobile Internet Device,MID)等。优选地,所述移动终端10为智能手机。
在本发明实施例中,所述存储器11、存储控制器12、处理器13、外设接口14、第一摄像头15及第二摄像头16各个元件之间直接或间接地电性连接,以实现数据的传输或交互。例如,这些元件相互之间可通过至少一条或多条通讯总线或信号线实现电性连接。所述VR拍摄系统100包括至少一个可以软件或固件(firmware)的形式存储于所述存储器11中或固化在所述移动终端10的操作系统(operating system,OS)中的软件功能模块。在本实施例中,所述移动终端10的操作系统可以是,但不限于,安卓(Android)系统、iOS(iPhone operating system)系统等。所述处理器13用于执行所述存储器11中存储的可执行模块,例如所述VR拍摄系统100所包括的软件功能模块及计算机程序等。
其中,所述存储器11可以是,但不限于,随机存取存储器(Random Access Memory,RAM),只读存储器(Read-Only Memory,ROM),可编程只读存储器(Programmable Read-Only Memory,PROM),可擦除只读存储器(Erasable Programmable Read-Only Memory,EPROM),电可擦除只读存储器(Electric Erasable Programmable Read-Only Memory,EEPROM)等。其中,存储器11用于存储程序,所述处理器13在接收到执行指令后,执行所述程序,该程序包括具有处理器可执行的非易失的程序代码的计算机可读介质,在该计算机可读介质上存储有用于执行本发明的方法中限定的上述功能的计算机程序。所述处理器13以及其他可能的组件对存储器11的访问可在所述存储控制器12的控制下进行。在本实施例中,所述存储器11可 以存储所述移动终端10进行VR拍摄获得的所有的图像信息,所述图像信息包括图片、视频等。
所述处理器13可以是一种具有信号处理能力的集成电路芯片。上述的处理器13可以是通用处理器,包括中央处理器(Central Processing Unit,CPU)、网络处理器(Network Processor,NP)等;还可以是数字信号处理器(DSP)、专用集成电路(ASIC)、现成可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本发明实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述外设接口14将各种输入/输出装置耦合至所述处理器13以及存储器11。在一些实施例中,外设接口14,处理器13以及存储控制器12可以在单个芯片中实现。在其他一些实例中,他们可以分别由独立的芯片实现。
所述第一摄像头15及第二摄像头16为所述移动终端10中朝向相反的摄像头,所述第一摄像头15和第二摄像头16用于获取拍摄区域内的图像信息,其中,所述第一摄像头15和第二摄像头16分别包括至少一个单摄像头,所述第一摄像头15和第二摄像头16各自包括的摄像头的数目可以不同,比如所述第一摄像头15包括一个单摄像头,所述第二摄像头16包括两个摄像头。在本实施例中,所述第一摄像头15及第二摄像头16可以是,但不限于,数字摄像头、模拟摄像头等。
在本发明实施例中,所述移动终端10还可以包括分别固定于所述第一摄像头15和所述第二摄像头16上的广角镜头及用于固定所述广角镜头的固定结构。在本实施例中,所述广角镜头可通过所述固定结构可拆卸地固定在所述移动终端10上,所述移动终端10可以控制所述第一摄像头15和所述第二 摄像头16分别通过所述广角镜头采集拍摄区域的广角空间图像信息,所述广角空间图像信息包括广角空间图片或广角空间视频。在本实施例中,所述广角镜头包括广角范围超过180度的鱼眼镜头、标准镜头等,所述广角镜头优选为广角范围超过180度的鱼眼镜头,比如,210度鱼眼镜头、230度鱼眼镜头及235度鱼眼镜头等。在本实施例中,所述固定结构可以是,但不限于,手机卡槽、手机壳等。
可以理解的是,图1所示的结构仅为移动终端10的一种结构示意图,所述移动终端10还可包括比图1中所示更多或者更少的组件,或者具有与图1所示不同的配置。图1中所示的各组件可以采用硬件、软件或其组合实现。
请参照图2,是本发明较佳实施例提供的应用于图1所示的移动终端10的VR拍摄方法的一种流程示意图。所述移动终端10包括朝向相反的第一摄像头15和第二摄像头16,及分别固定于所述第一摄像头15和所述第二摄像头16上的广角镜头,其中,所述第一摄像头15和第二摄像头16分别包括至少一个单摄像头。以下对所述方法的具体流程进行描述。
在本发明实施例中,所述VR拍摄方法可以包括以下步骤:
步骤S210,控制第一摄像头15和第二摄像头16分别通过广角镜头采集拍摄区域的广角空间图像信息。
在本实施例中,所述广角空间图像信息为摄像头在拍摄区域获取到的视野较为广阔的空间图像信息,所述广角空间图像信息包括广角空间图片、广角空间视频等,所述广角空间图像信息可存储在所述存储器11中。在本发明实施例中,所述广角镜头优选为广角度数超过180度的鱼眼镜头,比如210度鱼眼镜头、230度鱼眼镜头及235度鱼眼镜头等。
在本实施例中,所述移动终端10控制所述第一摄像头15和所述第二摄像头16分别通过广角镜头采集拍摄区域的广角空间图像信息的步骤包括:
控制所述第一摄像头15和所述第二摄像头16在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集。
在本实施例中,为避免第一摄像头15和第二摄像头16采集到的拍摄区域的广角空间图像信息之间出现较大误差,需控制所述第一摄像头15和所述第二摄像头16在预设的拍摄时间段内有相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集,以保证采集到的广角空间图像信息的精度。
在本实施例中,所述移动终端10控制所述第一摄像头15和所述第二摄像头16在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集的步骤包括:
检测所述移动终端10是否支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集;
当所述移动终端10支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,控制所述第一摄像头15和所述第二摄像头16同时对所述拍摄区域进行广角空间图像信息的采集;
当所述移动终端10不支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,控制所述第一摄像头15和所述第二摄像头16等时间间隔地对所述拍摄区域进行广角空间图像信息的采集。
在本实施例中,不同的移动终端10可能因为不同的配置使得有些移动终端10可以支持支持第一摄像头15和第二摄像头16在预设的拍摄时间段内同时进行广角空间图像信息的采集,有些移动终端10无法支持第一摄像头15和第二摄像头16在预设的拍摄时间段内同时进行广角空间图像信息的采集, 因此需要对移动终端10是否支持第一摄像头15和第二摄像头16同时进行广角空间图像信息的采集进行检测。
当检测结果表示所述移动终端10不支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,所述移动终端10可以控制第一摄像头15和第二摄像头16进行等时间间隔地切换拍摄,以采集拍摄区域的广角空间图像信息。当所述广角空间图像信息表现为图片时,所述第一摄像头15与所述第二摄像头16进行拍照的时间需控制在相等的间隔时间内;当所述广角空间图像信息表现为视频时,所述第一摄像头15与所述第二摄像头16均具有相同的摄影时间,所述摄影时间即为移动终端10设置的切换间隔时间。在本实施例中,所述切换间隔时间可以是1帧、2帧或多帧。
当检测结果表示所述移动终端10支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,所述移动终端10可以控制所述第一摄像头15和所述第二摄像头16同时对所述拍摄区域进行广角空间图像信息的采集。
当然的,当检测结果表示所述移动终端10支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,所述移动终端10也可以控制第一摄像头15和第二摄像头16进行等时间间隔地切换拍摄,以采集拍摄区域的广角空间图像信息。
步骤S220,根据所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息。
在本实施例中,所述根据所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息的步骤包括:
判断所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息之间是否存在交叠,当存在交叠时,对采集到的所述广角空间图像信息进行图像拼接合成,得到所述拍摄区域的VR图像信息。
在本实施例中,固定于所述第一摄像头15上的广角镜头与固定于所述第二摄像头16上的广角镜头之间可以是不同的,以一个例子说明,例如,固定于所述第一摄像头15上的广角镜头是210度鱼眼镜头,固定于所述第二摄像头16上的广角镜头可能是210度鱼眼镜头,可能是230度鱼眼镜头,也可能是235度鱼眼镜头,此时所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息之间可能存在较大的差异,使得由所述第一摄像头15采集到的广角空间图像信息和由所述第二摄像头16采集到的广角空间图像信息无法在超过180度的空间上形成交叠,或形成的交叠无法支持所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息转换为VR图像信息,因此需要判断所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息之间是否存在交叠。
在本实施例中,通过将存储在所述存储器11中的由所述第一摄像头15采集到的广角空间图像信息和由所述第二摄像头16采集到的广角空间图像信息进行比对,找出广角空间图像信息中相互交叠的部分,当存在相互交叠的部分或所述相互交叠的部分满足VR图像信息合成条件时,可以将采集到的所述广角空间图像信息进行图像拼接合成,得到所述拍摄区域的VR图像信息并将其存储在所述存储器11中。所述图像拼接合成可将存在有交叠的广角空间图像信息拼成完整的表现出所述拍摄区域的具体信息的无缝高分辨率图像信 息,即所述拍摄区域的VR图像信息,所述VR图像信息与所述广角空间图像信息相对应,包括VR图片或VR视频。
步骤S230,将所述拍摄区域的VR图像信息在所述移动终端10中进行显示。
在本实施例中,所述移动终端10可从所述存储器11中获取所述拍摄区域的VR图像信息,并对所述VR图像信息进行显示。所述移动终端10对所述VR图像信息进行显示的方式可以是,但不限于,3D旋转显示模式、可通过VR眼镜进行观看的VR模式、二维展开模式及六面体模式等。
在本发明实施例中,所述移动终端10还可以包括用于读取NFC(Near Field Communication,近距离无线通信)卡的信息的NFC读卡器。所述固定结构上可以设置有NFC卡,所述NFC读卡器设置于所述移动终端10中。
在本实施例的一种实施方式中,所述固定结构上设置有NFC卡,请参照图3,是图2中步骤S230包括的子步骤的流程示意图,所述步骤S230将所述拍摄区域的VR图像信息在所述移动终端10中进行显示的步骤可以包括:
子步骤S231,移动终端10通过NFC读卡器读取固定结构上的NFC卡的信息,并判断所述NFC卡的信息是否合法。
子步骤S232,当所述NFC卡的信息合法时,将拍摄区域的VR图像信息在所述移动终端10中进行显示。
在本实施例中,所述NFC卡的信息的合法性标志着所述固定结构的合法性。当所述固定结构上存在所述NFC卡时,只有所述固定结构的合法性得到确认之后,所述移动终端10才会对拍摄区域的VR图像信息进行显示。
在本实施例的另一种实施方式中,所述固定结构上未设置有用于体现所述固定结构的合法性的NFC卡时,所述移动终端10可直接对拍摄区域的VR图像信息进行显示。
在本实施例中,所述移动终端10可通过手拿的方式、使用落地式手机支架的方式或使用手机自拍杆的方式对拍摄区域进行广角空间图像信息的采集。
请参照图4,是本发明较佳实施例提供的应用于图1所示的移动终端10的VR拍摄方法的另一种流程示意图。所述VR拍摄方法还可以包括:
步骤S240,将所述拍摄区域的VR图像信息发送给其他的移动终端10。
在本实施例中,所述移动终端10可将生成的VR图像信息通过网络发送给其他的移动终端10进行共享,所述网络包括有线网络或无线网络。
请参照图5,为本发明较佳实施例提供的图1中所示的VR拍摄系统100的一种方框示意图。所述VR拍摄系统100应用于移动终端10,所述移动终端10包括朝向相反的第一摄像头15和第二摄像头16,及分别固定于所述第一摄像头15和所述第二摄像头16上的广角镜头,其中,所述第一摄像头15和所述第二摄像头16分别包括至少一个单摄像头。所述VR拍摄系统100可以包括:控制模块110、合成模块120及显示模块130。
所述控制模块110用于控制所述第一摄像头15和所述第二摄像头16分别通过所述广角镜头采集拍摄区域的广角空间图像信息。
在本实施例中,所述控制模块110控制所述第一摄像头15和所述第二摄像头16分别通过所述广角镜头采集拍摄区域的广角空间图像信息的方式包括:
控制所述第一摄像头15和所述第二摄像头16在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集。
具体地,所述控制模块110控制所述第一摄像头15和所述第二摄像头16在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集的方式包括:
检测所述移动终端10是否支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集;
当所述移动终端10支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,控制所述第一摄像头15和所述第二摄像头16同时对所述拍摄区域进行广角空间图像信息的采集;
当所述移动终端10不支持所述第一摄像头15和所述第二摄像头16同时进行广角空间图像信息的采集时,控制所述第一摄像头15和所述第二摄像头16等时间间隔地对所述拍摄区域进行广角空间图像信息的采集。
在本实施例中,所述控制模块110执行的过程可参考图2所示的步骤S210的详细描述。
所述合成模块120用于根据所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息。
在本实施例中,所述合成模块120根据所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息的方式包括:
判断所述第一摄像头15采集到的广角空间图像信息和所述第二摄像头16采集到的广角空间图像信息之间是否存在交叠,当存在交叠时,对采集到的 所述广角空间图像信息进行图像拼接合成,得到所述拍摄区域的VR图像信息。
在本实施例中,所述合成模块120执行的过程可参考图2所示的步骤S220的详细描述。
所述显示模块130用于将所述拍摄区域的VR图像信息在所述移动终端10中进行显示。
在本实施例中,所述移动终端10还可以包括用于读取NFC卡的信息的NFC读卡器。所述固定结构上可以设置有NFC卡,所述NFC读卡器设置于所述移动终端10中。
请参照图6,为本发明较佳实施例提供的图1中所示的VR拍摄系统100的另一种方框示意图。在本实施例中,所述显示模块130可以包括判断子模块131和显示子模块132。
所述判断子模块131用于通过所述NFC读卡器读取所述固定结构上的NFC卡的信息,并判断所述NFC卡的信息是否合法。
所述显示子模块132用于当所述NFC卡的信息合法时,将所述拍摄区域的VR图像信息在所述移动终端10中进行显示。
在本实施例的一种实施方式中,所述固定结构上设置有NFC卡时,只有在所述判断子模块131确认所述NFC卡的信息合法即所述固定结构合法之后,所述显示子模块132才会对拍摄区域的VR图像信息进行显示。
在本实施例的另一种实施方式中,所述固定结构上未设置有用于体现所述固定结构的合法性的NFC卡时,所述判断子模块131不工作,所述显示子模块132直接对所述拍摄区域的VR图像信息进行显示。
请再次参照图6,所述VR拍摄系统100还可以包括发送模块140。
所述发送模块140用于将所述拍摄区域的VR图像信息发送给其他的移动终端10。
综上所述,本发明实施例提供的VR拍摄方法、系统及移动终端。所述VR拍摄方法通过控制所述第一摄像头和所述第二摄像头分别通过可拆卸地固定在所述移动终端上的广角镜头采集拍摄区域的广角空间图像信息,判断所述第一摄像头和所述第二摄像头采集到的广角空间图像信息是否存在交叠,对采集到的广角空间图像信息进行图像拼接合成,得到拍摄区域的VR图像信息,并将所述拍摄区域的VR图像信息在所述移动终端中进行显示,从而为用户提供一种基于移动终端的性价比高的VR拍摄系统,实现对拍摄到的VR图像效果的实时观察,保证用户的体验。
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (20)

  1. 一种VR拍摄方法,应用于移动终端,所述移动终端包括朝向相反的第一摄像头和第二摄像头及通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头,其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头,其特征在于,所述方法包括:
    控制所述第一摄像头和所述第二摄像头分别通过所述广角镜头采集拍摄区域的广角空间图像信息;
    根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息;
    将所述拍摄区域的VR图像信息在所述移动终端中进行显示。
  2. 根据权利要求1所述的VR拍摄方法,其特征在于,所述方法还包括:
    将所述拍摄区域的VR图像信息发送给其他的移动终端。
  3. 根据权利要求1所述的VR拍摄方法,其特征在于,所述控制所述第一摄像头和所述第二摄像头分别通过所述广角镜头采集拍摄区域的广角空间图像信息的步骤包括:
    控制所述第一摄像头和所述第二摄像头在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集。
  4. 根据权利要求3所述的VR拍摄方法,其特征在于,所述控制所述第一摄像头和所述第二摄像头在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集的步骤包括:
    检测所述移动终端是否支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集;
    当所述移动终端支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集时,控制所述第一摄像头和所述第二摄像头同时对所述拍摄区域进行广角空间图像信息的采集;
    当所述移动终端不支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集时,控制所述第一摄像头和所述第二摄像头等时间间隔地对所述拍摄区域进行广角空间图像信息的采集。
  5. 根据权利要求1所述的VR拍摄方法,其特征在于,所述根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息得到所述拍摄区域的VR图像信息的步骤包括:
    判断所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息之间是否存在交叠,当存在交叠时,对采集到的所述广角空间图像信息进行图像拼接合成,得到所述拍摄区域的VR图像信息。
  6. 根据权利要求1-5中任意一项所述的VR拍摄方法,其特征在于,所述广角镜头包括广角范围超过180度的鱼眼镜头。
  7. 根据权利要求4所述的VR拍摄方法,其特征在于,当所述广角空间图像信息表现为图片时,所述第一摄像头与所述第二摄像头进行拍照的时间控制在相等的间隔时间内;当所述广角空间图像信息表现为视频时,所述第一摄像头与所述第二摄像头均具有相同的摄影时间,所述摄影时间即为所述移动终端设置的切换间隔时间。
  8. 根据权利要求7所述的VR拍摄方法,其特征在于,切换间隔时间是1帧或多帧。
  9. 根据权利要求7所述的VR拍摄方法,其特征在于,固定于所述第一摄像头上的广角镜头与固定于所述第二摄像头上的广角镜头是不同的。
  10. 一种VR拍摄系统,应用于移动终端,所述移动终端包括朝向相反的第一摄像头和第二摄像头及通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头,其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头,其特征在于,所述系统包括:
    控制模块,配置成控制所述第一摄像头和所述第二摄像头分别通过所述广角镜头采集拍摄区域的广角空间图像信息;
    合成模块,配置成根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息;
    显示模块,配置成将所述拍摄区域的VR图像信息在所述移动终端中进行显示。
  11. 根据权利要求10所述的VR拍摄系统,其特征在于,所述VR拍摄系统还包括:
    发送模块,配置成将所述拍摄区域的VR图像信息发送给其他的移动终端。
  12. 根据权利要求10所述的VR拍摄系统,其特征在于,所述控制模块配置成:
    控制所述第一摄像头和所述第二摄像头在预设的拍摄时间段内以相同的拍摄时间对所述拍摄区域进行广角空间图像信息的采集。
  13. 根据权利要求12所述的VR拍摄系统,其特征在于,所述控制模块进一步配置成:
    检测所述移动终端是否支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集;
    当所述移动终端支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集时,控制所述第一摄像头和所述第二摄像头同时对所述拍摄区域进行广角空间图像信息的采集;
    当所述移动终端不支持所述第一摄像头和所述第二摄像头同时进行广角空间图像信息的采集时,控制所述第一摄像头和所述第二摄像头等时间间隔地对所述拍摄区域进行广角空间图像信息的采集。
  14. 根据权利要求10所述的VR拍摄系统,其特征在于,所述合成模块配置成:
    判断所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息之间是否存在交叠,当存在交叠时,对采集到的所述广角空间图像信息进行图像拼接合成,得到所述拍摄区域的VR图像信息。
  15. 根据权利要求10-14中任意一项所述的VR拍摄系统,其特征在于,所述广角镜头包括广角范围超过180度的鱼眼镜头。
  16. 根据权利要求13所述的VR拍摄系统,其特征在于,当所述广角空间图像信息表现为图片时,所述第一摄像头与所述第二摄像头进行拍照的时间控制在相等的间隔时间内;当所述广角空间图像信息表现为视频时,所述第一摄像头与所述第二摄像头均具有相同的摄影时间,所述摄影时间即为所述移动终端设置的切换间隔时间。
  17. 根据权利要求17所述的VR拍摄系统,其特征在于,所述切换间隔时间是1帧或多帧。
  18. 根据权利要求17所述的VR拍摄系统,其特征在于,固定于所述第一摄像头上的广角镜头与固定于所述第二摄像头上的广角镜头是不同的。
  19. 一种移动终端,其特征在于,所述移动终端包括:
    存储器;
    处理器;
    朝向相反的第一摄像头和第二摄像头,其中,所述第一摄像头和所述第二摄像头分别包括至少一个单摄像头;
    通过固定结构分别固定于所述第一摄像头和所述第二摄像头上的广角镜头;及
    VR拍摄系统,所述系统安装于所述存储器中并包括一个或多个由所述处理器执行的软件功能模块,所述系统包括:
    控制模块,配置成控制所述第一摄像头和所述第二摄像头分别通过所述广角镜头采集拍摄区域的广角空间图像信息;
    合成模块,配置成根据所述第一摄像头采集到的广角空间图像信息和所述第二摄像头采集到的广角空间图像信息,得到所述拍摄区域的VR图像信息;
    显示模块,配置成将所述拍摄区域的VR图像信息在所述移动终端中进行显示。
  20. 一种具有处理器可执行的非易失的程序代码的计算机可读介质,其特征在于,所述程序代码使所述处理器执行所述权利要求1-9中任意一项所述方法。
PCT/CN2017/073104 2016-12-29 2017-02-08 Vr拍摄方法、系统及移动终端 WO2018120353A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611249548.5A CN106791674A (zh) 2016-12-29 2016-12-29 Vr拍摄方法、系统及移动终端
CN201611249548.5 2016-12-29

Publications (1)

Publication Number Publication Date
WO2018120353A1 true WO2018120353A1 (zh) 2018-07-05

Family

ID=58928259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/073104 WO2018120353A1 (zh) 2016-12-29 2017-02-08 Vr拍摄方法、系统及移动终端

Country Status (2)

Country Link
CN (1) CN106791674A (zh)
WO (1) WO2018120353A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172350A (zh) * 2017-06-15 2017-09-15 闻泰通讯股份有限公司 全景图像生成方法、装置及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103916582A (zh) * 2013-01-07 2014-07-09 华为技术有限公司 一种图像处理方法及装置
US20150138314A1 (en) * 2013-11-20 2015-05-21 Google Inc. Generating Panoramic Images
WO2016195792A1 (en) * 2015-06-02 2016-12-08 Qualcomm Incorporated Systems and methods for producing a combined view from fisheye cameras
US9521321B1 (en) * 2015-02-11 2016-12-13 360 Lab Llc. Enabling manually triggered multiple field of view image capture within a surround image mode for multi-lens mobile devices
CN205847395U (zh) * 2016-07-14 2016-12-28 幸福在线(北京)网络技术有限公司 一种终端设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210547A (zh) * 2016-09-05 2016-12-07 广东欧珀移动通信有限公司 一种全景拍摄的方法、装置及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103916582A (zh) * 2013-01-07 2014-07-09 华为技术有限公司 一种图像处理方法及装置
US20150138314A1 (en) * 2013-11-20 2015-05-21 Google Inc. Generating Panoramic Images
US9521321B1 (en) * 2015-02-11 2016-12-13 360 Lab Llc. Enabling manually triggered multiple field of view image capture within a surround image mode for multi-lens mobile devices
WO2016195792A1 (en) * 2015-06-02 2016-12-08 Qualcomm Incorporated Systems and methods for producing a combined view from fisheye cameras
CN205847395U (zh) * 2016-07-14 2016-12-28 幸福在线(北京)网络技术有限公司 一种终端设备

Also Published As

Publication number Publication date
CN106791674A (zh) 2017-05-31

Similar Documents

Publication Publication Date Title
CN111052727B (zh) 电子装置及其控制方法
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
JP6198958B2 (ja) 写真を取得するための方法、装置、コンピュータプログラム、及び、コンピュータ可読記憶媒体
US9712745B2 (en) Method and apparatus for operating camera function in portable terminal
EP2991339B1 (en) Photographing method and electronic device
US9319632B2 (en) Display apparatus and method for video calling thereof
US9742995B2 (en) Receiver-controlled panoramic view video share
KR102220443B1 (ko) 깊이 정보를 활용하는 전자 장치 및 방법
US9807300B2 (en) Display apparatus for generating a background image and control method thereof
US20160301840A1 (en) Photographing Method for Dual-Lens Device and Dual-Lens Device
CN109934931B (zh) 采集图像、建立目标物体识别模型的方法及装置
KR102314594B1 (ko) 이미지 디스플레이 방법 및 전자 장치
CN108495032B (zh) 图像处理方法、装置、存储介质及电子设备
KR20180011539A (ko) 영상의 처리를 위한 전자 장치
CN109582122B (zh) 增强现实信息提供方法、装置及电子设备
JP2017505004A (ja) 画像生成方法及びデュアルレンズ装置
KR20140104753A (ko) 신체 부위 검출을 이용한 이미지 프리뷰
CN108632543B (zh) 图像显示方法、装置、存储介质及电子设备
EP3687157A1 (en) Method for capturing images and electronic device
CN108292075A (zh) 拍摄设备及其操作方法
WO2018184260A1 (zh) 文档图像的校正方法及装置
CN112954212B (zh) 视频生成方法、装置及设备
KR20170060411A (ko) 피사체의 근접 여부에 따라 촬영 장치를 제어하는 방법 및 촬영 장치.
US20180220066A1 (en) Electronic apparatus, operating method of electronic apparatus, and non-transitory computer-readable recording medium
US10009545B2 (en) Image processing apparatus and method of operating the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17887290

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17887290

Country of ref document: EP

Kind code of ref document: A1