WO2020125797A1 - 一种终端、拍摄方法及存储介质 - Google Patents

一种终端、拍摄方法及存储介质 Download PDF

Info

Publication number
WO2020125797A1
WO2020125797A1 PCT/CN2019/127417 CN2019127417W WO2020125797A1 WO 2020125797 A1 WO2020125797 A1 WO 2020125797A1 CN 2019127417 W CN2019127417 W CN 2019127417W WO 2020125797 A1 WO2020125797 A1 WO 2020125797A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
cameras
angle
target
terminal
Prior art date
Application number
PCT/CN2019/127417
Other languages
English (en)
French (fr)
Inventor
汶晨光
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Priority to EP19899106.9A priority Critical patent/EP3902236A4/en
Publication of WO2020125797A1 publication Critical patent/WO2020125797A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0241Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present disclosure relates to, but not limited to, the communication field, and in particular, to a terminal, a shooting method, and a storage medium.
  • the current dual-camera smart phone shooting method enriches the shooting function of the mobile phone and improves the quality of the shooting picture.
  • the dual camera shooting method because the position and angle of the two cameras on the terminal are fixed in the related art, and the shooting focus is usually not in the center of the viewfinder (near the central axis of the camera), so the shooting The light in the center of the focal point usually enters the camera through the side.
  • the angle of deviation of the focal point from the central axis of the camera increases, the positioning error will also increase, which may easily lead to picture distortion. Therefore, the current dual-camera camera mode of the mobile phone affects the flexibility of shooting and also reduces the quality of the shooting screen to a certain extent.
  • Embodiments of the present disclosure provide a terminal, a shooting method, and a storage medium, so as to at least solve the problem of inflexible shooting of the terminal in the related art.
  • a terminal including: a memory storing a computer program and a processor, the processor being configured to run the computer program to execute: determining shooting information, wherein the shooting The information includes: the position of the shooting target or the shooting angle of view; adjusting the axis direction of at least two cameras included in the terminal according to the position of the shooting target so that the at least two cameras focus on the shooting target; or, according to The shooting angle of view adjusts the angle between the axis directions of the at least two cameras; and controls the adjusted at least two cameras to collect images.
  • a shooting method including: determining shooting information, wherein the shooting information includes: a shooting target position or a shooting angle of view; and adjusting the terminal position according to the shooting target position Including the axis direction of at least two cameras to focus the at least two cameras on the shooting target; or, adjusting the included angle of the axis direction of the at least two cameras according to the shooting angle of view; controlling the adjusted The at least two cameras collect images.
  • a storage medium in which a computer program is stored, wherein the computer program is configured to execute the steps in any one of the above method embodiments at runtime.
  • FIG. 1 is a structural block diagram of a terminal according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of the axis direction of a camera according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of the position of a camera according to an embodiment of the present disclosure.
  • 5a is a schematic front view of a flexible terminal according to an alternative embodiment of the present disclosure.
  • 5b is a schematic rear view of a flexible terminal according to an alternative embodiment of the present disclosure.
  • 5c is a schematic side view of a flexible terminal according to an alternative embodiment of the present disclosure.
  • 5d is a schematic diagram of side bending of a flexible terminal according to an optional embodiment of the present disclosure.
  • 5e is a schematic diagram of a shooting state of a flexible terminal according to an optional embodiment of the present disclosure.
  • FIG. 6a is a schematic rear view of a folding terminal according to an alternative embodiment of the present disclosure.
  • 6b is a schematic diagram of a shooting state of a folding terminal according to an optional embodiment of the present disclosure
  • FIG. 7 is a flowchart of a mobile terminal shooting method according to an optional embodiment of the present disclosure.
  • FIG. 8 is a flow chart of a mobile terminal shooting a stereoscopic effect photo according to an optional embodiment of the present disclosure
  • FIG. 9 is a flowchart of a mobile terminal according to an optional embodiment of the present disclosure shooting three-dimensional dynamic pictures
  • FIG. 10 is a flowchart of a mobile terminal automatically taking panoramic photos according to an optional embodiment of the present disclosure
  • FIG. 11 is a schematic diagram of a state in which a mobile terminal automatically takes a panoramic photo according to an optional embodiment of the present disclosure
  • 12a is a schematic view of an interface for displaying two camera images on a split screen of a terminal according to an optional embodiment of the present disclosure
  • 12b is a schematic view of an interface of a terminal displaying a single camera screen in full screen according to an optional embodiment of the present disclosure
  • 12c is a schematic diagram of an interface of a terminal that clicks a "composite” button to synthesize a stereoscopic effect screen according to an optional embodiment of the present disclosure
  • 12d is a schematic view of an interface of a click screen hiding tool bar of a terminal according to an optional embodiment of the present disclosure
  • FIG. 12e is a schematic diagram of an interface for shooting a stereoscopic image-split screen display of a terminal according to an optional embodiment of the present disclosure
  • 12f is a schematic diagram of an interface for shooting a stereoscopic image-single camera screen of a terminal according to an optional embodiment of the present disclosure
  • 12g is a schematic diagram of an interface for viewing a stereoscopic picture-synthesis picture viewing interface of a terminal according to an optional embodiment of the present disclosure
  • FIG. 12h is a schematic diagram of an interface for automatically shooting a panoramic photo of a terminal according to an optional embodiment of the present disclosure.
  • FIG. 1 is a structural block diagram of a terminal according to an embodiment of the present disclosure.
  • Embodiment 1 of the present disclosure provides a terminal, which may include: a camera 1 for collecting images, wherein the terminal includes at least two cameras 1; a memory 2 storing a computer program and a processor 3 , The processor 3 is configured to run a computer program to execute:
  • the shooting information includes: the position of the shooting target or the shooting angle of view;
  • the axis direction of the camera on the terminal can be controlled by the processor, so the problem of inflexibility in taking pictures of the terminal camera in the related art can be solved, and the effect of flexible photography can be achieved.
  • FIG. 3 is a schematic diagram of the axis direction of the camera according to an embodiment of the present disclosure.
  • the axis direction of the camera may refer to the direction of the camera, or on the straight line where the axis of the camera is located, toward the subject Direction.
  • the axis direction of the camera is focused on the shooting target, which may refer to the axis direction of the camera intersecting the shooting target.
  • the shooting target is an object, and focusing the axis direction of the camera on the shooting target may mean that the axis direction of the camera intersects the shooting object.
  • Existing smartphones use a dual camera structure with a fixed spatial position and no orientation adjustment, which causes the main subject to deviate from the axis direction of the camera, which is easy to cause picture distortion; and in the solution provided in this embodiment, focusing the axis direction of the camera on the shooting target can make The subject is always in the vicinity of the main axis of the camera, which improves the flexibility of the camera, avoids the distortion of the picture caused by the subject shifting from the center of the main axis of the camera, and improves the quality of the picture taken by the mobile phone.
  • the processor is configured to run a computer program to execute determining the shooting information, including:
  • determining the shooting information includes: after determining the shooting target, collecting an image of the shooting target and determining the position of the shooting target according to the scene depth information of the image.
  • the processor is configured to run a computer program to perform adjustment of the axis direction of at least two cameras according to the position of the shooting target to focus the camera on the shooting target, including:
  • the user can click on the screen to determine the shooting target (shooting focus), and the camera can calculate the angle that the camera should form according to the built-in algorithm by detecting the distance between the lens and the shooting target, namely The target axis direction when at least two cameras are focused on the shooting target can be determined.
  • the shooting information includes the shooting angle of view
  • the shooting information includes:
  • the shooting angle is determined according to the shooting instruction, wherein the shooting angle is indicated in the shooting instruction.
  • the processor is configured to run a computer program to perform the adjustment of the angle between the axes of the at least two cameras according to the shooting angle of view, including:
  • the current axis direction of at least two cameras is adjusted to the target axis direction according to the adjustment angle, wherein the included angle of the target axis direction is the target included angle.
  • the processor is configured to run a computer program to perform adjustment of the current axis direction of the camera to the target axis direction according to the adjustment angle, wherein the included angle of the target axis direction is the target included angle, including:
  • the angle between the axis directions of the at least two cameras is adjusted to continuously change within the angle range between the current angle and the target angle.
  • the terminal may instruct the terminal to include at least two cameras according to the instruction generated by the user’s selection.
  • the angle between the two axes needs to be set to 50°, and the processor adjusts the angle between the two cameras to 50° according to the instruction.
  • the processor may adjust the included angle in the axial direction of at least two cameras according to the preset angle according to the instruction indicating the panoramic mode. For example, if the maximum angle of the panoramic mode is set to 270° in advance, and the angle between the two cameras on the current terminal is 60°, the processor controls the axis direction of the two cameras to adjust to the state where the angle is 270°.
  • the angle between the axis directions of the at least two cameras is adjusted to continuously change within the angle range between the current angle and the target angle, and does not limit the angle must be in a continuous uninterrupted
  • the angle range changes continuously, for example, it can be changed within the angle range of 0° to 60°, or in the panoramic mode, the angle of the camera axis direction continuously changes from 60° to 0° within a predetermined time , And then continuously change from 0° to 270°, that is, it can also change continuously within the intermittent angle range.
  • adjusting the angle of the axis direction of at least two cameras to continuously change within the angle range between the current angle and the target angle does not limit the angle change must change uniformly according to time within a predetermined time, for example, The angle is evenly increased by 5° every 1s, or it can be increased evenly by 5° in the first 1s, and the angle is evenly increased by 10° in the next 1s, or after the angle is increased by 5° , Pause, after the camera takes the image, continue to increase the angle of the camera axis.
  • the processor is also configured to run a computer program to execute:
  • the processor is further configured to run a computer program to perform synthesis of images acquired through at least two cameras to obtain a synthesized image, including:
  • the pixel-based depth value synthesizes images collected by at least two cameras according to the image processing mode to obtain a synthesized image.
  • the processor is further configured to run a computer program to execute: outputting the image collected by the camera group and/or the composite image to the display screen of the terminal.
  • the camera group is provided on the first side of the terminal, where the first side may be a combined side formed by one side of the terminal, one side of the terminal component, or the side of the terminal component.
  • one side of the terminal may be an upper side, a lower side, a left side, a right side, a front side, or a rear side of the terminal or a component of the terminal.
  • the first side may be any terminal A side or a combined side constituted by either side of the terminal component.
  • the first side may include one of the following:
  • the terminal may include a terminal body, wherein the first side is a side on the terminal body; or
  • the terminal may include a first body and a second body connected to each other, wherein the first side is a combined side formed by the sides of the terminal on the same side of the first body and the second body in a state of straight deployment; or
  • the terminal may include a first body and a second body connected to each other, wherein the first side is a side on the first body or a side on the second body.
  • first side may be a side on a terminal that includes only one body, or may include a terminal with two components (such as a folding mobile phone, a flip mobile phone, etc.) that is straightly deployed
  • the side surfaces located on the same side of the two components form a combined plane, that is, the first side surface may be a plane resulting from this combination.
  • the cameras in the camera group are distributed on the edge of the first side.
  • the baseline (the distance between the two lenses) between the two lenses of the current mobile phone dual camera is very short, generally about 10mm, and the average baseline of the human eyes is 64mm, compared with the current dual camera
  • the baseline of the mobile phone is too short, therefore, it will cause a large calculation error during the positioning and image processing. It can only calculate the depth of field of the closer objects (shallow depth of field), which has greater limitations in the shooting scene.
  • the cameras may be distributed on the edges of the first side, where the edges distributed on the first side may mean that the cameras are distributed at least in the working state (for example, when taking pictures) ( Not close to each other), and distributed near the edge of the first side, which can effectively increase the length of the baseline between the cameras, improve the calculation accuracy of the image depth, and increase the effective range of the depth of field processing of the shooting picture, Bring higher convenience and practicality to users, effectively improve the user experience.
  • FIG. 4 is a schematic diagram of a camera position according to an embodiment of the present disclosure, where part A may represent the front or back of the terminal, and part B in FIG. 4 may represent any of the top, bottom, left, and right of the terminal
  • the two cameras may be located at the 1-2 position on the front or back of the terminal, or may be 4-5, 3-6, 7 -8, 9-10 position, can also be 1-8, 2-7, 4-9, 5-10 position, can also be 8-10, 7-9, 2-4, 1-5 position
  • the two cameras are installed on any side of the upper, lower, left, and right sides of the terminal, they can be located at positions 11-12.
  • the positions of the cameras can be distributed according to a square, for example, they can be set at four positions that can form a square among the 1-10 positions, as shown in FIG. 4, 7-8-9-10, 1-2-4-5, 7-2-9-4, 1-8-10-5, or two cameras in the horizontal and vertical directions, for example, It is 1-5-3-6, 2-4-3-6, 7-9-3-6, 8-10-3-6.
  • the first side is the side where the display screen on the terminal is located or the side of the terminal facing away from the display screen.
  • the camera is located on the side of the display screen to facilitate the photographer to take a Selfie, and does not affect the photographer to view the shooting screen; the camera is located on the side away from the display screen to facilitate the user to view the shot screen.
  • the terminal includes an adjustment component for adjusting the axis direction of the camera on the terminal.
  • the adjustment component may include a bending structure, wherein the bending structure is connected to the camera, and the bending structure is used to directly adjust the axis direction of the camera; or, the bending structure is connected to the bearing structure carrying the camera, bending The structure is used to adjust the axis direction of the camera by adjusting the bending state of the bearing structure.
  • the bending structure can be directly connected to the components of the camera itself to directly change the axis direction of the camera.
  • the bending structure may be a rotatable structure including a shaft, a hinge, or a coupling, etc., which may directly cause the camera orientation to change.
  • the terminal itself can maintain the original state without bending.
  • the bending structure may be connected to the carrying structure carrying the camera, for example, the carrying structure carrying the camera is some structure on the first side of the terminal, or the first side carrying the camera Then, by adjusting the bending state of the first side surface, a change in the axis direction of the camera can be brought about.
  • the bending structure may be a hinge or a hinge or a connecting shaft in the middle of the folding terminal. When the hinge is opened or closed, and the hinge and the connecting shaft are rotated, the two splits of the folding terminal are driven to open and close, and then the axis direction of the camera disposed on the split will change.
  • the bending structure may be a structure that supports the bearing structure, such as a bendable plate-shaped member, such as a bendable plate-shaped structure formed by connecting a plurality of small plate-shaped members to each other, can be controlled by The inclination angle of the small plate-shaped member adjusts the bending state of the whole bendable plate-shaped structure.
  • the bending structure itself is bent, it can drive the bending of the bearing structure.
  • the bearing structure carrying the camera itself may be a bending structure, and when the bending structure itself is bent, it is equivalent to bending the bearing structure.
  • the cameras when the supporting structure is in a bent state, are distributed on both sides of the bending axis on the supporting structure.
  • the structure carrying the camera may be the first side on the terminal.
  • the bending line may be a boundary line that divides the load-bearing structure into multiple bending regions after the load-bearing structure is bent, where the load-bearing structure bends
  • the direction of the straight line perpendicular to the surface of these fold areas also changes.
  • a plane load-bearing structure
  • the line at the bottom of the U-shaped structure can be understood as a bending line
  • the curved parts on the left and right sides can be understood Are two bending regions separated by a bending line;
  • all straight lines on the curved surface perpendicular to the semicircular arc line That is, the straight line parallel to the axis of the semi-cylindrical structure on the curved surface can be understood as a bending line.
  • the direction of the straight line perpendicular to the surface of the area has occurred during the bending of the bearing structure Variety.
  • FIG. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure. As shown in FIG. 2, an embodiment of the present disclosure also provides a shooting method, which is applied to any of the above-mentioned terminals. The method includes:
  • Step S202 Determine shooting information, where the shooting information includes: the position of the shooting target or the shooting angle of view;
  • Step S204 Adjust the axis direction of at least two cameras included in the terminal according to the position of the shooting target so that the at least two cameras focus on the shooting target; or, adjust the included angle of the axis direction of the at least two cameras according to the shooting angle of view;
  • Step S206 Control at least two adjusted cameras to collect images.
  • the axis direction of the cameras in the camera group on the terminal can be controlled by the processor, the problem of inflexibility in taking pictures of the terminal camera in the related art can be solved, and the effect of taking pictures flexibly can be achieved.
  • Determine shooting information including:
  • the shooting information includes the location of the shooting target, it is determined that the shooting information includes:
  • After determining the shooting target collect the image of the shooting target and determine the position of the shooting target according to the scene depth information of the image;
  • the shooting information includes the shooting angle of view
  • the shooting information includes:
  • the shooting angle is determined according to the shooting instruction, wherein the shooting angle is indicated in the shooting instruction.
  • adjusting the axis direction of at least two cameras according to the position of the shooting target to focus the camera on the shooting target includes:
  • adjusting the included angle of at least two cameras according to the shooting angle includes:
  • the current axis direction of at least two cameras is adjusted to the target axis direction according to the adjustment angle, wherein the included angle of the target axis direction is the target included angle.
  • the current axis direction of at least two cameras is adjusted to the target axis direction according to the adjustment angle, where the included angle of the target axis direction is the target included angle, including:
  • the angle between the axis directions of the at least two cameras is adjusted to continuously change within the angle range between the current angle and the target angle.
  • the images collected by at least two cameras are synthesized to obtain a synthesized image.
  • synthesizing images collected by at least two cameras to obtain a synthesized image includes:
  • the pixel-based depth value synthesizes images collected by at least two cameras according to the image processing mode to obtain a synthesized image.
  • the image collected by the camera group and/or the composite image are output to the display screen of the terminal.
  • An embodiment of the present disclosure also provides a storage medium in which a computer program is stored, wherein the computer program is configured to execute any of the steps in the above method embodiments during runtime.
  • the above storage medium may be set to store a computer program for performing the following steps:
  • Step S1 Determine the shooting information, where the shooting information includes: the position of the shooting target or the shooting angle of view;
  • Step S2 Adjust the axis direction of the at least two cameras included in the terminal according to the position of the shooting target so that the at least two cameras focus on the shooting target; or, adjust the included angle of the axis direction of the at least two cameras according to the shooting angle of view;
  • Step S3 Control at least two adjusted cameras to collect images.
  • the above storage medium may include, but is not limited to: a USB flash drive, a read-only memory (Read-Only Memory, ROM for short), a random access memory (Random Access Memory, RAM for short), Various media that can store computer programs, such as removable hard drives, magnetic disks, or optical disks.
  • the mobile terminal may include one or more processors (the processor may include but is not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory for storing data,
  • the above mobile terminal may further include a transmission device for communication functions and an input and output device.
  • the structure of the terminal described above is only an illustration, and does not limit the structure of the mobile terminal.
  • the mobile terminal may also include more or fewer components, or have different configurations.
  • the memory may be used to store computer programs, for example, software programs and modules of application software, such as the computer program corresponding to the shooting method in the embodiments of the present disclosure, and the processor executes various functional applications by running the computer program stored in the memory and Data processing, that is, the method described above.
  • the memory may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory may further include memories remotely provided with respect to the processor, and these remote memories may be connected to the mobile terminal through a network. Examples of the aforementioned network include, but are not limited to, the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the method according to the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course, it can also be implemented by hardware, but in many cases the former is Better implementation.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or part that contributes to the existing technology, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk,
  • the CD-ROM includes several instructions to enable a terminal device (which may be a mobile phone, computer, server, or network device, etc.) to execute the methods of the various embodiments of the present disclosure.
  • the embodiment of the present disclosure achieves the stereo shooting effect of the dual camera by installing a camera on each end of the same side of the flexible screen/folding screen smartphone.
  • When taking pictures automatically adjust the bending/folding angle of the mobile phone according to the focus point of the shooting screen, so that the two cameras are simultaneously focused on the focus point, and the visual images and depth images of the two cameras are collected and processed at the same time, and the image processing is performed to realize the simulated human eye mode.
  • 3D stereo photo shooting and 3D stereo dynamic picture shooting are performed to realize the simulated human eye mode.
  • the bending of the flexible screen mobile phone drives the cameras at both ends of the mobile phone to rotate, to simulate the eye movement of the human eye when observing things, to simulate the stereoscopic vision of the human eye, by increasing the distance between the two cameras
  • the shooting range of three-dimensional effect photos, at the same time, a three-dimensional three-dimensional dynamic photo shooting method is proposed.
  • the dual camera of the mobile phone in the present disclosure can automatically adjust the curvature of the mobile phone according to the change of the position of the subject to adjust the position of the camera, so that the subject is always in the vicinity of the main axis of the camera, which improves the flexibility of the camera and avoids the subject
  • the picture distortion caused by the deviation from the center of the camera spindle improves the quality of the picture taken by the mobile phone.
  • the calculation error is large, and the depth of field of only close objects can only be calculated.
  • the distance between the two cameras of the flexible screen/folding screen mobile phone Use a larger pitch that simulates the angle of view of the human eye.
  • the embodiments of the present disclosure can take advantage of the large parallax of the two cameras of the flexible screen/folding screen mobile phone to take close-up 3D dynamic pictures to generate a 3D scene photo that can be rotated and viewed within a certain angle, which improves the photo's Authenticity and fun of taking pictures.
  • FIGS. 5a to 6b provide a schematic diagram of the structure and shooting status of the flexible terminal and the folding terminal.
  • the screen of the flexible screen/folding screen smartphone has a bendable feature, and two cameras Installed in different locations on the back of the phone.
  • the first camera is installed in the upper half and the second camera is in the lower half, and the two cameras are on the same vertical line, because the flexible screen mobile phone or flexible terminal has a flexible Folding, so the relative position of the two cameras will change as the phone screen bends.
  • the bending process is shown in Figure 5c and Figure 5d.
  • the two cameras are respectively installed at the top positions on both sides of the left and right screens of the folding screen mobile phone, and the camera position changes with the folding motion of the screen, as shown in FIG. 5e.
  • the method of bending the screen of a flexible screen mobile phone is not limited to the simultaneous bending on both sides or the same degree of bending.
  • the degree of bending of each part of the flexible screen will vary according to the different preset modes and the changes in shooting scene conditions. It is automatically adjusted by instructions .
  • FIG. 6a and 6b are schematic structural diagrams of a folding mobile phone, in which two cameras are provided in the upper left and upper right corners on the back of the folding mobile phone, and the folding method of the folding screen mobile phone will also adopt different folding according to the difference in shooting scene conditions mechanism.
  • the B screen is folded using the A screen as a reference
  • the A screen is folded using the B screen as a reference
  • the AB screen is synchronously folded
  • the AB screen is asynchronously folded and rotated.
  • the embodiments of the present disclosure mainly relate to a shooting method of a flexible screen/folding screen terminal and a flexible screen/folding screen mobile terminal, and a camera is respectively installed at a different position on the back of the flexible screen/folding screen mobile phone.
  • Set the focus point when shooting, the two cameras separately view and calculate the scene depth in real time; automatically adjust the bending degree of each part of the flexible screen/folding screen in real time according to the distance between the focus point and the mobile phone camera, aim the two cameras at the screen Focus; the first camera and the second camera shoot at the same time to obtain the shooting screen and obtain the depth value of the shooting scene; perform image processing on the picture according to the preset image processing mode. Realize the three-dimensional effect photo shooting simulating the human eye mode and the three-dimensional dynamic photo shooting.
  • FIG. 7 is a flowchart of a shooting method of a mobile terminal according to an optional embodiment of the present disclosure. Since the first camera and the second camera of the flexible screen/folding screen mobile phone simulate the human stereoscopic vision mode , The distance between the cameras is large, and the large parallax can support the calculation of the depth value of the scene in a large range, and can support the processing of stereo shooting effects at a long distance.
  • the main optical axis of the camera can be aligned with the shooting focus, so that the imaging area of the subject is within the vicinity of the main axis of the camera, avoiding the distortion of the picture caused by the subject shifting from the center of the picture, and improving the picture quality.
  • the specific shooting realization process is shown in Figure 7.
  • Start the dual-camera shooting mode of flexible screen/folding screen detect the current bending/folding angle of the screen and the positional relationship between the two cameras, and display the preview images taken by the two cameras; the camera automatically focuses or the user manually sets the focus point;
  • the distance between the focus and the flexible screen/folding screen mobile phone according to the positional relationship between the focus point and camera one and camera two, calculate the angle parameter of the screen that should be bent/folded during shooting; obtain the current bending/folding angle of the mobile phone screen through the hardware sensor; judge Whether the current angle of the screen is consistent with the shooting angle parameter; if they are consistent, the picture is taken directly; if not, the compensation parameters are calculated, an instruction is issued, the screen is automatically bent/folded to the specified angle, and then the picture is taken.
  • the shooting method has a variety of application scenarios, among which three typical application scenarios are: taking a photo with a stereo effect at a long distance, taking a three-dimensional dynamic photo at a close distance, and automatically taking a panoramic photo mode.
  • FIG. 8 is a flowchart of taking a stereoscopic effect photo of a mobile terminal according to an optional embodiment of the present disclosure.
  • the camera automatically focuses or the user manually sets the focus point; the first camera and the second camera respectively view the scene, obtain the first preview image and the second preview image respectively, and calculate the scene depth in the screen in real time; according to the position of the focus point and The position relationship between the first camera and the second camera, according to the preset algorithm, calculate the angle that the mobile phone flexible screen/folding screen needs to be bent/folded; issue a bending command to adjust the bending degree of each part of the flexible screen, so that the screen bends/folds to the command
  • the shape shown make the two cameras focus on the focus point at the same time; refocus the first camera and the second camera, and take photos to obtain the first image and the second image; calculate the shooting scene based on the first image and the second image To obtain the depth value of each pixel of the shooting scene
  • FIG. 9 is a flowchart of shooting a three-dimensional dynamic picture by a mobile terminal according to an optional embodiment of the present disclosure.
  • the bending parameters of the flexible screen/folding screen align the two cameras to the focus point; detect whether the positions of the two cameras are level and level, and give corresponding guidance to the user; the first camera and the second camera shoot at the same time to obtain the first image Comparing with the second image; calculating the depth map of the shooting scene, performing three-dimensional reconstruction of the scene, matching the color image pixels to the reconstructed scene, performing transition processing on the image of the middle angle of view of the two cameras; performing screen synthesis, outputting a certain angle and rotating Observ
  • FIG. 10 is a flowchart of the mobile terminal automatically shooting panoramic photos in an optional embodiment of the present disclosure.
  • the state of the terminal during shooting is shown in FIG. 11, and FIG. 11 is the present disclosure.
  • FIG. 10 A schematic diagram of a state in which the mobile terminal of the optional embodiment automatically takes a panoramic photo.
  • the first camera and the second camera respectively view the scene to obtain a real-time preview image; obtain the current bending/folding angle of the mobile phone screen and the body state through the hardware sensor; determine whether the current mobile phone screen is straight (not curved/ Folded), if the screen of the fuselage is not straight, send a command to the screen angle controller to control the phone screen to return to the straight state, otherwise proceed to the next step; determine whether the current fuselage state is placed horizontally, if the fuselage is not in In the horizontal position state, the user is prompted to place the phone horizontally through the interface, so that the two cameras are in a horizontal position, otherwise enter the next step; according to the preset mode, the two ends of the screen are simultaneously bent inward to drive the camera to rotate, and the two cameras collect several shots in real time. Until the maximum preset shooting angle is reached (or the user manually terminates shooting); according to a preset algorithm, the collected images are combined into a panoramic photo, and the panoramic photo is output.
  • the "A+B” button function is to display two camera shots simultaneously on a split screen, and the user interface is shown in Figure 12a; the "A/B” button function is to display a single camera screen in full screen, and the user interface is shown in Figure 12b Shown; "Synthesis” button function is to synthesize the pictures taken by the two cameras according to the selected shooting mode.
  • the user interface is shown in Figure 12c.
  • the interface toolbar can be hidden to improve the visual experience, as shown in FIG. 12d; when shooting a stereoscopic effect photo, the user interface is shown in FIGS. 12e to 12g, where FIG.
  • FIG. 12e is the shooting of the terminal of an optional embodiment of the present disclosure Stereoscopic animation-Schematic diagram of split-screen display interface
  • Fig. 12f Stereoscopic animation of the terminal of the optional embodiment of the present disclosure-Schematic diagram of interface of single camera screen
  • Fig. 12g is a stereoscopic animation of the terminal of the optional embodiment of the present disclosure Figure-Schematic diagram of the interface for viewing the synthesized screen; after shooting the stereoscopic motion picture, you can slide the left and right sideways on the screen to view the different angle shooting scenes of the stereoscopic photo within a certain angle range.
  • FIG. 12h is a schematic diagram of an interface for automatically shooting a panoramic photo of a terminal according to an optional embodiment of the present disclosure.
  • the camera rotates with the screen through the flexible screen, simulating the stereoscopic principle of human performance. Simulates the characteristics of the human eyeball moving with the observed object.
  • the distance between the two cameras is increased to simulate the binocular parallax of the human eye, which can increase the depth of the shooting scene Measurement accuracy, the stereo effect of taking stereo pictures is stronger, and the shooting distance of stereo pictures is also increased.
  • the dual camera is used to automatically take panoramic photos, which avoids the problem of unstable picture quality caused by manually turning the mobile phone.
  • the two cameras shoot to the left and right sides simultaneously to improve the shooting efficiency.
  • modules or steps of the present disclosure can be implemented by a general-purpose computing device, they can be concentrated on a single computing device, or distributed in a network composed of multiple computing devices Above, in an embodiment, they can be implemented with program code executable by the computing device, so that they can be stored in the storage device to be executed by the computing device, and in some cases, may be different from here
  • the steps shown or described are executed in the order of, or they are made into individual integrated circuit modules respectively, or multiple modules or steps among them are made into a single integrated circuit module for implementation. In this way, the present disclosure is not limited to any specific combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

本公开提供了一种终端、拍摄方法及存储介质。终端包括存储有计算机程序的存储器以及处理器,处理器被配置为运行计算机程序以执行:确定拍摄信息,其中,拍摄信息包括:拍摄目标的位置或拍摄视角;根据拍摄目标的位置调整终端所包括的至少两个摄像头的轴线方向以使至少两个摄像头聚焦于拍摄目标;或,根据拍摄视角调整至少两个摄像头的轴线方向的夹角;控制被调整后的至少两个摄像头采集图像。摘图1

Description

一种终端、拍摄方法及存储介质 技术领域
本公开涉及但不限于通信领域,具体而言,涉及一种终端、拍摄方法及存储介质。
背景技术
目前的双摄像头智能手机拍摄方法丰富了手机拍摄的功能,提高了拍摄画面的质量。但双摄像头拍摄方法依然存在一些缺陷:由于相关技术中终端上的两个摄像头位置和角度是固定不变的,而拍摄焦点通常并非是在取景画面正中心(在相机中轴线附近),因此拍摄焦点中心的光线通常是通过侧面进入摄像头,随着焦点离相机中轴线偏差角度的增大,定位误差也会增大,容易导致画面变形。因此,目前的手机双摄拍照模式影响了拍摄的灵活度,也在一定程度上降低了拍摄画面的质量。
发明内容
本公开实施例提供了一种终端、拍摄方法及存储介质,以至少解决相关技术中终端拍摄不灵活的问题。
根据本公开的一个实施例,提供了一种终端,包括:存储有计算机程序的存储器以及处理器,所述处理器被配置为运行所述计算机程序以执行:确定拍摄信息,其中,所述拍摄信息包括:拍摄目标的位置或拍摄视角;根据所述拍摄目标的位置调整所述终端所包括的至少两个摄像头的轴线方向以使所述至少两个摄像头聚焦于所述拍摄目标;或,根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角;控制被调整后的所述至少两个摄像头采集图像。
根据本公开的另一个实施例,提供了一种拍摄方法,包括:确定拍摄信息,其中,所述拍摄信息包括:拍摄目标的位置或拍摄视角; 根据所述拍摄目标的位置调整所述终端所包括的至少两个摄像头的轴线方向以使所述至少两个摄像头聚焦于所述拍摄目标;或,根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角;控制被调整后的所述至少两个摄像头采集图像。
根据本公开的又一个实施例,还提供了一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
附图说明
此处所说明的附图用来提供对本公开的进一步理解,构成本申请的一部分,本公开的示意性实施例及其说明用于解释本公开,并不构成对本公开的不当限定。在附图中:
图1是根据本公开实施例的终端的结构框图;
图2是根据本公开实施例的拍摄方法的流程图;
图3是根据本公开实施例的摄像头的轴线方向示意图;
图4是根据本公开实施例的摄像头的位置示意图;
图5a是根据本公开可选实施例的柔性终端的正面示意图;
图5b是根据本公开可选实施例的柔性终端的背面示意图;
图5c是根据本公开可选实施例的柔性终端的侧面示意图;
图5d是根据本公开可选实施例的柔性终端的侧面弯折示意图;
图5e是根据本公开可选实施例的柔性终端的拍摄状态示意图;
图6a是根据本公开可选实施例的折叠终端的背面示意图;
图6b是根据本公开可选实施例的折叠终端的拍摄状态示意图;
图7是根据本公开可选实施例的移动终端拍摄方法的流程图;
图8是根据本公开可选实施例的移动终端拍摄立体效果照片的流程图;
图9是根据本公开可选实施例的移动终端拍摄三维动态图片的流程图;
图10是根据本公开可选实施例的移动终端自动拍摄全景照片的流程图;
图11是根据本公开可选实施例的移动终端自动拍摄全景照片的状态示意图;
图12a是根据本公开可选实施例的终端的分屏显示两摄像头画面的界面示意图;
图12b是根据本公开可选实施例的终端的全屏显示单个摄像头画面的界面示意图;
图12c是根据本公开可选实施例的终端的点击“合成”按钮合成立体效果画面的界面示意图;
图12d是根据本公开可选实施例的终端的点击画面隐藏工具栏的界面示意图;
图12e是根据本公开可选实施例的终端的拍摄立体动图-分屏显示的界面示意图;
图12f是根据本公开可选实施例的终端的拍摄立体动图-单摄像头画面的界面示意图;
图12g是根据本公开可选实施例的终端的拍摄立体动图-合成画面查看界面的界面示意图;
图12h是根据本公开可选实施例的终端的自动拍摄全景照片的界面示意图。
具体实施方式
下文中将参考附图并结合实施例来详细说明本公开。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
实施例1
图1是本公开实施例的一种终端的结构框图。如图1所示,本公开实施例一提供了一种终端,可以包括:摄像头1,用于采集图像,其中,终端上包括至少两个摄像头1;存储有计算机程序的存储器2以及处理器3,处理器3被配置为运行计算机程序以执行:
确定拍摄信息,其中,拍摄信息包括:拍摄目标的位置或拍摄视角;
根据拍摄目标的位置调整终端所包括的至少两个摄像头的轴线方向以使至少两个摄像头聚焦于拍摄目标;或,根据拍摄视角调整至少两个摄像头的轴线方向的夹角;
控制被调整后的至少两个摄像头采集图像。
通过本公开的实施例,可以通过处理器控制终端上的摄像头的轴线方向,因此可以解决相关技术中终端摄像头拍照不灵活的问题,达到灵活拍照的效果。
需要说明的是,图3是根据本公开实施例的摄像头的轴线方向示意图,如图3所示,摄像头的轴线方向可以指摄像头的朝向,或者在摄像头的轴线所在的直线上,朝向被拍摄对象的方向。
需要说明的是,摄像头的轴线方向聚焦于拍摄目标,可以指摄像头的轴线方向相交于拍摄目标上。例如拍摄目标是一个物体,摄像头的轴线方向聚焦于拍摄目标可以指摄像头的轴线方向相交于该拍摄物体上。
现有智能手机使用空间位置固定且无法调整朝向的双摄像头结构,导致主要拍摄对象偏离摄像头轴线方向,容易造成画面变形;而本实施方式中提供的方案中摄像头的轴线方向聚焦于拍摄目标可以使拍摄对象始终处于摄像头主轴线附近范围,提高了摄像头的灵活性,避免了由于拍摄主体偏离摄像头主轴中心而导致的画面变形,提高了手机拍摄画面的质量。
在一实施方式中,处理器被配置为运行计算机程序以执行确定拍 摄信息,包括:
当拍摄信息中包括拍摄目标的位置时,确定拍摄信息包括:在确定拍摄目标之后,采集拍摄目标的图像并根据图像的场景深度信息确定拍摄目标的位置。
在一实施方式中,处理器被配置为运行计算机程序以执行根据拍摄目标的位置调整至少两个摄像头的轴线方向以使摄像头聚焦于拍摄目标,包括:
根据拍摄目标的位置确定至少两个摄像头在聚焦于拍摄目标时的目标轴线方向;
根据目标轴线方向和至少两个摄像头的当前轴线方向确定至少两个摄像头的轴线方向的调整角度;
按照调整角度将至少两个摄像头的当前轴线方向调整至目标轴线方向。
需要说明的是,例如,进入拍摄模式后,用户可以在屏幕上点击确定了拍摄目标(拍摄焦点),摄像头通过检测镜头与拍摄目标的距离,根据内置算法可以计算出摄像头应该形成的角度,即可以确定至少两个摄像头在聚焦于拍摄目标时的目标轴线方向。
当拍摄信息中包括拍摄视角时,确定拍摄信息包括:
根据拍摄指令确定拍摄视角,其中,拍摄指令中指示了拍摄视角。
在一实施方式中,处理器被配置为运行计算机程序以执行根据拍摄视角调整至少两个摄像头的轴线方向的夹角,包括:
根据拍摄视角确定至少两个摄像头的轴线方向的目标夹角;
根据目标夹角和至少两个摄像头的轴线方向的当前夹角确定至少两个摄像头的轴线方向的调整角度;
按照调整角度将至少两个摄像头的当前轴线方向调整至目标轴线方向,其中,目标轴线方向的夹角为目标夹角。
在一实施方式中,处理器被配置为运行计算机程序以执行按照调 整角度将摄像头的当前轴线方向调整至目标轴线方向,其中,目标轴线方向的夹角为目标夹角,包括:
在预定时间内,调节至少两个摄像头的轴线方向的夹角在当前夹角和目标夹角之间的角度范围内连续变化。
需要说明的是,例如,用户在终端显示屏上选择了摄像头组中的两个摄像头的轴线方向夹角为50°,则终端可以根据用户的选择生成的指令指示终端所包括的至少两个摄像头的轴线方向夹角需要设置成50°,处理器根据该指令将两个摄像头的轴线方向夹角调整至50°。又如,用户选择了全景模式,则处理器可以根据指示了全景模式的指令按照预先设置的角度调整至少两个摄像头的轴线方向夹角。例如,预先设置全景模式的最大夹角是270°,当前终端上的两个摄像头的夹角是60°,则处理器控制两个摄像头的轴线方向调整至夹角为270°的状态。
需要说明的是,在预定时间内,调节至少两个摄像头的轴线方向的夹角在当前夹角和目标夹角之间的角度范围内连续变化并不限定夹角一定要在一个连续不间断的角度范围内连续变化,例如,可以是在0°到60°角度范围内变化,也可以是在全景模式中,在预定的时间内,摄像头的轴线方向的夹角由60°连续变至0°,再由0°连续变至270°,即也可以是在间断的角度范围内连续变化。
另外,调节至少两个摄像头的轴线方向的夹角在当前夹角和目标夹角之间的角度范围内连续变化也并不限定角度的变化在预定时间内一定是按照时间均匀变化的,例如可以是每1s的时间夹角均匀增加5°,也可以是前1s的时间夹角均匀增加5°,紧接着的下1s时间内夹角均匀增加了10°,也可以是夹角增加5°之后,暂停,待摄像头拍摄图像后,继续增大摄像头轴线方向的夹角。
在一实施方式中,处理器还被配置为运行计算机程序以执行:
在接收到合成指令后,合成通过至少两个摄像头采集到的图像得到合成图像。
在一实施方式中,处理器还被配置为运行计算机程序以执行合成通过至少两个摄像头采集到的图像得到合成图像,包括:
确定至少两个摄像头所采集的图像中的像素的深度值;
基于像素的深度值根据图像处理模式合成通过至少两个摄像头采集到的图像得到合成图像。
在一实施方式中,处理器还被配置为运行计算机程序以执行:向终端的显示屏输出通过摄像头组采集到的图像和/或合成图像。
在一实施方式中,摄像头组设置在终端的第一侧面上,其中,第一侧面可以为终端的一个侧面、终端的组成部分的一个侧面或者终端的组成部分的侧面所构成的组合侧面。
需要说明的是,上述的终端的一个侧面可以是终端或终端的组成部分的上侧面、下侧面、左侧面、右侧面、前侧面、后侧面等,第一侧面可以是终端上的任一侧面或终端的组成部分的任一侧面所构成的组合侧面。
在一实施方式中,第一侧面可以包括以下之一:
终端可以包括一个终端本体,其中,第一侧面为终端本体上的一个侧面;或者
终端可以包括互相连接的第一本体和第二本体,其中,第一侧面为终端在平直展开的状态下,位于第一本体和第二本体同一侧的侧面所形成的组合侧面;或者
终端可以包括互相连接的第一本体和第二本体,其中,第一侧面为第一本体上的一个侧面或第二本体上的一个侧面。
需要说明的是,上述的第一侧面可以是只包括了一个本体的终端上的一个侧面,也可以是包括了具有两个组成部分的终端(例如折叠手机、翻盖手机等终端)在平直展开的状态下,位于两个组成部分上的同一侧的侧面所构成一个组合的平面,即,第一侧面可以是这种组合而来的平面。
在一实施方式中,摄像头组中的摄像头分散分布在第一侧面的边缘。
需要说明的是,目前的手机双摄像头的两个镜头之间的基线(两个镜头的间距)很短,一般为10mm左右,而人类双眼的基线均值是64mm,相比之下目前的双摄手机基线太短,因此,在定位与图像处理过程中会引起较大的计算误差,只能计算较近物体的景深(浅景深),在拍摄场景上具有较大的局限性。在本申请的一个实施方式中,摄像头可以分散分布在第一侧面的边缘,其中,分散分布在第一侧面的边缘可以指摄像头之间至少在工作状态下(例如拍照时)是分散分布的(并非是挨靠在一起),并且分布在靠近第一侧面边缘的位置处,这样可以有效增加摄像头之间的基线长度,提高了图像深度的计算精度,增大可拍摄画面景深处理的有效范围,为用户带来更高的便捷性与实用性,有效提升用户体验。
例如图4所示,图4是根据本公开实施例的摄像头的位置示意图,其中,A部分可以表示终端的正面或者背面,图4中的B部分可以表示终端的上、下、左、右任一个侧面,在本实施例的摄像头组含有两个摄像头的情况下,两个摄像头可以是设置在终端的正面或背面上的1-2位置处,也可以是4-5、3-6、7-8、9-10位置处,也可以是1-8、2-7、4-9、5-10位置处,也可以是8-10、7-9、2-4、1-5位置处,当该两个摄像头设置在终端的上、下、左、右任一个侧面上时,可以设置在11-12位置处。
当终端上有多个摄像头,例如4个摄像头的时候,摄像头的位置可以按照正方形分布,例如,可以设置在1-10位置中可以构成正方形的四个位置上,如图4所示,可以是7-8-9-10、1-2-4-5、7-2-9-4、1-8-10-5,也可以是在水平方向和垂直方向上各设置两个摄像头,例如可以是1-5-3-6、2-4-3-6、7-9-3-6、8-10-3-6。
在一实施方式中,第一侧面为终端上的显示屏所在的侧面或终端上背离显示屏一侧的侧面。
需要说明的是,摄像头位于显示屏一侧可以方便拍摄者进行自拍,并且不影响拍摄者查看拍摄画面;摄像头设置在背离显示屏的一侧可以方便使用者查看被拍摄的画面。
在一实施方式中,终端上包括调整部件,该调整部件用于调整终端上的摄像头的轴线方向。
在一实施方式中,调整部件可以包括弯折结构,其中,弯折结构与摄像头连接,弯折结构用于直接调整摄像头的轴线方向;或者,弯折结构与承载摄像头的承载结构连接,弯折结构用于通过调整承载结构的弯折状态调整摄像头的轴线方向。
需要说明的是,弯折结构可以直接与摄像头本身的部件连接,用以直接带动摄像头轴线方向变化。例如,弯折结构可以是包括轴、铰链或联轴器等的可转动的结构,可以直接引起摄像头朝向变化,在这种情形下,终端本身可以保持原有的状态,不需要弯折。
在本公开的一个实施方式中,弯折结构可以与承载摄像头的承载结构连接,例如,承载摄像头的承载结构是终端的第一侧面上的一些结构,或者说是终端的第一侧面承载了摄像头,则,通过调整第一侧面的弯折状态就可以带来摄像头的轴线方向的变化。例如,弯折结构可以是折叠终端中间的合页或者铰链或者连接轴。当合页开合或者铰链、连接轴转动的时候,带动折叠终端的两个分体的开合,那么设置在分体上的摄像头的轴线方向就会发生变化。
又如,弯折结构可以是承托承载结构的一个结构,例如可弯折的板状构件,例如由多个小的板状构件相互连接所形成的可弯折的板状结构,可以通过控制小的板状构件的倾斜角度调整可弯折的板状结构整体的弯折状态。当弯折结构本身弯折时,可以带动承载结构的弯折。
再如,承载了摄像头的承载结构本身就可以是一个弯折结构,当弯折结构本身弯折时,相当于承载结构弯折。
在一实施方式中,在承载结构处于弯折状态时,摄像头分布在承载结构上的弯折轴线的两侧。
需要说明的是,承载摄像头的结构可以是终端上的第一侧面。当承载结构发生弯折时,即从平面状态变为曲面状态时,弯折线可以是在承载结构弯折后,将承载结构划分为多个弯折区域的边界线,其中,在承载结构发生弯折变化时,垂直于这些弯折区域表面的直线的方向也会发生变化。例如,某个平面(承载结构)左右两侧同时向上弯曲,形成剖面类似于与U型的结构,则该U型结构底部的线就可以理解为是弯折线,左右两侧弯曲的部分可以理解为被弯折线区分开的两个弯折区域;又如,某个平面(承载结构)弯折后形成了剖面为半圆弧线的曲面,则该曲面上所有垂直于该半圆弧线的直线(即该曲面上平行于该半圆筒状结构的轴线的直线)都可以理解为弯折线,该弯折线两侧的区域内,垂直于区域表面的直线的方向在承载结构弯折过程中都发生了变化。
需要说明的是,当摄像头组中的摄像头分布在承载结构上的弯折轴线的两侧时,由于垂直于弯折线两侧的弯折区域表面的直线的方向会在承载结构弯折时发生变化,这样就赋予了设置在承载结构上的摄像头组中的摄像头更多的朝向可能,例如摄像头的轴线方向可以在承载结构弯折后相交于拍摄目标处,使得终端可针对拍摄对象同时进行多角度拍摄,进一步使得终端能够适应多种拍摄场景,提升了拍摄的灵活性。
图2是根据本公开实施例的拍摄方法的流程图,如图2所示,本公开的实施例还提供了一种拍摄方法,应用于上述任一项的终端中,方法包括:
步骤S202,确定拍摄信息,其中,拍摄信息包括:拍摄目标的位置或拍摄视角;
步骤S204,根据拍摄目标的位置调整终端所包括的至少两个摄像头的轴线方向以使至少两个摄像头聚焦于拍摄目标;或,根据拍摄视角调整至少两个摄像头的轴线方向的夹角;
步骤S206,控制被调整后的至少两个摄像头采集图像。
通过本公开,由于可以通过处理器控制终端上的摄像头组中的摄像头的轴线方向,因此可以解决相关技术中终端摄像头拍照不灵活的问题,达到灵活拍照的效果。
确定拍摄信息,包括:
当拍摄信息中包括拍摄目标的位置时,确定拍摄信息包括:
在确定拍摄目标之后,采集拍摄目标的图像并根据图像的场景深度信息确定拍摄目标的位置;
当拍摄信息中包括拍摄视角时,确定拍摄信息包括:
根据拍摄指令确定拍摄视角,其中,拍摄指令中指示了拍摄视角。
在一实施方式中,根据拍摄目标的位置调整至少两个摄像头的轴线方向以使摄像头聚焦于拍摄目标,包括:
根据拍摄目标的位置确定至少两个摄像头在聚焦于拍摄目标时的目标轴线方向;
根据目标轴线方向和至少两个摄像头的当前轴线方向确定至少两个摄像头的轴线方向的调整角度;
按照调整角度将至少两个摄像头的当前轴线方向调整至目标轴线方向。
在一实施方式中,根据拍摄视角调整至少两个摄像头的轴线方向的夹角,包括:
根据拍摄视角确定至少两个摄像头的轴线方向的目标夹角;
根据目标夹角和至少两个摄像头的轴线方向的当前夹角确定至少两个摄像头的轴线方向的调整角度;
按照调整角度将至少两个摄像头的当前轴线方向调整至目标轴线方向,其中,目标轴线方向的夹角为目标夹角。
在一实施方式中,按照调整角度将至少两个摄像头的当前轴线方向调整至目标轴线方向,其中,目标轴线方向的夹角为目标夹角,包括:
在预定时间内,调节至少两个摄像头的轴线方向的夹角在当前夹角和目标夹角之间的角度范围内连续变化。
在一实施方式中,在接收到合成指令后,合成通过至少两个摄像头采集到的图像得到合成图像。
在一实施方式中,合成通过至少两个摄像头采集到的图像得到合成图像,包括:
确定至少两个摄像头所采集的图像中的像素的深度值;
基于像素的深度值根据图像处理模式合成通过至少两个摄像头采集到的图像得到合成图像。
在一实施方式中,向终端的显示屏输出通过摄像头组采集到的图像和/或合成图像。
在一实施方式中,本实施例中的具体示例可以参考上述实施例及可选实施方式中所描述的示例,本实施例在此不再赘述。
本公开的实施例还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的计算机程序:
步骤S1,确定拍摄信息,其中,拍摄信息包括:拍摄目标的位置或拍摄视角;
步骤S2,根据拍摄目标的位置调整终端所包括的至少两个摄像头的轴线方向以使至少两个摄像头聚焦于拍摄目标;或,根据拍摄视角调整至少两个摄像头的轴线方向的夹角;
步骤S3,控制被调整后的至少两个摄像头采集图像。
在一实施方式中,本实施例中的具体示例可以参考上述实施例及可选实施方式中所描述的示例,本实施例在此不再赘述。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(Read-Only Memory,简称为ROM)、随机存取存储器 (Random Access Memory,简称为RAM)、移动硬盘、磁碟或者光盘等各种可以存储计算机程序的介质。
本申请实施例所提供的方法实施例可以在移动终端或者类似的运算装置中执行。以运行在移动终端上为例,移动终端可以包括一个或多个处理器(处理器可以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)和用于存储数据的存储器,在一实施方式中,上述移动终端还可以包括用于通信功能的传输设备以及输入输出设备。本领域普通技术人员可以理解,上述所描述的终端的结构仅为示意,其并不对上述移动终端的结构造成限定。例如,移动终端还可包括更多或者更少的组件,或者具有不同的配置。
存储器可用于存储计算机程序,例如,应用软件的软件程序以及模块,如本公开实施例中的拍摄方法对应的计算机程序,处理器通过运行存储在存储器内的计算机程序,从而执行各种功能应用以及数据处理,即实现上述的方法。存储器可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器可进一步包括相对于处理器远程设置的存储器,这些远程存储器可以通过网络连接至移动终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开各个实施例的方法。
具体实施方式一
为了更好的理解本公开的实施例,以下结合场景对本公开实施例进一步说明。
本公开实施例通过在柔性屏/折叠屏智能手机同侧的两端分别安装一个摄像头实现双摄像头立体拍摄效果。在拍照时根据拍摄画面的对焦点自动调节手机的弯曲/折叠角度,使两个摄像头同时对准对焦点,同时采集拍摄两个摄像头的视觉图像和深度图像,进行图像处理,实现模拟人眼模式的3D立体照片拍摄和三维立体动态画面的拍摄。
在本公开的一个实施例中,通过柔性屏手机的弯曲带动手机两端的摄像头进行转动,来模拟人眼在观察事物时的眼球运动,模拟人眼的立体视觉,通过增大两摄像头的距离提高立体效果照片的拍摄范围,同时提出一种三维立体动态照片的拍摄方法。
一方面,相比于现有智能手机位置固定的双摄像头结构,两摄像头的相对位置关系不会随对焦点的位置而变化,在主要拍摄对象偏离画面中心的情况下容易导致画面变形问题的特点;本公开中的手机双摄像头可以根据拍摄对象的位置变化而自动调整手机弯曲度,来调整摄像头位置,使拍摄对象始终处于摄像头主轴线附近范围,提高了摄像头的灵活性,避免了由于拍摄主体偏离摄像头主轴中心而导致的画面变形,提高了手机拍摄画面的质量。
另一方面,针对现有双摄智能手机的两个摄像头间距短,导致的计算误差较大,只能计算较近物体的景深的问题,本公开中的柔性屏/折叠屏手机两个摄像头间距采用模拟人眼视角的较大间距。
除此之外,本公开的实施例可以利用柔性屏/折叠屏手机的两摄像头的较大视差近距离拍摄三维立体动态图片,生成在一定角度内可以旋转查看的三维场景照片,提升了照片的真实性和拍照的趣味性。
图5a至图6b针对柔性终端和折叠终端的结构、拍摄状态等提供了示意,如图5a至图6b所示,该柔性屏/折叠屏智能手机的屏幕具有可弯折的特点,两个摄像头分别安装在手机背面的不同位置。其中在柔性屏手机中,如图5b所示,第一摄像头安装在上半部分、第二摄像 头在下半部分,两个摄像头在同一条竖直线上,由于柔性屏手机或柔性终端具有可弯折性,因此两个摄像头的相对位置会随手机屏幕的弯曲而发生改变,弯折过程示意如图5c、图5d。在折叠屏手机中,两摄像头分别安装在折叠屏手机左右屏幕的两侧的顶部位置,摄像头位置随屏幕的折叠运动而发生变化,如图5e。柔性屏手机屏幕弯折的方式不只局限于两侧同步弯折或者弯曲程度相同,柔性屏幕各个部位的弯曲程度会随预设模式的不同和拍摄场景条件的变化而有所不同,通过指令自动调整。
图6a和图6b是折叠手机的结构示意图,其中,两个摄像头设置在折叠手机的背面的左上角和右上角,折叠屏手机的折叠方式也会随着拍摄场景条件的差异而采取不同的折叠机制。例如,以图6a所示的折叠终端的背面为例,以A屏为参照基准折叠B屏,以B屏为参照基准折叠A屏,AB屏同步折叠,AB屏不同步折叠转动等折叠方式。
本公开实施例主要涉及一种柔性屏/折叠屏终端的拍摄方法和柔性屏/折叠屏移动终端,在柔性屏/折叠屏手机的背面的不同位置分别安装一个摄像头。拍摄时设置对焦点,两个摄像头进行分别取景并实时计算画面中的场景深度;根据对焦点与手机摄像头的距离自动实时调节柔性屏/折叠屏各部分的弯曲程度,将两个摄像头对准画面焦点;第一摄像头和第二摄像头同时拍摄,获得拍摄画面并获得拍摄场景的深度值;根据预设的图像处理模式对图片进行图像处理。实现模拟人眼模式的立体效果照片拍摄和三维立体动态照片的拍摄。
拍摄过程原理示意图如图7所示,图7是根据本公开可选实施例的移动终端拍摄方法的流程图,由于柔性屏/折叠屏手机的第一摄像头与第二摄像头模拟人眼立体视觉模式,摄像头间距较大,较大的视差可以支持较大范围内的场景深度值计算,可以支持较远距离的立体拍摄效果处理。同时通过控制屏幕调整摄像头拍摄角度,可以将摄像头主光轴对准拍摄焦点,使拍摄对象的成像区域在摄像头主轴附近范围内,避免了拍摄主体偏离画面中心造成的画面变形,提高了画面质量。
当拍摄物体距离摄像头较远时(最大立体图像拍摄范围内),可以进行立体效果照片的拍摄,如背景虚化、多重对焦等立体效果的处理;当近距离拍摄时支持三维立体动态照片的拍摄:由于两摄像头拍摄角度差较大,此时可以支持三维动态图像的采集,计算拍摄场景的深度图,对拍摄范围内的场景进行三维重建,并生成一定角度内可以转动查看的三维动态图片。
具体的拍摄实现过程如图7所示。启动柔性屏/折叠屏双摄像头拍摄模式;检测屏幕当前的弯曲/折叠角度及两摄像头所处的位置关系,并显示两摄像头所拍摄的预览画面;相机自动对焦或用户手动设置对焦点;检测对焦点与柔性屏/折叠屏手机的距离,根据对焦点与摄像头一及摄像头二的位置关系,计算拍摄时屏幕应该弯曲/折叠的角度参数;通过硬件传感器获得手机屏幕当前的弯曲/折叠角度;判断屏幕当前角度与拍摄角度参数是否一致;若一致则直接拍摄画面;若不一致,则计算补偿参数,发出指令,屏幕自动弯曲/折叠至指定的角度,然后进行画面拍摄。
本拍摄方法具有多种应用场景,其中典型的三个应用场景是:较远距离拍摄具有立体效果的照片、近距离拍摄三维立体动态照片,以及自动拍摄全景照片模式。
拍摄具有立体效果照片的实现流程如图8所示,图8是本公开可选实施例的移动终端拍摄立体效果照片的流程图。相机自动对焦或用户手动设置对焦点;第一摄像头、第二摄像头分别取景,分别获得实时预览图像第一预览图像与第二预览图像,并实时计算画面中的场景深度;根据对焦点所在位置与第一摄像头及第二摄像头的位置关系,根据预设算法计算出手机柔性屏/折叠屏需要弯曲/折叠的角度;发出弯曲指令,调节柔性屏各部分的弯曲程度,使得屏幕弯曲/折叠到指令所示形状,使两个摄像头同时对准对焦点;第一摄像头、第二摄像头进行重新对焦,并拍摄照片,获得第一图像与第二图像;根据第一图像与第二图像计算出拍摄场景的深度图,获取拍摄场景每个像素的深度 值;根据拍摄场景的深度值与拍摄画面,选择图片处理模式,根据预设的图像处理算法对图片进行图像处理;输出图片。
拍摄三维立体动态照片拍摄流程的实现流程如图9所示,图9是本公开可选实施例的移动终端拍摄三维动态图片的流程图。选择三维立体动图拍摄模式,进行自动对焦,可手动辅助对焦;第一摄像头、第二摄像头分别取景,获得实时预览图像;根据预览图像实时计算画面中的场景深度,根据对焦点位置,自动调节柔性屏/折叠屏的弯曲参数,将两个摄像头对准对焦点;检测两摄像头位置是否水平平齐,并对用户进行相应的指导提示;第一摄像头和第二摄像头同时拍摄,获得第一图像与第二图像;计算拍摄场景的深度图,进行场景的三维重建,将色彩图像像素点与重建场景相对应,对两摄像头中间视角的图像进行过渡处理;进行画面合成,输出一定角度内可旋转观察的三维立体场景照片。
自动拍摄全景照片模式的实现流程如图10所示,图10是本公开可选实施例的移动终端自动拍摄全景照片的流程图,拍摄时终端的状态如图11所示,图11是本公开可选实施例的移动终端自动拍摄全景照片的状态示意图。选择自动拍摄全景图片模式;第一摄像头、第二摄像头分别取景,获得实时预览图像;通过硬件传感器获得手机屏幕当前的弯曲/折叠角度及机身状态;判断当前手机屏幕是否平直(未弯曲/折叠),若机身屏幕为非平直状态,则向屏幕角度控制器发出指令,控制手机屏幕恢复平直状态,否则进如下一步;判断当前机身状态是否为横向放置,若机身未处于横向位置状态,则通过界面提示用户将手机横向放置,使两摄像头处于水平位置,否则进入下一步;根据预设模式,屏幕两端同时向内侧弯曲带动摄像头转动,两摄像头实时采集若干拍摄画面,直至达到最大预设拍摄角度(或用户手动终止拍摄);根据预设算法将采集到的若干图像合成一张全景照片,输出全景照片。
在用户界面中,“A+B”按钮功能为分屏同时显示两个摄像头拍摄画面,用户界面如图12a所示;“A/B”按钮功能为全屏显示单个摄 像头画面,用户界面如图12b所示;“合成”按钮功能为根据所选的拍摄模式,合成两个摄像头所拍摄的画面,用户界面如图12c所示。拍摄过程中,可以隐藏界面工具栏,提高视觉体验,如图12d所示;拍摄立体效果照片时,用户界面如图12e至12g所示,其中图12e是本公开可选实施例的终端的拍摄立体动图-分屏显示的界面示意图、图12f本公开可选实施例的终端的拍摄立体动图-单摄像头画面的界面示意图、图12g是根据本公开可选实施例的终端的拍摄立体动图-合成画面查看界面的界面示意图;在完成立体动图的拍摄后,可以通过在屏幕上左右横向滑动,在一定角度范围内查看立体照片的不同角度拍摄场景。
拍摄全景照片时,用户界面操作如图12h所示。图12h是本公开可选实施例的终端的自动拍摄全景照片的界面示意图。
通过本公开的实施例,通过柔性屏实现摄像头随屏幕转动,模拟了人演的立体视觉原理。模拟了人眼球随所观察物体移动的特点,相比现有相机可减少拍摄主体的画面变形,提高画面质量;增大了两摄像头之间的距离来模拟人眼的双眼视差,可以提高拍摄场景深度的测量精度,拍摄立体图片的立体效果更强,同时也增大了立体图片的拍摄距离。可以近距离利用双摄像头的较大拍摄视差,来拍摄、合成立体动态场景照片,实现在一定角度内的三维场景查看。实现双摄像头自动拍摄全景照片,避免了手动转动手机造成的画面质量不稳定的问题,两摄像头分别向左右两侧同时拍摄提高了拍摄效率。
显然,本领域的技术人员应该明白,上述的本公开的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,在一实施方式中,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本公开不限制于任何特定的硬件和软件结合。
以上仅为本公开的优选实施例而已,并不用于限制本公开,对于本领域的技术人员来说,本公开可以有各种更改和变化。凡在本公开的原则之内,所作的任何修改、等同替换、改进等,均应包含在本公开的保护范围之内。

Claims (20)

  1. 一种终端,包括:存储有计算机程序的存储器以及处理器,所述处理器被配置为运行所述计算机程序以执行:
    确定拍摄信息,其中,所述拍摄信息包括:拍摄目标的位置或拍摄视角;
    根据所述拍摄目标的位置调整所述终端所包括的至少两个摄像头的轴线方向以使所述至少两个摄像头聚焦于所述拍摄目标;或,根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角;
    控制被调整后的所述至少两个摄像头采集图像。
  2. 根据权利要求1所述的终端,其中,所述处理器被配置为运行所述计算机程序以执行确定拍摄信息,包括:
    在所述拍摄信息中包括所述拍摄目标的位置的情况下,在确定所述拍摄目标之后,采集所述拍摄目标的图像并根据所述图像的场景深度信息确定所述拍摄目标的位置;
    在所述拍摄信息中包括所述拍摄视角的情况下,根据拍摄指令确定所述拍摄视角,其中,所述拍摄指令中指示了所述拍摄视角。
  3. 根据权利要求1所述的终端,其中,所述处理器被配置为运行所述计算机程序以执行根据所述拍摄目标的位置调整所述至少两个摄像头的轴线方向以使所述摄像头聚焦于拍摄目标,包括:
    根据所述拍摄目标的位置确定所述至少两个摄像头在聚焦于所述拍摄目标时的目标轴线方向;
    根据所述目标轴线方向和所述至少两个摄像头的当前轴线方向确定所述至少两个摄像头的轴线方向的调整角度;
    按照所述调整角度将所述至少两个摄像头的所述当前轴线方向调整至所述目标轴线方向。
  4. 根据权利要求1所述的终端,其中,所述处理器被配置为运行所述计算机程序以执行根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角,包括:
    根据所述拍摄视角确定所述至少两个摄像头的轴线方向的目标夹角;
    根据所述目标夹角和所述至少两个摄像头的轴线方向的当前夹角确定所述至少两个摄像头的轴线方向的调整角度;
    按照所述调整角度将所述至少两个摄像头的当前轴线方向调整至所述目标轴线方向,其中,所述目标轴线方向的夹角为所述目标夹角。
  5. 根据权利要求4所述的终端,其中,所述处理器被配置为运行所述计算机程序以执行按照所述调整角度将所述摄像头的当前轴线方向调整至目标轴线方向,其中,所述目标轴线方向的夹角为所述目标夹角,包括:
    在预定时间内,调节所述至少两个摄像头的轴线方向的夹角在所述当前夹角和所述目标夹角之间的角度范围内连续变化。
  6. 根据权利要求1所述的终端,其中,所述处理器还被配置为运行所述计算机程序以执行:
    在接收到合成指令后,确定所述至少两个摄像头所采集的图像中的像素的深度值;
    基于所述像素的深度值根据图像处理模式合成通过所述至少两个摄像头采集到的图像得到合成图像。
  7. 根据权利要求1所述的终端,还包括:
    调整部件,用于调整所述至少两个摄像头的轴线方向。
  8. 根据权利要求7所述的终端,其中,所述调整部件包括弯折结构,其中,所述弯折结构与所述摄像头连接,所述弯折结构用于直接调整所述摄像头的轴线方向;或,所述弯折结构与承载所述摄像头的承载结构连接,所述弯折结构用于通过调整所述承载结构的弯折状态调整所述摄像头的轴线方向。
  9. 根据权利要求8所述的终端,其中,在所述承载结构处于弯折状态时,所述至少两个摄像头分布在所述承载结构上的弯折轴线的两侧。
  10. 根据权利要求1所述的终端,其中,所述至少两个摄像头设置在所述终端的第一侧面上,其中,所述第一侧面为所述终端的一个侧面、所述终端的组成部分的一个侧面或者所述终端的组成部分的侧面所构成的组合侧面。
  11. 根据权利要求10所述的终端,其中,
    所述终端包括一个终端本体,其中,所述第一侧面为所述终端本体上的一个侧面;或者,
    所述终端包括互相连接的第一本体和第二本体,其中,所述第一侧面为所述终端在平直展开的状态下,位于所述第一本体和第二本体同一侧的侧面所形成的组合侧面;或者,
    所述终端包括互相连接的第一本体和第二本体,其中,所述第一侧面为所述第一本体上的一个侧面或所述第二本体上的一个侧面。
  12. 根据权利要求10或11所述的终端,其中,所述摄像头组中的摄像头分散分布在所述第一侧面的边缘。
  13. 根据权利要求10或11所述的终端,其中,所述第一侧面为 所述终端上的显示屏所在的侧面或所述终端上背离所述显示屏一侧的侧面。
  14. 一种拍摄方法,应用于权利要求1至13任一项所述的终端中,其中,所述方法包括:
    确定拍摄信息,其中,所述拍摄信息包括:拍摄目标的位置或拍摄视角;
    根据所述拍摄目标的位置调整所述终端所包括的至少两个摄像头的轴线方向以使所述至少两个摄像头聚焦于所述拍摄目标;或,根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角;
    控制被调整后的所述至少两个摄像头采集图像。
  15. 根据权利要求14所述的拍摄方法,其中,确定所述拍摄信息的步骤,包括:
    在所述拍摄信息中包括所述拍摄目标的位置的情况下,在确定所述拍摄目标之后,采集所述拍摄目标的图像并根据所述图像的场景深度信息确定所述拍摄目标的位置;
    在所述拍摄信息中包括所述拍摄视角的情况下,根据拍摄指令确定所述拍摄视角,其中,所述拍摄指令中指示了所述拍摄视角。
  16. 根据权利要求14所述的拍摄方法,根据所述拍摄目标的位置调整所述至少两个摄像头的轴线方向以使所述摄像头聚焦于拍摄目标的步骤,包括:
    根据所述拍摄目标的位置确定所述至少两个摄像头在聚焦于所述拍摄目标时的目标轴线方向;
    根据所述目标轴线方向和所述至少两个摄像头的当前轴线方向确定所述至少两个摄像头的轴线方向的调整角度;
    按照所述调整角度将所述至少两个摄像头的所述当前轴线方向调 整至所述目标轴线方向。
  17. 根据权利要求14所述的拍摄方法,其中,根据所述拍摄视角调整所述至少两个摄像头的轴线方向的夹角的步骤,包括:
    根据所述拍摄视角确定所述至少两个摄像头的轴线方向的目标夹角;
    根据所述目标夹角和所述至少两个摄像头的轴线方向的当前夹角确定所述至少两个摄像头的轴线方向的调整角度;
    按照所述调整角度将所述至少两个摄像头的当前轴线方向调整至所述目标轴线方向,其中,所述目标轴线方向的夹角为所述目标夹角。
  18. 根据权利要求17所述的拍摄方法,其中,按照所述调整角度将所述至少两个摄像头的当前轴线方向调整至所述目标轴线方向,其中,所述目标轴线方向的夹角为所述目标夹角的步骤,包括:
    在预定时间内,调节所述至少两个摄像头的轴线方向的夹角在所述当前夹角和所述目标夹角之间的角度范围内连续变化。
  19. 根据权利要求14所述的拍摄方法,还包括:
    在收到合成指令后,确定所述至少两个摄像头所采集的图像中的像素的深度值;
    基于所述像素的深度值根据图像处理模式合成通过所述至少两个摄像头采集到的图像得到合成图像。
  20. 一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行所述权利要求14至19任一项中所述的方法。
PCT/CN2019/127417 2018-12-21 2019-12-23 一种终端、拍摄方法及存储介质 WO2020125797A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19899106.9A EP3902236A4 (en) 2018-12-21 2019-12-23 TERMINAL, PHOTOGRAPHY PROCESS, AND STORAGE MEDIUM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811573988.5 2018-12-21
CN201811573988.5A CN111355878A (zh) 2018-12-21 2018-12-21 一种终端、拍摄方法及存储介质

Publications (1)

Publication Number Publication Date
WO2020125797A1 true WO2020125797A1 (zh) 2020-06-25

Family

ID=71100221

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/127417 WO2020125797A1 (zh) 2018-12-21 2019-12-23 一种终端、拍摄方法及存储介质

Country Status (3)

Country Link
EP (1) EP3902236A4 (zh)
CN (1) CN111355878A (zh)
WO (1) WO2020125797A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970455A (zh) * 2020-09-14 2020-11-20 Oppo广东移动通信有限公司 信息提示方法、装置、电子设备以及存储介质
CN112040134A (zh) * 2020-09-15 2020-12-04 努比亚技术有限公司 微云台拍摄控制方法、设备及计算机可读存储介质
CN112887616A (zh) * 2021-01-27 2021-06-01 维沃移动通信有限公司 拍摄方法和电子设备

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022109855A1 (en) * 2020-11-25 2022-06-02 Qualcomm Incorporated Foldable electronic device for multi-view image capture
CN112601023A (zh) * 2020-12-15 2021-04-02 展讯通信(天津)有限公司 终端设备、视角可变的成像方法、电子设备及存储介质
CN112791417B (zh) * 2020-12-31 2023-04-11 上海米哈游天命科技有限公司 游戏画面的拍摄方法、装置、设备及存储介质
CN113099113B (zh) * 2021-03-31 2022-12-27 北京小米移动软件有限公司 电子终端、拍照方法及装置、存储介质
CN113194173A (zh) * 2021-04-29 2021-07-30 维沃移动通信(杭州)有限公司 深度数据的确定方法、装置和电子设备
CN115223028B (zh) * 2022-06-02 2024-03-29 支付宝(杭州)信息技术有限公司 场景重建及模型训练方法、装置、设备、介质及程序产品
WO2024072597A1 (en) * 2022-09-28 2024-04-04 X Development Llc Adjustable aquaculture camera mounting system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130044240A1 (en) * 2011-08-17 2013-02-21 Nokia Corporation Apparatus and method for generating image data
CN103581509A (zh) * 2012-07-24 2014-02-12 华晶科技股份有限公司 镜头结构及其摄像装置
CN105262951A (zh) * 2015-10-22 2016-01-20 努比亚技术有限公司 具有双目摄像头的移动终端及其拍照方法
CN106791298A (zh) * 2016-12-01 2017-05-31 广东虹勤通讯技术有限公司 一种具有双摄像头的终端及拍照方法

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003051872A (ja) * 2001-08-07 2003-02-21 Minolta Co Ltd 携帯型通信装置
CN100458559C (zh) * 2003-06-23 2009-02-04 宋柏君 立体数码相机及成像显示方法
FI115947B (fi) * 2004-02-25 2005-08-15 Nokia Corp Elektroninen laite ja menetelmä elektronisessa laitteessa kuvainformaation muodostamiseksi sekä ohjelmatuote menetelmän toteuttamiseksi
CN201178472Y (zh) * 2008-04-28 2009-01-07 陆静麟 可在普通影视播放设备上直接播放的立体影视的摄制设备
KR101915064B1 (ko) * 2012-08-23 2018-11-05 삼성전자주식회사 플렉서블 장치 및 그 동작 방법
CN104469111A (zh) * 2014-12-02 2015-03-25 柳州市瑞蚨电子科技有限公司 摄像头装置
CN104601892A (zh) * 2015-01-30 2015-05-06 深圳酷派技术有限公司 一种终端、图像拍摄方法及装置
GB2535706A (en) * 2015-02-24 2016-08-31 Nokia Technologies Oy Device with an adaptive camera array
CN104730802B (zh) * 2015-03-27 2017-10-17 酷派软件技术(深圳)有限公司 光轴夹角的校准、对焦方法和系统和双摄像头设备
CN205545606U (zh) * 2016-04-19 2016-08-31 乐视控股(北京)有限公司 双摄像头移动终端
CN107122717B (zh) * 2017-03-31 2021-02-09 华北科技学院 一种人脸识别防作弊信息控制装置
CN107820014B (zh) * 2017-11-27 2020-07-21 努比亚技术有限公司 一种拍摄方法、移动终端及计算机存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130044240A1 (en) * 2011-08-17 2013-02-21 Nokia Corporation Apparatus and method for generating image data
CN103581509A (zh) * 2012-07-24 2014-02-12 华晶科技股份有限公司 镜头结构及其摄像装置
CN105262951A (zh) * 2015-10-22 2016-01-20 努比亚技术有限公司 具有双目摄像头的移动终端及其拍照方法
CN106791298A (zh) * 2016-12-01 2017-05-31 广东虹勤通讯技术有限公司 一种具有双摄像头的终端及拍照方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3902236A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970455A (zh) * 2020-09-14 2020-11-20 Oppo广东移动通信有限公司 信息提示方法、装置、电子设备以及存储介质
CN112040134A (zh) * 2020-09-15 2020-12-04 努比亚技术有限公司 微云台拍摄控制方法、设备及计算机可读存储介质
CN112040134B (zh) * 2020-09-15 2022-07-01 河北千和电子商务有限公司 微云台拍摄控制方法、设备及计算机可读存储介质
CN112887616A (zh) * 2021-01-27 2021-06-01 维沃移动通信有限公司 拍摄方法和电子设备

Also Published As

Publication number Publication date
EP3902236A1 (en) 2021-10-27
CN111355878A (zh) 2020-06-30
EP3902236A4 (en) 2022-01-26

Similar Documents

Publication Publication Date Title
WO2020125797A1 (zh) 一种终端、拍摄方法及存储介质
US10609282B2 (en) Wide-area image acquiring method and apparatus
WO2021012856A1 (zh) 一种全景图像的拍摄方法
JP5969992B2 (ja) 携帯型装置でのステレオスコピック(3d)のパノラマ生成
CN103945210B (zh) 一种实现浅景深效果的多摄像头拍摄方法
US20160295108A1 (en) System and method for panoramic imaging
CN109313346A (zh) 双目视图和单目视图之间的转换
CN104995905B (zh) 图像处理设备、拍摄控制方法和程序
CN108432230B (zh) 一种成像设备和一种用于显示场景的图像的方法
CN105530431A (zh) 一种反射式全景成像系统及方法
JP5127787B2 (ja) 複眼撮影装置及びその制御方法
CN106292162A (zh) 立体照相装置和相关控制方法
US9807372B2 (en) Focused image generation single depth information from multiple images from multiple sensors
WO2013155804A1 (zh) 一种照片拍摄方法及电子设备
JP3907008B2 (ja) 写真のための被写界の深度を増大するための方法及び手段
TW201351959A (zh) 立體全景影像合成方法及其相關之立體攝影機
CN105306921A (zh) 一种基于移动终端的三维照片拍摄方法及移动终端
JPWO2011108283A1 (ja) 立体撮像装置および立体撮像方法
CN108391116B (zh) 基于3d成像技术的全身扫描装置及扫描方法
JP2016504828A (ja) 単一のカメラを用いて3d画像を取り込む方法およびシステム
Lin et al. A low-cost portable polycamera for stereoscopic 360 imaging
JP2007264592A (ja) 3次元イメージ自動生成装置及び方法
CN108205236B (zh) 全景摄像机及其镜头
JP5822700B2 (ja) 画像撮影方法および画像撮影装置、プログラム
JP2020191624A (ja) 電子機器およびその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19899106

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019899106

Country of ref document: EP

Effective date: 20210721