CN107948505B - Panoramic shooting method and mobile terminal - Google Patents

Panoramic shooting method and mobile terminal Download PDF

Info

Publication number
CN107948505B
CN107948505B CN201711123919.XA CN201711123919A CN107948505B CN 107948505 B CN107948505 B CN 107948505B CN 201711123919 A CN201711123919 A CN 201711123919A CN 107948505 B CN107948505 B CN 107948505B
Authority
CN
China
Prior art keywords
image data
data
panoramic
image
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711123919.XA
Other languages
Chinese (zh)
Other versions
CN107948505A (en
Inventor
杨丑刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711123919.XA priority Critical patent/CN107948505B/en
Publication of CN107948505A publication Critical patent/CN107948505A/en
Application granted granted Critical
Publication of CN107948505B publication Critical patent/CN107948505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The embodiment of the invention provides a panoramic shooting method and a mobile terminal, wherein the method is applied to the mobile terminal, the mobile terminal comprises a first shooting component and a second shooting component, and the method comprises the following steps: controlling a first shooting assembly to acquire first image data; controlling a second shooting component to acquire second image data containing preset characteristic information; panoramic data is generated based on the first image data and the second image data. According to the embodiment of the invention, the first image data and the second image data containing the preset characteristic information can be respectively collected by controlling different shooting components, and the panoramic data with high image quality can be generated and amplified without blurring by splicing the first image data and the second image data, wherein the panoramic data comprises the panoramic image data and the panoramic video data.

Description

Panoramic shooting method and mobile terminal
Technical Field
The invention relates to the technical field of image processing, in particular to a panoramic shooting method and a mobile terminal.
Background
Along with the continuous development of mobile terminals, the user experience demand on the mobile terminals is higher and higher, and particularly on the shooting function of the mobile terminals, the shooting process is required to be convenient, the photo result is clear, and meanwhile, more demands are also made on panoramic images and panoramic videos.
Panorama, that is, image data of the whole scene is captured by a professional camera or a picture rendered by modeling software is used, the image is spliced by the software and played by a special player, that is, a plane photo or a computer modeling picture is changed into a 360-degree full view for virtual reality browsing, and a two-dimensional plane picture is simulated into a real three-dimensional space and presented to an observer.
The panoramic picture or the panoramic video is shot by adopting fish-eye lenses mostly, image data acquisition is carried out on the areas in front of and behind the camera by adopting one or two fish-eye lenses, the acquired image data is corrected and spliced after correction, and a complete panoramic picture or panoramic video is generated, but the generated panoramic picture or panoramic video is limited by the resolution ratio of the lenses, so that the local image quality of the panoramic picture or panoramic video is poor.
Disclosure of Invention
The embodiment of the invention provides a panoramic shooting method and a mobile terminal, and aims to solve the problem that the image quality of a panoramic image or a panoramic video is poor in the existing method for generating the panoramic image or the panoramic video.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a panoramic shooting method, which is applied to a mobile terminal, where the mobile terminal includes a first shooting component and a second shooting component, and the method includes:
controlling the first shooting assembly to acquire first image data;
controlling the second shooting assembly to acquire second image data containing preset characteristic information;
generating panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data includes panoramic image data and panoramic video data.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, where the mobile terminal includes a first shooting component and a second shooting component, and the mobile terminal includes:
the first acquisition module is used for controlling the first shooting assembly to acquire first image data;
the second acquisition module is used for controlling the second image shooting assembly to acquire second image data containing preset characteristic information;
a generating module configured to generate panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data includes panoramic image data and panoramic video data.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the panoramic shooting method described above.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the panorama shooting method described above.
In the embodiment of the invention, the first image data is acquired by controlling the first shooting component; controlling a second shooting component to acquire second image data containing preset characteristic information; controlling a first image processing component to perform first image processing on the first image data to generate first intermediate image data; controlling a second image processing component to perform second image processing on the second image data to generate second intermediate image data; according to the preset characteristic information, determining that data needing to be replaced in the first intermediate image data is second target image data, determining that data corresponding to the preset characteristic information in the second intermediate image data is first target image data, and when the pixel values of the second target image data are different from those of the first target image data, performing interpolation processing on the first intermediate image data, and then performing splicing, so that panoramic image data with high image quality can be generated.
Drawings
FIG. 1 is a flow chart of the steps of a first embodiment of a panorama photographing method of the present invention;
FIG. 2 is a flowchart illustrating the steps of a second embodiment of a panorama photographing method according to the present invention;
FIG. 3 is a flowchart illustrating sub-steps of step 202 according to a second embodiment of the present invention;
fig. 4 is a block diagram of a first embodiment of a mobile terminal according to the present invention;
fig. 5 is a schematic diagram of a hardware structure of a mobile terminal according to various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart illustrating steps of a first embodiment of a panorama shooting method according to the present invention is shown, where the method may be applied to a mobile terminal, where the mobile terminal may include a first shooting component and a second shooting component, and the method may specifically include the following steps:
step 101, controlling the first shooting assembly to collect first image data;
first image data, which may include a panoramic image, may be acquired by controlling a first photographing component in the mobile terminal. The first photographing assembly may include at least one fisheye camera.
The first photographing component may include an optical sensor including, but not limited to, a CCD (Charge-coupled Device) image sensor, a cmos (Complementary Metal-oxide semiconductor) image sensor, and the like.
Step 102, controlling the second shooting assembly to collect second image data containing preset characteristic information;
when the second shooting assembly is controlled to collect the second image data, the area matched with the preset characteristic information can be defined as a key area, the key area is automatically focused, and the second image data with high image quality is obtained, namely the second image data containing the preset characteristic information is collected. The preset characteristic information may be information (e.g., human, animal, plant, automobile, etc.) predefined by a user or a mobile terminal. The second capture assembly may include at least one non-wide angle camera, such as: a telephoto lens. The long-focus lens is a lens with a focal length longer than that of a standard lens, and the standard lens is a lens with a focal length equal to or approximate to the length of a diagonal line of a photographed negative and a photosensitive chip.
The second photographing component may include an optical sensor including, but not limited to, a CCD (Charge-coupled Device) image sensor, a cmos (Complementary Metal-oxide semiconductor) image sensor, and the like.
In a preferred embodiment, the user can modify the preset feature information according to the self requirement.
Step 103, generating panoramic data based on the first image data and the second image data.
On the basis of the first image data, the clear second image data acquired according to the preset characteristic information are combined to generate panoramic data, so that the image definition of the panoramic data is high, the resolution is high, and the image is not blurred after the panoramic data is amplified.
In the embodiment of the invention, first image data is acquired by controlling the first shooting assembly; controlling a second shooting component to acquire second image data containing preset characteristic information; based on the first image data and the second image data, panoramic data is generated, and since the first shooting component adopts a fisheye camera and the second shooting component adopts a non-wide-angle camera, the second image data has higher quality than the first image data, for example: clearer, higher resolution, etc., so that panoramic data including panoramic image data and panoramic video data with high image quality can be generated.
Referring to fig. 2, a flowchart illustrating steps of a second embodiment of a panorama shooting method according to the present invention is shown, where the method may be applied to a mobile terminal, where the mobile terminal may include a first shooting component and a second shooting component, and the method may specifically include the following steps:
step 201, controlling the first shooting assembly to collect first image data;
first image data, which may include a panoramic image, may be acquired by controlling a first photographing component in the mobile terminal. The first photographing assembly may include at least one fisheye camera.
The first photographing component may include an optical sensor including, but not limited to, a CCD (Charge-coupled Device) image sensor, a cmos (Complementary Metal-oxide semiconductor) image sensor, and the like.
In a preferred embodiment, step 201 may include: controlling the first shooting assembly to collect first video data; at least one frame of image is cut out from the first video data as the first image data.
The first shooting component can be used for collecting image data or video data, and when the first shooting component collects the video data, at least one frame of image can be intercepted from the video data to serve as the first image data. The video data may be a panoramic video, and at least one frame of image may be cut out from the panoramic video as the first image data.
Step 202, controlling the second shooting assembly to collect second image data containing preset characteristic information;
when the second shooting assembly is controlled to collect the second image data, the area matched with the preset characteristic information can be defined as a key area, the key area is automatically focused, and the second image data with high image quality is obtained, namely the second image data containing the preset characteristic information is collected. The preset characteristic information may be information (e.g., human, animal, plant, automobile, etc.) predefined by a user or a mobile terminal. The second capture assembly may include at least one non-wide angle camera, such as: a telephoto lens. The long-focus lens is a lens with a focal length longer than that of a standard lens, and the standard lens is a lens with a focal length equal to or approximate to the length of a diagonal line of a photographed negative and a photosensitive chip.
The second photographing component may include an optical sensor including, but not limited to, a CCD (Charge-coupled Device) image sensor, a cmos (Complementary Metal-oxide semiconductor) image sensor, and the like.
In a preferred embodiment, the user can modify the preset feature information according to the self requirement.
Fig. 3 is a flowchart illustrating sub-steps of step 202 according to a second embodiment of the present invention.
In a preferred embodiment, step 202 may comprise:
substep 2021, performing feature detection on the preview image acquired by the second image capturing component;
before the second shooting component determines the second image data, a preview image of the second shooting component needs to be acquired, and the preview image may not be saved locally.
In this embodiment, the feature detection may be performed on the obtained preview image by using an existing algorithm according to the preset feature information, and whether the preset feature information exists in the preview image is determined.
For example, if the preset feature information is face information, a face recognition algorithm may be adopted to detect the preview image and determine whether the preview image has the face information.
In the substep 2022, if it is detected that the preview image has the preset feature information, controlling the second shooting component to acquire second image data.
If the preview image is judged to have the preset characteristic information, focusing can be performed on an area matched with the preset characteristic information, and the focused preview image is collected as the second image data. The area matched with the preset characteristic information is focused, so that the acquired second image data is amplified, and the phenomenon of blurring cannot occur.
In another preferred embodiment, step 202 may further include: controlling the second shooting assembly to collect second video data; and intercepting at least one frame of image from the second video data as the second image data.
The second shooting component can be used for collecting image data or video data, and when the second shooting component collects the video data, at least one frame of image can be intercepted from the video data to serve as the second image data. The video data may be a non-panoramic video, and at least one frame of image may not be cut out from the panoramic video as the second image data.
Step 203, generating panoramic data based on the first image data and the second image data.
On the basis of the first image data, the panoramic data is generated by combining the second image data with high image quality acquired according to the preset characteristic information, so that the panoramic data has high image quality, and does not appear a fuzzy phenomenon after being amplified, and comprises panoramic image data and panoramic video data.
For example, the resolution of the first photographing element is 2MP (Mega Pixels), the field angle is a1, the resolution of the second photographing element is 20MP, and the field angle is a 2. In general, the field angle of the camera used to acquire the first image data is smaller than the field angle of the camera used to acquire the second image data, i.e., a2/a1< 1. For example, a2/a1 is 1/32, the resolution of the finally formed panorama data is 64 MP.
In the embodiment of the method, the first shooting assembly and the second shooting assembly can only acquire the first image data and the second image data, the positions of the first shooting assembly and the second shooting assembly and the number of optical sensors respectively included in the first shooting assembly and the second shooting assembly are not limited, and the first shooting assembly and the second shooting assembly can respectively include a plurality of optical sensors located at different positions.
In a preferred embodiment, the mobile terminal further includes a first image processing component and a second image processing component, and step 203 may include:
controlling the first image processing component to perform first image processing on the first image data to generate first intermediate image data; controlling the second image processing component to perform second image processing on the second image data to generate second intermediate image data; and splicing the first intermediate image data and the second intermediate image data to generate the panoramic data.
The first Image processing component may be a first ISP (Image Signal Processor), and may control the first ISP to perform a first Image processing on the first Image data to generate first intermediate Image data, wherein the first Image processing may include, but is not limited to, white balance, noise reduction, automatic exposure, distortion correction, and the like.
The second image processing component may be a second ISP, and may control the second ISP to perform second image processing on the second image data to generate second intermediate image data, wherein the second image processing includes, but is not limited to, white balance, noise reduction, automatic exposure, and the like.
And because the independent ISP is respectively adopted to carry out second image processing on the second image data, the generated second intermediate image data has better imaging effect. Since the second intermediate image data includes the preset feature information, the data corresponding to the preset feature information also has a good imaging effect in the panoramic data generated by splicing, including but not limited to high definition, high color rendition, low noise, and the like.
In another preferred embodiment, the same image processing component may be controlled, that is, the same ISP may be used to complete the first image processing and the second image processing, and the embodiment of the method does not limit the type, number, and model of the ISP.
And replacing the area matched with the second intermediate image data in the first intermediate image data by adopting the second intermediate image data to finish splicing and generate panoramic data.
In a preferred embodiment, the step of generating the panoramic data by stitching the first intermediate image data and the second intermediate image data may include:
determining first target image data corresponding to the preset characteristic information in the second intermediate image data; determining second target image data corresponding to the preset characteristic information in the first intermediate image data; and replacing the second target image data with the first target image data to generate the panoramic data.
The first intermediate image data and the second intermediate image data may be identified by using an existing algorithm, so as to determine whether preset feature information exists in the first intermediate image data and the second intermediate image data. If the preset feature information exists in the first intermediate image data, determining that data corresponding to the preset feature information in the first intermediate image data is second target image data; and if the preset characteristic information exists in the second intermediate image data, determining that the data corresponding to the preset characteristic information in the second intermediate image data is second target image data.
And replacing the second target image data with the first target image data, so that the first intermediate image data and the second intermediate image data are combined to generate panoramic data.
In a preferred embodiment, the first target image data comprises first pixel values and the second target image data comprises second pixel values;
before replacing the second target image data with the first target image data and generating the panoramic data, the method further includes:
judging whether the first pixel value is equal to the second pixel value;
if the first pixel value is not equal to the second pixel value, performing interpolation processing on the first intermediate image data according to the ratio of the first pixel value to the second pixel value;
the first pixel value is the total number of all pixel points in the first target image data, and the second pixel value is the total number of all pixel points in the second target image data.
The first image data and the second image data are acquired through different shooting assemblies, and the sum of the total number of pixel points contained in the image data acquired by the different shooting assemblies may be different, so that the pixel values of the second target image data and the first target image data may be different.
If the first pixel value is different from the second pixel value, interpolation processing needs to be performed on the first intermediate image data, otherwise, distortion of the panoramic data generated after splicing occurs.
For example: if the first pixel value is 2 million and the second pixel value is 1 million, the first intermediate image data needs to be interpolated according to the ratio of the first pixel value to the second pixel value, that is, 2. Specifically, an X-Y rectangular coordinate system may be established in the first intermediate image data, and 0.414 times of pixel points are inserted in the X direction and the Y direction, so that the pixel values of the second target image data after the difference processing and the first target image data are the same, and then the stitching processing is performed. The method of interpolating a pixel point includes, but is not limited to, interpolating an average of two adjacent pixel points between the two pixel points.
In the embodiment of the invention, first image data is acquired by controlling a first shooting assembly; controlling a second shooting component to acquire second image data containing preset characteristic information; controlling a first image processing component to perform first image processing on the first image data to generate first intermediate image data; controlling a second image processing component to perform second image processing on the second image data to generate second intermediate image data; according to the preset characteristic information, determining that data needing to be replaced in the first intermediate image data is second target image data, determining that data corresponding to the preset characteristic information in the second intermediate image data is first target image data, and when the pixel values of the second target image data are different from those of the first target image data, performing interpolation processing on the first intermediate image data, and then performing splicing, so that panoramic image data with high image quality can be generated. Preferably, at least one frame of image can be intercepted as the first image data in the first video data collected by the first shooting component; at least one frame of image is intercepted from second video data collected by the second shooting assembly to serve as the second image data, so that the video data are processed, and further panoramic video data are generated.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 4, a block diagram of a first embodiment of a mobile terminal according to the present invention is shown, where the mobile terminal includes a first shooting component and a second shooting component, and may specifically include the following modules:
a first collecting module 301, configured to control the first shooting assembly to collect first image data;
a second collecting module 302, configured to control the second shooting assembly to collect second image data including preset feature information;
a generating module 303, configured to generate panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data includes panoramic image data and panoramic video data.
In a preferred embodiment of the present invention, the first acquisition module 301 may include:
the first video acquisition unit is used for controlling the first shooting assembly to acquire first video data;
the first cut-off unit is used for cutting off at least one frame of image from the first video data to serve as the first image data.
In a preferred embodiment of the present invention, the second acquiring module 302 may include:
the detection unit is used for carrying out characteristic detection on the preview image acquired by the second image shooting component;
and the second image acquisition unit is used for controlling the second shooting assembly to acquire second image data if the preview image is detected to have the preset characteristic information.
In another preferred embodiment of the present invention, the second acquisition module 302 may further include:
the second video acquisition unit is used for controlling the second shooting assembly to acquire second video data;
and the second screenshot unit is used for intercepting at least one frame of image from the second video data as the second image data.
In a preferred embodiment of the present invention, the mobile terminal further includes a first image processing component and a second image processing component, and the generating module 303 may include:
a first generation unit configured to control the first image processing component to perform first image processing on the first image data, and generate first intermediate image data;
a second generating unit configured to control the second image processing component to perform second image processing on the second image data to generate second intermediate image data;
and a third generating unit, configured to splice the first intermediate image data and the second intermediate image data, and generate the panoramic data.
In a preferred embodiment of the present invention, the third generating unit may include:
the first target subunit is used for determining first target image data corresponding to the preset feature information in the second intermediate image data;
the second target subunit is used for determining second target image data corresponding to the preset characteristic information in the first intermediate image data;
a replacing subunit, configured to replace the second target image data with the first target image data, and generate the panoramic data.
In a preferred embodiment of the present invention, the first target image data includes a first pixel value, and the second target image data includes a second pixel value; the third generating unit may further include:
a pixel comparison subunit, configured to determine whether the first pixel value is equal to the second pixel value;
an interpolation subunit, configured to perform interpolation processing on the first intermediate image data according to a ratio of the first pixel value to the second pixel value if the first pixel value is not equal to the second pixel value;
the first pixel value is the total number of all pixel points in the first target image data, and the second pixel value is the total number of all pixel points in the second target image data.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition.
In the embodiment of the invention, first image data is acquired by controlling a first shooting assembly; controlling a second shooting component to acquire second image data containing preset characteristic information; controlling a first image processing component to perform first image processing on the first image data to generate first intermediate image data; controlling a second image processing component to perform second image processing on the second image data to generate second intermediate image data; according to the preset characteristic information, determining that data needing to be replaced in the first intermediate image data is second target image data, determining that data corresponding to the preset characteristic information in the second intermediate image data is first target image data, and when the pixel values of the second target image data are different from those of the first target image data, performing interpolation processing on the first intermediate image data, and then performing splicing, so that panoramic image data with high image quality can be generated. Preferably, at least one frame of image can be intercepted as the first image data in the first video data collected by the first shooting component; at least one frame of image is intercepted from second video data collected by the second shooting assembly to serve as the second image data, so that the video data are processed, and further panoramic video data are generated.
Figure 5 is a schematic diagram of a hardware configuration of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, and power supply 411. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 5 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 401 is configured to control the first shooting assembly to acquire first image data; controlling the second shooting assembly to acquire second image data containing preset characteristic information;
a processor 410 for generating panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data includes panoramic image data and panoramic video data.
In the embodiment of the invention, first image data is acquired by controlling the first shooting assembly; controlling a second shooting component to acquire second image data containing preset characteristic information; based on the first image data and the second image data, panoramic data is generated, and since the first shooting component adopts a fisheye camera and the second shooting component adopts a non-wide-angle camera, the second image data has higher quality than the first image data, for example: clearer, higher resolution, etc., so that panoramic data including panoramic image data and panoramic video data with high image quality can be generated.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 402, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the mobile terminal 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The mobile terminal 400 also includes at least one sensor 405, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the mobile terminal 400 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 5, the touch panel 4071 and the display panel 4061 are two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 408 is an interface through which an external device is connected to the mobile terminal 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 400 or may be used to transmit data between the mobile terminal 400 and external devices.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby integrally monitoring the mobile terminal. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The mobile terminal 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the mobile terminal 400 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 410, a memory 409, and a computer program that is stored in the memory 409 and can be run on the processor 410, and when being executed by the processor 410, the computer program implements each process of the above-mentioned panorama shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the panoramic shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A panoramic shooting method is applied to a mobile terminal, the mobile terminal comprises a first shooting component and a second shooting component, and the panoramic shooting method is characterized by comprising the following steps:
controlling the first shooting assembly to acquire first image data;
controlling the second shooting assembly to acquire second image data containing preset characteristic information;
generating panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data comprises panoramic image data and panoramic video data;
wherein, the step of controlling the second shooting component to collect second image data containing preset characteristic information comprises:
performing feature detection on the preview image acquired by the second image shooting assembly;
if the preview image is detected to have the preset feature information, controlling the second shooting assembly to acquire second image data;
the mobile terminal also comprises a first image processing component and a second image processing component;
the step of generating panoramic data based on the first image data and the second image data includes:
controlling the first image processing component to perform first image processing on the first image data to generate first intermediate image data;
controlling the second image processing component to perform second image processing on the second image data to generate second intermediate image data;
splicing the first intermediate image data and the second intermediate image data to generate the panoramic data;
the step of generating the panoramic data by splicing the first intermediate image data and the second intermediate image data includes:
determining first target image data corresponding to the preset characteristic information in the second intermediate image data;
determining second target image data corresponding to the preset characteristic information in the first intermediate image data;
replacing the second target image data with the first target image data to generate the panoramic data;
the first target image data comprises first pixel values and the second target image data comprises second pixel values;
before replacing the second target image data with the first target image data and generating the panoramic data, the method further includes:
judging whether the first pixel value is equal to the second pixel value;
if the first pixel value is not equal to the second pixel value, performing interpolation processing on the first intermediate image data according to the ratio of the first pixel value to the second pixel value;
the first pixel value is the total number of all pixel points in the first target image data, and the second pixel value is the total number of all pixel points in the second target image data;
if the preview image is detected to have the preset feature information, controlling the second shooting assembly to acquire second image data comprises:
and focusing an area matched with the preset characteristic information if the preset characteristic information exists in the preview image, and acquiring the focused preview image as the second image data.
2. The method of claim 1, wherein the step of controlling the first capture assembly to acquire first image data further comprises:
controlling the first shooting assembly to collect first video data;
at least one frame of image is cut out from the first video data as the first image data.
3. The method of claim 1, wherein the step of controlling the second camera assembly to capture second image data containing preset feature information further comprises:
controlling the second shooting assembly to collect second video data;
and intercepting at least one frame of image from the second video data as the second image data.
4. A mobile terminal comprising a first camera assembly and a second camera assembly, the mobile terminal comprising:
the first acquisition module is used for controlling the first shooting assembly to acquire first image data;
the second acquisition module is used for controlling the second shooting assembly to acquire second image data containing preset characteristic information;
a generating module configured to generate panoramic data based on the first image data and the second image data;
the first shooting assembly comprises at least one fisheye camera; the second shooting assembly comprises at least one non-wide-angle camera; the panoramic data comprises panoramic image data and panoramic video data;
wherein the second acquisition module comprises:
the detection unit is used for carrying out characteristic detection on the preview image acquired by the second image shooting component;
the second image acquisition unit is used for controlling the second shooting assembly to acquire second image data if the preview image is detected to have the preset feature information;
the mobile terminal also comprises a first image processing component and a second image processing component;
the generation module comprises:
a first generation unit configured to control the first image processing component to perform first image processing on the first image data, and generate first intermediate image data;
a second generating unit configured to control the second image processing component to perform second image processing on the second image data to generate second intermediate image data;
a third generating unit, configured to splice the first intermediate image data and the second intermediate image data to generate the panoramic data;
the third generation unit includes:
the first target subunit is used for determining first target image data corresponding to the preset characteristic information in the second intermediate image data;
the second target subunit is used for determining second target image data corresponding to the preset characteristic information in the first intermediate image data;
a replacement subunit configured to replace the second target image data with the first target image data, and generate the panoramic data;
the first target image data comprises first pixel values and the second target image data comprises second pixel values;
the third generating unit further includes:
a pixel comparison subunit, configured to determine whether the first pixel value is equal to the second pixel value;
an interpolation subunit, configured to perform interpolation processing on the first intermediate image data according to a ratio of the first pixel value to the second pixel value if the first pixel value is not equal to the second pixel value;
the first pixel value is the total number of all pixel points in the first target image data, and the second pixel value is the total number of all pixel points in the second target image data;
the second image acquisition unit is specifically configured to, if it is determined that the preset feature information exists in the preview image, focus an area matched with the preset feature information, and acquire the focused preview image as the second image data.
5. The mobile terminal of claim 4, wherein the first acquisition module further comprises:
the first video acquisition unit is used for controlling the first shooting assembly to acquire first video data;
the first cut-off unit is used for cutting off at least one frame of image from the first video data to serve as the first image data.
6. The mobile terminal of claim 4, wherein the second acquisition module further comprises:
the second video acquisition unit is used for controlling the second shooting assembly to acquire second video data;
and the second screenshot unit is used for intercepting at least one frame of image from the second video data as the second image data.
7. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the panorama shooting method of any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the panorama photographing method according to any one of claims 1 to 3.
CN201711123919.XA 2017-11-14 2017-11-14 Panoramic shooting method and mobile terminal Active CN107948505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711123919.XA CN107948505B (en) 2017-11-14 2017-11-14 Panoramic shooting method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711123919.XA CN107948505B (en) 2017-11-14 2017-11-14 Panoramic shooting method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107948505A CN107948505A (en) 2018-04-20
CN107948505B true CN107948505B (en) 2020-06-23

Family

ID=61932065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711123919.XA Active CN107948505B (en) 2017-11-14 2017-11-14 Panoramic shooting method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107948505B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005334B (en) * 2018-06-15 2020-07-03 清华-伯克利深圳学院筹备办公室 Imaging method, device, terminal and storage medium
CN108965719B (en) * 2018-08-08 2021-05-28 常山赛翁思智能科技有限公司 Image processing method and device
CN108965742B (en) 2018-08-14 2021-01-22 京东方科技集团股份有限公司 Special-shaped screen display method and device, electronic equipment and computer readable storage medium
CN110875998A (en) * 2018-08-30 2020-03-10 宏碁股份有限公司 Panoramic photographic device and image mapping combination method thereof
TWI678660B (en) * 2018-10-18 2019-12-01 宏碁股份有限公司 Electronic system and image processing method
CN109618093A (en) * 2018-12-14 2019-04-12 深圳市云宙多媒体技术有限公司 A kind of panoramic video live broadcasting method and system
CN112601007B (en) * 2020-11-11 2022-06-28 联想(北京)有限公司 Image acquisition method and device for characteristic region

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106385541A (en) * 2016-09-30 2017-02-08 虹软(杭州)科技有限公司 Method for realizing zooming through wide-angle photographing component and long-focus photographing component
CN106454121A (en) * 2016-11-11 2017-02-22 努比亚技术有限公司 Double-camera shooting method and device
CN107155064A (en) * 2017-06-23 2017-09-12 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107249096A (en) * 2016-06-14 2017-10-13 杭州海康威视数字技术股份有限公司 Panoramic camera and its image pickup method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150271400A1 (en) * 2014-03-19 2015-09-24 Htc Corporation Handheld electronic device, panoramic image forming method and non-transitory machine readable medium thereof
CN105827934B (en) * 2015-07-01 2019-08-20 维沃移动通信有限公司 The image processing method and electronic equipment of a kind of electronic equipment
CN105959546A (en) * 2016-05-25 2016-09-21 努比亚技术有限公司 Panorama shooting device and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107249096A (en) * 2016-06-14 2017-10-13 杭州海康威视数字技术股份有限公司 Panoramic camera and its image pickup method
CN106385541A (en) * 2016-09-30 2017-02-08 虹软(杭州)科技有限公司 Method for realizing zooming through wide-angle photographing component and long-focus photographing component
CN106454121A (en) * 2016-11-11 2017-02-22 努比亚技术有限公司 Double-camera shooting method and device
CN107155064A (en) * 2017-06-23 2017-09-12 维沃移动通信有限公司 A kind of image pickup method and mobile terminal

Also Published As

Publication number Publication date
CN107948505A (en) 2018-04-20

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN107948505B (en) Panoramic shooting method and mobile terminal
CN107592466B (en) Photographing method and mobile terminal
CN111541845B (en) Image processing method and device and electronic equipment
CN111083380B (en) Video processing method, electronic equipment and storage medium
CN106937039B (en) Imaging method based on double cameras, mobile terminal and storage medium
WO2021051995A1 (en) Photographing method and terminal
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
CN110784651B (en) Anti-shake method and electronic equipment
CN107566730B (en) A kind of panoramic picture image pickup method and mobile terminal
CN108989678B (en) Image processing method and mobile terminal
CN111064895B (en) Virtual shooting method and electronic equipment
CN109905603B (en) Shooting processing method and mobile terminal
CN109474786B (en) Preview image generation method and terminal
CN110266957B (en) Image shooting method and mobile terminal
CN111246106B (en) Image processing method, electronic device, and computer-readable storage medium
CN107623818B (en) Image exposure method and mobile terminal
CN110213484B (en) Photographing method, terminal equipment and computer readable storage medium
CN111885307B (en) Depth-of-field shooting method and device and computer readable storage medium
CN111145192A (en) Image processing method and electronic device
CN111601032A (en) Shooting method and device and electronic equipment
CN111083371A (en) Shooting method and electronic equipment
CN111246102A (en) Shooting method, shooting device, electronic equipment and storage medium
CN108616687B (en) Photographing method and device and mobile terminal
CN112188082A (en) High dynamic range image shooting method, shooting device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant