CN112188096A - Photographing method and device, terminal and storage medium - Google Patents

Photographing method and device, terminal and storage medium Download PDF

Info

Publication number
CN112188096A
CN112188096A CN202011031187.3A CN202011031187A CN112188096A CN 112188096 A CN112188096 A CN 112188096A CN 202011031187 A CN202011031187 A CN 202011031187A CN 112188096 A CN112188096 A CN 112188096A
Authority
CN
China
Prior art keywords
image
camera
terminal
blurring
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011031187.3A
Other languages
Chinese (zh)
Inventor
彭聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011031187.3A priority Critical patent/CN112188096A/en
Publication of CN112188096A publication Critical patent/CN112188096A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a photographing method and device, a terminal and a storage medium. The method is applied to a terminal comprising a first camera and a second camera, wherein the visual angle range of the second camera is larger than that of the first camera, and the method comprises the following steps: determining the distance between the terminal and a target object; if the distance is smaller than a preset distance threshold value, when a first image comprising a blurring area is obtained by photographing the target object through the first camera, photographing the target object through the second camera to obtain a second image; and adjusting the virtual region by utilizing the image content corresponding to the second image and the virtual region to obtain the target image with the improved definition of the virtual region. By the method, the photographing quality can be improved.

Description

Photographing method and device, terminal and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a photographing method and apparatus, a terminal, and a storage medium.
Background
With the popularization of terminal devices, the photographing function of the terminal device becomes one of the most common functions. In the digital era, a negative film is replaced by a photosensitive element, and the photosensitive element comprises a CCD image sensor and a CMOS sensor.
In the related art, when an object at a short distance, such as a document taken at a short distance, is imaged, a camera with a large size of a photosensitive element is generally used for taking a picture. This is because the photosensitive element (outsole) having a large size has a good sensitivity characteristic, and thus the image formation becomes clearer.
However, when photographing is performed using a camera having a large size of a photosensitive element, there is a problem in that edges may become blurred.
Disclosure of Invention
The disclosure provides a photographing method and device, a terminal and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a photographing method applied to a terminal including a first camera and a second camera, a viewing angle range of the second camera being greater than a viewing angle range of the first camera, the method including:
determining the distance between the terminal and a target object;
if the distance is smaller than a preset distance threshold value, when a first image comprising a blurring area is obtained by photographing the target object through the first camera, photographing the target object through the second camera to obtain a second image;
and adjusting the virtual region by utilizing the image content corresponding to the second image and the virtual region to obtain the target image with the improved definition of the virtual region.
Optionally, the adjusting the blurring region by using the image content corresponding to the second image and the blurring region to obtain the target image with the improved blurring region definition includes:
determining a pixel coordinate interval corresponding to a blurring region in the first image in the second image according to the visual angle ranges of the first camera and the second camera;
and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
Optionally, the fusing the pixels in the corresponding pixel coordinate interval in the second image with the pixels in the blurring region in the first image to obtain the target image includes:
performing interpolation compression on pixels in the corresponding pixel coordinate interval in the second image to obtain an interpolation result with the same size as the virtual part in the first image;
and replacing the blurring part in the first image with the interpolation result to obtain the target image.
Optionally, the determining the distance between the terminal and the target object includes:
determining the distance between the terminal and the target object by a time of flight (TOF) module.
Optionally, the first camera is a camera with a size larger than a size threshold value of the photosensitive element, and the second camera is a wide-angle camera.
According to a second aspect of the embodiments of the present disclosure, the apparatus is applied to a terminal including a first camera and a second camera, a viewing angle range of the second camera is greater than a viewing angle range of the first camera, and the apparatus includes:
a determining module configured to determine a distance between the terminal and a target object;
the photographing module is configured to photograph the target object through the second camera to obtain a second image when the first camera photographs the target object to obtain a first image including a blurring region if the distance is smaller than a preset distance threshold;
and the adjusting module is configured to adjust the blurring region by using the image content of the second image corresponding to the blurring region, so as to obtain the target image with the blurring region having improved definition.
Optionally, the adjusting module is specifically configured to determine, according to the viewing angle ranges of the first camera and the second camera, a pixel coordinate interval of the blurring region in the first image in the second image; and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
Optionally, the adjusting module is specifically configured to perform interpolation compression on pixels in a corresponding pixel coordinate interval in the second image, so as to obtain an interpolation result with the same size as that of the blurred portion in the first image; and replacing the blurring part in the first image with the interpolation result to obtain the target image.
Optionally, the determining module is specifically configured to determine the distance between the terminal and the target object through a time of flight TOF module.
Optionally, the first camera is a camera with a size larger than a size threshold value of the photosensitive element, and the second camera is a wide-angle camera.
According to a third aspect of the embodiments of the present disclosure, there is provided a terminal, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the photographing method as described in the first aspect above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium including:
the instructions in the storage medium, when executed by a processor of the terminal, enable the terminal to perform the photographing method as described in the above first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, when the terminal performs photographing, if it is determined that the distance between the terminal and the target object is smaller than the preset distance threshold, the first camera and the second camera having a viewing angle range larger than that of the first camera are used to photograph the target object together to obtain the first image and the second image, and then for the blurred region in the first image, because the viewing angle range of the second camera is large, the blurred region in the first image may be clearer in the second image. Based on this, this disclosure utilizes the formation of image advantage of second camera, can utilize the second image to adjust the blurring region in the first image, can obtain the target image after the blurring region definition improves, therefore can promote the imaging quality who shoots, promote user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a photographing method according to an embodiment of the present disclosure.
Fig. 2 is an exemplary diagram of a lens in a mobile phone.
Fig. 3 is an exemplary diagram of a blurring region.
Fig. 4 is a diagram illustrating a photographing apparatus according to an exemplary embodiment.
Fig. 5 is a block diagram of a terminal shown in an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart of a photographing method shown in an embodiment of the present disclosure, and as shown in fig. 1, the photographing method is applied to a terminal including a first camera and a second camera, a viewing angle range of the second camera is greater than a viewing angle range of the first camera, and the method includes:
s11, determining the distance between the terminal and the target object;
s12, if the distance is smaller than a preset distance threshold, when the first camera takes a picture of the target object to obtain a first image comprising a blurring area, the second camera takes a picture of the target object to obtain a second image;
and S13, adjusting the blurring area by using the image content of the second image corresponding to the blurring area, and obtaining the target image with the blurring area with improved definition.
In an embodiment of the present disclosure, a terminal device includes: a mobile device and a stationary device; the mobile device includes: a mobile phone, a tablet computer, or a wearable device, etc. The stationary device includes, but is not limited to, a Personal Computer (PC).
Two cameras are arranged in the terminal equipment, which are called a first camera and a second camera in the embodiment of the disclosure, wherein the visual angle range of the second camera is larger than that of the first camera. When the image is taken in close range, the light information sensed by the photosensitive element is less than that sensed by the central area at the edge angle of the viewing angle range, so that a blurring phenomenon may occur at the edge area of the image. When the lenses with different visual angle ranges are used for taking pictures at the same distance, the visual angle range of the second camera is larger than that of the first camera, so that the edge fuzzy part in the image shot by the first camera is clear in the image shot by the second camera.
In steps S11-S12 of the embodiment of the present disclosure, the terminal device may determine a distance between itself and the target object, and start the first camera and the second camera to photograph the target object when the distance between the terminal and the target object is smaller than a preset distance threshold, so as to obtain a first image of the first camera and a second image of the second camera.
It should be noted that, in the embodiment of the present disclosure, the preset distance threshold may be a boundary distance that may cause blurring after calibrating the distance between the camera and the object to be shot according to the imaging parameter of the camera and performing an analysis test on the obtained image. When the actual photographing distance is less than the boundary distance, a blurring region exists.
When the preset distance threshold is obtained by performing calibration test according to the imaging parameter of the camera, the imaging parameter may be the aperture size. Because of different aperture sizes, the light flux of the lens is different, and thus the sharpness of the obtained first image is different, and the blurring area may be different in the first images with different sharpness. In addition, when the actual photographing distance fluctuates within a range smaller than the preset distance threshold, the change of the distance only causes a slight change of the range of the blurring region, so that the blurring region corresponding to the preset distance threshold obtained in the calibration test can be defaulted as the blurring region generated during the actual photographing.
In the embodiment of the present disclosure, because the viewing angle range of the second camera is larger than that of the first camera, if the focal lengths of the first camera and the second camera are the same, the size of the second image of the second camera is the same as the size of the first image of the first camera. When the distance between the terminal and the target object is smaller than the preset distance threshold, the edge blurring area in the first image obtained by the terminal through the first camera may be clearer in the second image corresponding to the second camera.
In one embodiment, the first camera is a camera with a photosensitive element size greater than a size threshold, and the second camera is a wide-angle camera.
In this embodiment, the first camera may be a camera with a larger size of the photosensitive element, and because the distance between the pixels in the photosensitive element with the larger size is larger, and the optical information acquired by each pixel is more, the image obtained by using the camera is clearer, but when taking a close-range photograph, as described above, there may still be an area with a blurred edge. Taking the terminal as a mobile phone as an example, the camera may be a rear camera in the mobile phone, and is also called a main camera.
Correspondingly, the second camera is a wide-angle camera, and the visual angle range of the wide-angle camera is larger than that of the first camera. Because the focal length of the wide-angle camera is usually shorter, the image size of the second image obtained by the wide-angle camera is larger than that of the first image at the same photographing distance.
For example, the first camera is a 1-time main shooting, the second camera is a 0.5-time wide-angle camera, and then the focal length of the wide-angle camera is half of that of the first camera. If the first image size obtained by the first camera is 50 × 50 in mm under the same photographing distance, the second image size obtained by the second camera is 100 × 100.
Fig. 2 is an exemplary view of a lens in a mobile phone, and as shown in fig. 2, L1 denotes a first camera having a larger size of a photosensitive element, and L2 denotes a wide-angle lens. Fig. 3 is an exemplary diagram of a blurring region, as shown in fig. 3, the image is a first image captured by a first camera, and a gray region identified by L3 is the blurring region.
In one embodiment, the determining the distance between the terminal and the target object includes:
and obtaining the distance between the terminal and the target object through a time-of-flight TOF module.
In this embodiment, the terminal may utilize a built-in Time of flight (TOF) module for distance measurement. Generally, the TOF module includes a light source and a photodetector, and can transmit a light signal to a target object through the light source and then receive a light signal returned from the target object through the photodetector, and the distance of the target object from the TOF module is obtained by detecting the flight (round trip) time of the light signal.
It should be noted that, in the embodiments of the present disclosure, the TOF module is not limited to determine the distance between the terminal and the target object. If the terminal includes other sensors that can be used for distance measurement, the distance between the terminal and the target object can be determined by using one or more distance sensors. For example, other sensors that may be used for ranging include, but are not limited to: ultrasonic sensors, infrared sensors, or the like.
In step S13, after the terminal determines the blurring region in the first image, that is, the blurring region is adjusted by using the image content of the second image captured by the second camera corresponding to the blurring region, so as to obtain the target image with the improved sharpness of the blurring region.
As mentioned above, when the terminal takes pictures through the first camera and the second camera in a close range, the blurred region in the first image may be clearer in the second image. Therefore, the content of the blurring region can be adjusted by using the second image, so that the target image with the improved sharpness of the blurring region is obtained.
In one embodiment, step S13 includes:
determining a pixel coordinate interval corresponding to a blurring region in the first image in the second image according to the visual angle ranges of the first camera and the second camera;
and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
In this embodiment, when adjusting the content of the blurring region in the first image, it is necessary to determine the pixel coordinate interval of the blurring region in the first image corresponding to the second image.
For example, the coordinates of the blurring region in the first image are the corresponding pixel coordinate interval in the second image.
As mentioned above, since the viewing angle ranges of the first camera and the second camera are different, the focal lengths may also be different, and thus the imaging sizes of the different cameras may be different. When image fusion is performed, the first image and the second image may not be directly fused, and a pixel coordinate interval corresponding to the same acquisition target in the second image and the blurring area needs to be determined in advance according to imaging parameters of the first camera and the second camera. And according to the imaging parameters of the first camera and the second camera, pixel value interpolation and/or downsampling processing is carried out on the pixel coordinate interval, so that the processed pixel coordinate interval and the blurring area have the same number of pixels to describe the same acquisition target, and the pixels of the processed pixel coordinate interval and the blurring area are fused to obtain a processed target image. The imaging parameters include, but are not limited to: the viewing angle and/or depth of field is acquired.
It should be noted that when the blurred region in the first image is mapped into the second image according to the multiple of the focal length, the blurred region may not be an integer, and at this time, the pixel coordinate interval of the blurred region in the second image may be determined by rounding up or rounding down.
After the corresponding pixel coordinate interval of the blurring area in the second image is determined, the pixels in the corresponding pixel coordinate interval in the second image are subjected to intermediate processing and then are fused with the pixels of the blurring area in the first image, and a target image is obtained. The intermediate processing comprises: and averaging pixels in the corresponding pixel coordinate interval in the second image to obtain the pixels with the same number as the pixels in the blurring area.
In the process of pixel fusion, in one embodiment, the pixel of the blurring region and the pixel after intermediate processing may be calculated according to a predetermined equation to obtain a new pixel value, and the new pixel value may be substituted for the pixel value of the blurring region portion in the first image, so as to obtain the target image.
In another embodiment, step S13 includes:
performing interpolation compression on pixels in the corresponding pixel coordinate interval in the second image to obtain an interpolation result with the same size as the virtual part in the first image;
and replacing the blurring part in the first image with the interpolation result to obtain the target image.
In this embodiment, the correspondence may be clearer in the second image due to the content of the blurred region in the first image. Thus, in this embodiment, the blurred portion of the first image may be replaced with pixel values within the corresponding pixel coordinate interval of the second image.
When the second camera is a wide-angle camera, the size of the shot second image may be larger than that of the first image, and the number of pixels in the corresponding pixel coordinate interval in the second image may be greater than that of pixels in the blurred portion in the first image. For example, in the main-view and wide-angle cameras, in a first image with a size of 50 × 50, 40 to 50 regions are blurred regions. Since the size of the second image is 100 x 100, the pixel coordinate interval of the blurred region of 40-50 in the second image is 80-100 according to the viewing angle range and the depth of field of the first camera and the second camera. The blurring area has 10 pixels, and the corresponding pixel coordinate interval in the second image is 80-100 pixels, and 20 pixels. Therefore, before replacement, the pixels in the corresponding pixel coordinate interval in the second image need to be interpolated and compressed, so as to obtain an interpolation result with the same size as the blurred region in the first image.
For example, two adjacent pixels in the 80-100 region may be averaged to obtain 10 pixel values, and then the 10 pixel values may be fused with the pixels in the blurred region in the first image. Of course, the interpolation compression method is not limited to the mean method, and bilinear interpolation or cubic interpolation may also be used, which is not limited in the embodiment of the present disclosure.
It can be understood that, in this embodiment, after performing interpolation compression on the pixels in the corresponding pixel coordinate interval in the second image and obtaining the interpolation result with the same size as the blurred portion in the first image, the interpolation result is directly substituted for the blurred portion in the first image, so as to obtain the target image, which has the advantage of small calculation amount.
In the embodiment of the disclosure, when the terminal performs photographing, the first camera and the second camera having a viewing angle range larger than that of the first camera are used to photograph the target object together to obtain the first image and the second image, and then for the blurring region in the first image, because the viewing angle range of the second camera is large, the blurring region in the first image may be clearer in the second image. Based on this, this disclosure utilizes the formation of image advantage of second camera, can utilize the second image to adjust the blurring region in the first image, can obtain the target image after the blurring region definition improves, therefore can promote the imaging quality who shoots, promote user experience.
Fig. 4 is a diagram illustrating a photographing apparatus according to an exemplary embodiment. Referring to fig. 4, in an alternative embodiment, the apparatus is applied to a terminal including a first camera and a second camera, where a viewing angle range of the second camera is greater than a viewing angle range of the first camera, and the apparatus includes:
a determining module 101 configured to determine a distance between the terminal and a target object;
the photographing module 102 is configured to, if the distance is smaller than a preset distance threshold, photograph the target object through the second camera to obtain a second image when the first image including the blurring region is obtained by photographing the target object through the first camera;
an adjusting module 103, configured to adjust the blurring region by using the image content of the second image corresponding to the blurring region, so as to obtain a target image with the blurring region having an improved definition.
Optionally, the adjusting module 103 is specifically configured to determine, according to the viewing angle ranges of the first camera and the second camera, a pixel coordinate interval of the blurring region in the first image in the second image; and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
Optionally, the adjusting module 103 is specifically configured to perform interpolation compression on pixels in a corresponding pixel coordinate interval in the second image, so as to obtain an interpolation result with the same size as that of the blurred portion in the first image; and replacing the blurring part in the first image with the interpolation result to obtain the target image.
Optionally, the determining module 101 is specifically configured to determine the distance between the terminal and the target object through a time of flight TOF module.
Optionally, the first camera is a camera with a size larger than a size threshold value of the photosensitive element, and the second camera is a wide-angle camera.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 5 is a block diagram illustrating a terminal apparatus 800 according to an example embodiment. For example, the device 800 may be a cell phone, a computer, etc.
Referring to fig. 5, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as Wi-Fi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, instructions in which, when executed by a processor of a terminal, enable the terminal to perform a photographing method, the terminal including a first camera and a second camera, a viewing angle range of the second camera being greater than a viewing angle range of the first camera, the method comprising:
determining the distance between the terminal and a target object;
if the distance is smaller than a preset distance threshold value, when a first image comprising a blurring area is obtained by photographing the target object through the first camera, photographing the target object through the second camera to obtain a second image;
and adjusting the virtual region by utilizing the image content corresponding to the second image and the virtual region to obtain the target image with the improved definition of the virtual region.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A photographing method is applied to a terminal comprising a first camera and a second camera, wherein the visual angle range of the second camera is larger than that of the first camera, and the method comprises the following steps:
determining the distance between the terminal and a target object;
if the distance is smaller than a preset distance threshold value, when a first image comprising a blurring area is obtained by photographing the target object through the first camera, photographing the target object through the second camera to obtain a second image;
and adjusting the virtual region by utilizing the image content corresponding to the second image and the virtual region to obtain the target image with the improved definition of the virtual region.
2. The method according to claim 1, wherein the adjusting the blurring region to obtain the target image with the improved blurring region definition by using the image content of the second image corresponding to the blurring region comprises:
determining a pixel coordinate interval corresponding to a blurring region in the first image in the second image according to the visual angle ranges of the first camera and the second camera;
and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
3. The method according to claim 2, wherein the fusing the pixels in the corresponding pixel coordinate interval in the second image with the pixels in the blurred region in the first image to obtain the target image comprises:
performing interpolation compression on pixels in the corresponding pixel coordinate interval in the second image to obtain an interpolation result with the same size as the virtual part in the first image;
and replacing the blurring part in the first image with the interpolation result to obtain the target image.
4. The method of claim 1, wherein the determining the distance between the terminal and the target object comprises:
determining the distance between the terminal and the target object by a time of flight (TOF) module.
5. The method of claim 1, wherein the first camera is a camera having a photosensitive element size greater than a size threshold and the second camera is a wide-angle camera.
6. The utility model provides a photographing device which is characterized in that, is applied to in the terminal that includes first camera and second camera, the visual angle range of second camera is greater than the visual angle range of first camera, the device includes:
a determining module configured to determine a distance between the terminal and a target object;
the photographing module is configured to photograph the target object through the second camera to obtain a second image when the first camera photographs the target object to obtain a first image including a blurring region if the distance is smaller than a preset distance threshold;
and the adjusting module is configured to adjust the blurring region by using the image content of the second image corresponding to the blurring region, so as to obtain the target image with the blurring region having improved definition.
7. The apparatus of claim 6,
the adjusting module is specifically configured to determine a corresponding pixel coordinate interval of the blurring region in the first image in the second image according to the viewing angle ranges of the first camera and the second camera; and fusing pixels in the corresponding pixel coordinate interval in the second image with pixels in the blurring area in the first image to obtain the target image.
8. The apparatus of claim 7,
the adjusting module is specifically configured to perform interpolation compression on pixels in a corresponding pixel coordinate interval in the second image to obtain an interpolation result with the same size as that of a virtual part in the first image; and replacing the blurring part in the first image with the interpolation result to obtain the target image.
9. The apparatus of claim 6,
the determining module is specifically configured to obtain a distance between the terminal and the target object by a time of flight TOF module.
10. The apparatus of claim 6, wherein the first camera is a camera having a photosensitive element size greater than a size threshold, and the second camera is a wide-angle camera.
11. A terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the photographing method according to any one of claims 1 to 5.
12. A non-transitory computer-readable storage medium, instructions in which, when executed by a processor of a terminal, enable the terminal to perform the photographing method according to any one of claims 1 to 5.
CN202011031187.3A 2020-09-27 2020-09-27 Photographing method and device, terminal and storage medium Pending CN112188096A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011031187.3A CN112188096A (en) 2020-09-27 2020-09-27 Photographing method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011031187.3A CN112188096A (en) 2020-09-27 2020-09-27 Photographing method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112188096A true CN112188096A (en) 2021-01-05

Family

ID=73944182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011031187.3A Pending CN112188096A (en) 2020-09-27 2020-09-27 Photographing method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112188096A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810598A (en) * 2021-08-11 2021-12-17 荣耀终端有限公司 Photographing method and device
CN115348390A (en) * 2022-08-23 2022-11-15 维沃移动通信有限公司 Shooting method and shooting device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103561205A (en) * 2013-11-15 2014-02-05 深圳市中兴移动通信有限公司 Shooting method and shooting device
CN105100578A (en) * 2014-05-05 2015-11-25 南昌欧菲光电技术有限公司 Image processing system and image processing method thereof
CN109257540A (en) * 2018-11-05 2019-01-22 浙江舜宇光学有限公司 Take the photograph photography bearing calibration and the camera of lens group more
CN110166680A (en) * 2019-06-28 2019-08-23 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN110290324A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN111077661A (en) * 2018-10-18 2020-04-28 南昌欧菲生物识别技术有限公司 Wide-angle lens, camera module and electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103561205A (en) * 2013-11-15 2014-02-05 深圳市中兴移动通信有限公司 Shooting method and shooting device
CN105100578A (en) * 2014-05-05 2015-11-25 南昌欧菲光电技术有限公司 Image processing system and image processing method thereof
CN111077661A (en) * 2018-10-18 2020-04-28 南昌欧菲生物识别技术有限公司 Wide-angle lens, camera module and electronic device
CN109257540A (en) * 2018-11-05 2019-01-22 浙江舜宇光学有限公司 Take the photograph photography bearing calibration and the camera of lens group more
CN110166680A (en) * 2019-06-28 2019-08-23 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN110290324A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810598A (en) * 2021-08-11 2021-12-17 荣耀终端有限公司 Photographing method and device
CN115348390A (en) * 2022-08-23 2022-11-15 维沃移动通信有限公司 Shooting method and shooting device

Similar Documents

Publication Publication Date Title
CN108419016B (en) Shooting method and device and terminal
CN110557547B (en) Lens position adjusting method and device
EP3179711A2 (en) Method and apparatus for preventing photograph from being shielded
CN107888984B (en) Short video playing method and device
CN107944367B (en) Face key point detection method and device
CN112188096A (en) Photographing method and device, terminal and storage medium
CN110876014B (en) Image processing method and device, electronic device and storage medium
CN106469446B (en) Depth image segmentation method and segmentation device
CN113099113B (en) Electronic terminal, photographing method and device and storage medium
CN114422687B (en) Preview image switching method and device, electronic equipment and storage medium
CN110874829B (en) Image processing method and device, electronic device and storage medium
CN114418865A (en) Image processing method, device, equipment and storage medium
CN107707819B (en) Image shooting method, device and storage medium
CN107783704B (en) Picture effect adjusting method and device and terminal
CN107682623B (en) Photographing method and device
CN114666490A (en) Focusing method and device, electronic equipment and storage medium
CN114697517A (en) Video processing method and device, terminal equipment and storage medium
CN106131403B (en) Touch focusing method and device
CN114339018B (en) Method and device for switching lenses and storage medium
CN109447929B (en) Image synthesis method and device
CN115953422B (en) Edge detection method, device and medium
CN114339017B (en) Distant view focusing method, device and storage medium
CN109862252B (en) Image shooting method and device
CN110876013B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN118057824A (en) Image processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210105