CN111010496B - Image processing method and electronic equipment - Google Patents

Image processing method and electronic equipment Download PDF

Info

Publication number
CN111010496B
CN111010496B CN201911344804.2A CN201911344804A CN111010496B CN 111010496 B CN111010496 B CN 111010496B CN 201911344804 A CN201911344804 A CN 201911344804A CN 111010496 B CN111010496 B CN 111010496B
Authority
CN
China
Prior art keywords
image
projection
camera
determining
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911344804.2A
Other languages
Chinese (zh)
Other versions
CN111010496A (en
Inventor
陈明
吴仆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN201911344804.2A priority Critical patent/CN111010496B/en
Publication of CN111010496A publication Critical patent/CN111010496A/en
Application granted granted Critical
Publication of CN111010496B publication Critical patent/CN111010496B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an image processing method and electronic equipment, and relates to the technical field of image processing. The image processing method is applied to electronic equipment and comprises the following steps: a first image is obtained through shooting by a first camera, and a second image is obtained through shooting by a second camera; determining a projection image according to the second image; and fusing the projection image and the first image to obtain a target image. According to the scheme, the projection on the first image is eliminated by utilizing the projection image determined by the second image, so that the image without projection can be obtained, and the accuracy of projection elimination is improved in such a way.

Description

Image processing method and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and an electronic device.
Background
In the daily life photography of people, if an object which reflects light, such as glass, exists in a framed picture, projection is formed in the photographing process. It is now common practice to segment the projections by a single image processing and then to eliminate the projections.
However, the conventional single-image processing method has a poor effect of eliminating the projection.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, and aims to solve the problems that an existing projection elimination mode is low in accuracy rate and easily causes elimination of objects except for normal projection.
In order to solve the foregoing technical problem, an embodiment of the present invention provides an image processing method applied to an electronic device, including:
a first image is obtained through shooting by a first camera, and a second image is obtained through shooting by a second camera;
determining a projection image according to the second image;
and fusing the projection image and the first image to obtain a target image.
An embodiment of the present invention further provides an electronic device, including:
the shooting module is used for obtaining a first image through shooting by the first camera and obtaining a second image through shooting by the second camera;
a determining module for determining a projection image according to the second image;
and the fusion module is used for fusing the projection image and the first image to obtain a target image.
An embodiment of the present invention further provides an electronic device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program being executed by the processor for the steps of the image processing method described above.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the image processing method described above.
The invention has the beneficial effects that:
according to the scheme, the projection on the first image is eliminated by utilizing the projection image determined by the second image, so that the image without projection can be obtained, and the accuracy of projection elimination is improved in such a way.
Drawings
FIG. 1 is a flow chart illustrating an image processing method according to an embodiment of the present invention;
FIG. 2 is a detailed flow chart of an image processing method according to an embodiment of the invention;
FIG. 3 is a schematic diagram of a rear view taken by the rear camera;
FIG. 4 is a schematic diagram of a front image taken by the front camera;
FIG. 5 is a schematic diagram of a feature image determined in a pre-image;
FIG. 6 shows a schematic view of a projected image;
FIG. 7 is a schematic representation of the final target image after fusion;
FIG. 8 shows one of the block diagrams of an electronic device according to an embodiment of the invention;
FIG. 9 is a second block diagram of an electronic device according to an embodiment of the invention;
FIG. 10 is a third block diagram of an electronic device according to an embodiment of the invention;
FIG. 11 is a block diagram of an electronic device according to an embodiment of the present invention;
fig. 12 is a schematic diagram showing a hardware configuration of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides an image processing method, including:
step 101, a first image is obtained through shooting by a first camera, and a second image is obtained through shooting by a second camera;
the first camera and the second camera may be two different cameras on the electronic device, which is not specifically limited in this embodiment of the present invention. For example, optionally, one of the first camera and the second camera is a front camera, and the other is a rear camera.
Step 102, determining a projection image according to the second image;
and 103, fusing the projection image and the first image to obtain a target image.
Optionally, in the embodiment of the present invention, the front image captured by the front camera may be used to delete the projection in the rear image captured by the rear camera, and the rear image captured by the rear camera may also be used to delete the projection in the front image captured by the front camera. Of course, if there is a difference between the two cameras and the object of one camera is projected in the image captured by the other camera, the method can be used to delete the projection in the image without limiting the specific positional relationship between the two cameras.
It should be noted that the function of performing projection elimination provided in the embodiment of the present invention may determine whether to open according to a user's usage requirement, and if the user finds that a reflective object that may cause projection such as glass exists in a photographing environment, it determines that the function needs to open, and then executes an operation of opening the function, and the electronic device automatically acquires two images and eliminates the image projection, specifically, the implementation manner is: before the step 101, the electronic device may receive a first operation of a user; and responding to the first operation, and starting the first camera and the second camera.
The first operation may be a click operation, a long-press operation, or the like of the user, for example, an option of adding a projection elimination function to a camera function of the electronic device is added, when the user opens the camera and clicks a button of the projection elimination function, the electronic device executes the projection elimination function, and the finally obtained image is an image after the projection is eliminated.
It should be further noted that the electronic device may also automatically open another camera according to the feature of the preview image of the camera, and the specific implementation manner is as follows: before the step 101, the electronic device starts a first camera; and if the preview interface of the first camera has the projection characteristic, starting a second camera.
Optionally, it should be noted that a specific implementation manner of step 102 is as follows:
determining a characteristic image on the second image according to the first image;
and determining a projection image according to the characteristic image.
Specifically, according to the first image, determining the implementation manner of the feature image on the second image is as follows: performing feature matching of the first image and the second image to acquire feature information matched with the first image in the second image;
determining the feature information as a feature image on the second image.
It should be noted that, firstly, image feature points (e.g., (Brief, DoG, etc. with scale invariance) are extracted from two images, so as to achieve the purpose of image registration, and perform feature point matching. In this way, the projected area on the first image and the corresponding area on the second image can be determined.
It should be further noted that, optionally, in order to ensure that the feature image on the second image can be completely matched with the projection area on the first image, after obtaining the feature image on the second image, the electronic device needs to align the feature image with the matched feature on the first image.
It should be noted that the specific operation manner of alignment is as follows: and cutting the characteristic image of the second image, and then carrying out zooming and rotating operations, thus finishing the alignment operation with the image of the projection area on the first image.
It should be further noted that after the feature image is obtained, the feature image is required to be used to obtain a projection image matched with the feature image, which is specifically implemented by mapping the feature image to obtain the projection image matched with the feature image.
It should be noted that the mapping here can be implemented by using a pixel value mapping manner, i.e. by using a mapping look-up table (LUT).
Optionally, it should be noted that another specific implementation manner of step 102 is:
determining a first projection image corresponding to the second image according to the second image;
and determining a characteristic image corresponding to the first image in the first projection image, and determining the characteristic image as a projection image.
In this way, a second image is first used, and the second image is mapped to obtain a first projection image corresponding to the second image, that is, the first projection image is a projection image of all the second images; and then determining the matched features in the first image in the first projection image, and further obtaining the matched features as the projection image.
It should be noted that, in order to ensure that the obtained projection image can be completely matched with the projection area on the first image, after determining the feature matching with the first image in the first projection image, the electronic device needs to align the feature matching with the first image with the feature matching with the first image.
Further, optionally, after the projection image is obtained, an image matched with the projection image is subtracted from the first image, so that a target image with the projection removed can be obtained, and then the target image is displayed on a photographed preview interface.
As shown in fig. 2, the following describes a specific implementation of the embodiment of the present invention in detail, taking the front camera as an example to eliminate the projection in the image captured by the rear camera.
Step 201, shooting an image by using a rear camera and a front camera respectively;
it should be noted that, in this step, the rear camera and the front camera are used to simultaneously capture images, the image captured by the rear camera may include a projection, and the image captured by the front camera is a source for eliminating the projection on the rear image, specifically, the image captured by the rear camera is as shown in fig. 3, and the image captured by the front camera is as shown in fig. 4.
Step 202, matching key points and characteristics of the front image and the rear image to complete image matching and alignment;
it should be noted that the purpose of this step is to perform feature matching on the pre-image and the post-image, and to locate the region of the projection region on the post-image on the pre-image. Firstly, extracting image feature points (such as Brief, DoG and the like with scale invariance) from two images so as to achieve the aim of image registration and carry out feature point matching. By the above method, a projection area on the back image and a corresponding area on the front image can be determined, and specifically, the feature image determined in the front image is as shown in fig. 5. After the characteristics are matched, the image needs to be aligned next, the target area of the front image is cut, and then zooming and rotating operations are carried out, so that the alignment operation with the projection area on the rear image can be completed.
Step 203, generating a projection image by using the front image alignment area;
after the alignment of the projection area on the front image is completed, the next task to be completed is to generate a corresponding projection image on the front image using the aligned images. This step may be accomplished by the LUT method. After mapping the alignment area in the pre-image by using the LUT for projection effect, a projection image corresponding to the alignment image can be obtained, and specifically, a schematic diagram of the obtained projection image is shown in fig. 6.
Step 204, fusing the post-image and the projection image;
in this step, after subtracting and fusing the post-image and the projection image obtained in step 203, the target image without projection is obtained and displayed on the preview interface of the photographing, and specifically, the final fused image is as shown in fig. 7.
It should be noted that, the above-mentioned mode can realize utilizing the leading camera formation of image, and the projection of the picture of shooing of the back camera of accurate removal has improved user's experience of shooing.
As shown in fig. 8 to 11, an embodiment of the present invention further provides an electronic device 800, including:
the shooting module 801 is used for obtaining a first image through shooting by a first camera and obtaining a second image through shooting by a second camera;
a determining module 802, configured to determine a projection image according to the second image;
and a fusion module 803, configured to fuse the projection image and the first image to obtain a target image.
Optionally, the determining module 802 includes:
a first determining unit 8021, configured to determine, according to the first image, a feature image on the second image;
a second determining unit 8022, configured to determine a projection image according to the feature image.
Optionally, the determining module 802 includes:
a third determining unit 8023, configured to determine, according to the second image, a first projection image corresponding to the second image;
a fourth determining unit 8024, configured to determine a feature image corresponding to the first image in the first projection image, and determine the feature image as the projection image.
Optionally, the fusing module 803 is configured to:
and subtracting the image matched with the projection image from the first image to obtain a target image.
Optionally, before the capturing module 801 obtains a first image by capturing with a first camera and obtains a second image by capturing with a second camera, the method further includes:
a first starting module 804, configured to start the first camera;
a second starting module 805, configured to start a second camera if a projection feature exists on the preview interface of the first camera.
The electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the method embodiment of fig. 1, and is not described herein again to avoid repetition. The electronic equipment of the embodiment of the invention obtains a first image through shooting by the first camera and obtains a second image through shooting by the second camera; determining a projection image according to the second image; fusing the projection image with the first image to obtain a target image; by the method, the image without projection can be ensured to be obtained, and the accuracy of projection elimination is improved.
Fig. 12 is a schematic diagram of a hardware structure of an electronic device for implementing an embodiment of the present invention.
The electronic device 120 includes, but is not limited to: a radio frequency unit 1210, a network module 1220, an audio output unit 1230, an input unit 1240, a sensor 1250, a display unit 1260, a user input unit 1270, an interface unit 1280, a memory 1290, a processor 1211, and a power supply 1212. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 12 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 1211 is configured to obtain a first image through shooting by a first camera, and obtain a second image through shooting by a second camera; determining a projection image according to the second image; and fusing the projection image and the first image to obtain a target image.
The electronic equipment of the embodiment of the invention obtains a first image through shooting by the first camera and obtains a second image through shooting by the second camera; determining a projection image according to the second image; fusing the projection image with the first image to obtain a target image; by the method, the image without projection can be ensured to be obtained, and the accuracy of projection elimination is improved.
It should be understood that, in the embodiment of the present invention, the rf unit 1210 may be used for receiving and transmitting signals during information transmission or communication, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1211; in addition, the uplink data is transmitted to the base station. Generally, radio unit 1210 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio unit 1210 may also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 1220, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 1230 may convert audio data received by the radio frequency unit 1210 or the network module 1220 or stored in the memory 1290 into an audio signal and output as sound. Also, the audio output unit 1230 may also provide audio output related to a specific function performed by the electronic device 120 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1230 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1240 is used to receive audio or video signals. The input Unit 1240 may include a Graphics Processing Unit (GPU) 1241 and a microphone 1242, and the Graphics processor 1241 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1260. The image frames processed by the graphic processor 1241 may be stored in the memory 1290 (or other storage medium) or transmitted via the radio frequency unit 1210 or the network module 1220. The microphone 1242 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio unit 1210 in case of a phone call mode.
The electronic device 120 also includes at least one sensor 1250, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 1261 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1261 and/or the backlight when the electronic device 120 moves to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1250 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1260 is used to display information input by a user or information provided to a user. The Display unit 1260 may include a Display panel 1261, and the Display panel 1261 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1270 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 1270 includes a touch panel 1271 and other input devices 1272. Touch panel 1271, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., user operations on touch panel 1271 or near touch panel 1271 using a finger, stylus, or any other suitable object or attachment). Touch panel 1271 may include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1211, receives a command from the processor 1211, and executes the command. In addition, the touch panel 1271 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to touch panel 1271, user input unit 1270 may include other input devices 1272. In particular, other input devices 1272 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, touch panel 1271 can be overlaid on display panel 1261, and when touch panel 1271 detects a touch operation thereon or nearby, it can be transmitted to processor 1211 for determining the type of touch event, and processor 1211 can then provide a corresponding visual output on display panel 1261 according to the type of touch event. Although in fig. 12, the touch panel 1271 and the display panel 1261 are implemented as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1271 and the display panel 1261 may be integrated to implement the input and output functions of the electronic device, and are not limited herein.
The interface unit 1280 is an interface through which an external device is connected to the electronic apparatus 120. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1280 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 120 or may be used to transmit data between the electronic apparatus 120 and the external device.
The memory 1290 may be used for storing software programs and various data. The memory 1290 may mainly include a program storage area and a data storage area, where the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1290 can include high-speed random access memory and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 1211 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 1290 and calling data stored in the memory 1290, thereby monitoring the entire electronic device. The processor 1211 may include one or more processing units; preferably, the processor 1211 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be understood that the modem processor may not be integrated into the processor 1211.
The electronic device 120 may further include a power supply 1212 (e.g., a battery) for powering the various components, and preferably, the power supply 1212 may be logically coupled to the processor 1211 via a power management system such that the power management system performs functions including managing charging, discharging, and power consumption.
In addition, the electronic device 120 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, including a processor 1211, a memory 1290, and a computer program stored in the memory 1290 and capable of running on the processor 1211, where the computer program, when executed by the processor 1211, implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and details are not repeated herein to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the preferred embodiments of the present invention have been described, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims (6)

1. An image processing method applied to an electronic device, comprising:
a first image is obtained through shooting by a first camera, and a second image is obtained through shooting by a second camera;
determining a projection image according to the second image; fusing the projection image with the first image to obtain a target image;
determining a projection image from the second image, comprising:
performing feature matching of the first image and the second image to acquire feature information matched with the first image in the second image;
determining the feature information as a feature image on the second image;
after the characteristic image is obtained, mapping the characteristic image by using a pixel lookup table mode to obtain a projection image matched with the characteristic image;
the fusing the projection image and the first image to obtain a target image, comprising:
subtracting the image matched with the projection image from the first image to obtain a target image;
the images shot by the two cameras have difference, and the image shot by one camera has projection in the image shot by the other camera.
2. The image processing method of claim 1, wherein determining a projection image from the second image comprises:
mapping the second image according to the second image, and determining a first projection image corresponding to the second image, wherein the first projection image is a projection image of all the second images;
and determining a characteristic image corresponding to the first image in the first projection image, and determining the characteristic image as a projection image.
3. The image processing method according to claim 1, further comprising, before the capturing a first image by a first camera and a second image by a second camera:
starting a first camera;
and if the preview interface of the first camera has the projection characteristic, starting a second camera.
4. An electronic device, comprising:
the shooting module is used for obtaining a first image through shooting by the first camera and obtaining a second image through shooting by the second camera;
a determining module for determining a projection image according to the second image; the fusion module is used for fusing the projection image and the first image to obtain a target image;
determining a projection image from the second image, comprising:
performing feature matching of the first image and the second image to acquire feature information matched with the first image in the second image;
determining the feature information as a feature image on the second image;
after the characteristic image is obtained, mapping the characteristic image by using a pixel lookup table mode to obtain a projection image matched with the characteristic image;
the fusing the projection image and the first image to obtain a target image, comprising:
subtracting the image matched with the projection image from the first image to obtain a target image;
the images shot by the two cameras have difference, and the image shot by one camera has projection in the image shot by the other camera.
5. The electronic device of claim 4, wherein the determining module comprises:
a third determining unit, configured to map the second image according to the second image, and determine a first projection image corresponding to the second image, where the first projection image is a projection image of all the second images;
and the fourth determining unit is used for determining a characteristic image corresponding to the first image in the first projection image and determining the characteristic image as the projection image.
6. The electronic device of claim 4, further comprising, before the capturing module captures a first image through a first camera and a second image through a second camera:
the first starting module is used for starting the first camera;
and the second starting module is used for starting the second camera if the projection characteristics exist on the preview interface of the first camera.
CN201911344804.2A 2019-12-24 2019-12-24 Image processing method and electronic equipment Active CN111010496B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911344804.2A CN111010496B (en) 2019-12-24 2019-12-24 Image processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911344804.2A CN111010496B (en) 2019-12-24 2019-12-24 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111010496A CN111010496A (en) 2020-04-14
CN111010496B true CN111010496B (en) 2022-07-08

Family

ID=70117779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911344804.2A Active CN111010496B (en) 2019-12-24 2019-12-24 Image processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111010496B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112468722B (en) * 2020-11-19 2022-05-06 惠州Tcl移动通信有限公司 Shooting method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103473565A (en) * 2013-08-23 2013-12-25 华为技术有限公司 Image matching method and device
CN103826065A (en) * 2013-12-12 2014-05-28 小米科技有限责任公司 Image processing method and apparatus
WO2015021764A1 (en) * 2013-08-15 2015-02-19 小米科技有限责任公司 Image processing method and apparatus, and terminal device
CN107481201A (en) * 2017-08-07 2017-12-15 桂林电子科技大学 A kind of high-intensity region method based on multi-view image characteristic matching

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6801654B2 (en) * 2000-01-14 2004-10-05 Sony Corporation Picture processing apparatus, method and recording medium for a natural expansion drawing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015021764A1 (en) * 2013-08-15 2015-02-19 小米科技有限责任公司 Image processing method and apparatus, and terminal device
CN103473565A (en) * 2013-08-23 2013-12-25 华为技术有限公司 Image matching method and device
CN103826065A (en) * 2013-12-12 2014-05-28 小米科技有限责任公司 Image processing method and apparatus
CN107481201A (en) * 2017-08-07 2017-12-15 桂林电子科技大学 A kind of high-intensity region method based on multi-view image characteristic matching

Also Published As

Publication number Publication date
CN111010496A (en) 2020-04-14

Similar Documents

Publication Publication Date Title
CN108668083B (en) Photographing method and terminal
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
US20220279116A1 (en) Object tracking method and electronic device
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN111263071B (en) Shooting method and electronic equipment
CN108174103B (en) Shooting prompting method and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN109639969B (en) Image processing method, terminal and server
CN108307106B (en) Image processing method and device and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN108924414B (en) Shooting method and terminal equipment
CN109819168B (en) Camera starting method and mobile terminal
CN109523253B (en) Payment method and device
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN108174110B (en) Photographing method and flexible screen terminal
CN108174109B (en) Photographing method and mobile terminal
CN110290263B (en) Image display method and mobile terminal
CN110602387B (en) Shooting method and electronic equipment
CN108924413B (en) Shooting method and mobile terminal
CN109104564B (en) Shooting prompting method and terminal equipment
CN108259756B (en) Image shooting method and mobile terminal
CN110825474A (en) Interface display method and device and electronic equipment
CN108243489B (en) Photographing control method and mobile terminal
CN108156386B (en) Panoramic photographing method and mobile terminal
CN111010496B (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant