CN112770042B - Image processing method and device, computer readable medium, wireless communication terminal - Google Patents

Image processing method and device, computer readable medium, wireless communication terminal Download PDF

Info

Publication number
CN112770042B
CN112770042B CN201911070040.2A CN201911070040A CN112770042B CN 112770042 B CN112770042 B CN 112770042B CN 201911070040 A CN201911070040 A CN 201911070040A CN 112770042 B CN112770042 B CN 112770042B
Authority
CN
China
Prior art keywords
image
camera module
shooting
target
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911070040.2A
Other languages
Chinese (zh)
Other versions
CN112770042A (en
Inventor
郑海清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realme Chongqing Mobile Communications Co Ltd
Original Assignee
Realme Chongqing Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realme Chongqing Mobile Communications Co Ltd filed Critical Realme Chongqing Mobile Communications Co Ltd
Priority to CN201911070040.2A priority Critical patent/CN112770042B/en
Publication of CN112770042A publication Critical patent/CN112770042A/en
Application granted granted Critical
Publication of CN112770042B publication Critical patent/CN112770042B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure relates to the field of terminal technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable medium, and a wireless communication terminal. The method is applied to electronic equipment, the electronic equipment comprises a first camera module and a second camera module, and the method comprises the following steps: responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode, and activating a second camera module; acquiring a first image of a shooting target through the first camera module, and extracting corresponding shooting parameters; applying the shooting parameters to the second camera module to acquire a second image of the shooting target; and cropping the second image, and fusing the cropped image with the first image to obtain a fused image. The method can obtain the fusion image with higher image quality in the target shooting mode.

Description

Image processing method and device, computer readable medium, wireless communication terminal
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable medium, and a wireless communication terminal.
Background
At present, current intelligent terminal disposes two cameras, three cameras or four cameras in order to promote the shooting effect. And calling the corresponding camera in different shooting modes. For example, a macro camera is called to perform shooting in a macro shooting mode, a main camera is used in a general shooting model, or a wide-angle camera is used in shooting a distant view.
In most cases, in the macro shooting mode, shooting can be generally performed only by using a macro camera. Because the existing microspur camera photoreceptor has fewer pixel points, the picture quality is poor, the picture is not clear enough, and the picture size is small.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image processing method, an image processing apparatus, a computer readable medium, and a wireless communication terminal, which can simultaneously capture images using a plurality of cameras and process the captured images to improve image quality.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, an image processing method is provided, which is applied to an electronic device including a first camera module and a second camera module, and the method includes:
responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode, and activating a second camera module;
acquiring a first image of a shooting target through the first camera module, and extracting corresponding shooting parameters;
applying the shooting parameters to the second camera module to acquire a second image of the shooting target;
and cropping the second image, and fusing the cropped image with the first image to obtain a fused image.
According to a second aspect of the present disclosure, there is provided an image processing apparatus applied to an electronic device including a first camera module and a second camera module, the apparatus including:
the instruction response module is used for responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode and activating a second camera module;
the first camera module control module is used for acquiring a first image of a shooting target through the first camera module and extracting corresponding shooting parameters;
the second camera module control module is used for applying the shooting parameters to the second camera module so as to acquire a second image of the shooting target;
and the image fusion module is used for cutting the second image and fusing the cut image and the first image to obtain a fused image.
According to a third aspect of the present disclosure, a computer-readable medium, on which a computer program is stored which, when executed by a processor, implements the above-described image processing method.
According to a fourth aspect of the present disclosure, there is provided a wireless communication terminal comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image processing method described above.
According to the image processing method provided by the embodiment of the disclosure, when a specified target shooting mode is entered, the preset first camera module and the second camera module corresponding to the target shooting mode are activated, and the shooting parameters of the first camera module are applied to the second camera module for shooting, so that the first image and the second image are obtained and then fused, and a fused image with higher image quality in the target shooting mode is obtained.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a flow diagram of an image processing method in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a flow diagram of another image processing method in an exemplary embodiment of the disclosure;
fig. 3 schematically illustrates a composition diagram of an image processing apparatus in an exemplary embodiment of the present disclosure;
fig. 4 schematically shows a structural diagram of a computer system of a wireless communication device in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment provides an image processing method, which can be applied to a mobile intelligent terminal device such as a mobile phone and a tablet computer, which is configured with a macro camera and a main camera, and processes an image in a macro shooting mode.
Referring to fig. 1, the image processing method described above may include the steps of:
s11, responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode, and activating a second camera module;
s12, acquiring a first image of a shooting target through the first camera module, and extracting corresponding shooting parameters;
s13, applying the shooting parameters to the second camera module to acquire a second image of the shooting target;
and S14, cutting the second image, and fusing the cut image and the first image to obtain a fused image.
In the image processing method provided by the present exemplary embodiment, on one hand, when entering a specified target shooting mode, a first camera module and a second camera module preset in the target shooting mode are activated, and shooting parameters of the first camera module are applied to the second camera module for shooting, so that a first image and a second image shot by the two camera modules correspondingly can be obtained respectively. On the other hand, the shooting parameters of the first shooting module are applied to the second shooting module to shoot, so that a second image which has the same visual angle as the first image and the same scene content as the first image can be obtained. And then the first image and the second image are fused, so that a fused image with higher image quality in a target shooting mode is obtained.
Hereinafter, each step of the image processing method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In step S11, in response to a trigger instruction of a target shooting mode, a first camera module corresponding to the target shooting mode is activated, and a second camera module is activated.
In this example embodiment, the electronic device may be configured with a terminal device such as a mobile phone and a tablet computer having a first camera module and a second camera module. The first camera module can be a macro camera and is used as an auxiliary shooting component of the electronic equipment; the second camera module can be a main camera of the terminal device, for example, a main camera with 1600 ten thousand pixels or 6400 ten thousand pixels.
For example, when the target shooting mode is the macro shooting mode, when a user starts a camera application at the terminal device, or calls the camera application in a third-party application program, a trigger instruction may be generated when entering the macro shooting mode, so that the terminal device starts the first camera module (macro camera) and the second camera module (main camera).
In step S12, a first image of the shooting target is captured by the first camera module, and corresponding shooting parameters are extracted.
In this exemplary embodiment, the target environment or the target object is photographed according to a touch operation of the user on the terminal device, at this time, the photographing target may be photographed by using the macro camera to obtain the first image. Meanwhile, shooting parameters of the current macro camera can be extracted. For example, the photographing parameters may include any one of a focal length, a contrast, a shutter speed, exposure compensation, white balance, sensitivity, or any plurality thereof.
In step S13, the shooting parameters are applied to the second camera module to capture a second image of the shooting target.
In this exemplary embodiment, the shooting parameters of the macro camera during shooting can be directly applied to the main camera, so that the main camera takes pictures according to the shooting parameters, thereby acquiring a second image with the same view angle, the same focal length, the same shutter speed and other shooting parameters as those of the first image. Since the main camera has higher pixels, a macro image of higher quality can be obtained.
In step S14, the second image is cropped, and the cropped image is fused with the first image to obtain a fused image.
In an embodiment of the present disclosure, after the second image captured by the main camera is acquired, the second image may be further compressed or cropped to obtain a processed image with the same size as the first image. Then, image fusion processing may be performed on the processed image and the first image, and the fused image is used as a final macro image and displayed in a preview interface, or the fused image is input according to a touch operation of a user.
Because the second image has higher pixels and can contain more image details and pixel contents of the shooting target, the details in the first image can be optimized by using the second image after fusion, thereby effectively improving the image quality of the macro image and enhancing the detail expression of the macro image.
For example, an image fusion algorithm based on feature extraction or a pixel-level image fusion algorithm may be employed. For example: a laplacian pyramid fusion algorithm, a gaussian pyramid or weighted average algorithm, etc. The specific execution method of the image fusion algorithm can be realized by adopting a conventional technical means, and the specific execution method is not specially limited by the disclosure. Meanwhile, because the visual angle, the focal length, the exposure rate and the shutter speed of the first image and the second image are the same, the calculated amount of the first image and the second image during image fusion can be effectively reduced, and the processing speed is improved.
Based on the above, in the present exemplary embodiment, for the acquired first image, region division may also be performed. Specifically, the method may include:
step S21, carrying out image segmentation on the first image to obtain a plurality of image areas, and identifying an image area to be optimized;
and S22, updating the shooting parameters based on the area to be optimized to obtain the updated shooting parameters.
In this exemplary embodiment, after the first image is acquired, the first image may be subjected to image segmentation to divide the multi-image area. And identifying the characteristics of image content, color and the like in each image area, screening the areas needing to be optimized according to a preset rule, and taking the areas as the image areas to be optimized. And updating the shooting parameters by using a preset parameter updating strategy according to the area to be optimized, and applying the updated shooting parameters to the main camera so that the main camera acquires a second image according to the updated shooting parameters.
In the present exemplary embodiment, specifically, the image segmentation algorithm may be utilized to identify a foreground region and a background region in the first image, and identify a target image type in the foreground region, such as a type of an animal, a person, a building or a plant, a water drop, and the like; or, environmental information such as day, night, light level, etc. may also be included. And updating specific shooting parameters according to the specific image type by using a preset parameter updating strategy. For example, when shooting a plant, the configuration parameter update strategy is a shutter speed of 1/50s, a sensitivity of 200-400, and so on.
Or, in other exemplary embodiments of the present disclosure, after the first image is acquired and the shooting parameters are extracted, a second image may also be acquired according to the shooting parameters; and meanwhile, identifying a region to be optimized of the first image, acquiring a second image again after updating the shooting parameters, segmenting a corresponding segmented region of the region to be optimized in the second image, and fusing the first image, the second image and the optimized segmented region to obtain a fused image.
By dividing the first image into areas, identifying the areas to be optimized and updating the shooting parameters according to the optimization strategy, the acquired second image can be more accurately optimized in each image area, and the image quality of the macro image is improved.
In the present exemplary embodiment, referring to fig. 2, there is provided an image processing method, which may include the steps of:
s31, responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode, and activating a second camera module;
s32, acquiring a first image of a shooting target through the first camera module, and extracting corresponding shooting parameters;
s33, applying the shooting parameters to the second camera module to acquire a second image of the shooting target;
and S34, compressing the second image to acquire a shooting image corresponding to the shooting target.
In an embodiment of the disclosure, an image shot by the main camera may also be used as a final macro image and displayed in a preview interface of the terminal device. Alternatively, the captured image of the main camera may be output as a macro image according to the user's operation.
According to the image processing method provided by the embodiment of the disclosure, the macro camera and the main camera of the electronic device are started simultaneously when the micro-distance shooting mode is entered, the macro camera is used for shooting to acquire the first image and the corresponding shooting parameters, and the shooting parameters are applied to the main camera, so that the main camera can acquire the second image by applying the shooting parameters, and the first image and the second image have the same view angle and shooting parameters. And then the first image and the second image are fused, so that the first image can be optimized by utilizing the second image, the fused macro image has richer pixel content, and the image quality of the image is effectively improved. And can provide a user with a higher-definition macro image in the preview interface.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 3, an image processing apparatus 40 applied to an electronic device including a first camera module and a second camera module is further provided in the embodiment of the present example, and includes: the system comprises an instruction response module 401, a first camera module control module 402, a second camera module control module 403 and an image fusion module 404. Wherein, the first and the second end of the pipe are connected with each other,
the instruction response module 401 may be configured to activate the first camera module and the second camera module in response to a trigger instruction of the target shooting mode.
The first camera module control module 402 may be configured to collect a first image of a shooting target through the first camera module, and extract corresponding shooting parameters.
The second camera module control module 403 may be configured to apply the shooting parameters to the second camera module to acquire a second image of the shooting target.
The image fusion module 404 may be configured to crop the second image and fuse the cropped image with the first image to obtain a fused image.
In an example of the present disclosure, the first camera module control module 402 may include: an image segmentation unit, and a photographing parameter update unit (not shown in the figure). Wherein the content of the first and second substances,
the image segmentation unit may be configured to perform image segmentation on the first image to obtain a plurality of image regions, and identify an image region to be optimized.
The shooting parameter updating unit may be configured to update the shooting parameters based on the area to be optimized to obtain updated shooting parameters.
In an example of the present disclosure, the image segmentation unit may be configured to segment the first image to obtain a foreground image region and a background image region, so as to use the foreground image region as an image region to be optimized.
In an example of the present disclosure, the shooting parameter updating unit may be configured to identify a target image type corresponding to the area to be optimized, and update the shooting parameters according to a parameter update policy corresponding to the target image type to obtain updated shooting parameters.
In an example of the present disclosure, the image fusion module 404 may include: a compression processing unit (not shown in the figure).
The compression processing unit may be configured to perform compression processing on the second image to obtain a compressed image having the same size as the first image, and to fuse the compressed image with the first image to obtain a fused image.
In one example of the present disclosure, the apparatus 40 further includes: a second image processing module (not shown). Wherein, the first and the second end of the pipe are connected with each other,
the second image processing module may be configured to perform compression processing on the second image to obtain a captured image corresponding to the captured target.
In one example of the present disclosure, the apparatus 40 further includes: an image display module (not shown).
The image presentation module may be configured to present the fused image on a preview interface of the electronic device, or input the fused image.
The specific details of each module in the image processing apparatus have been described in detail in the corresponding image processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Fig. 4 illustrates a schematic block diagram of a computer system suitable for use with a wireless communication device to implement an embodiment of the present invention.
It should be noted that the computer system 500 of the electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiment of the present invention.
As shown in fig. 4, the computer system 500 includes a Central Processing Unit (CPU) 501, which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for system operation are also stored. The CPU 501, ROM502, and RAM 503 are connected to each other through a bus 504. An Input/Output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output section 507 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium shown in the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present invention, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software or hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
It should be noted that, as another aspect, the present application also provides a computer-readable medium, which may be included in the electronic device described in the foregoing embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed, for example, synchronously or asynchronously in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (7)

1. The image processing method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a first camera module and a second camera module, and the first camera module is a macro camera; the method comprises the following steps:
responding to a trigger instruction of a target shooting mode, activating a first camera module corresponding to the target shooting mode, and activating a second camera module;
acquiring a first image of a shooting target through the first camera module, and extracting corresponding shooting parameters; segmenting the first image to obtain a foreground image area and a background image area, taking the foreground image area as an image area to be optimized, and identifying the image area to be optimized; identifying the type of the target image corresponding to the area to be optimized, and updating the shooting parameters according to a parameter updating strategy corresponding to the type of the target image to obtain the updated shooting parameters;
applying the updated shooting parameters to the second camera module to acquire a second image of the shooting target;
and cropping the second image, and fusing the cropped image with the first image to obtain a fused image.
2. The method according to claim 1, wherein the cropping the second image and fusing the cropped image with the first image to obtain a fused image comprises:
and compressing the second image to obtain a compressed image with the same size as the first image, and fusing the compressed image with the first image to obtain a fused image.
3. The method of claim 1, wherein after applying the camera parameters to the second camera module to capture a second image of the camera target, the method further comprises:
and compressing the second image to obtain a shot image corresponding to the shooting target.
4. The method of claim 1, further comprising:
and displaying the fused image on a preview interface of the electronic equipment, or inputting the fused image.
5. An image processing apparatus applied to an electronic device including a first camera module and a second camera module, the apparatus comprising:
the instruction response module is used for responding to a trigger instruction of a target shooting mode and activating the first camera module and the second camera module;
first module control module of making a video recording for through first module of making a video recording gathers the first image of shooting the target to draw corresponding shooting parameter, include: the image segmentation unit is used for segmenting the first image to obtain a foreground image area and a background image area so as to take the foreground image area as an image area to be optimized; the shooting parameter updating unit is used for updating the shooting parameters based on the area to be optimized so as to obtain the updated shooting parameters;
the second camera module control module is used for applying the updated shooting parameters to the second camera module so as to acquire a second image of the shooting target;
and the image fusion module is used for cutting the second image and fusing the cut image with the first image to obtain a fused image.
6. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 4.
7. A wireless communication terminal, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the image processing method according to any one of claims 1 to 4.
CN201911070040.2A 2019-11-05 2019-11-05 Image processing method and device, computer readable medium, wireless communication terminal Active CN112770042B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911070040.2A CN112770042B (en) 2019-11-05 2019-11-05 Image processing method and device, computer readable medium, wireless communication terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911070040.2A CN112770042B (en) 2019-11-05 2019-11-05 Image processing method and device, computer readable medium, wireless communication terminal

Publications (2)

Publication Number Publication Date
CN112770042A CN112770042A (en) 2021-05-07
CN112770042B true CN112770042B (en) 2022-11-15

Family

ID=75692932

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911070040.2A Active CN112770042B (en) 2019-11-05 2019-11-05 Image processing method and device, computer readable medium, wireless communication terminal

Country Status (1)

Country Link
CN (1) CN112770042B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095629B (en) * 2021-10-20 2023-10-31 浪潮金融信息技术有限公司 Camera operation system, method and medium for linux terminal
CN115019515B (en) * 2022-04-19 2023-03-03 北京拙河科技有限公司 Imaging control method and system
CN115314750B (en) * 2022-08-10 2023-09-29 润博全景文旅科技有限公司 Video playing method, device and equipment
CN116757983B (en) * 2023-07-03 2024-02-06 北京拙河科技有限公司 Main and auxiliary image fusion method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106456109A (en) * 2015-05-07 2017-02-22 深圳迈瑞生物医疗电子股份有限公司 Optimization method and device for area display effect, and ultrasonic diagnostic system
CN108055453A (en) * 2017-12-05 2018-05-18 深圳市金立通信设备有限公司 A kind of acquisition parameters update method and terminal
CN108377341A (en) * 2018-05-14 2018-08-07 Oppo广东移动通信有限公司 Photographic method, device, terminal and storage medium
CN108600630A (en) * 2018-05-10 2018-09-28 Oppo广东移动通信有限公司 Photographic method, device and terminal device
CN108764370A (en) * 2018-06-08 2018-11-06 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN110266946A (en) * 2019-06-25 2019-09-20 普联技术有限公司 One kind is taken pictures effect automatic optimization method, device, storage medium and terminal device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578028A (en) * 2015-07-28 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Photographing method and terminal
CN105847674B (en) * 2016-03-25 2019-06-07 维沃移动通信有限公司 A kind of preview image processing method and mobile terminal based on mobile terminal
CN106572249A (en) * 2016-10-17 2017-04-19 努比亚技术有限公司 Region enlargement method and apparatus
CN106993139B (en) * 2017-04-28 2019-10-15 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107592467A (en) * 2017-10-20 2018-01-16 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108024056B (en) * 2017-11-30 2019-10-29 Oppo广东移动通信有限公司 Imaging method and device based on dual camera
CN107835372A (en) * 2017-11-30 2018-03-23 广东欧珀移动通信有限公司 Imaging method, device, mobile terminal and storage medium based on dual camera
CN108156374B (en) * 2017-12-25 2020-12-08 努比亚技术有限公司 Image processing method, terminal and readable storage medium
CN109698908A (en) * 2018-12-29 2019-04-30 努比亚技术有限公司 Intelligence calls method, terminal and the storage medium of front camera and rear camera imaging
CN109495689B (en) * 2018-12-29 2021-04-13 北京旷视科技有限公司 Shooting method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106456109A (en) * 2015-05-07 2017-02-22 深圳迈瑞生物医疗电子股份有限公司 Optimization method and device for area display effect, and ultrasonic diagnostic system
CN108055453A (en) * 2017-12-05 2018-05-18 深圳市金立通信设备有限公司 A kind of acquisition parameters update method and terminal
CN108600630A (en) * 2018-05-10 2018-09-28 Oppo广东移动通信有限公司 Photographic method, device and terminal device
CN108377341A (en) * 2018-05-14 2018-08-07 Oppo广东移动通信有限公司 Photographic method, device, terminal and storage medium
CN108764370A (en) * 2018-06-08 2018-11-06 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN110266946A (en) * 2019-06-25 2019-09-20 普联技术有限公司 One kind is taken pictures effect automatic optimization method, device, storage medium and terminal device

Also Published As

Publication number Publication date
CN112770042A (en) 2021-05-07

Similar Documents

Publication Publication Date Title
CN112770042B (en) Image processing method and device, computer readable medium, wireless communication terminal
CN108898567B (en) Image noise reduction method, device and system
US10334153B2 (en) Image preview method, apparatus and terminal
CN113992861B (en) Image processing method and image processing device
CN111131698B (en) Image processing method and device, computer readable medium and electronic equipment
CN107749944A (en) A kind of image pickup method and device
CN110324532B (en) Image blurring method and device, storage medium and electronic equipment
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
CN113129241B (en) Image processing method and device, computer readable medium and electronic equipment
CN111107265B (en) Image processing method and device, computer readable medium and electronic equipment
WO2019037038A1 (en) Image processing method and device, and server
CN110868547A (en) Photographing control method, photographing control device, electronic equipment and storage medium
CN110855957B (en) Image processing method and device, storage medium and electronic equipment
CN112019734B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN107979729B (en) Method and equipment for displaying preview image
CN107295261B (en) Image defogging method and device, storage medium and mobile terminal
CN110933304B (en) Method and device for determining to-be-blurred region, storage medium and terminal equipment
CN115278103B (en) Security monitoring image compensation processing method and system based on environment perception
CN109120856B (en) Camera shooting method and device
CN114255177B (en) Exposure control method, device, equipment and storage medium in imaging
CN113395434B (en) Preview image blurring method, storage medium and terminal equipment
CN113362381B (en) Image processing method and device, readable medium and communication terminal
CN113938578A (en) Image blurring method, storage medium and terminal device
CN111325148A (en) Method, device and equipment for processing remote sensing image and storage medium
CN112822404B (en) Image processing method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant