CN110855882B - Shooting processing method and device, storage medium and electronic equipment - Google Patents

Shooting processing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110855882B
CN110855882B CN201911057830.7A CN201911057830A CN110855882B CN 110855882 B CN110855882 B CN 110855882B CN 201911057830 A CN201911057830 A CN 201911057830A CN 110855882 B CN110855882 B CN 110855882B
Authority
CN
China
Prior art keywords
output mode
target object
mobile terminal
distance
texture complexity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911057830.7A
Other languages
Chinese (zh)
Other versions
CN110855882A (en
Inventor
刘彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realme Chongqing Mobile Communications Co Ltd
Original Assignee
Realme Chongqing Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realme Chongqing Mobile Communications Co Ltd filed Critical Realme Chongqing Mobile Communications Co Ltd
Priority to CN201911057830.7A priority Critical patent/CN110855882B/en
Publication of CN110855882A publication Critical patent/CN110855882A/en
Application granted granted Critical
Publication of CN110855882B publication Critical patent/CN110855882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a shooting processing method, a shooting processing device, a storage medium and electronic equipment, and relates to the technical field of mobile terminals. The shooting processing method comprises the following steps: acquiring preview image data of a camera of the mobile terminal in a first output mode, and detecting the distance between a target object in a current shooting scene and the mobile terminal according to the preview image data; if the distance between the target object and the mobile terminal is greater than a distance threshold value, switching the output mode of the camera to a second output mode, and outputting the shot image by using the second output mode; wherein the pixels of the first output mode are lower than the pixels of the second output mode. The mobile terminal and the shooting method thereof can improve the shooting effect of the mobile terminal.

Description

Shooting processing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of mobile terminal technologies, and in particular, to a shooting processing method, a shooting processing apparatus, a storage medium, and an electronic device.
Background
With the development of mobile terminals, people have higher and higher requirements on the photographing performance of the mobile terminals. At present, for a camera on a mobile terminal, an output mode can be adjusted according to the brightness of the current shooting environment.
For example, in a scene with dark environment, the camera is in a low-pixel output mode, and image data of low pixels is output; in a scene with a bright environment, the camera is switched to a high-pixel output mode, and image data of high pixels is output.
However, when the ambient brightness is taken into consideration, in many scenes, the subject to be photographed is unclear and the detailed expression is poor, and the photographing effect is poor.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a shooting processing method, a shooting processing apparatus, a storage medium, and an electronic device, thereby overcoming a problem of a poor shooting effect of a mobile terminal at least to a certain extent.
According to a first aspect of the present disclosure, there is provided a shooting processing method including: acquiring preview image data of a camera of the mobile terminal in a first output mode, and detecting the distance between a target object in a current shooting scene and the mobile terminal according to the preview image data; if the distance between the target object and the mobile terminal is greater than a distance threshold value, switching the output mode of the camera to a second output mode, and outputting the shot image by using the second output mode; wherein the pixels of the first output mode are lower than the pixels of the second output mode.
According to a second aspect of the present disclosure, there is provided a shooting processing apparatus including: the distance detection module is used for acquiring preview image data of a camera of the mobile terminal in a first output mode and detecting the distance between a target object in a current shooting scene and the mobile terminal according to the preview image data; the first shooting processing module is used for switching the output mode of the camera to a second output mode and outputting a shot image by utilizing the second output mode if the distance between the target object and the mobile terminal is greater than a distance threshold; wherein the pixels of the first output mode are lower than the pixels of the second output mode.
According to a third aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described photographing processing method.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to execute the above-described photographing processing method via execution of the executable instructions.
According to the technical scheme provided by some embodiments of the disclosure, the distance from a target object in a current shooting scene to a mobile terminal is determined by means of preview image data of a camera in a first output mode, and the output mode of the camera is automatically adjusted according to the distance, so that the output resolution is changed, the definition and texture details of images shot by a user are optimized, and the shooting effect is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 schematically shows a flowchart of a photographing processing method according to an exemplary embodiment of the present disclosure;
fig. 2 schematically shows a flowchart of an entire photographing process procedure according to an exemplary embodiment of the present disclosure;
fig. 3 schematically shows a block diagram of a photographing processing apparatus according to an exemplary embodiment of the present disclosure;
fig. 4 schematically shows a block diagram of a photographing processing apparatus according to another exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates a block diagram of a second photographing processing module according to an exemplary embodiment of the present disclosure;
fig. 6 schematically illustrates a block diagram of a second photographing processing module according to another exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of a distance detection module according to an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a block diagram of a distance detection module according to another exemplary embodiment of the present disclosure;
fig. 9 schematically shows a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation. In addition, the terms "first" and "second" are used for distinguishing purposes only and should not be construed as limiting the present disclosure.
The photographing processing method described below may be implemented by a mobile terminal, that is, the mobile terminal may perform the respective steps of the photographing processing method of the exemplary embodiment of the present disclosure, in which case the photographing processing apparatus described below may be disposed in the mobile terminal.
The mobile terminal disclosed by the disclosure includes but is not limited to a mobile phone, a tablet computer, a smart wearable device and the like.
Fig. 1 schematically shows a flowchart of a photographing processing method of an exemplary embodiment of the present disclosure. Referring to fig. 1, the photographing processing method may include the steps of:
and S12, acquiring preview image data of a camera of the mobile terminal in a first output mode, and detecting the distance between a target object and the mobile terminal in the current shooting scene according to the preview image data.
After the camera is turned on and before shooting is performed, the camera may be usually in a preview state in which the camera may output preview image data.
The present disclosure does not limit the kind of the camera on the mobile terminal as long as the camera can have different output mode functions. The output mode described in the present disclosure may be understood as a mode in which captured image data is output in a certain pixel. It should be understood that in the description of the present disclosure, the pixels corresponding to different output modes are different, and the output resolution of different output modes can be considered to be different.
When the camera performs preview, an output mode with relatively low pixels is often adopted, and this output mode is referred to as a first output mode by the present disclosure. Correspondingly, the output mode of the relatively high pixels is denoted as the second output mode.
Taking a camera with 6400 ten thousand pixels as an example, the output mode may include an output with 1600 thousand pixels and an output with 6400 thousand pixels. Normally, if a camera of 6400 ten thousand pixels is in a preview state, 1600 ten thousand pixels are output. In the case of a camera having 6400 ten thousand pixels, the first output mode according to the present disclosure may be a mode in which image data is output at 1600 ten thousand pixels, and the second output mode may be a mode in which image data is output at 6400 ten thousand pixels.
The mobile terminal may detect a distance of a target object from the mobile terminal in a current shooting scene using the preview image data.
For the process of determining the target object, according to some embodiments of the present disclosure, the mobile terminal may identify each object in the current shooting scene by using the preview image, and determine a type corresponding to each object. Specifically, the type corresponding to each object may be determined by using methods such as edge detection and neural network, for example, the type may include buildings, people, cars, flowers and plants, and the like. Next, an object occupying the largest proportion of the area of the preview image may be determined as a target object from among objects whose types belong to a preset type. For example, the preview image includes 5 objects, which are respectively denoted as object a, object B, object C, object D, and object E, and if the preset type is human, in this case, object a and object D are human, and the area occupation ratio of object a in the preview image is 25%, and the area occupation ratio of object D in the preview image is 30%, then object D is the target object. It should be understood that the preset type may be set by the user, and the present disclosure is not limited thereto.
For the process of determining the target object, according to other embodiments of the present disclosure, a preview image may be displayed on an interface of the mobile terminal. In this case, in response to a selection operation of the user for a region in the preview image, an object corresponding to the selected region may be determined as the target object. For example, if the preview image includes a cat, and the user clicks an area corresponding to the cat on the interface of the mobile terminal, the cat may be used as the target object.
It should be understood that the target object is an object that the user focuses on or is interested in the current shooting scene.
After the target object is determined, the distance from the target object to the mobile terminal can be detected by using the preview image data. Specifically, the distance detection may be implemented by using, for example, a similar triangle method, however, the present disclosure does not limit the specific process of detecting the distance, and may also implement the distance detection between the target object and the mobile terminal by using other monocular distance measurement methods.
S14, if the distance between the target object and the mobile terminal is larger than a distance threshold value, switching the output mode of the camera to a second output mode, and outputting the shot image by using the second output mode; wherein the pixels of the first output mode are lower than the pixels of the second output mode.
In an exemplary embodiment of the present disclosure, the pixels of the first output mode are lower than those of the second output mode, and still take a camera of 6400 ten thousand pixels as an example, the first output mode may be a mode of outputting image data at 1600 ten thousand pixels, and the second output mode may be a mode of outputting image data at 6400 ten thousand pixels.
In step S14, the distance of the target object detected in step S12 from the mobile terminal may be compared with a distance threshold, and it may be determined whether to switch the output mode of the camera according to the comparison result. The distance threshold may be set to a fixed value by a manufacturer of the mobile terminal when the mobile terminal leaves a factory, or may be set by a user, for example, 1 meter, which is not limited by the present disclosure.
According to some embodiments of the present disclosure, if a distance of the target object from the mobile terminal is greater than a distance threshold, the output mode of the camera may be switched from the first output mode to the second output mode, and the photographed image may be output using the second output mode.
According to further embodiments of the present disclosure, if the distance of the target object from the mobile terminal is less than or equal to the distance threshold, the texture complexity of the target object may be determined from the preview image data. The texture represents the visual characteristics of the homogeneous phenomenon in the image and reflects the surface structure organization arrangement attribute with slow change or periodic change on the surface of the object.
Specifically, the average of the sum of squares of the differences between the pixel values of each pixel point and the adjacent pixel points in the target object may be calculated, and the average may be used as the texture complexity of the target object.
Other methods of calculating texture complexity, such as texture complexity analysis based on image gradients, may be envisioned by those skilled in the art. However, these schemes for calculating the complexity of the texture should fall within the scope of the present disclosure under the concept of the present disclosure.
Next, one output mode may be selected from the first output mode and the second output mode according to the texture complexity of the target object to output the photographed image.
In one case, if the texture complexity of the target object is equal to or less than a texture complexity threshold, the photographed image is output using the first output mode. The texture complexity threshold may be manually set in advance, and the disclosure does not limit specific values.
Further, in this case, it is also possible to convert the captured image output in the first output mode into an image corresponding to the pixels in the second output mode and output it. Specifically, the captured image output in the first output mode may be input into a pre-configured up-scaling (Upscale) hardware module, so as to obtain an image corresponding to the pixel in the second output mode. Still taking a camera with 6400 ten thousand pixels as an example, an image with 1600 ten thousand pixels output in the first output mode can be input into the hardware module, and an image with 6400 ten thousand pixels can be obtained and output. Therefore, the camera keeps the output of high pixels, and the output is ensured to be consistent.
In another case, if the texture complexity of the target object is greater than the texture complexity threshold, the output mode of the camera may be switched to the second output mode, and the photographed image may be output using the second output mode.
It is easily understood that the output data of the photographed image may be encoded to generate, for example, a picture in jpeg format.
The shooting processing procedure of the present disclosure will be explained below with reference to fig. 2.
In step S202, the mobile terminal may detect a distance from the target object to the mobile terminal according to preview image data output by the camera previewing in the first output mode; in step S204, it is determined whether the distance between the target object and the mobile terminal is greater than a distance threshold, if so, step S210 is performed, and if not, step S206 is performed.
In step S206, the mobile terminal may determine the texture complexity of the target object; in step S208, it is determined whether the texture complexity is greater than a texture complexity threshold, if so, step S210 is performed, and if not, step S212 is performed.
In step S210, the mobile terminal switches the output mode of the camera from the first output mode to the second output mode, and outputs the captured image.
In step S212, the mobile terminal outputs the photographed image using the first output mode; in step S214, the captured image output in step S212 using the first output mode is converted into an image corresponding to the pixels of the second output mode and output.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, the present exemplary embodiment also provides a shooting processing apparatus.
Fig. 3 schematically shows a block diagram of a photographing processing apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 3, the photographing processing apparatus 3 according to an exemplary embodiment of the present disclosure may include a distance detection module 31 and a first photographing processing module 33.
Specifically, the distance detection module 31 may be configured to obtain preview image data of a camera of the mobile terminal in a first output mode, and detect a distance from a target object in a current shooting scene to the mobile terminal according to the preview image data; the first photographing processing module 33 may be configured to switch the output mode of the camera to a second output mode if the distance between the target object and the mobile terminal is greater than a distance threshold, and output the photographed image using the second output mode; wherein the pixels of the first output mode are lower than the pixels of the second output mode.
According to an exemplary embodiment of the present disclosure, referring to fig. 4, the photographing processing apparatus 4 may further include a second photographing processing module 41 compared to the photographing processing apparatus 3.
Specifically, the second photographing processing module 41 may be configured to perform: if the distance between the target object and the mobile terminal is smaller than or equal to the distance threshold, determining the texture complexity of the target object according to the preview image data; selecting an output mode from the first output mode and the second output mode to output a photographed image according to the texture complexity of the target object.
According to an exemplary embodiment of the present disclosure, referring to fig. 5, the second photographing processing module 41 includes a first photographing processing unit 501.
Specifically, the first photographing processing unit 501 may be configured to perform: and if the texture complexity of the target object is less than or equal to a texture complexity threshold, outputting the shot image by using the first output mode.
According to an exemplary embodiment of the present disclosure, the first photographing processing unit 501 may be further configured to perform: after the photographed image is output using the first output mode, the photographed image output using the first output mode is converted into an image corresponding to the pixels of the second output mode and output.
According to an exemplary embodiment of the present disclosure, referring to fig. 6, the second photographing processing module 41 includes a second photographing processing unit 601.
Specifically, the second shooting processing unit 601 may be configured to execute: and if the texture complexity of the target object is greater than the texture complexity threshold, switching the output mode of the camera to a second output mode, and outputting the shot image by using the second output mode.
According to an exemplary embodiment of the present disclosure, referring to fig. 7, the distance detection module 31 includes a first target object determination unit 701.
Specifically, the first target object determining unit 701 may be configured to perform: identifying each object in the current shooting scene by using the preview image data, and determining the type corresponding to each object; and determining the object which occupies the largest area proportion of the preview image from the objects of which the types belong to the preset types as the target object.
According to an exemplary embodiment of the present disclosure, referring to fig. 8, the distance detection module 31 includes a second target object determination unit 801.
Specifically, the second target object determination unit 801 may be configured to perform: and responding to the selection operation of the user on the preview image, and determining the object corresponding to the selection operation as the target object.
Since each functional module of the shooting processing apparatus according to the embodiment of the present invention is the same as that in the embodiment of the present invention, it is not described herein again.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
The program product for implementing the above method according to an embodiment of the present invention may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical disk, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 900 is embodied in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification. For example, the processing unit 910 may perform step S12 and step S14 as shown in fig. 1.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 1000 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (4)

1. A shooting processing method, characterized by comprising:
acquiring preview image data of a camera of a mobile terminal in a first output mode, identifying each object in a current shooting scene by using the preview image data, determining a type corresponding to each object, determining an object occupying the largest area proportion of a preview image from objects of which the types belong to preset types as a target object, and detecting the distance between the target object and the mobile terminal;
if the distance between the target object and the mobile terminal is larger than a distance threshold value, switching the output mode of the camera to a second output mode, and outputting a shot image by using the second output mode;
if the distance between the target object and the mobile terminal is smaller than or equal to the distance threshold, determining the texture complexity of the target object according to the preview image data, wherein the texture complexity is an average value of the sum of squares of differences between pixel values of each pixel point and adjacent pixel points in the target object, outputting a shot image by using the first output mode under the condition that the texture complexity of the target object is smaller than or equal to a texture complexity threshold, converting the shot image output by the first output mode into an image corresponding to a pixel of the second output mode and outputting the image, and switching the output mode of the camera into the second output mode under the condition that the texture complexity of the target object is larger than the texture complexity threshold, and outputting the shot image by using the second output mode;
wherein the pixels of the first output mode are lower than the pixels of the second output mode.
2. A shooting processing apparatus characterized by comprising:
the distance detection module is used for acquiring preview image data of a camera of the mobile terminal in a first output mode, identifying each object in a current shooting scene by using the preview image data, determining the type corresponding to each object, determining an object occupying the largest area proportion of a preview image from objects of which the types belong to preset types, taking the object as a target object, and detecting the distance between the target object and the mobile terminal;
the first shooting processing module is used for switching the output mode of the camera to a second output mode and outputting a shot image by utilizing the second output mode if the distance between the target object and the mobile terminal is greater than a distance threshold;
a second shooting processing module for, if the distance between the target object and the mobile terminal is less than or equal to the distance threshold, determining the texture complexity of the target object according to the preview image data, wherein the texture complexity is the average value of the square sum of the differences between the pixel values of each pixel point and the adjacent pixel points in the target object, and outputting a photographed image using the first output mode in a case where the texture complexity of the target object is equal to or less than a texture complexity threshold, and converts the photographed image output by the first output mode into an image corresponding to the pixels of the second output mode and outputs it, under the condition that the texture complexity of the target object is larger than the texture complexity threshold, switching the output mode of the camera to a second output mode, and outputting a shot image by using the second output mode;
wherein the pixels of the first output mode are lower than the pixels of the second output mode.
3. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the shooting processing method of claim 1.
4. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the photographic processing method of claim 1 via execution of the executable instructions.
CN201911057830.7A 2019-11-01 2019-11-01 Shooting processing method and device, storage medium and electronic equipment Active CN110855882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911057830.7A CN110855882B (en) 2019-11-01 2019-11-01 Shooting processing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911057830.7A CN110855882B (en) 2019-11-01 2019-11-01 Shooting processing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110855882A CN110855882A (en) 2020-02-28
CN110855882B true CN110855882B (en) 2021-10-08

Family

ID=69597934

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911057830.7A Active CN110855882B (en) 2019-11-01 2019-11-01 Shooting processing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110855882B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115118871B (en) * 2022-02-11 2023-12-15 东莞市步步高教育软件有限公司 Shooting pixel mode switching method, shooting pixel mode switching system, terminal equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101815126A (en) * 2010-03-19 2010-08-25 中兴通讯股份有限公司 Method and device for automatically adjusting display scale
CN101977283A (en) * 2010-09-09 2011-02-16 华为终端有限公司 Method and apparatus for switching photographing mode
CN103024165A (en) * 2012-12-04 2013-04-03 华为终端有限公司 Method and device for automatically setting shooting mode
CN104539846A (en) * 2014-12-26 2015-04-22 小米科技有限责任公司 Picture shooting method, device and terminal
CN104777980A (en) * 2015-04-22 2015-07-15 广东欧珀移动通信有限公司 Method and device for electricity saving of high-resolution terminal
CN106161916A (en) * 2015-04-08 2016-11-23 联想(北京)有限公司 A kind of image-pickup method and electronic equipment
CN106210514A (en) * 2016-07-04 2016-12-07 广东欧珀移动通信有限公司 Take pictures focusing method, device and smart machine
CN106464801A (en) * 2014-05-21 2017-02-22 高通股份有限公司 System and method for determining image resolution
CN106488126A (en) * 2016-10-26 2017-03-08 惠州Tcl移动通信有限公司 A kind of can the photographic method of auto zoom picture size, system and its mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5040493B2 (en) * 2006-12-04 2012-10-03 ソニー株式会社 Imaging apparatus and imaging method
US20100149338A1 (en) * 2008-12-16 2010-06-17 Mamigo Inc Method and apparatus for multi-user user-specific scene visualization
CN103440618A (en) * 2013-09-25 2013-12-11 云南大学 Block-based texture synthesis method and device
JP6897679B2 (en) * 2016-06-28 2021-07-07 ソニーグループ株式会社 Imaging device, imaging method, program
CN108322747B (en) * 2018-01-05 2020-07-10 中国软件与技术服务股份有限公司 Coding unit division optimization method for ultra-high definition video

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101815126A (en) * 2010-03-19 2010-08-25 中兴通讯股份有限公司 Method and device for automatically adjusting display scale
CN101977283A (en) * 2010-09-09 2011-02-16 华为终端有限公司 Method and apparatus for switching photographing mode
CN103024165A (en) * 2012-12-04 2013-04-03 华为终端有限公司 Method and device for automatically setting shooting mode
CN106464801A (en) * 2014-05-21 2017-02-22 高通股份有限公司 System and method for determining image resolution
CN104539846A (en) * 2014-12-26 2015-04-22 小米科技有限责任公司 Picture shooting method, device and terminal
CN106161916A (en) * 2015-04-08 2016-11-23 联想(北京)有限公司 A kind of image-pickup method and electronic equipment
CN104777980A (en) * 2015-04-22 2015-07-15 广东欧珀移动通信有限公司 Method and device for electricity saving of high-resolution terminal
CN106210514A (en) * 2016-07-04 2016-12-07 广东欧珀移动通信有限公司 Take pictures focusing method, device and smart machine
CN106488126A (en) * 2016-10-26 2017-03-08 惠州Tcl移动通信有限公司 A kind of can the photographic method of auto zoom picture size, system and its mobile terminal

Also Published As

Publication number Publication date
CN110855882A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110675404B (en) Image processing method, image processing apparatus, storage medium, and terminal device
CN109040523B (en) Artifact eliminating method and device, storage medium and terminal
CN110809101B (en) Image zooming processing method and device, electronic equipment and storage medium
CN110536078A (en) Handle the method and dynamic visual sensor of the data of dynamic visual sensor
EP4053784A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN114943936B (en) Target behavior recognition method and device, electronic equipment and storage medium
CN103826064A (en) Image processing method, device and handheld electronic equipment
CN112306793A (en) Method and device for monitoring webpage
CN110855957B (en) Image processing method and device, storage medium and electronic equipment
JP7255841B2 (en) Information processing device, information processing system, control method, and program
CN111931781A (en) Image processing method and device, electronic equipment and storage medium
CN110855882B (en) Shooting processing method and device, storage medium and electronic equipment
CN110263301B (en) Method and device for determining color of text
CN112801882B (en) Image processing method and device, storage medium and electronic equipment
CN110933304B (en) Method and device for determining to-be-blurred region, storage medium and terminal equipment
CN109218620B (en) Photographing method and device based on ambient brightness, storage medium and mobile terminal
CN111447360A (en) Application program control method and device, storage medium and electronic equipment
CN110855881B (en) Shooting processing method and device, storage medium and electronic equipment
CN114255177B (en) Exposure control method, device, equipment and storage medium in imaging
CN113658073A (en) Image denoising processing method and device, storage medium and electronic equipment
CN114071024A (en) Image shooting method, neural network training method, device, equipment and medium
CN112911186B (en) Image storage method and device, electronic equipment and storage medium
CN116567194B (en) Virtual image synthesis method, device, equipment and storage medium
CN112288774B (en) Mobile detection method, mobile detection device, electronic equipment and storage medium
CN117631841A (en) Financial machine control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant