CN117337578A - Method, electronic device, apparatus, and computer-readable storage medium for generating image - Google Patents

Method, electronic device, apparatus, and computer-readable storage medium for generating image Download PDF

Info

Publication number
CN117337578A
CN117337578A CN202180098120.8A CN202180098120A CN117337578A CN 117337578 A CN117337578 A CN 117337578A CN 202180098120 A CN202180098120 A CN 202180098120A CN 117337578 A CN117337578 A CN 117337578A
Authority
CN
China
Prior art keywords
image
processing
camera image
display
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180098120.8A
Other languages
Chinese (zh)
Inventor
新井俊彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN117337578A publication Critical patent/CN117337578A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A method of generating an image comprising: acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input; displaying an initial camera image obtained by performing initial image processing on the reference camera image on a display; selecting a processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents according to a second operation input; and generating a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.

Description

Method, electronic device, apparatus, and computer-readable storage medium for generating image
Technical Field
The present disclosure relates to a method of generating image data, an electronic device implementing the method, a computer readable medium and apparatus comprising program instructions stored thereon for performing the method.
Background
Traditionally, electronic devices such as smartphones are equipped with digital cameras that can capture objects, such as people.
Here, there are two ways of photographing, one focusing on the main object and the other balancing the main object and the background.
For example, a portrait photo needs to draw attention to a person. In this case, the background is not important. In some cases, the background may interfere with the primary object.
In taking photographs such as travel photographs, a user wishes to take a person and a background at the same time. In this case too, the background should be well captured (with respect to focus, brightness and color).
They are then selected according to the user's intention. Furthermore, each mode requires control of multiple settings. There are the following problems.
For example, the user needs to control various parameter settings, which requires experience and knowledge. Only the user knows their intent. However, the camera assembly of the electronic device cannot take into account the user's intent.
In particular, for example, when a user shoots flowers using a camera assembly, there is a case where the user shoots main subject flowers to emphasize them, and there is a case where the user shoots the main subject flowers together with a background.
In the case where the user emphasizes the main object flower using the camera assembly, the main object flower is photographed so as to stand out in the background by making the flower appear most beautiful by adjusting exposure and white balance, blurring the background.
On the other hand, in the case where the user photographs the main subject flower together with the background using the camera module, the camera module photographs so that the whole is in focus by adjusting the exposure and white balance in the camera module so that the whole image is right.
However, shooting "only the flower is the main object" or "the main object flower together with the background" is the user's intention. The camera component cannot recognize the user's intent.
Disclosure of Invention
The present disclosure is directed to solving at least one of the above-mentioned technical problems. Accordingly, there is a need for providing an electronic device and a method of controlling an electronic device.
According to the present disclosure, a method for generating an image includes:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input;
displaying an initial camera image obtained by performing initial image processing on the reference camera image on a display;
selecting a processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents according to a second operation input; and
the processed camera image is generated by performing image processing specified in the selected processing parameter set on the reference camera image.
According to the present disclosure, an electronic device includes:
a camera assembly configured to acquire a camera image by capturing an object;
a display;
at least one processor; and
at least one memory including program code;
the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by a first operational input;
displaying an initial camera image obtained by performing initial image processing on the reference camera image on the display;
selecting a processing parameter set from a plurality of processing parameter sets, the plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
a processed camera image is generated by performing image processing specified in the selected set of processing parameters on the reference camera image.
According to the present disclosure, an apparatus comprises:
a camera image acquisition unit configured to acquire a reference camera image by controlling the camera assembly to capture at least one main object specified by the first operation input;
a display unit configured to display an initial camera image obtained by performing initial image processing on a reference camera image on a display;
a parameter selection unit configured to select one processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
and a camera image generation unit configured to generate a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
According to the present disclosure, a computer readable medium includes program instructions stored thereon for performing at least the following:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input;
displaying an initial camera image obtained by performing initial image processing with reference to the camera image on a display;
selecting a processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents according to a second operation input; and
the processed camera image is generated by performing image processing specified in the selected processing parameter set on the reference camera image.
Drawings
These and/or other aspects and advantages of the embodiments of the present disclosure will become apparent from and more readily appreciated from the following description taken in conjunction with the accompanying drawings.
FIG. 1 illustrates a plan view of a first face of an electronic device according to an embodiment of the present disclosure;
fig. 2 illustrates a plan view of a second face of an electronic device according to an embodiment of the present disclosure;
FIG. 3 illustrates a block diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 4 shows a block diagram of a processor according to an embodiment of the present disclosure;
FIG. 5A is a diagram illustrating an example display of an electronic device displaying a camera image weighted on a primary object;
FIG. 5B is a diagram illustrating an example display of an electronic device displaying camera images weighted over a primary object and background;
fig. 6 is a diagram for explaining an example of image processing of a camera image weighted on a main object;
fig. 7 is a diagram for explaining an example of image processing on camera images weighted on a main object and a background;
FIG. 8A is a diagram showing an example of a display of an electronic device displaying a weighted camera image on a primary object and a slider bar for operation by user input;
FIG. 8B is a diagram showing an example of a display of an electronic device displaying a camera image weighted over a primary object and background and a slider bar for operation by user input;
FIG. 9A is a diagram illustrating an example of a display of an electronic device displaying a camera image weighted on a primary object as an initial camera image; and
fig. 9B is a diagram showing an example of a display of an electronic device that displays a camera image weighted on a main object and a background as an initial camera image.
Detailed Description
Embodiments of the present disclosure will be described in detail, and examples of the embodiments will be illustrated in the accompanying drawings. Throughout the description, identical or similar elements and elements having identical or similar functions are denoted by identical reference numerals. The embodiments described herein with reference to the drawings are illustrative and are intended to be illustrative of the present disclosure, but should not be construed as limiting the present disclosure.
Here, fig. 1 illustrates a plan view of a first side of an electronic device 10 according to an embodiment of the present disclosure, and fig. 2 illustrates a plan view of a second side of the electronic device 10 according to an embodiment of the present disclosure. The first side may be referred to as the back side of the electronic device 10 and the second side may be referred to as the front side of the electronic device 10.
As shown in fig. 1 and 2, the electronic device 10 may include a display 20 and a camera assembly 30. In the present embodiment, the camera assembly 30 includes a first primary camera 32, a second primary camera 34, and a secondary camera 36. The first and second primary cameras 32, 34 may capture images on a first side of the electronic device 10 and the secondary camera 36 may capture images on a second side of the electronic device 10. Thus, the first main camera 32 and the second main camera 34 are so-called external cameras (out-cameras), and the sub-camera 36 is a so-called internal camera (in-camera). For example, the electronic device 10 may be a mobile phone, tablet, personal digital assistant, or the like.
Note that the display 20 is, for example, a touch screen that accepts an operation input of the user U.
Fig. 3 shows a block diagram of the electronic device 10 according to the present embodiment. As shown in fig. 3, in addition to the display 20 and the camera assembly 30, the electronic device 10 may include a main processor 40, an image signal processor 42, a memory 44, a power circuit 46, and a communication circuit 48. The display 20, camera assembly 30, main processor 40, image signal processor 42, memory 44, power circuit 46, and communication circuit 48 are interconnected by bus 50.
The processor 100 includes a main processor 40 and an image signal processor 42.
As will be described later, the processor 100 acquires a reference camera image by controlling the camera assembly 30. Further, the processor 100 acquires a processed camera image by performing image processing on the reference camera image.
The main processor 40 executes one or more programs stored in the memory 44. The main processor 40 implements various applications and data processing (including image data processing) of the electronic device 10 by executing programs. The main processor 40 may be one or more computer processors. The main processor 40 is not limited to one central processing unit (Central Processing Unit, CPU) core, i.e., it may have a plurality of CPU cores. The main processor 40 may be a main CPU of the electronic device 10, an image processing unit (Image Process Unit, IPU) or a digital signal processor (Digital Signal Processing, DSP) provided together with the camera assembly 30.
The image signal processor 42 controls the camera assembly 30 and processes various image data captured by the camera assembly 30 to generate target image data. For example, the image signal processor 42 may perform a demosaicing process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process, and the like on image data captured by the camera assembly 30.
In this embodiment, the main processor 40 and the image signal processor 42 cooperate with each other to generate target image data of an object captured by the camera assembly 30. That is, the main processor 40 and the image signal processor 42 are configured to capture an image of an object through the camera assembly 30 and perform various image processes on the captured image data.
The memory 44 stores programs to be executed by the main processor 40 and the image signal processor 42, as well as various data. For example, data of the captured image is stored in the memory 44.
It is noted that images, including images such as captured images and generated images, are present in the electronic device 10 in the form of image data (e.g., data of the images are stored in the memory 44). Further, the processor 100 processes the image into image data.
In particular, for example, memory 44 includes program code. The memory 44 and program code are configured to, with the processor 100, cause the electronic device 10 to perform: acquiring a reference camera image by controlling the camera assembly 30 to capture at least one main object specified by a first operation input of a user; displaying an initial camera image obtained by performing initial image processing on the reference camera image on the display 20; selecting a processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents according to a second operation input by a user; and generating a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
The memory 44 may include high-speed random access memory (Random Access Memory, RAM) and/or non-volatile memory such as flash memory and disk memory. That is, the memory 44 may include a non-transitory computer readable medium storing a program.
The power supply circuit 46 may have a battery such as a lithium ion rechargeable battery and a battery management unit (Battery Management Unit, BMU) for managing the battery.
The communication circuit 48 is configured to receive and transmit data by wireless communication to communicate with a base station of a communication network system, the internet, or other devices. The wireless communication may employ any communication standard or protocol including, but not limited to, global system for mobile communications (Global System for Mobile communication, GSM), code division multiple access (Code Division Multiple Access, CDMA), long term evolution (Long Term Evolution, LTE), advanced long term evolution (LTE-advanced), fifth generation mobile communication technology (5th Generation mobile communication technology,5G). The communication circuit 48 may include an antenna and Radio Frequency (RF) circuitry.
In the present embodiment, the image generation process is executed by, for example, the main processor 40 (processor 100) to generate image data.
Further, in the present embodiment, the program instructions of the image generation process are stored in a non-transitory computer-readable medium of the memory 44. When the program instructions are read from the memory 44 and executed in the main processor 40, the main processor 40 implements an image generation process.
Here, fig. 4 shows a block diagram of a processor 100 according to an embodiment of the present disclosure.
As shown in fig. 4, the processor 100 shown in fig. 4 may be replaced with a device. The device comprises: a camera image acquisition unit 100a configured to acquire a reference camera image by controlling the camera assembly to capture at least one main object specified by a first operation input of a user; a display unit 100b configured to display, on a display, an initial camera image obtained by performing initial image processing on the reference camera image; a parameter selection unit 100c configured to select one processing parameter set from a plurality of image processing parameter sets that specify a plurality of image processes having different processing contents, according to a second operation input by a user; and a camera image generation unit 100d configured to generate a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
In this case, the electronic device 10 may be represented as an apparatus including the processor 100.
[ image data generation method ]
Next, an example of a method of generating image data by the electronic apparatus 100 having the above-described configuration and function according to the present embodiment will be described.
Here, fig. 5A is a diagram showing a display example of an electronic device that displays a camera image weighted on a main object. Further, fig. 5B is a diagram showing a display example of an electronic device that displays camera images weighted on a main object and a background.
The processing of the processor 100 of the electronic device 10 shown in fig. 3 will be mainly described below.
First, the processor 100 captures at least one main object Z designated by a first operation input of a user by controlling the camera assembly 30, thereby obtaining a reference camera image.
The processor 100 sets the object of the camera assembly 30 that has been focused on in the reference camera image as the main object Z.
In the examples shown in fig. 5A, 5B, 6 and 7, the main object Z is a flower, but may be an object other than a flower.
The processor 100 then displays an initial camera image on the display 20, the initial camera image being obtained by performing an initial image process on the reference camera image.
Then, the processor 100 selects one processing parameter set from among a plurality of processing parameter sets that specify a plurality of image processes having different processing contents, according to a second operation input by the user.
For example, camera assembly 30 has many features to achieve "focus on target" or "balanced focus on target and background".
For example, the image processing performed by the processor 100 on the reference camera image includes: the foreground on the background image is adjusted based on the image of the main object Z in the reference camera image or the entire image of the reference camera image (the foreground is turned on or off).
Here, the "foreground" controls the focus blur on the background so that the main object will stand out in the background.
Further, the image processing performed by the processor 100 for the reference camera image includes: the white balance of the reference camera image is adjusted based on the image of the main object Z in the reference camera image or the entire image of the reference camera image.
Further, the image processing performed by the processor 100 for the reference camera image includes: the brightness is adjusted based on the image of the main object Z in the reference camera image or the entire image of the reference camera image.
Here, an "extended depth of focus (Extended Depth Of Focus, EXOF)" captures multiple frames to allow good focus on both the target and the background.
Further, the image processing performed by the processor 100 for the reference camera image includes: the high dynamic range (High Dynamic Range, HDR) process is turned on or off.
"HDR" captures multiple frames so that both the object and the background have good brightness.
Here, for example, as shown in fig. 6 and 7, the second operation input from the outside is the second operation input after the user U displays the initial camera image on the display 20. In this case, as described above, the display 20 is, for example, a touch screen that receives an operation input of the user U.
In this case, for example, according to an operation input by the user U, the processor 100 selects a first processing parameter set defining a first image processing (fig. 5A, 6) or a second processing parameter set defining a second image processing (fig. 5B, 7) whose processing content is different from that of the first image processing from among the plurality of processing parameter sets.
For example, the user U can switch between the first processing parameter set and the second processing parameter set by operating (touching) the button 20a (fig. 6) or the button 20b (fig. 7) displayed on the display screen 20.
Here, fig. 6 is a diagram for explaining an example of image processing on a camera image weighted on a main object.
In the case of "focusing on a main object", brightness is considered based on the main object. Furthermore, HDR is off (background is not important). In addition, the aperture is opened to narrow the depth of field or to turn on the foreground function. Further, the white balance gain is considered based on the main target.
On the other hand, fig. 7 is a diagram for explaining an example of image processing on camera images weighted on a main object and a background.
In the case of "focus both", brightness is considered based on the entire image. Furthermore, HDR is turned on in the case of a wide dynamic range scene to fade over bright/dark areas. In addition, the aperture is narrowed to obtain a wide depth of focus and to close the foreground function to capture the background as well. Furthermore, white balance gain is considered based on the entire image.
In this way, the processor 100 acquires a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
Finally, the processor 100 displays a processed camera image on the display 20, the processed camera image being obtained by performing an image processing defined in the selected set of processing parameters on the reference camera image.
Thus, with this scheme, the user can set various settings based on the balance between the target object and the background.
Here, fig. 8A is a diagram showing an example of a display of an electronic device that displays a camera image weighted on a main object and a slider bar for input operation by a user. Further, fig. 8B is a diagram showing an example of a display of an electronic device that displays a camera image weighted on a main object and a background and a slide bar for an input operation by a user.
For example, as shown in fig. 8A and 8B, the processor 100 may display a slider 20c on the display 20 for inputting an operation of the user U regarding continuous or stepwise adjustment of the content of the image processing for the reference camera image.
It is noted that the processor 100 may display a dial on the display 20 instead of the slider bar 20c for inputting an operation of the user U related to continuously or stepwise adjusting the content of the image processing with respect to the reference camera image.
In this way, a dial or sliding bar may be applied instead of buttons. The dial or slider bar may control the balance between the main object and the background.
As described above, in response to the operation input of the user U, the processor 100 can adjust the image weight of the main object Z in the image processing.
Here, fig. 9A is a diagram showing an example of a display of an electronic device that displays a camera image weighted on a main object as an initial camera image. Further, fig. 9B shows a diagram of an example of a display of an electronic device that displays a camera image weighted on a main object and a background as an initial camera image.
For example, as shown in fig. 9A and 9B, the processor 100 may select a set of processing parameters to be performed for obtaining an initial camera image based on the proportion of the image region of the main object Z in the reference camera image.
More specifically, as shown in fig. 9A, when the proportion of the image area of the main subject Z in the reference camera image is equal to or greater than a preset selection threshold, the processor 100 displays an initial camera image on which a first image process, which is defined as an initial image process (initial state is a large main subject) in a first process parameter set, on the display 20.
In this case, the operation buttons 20a are displayed on the display 20.
On the other hand, as shown in fig. 9B, when the proportion of the image area of the main subject Z in the reference camera image is smaller than the selection threshold, the processor 100 displays the initial camera image on which the second image processing, which is defined as the initial image processing (initial state is a small main subject) in the second processing parameter set, on the display 20.
In this case, the operation buttons 20b are displayed on the display 20.
In this way, the electronic device 10 may take into account the initial state of "focus master object" or "balance master object with background" using the master object scale on the image.
As described above, an electronic device according to the present invention includes: a camera assembly configured to acquire a camera image by capturing an object; a display; at least one processor; and at least one memory including program code. Further, the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform: acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input; displaying an initial camera image obtained by performing initial image processing on the reference camera image on a display; selecting a processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents according to a second operation input; and generating a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
Accordingly, the electronic device according to the present invention solves the above-mentioned problems through a user interface so that the camera module controls various parameters based on user commands. Thus, the user can control various settings by one action. Thus, the user can obtain a desired image without difficulty.
In describing embodiments of the present disclosure, it should be understood that terms such as "center," "longitudinal," "transverse," "length," "width," "thickness," "upper," "lower," "front," "rear," "back," "left," "right," "vertical," "horizontal," "top," "bottom," "interior," "exterior," "clockwise," and "counterclockwise" should be construed to refer to directions or locations as described or illustrated in the drawings in question. These related terms are only used to simplify the description of the present disclosure and do not indicate or imply that the devices or elements referred to must have a particular orientation or must be constructed or operated in a particular orientation. Accordingly, these terms should not be construed as limiting the present disclosure.
Furthermore, terms such as "first" and "second" are used herein for descriptive purposes and are not intended to indicate or imply relative importance or significance or the number of technical features indicated. Thus, features defined as "first" and "second" may include one or more of the features. In the description of the present disclosure, "a plurality" means "two or more than two" unless otherwise indicated.
In the description of the embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted," "connected," "coupled," and the like are used broadly and may be, for example, a fixed connection, a removable connection, or an integral connection; or may be a mechanical or electrical connection; or may be directly connected or indirectly connected through an intermediate structure; internal communication of two elements as would be understood by one of ordinary skill in the art depending on the particular situation is also possible.
In embodiments of the present disclosure, unless specified or limited otherwise, structures in which a first feature is "on" or "under" a second feature may include embodiments in which the first feature is in direct contact with the second feature, and may also include embodiments in which the first feature and the second feature are not in direct contact with each other, but are contacted by additional features formed therebetween. Furthermore, embodiments in which a first feature is "on", "above" or "on top of" a second feature may include embodiments in which the first feature is "on", "above" or "on top of" the second feature "orthogonally or obliquely to the first feature, or simply means that the first feature is at a height greater than the height of the second feature; while a first feature "under", "beneath" or "at the bottom of" a second feature "may include embodiments where the first feature is" under "," beneath "or" at the bottom of "the second feature" orthogonally or obliquely, or simply meaning that the first feature is at a lower elevation than the second feature.
Various embodiments and examples are provided in the above description to implement the different structures of the present disclosure. In order to simplify the present disclosure, certain elements and arrangements are described above. However, these elements and arrangements are merely examples and are not intended to limit the present disclosure. Further, in different examples of the present disclosure, reference numerals and/or letters may be repeated. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations. In addition, the present disclosure provides examples of different processes and materials. However, those skilled in the art will appreciate that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment," "some embodiments," "an example embodiment," "an example," "a particular example," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above-identified phrases in various places throughout this specification are not necessarily all referring to the same embodiment or example of the disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in the flow diagrams or otherwise described herein may be understood as comprising one or more modules, segments, or portions of code comprising executable instructions for implementing specific logical functions or steps in the process, and the scope of the preferred embodiments of the present disclosure includes other implementations in which functions may be implemented in a different order (including in substantially the same order or in an opposite order) than shown or discussed, as would be understood by those skilled in the art.
Logic and/or steps (e.g., a particular sequence of executable instructions for performing a logic function) described elsewhere herein or shown in a flowchart may be embodied in any computer-readable medium to be used by or in connection with an instruction execution system, apparatus, or device (e.g., a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device that executes the instructions). For the purposes of this description, a "computer-readable medium" can be any apparatus that can be used in, or that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples of the computer-readable medium include, but are not limited to: an electronic connection (electronic device) with one or more wires, a portable computer housing (magnetic device), a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), an erasable programmable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM or flash Memory), an optical fiber device, and a portable compact disc Read Only Memory (Compact Disk Read-Only Memory, CDROM). Furthermore, the computer readable medium may even be paper or other suitable medium upon which the program can be printed, as, for example, when the program is desired to be electronically captured, the paper or other suitable medium can be optically scanned, then compiled, decrypted or otherwise processed in a suitable manner, and then the program can be stored in a computer memory.
It should be understood that each portion of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, in another embodiment as well, the steps or methods may be implemented by one or a combination of the following techniques, which are known in the art: discrete logic circuits with logic gates for implementing logic functions for data signals, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (Programmable Gate Array, PGA), field programmable gate arrays (Field Programmable Gate Array, FPGA), and the like.
Those skilled in the art will appreciate that all or part of the steps in the above-described exemplary methods of the present disclosure may be implemented using program command related hardware. These programs may be stored in a computer readable storage medium and when run on a computer comprise one or a combination of steps in the method embodiments of the present disclosure.
Furthermore, each functional unit of the embodiments of the present disclosure may be integrated in a processing module, or these units may be physically present alone, or two or more units are integrated in a processing module. The integrated modules may be implemented in hardware or in software functional modules. When the integrated module is implemented in the form of a software functional module and sold or used as a stand-alone product, the integrated module may be stored in a computer-readable storage medium.
The storage medium may be a read-only memory, a magnetic disk, a Compact Disc (CD), or the like.
Although embodiments of the present disclosure have been shown and described, it will be understood by those skilled in the art that these embodiments are illustrative and not to be construed as limiting the present disclosure, and that changes, modifications, substitutions, and alterations may be made in the embodiments without departing from the scope of the disclosure.

Claims (20)

1. A method of generating an image, comprising:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input;
displaying an initial camera image obtained by performing initial image processing on the reference camera image on a display;
selecting a processing parameter set from a plurality of processing parameter sets, the plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
a processed camera image is generated by performing image processing specified in the selected set of processing parameters on the reference camera image.
2. The method according to claim 1,
wherein the second operation input is an operation input after the user displays the initial camera image on the display.
3. The method according to claim 2,
wherein the display is a touch screen that accepts the second operation input of the user.
4. The method according to claim 1,
wherein the processed camera image is displayed on the display, the processed camera image being obtained by performing an image processing specified in the selected processing parameter set on the reference camera image.
5. The method according to claim 1,
wherein, in response to the second operation input, a first processing parameter set defining a first image processing or a second processing parameter set defining a second image processing, the processing content of which is different from the processing content of the first image processing, is selected from the plurality of processing parameter sets.
6. A method according to claim 3,
wherein a dial or a slider for inputting an operation of the user with respect to continuously or stepwise adjusting the content of the image processing for the reference camera image is displayed on the display.
7. The method according to claim 1,
wherein an object focused by the camera assembly in the reference camera image is set as the main object.
8. The method according to claim 1,
wherein a set of processing parameters to be performed for acquiring the initial camera image is selected based on a proportion of an image area of the main object in the reference camera image.
9. The method according to claim 5,
wherein,
displaying the initial camera image on which the first image processing has been performed, the first image processing being defined as the initial image processing in the first processing parameter set, on the display when a ratio of the image area of the main object in the reference camera image is equal to or greater than a preset selection threshold, and
in another aspect, the initial camera image on which the second image processing has been performed is displayed on the display when the proportion of image areas of the main object in the reference camera image is less than the selection threshold, the second image processing being defined as the initial image processing in the second processing parameter set.
10. An electronic device, comprising:
a camera assembly configured to acquire a camera image by capturing an object;
a display;
at least one processor; and
at least one memory including program code;
the at least one memory and the program code are configured to, with the at least one processor, cause the electronic device to perform:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by a first operational input;
displaying an initial camera image obtained by performing initial image processing on the reference camera image on the display;
selecting a processing parameter set from a plurality of processing parameter sets, the plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
a processed camera image is generated by performing image processing specified in the selected set of processing parameters on the reference camera image.
11. An electronic device according to claim 10,
wherein the second operation input is an operation input after the user displays the initial camera image on the display.
12. An electronic device according to claim 11,
wherein the display is a touch screen that accepts the second operation input of the user.
13. The electronic device according to claim 10,
wherein the processor is configured to display the processed camera image on the display, the processed camera image being obtained by performing an image processing specified in the selected set of processing parameters on the reference camera image.
14. An electronic device according to claim 10,
wherein the processor is configured to select, in response to the second operation input, a first set of processing parameters defining the first image processing or a second set of processing parameters defining a second image processing, the processing content of the second image processing being different from the processing content of the first image processing, from the plurality of processing parameter sets.
15. An electronic device according to claim 12,
wherein the processor is configured to display a dial or slider on the display for inputting an operation of the user with respect to continuously or stepwise adjusting the content of the image processing for the reference camera image.
16. An electronic device according to claim 10,
wherein the processor is configured to set an object focused by the camera assembly in the reference camera image as the master object.
17. An electronic device according to claim 10,
wherein the processor is configured to select a set of processing parameters to be performed for acquiring the initial camera image based on a proportion of an image region of the main object in the reference camera image.
18. An electronic device according to claim 14,
wherein,
when the proportion of the image area of the main object in the reference camera image is equal to or greater than a preset selection threshold, the processor is configured to display the initial camera image on which the first image processing has been performed, the first image processing being defined as the initial image processing in the first processing parameter set, and
in another aspect, when the proportion of image areas of the primary object in the reference camera image is less than the selection threshold, the processor is configured to display an initial camera image on the display on which the second image processing has been performed, the second image processing being defined as the initial image processing in the second set of processing parameters.
19. An apparatus, comprising:
a camera image acquisition unit configured to acquire a reference camera image by controlling the camera assembly to capture at least one main object specified by the first operation input;
a display unit configured to display an initial camera image obtained by performing initial image processing on the reference camera image on a display;
a parameter selection unit configured to select one processing parameter set from a plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
and a camera image generation unit configured to generate a processed camera image by performing image processing specified in the selected processing parameter set on the reference camera image.
20. A computer readable medium comprising program instructions stored thereon for performing at least the following:
acquiring a reference camera image by controlling the camera assembly to capture at least one primary object specified by the first operational input;
displaying an initial camera image obtained by performing initial image processing on the reference camera image on a display;
selecting a processing parameter set from a plurality of processing parameter sets, the plurality of processing parameter sets specifying a plurality of image processes having different processing contents, according to a second operation input; and
a processed camera image is generated by performing image processing specified in the selected set of processing parameters on the reference camera image.
CN202180098120.8A 2021-06-30 2021-06-30 Method, electronic device, apparatus, and computer-readable storage medium for generating image Pending CN117337578A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/103750 WO2023272622A1 (en) 2021-06-30 2021-06-30 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117337578A true CN117337578A (en) 2024-01-02

Family

ID=84692171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180098120.8A Pending CN117337578A (en) 2021-06-30 2021-06-30 Method, electronic device, apparatus, and computer-readable storage medium for generating image

Country Status (2)

Country Link
CN (1) CN117337578A (en)
WO (1) WO2023272622A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8259208B2 (en) * 2008-04-15 2012-09-04 Sony Corporation Method and apparatus for performing touch-based adjustments within imaging devices
CN106303250A (en) * 2016-08-26 2017-01-04 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107155060A (en) * 2017-04-19 2017-09-12 北京小米移动软件有限公司 Image processing method and device
JP6946103B2 (en) * 2017-08-01 2021-10-06 キヤノン株式会社 Imaging device, light emission control device, its control method and program
CN108495043B (en) * 2018-04-28 2020-08-07 Oppo广东移动通信有限公司 Image data processing method and related device
CN110727810B (en) * 2019-10-15 2023-05-02 联想(北京)有限公司 Image processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2023272622A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
JP6945744B2 (en) Shooting methods, devices, and devices
US9615012B2 (en) Using a second camera to adjust settings of first camera
US9172888B2 (en) Determining exposure times using split paxels
US9118841B2 (en) Determining an image capture payload burst structure based on a metering image capture sweep
JP2017509259A (en) Imaging method for portable terminal and portable terminal
KR20150099302A (en) Electronic device and control method of the same
US9087391B2 (en) Determining an image capture payload burst structure
US20130293748A1 (en) Image processing apparatus, image processing method, photographic imaging apparatus, and recording device recording image processing program
CN102137232B (en) Image quality adjusting device, camera and image quality adjusting method
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
KR102207633B1 (en) Image photographing apparatus and control methods thereof
WO2019208155A1 (en) Image processing device, method and program, and imaging device
US9325895B2 (en) Digital photographing apparatus and control method thereof
CN117337578A (en) Method, electronic device, apparatus, and computer-readable storage medium for generating image
WO2022222075A1 (en) Method of generating image data, electronic device, apparatus, and computer readable medium
CN117296329A (en) Electronic device, method of generating image data, and non-transitory computer readable medium
JP7212128B2 (en) Image processing method, device and storage medium
EP4246955A1 (en) Image processing method and electronic device
WO2021138797A1 (en) Method of adjusting captured image and electrical device
KR20220129885A (en) Method and apparatus for capturing control
KR20240034606A (en) Electronic device providing image and method of operating the same
CN117528265A (en) Video shooting method and electronic equipment
CN115066880A (en) Method for generating captured image and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination