WO2022246752A1 - Method of generating an image, electronic device, apparatus, and computer readable storage medium - Google Patents

Method of generating an image, electronic device, apparatus, and computer readable storage medium Download PDF

Info

Publication number
WO2022246752A1
WO2022246752A1 PCT/CN2021/096482 CN2021096482W WO2022246752A1 WO 2022246752 A1 WO2022246752 A1 WO 2022246752A1 CN 2021096482 W CN2021096482 W CN 2021096482W WO 2022246752 A1 WO2022246752 A1 WO 2022246752A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera image
recording
preview
image
processor
Prior art date
Application number
PCT/CN2021/096482
Other languages
French (fr)
Inventor
Masato Miyauchi
Jun Luo
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to CN202180098016.9A priority Critical patent/CN117378208A/en
Priority to PCT/CN2021/096482 priority patent/WO2022246752A1/en
Publication of WO2022246752A1 publication Critical patent/WO2022246752A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • the present disclosure relates to a method of generating image data, an electronic device implementing such method a computer readable medium including program instructions stored thereon for performing such method, and an apparatus.
  • the user can check the latest frame (the preview camera image) of the moving image in real time when shooting the moving image.
  • the recording camera image is generated using the processing result of the preview camera image. That is, the process of generating these camera images is shared.
  • the image quality of the recording camera image of the captured moving image is the same as the image quality of the preview camera image of the moving image.
  • the processing for generating the preview camera image prioritizes the processing time over the image quality, so that the processing result of the recording camera image is not sufficient image quality.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electronic device and a method of controlling electronic device.
  • a method of generating an image including:
  • an electronic device includes:
  • an imaging module configured to capture a camera image
  • At least one memory including program code
  • the at least one memory and the program code configured to, with the at least one processor, cause the electronic device to perform:
  • processor executes the preview process and the recording process separately.
  • an apparatus including:
  • a camera image acquiring unit configured to acquire a camera image, to generate a moving image, by controlling an imaging module
  • a preview camera image acquiring unit configured to acquire a preview camera image based on the camera image, by a preview process
  • a recording camera image acquiring unit configured to acquire a recording camera image based on the camera image, by a recording process
  • a computer readable medium comprising program instructions stored thereon for performing at least the following:
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electronic device 100 and a subject 101 according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating an example of the configuration of the electronic device 100 shown in FIG. 1;
  • FIG. 3 is a diagram showing another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2;
  • FIG. 4 is a diagram showing still another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2;
  • FIG. 5 is a diagram showing a block diagram of the processor according to the embodiment of the present disclosure.
  • FIG. 6 is a diagram showing a first example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
  • FIG. 7 is a diagram showing a second example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electronic device 100 and a subject 101 according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of the configuration of the electronic device 100 shown in FIG. 1.
  • the electronic device 100 includes a first camera module 10, a second module 20, and an image signal processor 30 that controls the first camera module 10 and the second camera module 20, and processes camera image data acquired from the camera modules 10 and 20.
  • the imaging module 102 is configured by the first camera module 10 and the second module 20.
  • the imaging module 102 is defined as one that captures at least a subject 101 and acquires a camera image.
  • the imaging module 102 includes the first camera module 10 that captures the subject 101 and acquires a first camera image, and the second camera module 20, that captures the subject 101 and acquires a second camera image.
  • the depth information of the camera image is calculated from the parallax of the first camera module 10 and the second camera module 20, and the depth image corresponding to the camera image is acquired based on the calculated depth information.
  • the first camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
  • the first camera module 10 includes, for example, a focus &OIS actuator 10f that actuates the master lens 10a, and a focus &OIS driver 10e that drives the focus &OIS actuator 10f, as shown in FIG. 2.
  • the first camera module 10 acquires a first camera image of the subjects 101, for example (FIG. 2) .
  • the second camera module 20 includes, for example, a master lens 20a that is capable of focusing on a subject, a master image sensor 20b that detects an image inputted via the master lens 20a, and a master image sensor driver 20c that drives the master image sensor 20b, as shown in FIG. 2.
  • the second camera module 20 includes, for example, a focus &OIS actuator 20f that actuates the master lens 20a, and a focus &OIS driver 20e that drives the focus &OIS actuator 20f, as shown in FIG. 2.
  • the second camera module 20 acquires a second camera image of the subjects 101, for example (FIG. 2) .
  • the electronic device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial measurement unit
  • the GNSS module 40 measures the current position of the electronic device 100, for example.
  • the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
  • the display module 45 displays predefined information.
  • the display module 45 is, for example, a touch panel.
  • the input module 46 receives a user’s input (a user’s operations) .
  • the input module 46 is included in, for example, the touch panel.
  • An IMU 47 detects, for example, the angular velocity and the acceleration of the electronic device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • GNSS global navigation satellite system
  • the processor 103 is composed of the image signal processor 30 and the main processor 48.
  • the processor 103 is defined as a controller that controls the imaging module 102 and acquires a camera image.
  • the processor 103 controls the imaging module 102 to sequentially acquire camera images and depth images corresponding to a plurality of consecutive frames. Then, the processor 103 outputs the processed camera image (camera image data) by performing image processing on the acquired camera image based on the depth image.
  • the memory 49 stores a program and data required for the image processor 30 to control the first camera module 10 and the second camera module 20, acquired image data, and programs and data required for the main processor 48 to control the electronic device 100.
  • the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the processor 103, the computer program implements a method for controlling the electronic device 100.
  • the images including such as the captured images and generated images exist in the form image data in the electronic device 100 (for, example, the data of the images are stored in the memory 49) . Furthermore, the processor 103 processes the images as image data.
  • the memory 49 includes program code.
  • the memory 49 and the program code configured to, with the processor 103, cause the electronic device 100 to perform: acquiring a camera image, to generate a moving image, by controlling an imaging module 102; acquiring a preview camera image based on the camera image, by a preview process; and acquiring a recording camera image based on the camera image, by a recording process, wherein the preview process and the recording process are executed separately by a processor 103.
  • the electronic device 100 communicates with the external calculator (the sever) 200 via the wireless communication module 41.
  • the processor 103 of the electronic device 100 causes the external calculator 200 to perform calculation processing of the predetermined data (information) , by transmitting predetermined data (information) to the external calculator 200 via the wireless communication module 41.
  • the processor 103 of the electronic device 100 receives the data (information) calculated by the external calculator 200 from the calculator 200 via the wireless communication module 41.
  • the electronic device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electronic devices (for instance, a tablet computer and a PDA) including the imaging module 102.
  • the imaging module 102 includes the first camera module 10 and the second camera module 20.
  • the first camera module 10 captures the subject 101 to acquire a first camera image.
  • the second camera module 20 captures the subject 101 to acquire a second camera image.
  • the processor 103 acquires the camera image based on the first camera image and the second camera image. Then, the processor 103 calculates the depth information corresponding to the camera image, and the processor 103 acquires the depth image corresponding to the camera image based on the calculated depth information.
  • the configuration of the imaging module 102 of the electronic device 100 is not limited to the configurations shown in FIGS. 1 and 2.
  • FIG. 3 is a diagram showing another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2.
  • the imaging module 102 may only include a first camera module 10.
  • the processor 103 calculates the depth information corresponding to the camera image, the processor 103 and acquires the depth image corresponding to the camera image based on the calculated depth information.
  • FIG. 4 is a diagram showing still another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2.
  • the imaging module 102 may further include an additional camera module 50.
  • the additional camera module 50 shown in FIG. 4 captures the subject 101 and acquires a difference image (the event camera image) having pixel values due to changes over time in the captured camera image.
  • the additional camera module 50 is, for example, an event camera having a performance close to the shooting timing and FPS (Flames Per Second) of the imaging module 102.
  • FIG. 5 illustrates a block diagram of the processor according to the embodiment of the present disclosure.
  • the processor 103 shown in FIG. 5 may be substituted as an apparatus.
  • the apparatus comprises: a camera image acquiring unit 103a configured to acquire a camera image, to generate a moving image, by controlling an imaging module 102; a preview camera image acquiring unit 103b configured to acquire a preview camera image based on the camera image, by a preview process; and a recording camera image acquiring unit 103c configured to acquire a recording camera image based on the camera image, by a recording process, wherein the preview process and the recording process are executed separately.
  • the electronic device 100 may be represented as an apparatus including the processor 103.
  • FIG. 6 is a diagram showing a first example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
  • the processor 103 captures a subject by controlling the imaging module 102 (S1) . Then, the processor 103 acquires a camera image in order to generate a moving image (S2) .
  • the processor 103 acquires a preview camera image for previewing at the time of shooting the moving image based on the camera image by the preview process PA.
  • the processor 103 acquires a recording camera image for recording based on the camera image by the recording process PB.
  • the processor 103 separately executes the preview process PA and the recording process PB.
  • the processor 103 executes the preview process PA and the recording process PB in parallel (simultaneously) .
  • the processor 103 acquires the preview depth value based on the camera image in the preview process PA (the preview depth calculation process S3A) .
  • the processor 103 acquires a preview depth image based on the preview depth value (the preview depth image acquisition process S4A) .
  • the processor 103 generates a preview camera image by executing image processing on the camera image based on the preview depth image (the preview image processing process S5A) .
  • the processor 103 executes the preview output process S6A that outputs the preview camera image in the preview process PA.
  • the processor 103 causes the display module to display the preview camera image.
  • the processor 103 acquires a recording depth value based on the camera image in the recording process PB (the recording depth calculation process S3B) .
  • the processor 103 acquires a recording depth image based on the recording depth value (the recording depth image acquisition process S4B) .
  • the processor 103 generates a recording camera image by executing image processing on the camera image based on the recording depth image (the recording image processing process S5B) .
  • the processor 103 executes the recording output process S6B for outputting the recording camera image.
  • the processor 103 stores the recording camera image in the memory 49.
  • the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, by the processor 103 executing the preview process PA and the recording process PB separately in parallel.
  • the first amount of data of the camera image calculated by the processor 103 at the preview depth is set to be smaller than the second amount of data of the camera image calculated by the processor 103 at the recording depth.
  • the first calculation amount for the camera image on which the processor 103 executes the preview depth calculation is set lower than the calculation amount for of the camera image on which the processor 103 executes the recording depth calculation.
  • the processor 103 can execute the process of generating the preview camera image faster.
  • the processor 103 may cause an external calculator 200 shown in FIG. 2 to generate a recording camera image based on the camera image, in the recording process PB.
  • the external calculator 200 acquires the recording depth value based on the camera image (the recording depth calculation process S3B) .
  • the external calculator 200 acquires a recording depth image based on the recording depth value (recording depth image acquisition process S4B) .
  • the external computer 200 generates a recording camera image by performing image processing on the camera image based on the recording depth image (the recording image processing process S5B) .
  • the processor 103 acquires the recording depth image generated by the external computer 200.
  • the wireless communication module 41 shown in FIG. 2 transmits the camera image to the external calculator 200, and the wireless communication module 41 receives the recording camera image generated by the external calculator 200, in the recording process PB.
  • the amount of data processed by the processor 100 to generate the recording camera image can be reduced. Therefore, it is possible to improve the responsiveness of the process of generating the preview camera image, and it is possible to apply a heavy process that prioritizes the image quality to the generation of the recording camera image.
  • the processing speed of the external calculator 200 shown in FIG. 2 is set to be faster than the processing speed of the processor 103.
  • the present invention is not limited to this example, and the preview camera image and the recording camera image may be generated based on the two camera images (the main camera image and sub camera image) .
  • FIG. 7 is a diagram showing a second example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
  • the main camera image and the sub camera image are acquired by the first camera module 10 and the second camera module 20 of the imaging module 102. Then, the processor 103 generates a preview camera image and a recording camera image based on these two camera images.
  • the processor 103 controls the imaging module 102 to image a subject in order to generate a moving image (S1R, S1L) .
  • the processor 103 acquires the main camera image and the sub camera image (S2R, S2L) .
  • the processor 103 acquires a preview depth value based on the main camera image and the sub camera image (the preview depth calculation process S3A) .
  • the processor 103 acquires the preview depth image based on the preview depth value (the preview depth image acquisition process S4A) .
  • the processor 103 generates a preview camera image by executing image processing on the main camera image based on the preview depth image (the preview image processing process S5A) .
  • the processor 103 executes the preview output process S6A that outputs the preview camera image in the preview process PA.
  • the processor 103 causes the display module to display the preview camera image.
  • the processor 103 acquires a recording depth value based on the main camera image and the sub camera image (the recording depth calculation process S3B) .
  • the processor 103 acquires a recording depth image based on the recording depth value (recording depth image acquisition process S4B) .
  • the processor 103 generates a recording camera image by executing image processing on the main camera image based on the recording depth image (the recording image processing process S5B) .
  • the processor 103 executes the recording output process S6B for outputting the recording camera image.
  • the processor 103 stores the recording camera image in the memory 49.
  • the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, thereby the processor 103 executes the preview process PA and the recording process PB in parallel and separately.
  • the electronic device comprises: an imaging module configured to capture a camera image; at least one processor; and at least one memory including program code.
  • the at least one memory and the program code configured to, with the at least one processor, cause the electronic device to perform: acquiring a camera image, to generate a moving image, by controlling the imaging module; acquiring a preview camera image based on the camera image, by a preview process; and acquiring a recording camera image based on the camera image, by a recording process.
  • the processor executes the preview process and the recording process separately.
  • the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, thereby the processor executes the preview process and the recording process in parallel and separately.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electronic connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electronic manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A method of generating an image includes: acquiring a camera image, to generate a moving image, by controlling an imaging module; acquiring a preview camera image based on the camera image, by a preview process; and acquiring a recording camera image based on the camera image, by a recording process, wherein the preview process and the recording process are executed separately by a processor.

Description

METHOD OF GENERATING AN IMAGE, ELECTRONIC DEVICE, APPARATUS, AND COMPUTER READABLE STORAGE MEDIUM TECHNICAL FIELD
The present disclosure relates to a method of generating image data, an electronic device implementing such method a computer readable medium including program instructions stored thereon for performing such method, and an apparatus.
BACKGROUND
Conventionally, there are electronic devices such as smartphones equipped with a digital camera that captures a subject such as a person.
With such conventional electronic devices, the user can check the latest frame (the preview camera image) of the moving image in real time when shooting the moving image. However, in this case, the recording camera image is generated using the processing result of the preview camera image. That is, the process of generating these camera images is shared.
Therefore, the image quality of the recording camera image of the captured moving image is the same as the image quality of the preview camera image of the moving image.
In such a conventional technique, the processing for generating the preview camera image prioritizes the processing time over the image quality, so that the processing result of the recording camera image is not sufficient image quality.
On the other hand, in such a conventional technique, since the processing time of the preview camera image is limited, it is not possible to select heavy processing as the processing of the recording camera image.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electronic device and a method of controlling electronic device.
In accordance with the present disclosure, a method of generating an image, including:
acquiring a camera image, to generate a moving image, by controlling an imaging module;
acquiring a preview camera image based on the camera image, by a preview process; and
acquiring a recording camera image based on the camera image, by a recording process,
wherein the preview process and the recording process are executed separately by a processor.
In accordance with the present disclosure, an electronic device includes:
an imaging module configured to capture a camera image;
at least one processor; and
at least one memory including program code;
the at least one memory and the program code configured to, with the at least one processor, cause the electronic device to perform:
acquiring a camera image, to generate a moving image, by controlling the imaging module;
acquiring a preview camera image based on the camera image, by a preview process; and
acquiring a recording camera image based on the camera image, by a recording process,
wherein the processor executes the preview process and the recording process separately.
In accordance with the present disclosure, an apparatus including:
a camera image acquiring unit configured to acquire a camera image, to generate a moving image, by controlling an imaging module;
a preview camera image acquiring unit configured to acquire a preview camera image based on the camera image, by a preview process; and
a recording camera image acquiring unit configured to acquire a recording camera image based on the camera image, by a recording process,
wherein the preview process and the recording process are executed separately.
In accordance with the present disclosure, a computer readable medium comprising program instructions stored thereon for performing at least the following:
acquiring a camera image, to generate a moving image, by controlling an imaging module;
acquiring a preview camera image based on the camera image, by a preview process; and
acquiring a recording camera image based on the camera image, by a recording process,
wherein the preview process and the recording process are executed separately by a processor.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a diagram illustrating an example of an arrangement of an electronic device 100 and a subject 101 according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an example of the configuration of the electronic device 100 shown in FIG. 1;
FIG. 3 is a diagram showing another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2;
FIG. 4 is a diagram showing still another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2;
FIG. 5 is a diagram showing a block diagram of the processor according to the embodiment of the present disclosure;
FIG. 6 is a diagram showing a first example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention; and
FIG. 7 is a diagram showing a second example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
FIG. 1 is a diagram illustrating an example of an arrangement of an electronic device 100 and a subject 101 according to an embodiment of the present invention. FIG. 2 is a diagram illustrating an example of the configuration of the electronic device 100 shown in FIG. 1.
As shown in FIG. 1 and FIG. 2, for example, the electronic device 100 includes a first camera module 10, a second module 20, and an image signal processor 30 that controls the first camera module 10 and the second camera module 20, and processes camera image data acquired from the  camera modules  10 and 20.
In the examples of FIGS. 1 and 2, the imaging module 102 is configured by the first camera module 10 and the second module 20. The imaging module 102 is defined as one that captures at least a subject 101 and acquires a camera image.
Thus, as shown in FIG. 2, the imaging module 102 includes the first camera module 10 that captures the subject 101 and acquires a first camera image, and the second camera module 20, that captures the subject 101 and acquires a second camera image.
In the example of FIG. 2, the depth information of the camera image is calculated from the parallax of the first camera module 10 and the second camera module 20, and the depth image corresponding to the camera image is acquired based on the calculated depth information.
The first camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
Furthermore, the first camera module 10 includes, for example, a focus &OIS actuator 10f that actuates the master lens 10a, and a focus &OIS driver 10e that drives the focus &OIS actuator 10f, as shown in FIG. 2.
The first camera module 10 acquires a first camera image of the subjects 101, for example (FIG. 2) .
The second camera module 20 includes, for example, a master lens 20a that is capable of focusing on a subject, a master image sensor 20b that detects an image inputted via the master lens 20a, and a master image sensor driver 20c that drives the master image sensor 20b, as shown in FIG. 2.
Furthermore, the second camera module 20 includes, for example, a focus &OIS actuator 20f that actuates the master lens 20a, and a focus &OIS driver 20e that drives the focus &OIS actuator 20f, as shown in FIG. 2.
The second camera module 20 acquires a second camera image of the subjects 101, for example (FIG. 2) .
Furthermore, as shown in FIG. 2, for example, the electronic device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
The GNSS module 40 measures the current position of the electronic device 100, for example.
The CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
The microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
The display module 45 displays predefined information. The display module 45 is, for example, a touch panel.
The input module 46 receives a user’s input (a user’s operations) . The input module 46 The input module 46 is included in, for example, the touch panel.
An IMU 47 detects, for example, the angular velocity and the acceleration of the electronic device 100.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
In the example of FIG. 2, the processor 103 is composed of the image signal processor 30 and the main processor 48. The processor 103 is defined as a controller that controls the imaging module 102 and acquires a camera image.
For example, the processor 103 controls the imaging module 102 to sequentially acquire camera images and depth images corresponding to a plurality of consecutive frames. Then, the processor 103 outputs the processed camera image (camera image data) by performing image processing on the acquired camera image based on the depth image.
The memory 49 stores a program and data required for the image processor 30 to control the first camera module 10 and the second camera module 20, acquired image data, and programs and data required for the main processor 48 to control the electronic device 100.
For example, the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the processor 103, the computer program implements a method for controlling the electronic device 100.
It is noticed that the images including such as the captured images and generated images exist in the form image data in the electronic device 100 (for, example, the data of the images are stored in the memory 49) . Furthermore, the processor 103 processes the images as image data.
Especially, for example, the memory 49 includes program code. The memory 49 and the program code configured to, with the processor 103, cause the electronic device 100 to perform: acquiring a camera image, to generate a moving image, by controlling an imaging module 102; acquiring a preview camera image based on the camera image, by a preview process; and acquiring a recording camera image based on the camera image, by a recording process, wherein the preview process and the recording process are executed separately by a processor 103.
Here, for example, as shown in FIG. 2, the electronic device 100 communicates with the external calculator (the sever) 200 via the wireless communication module 41.
For example, the processor 103 of the electronic device 100 causes the external calculator 200 to perform calculation processing of the predetermined data (information) , by transmitting predetermined data (information) to the external calculator 200 via the wireless communication module 41.
Then, the processor 103 of the electronic device 100 receives the data (information) calculated by the external calculator 200 from the calculator 200 via the wireless communication module 41.
The electronic device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electronic devices (for instance, a tablet computer and a PDA) including the imaging module 102.
As described above, in the examples shown in FIGS. 1 and 2, the imaging module 102 includes the first camera module 10 and the second camera module 20. The first camera module 10 captures the subject 101 to acquire a first camera image. The second camera module 20 captures the subject 101 to acquire a second camera image.
In the examples shown in FIGS. 1 and 2, the processor 103 acquires the camera image based on the first camera image and the second camera image. Then, the processor 103 calculates the depth information corresponding to the camera image, and the processor 103 acquires the depth image corresponding to the camera image based on the calculated depth information.
However, the configuration of the imaging module 102 of the electronic device 100 is not limited to the configurations shown in FIGS. 1 and 2.
For example, FIG. 3 is a diagram showing another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2.
Thus, instead of the examples shown in FIGS. 1 and 2, for example, as shown in FIG. 3, the imaging module 102 may only include a first camera module 10.
In the example shown in FIG. 3, the processor 103 calculates the depth information corresponding to the camera image, the processor 103 and acquires the depth image corresponding to the camera image based on the calculated depth information.
Next, FIG. 4 is a diagram showing still another example of the imaging module 102 of the electronic device 100 shown in FIGS. 1 and 2.
As shown in FIG. 4, the imaging module 102 may further include an additional camera module 50.
The additional camera module 50 shown in FIG. 4 captures the subject 101 and acquires a difference image (the event camera image) having pixel values due to changes over time in the captured camera image.
The additional camera module 50 is, for example, an event camera having a performance close to the shooting timing and FPS (Flames Per Second) of the imaging module 102.
Here, FIG. 5 illustrates a block diagram of the processor according to the embodiment of the present disclosure.
As shown in FIG. 5, the processor 103 shown in FIG. 5 may be substituted as an apparatus. The apparatus comprises: a camera image acquiring unit 103a configured to acquire a camera image, to generate a moving image, by controlling an imaging module 102; a preview camera image acquiring unit 103b configured to acquire a preview camera image based on the camera image, by a preview process; and a recording camera image acquiring unit 103c configured to acquire a recording camera image based on the camera image, by a recording process, wherein the preview process and the recording process are executed separately. In this case, the electronic device 100 may be represented as an apparatus including the processor 103.
Next, an example of a method of generating image data by the electronic device 100 having the above-described configuration and functions will now be described.
Hereinafter, the processing by the processor 103 of the electronic device 100 will be mainly described. In the example shown in FIG. 6 below, an example of generating a preview camera image and a recording camera image based on one camera image will be described.
Here, FIG. 6 is a diagram showing a first example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
As shown in FIG. 6, the processor 103 captures a subject by controlling the imaging module 102 (S1) . Then, the processor 103 acquires a camera image in order to generate a moving image (S2) .
Then, the processor 103 acquires a preview camera image for previewing at the time of shooting the moving image based on the camera image by the preview process PA.
Furthermore, the processor 103 acquires a recording camera image for recording based on the camera image by the recording process PB.
In the example shown in FIG. 6, the processor 103 separately executes the preview process PA and the recording process PB.
In particular, the processor 103 executes the preview process PA and the recording process PB in parallel (simultaneously) .
More specifically, the processor 103 acquires the preview depth value based on the camera image in the preview process PA (the preview depth calculation process S3A) .
Then, the processor 103 acquires a preview depth image based on the preview depth value (the preview depth image acquisition process S4A) .
Then, the processor 103 generates a preview camera image by executing image processing on the camera image based on the preview depth image (the preview image processing process S5A) .
Then, the processor 103 executes the preview output process S6A that outputs the preview camera image in the preview process PA.
For example, the processor 103 causes the display module to display the preview camera image.
On the other hand, the processor 103 acquires a recording depth value based on the camera image in the recording process PB (the recording depth calculation process S3B) .
Then, the processor 103 acquires a recording depth image based on the recording depth value (the recording depth image acquisition process S4B) .
Then, the processor 103 generates a recording camera image by executing image processing on the camera image based on the recording depth image (the recording image processing process S5B) .
Then, in the recording process PB, the processor 103 executes the recording output process S6B for outputting the recording camera image.
For example, the processor 103 stores the recording camera image in the memory 49.
In this way, the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, by the processor 103 executing the preview process PA and the recording process PB separately in parallel.
Here, for example, the first amount of data of the camera image calculated by the processor 103 at the preview depth is set to be smaller than the second amount of data of the camera image calculated by the processor 103 at the recording depth.
In particular, the first calculation amount for the camera image on which the processor 103 executes the preview depth calculation is set lower than the calculation amount for of the camera image on which the processor 103 executes the recording depth calculation.
Therefore, the processor 103 can execute the process of generating the preview camera image faster.
For example, the processor 103 may cause an external calculator 200 shown in FIG. 2 to generate a recording camera image based on the camera image, in the recording process PB.
More specifically, in the recording process PB, the external calculator 200 acquires the recording depth value based on the camera image (the recording depth calculation process S3B) .
Then, the external calculator 200 acquires a recording depth image based on the recording depth value (recording depth image acquisition process S4B) .
Then, the external computer 200 generates a recording camera image by performing image processing on the camera image based on the recording depth image (the recording image processing process S5B) .
Then, the processor 103 acquires the recording depth image generated by the external computer 200.
In this case, the wireless communication module 41 shown in FIG. 2 transmits the camera image to the external calculator 200, and the wireless communication module 41 receives the recording camera image generated by the external calculator 200, in the recording process PB.
As a result, the amount of data processed by the processor 100 to generate the recording camera image can be reduced. Therefore, it is possible to improve the responsiveness of the process of generating the preview camera image, and it is possible to apply a heavy process that prioritizes the image quality to the generation of the recording camera image.
For example, the processing speed of the external calculator 200 shown in FIG. 2 is set to be faster than the processing speed of the processor 103.
As a result, in the external calculator 200, heavy processing that prioritizes image quality can be applied to the generation of the recording camera image.
[Modification example]
In the example shown in FIG. 6 described above, an example of generating a preview camera image and a recording camera image based on one camera image has been described. However, the present invention is not limited to this example, and the preview camera image and the recording camera image may be generated based on the two camera images (the main camera image and sub camera image) .
Therefore, an example of generating a preview camera image and a recording camera image based on two camera images (the main camera image and sub camera image) will be described below.
Here, FIG. 7 is a diagram showing a second example of a flow for acquiring a preview camera image and a recording camera image, at the time of moving image shooting according to the embodiment of the present invention.
In the example of FIG. 7, the main camera image and the sub camera image are acquired by the first camera module 10 and the second camera module 20 of the imaging module 102. Then, the processor 103 generates a preview camera image and a recording camera image based on these two camera images.
More specifically, first, as shown in FIG. 7, the processor 103 controls the imaging module 102 to image a subject in order to generate a moving image (S1R, S1L) . As a result, the processor 103 acquires the main camera image and the sub camera image (S2R, S2L) .
Then, in the preview process PA, the processor 103 acquires a preview depth value based on the main camera image and the sub camera image (the preview depth calculation process S3A) .
Then, the processor 103 acquires the preview depth image based on the preview depth value (the preview depth image acquisition process S4A) .
Then, the processor 103 generates a preview camera image by executing image processing on the main camera image based on the preview depth image (the preview image processing process S5A) .
Then, the processor 103 executes the preview output process S6A that outputs the preview camera image in the preview process PA.
For example, the processor 103 causes the display module to display the preview camera image.
On the other hand, in the recording process PB, the processor 103 acquires a recording depth value based on the main camera image and the sub camera image (the recording depth calculation process S3B) .
Then, the processor 103 acquires a recording depth image based on the recording depth value (recording depth image acquisition process S4B) .
Then, the processor 103 generates a recording camera image by executing image processing on the main camera image based on the recording depth image (the recording image processing process S5B) .
Then, in the recording process PB, the processor 103 executes the recording output process S6B for outputting the recording camera image.
For example, the processor 103 stores the recording camera image in the memory 49.
In this way, the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, thereby the processor 103 executes the preview process PA and the recording process PB in parallel and separately.
As described above, according to the electronic device according to the present invention, comprises: an imaging module configured to capture a camera image; at least one processor; and at least one memory including program code. The at least one memory and the program code configured to, with the at least one processor, cause the electronic device to perform: acquiring a camera image, to generate a moving image, by controlling the imaging module; acquiring a preview camera image based on the camera image, by a preview process; and acquiring a recording camera image based on the camera image, by a recording process. The processor executes the preview process and the recording process separately.
As a result, the responsiveness of the process of generating the preview camera image can be prioritized, and the heavy process of prioritizing the image quality can be applied to the generation of the recording camera image, thereby the processor executes the preview process and the recording process in parallel and separately.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electronic connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment  in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electronic manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (20)

  1. A method of generating an image, comprising:
    acquiring a camera image, to generate a moving image, by controlling an imaging module;
    acquiring a preview camera image based on the camera image, by a preview process; and
    acquiring a recording camera image based on the camera image, by a recording process,
    wherein the preview process and the recording process are executed separately by a processor.
  2. The method according to claim 1,
    wherein the processor executes the preview process and the recording process in parallel.
  3. The method according to claim 1,
    wherein, in the recording process, the processor generates the recording camera image based on the camera image, using an external calculator.
  4. The method according to claim 1,
    wherein the camera image to the external computer is transmitted by a communication module, and the recording camera image generated by the external computer is received by the communication module, in the recording process.
  5. The method according to claim 1,
    wherein, in the preview process,
    the processor acquires a preview depth value based on the camera image,
    the processor acquires a preview depth image based on the preview depth value, and
    the processor generates a preview camera image by executing image processing on the camera image based on the preview depth image, and
    wherein in the recording process,
    the processor acquires a recording depth value based on the camera image,
    the processor acquires a recording depth image based on the recording depth value,
    the processor generates a recording camera image by executing image processing on the camera image based on the recording depth image.
  6. The method according to claim 5,
    wherein a first amount of data of the camera image on which the processor calculates the preview depth value is set to be smaller than a second amount of data of the camera image on which the processor calculates the recording depth value.
  7. The method according to claim 5,
    wherein a first calculation amount for the camera image on which the processor calculates the preview depth value is set lower than a second calculation amount for the camera image on which the processor calculates the recording depth value.
  8. The method according to claim 3,
    wherein, in the preview process,
    the processor acquires a preview depth value based on the camera image,
    the processor acquires a preview depth image based on the preview depth value, and
    the processor generates a preview camera image by executing image processing on the camera image based on the preview depth image, and
    wherein in the recording process,
    the external calculator acquires the recording depth value based on the camera image,
    the external calculator acquires the recording depth image based on the recording depth value, and
    the external computer generates a recording camera image by performing image processing on the camera image based on the recording depth image.
  9. The method according to claim 3.
    wherein a processing speed of the external computer is faster than a processing speed of the processor.
  10. An electronic device comprising:
    an imaging module configured to capture a camera image;
    at least one processor; and
    at least one memory including program code;
    the at least one memory and the program code configured to, with the at least one processor, cause the electronic device to perform:
    acquiring a camera image, to generate a moving image, by controlling the imaging module;
    acquiring a preview camera image based on the camera image, by a preview process; and
    acquiring a recording camera image based on the camera image, by a recording process,
    wherein the processor executes the preview process and the recording process separately.
  11. The electronic device according to claim 10,
    wherein the processor executes the preview process and the recording process in parallel.
  12. The electronic device according to claim 10,
    wherein, in the recording process, the processor generates the recording camera image based on the camera image, using an external calculator.
  13. The electronic device according to claim 10, further comprising a communication module configured to transmit the camera image to the external computer, and to receive the recording camera image generated by the external computer, in the recording process.
  14. The electronic device according to claim 10,
    wherein, in the preview process,
    the processor acquires a preview depth value based on the camera image,
    the processor acquires a preview depth image based on the preview depth value, and
    the processor generates a preview camera image by executing image processing on the camera image based on the preview depth image, and
    wherein in the recording process,
    the processor acquires a recording depth value based on the camera image,
    the processor acquires a recording depth image based on the recording depth value,
    the processor generates a recording camera image by executing image processing on the camera image based on the recording depth image.
  15. The electronic device according to claim 14,
    wherein a first amount of data of the camera image on which the processor calculates the preview depth value is set to be smaller than a second amount of data of the camera image on which the processor calculates the recording depth value.
  16. The electronic device according to claim 14,
    wherein a first calculation amount for the camera image on which the processor calculates the preview depth value is set lower than a second calculation amount for of the camera image on which the processor calculates the recording depth value.
  17. The electronic device according to claim 12,
    wherein, in the preview process,
    the processor acquires a preview depth value based on the camera image,
    the processor acquires a preview depth image based on the preview depth value, and
    the processor generates a preview camera image by executing image processing on the camera image based on the preview depth image, and
    wherein in the recording process,
    the external calculator acquires the recording depth value based on the camera image,
    the external calculator acquires the recording depth image based on the recording depth value, and
    the external computer generates a recording camera image by performing image processing on the camera image based on the recording depth image.
  18. The electronic device according to claim 12,
    wherein a processing speed of the external computer is faster than a processing speed of the processor.
  19. An apparatus comprising:
    a camera image acquiring unit configured to acquire a camera image, to generate a moving image, by controlling an imaging module;
    a preview camera image acquiring unit configured to acquire a preview camera image based on the camera image, by a preview process; and
    a recording camera image acquiring unit configured to acquire a recording camera image based on the camera image, by a recording process,
    wherein the preview process and the recording process are executed separately.
  20. A computer readable medium comprising program instructions stored thereon for performing at least the following:
    acquiring a camera image, to generate a moving image, by controlling an imaging module;
    acquiring a preview camera image based on the camera image, by a preview process; and
    acquiring a recording camera image based on the camera image, by a recording process,
    wherein the preview process and the recording process are executed separately by a processor.
PCT/CN2021/096482 2021-05-27 2021-05-27 Method of generating an image, electronic device, apparatus, and computer readable storage medium WO2022246752A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180098016.9A CN117378208A (en) 2021-05-27 2021-05-27 Method, electronic device, apparatus, and computer-readable storage medium for generating image
PCT/CN2021/096482 WO2022246752A1 (en) 2021-05-27 2021-05-27 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/096482 WO2022246752A1 (en) 2021-05-27 2021-05-27 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Publications (1)

Publication Number Publication Date
WO2022246752A1 true WO2022246752A1 (en) 2022-12-01

Family

ID=84229421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/096482 WO2022246752A1 (en) 2021-05-27 2021-05-27 Method of generating an image, electronic device, apparatus, and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN117378208A (en)
WO (1) WO2022246752A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827935A (en) * 2015-07-23 2016-08-03 维沃移动通信有限公司 Terminal screenshot method and terminal
CN108184052A (en) * 2017-12-27 2018-06-19 努比亚技术有限公司 A kind of method of video record, mobile terminal and computer readable storage medium
CN108900790A (en) * 2018-06-26 2018-11-27 努比亚技术有限公司 Method of video image processing, mobile terminal and computer readable storage medium
US20210136297A1 (en) * 2019-11-01 2021-05-06 Samsung Electronics Co., Ltd. Method for providing preview and electronic device for displaying preview

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827935A (en) * 2015-07-23 2016-08-03 维沃移动通信有限公司 Terminal screenshot method and terminal
CN108184052A (en) * 2017-12-27 2018-06-19 努比亚技术有限公司 A kind of method of video record, mobile terminal and computer readable storage medium
CN108900790A (en) * 2018-06-26 2018-11-27 努比亚技术有限公司 Method of video image processing, mobile terminal and computer readable storage medium
US20210136297A1 (en) * 2019-11-01 2021-05-06 Samsung Electronics Co., Ltd. Method for providing preview and electronic device for displaying preview

Also Published As

Publication number Publication date
CN117378208A (en) 2024-01-09

Similar Documents

Publication Publication Date Title
CN109842753B (en) Camera anti-shake system, camera anti-shake method, electronic device and storage medium
CN110012224B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
KR102348504B1 (en) Method for reducing parallax of a plurality of cameras and electronic device supporting the same
US10740431B2 (en) Apparatus and method of five dimensional (5D) video stabilization with camera and gyroscope fusion
US7801428B2 (en) Shot image display system, image receiving device, control method for image receiving device, and server
CN109756671B (en) Electronic device for recording images using multiple cameras and method of operating the same
KR20190032061A (en) Electronic device performing image correction and operation method of thereof
US20120236164A1 (en) Imaging device and method of obtaining image
WO2017113937A1 (en) Mobile terminal and noise reduction method
US11081044B2 (en) Encoding method and apparatus, display apparatus, medium and signal transmission system
JP5932045B2 (en) Method and apparatus for conditional display of stereoscopic image pairs
CN105827961A (en) Mobile terminal and focusing method
WO2018076941A1 (en) Method and device for implementing panoramic photographing
CN105163035A (en) Mobile terminal shooting system and mobile terminal shooting method
CN107071277B (en) Optical drawing shooting device and method and mobile terminal
US20150288949A1 (en) Image generating apparatus, imaging apparatus, and image generating method
CN104221149A (en) Imaging element and imaging device
US10554891B2 (en) Image stabilization apparatus, image stabilization method, image capturing apparatus, image capturing system and non-transitory storage medium
CN116368814A (en) Spatial alignment transformation without FOV loss
JP2012194487A (en) Imaging device, imaging method and program
US11159725B2 (en) Image processing apparatus, image processing method, and recording medium
WO2022246752A1 (en) Method of generating an image, electronic device, apparatus, and computer readable storage medium
CN105210362A (en) Image adjustment device, image adjustment method, image adjustment program, and image capture device
WO2022178782A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
WO2024055290A1 (en) Method of detecting flicker area in captured image, electronic device and computer-readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21942329

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180098016.9

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21942329

Country of ref document: EP

Kind code of ref document: A1