WO2022178782A1 - Electric device, method of controlling electric device, and computer readable storage medium - Google Patents

Electric device, method of controlling electric device, and computer readable storage medium Download PDF

Info

Publication number
WO2022178782A1
WO2022178782A1 PCT/CN2021/077968 CN2021077968W WO2022178782A1 WO 2022178782 A1 WO2022178782 A1 WO 2022178782A1 CN 2021077968 W CN2021077968 W CN 2021077968W WO 2022178782 A1 WO2022178782 A1 WO 2022178782A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera image
image
camera
processor
new frame
Prior art date
Application number
PCT/CN2021/077968
Other languages
French (fr)
Inventor
Masato Miyauchi
Jun Luo
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/077968 priority Critical patent/WO2022178782A1/en
Priority to CN202180084542.XA priority patent/CN116686284A/en
Publication of WO2022178782A1 publication Critical patent/WO2022178782A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images

Definitions

  • the present invention relates to an electric device, a method of controlling the electric device, and a computer readable storage medium.
  • a technique is widely spread in which a photograph with bokeh images of objects that are located in a foreground or a background of a subject, like a photograph taken with a digital single lens reflex (DSLR) camera, is generated artificially from an image taken with a camera having a deep depth of field such as a camera of a smartphone.
  • DSLR digital single lens reflex
  • a bokeh image in which a portion that needs to get attention is made clearer and the foreground and the background of the portion is blurred, is therefore produced by image processing.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electric device and a method of controlling electric device.
  • an electric device includes:
  • an imaging module that takes a photograph of a subject, and acquires a camera image
  • a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
  • the processor controls the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
  • the processor determines whether the camera image of the new frame is good or bad.
  • the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
  • the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
  • the image processing of the camera image includes generating a bokeh image on the camera image.
  • the electric device further comprising an inertial measurement unit that detects the acceleration of the electric device,
  • the inertial measurement unit detects the acceleration of the electric device in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame was captured, and
  • the processor determines that the camera image of the new frame is good, when a maximum value of acceleration of the electric device detected by the inertial measurement unit is less than a preset first threshold value
  • the processor determines that the camera image of the new frame is bad, when the maximum value of the acceleration of the electric device detected by the inertial measurement unit is equal to or greater than the first threshold value.
  • the inertial measurement unit detects acceleration in the x-axis direction, y-axis direction, and z-axis direction of the electric device, respectively, and
  • the processor determines that the camera image of the new frame is good, when the maximum value of all accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device detected by the inertial measurement unit is less than the first threshold value,
  • the processor determines that the camera image of the new frame is bad, when the maximum value of any one of the accelerations of the electric device detected by the inertial measurement unit in the x-axis direction, the y-axis direction, and the z-axis direction is equal to or greater than the first threshold value.
  • the imaging module further includes an additional camera module that captures the subject and acquires a difference image having pixel values according to changes in the captured image over time
  • the additional camera module acquires the difference image in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame is captured
  • the processor acquires a sum of the maximum values of each pixel of the difference image acquired by the additional camera module
  • the processor determines that the camera image of the new frame is good, when a total value of the maximum values of all the pixels of the difference image is less than the preset second threshold value,
  • the processor determines that the camera image of the new frame is bad, when the total value of the maximum values of all the pixels of the difference image is equal to or greater than the second threshold value.
  • the imaging module captures the subject at the first time and acquires the camera image of the previous frame, after that, the imaging module captures the subject at the second time and acquires the camera image of the new frame,
  • the processor acquires differences, between the pixel value of the pixel of the camera image of the previous frame and the pixel value of the pixel of the camera image of the new frame, for each corresponding pixel,
  • the processor obtains a sum of the differences of all the pixels of the camera image
  • the processor determines that the camera image of the new frame is good, when a total value of the difference of all the pixels is less than a preset third threshold value,
  • the processor determines that the camera image of the new frame is bad, when the total value of the differences of all the pixels is equal to or more than the third threshold value.
  • imaging module includes:
  • a first camera module that captures the subject and acquires a first camera image
  • a second camera module that captures the subject and acquires a second camera image
  • the processor acquires the camera image based on the first camera image and the second camera image
  • the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
  • the imaging module includes a first camera module that captures the subject and acquires the camera image
  • the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
  • a method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
  • the imaging module controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
  • the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
  • the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
  • a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image, and
  • the imaging module controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
  • the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
  • the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1;
  • FIG. 3 is a diagram showing another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2;
  • FIG. 4 is a diagram showing still another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2;
  • FIG. 5 is a diagram showing an example of a flow of processing in which the electric device 100 shown in FIGS. 1 and 2 captures a subject and outputs a processed camera image;
  • FIG. 6 is a diagram showing an example of the relationship between the camera image of the acquired frame and the time
  • FIG. 7 is a diagram showing a relationship between a plurality of frames in which the processes shown in FIG. 5 are sequentially executed and determination results;
  • FIG. 8 is a diagram showing a specific example of executing the process shown in FIG. 5 for the frame (M+1) , the frame (M+2) , and the frame (M+3) shown in FIG. 7.
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1.
  • the electric device 100 includes a first camera module 10, a second module 20, and an image signal processor 30 that controls the first camera module 10 and the second camera module 20, and processes camera image data acquired from the camera module 10.
  • the imaging module 102 is configured by the first camera module 10 and the second module 20.
  • the imaging module 102 is defined as one that captures at least a subject 101 and acquires a camera image.
  • the imaging module 102 includes the first camera module 10 that captures the subject 101 and acquires a first camera image, and the second camera module 20, that captures the subject 101 and acquires a second camera image.
  • the depth information of the camera image is calculated from the parallax of the first camera module 10 and the second module 20, and the depth image corresponding to the camera image is acquired based on the calculated depth information.
  • the first camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
  • the first camera module 10 includes, for example, a focus & OIS actuator 10f that actuates the master lens 10a, and a focus & OIS driver 10e that drives the focus & OIS actuator 10f, as shown in FIG. 2.
  • the first camera module 10 acquires a first camera image of the subjects 101, for example (FIG. 2) .
  • the second camera module 20 includes, for example, a master lens 20a that is capable of focusing on a subject, a master image sensor 20b that detects an image inputted via the master lens 20a, and a master image sensor driver 20c that drives the master image sensor 10b, as shown in FIG. 2.
  • the second camera module 20 includes, for example, a focus & OIS actuator 20f that actuates the master lens 10a, and a focus & OIS driver 10e that drives the focus & OIS actuator 10f, as shown in FIG. 2.
  • the second camera module 20 acquires a second camera image of the subjects 101, for example (FIG. 2) .
  • the electric device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial measurement unit
  • the GNSS module 40 measures the current position of the electric device 100, for example.
  • the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
  • the display module 45 displays predefined information.
  • the display module 45 is, for example, a touch panel.
  • the input module 46 receives a user’s input (a user’s operations) .
  • the input module 46 is included in, for example, the touch panel.
  • An IMU 47 detects, for example, the angular velocity and the acceleration of the electric device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • GNSS global navigation satellite system
  • the processor 103 is composed of the image signal processor 30 and the main processor 48.
  • the processor 103 is defined as a controller that controls the imaging module 102 and acquires a camera image.
  • the processor 103 controls the imaging module 102 to acquire a camera image and a depth image. Then, the processor 103 outputs the processed camera image (the camera image data including the bokeh image data) by processing the acquired camera image based on the depth image.
  • the memory 49 stores a program and data required for the image processor 30 to control the first camera module 10 and the second camera module 20, acquired image data, and programs and data required for the main processor 48 to control the electric device 100.
  • the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the processor 103, the computer program implements a method for controlling the electric device 100.
  • the method comprises: controlling by means of the processor 103, the imaging module 102 to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and determining by means of the processor 103, whether the camera image of the new frame is good or bad, and wherein the processor 103 calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good, on the other hand, the processor 103 acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor 103 determines that the camera
  • the electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electric devices (for instance, a tablet computer and a PDA) including the imaging module 102.
  • the imaging module 102 includes the first camera module 10 and the second camera module 20.
  • the first camera module 10 captures the subject 101 to acquire a first camera image.
  • the second camera module 20 captures the subject 101 to acquire a second camera image.
  • the processor 103 acquires the camera image based on the first camera image and the second camera image. Then, the processor 103 calculates the depth information corresponding to the camera image, and the processor 103 acquires the depth image corresponding to the camera image based on the calculated depth information.
  • the configuration of the imaging module 102 of the electric device 100 is not limited to the configurations shown in FIGS. 1 and 2.
  • FIG. 3 is a diagram showing another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2.
  • the imaging module 102 may only include a first camera module 10.
  • the processor 103 calculates the depth information corresponding to the camera image, the processor 103 and acquires the depth image corresponding to the camera image based on the calculated depth information.
  • FIG. 4 is a diagram showing still another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2.
  • the imaging module 102 may further include an additional camera module 50.
  • the additional camera module 50 shown in FIG. 4 captures the subject 101 and acquires a difference image having pixel values due to changes over time in the captured image.
  • the additional camera module 50 is, for example, an event camera having a performance close to the shooting timing and FPS (Flames Per Second) of the imaging module 102.
  • the processor 103 determines whether the camera image of the new frame is a good camera image (a valid camera image) or a bad camera image (an invalid camera image) based on the difference image acquired by the additional camera module 50.
  • the bad frame having this bad camera image includes a strong motion blurred image or a broken image.
  • the good frame with a good camera image does not include a strong motion blurred image and a broken image.
  • FIG. 5 is a diagram showing an example of a flow of processing in which the electric device 100 shown in FIGS. 1 and 2 captures a subject and outputs a processed camera image.
  • FIG. 6 is a diagram showing an example of the relationship between the camera image of the acquired frame and the time.
  • the new frame (N) is the latest frame at the time of shooting. Furthermore, the previous frame (N-1) is the frame immediately before the new frame (N) .
  • the imaging module 102 captures at least the subject 101 and acquires a camera image.
  • the processor 103 controls the imaging module 102 to acquire the camera image of a new frame (N) including an image of the subject 101.
  • step S3 of FIG. 5 the processor 103 determines whether the acquired camera image of the new frame (N) is good or bad.
  • the bad frame having this bad camera image includes a strong motion blurred image or a broken image.
  • the good frame with a good camera image does not include a strong motion blurred image and a broken image.
  • step S3 of FIG. 5 when the processor 103 determines in step S3 of FIG. 5 that the camera image of the new frame (N) is good, the processor 103 calculates the depth information corresponding to the camera image in step S4 of FIG. 5. Furthermore, in step S5 of FIG. 5, the processor 103 acquires a depth image corresponding to the camera image based on the calculated depth information. Then, in step S6 of FIG. 5, the processor 103 processes the camera image based on the acquired depth image.
  • step S3 of FIG. 5 the processor 103 determines in step S3 of FIG. 5 that the camera image of the new frame (N) is bad, the processor 103 does not calculate the depth information corresponding to the camera image in step S4 of FIG. 5. Then, in step S5 of FIG. 5, the processor 103 acquires a depth image corresponding to the camera image, based on the depth information corresponding to the camera image of the previous frame (N-1) before the new frame (N) . Then, in step S6 of FIG. 5, the processor 103 processes the camera image based on the acquired depth image.
  • the image processing of the camera image includes generating a bokeh image on the camera image.
  • step S7 of FIG. 5 the processor 103 outputs the processed camera image (the camera image data including the bokeh image data) .
  • a method for generating a depth image a method using a monocular, stereo, or TOF (time of flight) data is basically applied.
  • TOF time of flight
  • the present invention can also be applied to other depth image generation methods.
  • the electric device 100 includes the inertial measurement unit 47 that detects the acceleration of the electric device 100.
  • the inertial measurement unit 47 detects the acceleration of the electric device 100, in the period from the first time ta when the camera image of the previous frame (N-1) captured to the second time tb when the camera image of the new frame (N) was captured.
  • the processor 103 determines whether the acquired camera image of the new frame is good or bad.
  • the processor 103 determines that the camera image of the new frame (N) is good, when the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47 is less than a preset first threshold value.
  • the processor 103 determines that the camera image of the new frame (N) is bad, when the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47 is equal to or greater than the first threshold value.
  • the processor 103 determines that the camera image of the new frame (N) is good regardless of the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47.
  • the maximum value of the acceleration is an absolute value. In this determination method, attention is paid only to the amount of change in acceleration, not the direction of change in acceleration.
  • the total number of pixels of the camera image of the previous frame (N-1) is the same as the total number of pixels of the camera image of the new frame (N) .
  • the inertial measurement unit 47 may detect accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device 100, respectively.
  • the processor 103 determines that the camera image of the new frame (N) is good, when the maximum value of all accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device detected by the inertial measurement unit 47 is less than the first threshold value.
  • the processor 103 determines that the camera image of the new frame (N) is bad, when the maximum value of any one of the accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device 100 detected by the inertial measurement unit 47 is equal to or greater than the first threshold value.
  • the imaging module 102 includes the additional camera module 50 that captures the subject 101 and acquires a difference image having pixel values corresponding to changes in the captured image over time.
  • the additional camera module 50 acquires the difference image, in the period from the first time ta when the camera image of the previous frame (N-1) is captured to the second time tb when the camera image of the new frame (N) is captured.
  • the processor 103 acquires the sum of the maximum values of each pixel of the difference image acquired by the additional camera module 50.
  • the processor 103 determines that the camera image of the new frame (N) is good, when the total value of the maximum values of all the pixels of the difference image is less than the preset second threshold value (when the fluctuation of the image is small) .
  • the processor 103 determines that the camera image of the new frame (N) is bad, when the total value of the maximum values of all the pixels of the difference image is equal to or greater than the second threshold value (when the fluctuation of the image is large) .
  • the processor 103 determines that the camera image of the new frame (N) is good, regardless of the total value of the maximum values of all the pixels of the difference image, when the camera image of the new frame (N) is the camera image of the frame first acquired by capturing the subject 101.
  • the maximum value of each pixel of the difference image is an absolute value. In this determination method, attention is paid only to the amount of change in the captured image, not the direction of change in the captured image.
  • the difference between the pixel value of the pixel of the camera image of the previous frame (N-1) and the pixel value of the pixel of the camera image of the new frame (N) , may be used to determine whether the frame is good or bad.
  • the imaging module 102 acquires the camera image of the previous frame (N-1) by capturing the subject 101 at the first time ta.
  • the imaging module 102 acquires the camera image of the new frame (N) by capturing the subject 101 at the second time tb.
  • the processor 103 determines whether the acquired camera image of the new frame (N) is good or bad.
  • the processor 103 acquires the difference, between the pixel value of the pixel of the camera image of the previous frame (N-1) and the pixel value of the pixel of the camera image of the new frame (N) is calculated, for each corresponding pixel.
  • the processor 103 acquires the total value of the differences of all the pixels of the camera image.
  • the processor 103 determines that the camera image of the new frame (N) is good, when the total value of the difference of all the pixels is less than a preset third threshold value.
  • the processor 103 determines that the camera image of the new frame (N) is bad, when the total value of the difference of all the pixels is equal to or more than the third threshold value.
  • the processor 103 determines that the camera image of the new frame (N) is good, regardless of the total value of the difference of all the pixels, when the camera image of the new frame (N) is the camera image of the frame first acquired by photographing the subject 100.
  • the difference is an absolute value. In this determination method, attention is paid only to the amount of change in the difference, not the direction of change in the difference.
  • FIG. 7 is a diagram showing a relationship between a plurality of frames in which the processes shown in FIG. 5 are sequentially executed and determination results.
  • FIG. 8 is a diagram showing a specific example of executing the process shown in FIG. 5 for the frame (M+1) , the frame (M+2) , and the frame (M+3) shown in FIG. 7.
  • the frame (M) , the frame (M+1) , the frame (M+2) , and the frame (M+3) indicate arbitrary continuous frames.
  • the frame (M+2) is determined to be a bad frame, and in the processing of this frame (M+2) , the processing of calculating the depth is skipped. Then, the depth image of the frame (M+1) before the frame (M+2) is used as the depth image of the frame (M+1) .
  • the electric device by skipping some processing of bad frames, the margin can be obtained for the computing power and allowable temperature of the processor (the chip) .
  • the processing of the processor when a good frame is input can be executed at a higher speed.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Abstract

An electric device according to the embodiments of the present disclosure includes: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module and acquires the camera image.

Description

ELECTRIC DEVICE, METHOD OF CONTROLLING ELECTRIC DEVICE, AND COMPUTER READABLE STORAGE MEDIUM TECHNICAL FIELD
The present invention relates to an electric device, a method of controlling the electric device, and a computer readable storage medium.
BACKGROUND
Conventionally, there are electric devices such as smartphones equipped with a digital camera that captures a subject such as a person.
At present, a technique is widely spread in which a photograph with bokeh images of objects that are located in a foreground or a background of a subject, like a photograph taken with a digital single lens reflex (DSLR) camera, is generated artificially from an image taken with a camera having a deep depth of field such as a camera of a smartphone.
If a photograph is taken by using a camera with a deep depth of field, like that of a smartphone, an image that is in focus from a short-distance portion to a long-distance portion may be obtained. A bokeh image, in which a portion that needs to get attention is made clearer and the foreground and the background of the portion is blurred, is therefore produced by image processing.
There are problems that, in this image processing for the bokeh image, the bad frames which includes for example strong motion blurred image or broken image cannot get proper results no matter how well be processed. Therefore, it only consumes energy and stores a garbage data if the algorithm has rolling mean.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electric device and a method of controlling electric device.
In accordance with the present disclosure, an electric device includes:
an imaging module that takes a photograph of a subject, and acquires a camera image; and
a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
wherein
the processor controls the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
the processor determines whether the camera image of the new frame is good or bad, and
wherein
the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
In the electric device,
wherein the image processing of the camera image includes generating a bokeh image on the camera image.
In the electric device,
The electric device further comprising an inertial measurement unit that detects the acceleration of the electric device,
wherein the inertial measurement unit detects the acceleration of the electric device in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame was captured, and
wherein
the processor determines that the camera image of the new frame is good, when a maximum value of acceleration of the electric device detected by the inertial measurement unit is less than a preset first threshold value,
on the other hand, the processor determines that the camera image of the new frame is bad, when the maximum value of the acceleration of the electric device detected by the inertial measurement unit is equal to or greater than the first threshold value.
In the electric device,
wherein the inertial measurement unit detects acceleration in the x-axis direction, y-axis direction, and z-axis direction of the electric device, respectively, and
wherein
the processor determines that the camera image of the new frame is good, when the maximum value of all accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device detected by the inertial measurement unit is less than the first threshold value,
on the other hand, the processor determines that the camera image of the new frame is bad, when the maximum value of any one of the accelerations of the electric device detected by the inertial measurement unit in the x-axis direction, the y-axis direction, and the z-axis direction is equal to or greater than the first threshold value.
In the electric device,
wherein the imaging module further includes an additional camera module that captures the subject and acquires a difference image having pixel values according to changes in the captured image over time,
wherein the additional camera module acquires the difference image in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame is captured,
wherein the processor acquires a sum of the maximum values of each pixel of the difference image acquired by the additional camera module, and
wherein
the processor determines that the camera image of the new frame is good, when a total value of the maximum values of all the pixels of the difference image is less than the preset second threshold value,
on the other hand, the processor determines that the camera image of the new frame is bad, when the total value of the maximum values of all the pixels of the difference image is equal to or greater than the second threshold value.
In the electric device,
wherein the imaging module captures the subject at the first time and acquires the camera image of the previous frame, after that, the imaging module captures the subject at the second time and acquires the camera image of the new frame,
wherein the processor acquires differences, between the pixel value of the pixel of the camera image of the previous frame and the pixel value of the pixel of the camera image of the new frame, for each corresponding pixel,
wherein the processor obtains a sum of the differences of all the pixels of the camera image, and
wherein
the processor determines that the camera image of the new frame is good, when a total value of the difference of all the pixels is less than a preset third threshold value,
On the other hand, the processor determines that the camera image of the new frame is bad, when the total value of the differences of all the pixels is equal to or more than the third threshold value.
In the electric device,
wherein the imaging module includes:
a first camera module that captures the subject and acquires a first camera image, and
a second camera module that captures the subject and acquires a second camera image, and
wherein the processor acquires the camera image based on the first camera image and the second camera image, and
the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
In the electric device,
wherein the imaging module includes a first camera module that captures the subject and acquires the camera image, and
wherein the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
In accordance with the present disclosure, a method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
the method comprising:
controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
determining by means of the processor, whether the camera image of the new frame is good or bad, and
wherein
the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
In accordance with the present disclosure, a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image, and
the method comprising:
controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
determining by means of the processor, whether the camera image of the new frame is good or bad, and
wherein
the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1;
FIG. 3 is a diagram showing another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2;
FIG. 4 is a diagram showing still another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2;
FIG. 5 is a diagram showing an example of a flow of processing in which the electric device 100 shown in FIGS. 1 and 2 captures a subject and outputs a processed camera image;
FIG. 6 is a diagram showing an example of the relationship between the camera image of the acquired frame and the time;
FIG. 7 is a diagram showing a relationship between a plurality of frames in which the processes shown in FIG. 5 are sequentially executed and determination results; and
FIG. 8 is a diagram showing a specific example of executing the process shown in FIG. 5 for the frame (M+1) , the frame (M+2) , and the frame (M+3) shown in FIG. 7.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention. FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1.
As shown in FIG. 1 and FIG. 2, for example, the electric device 100 includes a first camera module 10, a second module 20, and an image signal processor 30 that controls the first camera module 10 and the second camera module 20, and processes camera image data acquired from the camera module 10.
In the examples of FIGS. 1 and 2, the imaging module 102 is configured by the first camera module 10 and the second module 20. The imaging module 102 is defined as one that captures at least a subject 101 and acquires a camera image.
Thus, as shown in FIG. 2, the imaging module 102 includes the first camera module 10 that captures the subject 101 and acquires a first camera image, and the second camera module 20, that captures the subject 101 and acquires a second camera image.
In the example of FIG. 2, the depth information of the camera image is calculated from the parallax of the first camera module 10 and the second module 20, and the depth image corresponding to the camera image is acquired based on the calculated depth information.
The first camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
Furthermore, the first camera module 10 includes, for example, a focus & OIS actuator 10f that actuates the master lens 10a, and a focus & OIS driver 10e that drives the focus & OIS actuator 10f, as shown in FIG. 2.
The first camera module 10 acquires a first camera image of the subjects 101, for example (FIG. 2) .
The second camera module 20 includes, for example, a master lens 20a that is capable of focusing on a subject, a master image sensor 20b that detects an image inputted via the master lens 20a, and a master image sensor driver 20c that drives the master image sensor 10b, as shown in FIG. 2.
Furthermore, the second camera module 20 includes, for example, a focus & OIS actuator 20f that actuates the master lens 10a, and a focus & OIS driver 10e that drives the focus & OIS actuator 10f, as shown in FIG. 2.
The second camera module 20 acquires a second camera image of the subjects 101, for example (FIG. 2) .
Furthermore, as shown in FIG. 2, for example, the electric device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
The GNSS module 40 measures the current position of the electric device 100, for example.
The CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
The microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
The display module 45 displays predefined information. The display module 45 is, for example, a touch panel.
The input module 46 receives a user’s input (a user’s operations) . The input module 46 is included in, for example, the touch panel.
An IMU 47 detects, for example, the angular velocity and the acceleration of the electric device 100.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
In the example of FIG. 2, the processor 103 is composed of the image signal processor 30 and the main processor 48. The processor 103 is defined as a controller that controls the imaging module 102 and acquires a camera image.
For example, the processor 103 controls the imaging module 102 to acquire a camera image and a depth image. Then, the processor 103 outputs the processed camera image (the camera image data including the bokeh image data) by processing the acquired camera image based on the depth image.
The memory 49 stores a program and data required for the image processor 30 to control the first camera module 10 and the second camera module 20, acquired image data, and programs and data required for the main processor 48 to control the electric device 100.
For example, the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the processor 103, the computer program implements a method for controlling the electric device 100. For example, the method comprises: controlling by means of the processor 103, the imaging module 102 to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and determining by means of the processor 103, whether the camera image of the new frame is good or bad, and wherein the processor 103 calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good, on the other hand, the processor 103 acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor 103 determines that the camera image of the new frame is bad.
The electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electric devices (for instance, a tablet computer and a PDA) including the imaging module 102.
As described above, in the examples shown in FIGS. 1 and 2, the imaging module 102 includes the first camera module 10 and the second camera module 20. The first camera module 10 captures the subject 101 to acquire a first camera image. The second camera module 20 captures the subject 101 to acquire a second camera image.
In the examples shown in FIGS. 1 and 2, the processor 103 acquires the camera image based on the first camera image and the second camera image. Then, the processor 103 calculates the depth information corresponding to the camera image, and the processor 103 acquires the depth image corresponding to the camera image based on the calculated depth information.
However, the configuration of the imaging module 102 of the electric device 100 is not limited to the configurations shown in FIGS. 1 and 2.
For example, FIG. 3 is a diagram showing another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2.
Thus, instead of the examples shown in FIGS. 1 and 2, for example, as shown in FIG. 3, the imaging module 102 may only include a first camera module 10.
In the example shown in FIG. 3, the processor 103 calculates the depth information corresponding to the camera image, the processor 103 and acquires the depth image corresponding to the camera image based on the calculated depth information.
Next, FIG. 4 is a diagram showing still another example of the imaging module 102 of the electric device 100 shown in FIGS. 1 and 2.
As shown in FIG. 4, the imaging module 102 may further include an additional camera module 50.
The additional camera module 50 shown in FIG. 4 captures the subject 101 and acquires a difference image having pixel values due to changes over time in the captured image.
The additional camera module 50 is, for example, an event camera having a performance close to the shooting timing and FPS (Flames Per Second) of the imaging module 102.
In the example shown in FIG. 4, as described later, the processor 103 determines whether the camera image of the new frame is a good camera image (a valid camera image) or a bad camera image (an invalid camera image) based on the difference image acquired by the additional camera module 50.
The bad frame having this bad camera image includes a strong motion blurred image or a broken image. On the other hand, the good frame with a good camera image does not include a strong motion blurred image and a broken image.
[Example of a method of controlling the electric device]
Next, an example of a method of controlling the electric device 100 having the above-described configuration and functions will now be described.
FIG. 5 is a diagram showing an example of a flow of processing in which the electric device 100 shown in FIGS. 1 and 2 captures a subject and outputs a processed camera image. FIG. 6 is a diagram showing an example of the relationship between the camera image of the acquired frame and the time.
Here, as shown in FIG. 6, the new frame (N) is the latest frame at the time of shooting. Furthermore, the previous frame (N-1) is the frame immediately before the new frame (N) .
In the example shown in FIG. 5, a control method for image processing the camera image of the new frame (N) shown in FIG. 6 will be described below.
Here, for example, as shown in step S1 of FIG. 5, the imaging module 102 captures at least the subject 101 and acquires a camera image.
Next, as shown in step S2 of FIG. 5, the processor 103 controls the imaging module 102 to acquire the camera image of a new frame (N) including an image of the subject 101.
Next, as shown in step S3 of FIG. 5, the processor 103 determines whether the acquired camera image of the new frame (N) is good or bad.
As described above, the bad frame having this bad camera image includes a strong motion blurred image or a broken image. On the other hand, the good frame with a good camera image does not include a strong motion blurred image and a broken image.
Next, when the processor 103 determines in step S3 of FIG. 5 that the camera image of the new frame (N) is good, the processor 103 calculates the depth information corresponding to the camera image in step S4 of FIG. 5. Furthermore, in step S5 of FIG. 5, the processor 103 acquires a depth image corresponding to the camera image based on the calculated depth information. Then, in step S6 of FIG. 5, the processor 103 processes the camera image based on the acquired depth image.
On the other hand, When the processor 103 determines in step S3 of FIG. 5 that the camera image of the new frame (N) is bad, the processor 103 does not calculate the depth information corresponding to the camera image in step S4 of FIG. 5. Then, in step S5 of FIG. 5, the processor 103 acquires a depth image corresponding to the camera image, based on the depth information corresponding to the camera image of the previous frame (N-1) before the new frame (N) . Then, in step S6 of FIG. 5, the processor 103 processes the camera image based on the acquired depth image.
The image processing of the camera image includes generating a bokeh image on the camera image.
Then, in step S7 of FIG. 5, the processor 103 outputs the processed camera image (the camera image data including the bokeh image data) .
As a method for generating a depth image, a method using a monocular, stereo, or TOF (time of flight) data is basically applied. However, the present invention can also be applied to other depth image generation methods.
Here, some specific examples of the above-mentioned good /bad determination method of the frame shown in step S3 of FIG. 5 will be described below.
[First example of determination method]
As described above, in the example shown in FIG. 2, the electric device 100 includes the inertial measurement unit 47 that detects the acceleration of the electric device 100.
Therefore, in this first example of determination method, first, the inertial measurement unit 47 detects the acceleration of the electric device 100, in the period from the first time ta when the  camera image of the previous frame (N-1) captured to the second time tb when the camera image of the new frame (N) was captured.
Next, the processor 103 determines whether the acquired camera image of the new frame is good or bad.
That is, more specifically, the processor 103 determines that the camera image of the new frame (N) is good, when the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47 is less than a preset first threshold value.
On the other hand, the processor 103 determines that the camera image of the new frame (N) is bad, when the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47 is equal to or greater than the first threshold value.
In addition, it should be noted that, when the camera image of the new frame (N) is the camera image of the frame first acquired by capturing the subject 101, the processor 103 determines that the camera image of the new frame (N) is good regardless of the maximum value of the acceleration of the electric device 100 detected by the inertial measurement unit 47.
In addition, it should be noted that the maximum value of the acceleration is an absolute value. In this determination method, attention is paid only to the amount of change in acceleration, not the direction of change in acceleration.
In addition, it should be noted that the total number of pixels of the camera image of the previous frame (N-1) is the same as the total number of pixels of the camera image of the new frame (N) .
Here, in the above determination method, more specifically, the inertial measurement unit 47 may detect accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device 100, respectively.
In this case, the processor 103 determines that the camera image of the new frame (N) is good, when the maximum value of all accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device detected by the inertial measurement unit 47 is less than the first threshold value.
On the other hand, the processor 103 determines that the camera image of the new frame (N) is bad, when the maximum value of any one of the accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device 100 detected by the inertial measurement unit 47 is equal to or greater than the first threshold value.
[Second example of determination method]
Alternatively, in the example shown in FIG. 4 described above, the imaging module 102 includes the additional camera module 50 that captures the subject 101 and acquires a difference image having pixel values corresponding to changes in the captured image over time.
In this second example of determination method, the additional camera module 50 acquires the difference image, in the period from the first time ta when the camera image of the previous frame (N-1) is captured to the second time tb when the camera image of the new frame (N) is captured.
Then, the processor 103 acquires the sum of the maximum values of each pixel of the difference image acquired by the additional camera module 50.
The processor 103 determines that the camera image of the new frame (N) is good, when the total value of the maximum values of all the pixels of the difference image is less than the preset second threshold value (when the fluctuation of the image is small) .
On the other hand, the processor 103 determines that the camera image of the new frame (N) is bad, when the total value of the maximum values of all the pixels of the difference image is equal to or greater than the second threshold value (when the fluctuation of the image is large) .
In addition, it should be noted that the processor 103 determines that the camera image of the new frame (N) is good, regardless of the total value of the maximum values of all the pixels of the difference image, when the camera image of the new frame (N) is the camera image of the frame first acquired by capturing the subject 101.
In addition, it should be noted that the maximum value of each pixel of the difference image is an absolute value. In this determination method, attention is paid only to the amount of change in the captured image, not the direction of change in the captured image.
[Third example of determination method]
Alternatively, the difference, between the pixel value of the pixel of the camera image of the previous frame (N-1) and the pixel value of the pixel of the camera image of the new frame (N) , may be used to determine whether the frame is good or bad.
In this third example of determination method, first, the imaging module 102 acquires the camera image of the previous frame (N-1) by capturing the subject 101 at the first time ta.
After that, the imaging module 102 acquires the camera image of the new frame (N) by capturing the subject 101 at the second time tb.
Next, the processor 103 determines whether the acquired camera image of the new frame (N) is good or bad.
More specifically, the processor 103 acquires the difference, between the pixel value of the pixel of the camera image of the previous frame (N-1) and the pixel value of the pixel of the camera image of the new frame (N) is calculated, for each corresponding pixel.
Next, the processor 103 acquires the total value of the differences of all the pixels of the camera image.
Next, the processor 103 determines that the camera image of the new frame (N) is good, when the total value of the difference of all the pixels is less than a preset third threshold value.
On the other hand, the processor 103 determines that the camera image of the new frame (N) is bad, when the total value of the difference of all the pixels is equal to or more than the third threshold value.
In addition, it should be noted that the processor 103 determines that the camera image of the new frame (N) is good, regardless of the total value of the difference of all the pixels, when the camera image of the new frame (N) is the camera image of the frame first acquired by photographing the subject 100.
In addition, it should be noted that the difference is an absolute value. In this determination method, attention is paid only to the amount of change in the difference, not the direction of change in the difference.
Next, an example of the relationship, between the determination results of the plurality of frames determined by the above-mentioned determination method and the image processing of the frames of the plurality of frames, will be described below.
FIG. 7 is a diagram showing a relationship between a plurality of frames in which the processes shown in FIG. 5 are sequentially executed and determination results. FIG. 8 is a diagram showing a specific example of executing the process shown in FIG. 5 for the frame (M+1) , the frame (M+2) , and the frame (M+3) shown in FIG. 7.
In FIGS. 7 and 8, the frame (M) , the frame (M+1) , the frame (M+2) , and the frame (M+3) indicate arbitrary continuous frames.
As shown in FIGS. 7 and 8, the frame (M+2) is determined to be a bad frame, and in the processing of this frame (M+2) , the processing of calculating the depth is skipped. Then, the depth image of the frame (M+1) before the frame (M+2) is used as the depth image of the frame (M+1) .
In this way, for the bad new frame, some processing is skipped and the depth image of the previous frame is used. Thus, the depth image calculation of the bad new frames is skipped. However, the input image of this bad new frame is used as it is. Therefore, the frame dropping does not occur.
As described above, according to the electric device according to the present invention, by skipping some processing of bad frames, the margin can be obtained for the computing power and allowable temperature of the processor (the chip) . As a result, the processing of the processor when a good frame is input can be executed at a higher speed.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable  instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (10)

  1. An electric device comprising:
    an imaging module that takes a photograph of a subject, and acquires a camera image; and
    a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
    wherein
    the processor controls the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
    the processor determines whether the camera image of the new frame is good or bad, and
    wherein
    the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
    on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
  2. The electric device according to claim 1, wherein the image processing of the camera image includes generating a bokeh image on the camera image.
  3. The electric device according to claim 1, further comprising an inertial measurement unit that detects the acceleration of the electric device,
    wherein the inertial measurement unit detects the acceleration of the electric device in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame was captured, and
    wherein
    the processor determines that the camera image of the new frame is good, when a maximum value of acceleration of the electric device detected by the inertial measurement unit is less than a preset first threshold value,
    on the other hand, the processor determines that the camera image of the new frame is bad, when the maximum value of the acceleration of the electric device detected by the inertial measurement unit is equal to or greater than the first threshold value.
  4. The electric device according to claim 3,
    wherein the inertial measurement unit detects acceleration in the x-axis direction, y-axis direction, and z-axis direction of the electric device, respectively, and
    wherein
    the processor determines that the camera image of the new frame is good, when the maximum value of all accelerations in the x-axis direction, the y-axis direction, and the z-axis direction of the electric device detected by the inertial measurement unit is less than the first threshold value,
    on the other hand, the processor determines that the camera image of the new frame is bad, when the maximum value of any one of the accelerations of the electric device detected by the inertial measurement unit in the x-axis direction, the y-axis direction, and the z-axis direction is equal to or greater than the first threshold value.
  5. The electric device according to claim 1,
    wherein the imaging module further includes an additional camera module that captures the subject and acquires a difference image having pixel values according to changes in the captured image over time,
    wherein the additional camera module acquires the difference image in a period from a first time when the camera image of the previous frame is captured to a second time when the camera image of the new frame is captured,
    wherein the processor acquires a sum of the maximum values of each pixel of the difference image acquired by the additional camera module, and
    wherein
    the processor determines that the camera image of the new frame is good, when a total value of the maximum values of all the pixels of the difference image is less than the preset second threshold value,
    on the other hand, the processor determines that the camera image of the new frame is bad, when the total value of the maximum values of all the pixels of the difference image is equal to or greater than the second threshold value.
  6. The electric device according to claim 1,
    wherein the imaging module captures the subject at the first time and acquires the camera image of the previous frame, after that, the imaging module captures the subject at the second time and acquires the camera image of the new frame,
    wherein the processor acquires differences, between the pixel value of the pixel of the camera image of the previous frame and the pixel value of the pixel of the camera image of the new frame, for each corresponding pixel,
    wherein the processor obtains a sum of the differences of all the pixels of the camera image, and
    wherein
    the processor determines that the camera image of the new frame is good, when a total value of the difference of all the pixels is less than a preset third threshold value,
    On the other hand, the processor determines that the camera image of the new frame is bad, when the total value of the differences of all the pixels is equal to or more than the third threshold value.
  7. The electric device according to claim 1,
    wherein the imaging module includes:
    a first camera module that captures the subject and acquires a first camera image, and
    a second camera module that captures the subject and acquires a second camera image, and
    wherein the processor acquires the camera image based on the first camera image and the second camera image, and
    the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
  8. The electric device according to claim 1,
    wherein the imaging module includes a first camera module that captures the subject and acquires the camera image, and
    wherein the processor calculates the depth information corresponding to the camera image, and acquires the depth image corresponding to the camera image based on the calculated depth information.
  9. A method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging  module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image,
    the method comprising:
    controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
    determining by means of the processor, whether the camera image of the new frame is good or bad, and
    wherein
    the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
    on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
  10. A computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device including: an imaging module that takes a photograph of a subject, and acquires a camera image; and a processor that controls the imaging module to acquire camera images and depth images, processes the acquired camera image based on the depth image, and outputs the processed camera image by processing the acquired camera image based on the depth image, and
    the method comprising:
    controlling by means of the processor, the imaging module to acquire a camera image of a new frame containing an image of the subject, after acquiring a camera image of the previous frame, and
    determining by means of the processor, whether the camera image of the new frame is good or bad, and
    wherein
    the processor calculates depth information corresponding to the camera image of the new frame, and acquires a depth image corresponding to the camera image based on the calculated depth information, when the processor determines that the camera image of the new frame is good,
    on the other hand, the processor acquires a depth image for the camera image of the new frame based on a depth image corresponding to the camera image of the previous frame, without calculating the depth information corresponding to the camera image of the new frame, when the processor determines that the camera image of the new frame is bad.
PCT/CN2021/077968 2021-02-25 2021-02-25 Electric device, method of controlling electric device, and computer readable storage medium WO2022178782A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/077968 WO2022178782A1 (en) 2021-02-25 2021-02-25 Electric device, method of controlling electric device, and computer readable storage medium
CN202180084542.XA CN116686284A (en) 2021-02-25 2021-02-25 Electronic device, method of controlling electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/077968 WO2022178782A1 (en) 2021-02-25 2021-02-25 Electric device, method of controlling electric device, and computer readable storage medium

Publications (1)

Publication Number Publication Date
WO2022178782A1 true WO2022178782A1 (en) 2022-09-01

Family

ID=83047690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/077968 WO2022178782A1 (en) 2021-02-25 2021-02-25 Electric device, method of controlling electric device, and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN116686284A (en)
WO (1) WO2022178782A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102907105A (en) * 2010-06-09 2013-01-30 柯达公司 Video camera providing videos with perceived depth
US20160182887A1 (en) * 2014-12-23 2016-06-23 Mems Drive, Inc. Three dimensional imaging with a single camera
CN107071379A (en) * 2015-11-02 2017-08-18 联发科技股份有限公司 The enhanced method of display delay and mancarried device
US20200322544A1 (en) * 2019-04-04 2020-10-08 Samsung Electronics Co., Ltd. Electronic device, method, and computer-readable medium for providing bokeh effect in video
US20210042949A1 (en) * 2019-08-07 2021-02-11 Canon Kabushiki Kaisha Depth information generating apparatus, image capturing apparatus, depth information generating method, image processing apparatus, and image processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102907105A (en) * 2010-06-09 2013-01-30 柯达公司 Video camera providing videos with perceived depth
US20160182887A1 (en) * 2014-12-23 2016-06-23 Mems Drive, Inc. Three dimensional imaging with a single camera
CN107071379A (en) * 2015-11-02 2017-08-18 联发科技股份有限公司 The enhanced method of display delay and mancarried device
US20200322544A1 (en) * 2019-04-04 2020-10-08 Samsung Electronics Co., Ltd. Electronic device, method, and computer-readable medium for providing bokeh effect in video
US20210042949A1 (en) * 2019-08-07 2021-02-11 Canon Kabushiki Kaisha Depth information generating apparatus, image capturing apparatus, depth information generating method, image processing apparatus, and image processing method

Also Published As

Publication number Publication date
CN116686284A (en) 2023-09-01

Similar Documents

Publication Publication Date Title
TWI808987B (en) Apparatus and method of five dimensional (5d) video stabilization with camera and gyroscope fusion
CN113454982B (en) Electronic device for stabilizing image and method of operating the same
US10171791B2 (en) Methods and apparatus for conditional display of a stereoscopic image pair
US20160191788A1 (en) Image processing apparatus and image pickup apparatus
CN102986233B (en) Image imaging device
US20120236164A1 (en) Imaging device and method of obtaining image
KR20210130773A (en) Image processing method and head mounted display device
KR20160090373A (en) Photographing method for dual-camera device and dual-camera device
US8965105B2 (en) Image processing device and method
US20120249543A1 (en) Display Control Apparatus and Method, and Program
US20140327744A1 (en) Image processing apparatus, method thereof, and non-transitory computer readable storage medium
US20140333724A1 (en) Imaging device, imaging method and program storage medium
KR20190014959A (en) Electronic device for playing movie based on movment information and operating mehtod thereof
JPWO2014141653A1 (en) Image generating apparatus, imaging apparatus, and image generating method
WO2018219274A1 (en) Method and apparatus for denoising processing, storage medium and terminal
WO2022178782A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
CN115937291B (en) Binocular image generation method and device, electronic equipment and storage medium
WO2022000266A1 (en) Method for creating depth map for stereo moving image and electronic device
WO2022178781A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
CN114910052A (en) Camera-based distance measurement method, control method and device and electronic equipment
CN114125298A (en) Video generation method and device, electronic equipment and computer readable storage medium
WO2022246752A1 (en) Method of generating an image, electronic device, apparatus, and computer readable storage medium
WO2024055290A1 (en) Method of detecting flicker area in captured image, electronic device and computer-readable storage medium
US20120188343A1 (en) Imaging apparatus
WO2022241728A1 (en) Image processing method, electronic device and non–transitory computer–readable media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21927220

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180084542.X

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21927220

Country of ref document: EP

Kind code of ref document: A1