WO2022183443A1 - Method of suggesting shooting position for electronic device and electronic device - Google Patents

Method of suggesting shooting position for electronic device and electronic device Download PDF

Info

Publication number
WO2022183443A1
WO2022183443A1 PCT/CN2021/079099 CN2021079099W WO2022183443A1 WO 2022183443 A1 WO2022183443 A1 WO 2022183443A1 CN 2021079099 W CN2021079099 W CN 2021079099W WO 2022183443 A1 WO2022183443 A1 WO 2022183443A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
primary object
returns
score
electronic device
Prior art date
Application number
PCT/CN2021/079099
Other languages
French (fr)
Inventor
Teruchika MIURA
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/079099 priority Critical patent/WO2022183443A1/en
Priority to CN202180095208.4A priority patent/CN116998159A/en
Publication of WO2022183443A1 publication Critical patent/WO2022183443A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Definitions

  • the present disclosure relates to a method of suggesting a shooting position for an electronic device, and an electronic device implementing such method.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method of suggesting a shooting position for an electronic device and an electrical device implementing such method.
  • a method of suggesting a shooting position for an electronic device the electronic device having a camera and a display for displaying an image captured by the camera in real time, the method may include:
  • the guide image being determined based on a score indicating the evaluating result of the composition.
  • an electronic device may include: a processor and a memory for storing instructions, wherein the instructions, when executed by the processor, cause the processor to perform the method according to the present disclosure.
  • a computer-readable storage medium on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to the present disclosure, may be provided.
  • FIG. 1 is a circuit diagram illustrating an example of a configuration of an electronic device according to an embodiment of the present disclosure
  • FIG. 2 shows a situation where the electronic device is capturing an image of a plurality of objects placed on a plane P;
  • FIG. 3 is a flowchart illustrating a method of suggesting a shooting position for the electronic device according to an embodiment of the present disclosure
  • FIG. 4 shows an example of an RGB image (left drawing) and a depth image (right drawing) captured by the electronic device
  • FIG. 5 is a flowchart illustrating a method of detecting the objects in the depth image in detail
  • FIG. 6 is a flowchart illustrating a method of determining guide direction for moving the electronic device, calculating a score of the composition and displaying a guide image in detail;
  • FIG. 7 shows the guide direction toward the primary object along the Z axis
  • FIG. 8 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
  • FIG. 9 shows the guide direction in the X-Y plane
  • FIG. 10 is a diagram for explaining a method of determining whether the primary object is located at an appropriate position in the X-Y plane;
  • FIG. 11 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
  • FIG. 12 shows the guide direction of rotation around the primary object along the Y axis
  • FIG. 13 is a diagram for explaining angles ⁇ a and ⁇ t in the X-Z plane for determining whether the primary object and the secondary object are properly arranged in the Z axis direction;
  • FIG. 14 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
  • FIG. 15 shows the guide direction of rotation around the primary object along the X axis
  • FIG. 16 is a diagram for explaining a method of determining whether a depression angle of the camera is appropriate
  • FIG. 17 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
  • FIG. 18 shows a guide image displayed over an image captured by the camera.
  • FIG. 1 is a circuit diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 is a mobile device such as a smartphone in this embodiment, but may be other types of electronic device equipped with one or more camera modules.
  • the electronic device 100 includes a camera module 10, a range sensor module 20, and an image signal processor 30 that controls the camera module 10 and the range sensor module 20.
  • the image signal processor 30 performs image processing on an image acquired from the camera module 10 and a depth image acquired from the range sensor module 20.
  • the camera module 10 includes a master camera module 11 and a slave camera module 12 to be used for binocular stereo viewing, as shown in FIG. 1.
  • the master camera module 11 includes a first lens 11a that is capable of focusing on an object, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b, as shown in FIG. 1.
  • the slave camera module 12 includes a second lens 12a that is capable of focusing on an object, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b, as shown in FIG. 1.
  • the master camera module 11 captures a master camera image.
  • the slave camera module 12 captures a slave camera image.
  • the master camera image and the slave camera image may be a color image such as an RGB image, or a monochrome image.
  • the range sensor module 20 captures a depth image. Specifically, the range sensor module 20 acquires time of flight (ToF) depth information by emitting pulsed light toward an object and detecting reflection light from the object.
  • the ToF depth information indicates an actual distance between the electronic device 100 and the object.
  • the image signal processor 30 controls the master camera module 11, the slave camera module 12, and the range sensor module 20 to capture an image.
  • An image with bokeh can be obtained based on the master camera image, the slave camera image, and the ToF depth information.
  • the electronic device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial measurement unit
  • the GNSS module 40 measures a current position of the electronic device 100.
  • the wireless communication module 41 performs wireless communications with the Internet.
  • the CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
  • the display module 45 displays various information such as a master camera image so that a user can check it.
  • the display module 45 displays an image captured by the camera module 10 in real time.
  • the input module 46 inputs information via a user’s operation. For example, the input module 46 inputs an instruction to capture and store an image displayed on the display module 45.
  • the IMU 47 detects the angular velocity and the acceleration of the electronic device 100.
  • a posture of the electronic device 100 can be grasped by a measurement result of the IMU 47.
  • the display module 45 may display an image according to the posture of the electronic device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • GNSS global navigation satellite system
  • the memory 49 stores data of image captured by the camera module 10, data of depth image captured by the range sensor module 20, and a program which runs on the image signal processor 30 and/or the main processor 48.
  • one of the master camera module 11 and the slave camera module 12 can be omitted.
  • the range sensor module 20 can also be omitted.
  • the camera module 10 is not essential to the present disclosure.
  • the range sensor module 20 is not essential.
  • FIG. 2 shows a situation where the electronic device 100 is capturing an image of a first object S1 and a second object S2.
  • the objects S1 and S2 are placed on a plane P such as a table surface or a floor surface.
  • the X axis is a horizontal axis in a preview image displayed on the display module 45.
  • the Y axis is a vertical axis in the preview image.
  • the X-Y plane is parallel to a display of the display module 45.
  • the Z axis is an axis orthogonal to both the X axis and the Y axis.
  • the Z axis indicates a direction along a depth direction.
  • the origin of the XYZ coordinate system may be located at the camera module 10.
  • the XYZ coordinate system may be updated according to a posture of the electronic device 100 detected by the IMU 47.
  • FIG. 3 is a flowchart which shows the method according to an embodiment of the present disclosure.
  • the image signal processor 30 acquires an image captured by the camera module 10.
  • the image I1 is acquired as shown in FIG. 4.
  • the first object S1 and the second object S2 are placed on the plane P.
  • An RGB image may be acquired as the image.
  • the image may be captured with either the master camera module 11 or the slave camera module 12.
  • the image signal processor 30 acquires a depth image captured by the range sensor module 20.
  • the depth image I2 is acquired as shown in FIG. 4.
  • the depth image I2 corresponds to the image I1 acquired in the step S1. Similar to the image I1, the first object S1 and the second object S2 are placed on the plane P in the depth image I2.
  • a depth image can be acquired by performing stereo processing with the master camera image and the slave camera image when the range sensor module 20 is not provided with the electronic device 100.
  • a depth image can be acquired by performing “moving stereo processing” by moving a single camera when the camera module 10 has only one of the master camera module 11 and the slave camera module 12 or when the range sensor module 20 are not provided with the electronic device 100.
  • Visual SLAM Simultaneous Localization and Mapping
  • SLAM Simultaneous Localization and Mapping
  • the image signal processor 30 detects a plurality of objects in the depth image I2 captured in the step S2 and selects a primary object and a secondary object from the plurality of objects. The details of this step will be described with reference to FIG. 5 which is a flowchart showing an example of the step S3.
  • the image signal processor 30 detects a plane on which the objects are placed.
  • the RANSAC Random Sample Con-sensus
  • the image signal processor 30 determines whether a plane is detected or not. If yes, proceed to the step S33. Otherwise, end the detection process flow. The detection process flow also ends if the detected plane is not flat.
  • the image signal processor 30 excludes the plane T and the background B.
  • the image signal processor 30 performs a clustering process to group small objects together.
  • a clustering process For example, DBSCAN (Density-based spatial clustering of applications with noise) can be used for the clustering process.
  • the image signal processor 30 ignores small objects to remove noise. As a result, information on position and size of each remaining object in the depth image is obtained.
  • the image signal processor 30 selects a primary object and a secondary object from the objects left after the step S35. For example, when the range sensor module 20 is provided with the device 100, the processor 30 selects an object with the shortest distance to the electronic device 100 as the primary object and selects an object with the second shortest distance to the electronic device 10 as the secondary object. In this example, the first object S1 is selected as the primary object and the second object S2 is selected as the secondary object.
  • the processor 30 may select the primary and secondary objects based on their sizes.
  • step S4 following the step S3 will be described.
  • the image signal processor 30 determines a guide direction for moving the electronic device 100 based on object information.
  • the object information is information related to the primary object and the secondary object which are selected in the step S3. Specifically, the object information indicates at least one of a position of the selected object (s) , a size of the selected object (s) , and a distance between the selected object (s) and the camera module 10.
  • the main processor 48 evaluates a composition of the image based on the object information. Specifically, the main processor 48 calculates a score which indicates how good the composition is.
  • the ISP 30 may perform the step S5.
  • the display module 45 displays a guide UI with the image.
  • the guide UI indicates the guide direction determined in the step S4.
  • the display module 45 displays a guide image with the image.
  • the guide image is an image determined by the score.
  • the guide image may be an image with the highest score among images captured by the camera module 10 and stored in the memory 49.
  • the guide image may be overlaid on the image.
  • the ISP 30 may perform the step S7.
  • FIG. 6 shows a flowchart which is executed every time an image is captured by the camera module 10.
  • the image signal processor 30 determines whether the camera module 10 is close enough to the primary object S1. If it is determined that the camera module 10 is not close enough to the primary object S1, as shown in FIG. 7, the image signal processor 30 determines a direction toward the primary object along the Z axis to guide the camera module 10 closer to the primary object S1 as the guide direction GD.
  • the image signal processor 30 may calculate a ratio of a size of the primary object S1 in the image to an overall size of the image.
  • the size of the primary object S1 may an area of a shape that surrounds the primary object S1 (e.g., a rectangular frame) .
  • the image signal processor 30 determines that the camera module 10 is close enough to the primary object S1 if the ratio is greater than a predetermined value (e.g., 14%)
  • the image signal processor 30 may acquire a distance between the camera module 10 and the primary object S1 from the depth image which is obtained from the range sensor module 20. The image signal processor 30 determines that the camera module 10 is close enough to the primary object S1 if the distance is shorter than a predetermined distance (e.g., 30cm) .
  • a predetermined distance e.g. 30cm
  • the main processor 48 calculates the score based on the ratio.
  • the score is calculated by an equation (1) .
  • s is a size of the primary object
  • s t is a size of the image
  • Clip is a function which returns a value of a ratio of s/s t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1
  • Round is a function which rounds an argument and returns an integer value.
  • the score calculated by the equation (1) is an integer between 0 and 100.
  • the main processor 48 may acquire a distance between the primary object S1 and the camera module 10 from the depth image and then calculates a score based on the distance.
  • the score is calculated by an equation (2) .
  • Clip is a function which returns a value of a ratio of z/z t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1
  • Round is a function which rounds an argument and returns an integer value.
  • the score calculated by the equation (2) is also an integer between 0 and 100.
  • the electronic device 100 displays, as a guide UI, a message for asking a user of the electronic device 100 to move the camera 10 closer to the primary object S1.
  • a guide UI1 “Move Closer ! ”
  • the guide UI1 may be overlaid on a real-time image displayed on the display module 45.
  • the guide UI1 makes it easy for the user to know the direction in the Z axis direction in which he/she should move the electronic device 100.
  • the user follows the guide UI1 to move the electronic device 100 (the camera module 10) closer to the primary object.
  • the guide UI1 may be a voice message.
  • the image signal processor 30 determines whether the primary object S1 is located at an appropriate position in the X-Y plane in the image captured by the camera module 10. If it is determined that the primary object S1 is not located at an appropriate position in the X-Y plane, as shown in FIG. 9, the image signal processor 30 determines a direction in the X-Y plane to guide the camera module 10 to the appropriate position as the guide direction GD.
  • FIG. 10 is a diagram for explaining a method of determining whether the primary object is located at an appropriate position in the X-Y plane.
  • W is a horizontal length of the image and H is a vertical length of the image.
  • an RGB image is virtually divided into nine equal parts by two horizontal lines HL1, HL2 and two vertical lines VL1, VL2.
  • An intersection of the horizontal line HL1 and the vertical line VL1 is a target point T.
  • the target point T is a lower thirds grid point.
  • a shooting position is suggested so that a reference point of the primary object is located at the target point T or in a region which includes the target point T.
  • the image signal processor 30 calculates a center C1 of gravity of the primary object S1 as the reference point and determines that the primary object S1 is located at an appropriate position in the X-Y plane if the center C1 is located within a region centered on the target point T.
  • the image signal processor 30 may calculate a first reference point of the primary object S1 and a second reference point of the secondary object S2.
  • the first reference point may be a center C1 of gravity of the primary object S1 and the second reference point may be a center C2 of gravity of the secondary object S2.
  • the image signal processor 30 determines that the primary object S1 is located at an appropriate position in the X-Y plane if the first reference point of the primary object S1 (e.g., the center C1) is located within a predetermined region which includes the target point T and the second reference point of the secondary object S2 (e.g., the center C2) is located on the opposite side of the primary object.
  • the center C2 is located on a diagonal of the image as shown in FIG. 10.
  • the diagonal passes through the primary object S1.
  • step S52 If it is determined that the primary object S1 is not located at an appropriate position in the X-Y plane, the process proceeds to the step S52.
  • the main processor 48 calculates the score based on the reference point of the primary object S1 and the target point T of the image. For example, the score is calculated by an equation (3) .
  • x t is a X-coordinate of the target point
  • y t is a Y-coordinate of the target point
  • x p is a X-coordinate of the reference point
  • y p is a Y-coordinate of the reference point
  • W is a horizontal length of the image
  • H is a vertical length of the image
  • Round is a function which rounds an argument and returns an integer value.
  • the score calculated by the equation (3) is an integer between 100 and 200.
  • the electronic device 100 displays, as a guide UI, an arrow indicating the determined direction.
  • a guide UI2 i.e., a phone icon with the arrow is displayed as shown in FIG. 11.
  • the guide UI2 may be overlaid on a real-time image displayed on the display module 45.
  • the guide UI2 makes it easy for the user to know the direction in the X-Y plane in which he/she should move the electronic device 100.
  • the user follows the guide UI2 to move the electronic device 100 (the camera module 10) in the X-Y plane so that the primary object is located at the corner of the screen.
  • the guide UI2 may be a voice message.
  • a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 11 to make it easy for a user of the electronic device 100 to recognize the primary object.
  • the image signal processor 30 determines whether the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction. If it is determined that the primary object S1 and the secondary object S2 are not properly arranged in the Z axis direction, as shown in FIG. 12, the image signal processor 30 determines, as the guide direction GD, a direction of rotation around the primary object S1 along the Y axis to guide the camera module 10 to a position where the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction.
  • FIG. 13 is a diagram for explaining the angle ⁇ a and a target angle ⁇ t in the X-Z plane.
  • the angle ⁇ a is an angle between a line L1 and a line L2 which is parallel to the X axis.
  • the line L1 connects the primary object S1 and the secondary object S2.
  • the line L1 is a line which connects the center C1 and the center C2.
  • the target angle ⁇ t is an angle to define a range for determination.
  • the target angle ⁇ t is 45°.
  • the image signal processor 30 calculates the angle ⁇ a and then determines that the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction if the angle ⁇ a is within a predetermined range defined by the target angle ⁇ t .
  • the predetermined range is a range centered around the target angle ⁇ t (e.g., ⁇ t ⁇ 10°) .
  • the main processor 48 calculates the score based on a difference between the angle ⁇ a and the target angle ⁇ t .
  • the score is calculated by an equation (4) .
  • ⁇ t is the target angle
  • ⁇ a is the angle
  • diff is a function which returns an absolute value of a difference between ⁇ t and ⁇ a
  • Round is a function which rounds an argument and returns an integer value
  • the score calculated by the equation (4) is also an integer between 200 and 300.
  • the electronic device 100 displays, as a guide UI, an icon which rotates around a virtual axis parallel to the Y axis.
  • the virtual axis passes through the primary object S1.
  • a guide UI3 i.e., an animated phone icon which rotates around the virtual axis is displayed as shown in FIG. 14.
  • the guide UI3 may be overlaid on a real-time image displayed on the display module 45.
  • the guide UI3 makes it easy for the user to know the direction of rotation in the X-Z plane in which he/she should move the electronic device 100.
  • the user follows the guide UI3 to move the electronic device 100 (the camera module 10) in the X-Z plane so that the primary object and the secondary object is located in a diagonal position.
  • the guide UI3 may be a voice message.
  • a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 14 to make it easy for a user of the electronic device 100 to recognize the primary object.
  • the image signal processor 30 determines whether a depression angle of the camera module 10 is appropriate. If it is determined that the depression angle is not appropriate, as shown in FIG. 15, the image signal processor 30 determines, as the guide direction GD, a direction of rotation around the primary object S1 along the X axis to guide the camera module 10 to a position where the depression angle is appropriate.
  • FIG. 16 is a diagram for explaining a method of determining whether the depression angle is appropriate.
  • the distance Dh is a vertical distance between the center C1 and the center C2. If a ratio of the distance Dh to a vertical length H of the image is in a predetermined range (e.g., 20% ⁇ 10%) , it is determined that the depression angle of the camera module 10 is appropriate.
  • the image signal processor 30 calculates the distance Dh and then determines that the depression angle of the camera module 10 is appropriate if the ratio is within the predetermined range.
  • the main processor 48 calculates the score based on the distance Dh.
  • the score is calculated by an equation (5) .
  • R is a target range
  • Dh is the distance
  • H is a vertical length of the image
  • Clip is a function which returns a value of
  • Round is a function which rounds an argument and returns an integer value.
  • the target range R is 0.2, for example.
  • the score calculated by the equation (5) is an integer between 300 and 400.
  • the electronic device 100 displays, as a guide UI, an icon which rotates around a virtual axis parallel to the X axis.
  • the virtual axis passes through the primary object S1.
  • a guide UI4 i.e., an animated phone icon which rotates around the virtual axis is displayed as shown in FIG. 16.
  • the guide UI4 may be overlaid on a real-time image displayed on the display module 45.
  • the guide UI4 makes it easy for the user to know the direction of rotation around the virtual axis in which he/she should move the electronic device 100. According to the guide UI4, the user easily moves the electronic device 100 (the camera module 10) around the virtual axis so that the appropriate depression angle can be obtained.
  • the guide UI4 may be a voice message.
  • a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 16 to make it easy for a user of the electronic device 100 to recognize the primary object.
  • the electronic device 100 displays, as a guide UI, a message to inform a user that the electronic device 100 is in a good position for shooting the primary and secondary objects S1 and S2.
  • a guide UI5 “Ready ! ”
  • the guide UI5 may be a voice message.
  • step S7 will be described in detail with reference to FIG. 6.
  • step S71 the main processor 48 determines whether a current score calculated in one of the steps S51 to S54 is greater than the previous score stored in the memory 49. If the current score is greater than the previous score, the process proceeds to the step S72, otherwise it proceeds to the step S73.
  • the main processor 48 updates an image stored in the memory 49 and a score.
  • the previous image may be overwritten with the current image.
  • the previous score may be overwritten with the current score.
  • the display module 45 displays the guide image stored in the memory 49.
  • the guide image is displayed with the image.
  • the guide image GI may be overlaid on the image displayed on the electronic device 100.
  • the steps described above may be performed by the ISP 30 or the main processor 48.
  • the electronic device 100 can strengthen the sense of distance by applying a bokeh processing on the image.
  • the user can quickly compose an image by referring to a guide image which is obtained by evaluating a composition of the image based on the object information.
  • the user can shorten time to reach the target composition of the image.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Abstract

Disclosed is a method of suggesting a shooting position for an electronic device. The electronic device has a camera and a display for displaying an image captured by the camera in real time. The method includes acquiring an image captured by the camera, acquiring a depth image corresponding to the image, detecting a plurality of objects in the depth image and selecting a primary object and a secondary object from the plurality of objects, evaluating a composition of the image based on object information related to the primary object and the secondary object, and displaying a guide image with the image on the display. The guide image is determined based on a score indicating the evaluating result of the composition.

Description

METHOD OF SUGGESTING SHOOTING POSITION FOR ELECTRONIC DEVICE AND ELECTRONIC DEVICE TECHNICAL FIELD
The present disclosure relates to a method of suggesting a shooting position for an electronic device, and an electronic device implementing such method.
BACKGROUND
In recent years, images captured by an electronic device with a camera such as smartphones are often uploaded to Social Networking Service (SNS) etc, . When shooting an object such as a dish on a table with a smartphone, it is preferable for the user to shoot an image with a composition having a sense of distance or sense of depth and further, to shoot an image which can emphasize a sense of distance by applying bokeh processing.
It is difficult for a user who is unfamiliar with capturing images to determine the appropriate shooting position for shooting an image with a sense of distance. It may be expected that it is the job of the smartphone to process an image captured by the user so as to change it into an image with a sense of distance. However, changing the composition of an image captured by the user to a composition with a sense of distance is not an easy feat.
Further, it is required that the user can quickly compose an image with a sense of distance.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method of suggesting a shooting position for an electronic device and an electrical device implementing such method.
In accordance with the present disclosure, a method of suggesting a shooting position for an electronic device, the electronic device having a camera and a display for displaying an image captured by the camera in real time, the method may include:
acquiring an image captured by the camera;
acquiring a depth image corresponding to the image;
detecting a plurality of objects in the depth image and selecting a primary object and a secondary object from the plurality of objects;
evaluating a composition of the image based on object information related to the primary object and the secondary object; and
displaying a guide image with the image on the display, the guide image being determined based on a score indicating the evaluating result of the composition.
In accordance with the present disclosure, an electronic device may include: a processor and a memory for storing instructions, wherein the instructions, when executed by the processor, cause the processor to perform the method according to the present disclosure.
In accordance with the present disclosure, a computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to the present disclosure, may be provided.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a circuit diagram illustrating an example of a configuration of an electronic device according to an embodiment of the present disclosure;
FIG. 2 shows a situation where the electronic device is capturing an image of a plurality of objects placed on a plane P;
FIG. 3 is a flowchart illustrating a method of suggesting a shooting position for the  electronic device according to an embodiment of the present disclosure;
FIG. 4 shows an example of an RGB image (left drawing) and a depth image (right drawing) captured by the electronic device;
FIG. 5 is a flowchart illustrating a method of detecting the objects in the depth image in detail;
FIG. 6 is a flowchart illustrating a method of determining guide direction for moving the electronic device, calculating a score of the composition and displaying a guide image in detail;
FIG. 7 shows the guide direction toward the primary object along the Z axis;
FIG. 8 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
FIG. 9 shows the guide direction in the X-Y plane;
FIG. 10 is a diagram for explaining a method of determining whether the primary object is located at an appropriate position in the X-Y plane;
FIG. 11 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
FIG. 12 shows the guide direction of rotation around the primary object along the Y axis;
FIG. 13 is a diagram for explaining angles θ a and θ t in the X-Z plane for determining whether the primary object and the secondary object are properly arranged in the Z axis direction;
FIG. 14 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI;
FIG. 15 shows the guide direction of rotation around the primary object along the X axis;
FIG. 16 is a diagram for explaining a method of determining whether a depression angle of the camera is appropriate;
FIG. 17 is an example of an image displayed on the electronic device, the image including an RGB image of the objects and the guide UI; and
FIG. 18 shows a guide image displayed over an image captured by the camera.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
<Electronic device 100>
An electronic device 100 will be described with reference to FIG. 1. FIG. 1 is a circuit diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 is a mobile device such as a smartphone in this embodiment, but may be other types of electronic device equipped with one or more camera modules.
As shown in FIG. 1, the electronic device 100 includes a camera module 10, a range sensor module 20, and an image signal processor 30 that controls the camera module 10 and the range sensor module 20. The image signal processor 30 performs image processing on an image acquired from the camera module 10 and a depth image acquired from the range sensor module 20.
The camera module 10 includes a master camera module 11 and a slave camera module 12 to be used for binocular stereo viewing, as shown in FIG. 1.
The master camera module 11 includes a first lens 11a that is capable of focusing on an object, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b, as shown in FIG. 1.
The slave camera module 12 includes a second lens 12a that is capable of focusing on an object, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b, as shown in FIG. 1.
The master camera module 11 captures a master camera image. Similarly, the slave camera module 12 captures a slave camera image. The master camera image and the slave camera image may be a color image such as an RGB image, or a monochrome image.
The range sensor module 20 captures a depth image. Specifically, the range sensor module 20 acquires time of flight (ToF) depth information by emitting pulsed light toward an object and detecting reflection light from the object. The ToF depth information indicates an actual distance between the electronic device 100 and the object.
The image signal processor 30 controls the master camera module 11, the slave camera module 12, and the range sensor module 20 to capture an image. An image with bokeh can be obtained based on the master camera image, the slave camera image, and the ToF depth information.
Furthermore, as shown in FIG. 1, the electronic device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
The GNSS module 40 measures a current position of the electronic device 100. The wireless communication module 41 performs wireless communications with the Internet. The CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method. The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42. The microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
The display module 45 displays various information such as a master camera image so that a user can check it. The display module 45 displays an image captured by the camera module 10 in real time.
The input module 46 inputs information via a user’s operation. For example, the input module 46 inputs an instruction to capture and store an image displayed on the display module 45.
The IMU 47 detects the angular velocity and the acceleration of the electronic device 100. A posture of the electronic device 100 can be grasped by a measurement result of the IMU 47. The display module 45 may display an image according to the posture of the electronic device 100.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
The memory 49 stores data of image captured by the camera module 10, data of depth image captured by the range sensor module 20, and a program which runs on the image signal processor 30 and/or the main processor 48.
Regarding the configuration of the electronic device 100, one of the master camera module 11 and the slave camera module 12 can be omitted. The range sensor module 20 can also be omitted. As such, the camera module 10 is not essential to the present disclosure. Also, the range sensor module 20 is not essential.
Here, the XYZ coordinate system used by the electronic device 100 will be described with reference to FIG. 2. FIG. 2 shows a situation where the electronic device 100 is capturing an image of a first object S1 and a second object S2. The objects S1 and S2 are placed on a plane P such as a table surface or a floor surface.
The X axis is a horizontal axis in a preview image displayed on the display module 45. The Y axis is a vertical axis in the preview image. The X-Y plane is parallel to a display of the display module 45. The Z axis is an axis orthogonal to both the X axis and the Y axis. The Z axis  indicates a direction along a depth direction. The origin of the XYZ coordinate system may be located at the camera module 10. The XYZ coordinate system may be updated according to a posture of the electronic device 100 detected by the IMU 47.
< Method of suggesting a shooting position >
A method of suggesting a shooting position to a user of the electronic device 100 who will try to capture an image of the first and second objects S1 and S2 will be described. FIG. 3 is a flowchart which shows the method according to an embodiment of the present disclosure.
In the step S1, the image signal processor 30 acquires an image captured by the camera module 10. In this step, the image I1 is acquired as shown in FIG. 4. The first object S1 and the second object S2 are placed on the plane P.
An RGB image may be acquired as the image. The image may be captured with either the master camera module 11 or the slave camera module 12.
In the step S2, the image signal processor 30 acquires a depth image captured by the range sensor module 20. In this step, the depth image I2 is acquired as shown in FIG. 4. The depth image I2 corresponds to the image I1 acquired in the step S1. Similar to the image I1, the first object S1 and the second object S2 are placed on the plane P in the depth image I2.
Alternatively, a depth image can be acquired by performing stereo processing with the master camera image and the slave camera image when the range sensor module 20 is not provided with the electronic device 100.
Alternatively, a depth image can be acquired by performing “moving stereo processing” by moving a single camera when the camera module 10 has only one of the master camera module 11 and the slave camera module 12 or when the range sensor module 20 are not provided with the electronic device 100. Visual SLAM (Simultaneous Localization and Mapping) can be used as the moving stereo processing technique.
In the step S3, the image signal processor 30 detects a plurality of objects in the depth image I2 captured in the step S2 and selects a primary object and a secondary object from the plurality of objects. The details of this step will be described with reference to FIG. 5 which is a flowchart showing an example of the step S3.
In the step S31, the image signal processor 30 detects a plane on which the objects are placed. In this step, for example, the RANSAC (RANdom Sample Con-sensus) method can be used in order to detect the plane.
In the step S32, the image signal processor 30 determines whether a plane is detected or not. If yes, proceed to the step S33. Otherwise, end the detection process flow. The detection process flow also ends if the detected plane is not flat.
In the step S33, the image signal processor 30 excludes the plane T and the background B.
In the step S34, the image signal processor 30 performs a clustering process to group small objects together. For example, DBSCAN (Density-based spatial clustering of applications with noise) can be used for the clustering process.
In the step S35, the image signal processor 30 ignores small objects to remove noise. As a result, information on position and size of each remaining object in the depth image is obtained.
In the step S36, the image signal processor 30 selects a primary object and a secondary object from the objects left after the step S35. For example, when the range sensor module 20 is provided with the device 100, the processor 30 selects an object with the shortest distance to the electronic device 100 as the primary object and selects an object with the second shortest distance to the electronic device 10 as the secondary object. In this example, the first object S1 is selected as the primary object and the second object S2 is selected as the secondary object.
Alternatively, the processor 30 may select the primary and secondary objects based on their sizes.
Returning to the flowchart of FIG. 3, the step S4 following the step S3 will be described.
In the step S4, the image signal processor 30 determines a guide direction for moving the electronic device 100 based on object information. The object information is information related  to the primary object and the secondary object which are selected in the step S3. Specifically, the object information indicates at least one of a position of the selected object (s) , a size of the selected object (s) , and a distance between the selected object (s) and the camera module 10.
In the step S5, the main processor 48 evaluates a composition of the image based on the object information. Specifically, the main processor 48 calculates a score which indicates how good the composition is. The ISP 30 may perform the step S5.
In the step S6, the display module 45 displays a guide UI with the image. The guide UI indicates the guide direction determined in the step S4.
In the step S7, the display module 45 displays a guide image with the image. The guide image is an image determined by the score. For example, the guide image may be an image with the highest score among images captured by the camera module 10 and stored in the memory 49. The guide image may be overlaid on the image. The ISP 30 may perform the step S7.
The details of the steps S4 to S7 will be described with reference to FIG. 6. FIG. 6 shows a flowchart which is executed every time an image is captured by the camera module 10.
In the step S41, the image signal processor 30 determines whether the camera module 10 is close enough to the primary object S1. If it is determined that the camera module 10 is not close enough to the primary object S1, as shown in FIG. 7, the image signal processor 30 determines a direction toward the primary object along the Z axis to guide the camera module 10 closer to the primary object S1 as the guide direction GD.
In order to determine whether the camera module 10 is close enough to the primary object S1, the image signal processor 30 may calculate a ratio of a size of the primary object S1 in the image to an overall size of the image. The size of the primary object S1 may an area of a shape that surrounds the primary object S1 (e.g., a rectangular frame) . The image signal processor 30 determines that the camera module 10 is close enough to the primary object S1 if the ratio is greater than a predetermined value (e.g., 14%)
Alternatively, the image signal processor 30 may acquire a distance between the camera module 10 and the primary object S1 from the depth image which is obtained from the range sensor module 20. The image signal processor 30 determines that the camera module 10 is close enough to the primary object S1 if the distance is shorter than a predetermined distance (e.g., 30cm) .
If it is determined that the camera module 10 is not close enough to the primary object S1, the process proceeds to the step S51.
In the step S51, the main processor 48 calculates the score based on the ratio. For example, the score is calculated by an equation (1) .
Score=Round (100×Clip (s/s t, 0, 1) ) ... (1) ,
where s is a size of the primary object, s t is a size of the image, Clip is a function which returns a value of a ratio of s/s t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1, and Round is a function which rounds an argument and returns an integer value.
The score calculated by the equation (1) is an integer between 0 and 100.
Alternatively, in the step S51, the main processor 48 may acquire a distance between the primary object S1 and the camera module 10 from the depth image and then calculates a score based on the distance. For example, the score is calculated by an equation (2) .
Figure PCTCN2021079099-appb-000001
where z is a distance between the primary object S1 and the camera module 10, z t is a predetermined value, Clip is a function which returns a value of a ratio of z/z t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1, and Round is a function which rounds an argument and returns an integer value.
The score calculated by the equation (2) is also an integer between 0 and 100.
In the step S61, the electronic device 100 displays, as a guide UI, a message for asking a user of the electronic device 100 to move the camera 10 closer to the primary object S1. For example, a guide UI1, “Move Closer ! ” , is displayed as shown in FIG. 8. The guide UI1 may be overlaid on a real-time image displayed on the display module 45. The guide UI1 makes it easy for the user to know the direction in the Z axis direction in which he/she should move the electronic device 100. The user follows the guide UI1 to move the electronic device 100 (the camera module 10) closer to the primary object.
Optionally, the guide UI1 may be a voice message.
Otherwise (i.e., in case that the camera 10 is close enough to the primary object S1) , the process proceeds to the step S42.
In the step S42, the image signal processor 30 determines whether the primary object S1 is located at an appropriate position in the X-Y plane in the image captured by the camera module 10.If it is determined that the primary object S1 is not located at an appropriate position in the X-Y plane, as shown in FIG. 9, the image signal processor 30 determines a direction in the X-Y plane to guide the camera module 10 to the appropriate position as the guide direction GD.
FIG. 10 is a diagram for explaining a method of determining whether the primary object is located at an appropriate position in the X-Y plane. W is a horizontal length of the image and H is a vertical length of the image.
In this example, an RGB image is virtually divided into nine equal parts by two horizontal lines HL1, HL2 and two vertical lines VL1, VL2. An intersection of the horizontal line HL1 and the vertical line VL1 is a target point T. The target point T is a lower thirds grid point. A shooting position is suggested so that a reference point of the primary object is located at the target point T or in a region which includes the target point T.
For example, the image signal processor 30 calculates a center C1 of gravity of the primary object S1 as the reference point and determines that the primary object S1 is located at an appropriate position in the X-Y plane if the center C1 is located within a region centered on the target point T.
Alternatively, the image signal processor 30 may calculate a first reference point of the primary object S1 and a second reference point of the secondary object S2. The first reference point may be a center C1 of gravity of the primary object S1 and the second reference point may be a center C2 of gravity of the secondary object S2. The image signal processor 30 determines that the primary object S1 is located at an appropriate position in the X-Y plane if the first reference point of the primary object S1 (e.g., the center C1) is located within a predetermined region which includes the target point T and the second reference point of the secondary object S2 (e.g., the center C2) is located on the opposite side of the primary object. In other words, the center C2 is located on a diagonal of the image as shown in FIG. 10. Here, the diagonal passes through the primary object S1.
If it is determined that the primary object S1 is not located at an appropriate position in the X-Y plane, the process proceeds to the step S52.
In the step S52, the main processor 48 calculates the score based on the reference point of the primary object S1 and the target point T of the image. For example, the score is calculated by an equation (3) .
Figure PCTCN2021079099-appb-000002
where x t is a X-coordinate of the target point, y t is a Y-coordinate of the target point, x p is a X-coordinate of the reference point, y p is a Y-coordinate of the reference point, W is a horizontal length of the image, H is a vertical length of the image, and Round is a function which rounds an argument and returns an integer value.
The score calculated by the equation (3) is an integer between 100 and 200.
In the step S62, the electronic device 100 displays, as a guide UI, an arrow indicating the determined direction. For example, a guide UI2, i.e., a phone icon with the arrow is displayed as shown in FIG. 11. The guide UI2 may be overlaid on a real-time image displayed on the display module 45. The guide UI2 makes it easy for the user to know the direction in the X-Y plane in which he/she should move the electronic device 100. The user follows the guide UI2 to move the electronic device 100 (the camera module 10) in the X-Y plane so that the primary object is located at the corner of the screen.
Optionally, the guide UI2 may be a voice message.
Optionally, a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 11 to make it easy for a user of the electronic device 100 to recognize the primary object.
Otherwise (i.e., in case that the primary object S1 is located at an appropriate position in the X-Y plane) , the process proceeds to the step S43.
In the step S43, the image signal processor 30 determines whether the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction. If it is determined that the primary object S1 and the secondary object S2 are not properly arranged in the Z axis direction, as shown in FIG. 12, the image signal processor 30 determines, as the guide direction GD, a direction of rotation around the primary object S1 along the Y axis to guide the camera module 10 to a position where the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction.
In the present disclosure, whether the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction is determined based on an angle θ a formed by a positional relationship between the primary object S1 and the secondary object S2. FIG. 13 is a diagram for explaining the angle θ a and a target angle θ t in the X-Z plane. The angle θ a is an angle between a line L1 and a line L2 which is parallel to the X axis. The line L1 connects the primary object S1 and the secondary object S2. For example, the line L1 is a line which connects the center C1 and the center C2. The target angle θ t is an angle to define a range for determination. For example, the target angle θ t is 45°.
More specifically, first, the image signal processor 30 calculates the angle θ a and then determines that the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction if the angle θ a is within a predetermined range defined by the target angle θ t. The predetermined range is a range centered around the target angle θ t (e.g., θ t±10°) .
If it is determined that the primary object S1 and the secondary object S2 are not properly arranged in the Z axis direction, the process proceeds to the step S53.
In the step S53, the main processor 48 calculates the score based on a difference between the angle θ a and the target angle θ t. For example, the score is calculated by an equation (4) .
Figure PCTCN2021079099-appb-000003
where θ t is the target angle, θ a is the angle, diff is a function which returns an absolute value of a difference between θ t and θ a, and Round is a function which rounds an argument and returns an integer value.
The score calculated by the equation (4) is also an integer between 200 and 300.
In the step S63, the electronic device 100 displays, as a guide UI, an icon which rotates around a virtual axis parallel to the Y axis. The virtual axis passes through the primary object S1. For example, a guide UI3, i.e., an animated phone icon which rotates around the virtual axis is displayed as shown in FIG. 14. The guide UI3 may be overlaid on a real-time image displayed on the display module 45. The guide UI3 makes it easy for the user to know the direction of rotation in the X-Z plane in which he/she should move the electronic device 100. The user follows the guide UI3 to move the electronic device 100 (the camera module 10) in the X-Z plane so that the primary object and the secondary object is located in a diagonal position.
Optionally, the guide UI3 may be a voice message.
Optionally, a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 14 to make it easy for a user of the electronic device 100 to recognize the primary object.
Otherwise (i.e., in case that the primary object S1 and the secondary object S2 are properly arranged in the Z axis direction) , the process proceeds to the step S44.
In the step S44, the image signal processor 30 determines whether a depression angle of the camera module 10 is appropriate. If it is determined that the depression angle is not appropriate, as shown in FIG. 15, the image signal processor 30 determines, as the guide direction GD, a direction of rotation around the primary object S1 along the X axis to guide the camera module 10 to a position where the depression angle is appropriate.
In the present disclosure, whether the depression angle is appropriate is determined based on a distance Dh along the Y axis between the primary object S1 and the secondary object S2. FIG. 16 is a diagram for explaining a method of determining whether the depression angle is appropriate. In this example, the distance Dh is a vertical distance between the center C1 and the center C2. If a ratio of the distance Dh to a vertical length H of the image is in a predetermined range (e.g., 20%±10%) , it is determined that the depression angle of the camera module 10 is appropriate.
More specifically, first, the image signal processor 30 calculates the distance Dh and then determines that the depression angle of the camera module 10 is appropriate if the ratio is within the predetermined range.
If it is determined that the depression angle of the camera module 10 is not appropriate, the process proceeds to the step S54.
In the step S54, the main processor 48 calculates the score based on the distance Dh. For example, the score is calculated by an equation (5) .
Figure PCTCN2021079099-appb-000004
where R is a target range, Dh is the distance, H is a vertical length of the image, Clip is a function which returns a value of |R-Dh/H| is if the value is between 0 and 1, returns 0 if the value is less than 0 and returns 1 if the value is more than 1, and Round is a function which rounds an argument and returns an integer value.
The target range R is 0.2, for example.
The score calculated by the equation (5) is an integer between 300 and 400.
In the step S64, the electronic device 100 displays, as a guide UI, an icon which rotates around a virtual axis parallel to the X axis. The virtual axis passes through the primary object S1. For example, a guide UI4, i.e., an animated phone icon which rotates around the virtual axis is displayed as shown in FIG. 16. The guide UI4 may be overlaid on a real-time image displayed on the display module 45. The guide UI4 makes it easy for the user to know the direction of rotation around the virtual axis in which he/she should move the electronic device 100. According to the guide UI4, the user easily moves the electronic device 100 (the camera module 10) around the virtual axis so that the appropriate depression angle can be obtained.
Optionally, the guide UI4 may be a voice message.
Optionally, a frame M1 indicating the primary object S1 may be displayed as shown in FIG. 16 to make it easy for a user of the electronic device 100 to recognize the primary object.
Otherwise (i.e., in case that the depression angle of the camera module 10 is appropriate) , the process proceeds to the step S65.
In the step S65, the electronic device 100 displays, as a guide UI, a message to inform a user that the electronic device 100 is in a good position for shooting the primary and secondary objects S1 and S2. For example, a guide UI5, “Ready ! ” , is displayed in a manner of superimposed on a real-time image as shown in FIG. 17.
Optionally, the guide UI5 may be a voice message.
Next, the step S7 will be described in detail with reference to FIG. 6.
In the step S71, the main processor 48 determines whether a current score calculated in one of the steps S51 to S54 is greater than the previous score stored in the memory 49. If the current score is greater than the previous score, the process proceeds to the step S72, otherwise it proceeds to the step S73.
In the step S72, the main processor 48 updates an image stored in the memory 49 and a score. The previous image may be overwritten with the current image. Similarly, the previous score may be overwritten with the current score.
In the step S73, the display module 45 displays the guide image stored in the memory 49. The guide image is displayed with the image. As shown FIG. 18, the guide image GI may be overlaid on the image displayed on the electronic device 100.
The steps described above may be performed by the ISP 30 or the main processor 48.
According to the present disclosure described above, it is possible to suggest a shooting position to a user of the electronic device 100 so that he/she can easily capture an image of a composition having a sense of distance, considering a positional relationship among the primary object S1, the secondary object S2 and the camera module 10. As a result, the user can capture an image with a sense of distance between the objects even if he/she does not have special skills or experience in shooting images. Further, the electronic device 100 can strengthen the sense of distance by applying a bokeh processing on the image.
Further, according to the present disclosure, the user can quickly compose an image by referring to a guide image which is obtained by evaluating a composition of the image based on the object information. In other words, the user can shorten time to reach the target composition of the image.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in  which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination  of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (15)

  1. A method of suggesting a shooting position for an electronic device, the electronic device having a camera and a display for displaying an image captured by the camera in real time, the method comprising:
    acquiring an image captured by the camera;
    acquiring a depth image corresponding to the image;
    detecting a plurality of objects in the depth image and selecting a primary object and a secondary object from the plurality of objects;
    evaluating a composition of the image based on object information related to the primary object and the secondary object; and
    displaying a guide image with the image on the display, the guide image being determined based on a score indicating the evaluating result of the composition.
  2. The method according to claim 1, wherein the guide image is overlaid on the image captured by the camera.
  3. The method according to claim 1 or 2, wherein the guide image is an image with the highest score among images captured by the camera.
  4. The method according to any one of claims 1 to 3, wherein the evaluating a composition of the image based on object information related to the primary object and the secondary object comprises calculating the score based on a ratio of a size of the primary object in the image to a size of the image.
  5. The method according to claim 4, wherein the score is calculated by an equation (1) ,
    Score = Round (100×Clip (s/s t, 0, 1) ) ... (1) ,
    where s is a size of the primary object, s t is a size of the image, Clip is a function which returns a value of a ratio of s/s t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1, and Round is a function which rounds an argument and returns an integer value.
  6. The method according to any one of claims 1 to 3, wherein the evaluating a composition of the image based on object information related to the primary object and the secondary object comprises calculating the score based on a distance between the primary object and the camera.
  7. The method according to claim 6, wherein the score is calculated by an equation (2) ,
    Figure PCTCN2021079099-appb-100001
    where z is a distance between the primary object and the camera module, z t is a predetermined value, Clip is a function which returns a value of a ratio of z/z t if the ratio is between 0 and 1, returns 0 if the ratio is less than 0 and returns 1 if the ratio is more than 1, and Round is a function which rounds an argument and returns an integer value.
  8. The method according to any one of claims 1 to 3, wherein the evaluating a composition of the image based on object information related to the primary object and the secondary object comprises calculating the score based on a reference point of the primary object and a target point of the image.
  9. The method according to claim 8, wherein the score is calculated by an equation (3) ,
    Figure PCTCN2021079099-appb-100002
    where x t is a X-coordinate of the target point, y t is a Y-coordinate of the target point, x p is a X-coordinate of the reference point, y p is a Y-coordinate of the reference point, W is a horizontal length of the image, H is a vertical length of the image, and Round is a function which rounds an argument and returns an integer value.
  10. The method according to any one of claims 1 to 3, wherein the evaluating a composition of the image based on object information related to the primary object and the secondary object comprises calculating the score based on a difference between an angle in the X-Z plane and a target angle, the angle being an angle between a first line connecting the primary object and the secondary object and a second line parallel to the X axis.
  11. The method according to claim 10, wherein the score is calculated by an equation (4) ,
    Figure PCTCN2021079099-appb-100003
    where θ t is the target angle, θ a is the angle, diff is a function which returns an absolute value of a difference between θ t and θ a, and Round is a function which rounds an argument and returns an integer value.
  12. The method according to any one of claims 1 to 3, wherein the evaluating a composition of the image based on object information related to the primary object and the secondary object comprises calculating the score based on a distance along the Y axis between the primary object and the secondary object.
  13. The method according to claim 12, wherein the score is calculated by an equation (5) ,
    Figure PCTCN2021079099-appb-100004
    where R is a target range, Dh is the distance, H is a vertical length of the image, Clip is a function which returns a value of |R-Dh/H| if the value is between 0 and 1, returns 0 if the value is less than 0 and returns 1 if the value is more than 1, and Round is a function which rounds an argument and returns an integer value.
  14. An electronic device comprising a processor and a memory for storing instructions, wherein the instructions, when executed by the processor, cause the processor to perform the method according to any one of claims 1 to 13.
  15. A computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to any one of claims 1 to 13.
PCT/CN2021/079099 2021-03-04 2021-03-04 Method of suggesting shooting position for electronic device and electronic device WO2022183443A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/079099 WO2022183443A1 (en) 2021-03-04 2021-03-04 Method of suggesting shooting position for electronic device and electronic device
CN202180095208.4A CN116998159A (en) 2021-03-04 2021-03-04 Method for suggesting shooting position of electronic equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/079099 WO2022183443A1 (en) 2021-03-04 2021-03-04 Method of suggesting shooting position for electronic device and electronic device

Publications (1)

Publication Number Publication Date
WO2022183443A1 true WO2022183443A1 (en) 2022-09-09

Family

ID=83153897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/079099 WO2022183443A1 (en) 2021-03-04 2021-03-04 Method of suggesting shooting position for electronic device and electronic device

Country Status (2)

Country Link
CN (1) CN116998159A (en)
WO (1) WO2022183443A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105981368A (en) * 2014-02-13 2016-09-28 谷歌公司 Photo composition and position guidance in an imaging device
CN108289169A (en) * 2018-01-09 2018-07-17 北京小米移动软件有限公司 Image pickup method, device, electronic equipment and storage medium
CN108366203A (en) * 2018-03-01 2018-08-03 北京金山安全软件有限公司 Composition method, composition device, electronic equipment and storage medium
CN109040605A (en) * 2018-11-05 2018-12-18 北京达佳互联信息技术有限公司 Shoot bootstrap technique, device and mobile terminal and storage medium
US20190109981A1 (en) * 2017-10-11 2019-04-11 Adobe Systems Incorporated Guided image composition on mobile devices
US20200184215A1 (en) * 2018-12-07 2020-06-11 International Business Machines Corporation Photographic results by composition analysis using deep learning neural networks
CN111432114A (en) * 2019-12-31 2020-07-17 武汉星巡智能科技有限公司 Grading method, device and equipment based on shooting composition and storage medium
CN109196852B (en) * 2016-11-24 2021-02-12 华为技术有限公司 Shooting composition guiding method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105981368A (en) * 2014-02-13 2016-09-28 谷歌公司 Photo composition and position guidance in an imaging device
CN109196852B (en) * 2016-11-24 2021-02-12 华为技术有限公司 Shooting composition guiding method and device
US20190109981A1 (en) * 2017-10-11 2019-04-11 Adobe Systems Incorporated Guided image composition on mobile devices
CN108289169A (en) * 2018-01-09 2018-07-17 北京小米移动软件有限公司 Image pickup method, device, electronic equipment and storage medium
CN108366203A (en) * 2018-03-01 2018-08-03 北京金山安全软件有限公司 Composition method, composition device, electronic equipment and storage medium
CN109040605A (en) * 2018-11-05 2018-12-18 北京达佳互联信息技术有限公司 Shoot bootstrap technique, device and mobile terminal and storage medium
US20200184215A1 (en) * 2018-12-07 2020-06-11 International Business Machines Corporation Photographic results by composition analysis using deep learning neural networks
CN111432114A (en) * 2019-12-31 2020-07-17 武汉星巡智能科技有限公司 Grading method, device and equipment based on shooting composition and storage medium

Also Published As

Publication number Publication date
CN116998159A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
US9661214B2 (en) Depth determination using camera focus
EP3989177A1 (en) Method for controlling multiple virtual characters, device, apparatus, and storage medium
WO2019184889A1 (en) Method and apparatus for adjusting augmented reality model, storage medium, and electronic device
CN103576160B (en) Distance measuring device and the image processing apparatus including it
CN106846410B (en) Driving environment imaging method and device based on three dimensions
CN110148178B (en) Camera positioning method, device, terminal and storage medium
WO2020042968A1 (en) Method for acquiring object information, device, and storage medium
US9921054B2 (en) Shooting method for three dimensional modeling and electronic device supporting the same
CN111256676B (en) Mobile robot positioning method, device and computer readable storage medium
CN113052919A (en) Calibration method and device of visual sensor, electronic equipment and storage medium
WO2022134475A1 (en) Point cloud map construction method and apparatus, electronic device, storage medium and program
KR20160096966A (en) Method for notifying environmental context information, electronic apparatus and storage medium
CN109754439B (en) Calibration method, calibration device, electronic equipment and medium
WO2019196871A1 (en) Modeling method and related device
CN110738185A (en) Form object identification method and device and storage medium
CN112308103A (en) Method and device for generating training sample
KR20220085834A (en) Electronic devices and focusing methods
WO2022183443A1 (en) Method of suggesting shooting position for electronic device and electronic device
WO2022174432A1 (en) Method of suggesting shooting position for electronic device and electronic device
CN113066134A (en) Calibration method and device of visual sensor, electronic equipment and storage medium
CN116258810A (en) Rendering method, device, equipment and storage medium of pavement elements
US9904355B2 (en) Display method, image capturing method and electronic device
CN112950535A (en) Video processing method and device, electronic equipment and storage medium
CN108540726B (en) Method and device for processing continuous shooting image, storage medium and terminal
CN112683262A (en) Positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21928528

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180095208.4

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21928528

Country of ref document: EP

Kind code of ref document: A1