CN107018322B - Control method and control device for rotary camera auxiliary composition and electronic device - Google Patents

Control method and control device for rotary camera auxiliary composition and electronic device Download PDF

Info

Publication number
CN107018322B
CN107018322B CN201710137956.XA CN201710137956A CN107018322B CN 107018322 B CN107018322 B CN 107018322B CN 201710137956 A CN201710137956 A CN 201710137956A CN 107018322 B CN107018322 B CN 107018322B
Authority
CN
China
Prior art keywords
depth
foreground
image
processing
main image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710137956.XA
Other languages
Chinese (zh)
Other versions
CN107018322A (en
Inventor
孙剑波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710137956.XA priority Critical patent/CN107018322B/en
Publication of CN107018322A publication Critical patent/CN107018322A/en
Application granted granted Critical
Publication of CN107018322B publication Critical patent/CN107018322B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a control method for auxiliary composition of a rotary camera. The control method comprises the following steps: processing the scene data to obtain a current foreground type; searching a current composition suggestion corresponding to the current foreground type in a memory; and controlling the rotating camera to rotate so as to obtain a scene image which accords with the current composition suggestion. In addition, the invention also discloses a control device and an electronic device. The control method, the control device and the electronic device determine the current foreground type by using the depth information, so as to obtain the current composition suggestion corresponding to the current foreground type and control the rotary camera to rotate according to the current composition suggestion to obtain the scene image with proper composition.

Description

Control method and control device for rotary camera auxiliary composition and electronic device
Technical Field
The present invention relates to imaging technologies, and in particular, to a method, a device, and an electronic device for controlling a rotary camera to assist in composition.
Background
Composition in the shooting technology belongs to a relatively professional skill, many common consumers do not have the skill, and how to adjust the lens orientation or the lens orientation is not accurately adjusted is not known, so that the visual effect of the image is poor.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the present invention needs to provide a control method, a control device and an electronic device for rotary camera assisted composition.
A control method of rotary camera auxiliary composition is used for controlling an electronic device, the electronic device comprises an imaging device and a memory, the imaging device comprises a rotary camera and is used for collecting scene data, and the memory stores a plurality of foreground types and corresponding composition suggestions; the control method comprises the following steps:
processing the scene data to obtain a current foreground type;
finding a current composition suggestion corresponding to the current foreground type in the memory; and
and controlling the rotating camera to rotate so as to obtain a scene image which accords with the current composition suggestion.
A control device for rotary camera auxiliary composition is used for controlling an electronic device, the electronic device comprises an imaging device and a memory, the imaging device comprises a rotary camera, the imaging device is used for collecting scene data, and the memory stores a plurality of foreground types and corresponding composition suggestions; the control device comprises a processing module, a searching module and a control module.
The processing module is configured to process the scene data to obtain a current foreground type.
The finding module is used for finding a current composition suggestion corresponding to the current foreground type in the memory.
The control module is used for controlling the rotating camera to rotate so as to obtain a scene image which accords with the current composition suggestion.
An electronic apparatus includes an imaging apparatus, a memory, and the control apparatus.
The imaging device comprises a rotary camera and is used for collecting scene data, and the scene data comprises a cache main image.
The memory stores a plurality of foreground types and corresponding composition suggestions.
The control method, the control device and the electronic device for the auxiliary composition of the rotary camera determine the current foreground type by using the depth information, so that a current composition suggestion corresponding to the current foreground type is obtained, and the rotary camera is controlled to rotate according to the current composition suggestion to obtain a scene image with a proper composition.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating a method for controlling a rotary camera assisted composition according to an embodiment of the present invention.
Fig. 2 is a schematic plan view of an electronic device according to an embodiment of the present invention.
Fig. 3 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 4 is a functional block diagram of a processing module of the control device in accordance with certain embodiments of the present invention.
FIG. 5 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 6 is a functional block diagram of a processing unit in accordance with certain embodiments of the present invention.
FIG. 7 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 8 is a schematic diagram of another functional block of the processing unit in accordance with certain embodiments of the present invention.
FIG. 9 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 10 is a functional block diagram of an acquisition unit in accordance with certain embodiments of the present invention.
FIG. 11 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 12 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 13 is another functional block diagram of a processing module in accordance with certain implementations of the invention.
FIG. 14 is a schematic diagram of caching a primary image in accordance with certain embodiments of the invention.
FIG. 15 is a schematic diagram of a current composition suggestion of certain embodiments of the present invention.
FIG. 16 is a schematic illustration of an image of a scene in accordance with certain embodiments of the invention.
Description of the main element symbols:
electronic device 100, control device 10, processing module 12, processing unit 122, first processing subunit 1222, second processing subunit 1224, third processing subunit 1226, fourth processing subunit 1228, acquisition unit 124, fifth processing subunit 1242, finding subunit 1244, determination unit 126, determination subunit 1262, finding module 14, control module 16, imaging device 20, rotary camera 22, memory 30.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
Referring to fig. 1 and fig. 2, the method for controlling the auxiliary composition of the rotary camera 22 according to the embodiment of the invention can be used for controlling the electronic device 100. The electronic device 100 includes an imaging device 20 and a memory 30. The imaging device 20 includes a rotating camera 22. The imaging device 20 is used to acquire scene data. The memory 30 stores a plurality of foreground types and corresponding composition suggestions. The control method comprises the following steps:
s12: processing the scene data to obtain a current foreground type;
s14: finding a current composition suggestion in memory 30 corresponding to the current foreground type; and
s16: the rotating camera 22 is controlled to rotate to obtain an image of the scene that conforms to the current composition recommendation.
Referring to fig. 2 again, the control device 10 for composition assistance by the rotating camera 22 according to the embodiment of the present invention can be used for controlling an electronic device. The control device 10 includes a processing module 12, a finding module 14, and a control module 16. The processing module 12 is used to process scene data to obtain a current foreground type. The finding module 14 is adapted to find a current composition recommendation in the memory 30 corresponding to the current foreground type. The control module 16 is used for controlling the rotation of the rotary camera 22 to obtain the scene image according with the current composition suggestion.
That is, the control method according to the embodiment of the present invention may be implemented by the control device 10 according to the embodiment of the present invention, wherein the step S12 may be implemented by the processing module 12, the step S14 may be implemented by the finding module 14, and the step S16 may be implemented by the control module 16.
In some embodiments, the control device 10 according to the embodiment of the present invention may be applied to the electronic device 100 according to the embodiment of the present invention, or the electronic device 100 according to the embodiment of the present invention may include the control device 10 according to the embodiment of the present invention.
The control method, the control device 10 and the electronic device 100 of the embodiment of the present invention determine the current foreground type by using the depth information, thereby obtaining a current composition suggestion corresponding to the current foreground type and controlling the rotating camera 22 to rotate according to the current composition suggestion to obtain a scene image with a proper composition.
In some embodiments, the electronic device 100 comprises a mobile phone or a tablet computer. In the embodiment of the present invention, the electronic device 100 is a mobile phone.
Referring to fig. 3, in some embodiments, the scene data includes a buffered main image, and step S12 includes the following steps:
s122: processing scene data to obtain depth information of a cached main image;
s124: acquiring a foreground part of a cached main image according to the depth information; and
s126: the current foreground type is determined from the foreground portion.
Referring to FIG. 4, in some embodiments, the scene data includes a cached primary image. The processing module 12 comprises a processing unit 122, an obtaining unit 124 and a determining unit 126. The processing unit 122 is configured to process the scene data to obtain depth information of the buffered main image. The obtaining unit 124 is configured to obtain a foreground portion of the buffered main image according to the depth information. The determining unit 126 is configured to determine a current foreground type from the foreground portion.
That is, step S122 may be implemented by the processing unit 122, step S124 may be implemented by the acquiring unit 124, and step S126 may be implemented by the determining unit 126.
In this way, the foreground portion of the cached master image may be obtained and the foreground portion may be the subject, thereby obtaining the current foreground type.
Referring to fig. 5, in some embodiments, the scene data includes a depth image corresponding to the cached main image, and step S122 includes the following steps:
s1222: processing the depth image to obtain depth data of the cached main image; and
s1224: the depth data is processed to obtain depth information.
Referring to fig. 6, in some embodiments, the scene data includes a depth image corresponding to a cached main image, and the processing unit 122 includes a first processing sub-unit 1222 and a second processing sub-unit 1224. The first processing subunit 1222 is configured to process the depth image to obtain depth data of the buffered main image. The second processing subunit 1224 is configured to process the depth data to obtain depth information.
That is, step S1222 may be implemented by the first processing subunit 1222, and step S1224 may be implemented by the second processing subunit 1224.
In this way, the depth information of the cached main image can be quickly obtained by using the depth image.
It will be appreciated that the main image is an RGB color image, and the depth image contains a large amount of depth data, i.e. depth information of each person or object in the scene, including the size and/or range of depth. Since the color information of the cached main image and the depth information of the depth image are in a one-to-one correspondence relationship, the depth information of the cached main image can be obtained.
In some embodiments, the obtaining of the depth image corresponding to the cached main image includes obtaining the depth image by structured light depth ranging and obtaining the depth image by a time of flight (TOF) depth camera.
When structured light depth ranging is used to obtain a depth image, the imaging device 20 includes a camera and a projector.
It is understood that the structured light depth ranging is to project a certain pattern of light structures on the surface of an object by using a projector, and form a three-dimensional image of light bars modulated by the shape of the object to be measured on the surface. The light bar three-dimensional image is detected by the camera so as to obtain a light bar two-dimensional distortion image. The degree of distortion of the light bar depends on the relative position between the projector and the camera and the object surface profile or height. The displacement displayed along the light bar is proportional to the height of the object surface and the kink indicates a change in plane, discontinuing the physical gap of the display surface. When the relative position between the projector and the camera is fixed, the three-dimensional contour of the surface of the object can be reproduced by the distorted two-dimensional light strip image coordinates, so that the depth information can be acquired. The structured light depth ranging has higher resolution and measurement precision.
When a TOF depth camera is used to obtain the depth image, the imaging device 20 includes a TOF depth camera.
It can be understood that the TOF depth camera records the modulated infrared light emitted from the light emitting unit through the sensor to be emitted to the object, and then the phase change reflected from the object can acquire the depth distance of the whole scene in real time within a wavelength range according to the light speed. The TOF depth camera is not influenced by the gray scale and the characteristics of the surface of the shot object when calculating the depth information, can quickly calculate the depth information and has high real-time performance.
Referring to fig. 7, in some embodiments, the scene data includes a buffer secondary image corresponding to the buffer primary image, and step S122 includes the following steps:
s1226: processing the main cache image and the auxiliary cache image to obtain depth data of the main cache image; and
s1228: the depth data is processed to obtain depth information.
Referring to fig. 8, in some embodiments, the scene data includes a buffered secondary image corresponding to a buffered primary image, and the processing unit 122 includes a third processing sub-unit 1226 and a fourth processing sub-unit 1228. The third processing sub-unit 1226 is arranged to process the buffered main image and the buffered sub-image to obtain depth data of the buffered main image. The fourth processing subunit 1228 is configured to process the depth data to obtain depth information.
That is, step S1226 may be implemented by the third processing subunit 1226, and step S1228 may be implemented by the fourth processing subunit 1228.
In this way, the depth information of the buffered main image can be acquired by processing the buffered main image and the buffered sub-image.
In some embodiments, the imaging device 20 includes a primary camera and a secondary camera.
It is understood that the depth information may be obtained by a binocular stereo vision ranging method, in which case the scene data includes a cached main image and a cached sub-image. The main cache image is shot by the main camera, and the auxiliary cache image is shot by the auxiliary camera. The binocular stereo vision ranging is that two identical cameras are used for imaging the same object from different positions to obtain a stereo image pair of the object, corresponding image points of the stereo image pair are matched through an algorithm, accordingly, parallax is calculated, and finally, a method based on triangulation is adopted to recover depth information. In this way, the depth information of the cached main image can be obtained by matching the stereoscopic image pair of the cached main image and the cached sub-image.
Referring to fig. 9, in some embodiments, step S124 includes the following steps:
s1242: obtaining the most front point of the cached main image according to the depth information; and
s1244: and finding a region which is adjacent to the most front point and continuously changes in depth as a foreground part.
Referring to fig. 10, in some embodiments, the obtaining unit 124 includes a fifth processing subunit 1242 and a finding subunit 1244. The fifth processing sub-unit 1242 is arranged to obtain the foremost point of the cached main image from the depth information. The find sub-unit 1244 is used to find an area adjoining the frontmost point and continuously changing in depth as the foreground portion.
That is, step S1242 may be implemented by the fifth processing subunit 1242, and step S1244 may be implemented by the finding subunit 1244.
In this way, the foreground portion of the cached main image that is physically associated may be obtained, i.e., the foreground portion is connected in the real scene. The relation of the foreground part can be intuitively obtained by taking the foreground part in physical relation as a main body.
Specifically, the foremost point of the cached main image is obtained according to the depth information, the foremost point is equivalent to the beginning of the foreground part, the diffusion is carried out from the foremost point, areas which are adjacent to the foremost point and have continuously changing depths are obtained, and the areas and the foremost point are merged into the foreground area.
It should be noted that the forefront point refers to a pixel point corresponding to an object with the smallest depth, that is, a pixel point corresponding to an object with the smallest object distance or the object closest to the imaging device 20. Adjacent means that two pixels are connected together. When the depth continuously changes, the depth difference of two adjacent pixels is smaller than the preset difference, or the depth difference of two adjacent pixels is smaller than the preset difference, the depth of the two adjacent pixels is continuously changed.
Referring to fig. 11, in some embodiments, step S124 may include the following steps:
s1246: obtaining the most front point of the cached main image according to the depth information; and
s1248: an area having a depth difference from the frontmost point smaller than a predetermined threshold is found as the foreground portion.
In this way, the foreground part of the main image may be obtained, i.e. in a real scene, the foreground part may not be linked together, but may be in a certain logical relationship, such as a scene that hawk dives down to grab a chicken, and the hawk and the chicken may not be physically linked together, but logically, it may be judged that they are linked.
Specifically, the foremost point of the cached main image is obtained according to the depth information, the foremost point is equivalent to the beginning of the foreground part, diffusion is performed from the foremost point, areas with the depth difference with the foremost point being smaller than a preset threshold value are obtained, and the areas and the foremost point are merged into the foreground area.
In some embodiments, the predetermined threshold may be a value set by a user. Therefore, the user can determine the range of the foreground part according to the requirement of the user, so that an ideal composition suggestion is obtained, and an ideal composition is realized.
In some embodiments, the predetermined threshold may be a value determined by the control device 10, without limitation. The predetermined threshold determined by the control device 10 may be a fixed value stored internally or may be a value calculated according to different conditions, such as the depth of the forefront point.
In some embodiments, step S124 may include the steps of:
and searching an area with the depth in a preset interval as a foreground part.
In this way, a foreground portion having a depth in an appropriate range can be obtained.
It can be understood that in some shooting situations, the foreground part is not the foremost part, but is a part slightly behind the foremost part, for example, a person sits behind a computer, the computer is relatively far ahead, but the person is the main part, so that the problem of incorrect selection of the main part can be effectively avoided by taking the area with the depth in the predetermined interval as the foreground part.
Referring to fig. 12, in some embodiments, step S126 includes the following steps:
s1262: and determining the foreground type according to the matching relation of the size and the shape of the foreground part and the size, the shape and/or the position of the background part of the cached main image.
Referring to fig. 13, in some embodiments, determination unit 126 includes a determination subunit 1262. The determining subunit 1262 is configured to determine the foreground type based on a size, shape and/or position of the foreground portion in co-operation with a size, shape and/or position of the background portion of the cached main image.
That is, step S1262 may be implemented by the determination subunit 1262.
In this way, the foreground type may be determined by the foreground portion or the fit relationship of the foreground portion to the background portion.
It can be understood that the foreground part is used as a main part of the image, and when determining the foreground type, the foreground type can be determined as the most important determining factor, that is, the foreground type is determined by using the characteristics of the size, the shape, the content and the like of the foreground part.
In some embodiments, foreground types include symmetric types, nine-grid patterns, diagonal types, triangular types, and the like. For example, if the foreground part is two objects that are symmetric left and right, it can be determined that the corresponding foreground type is a symmetric type.
In some embodiments, in order to improve the quality of composition, when determining the foreground type, reference may be made to the background portion or the matching relationship between the foreground portion and the background portion, so as to obtain a more accurate foreground type, and further obtain a more ideal composition suggestion.
Referring to fig. 14-16 together, in one embodiment, the control device 10 controls the imaging device 20 to image to obtain the cached main image and the depth image (not shown) as shown in fig. 14, the foreground image of the cached main image can be obtained as two lions according to the processing of the depth information, and the current foreground type is determined according to the two lions, for example, the two lions in fig. 14 are similar in shape, the lions on the left side are too close to the edge, and more space is left on the right side of the lions on the right side. According to this foreground type, a symmetrical composition may be selected, as shown in fig. 15, and the rotation of the rotary camera 22 is controlled so that the two lions are in a symmetrical state in the image, thereby obtaining an image of the scene as shown in fig. 16.
In the description of the embodiments of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
In the description of the embodiments of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present invention can be understood by those of ordinary skill in the art according to specific situations.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (8)

1. A control method of rotary camera auxiliary composition is used for controlling an electronic device, and is characterized in that the electronic device comprises an imaging device and a memory, the imaging device comprises a rotary camera, the imaging device is used for collecting scene data, and the memory stores a plurality of foreground types and corresponding composition suggestions; the control method comprises the following steps:
processing the scene data to obtain a current foreground type;
finding a current composition suggestion corresponding to the current foreground type in the memory; and
controlling the rotating camera to rotate to obtain a scene image which accords with the current composition suggestion;
the scene data includes cached primary images, and the step of processing the scene data to obtain the current foreground type includes the steps of:
processing the scene data to acquire depth information of the cached main image;
acquiring a foreground part of the cached main image according to the depth information; and
determining a current foreground type according to the foreground part;
the scene data comprises a depth image corresponding to the cache main image, the depth image is obtained by adopting structured light depth ranging or a time-of-flight depth camera, and the step of processing the scene data to obtain the depth information of the cache main image comprises the following steps:
processing the depth image to obtain depth data of the cached main image; and
processing the depth data to obtain the depth information;
the step of obtaining the foreground part of the cached main image according to the depth information comprises the following steps:
obtaining the foremost point of the cache main image according to the depth information; and
finding an area having a difference in depth from the frontmost point less than a predetermined threshold as the foreground portion.
2. The control method of claim 1, wherein said step of determining a current foreground type from said foreground portion comprises the steps of:
and determining the foreground type according to the matching relation between the size and the shape of the foreground part and the size, the shape and/or the position of the background part of the cached main image.
3. A control device for rotary camera auxiliary composition is used for controlling an electronic device, and is characterized in that the electronic device comprises an imaging device and a memory, the imaging device comprises a rotary camera, the imaging device is used for collecting scene data, and the memory stores a plurality of foreground types and corresponding composition suggestions; the control device includes:
a processing module to process the scene data to obtain a current foreground type;
a finding module to find a current composition suggestion corresponding to the current foreground type in the memory; and
the control module is used for controlling the rotary camera to rotate so as to obtain a scene image which accords with the current composition suggestion;
the scene data includes a cached primary image, the processing module includes:
a processing unit for processing the scene data to obtain depth information of the cached main image;
an obtaining unit, configured to obtain a foreground portion of the cached main image according to the depth information; and
a determining unit for determining a current foreground type from the foreground portion;
the scene data includes a depth image corresponding to the cached main image, the depth image is obtained by using structured light depth ranging or a time-of-flight depth camera, and the processing unit includes:
a first processing sub-unit for processing the depth image to obtain depth data of the cached main image; and
a second processing subunit, configured to process the depth data to obtain the depth information;
the acquisition unit is used for acquiring the most front point of the cached main image according to the depth information, and searching an area with the depth difference of the most front point being less than a preset threshold value as the foreground part.
4. The control apparatus according to claim 3, wherein the determination unit includes:
a determining subunit, configured to determine the foreground type according to a size and a shape of the foreground portion and a size, a shape, and/or a position matching relationship of a background portion of the cached main image.
5. An electronic device, comprising:
an imaging device comprising a rotating camera, the imaging device for collecting scene data, the scene data comprising a cached master image;
a memory storing a plurality of foreground types and corresponding composition suggestions; and
a control device as claimed in claim 3 or 4.
6. The electronic device of claim 5, wherein the electronic device comprises a cell phone or a tablet computer.
7. The electronic device of claim 5, wherein the imaging device comprises a camera and a projector.
8. The electronic device of claim 5, wherein the imaging device comprises a TOF depth camera.
CN201710137956.XA 2017-03-09 2017-03-09 Control method and control device for rotary camera auxiliary composition and electronic device Expired - Fee Related CN107018322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710137956.XA CN107018322B (en) 2017-03-09 2017-03-09 Control method and control device for rotary camera auxiliary composition and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710137956.XA CN107018322B (en) 2017-03-09 2017-03-09 Control method and control device for rotary camera auxiliary composition and electronic device

Publications (2)

Publication Number Publication Date
CN107018322A CN107018322A (en) 2017-08-04
CN107018322B true CN107018322B (en) 2020-02-11

Family

ID=59440675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710137956.XA Expired - Fee Related CN107018322B (en) 2017-03-09 2017-03-09 Control method and control device for rotary camera auxiliary composition and electronic device

Country Status (1)

Country Link
CN (1) CN107018322B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111935393A (en) * 2020-06-28 2020-11-13 百度在线网络技术(北京)有限公司 Shooting method, shooting device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5115139B2 (en) * 2007-10-17 2013-01-09 ソニー株式会社 Composition determination apparatus, composition determination method, and program
JP5359645B2 (en) * 2009-07-23 2013-12-04 ソニー株式会社 Composition determination apparatus, imaging system, composition determination method, and program
CN103067705B (en) * 2012-12-19 2016-06-08 宁波大学 A kind of multi-view depth video preprocess method
CN104935832B (en) * 2015-03-31 2019-07-12 浙江工商大学 For the video keying method with depth information
CN106327473A (en) * 2016-08-10 2017-01-11 北京小米移动软件有限公司 Method and device for acquiring foreground images

Also Published As

Publication number Publication date
CN107018322A (en) 2017-08-04

Similar Documents

Publication Publication Date Title
CN106851123B (en) Exposure control method, exposure control device and electronic device
CN106851124B (en) Image processing method and device based on depth of field and electronic device
CN106993112B (en) Background blurring method and device based on depth of field and electronic device
CN108076286B (en) Image blurring method and device, mobile terminal and storage medium
EP0684585B1 (en) Image forming method and apparatus
JP4548850B2 (en) Method and apparatus for selecting a stereoscopic image
JP2011060216A (en) Device and method of processing image
CN105979165A (en) Blurred photos generation method, blurred photos generation device and mobile terminal
EP1534021A1 (en) Device and method for displaying stereo image
CN106991378B (en) Depth-based face orientation detection method and device and electronic device
US20120293659A1 (en) Parameter determining device, parameter determining system, parameter determining method, and recording medium
CN106851107A (en) Switch control method, control device and the electronic installation of camera assisted drawing
JP2013513095A (en) Method and system for obtaining an improved stereo image of an object
CN112889272B (en) Depth image acquisition method, depth image acquisition device and electronic device
CN107493427A (en) Focusing method, device and the mobile terminal of mobile terminal
CN106998389A (en) Control method, control device and the electronic installation of auto composition
US20180213156A1 (en) Method for displaying on a screen at least one representation of an object, related computer program, electronic display device and apparatus
CN106875433A (en) Cut control method, control device and the electronic installation of composition
EP3189493B1 (en) Depth map based perspective correction in digital photos
CN106973224B (en) Auxiliary composition control method, control device and electronic device
CN107018322B (en) Control method and control device for rotary camera auxiliary composition and electronic device
CN107025636B (en) Image defogging method and device combined with depth information and electronic device
JPH1021401A (en) Three-dimensional information processor
CN106991696B (en) Backlight image processing method, backlight image processing device and electronic device
JP2006031101A (en) Image generation method and device therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200211