CN110233968B - Image shooting control method and device and image shooting system - Google Patents

Image shooting control method and device and image shooting system Download PDF

Info

Publication number
CN110233968B
CN110233968B CN201910542009.8A CN201910542009A CN110233968B CN 110233968 B CN110233968 B CN 110233968B CN 201910542009 A CN201910542009 A CN 201910542009A CN 110233968 B CN110233968 B CN 110233968B
Authority
CN
China
Prior art keywords
image
scene
unit
control
shooting unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910542009.8A
Other languages
Chinese (zh)
Other versions
CN110233968A (en
Inventor
浦汉来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Moxiang Network Technology Co ltd
Original Assignee
Shanghai Moxiang Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Moxiang Network Technology Co ltd filed Critical Shanghai Moxiang Network Technology Co ltd
Priority to CN201910542009.8A priority Critical patent/CN110233968B/en
Publication of CN110233968A publication Critical patent/CN110233968A/en
Application granted granted Critical
Publication of CN110233968B publication Critical patent/CN110233968B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides an image shooting control method, an image shooting control device and an image shooting system, wherein the image shooting control method comprises the following steps: identifying an image scene shot by an image shooting unit; and adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image. According to the embodiment of the application, the shooting control parameters can be automatically adjusted according to the requirements of the application scenes, so that the image shooting unit can shoot images meeting the shooting requirements of different application scenes.

Description

Image shooting control method and device and image shooting system
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image shooting control method and device and an image shooting system.
Background
For hardware equipment, images can often be shot in a mode of function expansion or component expansion, however, in different application environments, in order to meet shooting requirements of different application scenes, related parameters of the hardware equipment need to be manually adjusted in the prior art, especially for shooting in a mode of component expansion, difficulty in adjusting the parameters of the hardware equipment is high, and especially for most users at non-professional levels, manual adjustment means high professional ability, so that the parameters of the hardware equipment cannot be effectively adjusted.
Therefore, it is desirable to provide a technical solution to automatically adjust the relevant parameters of the hardware device according to the requirements of the application scenarios, so that the image capturing unit captures images meeting the capturing requirements of different application scenarios.
Disclosure of Invention
In view of the above, an object of the present invention is to provide an image capturing control method and apparatus, and an image capturing system, which overcome the above-mentioned shortcomings in the prior art.
The embodiment of the application provides an image shooting control method, which comprises the following steps:
identifying an image scene shot by an image shooting unit;
and adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, adaptively adjusting shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image includes: and adaptively adjusting optical control parameters when the image shooting unit shoots the image according to the identified image scene, wherein the shooting control parameters comprise the optical control parameters.
Optionally, in any embodiment of the present application, adaptively adjusting shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image includes: and adaptively adjusting the attitude control parameters of the image shooting unit according to the identified image scene so as to control the image shooting unit to shoot the image, wherein the shooting control parameters comprise the attitude control parameters.
Optionally, in any embodiment of the present application, if the image capturing unit is disposed on the supporting component or the electronic device;
correspondingly, the self-adaptive adjustment of shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises the following steps: and adaptively adjusting and controlling attitude control parameters of the support component or the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, if the image capturing unit is disposed on a support assembly or an electronic device, and the support device or the electronic device is in a motion state; correspondingly, the self-adaptive adjustment of shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises the following steps: and adaptively adjusting and controlling attitude control parameters of the support component or the electronic equipment according to the recognized image scene and the motion state of the support component or the electronic equipment so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, identifying an image scene captured by an image capturing unit includes:
extracting local features of the image shot by the image shooting unit, and quantizing the extracted local features into visual words;
and counting the frequency of the visual words to obtain a visual word distribution histogram, and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram.
The embodiment of the application provides an image shooting control device, it includes:
the scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the parameter adjusting module is used for adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image.
The embodiment of the application provides an image shooting system, it includes: an image capturing unit and an image capturing control device; the image capture control apparatus includes:
the scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameter of the supporting component according to the identified image scene so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, the parameter adjusting module is further configured to adaptively adjust an optical control parameter when the image capturing unit captures an image according to the identified image scene, where the capture control parameter includes the optical control parameter.
Optionally, in any embodiment of the present application, the parameter adjusting module is further configured to adaptively adjust an attitude control parameter of the image capturing unit according to the identified image scene, so as to control the image capturing unit to capture an image, where the capture control parameter includes the attitude control parameter.
Optionally, in any embodiment of the present application, if the image capturing unit is disposed on the supporting component or the electronic component;
correspondingly, the parameter adjusting module is further configured to adaptively adjust and control the attitude control parameter of the supporting component or the electronic component according to the identified image scene, so as to control the image capturing unit to capture an image.
Optionally, in any embodiment of the present application, if the image capturing unit is disposed on a support assembly or an electronic device, and the support assembly or the electronic device is in a motion state; correspondingly, the parameter adjusting module is further configured to adaptively adjust and control an attitude control parameter of the support component or the electronic device according to the identified image scene and the motion state of the support component or the electronic device, so as to control the image capturing unit to capture an image.
Optionally, in any embodiment of the present application, the scene recognition module is further configured to perform local feature extraction on the image captured by the image capturing unit, and quantize the extracted local features into visual words; counting the frequency of the visual words to obtain a visual word distribution histogram; and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram.
In the embodiment of the application, the image scene shot by the image shooting unit is identified; and the shooting control parameters are adaptively adjusted according to the identified image scene so as to control the image shooting unit to shoot images, so that the shooting control parameters can be automatically adjusted according to the requirements of application scenes, and the image shooting unit can shoot images meeting the shooting requirements of different application scenes.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
fig. 1 is a schematic flowchart of an image capturing control method according to a first embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an image capturing control method according to a second embodiment of the present application;
fig. 3 is a schematic flowchart of an image capturing control method according to a third embodiment of the present application;
fig. 4 is a schematic flowchart of an image capturing control method according to a fourth embodiment of the present application;
fig. 5 is a schematic structural diagram of an image capture control apparatus according to a fifth embodiment of the present application;
fig. 6 is a schematic structural diagram of an image capture control apparatus according to a sixth embodiment of the present application;
fig. 7 is a schematic structural diagram of an image capture control apparatus according to a seventh embodiment of the present application;
fig. 8 is a schematic structural diagram of an image capture control apparatus according to an eighth embodiment of the present application;
FIG. 9 is a schematic structural diagram of an image capturing system according to a ninth embodiment of the present application;
fig. 10 is a schematic structural diagram of an image capturing system according to a tenth embodiment of the present application;
fig. 11 is a schematic structural diagram of an image capturing system according to an eleventh embodiment of the present application;
fig. 12 is a schematic structural diagram of an image capturing system according to a twelfth embodiment of the present application.
Detailed Description
It is not necessary for any particular embodiment of the invention to achieve all of the above advantages at the same time.
In the embodiment of the application, the image scene shot by the image shooting unit is identified; and the shooting control parameters are adaptively adjusted according to the identified image scene so as to control the image shooting unit to shoot images, so that the shooting control parameters can be automatically adjusted according to the requirements of application scenes, and the image shooting unit can shoot images meeting the shooting requirements of different application scenes.
The following further describes specific implementations of embodiments of the present application with reference to the drawings of the embodiments of the present application.
Fig. 1 is a schematic flowchart of an image capturing control method according to a first embodiment of the present disclosure; as shown in fig. 1, it includes:
s101, identifying an image scene shot by an image shooting unit;
in this embodiment, the image capturing unit may be a camera or any structure capable of capturing an image. The camera may be a general camera, a depth camera, or the like, and is not particularly limited herein.
Optionally, in this embodiment, when the image scene captured by the image capturing unit is identified in step S101, the following steps S111 to S121 may be specifically implemented:
s111, extracting local features of the image shot by the image shooting unit, and quantizing the extracted local features into visual words;
and S121, counting the frequency of the visual words to obtain a visual word distribution histogram, and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram.
Through the processing in the steps S111 to S121, it is finally equivalent to that each image can be regarded as a word frequency vector, each image represents a probability distribution formed by a plurality of topics, and each topic represents a probability distribution formed by a plurality of visual words, thereby effectively achieving data dimension reduction and reducing the difficulty and complexity of data processing.
In the implementation of step S121, the visual word distribution histogram is classified by a separator to realize recognition of the image scene captured by the image capturing unit. In specific implementation, an image scene material library may be configured in advance, wherein the image scene material library includes a plurality of image scene materials, a visual word material distribution histogram may exist correspondingly, and the classifier may perform classification processing on the visual word distribution histogram to recognize an image scene captured by the image capturing unit, and may perform direct comparison between the visual word material distribution histogram and the visual word distribution histogram to realize rapid recognition of the image scene.
Further, considering that the visual words with higher occurrence frequency are more beneficial to the classification processing of the classified words than the visual words with lower occurrence frequency. Therefore, a feature function is further introduced to enhance the visual words with higher frequency so as to improve the accuracy of image scene recognition.
In particular, the characteristic function may be embodied by a dichotomy, namely: and if the word frequency of the visual word is greater than the set word frequency threshold value, the visual word is a high-frequency visual word, otherwise, the visual word is a low-frequency visual word. Then, the classifier is used for identifying the image scene of the distribution histogram of the determined high-frequency visual words.
Here, it should be noted that the implementation of the recognition of the image scene through the above steps S111 to S121 is only an example and is not limited to the only example. It is obvious to those skilled in the art that any other technical means capable of recognizing the image scene can be adopted.
The image scenes identified after the above step S101 are, for example, seaside, city, cloudy day, sunny day, rainy day, and other possible scenes.
And S102, adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image.
In this embodiment, after the image scene is identified in step S101, the shooting control parameter is adjusted to match the image scene adaptive region, and the image shooting unit is controlled to shoot the image according to the adjusted shooting control parameter.
Further, the shooting control parameters are control parameters in a broad sense, and may be directly related to the image capturing unit, or also referred to as any parameters that may affect the image capturing on the image capturing unit itself; alternatively, it is also possible that the image capturing unit is indirectly related to any parameter that is not an image capturing unit but that may affect the image capturing. For a detailed exemplary description, reference may be made to the following description of the embodiments. The following description of the embodiments is not intended to be limiting. In fact, it is also possible for those skilled in the art to flexibly set adaptively adjustable shooting control parameters according to the needs of a specific scene.
Fig. 2 is a schematic flowchart of an image capturing control method according to a second embodiment of the present application; as shown in fig. 2, in the present embodiment, an example of adaptively adjusting the optical control parameter when the image capturing unit captures an image is described. Specifically, the flow of the image capturing control method may include:
s201, identifying an image scene shot by an image shooting unit;
in this embodiment, the image capturing unit may be a camera or any structure capable of capturing an image. The camera may be a general camera, a depth camera, or the like, and is not particularly limited herein.
In this embodiment, when the image scene captured by the image capturing unit is identified in step S201, the following steps S211 to S231 may be implemented:
s211, extracting color features from the image based on a color histogram of an RGB space, and extracting texture features from the image based on a multichannel Gabor filter;
in step S211, when extracting color features from an image based on the color histogram of the RGB space, specifically, equal-interval quantization may be adopted, so that R, G, B is divided into four levels according to respective ranges.
The Gabor filter used in step S211 may be specifically implemented by a Gabor function kernel, and a multi-channel Gabor filter is obtained by configuring multi-scale parameters in the Gabor function.
S221, carrying out normalization processing on the color features and the texture features to obtain combined features;
in step S221, the color feature and the texture feature are normalized to obtain a feature vector with a combined feature, for example, 60 dimensions, the first 12 dimensions are color feature vectors, and the last 48 dimensions are texture feature vectors, so that the color feature and the texture feature are effectively normalized.
And S231, classifying the combined features by using a vector machine so as to identify the image scene shot by the image shooting unit.
When the combined features are classified by using the vector machine in step S231, the optimal model may be continuously adjusted, and the combined features are classified by using the optimal model, so as to improve the accuracy of scene recognition.
Based on the above processing of S211-S231, it can be seen that a detailed description of the image can be achieved due to the integration of color features and texture features, thereby achieving the recognition of the image scene as accurately as possible.
S202, adaptively adjusting optical control parameters when the image shooting unit shoots the image according to the identified image scene.
In this embodiment, to implement step S202, an adjustable relationship mapping between an image scene and an optical control parameter may be established in advance, and the optical control parameter that can be adaptively adjusted in a certain image scene is determined through the adjustable relationship mapping.
In this embodiment, the optical control parameter may be at least one of exposure, filter control, or white balance.
Specifically, if the image scene is cloudy, for example, adaptive adjustment of the exposure level prevents underexposure or also called underexposure, resulting in poor quality of the captured image. On the contrary, if the image scene is sunny and the sunshine is sufficient, the exposure degree is adjusted in a self-adaptive mode to prevent overexposure, so that the quality of the shot image is poor, and the image quality is improved on the whole.
The example shown in fig. 2 is described by taking an example of adjusting the optical control parameter when the image capturing unit captures an image. As described above, the shooting control parameter may not be a parameter of the image capturing unit, but mainly may be any parameter that can affect image capturing. Thus, in different application scenarios, any parameter affecting the image capturing may also be associated with a structural extension component or an electronic device structurally comprising an image capturing unit.
Therefore, when the photographing control parameters are adaptively adjusted according to the recognized image scene to control the image photographing unit to photograph the image, the attitude control parameters of the image photographing unit may be adaptively adjusted according to the recognized image scene to control the image photographing unit to photograph the image, the photographing control parameters including the attitude control parameters. The attitude control parameter is used for controlling the attitude of the structure extension assembly or the attitude of the electronic equipment, so that the image shooting unit is influenced to shoot the image. Exemplary embodiments are described in detail below with reference to fig. 3 and 4.
Fig. 3 is a schematic flowchart of an image capturing control method according to a third embodiment of the present application; as shown in fig. 3, when the image capturing unit is disposed on the structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, such as but not limited to a pan-tilt. Specifically, in the present embodiment, the image capturing control method includes:
s301, identifying an image scene shot by an image shooting unit;
in this embodiment, the step S301 may refer to the steps S111 to S121, or the steps S213 to S233. In fact, in order to improve the accuracy of image scene recognition as much as possible, the image scene recognition may be performed through the steps S111-S121 and the steps S213-S233, for example, statistical analysis is performed on the image scenes recognized through the steps S111-S121 and the steps S213-S233, so as to obtain a final image scene.
S302, adaptively adjusting and controlling the posture control parameters of the supporting component according to the identified image scene so as to control the image shooting unit to shoot the image.
Similar to the second embodiment, the adjustable relationship mapping between the image scene and the attitude control parameter may be established in advance, and the attitude control parameter that can be adaptively adjusted in a certain image scene is determined through the adjustable relationship mapping. Further, the attitude control parameters may or may not be actually adjusted by the configuration items.
For example, when the product form of the support assembly is a cloud platform, the attitude control parameter is at least one of a pitch attitude control parameter, a pan attitude control parameter, or a roll attitude control parameter. For this purpose, the pitching motion of the pan/tilt head is adjusted by the pitching attitude control parameter, the panning motion of the pan/tilt head is adjusted by the panning attitude control parameter, or the scrolling of the roll attitude control parameter. Because the image shooting unit is arranged on the holder, the posture of the image shooting unit can be changed along with the change of the posture of the holder under the control of the posture control parameters, so that the image shooting in various application scenes can be met.
Specifically, a pitching control motor, a translation control motor and a rolling control motor are configured on the holder, and the pitching control motor, the translation control motor and the rolling control motor are respectively controlled through the pitching attitude control parameter, the translation attitude control parameter or the rolling attitude control parameter, so that the posture of the holder is adjusted.
Here, it should be noted that the supporting component is a cradle head only by way of example, and in fact, the supporting component is meant in a broad sense, and it can be virtually any structure capable of providing a supporting function for the image capturing unit, such as fixing the image capturing unit to a bicycle handlebar, fixing the image capturing unit to a helmet, and the handlebar and the helmet are equivalent to the supporting component.
In another embodiment, if the image capturing unit is disposed on an electronic device, the electronic device is, for example, an unmanned aerial vehicle. The image capturing unit may be a component of the electronic device or an accessory of the electronic device. In this case, correspondingly, adaptively adjusting the shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises: and adaptively adjusting and controlling the attitude control parameter of the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
If the electronic device is an unmanned aerial vehicle, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the above-mentioned cloud deck, can dispose pitch control motor, yaw control motor, roll control motor on unmanned aerial vehicle, thereby control pitch control motor, yaw control motor, roll control motor respectively through above-mentioned pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter and realize the adjustment of unmanned aerial vehicle gesture.
It should be noted that, the specific product form of the electronic device may also be a tracker, and the principle of the attitude control of the electronic device is similar to that of an unmanned aerial vehicle.
Fig. 4 is a schematic flowchart of an image capturing control method according to a fourth embodiment of the present application; as shown in fig. 4, when the image capturing unit is disposed on a structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, and the support component is in a moving state, the support component includes, but is not limited to, a cradle head, and the cradle head is disposed on a moving structure (such as a bicycle). Specifically, in this embodiment, it includes:
s401, identifying an image scene shot by an image shooting unit;
in this embodiment, the step S401 can be implemented by referring to the steps S111 to S121, or the steps S213 to S233. In fact, in order to improve the accuracy of image scene recognition as much as possible, the image scene recognition may be performed through the steps S111 to S121 and the steps S213 to S233, for example, statistical analysis is performed on the image scenes recognized respectively in the steps S111 to S121 and the steps S213 to S233, for example, euclidean distances are calculated on the image scenes recognized respectively in the steps S111 to S121 and the steps S213 to S233, and the closer the distance, the more accurate the image scene recognition is expressed, and the larger the possibility of the image scene is expressed.
S402, adaptively adjusting and controlling the posture control parameters of the supporting component according to the recognized image scene and the motion state of the supporting component so as to control the image shooting unit to shoot the image.
In this embodiment, a motion sensor, such as a gyroscope or an acceleration sensor, may be specifically disposed on the support assembly, and the motion state of the support assembly may be determined according to data of the motion sensor. For example, since the tripod head is fixed to the bicycle and the recognized image scene is the driving state of the bicycle, in order to avoid image blurring caused by shaking of the image capturing unit when capturing an image, the posture of the tripod head is controlled to be stable by at least one of the pitch posture control parameter, the pan posture control parameter, and the roll posture control parameter, so that the posture of the image capturing unit is ensured to be stable as much as possible, and a high-quality image is captured.
Similarly, if the electronic device is a drone, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the pan/tilt control motor, the yaw control motor, and the roll control motor can be configured on the unmanned aerial vehicle, and the pitch control motor, the yaw control motor, and the roll control motor are respectively controlled by the above pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter, thereby realizing the adjustment of the attitude of the unmanned aerial vehicle, and capturing a high-quality image.
Here, it should be noted that in some application scenarios, the optical control parameter and the attitude control parameter may also be adjusted simultaneously in combination with fig. 2, 3, and 4.
Fig. 5 is a schematic structural diagram of an image capture control apparatus according to a fifth embodiment of the present application; as shown in fig. 5, it includes:
the first scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the parameter adjusting module is used for adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, the first scene recognition module is further configured to perform local feature extraction on the image captured by the image capturing unit, and quantize the extracted local features into visual words; counting the frequency of the visual words to obtain a visual word distribution histogram; and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram.
Optionally, in this embodiment, the first scene recognition module may specifically include:
the characteristic extraction unit is used for extracting local characteristics of the image shot by the image shooting unit and quantizing the extracted local characteristics into visual words;
and the scene recognition unit counts the frequency of the visual words to obtain a visual word distribution histogram and recognizes the image scene shot by the image shooting unit according to the visual word distribution histogram.
Through the processing of the scene recognition module, each image can be regarded as a word frequency vector, each image represents a probability distribution formed by a plurality of subjects, and each subject represents a probability distribution formed by a plurality of visual words, so that dimensionality reduction of data is effectively achieved, and the difficulty and complexity of data processing are improved.
Fig. 6 is a schematic structural diagram of an image capture control apparatus according to a sixth embodiment of the present application; as shown in fig. 6, it includes:
the second scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the first parameter adjusting module is used for adaptively adjusting optical control parameters when the image shooting unit shoots the image according to the identified image scene, wherein the shooting control parameters comprise the optical control parameters.
In this embodiment, the second scene recognition module specifically includes:
the characteristic extraction unit is used for extracting color characteristics from the image based on a color histogram of an RGB space and extracting texture characteristics from the image based on a multichannel Gabor filter;
the normalization unit is used for performing normalization processing on the color features and the texture features to obtain combined features;
and the scene recognition unit is used for classifying the combined features by using a vector machine so as to recognize the image scene shot by the image shooting unit.
When extracting color features from an image based on a color histogram of an RGB space, it is specifically possible to use equal-interval quantization, so that R, G, B is divided into four levels according to their respective ranges.
The used Gabor filter can be specifically realized by a Gabor function kernel, and a multi-channel Gabor filter is obtained by configuring multi-scale parameters in the Gabor function.
The color feature and the texture feature are normalized to obtain a combined feature, for example, a feature vector which can be 60-dimensional, the first 12-dimensional feature vector is a color feature vector, and the second 48-dimensional feature vector is a texture feature vector, so that the normalization of the color feature and the texture feature is effectively realized.
When the combination features are classified by using a vector machine, the optimal module can be continuously adjusted, and then the combination features are classified by the optimal module, so that the accuracy of scene recognition is improved.
Based on the structure of the second scene recognition module, the detailed description of the image can be realized due to the combination of the color feature and the texture feature, so that the recognition of the image scene can be realized as accurately as possible.
Fig. 7 is a schematic structural diagram of an image capture control apparatus according to a seventh embodiment of the present application; as shown in fig. 7, when the image capturing unit is disposed on the structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, such as but not limited to a pan-tilt. The image capture control device includes:
the third scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the second parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameter of the supporting component or the electronic component according to the identified image scene so as to control the image shooting unit to shoot the image.
In this embodiment, the third scene recognition module may multiplex the first scene recognition module or the second scene recognition module, or include the first scene recognition module and the second scene recognition module, for example, perform statistical analysis on image scenes respectively recognized by the first scene recognition module and the second scene recognition module to obtain a final image scene.
Similar to the second embodiment, the adjustable relationship mapping between the image scene and the attitude control parameter may be established in advance, and the attitude control parameter that can be adaptively adjusted in a certain image scene is determined through the adjustable relationship mapping. Further, the attitude control parameters may or may not be actually adjusted by the configuration items.
For example, when the product form of the support assembly is a cloud platform, the attitude control parameter is at least one of a pitch attitude control parameter, a pan attitude control parameter, or a roll attitude control parameter. For this purpose, the pitching motion of the pan/tilt head is adjusted by the pitching attitude control parameter, the panning motion of the pan/tilt head is adjusted by the panning attitude control parameter, or the scrolling of the roll attitude control parameter. Because the image shooting unit is arranged on the holder, the posture of the image shooting unit can be changed along with the change of the posture of the holder under the control of the posture control parameters, so that the image shooting in various application scenes can be met.
Specifically, a pitching control motor, a translation control motor and a rolling control motor are configured on the holder, and the pitching control motor, the translation control motor and the rolling control motor are respectively controlled through the pitching attitude control parameter, the translation attitude control parameter or the rolling attitude control parameter, so that the posture of the holder is adjusted.
Here, it should be noted that the supporting component is a cradle head only by way of example, and in fact, the supporting component is meant in a broad sense, and it can be virtually any structure capable of providing a supporting function for the image capturing unit, such as fixing the image capturing unit to a bicycle handlebar, fixing the image capturing unit to a helmet, and the handlebar and the helmet are equivalent to the supporting component.
In another embodiment, if the image capturing unit is disposed on an electronic device, the electronic device is, for example, an unmanned aerial vehicle. The image capturing unit may be a component of the electronic device or an accessory of the electronic device. In this case, correspondingly, adaptively adjusting the shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises: and adaptively adjusting and controlling the attitude control parameter of the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
If the electronic device is an unmanned aerial vehicle, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the above-mentioned cloud deck, can dispose pitch control motor, yaw control motor, roll control motor on unmanned aerial vehicle, thereby control pitch control motor, yaw control motor, roll control motor respectively through above-mentioned pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter and realize the adjustment of unmanned aerial vehicle gesture.
It should be noted that, the specific product form of the electronic device may also be a tracker, and the principle of the attitude control of the electronic device is similar to that of an unmanned aerial vehicle.
Fig. 8 is a schematic structural diagram of an image capture control apparatus according to an eighth embodiment of the present application; as shown in fig. 8, when the image capturing unit is disposed on a structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, and the support component is in a moving state, the support component includes, but is not limited to, a cradle head, and the cradle head is disposed on a moving structure (such as a bicycle). Specifically, in the present embodiment, the image capturing control apparatus includes:
the fourth scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the third parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameters of the supporting component or the electronic equipment according to the identified image scene and the motion state of the supporting component so as to control the image shooting unit to shoot the image.
In this embodiment, the fourth scene recognition module may multiplex the first scene recognition module or the second scene recognition module, or include the first scene recognition module and the second scene recognition module, for example, perform statistical analysis on image scenes respectively recognized by the first scene recognition module and the second scene recognition module to obtain a final image scene. For example, euclidean distances are calculated for image scenes respectively identified by the first scene identification module and the second scene identification module, and the closer the distance is, the more accurate the image scene identification is expressed, and the larger the possibility of corresponding image scenes is.
In this embodiment, a motion sensor, such as a gyroscope or an acceleration sensor, may be specifically disposed on the support assembly, and the motion state of the support assembly may be determined according to data of the motion sensor. For example, since the tripod head is fixed to the bicycle and the recognized image scene is the driving state of the bicycle, in order to avoid image blurring caused by shaking of the image capturing unit when capturing an image, the posture of the tripod head is controlled to be stable by at least one of the pitch posture control parameter, the pan posture control parameter, and the roll posture control parameter, so that the posture of the image capturing unit is ensured to be stable as much as possible, and a high-quality image is captured.
Similarly, if the electronic device is a drone, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the pan/tilt control motor, the yaw control motor, and the roll control motor can be configured on the unmanned aerial vehicle, and the pitch control motor, the yaw control motor, and the roll control motor are respectively controlled by the above pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter, thereby realizing the adjustment of the attitude of the unmanned aerial vehicle, and capturing a high-quality image.
Here, it should be noted that in some application scenarios, the optical control parameter and the attitude control parameter may be adjusted simultaneously in combination with fig. 5, 6, and 7.
FIG. 9 is a schematic structural diagram of an image capturing system according to a ninth embodiment of the present application; as shown in fig. 9, the image capturing system includes: an image capturing unit and an image capturing control device; wherein the image capturing control apparatus includes:
the first scene recognition module is used for recognizing the image scene shot by the image shooting unit according to a preset scene recognition model;
and the parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameter of the supporting component according to the identified image scene so as to control the image shooting unit to shoot the image.
Optionally, in any embodiment of the present application, the first scene recognition module is further configured to perform local feature extraction on the image captured by the image capturing unit, and quantize the extracted local features into visual words; counting the frequency of the visual words to obtain a visual word distribution histogram; and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram.
Optionally, in this embodiment, the first scene recognition module may specifically include:
the characteristic extraction unit is used for extracting local characteristics of the image shot by the image shooting unit and quantizing the extracted local characteristics into visual words;
and the scene recognition unit counts the frequency of the visual words to obtain a visual word distribution histogram and recognizes the image scene shot by the image shooting unit according to the visual word distribution histogram.
Through the processing of the scene recognition module, each image can be regarded as a word frequency vector, each image represents a probability distribution formed by a plurality of subjects, and each subject represents a probability distribution formed by a plurality of visual words, so that dimensionality reduction of data is effectively achieved, and the difficulty and complexity of data processing are improved.
Fig. 10 is a schematic structural diagram of an image capturing system according to a tenth embodiment of the present application; as shown in fig. 10, the image capturing system includes: an image capturing unit and an image capturing control device; wherein the image capturing control apparatus includes:
the second scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the first parameter adjusting module is used for adaptively adjusting optical control parameters when the image shooting unit shoots the image according to the identified image scene, wherein the shooting control parameters comprise the optical control parameters.
In this embodiment, the second scene recognition module specifically includes:
the characteristic extraction unit is used for extracting color characteristics from the image based on a color histogram of an RGB space and extracting texture characteristics from the image based on a multichannel Gabor filter;
the normalization unit is used for performing normalization processing on the color features and the texture features to obtain combined features;
and the scene recognition unit is used for classifying the combined features by using a vector machine so as to recognize the image scene shot by the image shooting unit.
When extracting color features from an image based on a color histogram of an RGB space, it is specifically possible to use equal-interval quantization, so that R, G, B is divided into four levels according to their respective ranges.
The used Gabor filter can be specifically realized by a Gabor function kernel, and a multi-channel Gabor filter is obtained by configuring multi-scale parameters in the Gabor function.
The color feature and the texture feature are normalized to obtain a combined feature, for example, a feature vector which can be 60-dimensional, the first 12-dimensional feature vector is a color feature vector, and the second 48-dimensional feature vector is a texture feature vector, so that the normalization of the color feature and the texture feature is effectively realized.
When the combination features are classified by using a vector machine, the optimal module can be continuously adjusted, and then the combination features are classified by the optimal module, so that the accuracy of scene recognition is improved.
Based on the structure of the second scene recognition module, the detailed description of the image can be realized due to the combination of the color feature and the texture feature, so that the recognition of the image scene can be realized as accurately as possible.
Fig. 11 is a schematic structural diagram of an image capturing system according to an eleventh embodiment of the present application; as shown in fig. 11, when the image capturing unit is disposed on the structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, such as but not limited to a pan-tilt. The image capturing system includes: an image capturing unit and an image capturing control device; wherein the image capturing control device includes:
the third scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the second parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameter of the supporting component or the electronic component according to the identified image scene so as to control the image shooting unit to shoot the image.
In this embodiment, the third scene recognition module may multiplex the first scene recognition module or the second scene recognition module, or include the first scene recognition module and the second scene recognition module, for example, perform statistical analysis on image scenes respectively recognized by the first scene recognition module and the second scene recognition module to obtain a final image scene.
Similar to the second embodiment, the adjustable relationship mapping between the image scene and the attitude control parameter may be established in advance, and the attitude control parameter that can be adaptively adjusted in a certain image scene is determined through the adjustable relationship mapping. Further, the attitude control parameters may or may not be actually adjusted by the configuration items.
For example, when the product form of the support assembly is a cloud platform, the attitude control parameter is at least one of a pitch attitude control parameter, a pan attitude control parameter, or a roll attitude control parameter. For this purpose, the pitching motion of the pan/tilt head is adjusted by the pitching attitude control parameter, the panning motion of the pan/tilt head is adjusted by the panning attitude control parameter, or the scrolling of the roll attitude control parameter. Because the image shooting unit is arranged on the holder, the posture of the image shooting unit can be changed along with the change of the posture of the holder under the control of the posture control parameters, so that the image shooting in various application scenes can be met.
Specifically, a pitching control motor, a translation control motor and a rolling control motor are configured on the holder, and the pitching control motor, the translation control motor and the rolling control motor are respectively controlled through the pitching attitude control parameter, the translation attitude control parameter or the rolling attitude control parameter, so that the posture of the holder is adjusted.
Here, it should be noted that the supporting component is a cradle head only by way of example, and in fact, the supporting component is meant in a broad sense, and it can be virtually any structure capable of providing a supporting function for the image capturing unit, such as fixing the image capturing unit to a bicycle handlebar, fixing the image capturing unit to a helmet, and the handlebar and the helmet are equivalent to the supporting component.
In another embodiment, if the image capturing unit is disposed on an electronic device, the electronic device is, for example, an unmanned aerial vehicle. The image capturing unit may be a component of the electronic device or an accessory of the electronic device. In this case, correspondingly, adaptively adjusting the shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises: and adaptively adjusting and controlling the attitude control parameter of the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
If the electronic device is an unmanned aerial vehicle, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the above-mentioned cloud deck, can dispose pitch control motor, yaw control motor, roll control motor on unmanned aerial vehicle, thereby control pitch control motor, yaw control motor, roll control motor respectively through above-mentioned pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter and realize the adjustment of unmanned aerial vehicle gesture.
It should be noted that, the specific product form of the electronic device may also be a tracker, and the principle of the attitude control of the electronic device is similar to that of an unmanned aerial vehicle.
Fig. 12 is a schematic structural diagram of an image capturing system according to a twelfth embodiment of the present application; as shown in fig. 12, when the image capturing unit is disposed on a structural extension component of the image capturing unit as a support component, the image capturing unit is disposed on the support component, and the support component is in a moving state, the support component includes, but is not limited to, a cradle head, and the cradle head is disposed on a moving structure (such as a bicycle). The image capturing system includes: an image capturing unit and an image capturing control device; wherein the image capturing control device includes:
the fourth scene recognition module is used for recognizing the image scene shot by the image shooting unit;
and the third parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameters of the supporting component or the electronic equipment according to the identified image scene and the motion state of the supporting component so as to control the image shooting unit to shoot the image.
In this embodiment, the fourth scene recognition module may multiplex the first scene recognition module or the second scene recognition module, or include the first scene recognition module and the second scene recognition module, for example, perform statistical analysis on image scenes respectively recognized by the first scene recognition module and the second scene recognition module to obtain a final image scene. For example, euclidean distances are calculated for image scenes respectively identified by the first scene identification module and the second scene identification module, and the closer the distance is, the more accurate the image scene identification is expressed, and the larger the possibility of corresponding image scenes is.
In this embodiment, a motion sensor, such as a gyroscope or an acceleration sensor, may be specifically disposed on the support assembly, and the motion state of the support assembly may be determined according to data of the motion sensor. For example, since the tripod head is fixed to the bicycle and the recognized image scene is the driving state of the bicycle, in order to avoid image blurring caused by shaking of the image capturing unit when capturing an image, the posture of the tripod head is controlled to be stable by at least one of the pitch posture control parameter, the pan posture control parameter, and the roll posture control parameter, so that the posture of the image capturing unit is ensured to be stable as much as possible, and a high-quality image is captured.
Similarly, if the electronic device is a drone, the attitude control parameter is at least one of a pitch attitude control parameter, a yaw attitude control parameter, or a roll attitude control parameter. Specifically, like the attitude control of the pan/tilt control motor, the yaw control motor, and the roll control motor can be configured on the unmanned aerial vehicle, and the pitch control motor, the yaw control motor, and the roll control motor are respectively controlled by the above pitch attitude control parameter, yaw attitude control parameter, or roll attitude control parameter, thereby realizing the adjustment of the attitude of the unmanned aerial vehicle, and capturing a high-quality image.
Here, it should be noted that in some application scenarios, the optical control parameter and the attitude control parameter may be adjusted simultaneously in combination with fig. 8, 9, and 10.
In the above embodiments, the image capturing control device may be configured on a controller of the support component (e.g., a cradle head) or the electronic device (e.g., an unmanned aerial vehicle, a tracker). Of course, according to the requirements of the application scenario, the method may be actually configured on any data processing unit that can implement the above technical solution.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. An image capturing control method, characterized by comprising:
identifying an image scene shot by an image shooting unit;
adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image;
wherein the identifying the image scene shot by the image shooting unit comprises:
step S111, extracting local features of the image shot by the image shooting unit, and quantizing the extracted local features into visual words; step S121, counting the frequency of the visual words to obtain a visual word distribution histogram, and identifying the image scene shot by the image shooting unit according to the visual word distribution histogram; step S121 includes configuring an image scene material library in advance, where the image scene material library includes a plurality of image scene materials and corresponding visual word material distribution histograms, and the classifier implements recognition of the image scene by comparing the visual word material distribution histogram with the visual word distribution histogram;
step S213, extracting color features from the image based on the color histogram of the RGB space, and extracting texture features from the image based on a multi-channel Gabor filter, wherein the Gabor filter is obtained by configuring multi-scale parameters in a Gabor function; step S223, carrying out normalization processing on the color features and the texture features to obtain combined features, wherein the combined features are feature vectors of 60 dimensions, the first 12 dimensions are color feature vectors, and the last 48 dimensions are texture feature vectors; step S233, the combined features are classified by using a vector machine so as to identify the image scene shot by the image shooting unit;
and performing statistical analysis on the image scenes respectively identified in the steps S111-S121 and the steps S213-S233.
2. The method of claim 1, wherein adaptively adjusting shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises: and adaptively adjusting optical control parameters when the image shooting unit shoots the image according to the identified image scene, wherein the shooting control parameters comprise the optical control parameters.
3. The method of claim 1, wherein adaptively adjusting shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises: and adaptively adjusting the attitude control parameters of the image shooting unit according to the identified image scene so as to control the image shooting unit to shoot the image, wherein the shooting control parameters comprise the attitude control parameters.
4. The method of claim 3, wherein if the image capture unit is disposed on a support assembly or an electronic device;
correspondingly, the self-adaptive adjustment of shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises the following steps: and adaptively adjusting and controlling attitude control parameters of the support component or the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
5. The method according to claim 3, wherein if the image capturing unit is disposed on a support assembly or an electronic device, and the support assembly or the electronic device is in motion; correspondingly, the self-adaptive adjustment of shooting control parameters according to the identified image scene to control the image shooting unit to shoot the image comprises the following steps: and adaptively adjusting and controlling attitude control parameters of the support component or the electronic equipment according to the recognized image scene and the motion state of the support component or the electronic equipment so as to control the image shooting unit to shoot the image.
6. An image capture control apparatus, comprising:
the first scene recognition module is used for recognizing the image scene shot by the image shooting unit; the first scene recognition module comprises a first feature extraction unit, a second feature extraction unit and a third feature extraction unit, wherein the first feature extraction unit is used for extracting local features of the image shot by the image shooting unit and quantizing the extracted local features into visual words; the first scene recognition unit counts the frequency of the visual words to obtain a visual word distribution histogram and recognizes the image scene shot by the image shooting unit according to the visual word distribution histogram; the first scene recognition unit is further used for configuring an image scene material library in advance, the image scene material library comprises a plurality of image scene materials and corresponding visual word material distribution histograms, and the classifier is used for recognizing the image scene by comparing the visual word material distribution histograms with the visual word distribution histograms;
the second scene recognition module is used for recognizing the image scene shot by the image shooting unit; the second scene identification module comprises a second feature extraction unit, a color histogram extraction unit and a multi-channel Gabor filter, wherein the second feature extraction unit is used for extracting color features from an image based on the color histogram of the RGB space and extracting texture features from the image based on the multi-channel Gabor filter, and the Gabor filter is obtained by configuring multi-scale parameters in a Gabor function; the normalization unit is used for performing normalization processing on the color features and the texture features to obtain combined features, wherein the combined features are feature vectors of 60 dimensions, the first 12 dimensions are color feature vectors, and the second 48 dimensions are texture feature vectors; the second scene recognition unit is used for classifying the combined features by using a vector machine so as to recognize the image scene shot by the image shooting unit;
the fourth scene recognition module is used for recognizing the image scenes shot by the image shooting unit and further used for performing statistical analysis on the image scenes respectively recognized by the first scene recognition module and the second scene recognition module;
and the parameter adjusting module is used for adaptively adjusting shooting control parameters according to the identified image scene so as to control the image shooting unit to shoot the image.
7. The apparatus of claim 6, wherein the parameter adjusting module is further configured to adaptively adjust an optical control parameter when the image capturing unit captures the image according to the identified image scene, and the capture control parameter comprises the optical control parameter.
8. The apparatus of claim 6, wherein the parameter adjusting module is further configured to adaptively adjust an attitude control parameter of the image capturing unit according to the identified image scene to control the image capturing unit to capture the image, and the capture control parameter comprises the attitude control parameter.
9. The apparatus of claim 8, wherein if the image capturing unit is disposed on a support assembly or an electronic assembly;
correspondingly, the parameter adjusting module is further configured to adaptively adjust and control the attitude control parameter of the supporting component or the electronic component according to the identified image scene, so as to control the image capturing unit to capture an image.
10. The apparatus according to claim 8, wherein if the image capturing unit is disposed on a supporting component or an electronic device, the supporting component or the electronic device is in a motion state; correspondingly, the parameter adjusting module is further configured to adaptively adjust and control an attitude control parameter of the support component or the electronic device according to the identified image scene and the motion state of the support component or the electronic device, so as to control the image capturing unit to capture an image.
11. An image capture system, comprising: the device comprises an image shooting unit, an image shooting control device and a supporting component or electronic equipment, wherein the image shooting unit is arranged on the supporting component or the electronic equipment; the image capture control apparatus includes:
the first scene recognition module is used for recognizing the image scene shot by the image shooting unit; the first scene recognition module comprises a first feature extraction unit, a second feature extraction unit and a third feature extraction unit, wherein the first feature extraction unit is used for extracting local features of the image shot by the image shooting unit and quantizing the extracted local features into visual words; the first scene recognition unit counts the frequency of the visual words to obtain a visual word distribution histogram and recognizes the image scene shot by the image shooting unit according to the visual word distribution histogram; the first scene recognition unit is further used for configuring an image scene material library in advance, the image scene material library comprises a plurality of image scene materials and corresponding visual word material distribution histograms, and the classifier is used for recognizing the image scene by comparing the visual word material distribution histograms with the visual word distribution histograms;
the second scene recognition module is used for recognizing the image scene shot by the image shooting unit; the second scene identification module comprises a second feature extraction unit, a color histogram extraction unit and a multi-channel Gabor filter, wherein the second feature extraction unit is used for extracting color features from an image based on the color histogram of the RGB space and extracting texture features from the image based on the multi-channel Gabor filter, and the Gabor filter is obtained by configuring multi-scale parameters in a Gabor function; the normalization unit is used for performing normalization processing on the color features and the texture features to obtain combined features, wherein the combined features are feature vectors of 60 dimensions, the first 12 dimensions are color feature vectors, and the second 48 dimensions are texture feature vectors; the second scene recognition unit is used for classifying the combined features by using a vector machine so as to recognize the image scene shot by the image shooting unit;
the fourth scene recognition module is used for recognizing the image scenes shot by the image shooting unit and further used for performing statistical analysis on the image scenes respectively recognized by the first scene recognition module and the second scene recognition module;
and the parameter adjusting module is used for adaptively adjusting and controlling the attitude control parameters of the supporting component or the electronic equipment according to the identified image scene so as to control the image shooting unit to shoot the image.
12. The system of claim 11, wherein the support assembly or the electronic device is in motion; correspondingly, the parameter adjusting module is further configured to adaptively adjust and control an attitude control parameter of the support component or the electronic device according to the identified image scene and the motion state of the support component or the electronic device, so as to control the image capturing unit to capture an image.
CN201910542009.8A 2019-06-21 2019-06-21 Image shooting control method and device and image shooting system Active CN110233968B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910542009.8A CN110233968B (en) 2019-06-21 2019-06-21 Image shooting control method and device and image shooting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910542009.8A CN110233968B (en) 2019-06-21 2019-06-21 Image shooting control method and device and image shooting system

Publications (2)

Publication Number Publication Date
CN110233968A CN110233968A (en) 2019-09-13
CN110233968B true CN110233968B (en) 2021-04-06

Family

ID=67856923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910542009.8A Active CN110233968B (en) 2019-06-21 2019-06-21 Image shooting control method and device and image shooting system

Country Status (1)

Country Link
CN (1) CN110233968B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216949A (en) * 2014-08-13 2014-12-17 中国科学院计算技术研究所 Method and system for expressing clustering of image features by fusion of space information
CN106357983A (en) * 2016-11-15 2017-01-25 上海传英信息技术有限公司 Photographing parameter adjustment method and user terminal
CN107888828A (en) * 2017-11-22 2018-04-06 网易(杭州)网络有限公司 Space-location method and device, electronic equipment and storage medium
CN108093174A (en) * 2017-12-15 2018-05-29 北京臻迪科技股份有限公司 Patterning process, device and the photographing device of photographing device
CN108289169A (en) * 2018-01-09 2018-07-17 北京小米移动软件有限公司 Image pickup method, device, electronic equipment and storage medium
CN108475075A (en) * 2017-05-25 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, device and holder
CN108710847A (en) * 2018-05-15 2018-10-26 北京旷视科技有限公司 Scene recognition method, device and electronic equipment
CN109241820A (en) * 2018-07-10 2019-01-18 北京二郎神科技有限公司 The autonomous image pickup method of unmanned plane based on space exploration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011223296A (en) * 2010-04-09 2011-11-04 Sony Corp Imaging control apparatus and imaging control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216949A (en) * 2014-08-13 2014-12-17 中国科学院计算技术研究所 Method and system for expressing clustering of image features by fusion of space information
CN106357983A (en) * 2016-11-15 2017-01-25 上海传英信息技术有限公司 Photographing parameter adjustment method and user terminal
CN108475075A (en) * 2017-05-25 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, device and holder
CN107888828A (en) * 2017-11-22 2018-04-06 网易(杭州)网络有限公司 Space-location method and device, electronic equipment and storage medium
CN108093174A (en) * 2017-12-15 2018-05-29 北京臻迪科技股份有限公司 Patterning process, device and the photographing device of photographing device
CN108289169A (en) * 2018-01-09 2018-07-17 北京小米移动软件有限公司 Image pickup method, device, electronic equipment and storage medium
CN108710847A (en) * 2018-05-15 2018-10-26 北京旷视科技有限公司 Scene recognition method, device and electronic equipment
CN109241820A (en) * 2018-07-10 2019-01-18 北京二郎神科技有限公司 The autonomous image pickup method of unmanned plane based on space exploration

Also Published As

Publication number Publication date
CN110233968A (en) 2019-09-13

Similar Documents

Publication Publication Date Title
US11093754B2 (en) Method, system and apparatus for selecting frames of a video sequence
US11151384B2 (en) Method and apparatus for obtaining vehicle loss assessment image, server and terminal device
US11847826B2 (en) System and method for providing dominant scene classification by semantic segmentation
US10924677B2 (en) Electronic device and method for providing notification related to image displayed through display and image stored in memory based on image analysis
CN109348089B (en) Night scene image processing method and device, electronic equipment and storage medium
US20200058075A1 (en) Method and apparatus for obtaining vehicle loss assessment image, server and terminal device
US10262397B2 (en) Image de-noising using an equalized gradient space
CN110929805B (en) Training method, target detection method and device for neural network, circuit and medium
CN110139169B (en) Video stream quality evaluation method and device and video shooting system
US10657657B2 (en) Method, system and apparatus for detecting a change in angular position of a camera
WO2016165060A1 (en) Skin detection based on online discriminative modeling
US11417003B2 (en) Method and apparatus for tracking eyes of user and method of generating inverse-transform image
CN111598065B (en) Depth image acquisition method, living body identification method, apparatus, circuit, and medium
CN109618102B (en) Focusing processing method and device, electronic equipment and storage medium
CN112889068A (en) Neural network object recognition for image processing
US20240029285A1 (en) Adaptive face depth image generation
CN110233968B (en) Image shooting control method and device and image shooting system
CN111526341A (en) Monitoring camera
WO2023001107A1 (en) Photographic image processing method and device
US10885346B2 (en) Method, system and apparatus for selecting frames of a video sequence
CN109919851A (en) A kind of flating obscures removing method and device
US20230040122A1 (en) Electronic device and method for supporting deblurring of image data
US11743450B1 (en) Quick RGB-IR calibration verification for a mass production process
US20230269479A1 (en) Image pickup apparatus that performs automatic shooting, control method therefor, and storage medium
AU2023202005B1 (en) Image rotation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant