CN108696697B - Camera device control method and device - Google Patents

Camera device control method and device Download PDF

Info

Publication number
CN108696697B
CN108696697B CN201810785265.5A CN201810785265A CN108696697B CN 108696697 B CN108696697 B CN 108696697B CN 201810785265 A CN201810785265 A CN 201810785265A CN 108696697 B CN108696697 B CN 108696697B
Authority
CN
China
Prior art keywords
data
video
camera device
camera
video acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810785265.5A
Other languages
Chinese (zh)
Other versions
CN108696697A (en
Inventor
常树磊
郑旭东
王月
满春刚
李祥熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810785265.5A priority Critical patent/CN108696697B/en
Publication of CN108696697A publication Critical patent/CN108696697A/en
Application granted granted Critical
Publication of CN108696697B publication Critical patent/CN108696697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application discloses a control method, a device and a system of a camera device, wherein before video acquisition is carried out on a video acquisition scene according to a video acquisition task, machine position data can be planned for the camera devices at different positions in the video acquisition scene in advance aiming at the video acquisition task, when the video acquisition scene is subjected to video acquisition according to the video acquisition task, the machine position data corresponding to the camera device can be determined according to the positions of the camera devices in the video acquisition scene, and how to reasonably move the camera device meeting the video acquisition task in the video acquisition scene can be indicated through displacement information included by the machine position data when the camera device executes the video acquisition task. Therefore, the camera device can automatically move under the indication of the camera position data by sending the corresponding camera position data to the camera device, and the video data required by the video acquisition task is acquired at the indicated position, so that the original human experience influence is eliminated.

Description

Camera device control method and device
Technical Field
The present application relates to the field of data processing, and in particular, to a method and an apparatus for controlling an image capturing apparatus.
Background
The method includes the steps that videos need to be acquired in video playing modes such as live broadcasting and recorded broadcasting, wherein the videos can be generated through video data acquired by a camera in a video acquisition scene. In the process of collecting video data, video data collection is carried out on a collected object by a camera worker carrying on a shoulder and a handheld camera.
The video data acquisition of the acquisition object through which angle and position mainly depends on the experience of the camera shooting personnel. In a video acquisition scene needing a plurality of cameras, how to match the cameras can be completed by experience matching among camera shooting personnel.
Because the collection of the video data at present depends too much on the human experience of the photographers, the collected video data may not meet the collection requirement or the playing requirement.
Disclosure of Invention
In order to solve the technical problem, the application provides a control method, device and system for a camera device, which not only eliminates the original human experience influence, but also can acquire video data meeting the requirements of a video acquisition task through the camera device.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides an imaging apparatus control method, including:
the data processing equipment determines the machine position data of the camera device for the video acquisition task according to the position of the camera device in a video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
and the data processing equipment sends the machine position data to the camera device so that the camera device can collect the video data required by the video collection task in the video collection scene according to the machine position data.
In a second aspect, an embodiment of the present application provides an image pickup apparatus control apparatus, including a first determination unit and a first transmission unit:
the first determining unit is used for determining the machine position data of the camera device for the video acquisition task according to the position of the camera device in the video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
the first sending unit is used for sending the machine position data to the camera device so that the camera device can collect the video data required by the video collection task in the video collection scene according to the machine position data.
In a third aspect, an embodiment of the present application provides an image pickup apparatus control apparatus, including a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the image pickup apparatus control method according to any one of the first aspect according to an instruction in the program code.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium for storing program code for executing the image pickup apparatus control method according to any one of the second aspects.
According to the technical scheme, before video acquisition is carried out on a video acquisition scene according to a video acquisition task, the data processing equipment can plan the machine position data for the camera devices at different positions in the video acquisition scene in advance aiming at the video acquisition task, when the video acquisition scene is subjected to video acquisition according to the video acquisition task, the data processing equipment can determine the machine position data corresponding to the camera devices according to the positions of the camera devices in the video acquisition scene, and how to reasonably move the camera devices in the video acquisition scene to meet the video acquisition task can be indicated through displacement information included by the machine position data when the camera devices execute the video acquisition task. Therefore, the corresponding machine position data can be sent to the camera device, the camera device can automatically move under the indication of the machine position data, the video data required by the video acquisition task is acquired at the indicated position, it can be seen that when the video data is acquired, the position of the camera device for acquiring the video data can be determined or adjusted no longer through the personal experience of photographers, but the displacement control of the camera device in the video data acquisition process can be completed through the determined machine position data, the influence of original human experience is eliminated, and the video data meeting the requirement of the video acquisition task can be acquired through the camera device.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is an exemplary diagram of an application scenario of a control method of an image capturing apparatus according to an embodiment of the present application;
fig. 2 is a flowchart of a method for controlling an image capturing apparatus according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a method for adjusting a camera parameter according to an embodiment of the present disclosure;
fig. 4 is an exemplary diagram of an application scenario of a control method of an image capturing apparatus according to an embodiment of the present application;
fig. 5 is an architecture diagram of an application system of a control method for a camera device according to an embodiment of the present disclosure;
fig. 6 is a flowchart of a method for controlling an image capturing apparatus according to an embodiment of the present application;
fig. 7a is a structural diagram of a control device of an image pickup apparatus according to an embodiment of the present application;
fig. 7b is a structural diagram of a control device of an image capturing apparatus according to an embodiment of the present application;
fig. 7c is a structural diagram of a control device of an image capturing apparatus according to an embodiment of the present application;
fig. 7d is a structural diagram of a control device of an image capturing apparatus according to an embodiment of the present application;
fig. 7e is a structural diagram of a control device of an image capturing apparatus according to an embodiment of the present application;
fig. 8 is a structural diagram of an image pickup apparatus control device according to an embodiment of the present application;
fig. 9 is a structural diagram of an imaging device control apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the traditional video data acquisition process, no matter which angle and position each camera is determined to acquire video data of an acquired object, or how each camera is determined to be matched, the manual experience of a camera is relied on, and the manual experience of the camera is excessively relied on, so that the acquired video data may not meet the acquisition requirement or the playing requirement.
Therefore, the embodiment of the present application provides a camera device control method, which can plan the machine position data of the camera device at each position in the video capture scene in advance, and the displacement information included in the machine position data can indicate how to reasonably move the camera device in the video capture scene to meet the video capture task when the camera device executes the video capture task. Therefore, when video acquisition is carried out on a video acquisition scene according to the video acquisition task, displacement control of the camera device in the video data acquisition process can be completed through the determined machine position data, and therefore video data required by the video acquisition task is acquired.
The image pickup apparatus control method can be applied to a system shown in fig. 1, in which an image pickup apparatus 102 and an image pickup apparatus cooperative control system 103 can be included. The camera device 102 may be configured with at least one camera device in the video capture scene 101 for a video capture task, and each camera device 102 may include a camera and a control component, and the control component may be configured to control the camera to move according to an indication of the machine position data, so that the camera device 102 acquires video data required by the video capture task in the video capture scene according to the machine position data. The camera coordination control system 103 may be configured in a terminal device, and the terminal device may include a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of sale (POS), a vehicle-mounted computer, and the like; the imaging apparatus cooperative control system 103 may be disposed in a server.
Generally, for one video capture task, before video capture, the machine position data of the cameras at different positions in the video capture scene can be planned in advance and stored in the camera coordination control system 103, the storage condition of the machine position data in the camera coordination control system 103 can be as shown by 104 in fig. 1, and in 104, the cameras 102 at different positions have corresponding machine position data based on the different positions of the cameras 102 in the video capture scene 101.
In this way, when a certain camera device 102 needs to be controlled, the coordination control system 103 may acquire the position of the camera device 102 in the video capture scene 101, and determine the machine position data of the camera device 102 for the video capture task according to the position of the camera device 102 in the video capture scene 101, for example, if the position of the camera device 102 in the video capture scene 101 is position 1, then the machine position data of the camera device 102 in position 1 for the video capture task may be determined according to 104. The coordination control system 103 may send the determined machine position data to the camera 102, so that when performing the video capture task, the camera 102 may move by itself under the instruction of the machine position data and capture the video data required by the video capture task at the instructed position.
In the embodiment of the application, the video data corresponding to the video that is expected to be played to the audience can be acquired by executing the video acquisition task, and the video acquisition task may refer to a task of acquiring video data executed in a live broadcast process or a task of acquiring video data executed in a recording process.
A video capture scene may refer to a scene that is encompassed by a video capture task. For example, when some programs are live, the video capture task is to capture a video hosted by a host in a studio, which can be used as a video capture scene.
It should be noted that there may be one or more video capture scenes. If the video acquisition task only comprises one scene, the video acquisition scene can be one scene, for example, the video acquisition task only acquires the video hosted by the host in the studio a, the scene included in the video acquisition task is the studio a, and at this time, the video acquisition scene can be the studio a; if the video collection task includes multiple scenes, the multiple video collection scenes may be provided, for example, in an integrated art program, the scenes are frequently changed, if the program is started outdoors and some game links in the program need to be performed in a stadium, the video collection task is to collect outdoor starting videos and collect game link videos in the stadium, the scenes included in the video collection task are outdoors and the stadium, and at this time, the video collection scenes may be outdoors and the stadium.
The camera device may be a device for acquiring video data, where the video data may include dynamic video data and still picture data, and the video data is mainly taken as the dynamic video data in the embodiment of the present application.
In the process of performing video acquisition for a video acquisition task, the position of a camera in a video acquisition scene may change with time, for example, a certain camera is at t1-t2At time at position A, at t2-t3The moment is at position B. The machine position data mentioned in the embodiment of the application can reflect the positions of the camera devices in the video acquisition scene at different moments in the video acquisition process aiming at the video acquisition task, for example, the machine position data can identify the camera devices at t1-t2At time at position A, at t2-t3The moment moves to position B.
It should be noted that the machine position data mentioned in the embodiment of the present application may be complete machine position data corresponding to a camera device at a certain position in the whole process of performing video acquisition on a video acquisition task, so that the camera device can move automatically under the instruction of the complete machine position data in the whole process of video acquisition; of course, in some cases, the whole process of video acquisition may be too long, and a rest may be needed in the middle, so that the whole process of video acquisition is interrupted, or a plurality of scenes need to be replaced in the whole process of video acquisition, so that the whole process of video acquisition is interrupted, in this case, the time of the rest or the process of replacing the scenes need not to control the camera device, in order to ensure that the camera device can be in a rest state in the time of the rest or the process of replacing the scenes, the machine position data of the embodiment of the present application can segment the machine position data, so that the camera device can move by itself under the indication of the machine position data corresponding to the video acquisition processes in different time periods.
The machine position data can comprise displacement information, and the displacement information can indicate the camera device to reasonably move in a video acquisition scene to meet the video acquisition task when the camera device executes the video acquisition task. During the video capture process, the position and/or angle of the camera in the video capture scene may change with time, and in order to indicate a reasonable moving position and/or angle of the camera in the video capture scene, the displacement information may include angular displacement information and/or position displacement information.
The position displacement information can represent the displacement of the camera device from the position of a certain moment to the position of the next moment, and the camera device can be indicated to be adjusted to the corresponding position at a certain moment according to the position displacement information, so that the camera device can reasonably move in a video acquisition scene to meet the video acquisition task.
For example, the station data identifies that the camera is at t1-t2At time at position A, at t2-t3The moment moves to position B. From the position data, it can be seen that at t1-t2Between moments, the camera is always in position A, t1-t2The position displacement information between the moments may be 0; t is t2-t3Between the moments the camera is always in position B, i.e. at t2The time imaging device needs to move from the previous position a to the position B, and in this case, the position displacement information may be the displacement from the position a to the position B.
The angular displacement information may indicate a displacement between the angle at which the image pickup apparatus moves from a certain time to the angle at which the image pickup apparatus moves from the next time, and the image pickup apparatus may be instructed to adjust to a corresponding angle at the certain time according to the angular displacement information.
For example, the camera is always in position a throughout the video capture process, but the angle of the camera may be different at different times, where the camera is at t1-t2The angle at time is 15 degrees upwards, at t2-t3The angle of the moment is 15 degrees to the left, and the displacement information comprises angle displacement information, so that the camera device is controlled to be at t according to the displacement information2The time is adjusted from 15 degrees up to 15 degrees left.
It should be noted that, in the embodiment of the present application, if a video capture scene includes one camera device, then displacement control of the camera device in the video data capture process may be completed through the determined machine position data of the camera device; if the video acquisition scene comprises a plurality of camera devices, then, for each camera device in the video acquisition scene, corresponding machine position data can be determined according to the position of the camera device in the video acquisition scene, each camera device can be independently controlled, and a plurality of camera devices can be simultaneously controlled.
Since the method of independently controlling each image pickup apparatus is basically the same, generally, the method of independently controlling each image pickup apparatus is combined to realize simultaneous control of a plurality of image pickup apparatuses. In the embodiment of the present application, for convenience of description, a method for controlling an image pickup apparatus will be described by taking the control of one image pickup apparatus as an example.
Next, a method for controlling an imaging apparatus according to an embodiment of the present application will be described with reference to the drawings.
Referring to fig. 2, fig. 2 shows a flowchart of an image pickup apparatus control method, and the embodiment corresponding to fig. 2 and subsequent embodiments may be executed by a data processing apparatus, such as the image pickup apparatus coordination control system 103 shown in fig. 1, the method including:
s201, the data processing equipment determines the machine position data of the camera device for the video acquisition task according to the position of the camera device in the video acquisition scene.
The machine position data can include displacement information of the camera device in a video capture scene.
For a certain video acquisition task, before video acquisition, setting data can be planned for the camera devices at different positions in a video acquisition scene according to contents required to be presented by the video acquisition task, and the setting data are stored in the camera device coordination control system, if a certain camera device needs to be controlled, the camera device coordination control system can acquire the position of the camera device in the video acquisition scene, and therefore the setting data of the camera device for the video acquisition task can be determined according to the position of the camera device in the video acquisition scene.
For example, the video capture scene includes three cameras A, B and C, the three cameras are located at positions 1, 2 and 3 in the video capture scene, respectively, the machine position data 1 of the camera a located at the position 1, the machine position data 2 of the camera B located at the position 2, and the machine position data 3 of the camera C located at the position 3 are planned in advance, and if it is necessary to control the camera a located at the position 1, the camera coordination control system can determine that the machine position data of the camera a is the machine position data 1 according to the position 1 of the camera a in the video capture scene.
S202, the data processing equipment sends the machine position data to the camera device, so that the camera device can collect video data required by the video collection task in the video collection scene according to the machine position data.
After the machine position data corresponding to the camera device is determined, the camera device coordination control system can send the determined machine position data to the camera device, and a control assembly in the camera device can control the camera to move to a corresponding position in a video acquisition scene according to the machine position data, so that video data required by a video acquisition task are acquired.
One implementation manner of sending the camera position data to the camera device by the camera device coordination control system may be: and the camera device coordination control system transmits the position data to the camera device through the transmission system. The transmission system may be a wired transmission system or a wireless transmission system.
When the transmission system is a wired transmission system, the transmission system may include a camera device coordination control system interface, a camera device interface, and a transmission line, and both ends of the transmission line may be connected to the coordination control system interface and the camera device interface, respectively, so as to realize information transmission between the camera device coordination control system and the camera device.
When the transmission system is a wireless transmission system, the structure of the transmission system may include an antenna of the camera device coordination control system and an antenna of the camera device, and the antenna of the camera device coordination control system and the antenna of the camera device are respectively used as a receiver and a transmitter, so as to realize information transmission between the camera device coordination control system and the camera device.
According to the technical scheme, before video acquisition is carried out on a video acquisition scene according to a video acquisition task, machine position data can be planned for the camera devices at different positions in the video acquisition scene in advance for the video acquisition task, when the video acquisition scene is subjected to video acquisition according to the video acquisition task, the data processing equipment can determine the machine position data corresponding to the camera devices according to the positions of the camera devices in the video acquisition scene, and how to reasonably move the camera devices in the video acquisition scene to meet the video acquisition task can be indicated through displacement information included by the machine position data when the camera devices execute the video acquisition task. Therefore, corresponding machine position data can be sent to the camera device through the data processing equipment, the camera device can automatically move under the indication of the machine position data, the video data required by the video acquisition task is acquired at the indicated position, it can be seen that when the video data is acquired, the position of the camera device for acquiring the video data can be determined or adjusted no longer through the personal experience of photographers, but the displacement control of the camera device in the video data acquisition process can be completed through the determined machine position data, not only is the influence of original human experience eliminated, but also the video data meeting the requirement of the video acquisition task can be acquired through the camera device.
It should be noted that, in the embodiment corresponding to fig. 2, the camera device plans the machine position data for the camera devices at different positions in the video capture scene in advance, and sends the machine position data corresponding to the camera device according to the position of the camera device, so that the camera device can complete the video capture task under the instruction of the machine position data. However, in some cases, as the video acquisition is performed, at a certain time, the machine position data required for acquiring the video task requirement may change due to some emergency situations, and in this case, in order to further meet the requirement of the video acquisition task, the machine position data may be updated; or, in the process of acquiring the video data, if the machine bit data is adjusted according to the acquired video data, the acquired video data can better meet the requirements of the video acquisition task.
For example, for a certain video acquisition task, the camera device has already acquired the machine position data, starts to acquire the video data under the instruction of the machine position data, and can generate a video according to the video data, and the director finds that if the video effect acquired by adjusting the angle of the camera device at a time is better according to the video, the requirements of the video acquisition task are more satisfied.
In this case, the present embodiment provides an image pickup apparatus control method, namely after S202, the method further comprising: and updating the machine position data and sending the updated machine position data to the camera device. The mode of updating the machine position data may be to re-plan the machine position data of the camera device, and replace the machine position data after the time of the change in the camera device coordination control system with the newly planned machine position data, so as to update the machine position data.
The control method of the image capturing apparatus described in the embodiment corresponding to fig. 2 is mainly implemented based on the machine-position data, and in the embodiment corresponding to fig. 3, the control method of the image capturing apparatus based on the image capturing parameters will be described.
In the process of carrying out video acquisition on the video acquisition task, the video data required by the video acquisition task can be acquired by the camera device according to the indication of the position data, but the corresponding environmental information can be different according to different acquisition positions, and in the video acquisition process, the camera shooting parameter of the camera device needs to be matched with the environmental information, so that the acquired video data can be ensured to meet the requirement of the video acquisition task. Once the shooting parameters are not matched with the environmental information, the acquired video data may not meet the requirements of the video acquisition task, and the acquired video data needs to be processed to obtain the video data meeting the requirements.
Therefore, in order to ensure that the acquired video data meet the requirements of a video acquisition task, avoid subsequent operations such as processing the acquired video data and the like, and improve the acquisition efficiency of the video data, when the camera device is controlled, the camera device not only needs to be controlled to move according to the machine position data, but also needs to be controlled to adjust the camera parameters, so that the camera device can acquire the video according to the machine position data and with the appropriate camera parameters.
Referring to fig. 3, fig. 3 shows a method for adjusting imaging parameters, the method comprising:
s301, the data processing equipment acquires environmental information of the position acquired by the camera device.
The acquisition position may be a position where an acquisition object to be acquired by the camera device is located in a video acquisition scene, and the acquisition object may be an object to be included in video data required by a video acquisition task, and the acquisition object may be a person or an object.
For example, as shown in fig. 4, for example, the camera device 102 includes one camera device, and if the object to be captured by the camera device 102 is shown as 105 in fig. 4, the capture position is the position where the object 105 is located in the video capture scene.
The environment information may refer to information that may affect video quality when the image pickup device performs video pickup, for example, illuminance, a distance between a pickup position and the image pickup device, and the like.
And S302, the data processing equipment determines corresponding image pickup parameters according to the environment information.
The camera parameters may be parameters used by the camera device when performing video acquisition for a video acquisition task. The imaging parameters may include aperture parameters, focal parameters, and the like.
The environmental information of the acquisition position can influence the camera shooting parameters of the camera shooting device, namely the acquisition position is different, and the corresponding environmental information can be different. When the environmental information of the acquisition position is suitable for the shooting parameters of the camera device, the camera device can acquire video data with high video quality, so that the shooting parameters adopted by the camera device can be different aiming at different environmental information in order to ensure the video quality, and after the environmental information is determined, the shooting parameters corresponding to the environmental information need to be determined according to the environmental information.
Before video acquisition is carried out, the environmental information in a video acquisition scene can be determined, and the corresponding relation between the environmental information of each position in the video acquisition scene and the shooting parameters can be planned in advance according to the environmental information in the video acquisition scene, so that the implementation mode of determining the corresponding shooting parameters can be as follows: and determining the camera shooting parameters corresponding to the environment information according to the environment information and the corresponding relation.
Taking the environment information as the illuminance as an example, if the collecting position is the position a, the illuminance of the position a may be detected by the detecting device, and after the imaging device coordination control system obtains the illuminance of the position a by the detecting device, the imaging parameters of the aperture, for example, may be determined according to the illuminance and the corresponding relationship. If the illuminance value is small, that is, the light at the acquisition position a is dark, in order to acquire video data corresponding to a clear video, the aperture determined according to the illuminance value may be a large aperture, for example, the aperture parameter is 2; if the illumination value is large, that is, the light at the capture position a is strong, in order to capture the video data corresponding to the clear video, the aperture determined according to the illumination value may be a small aperture, for example, the aperture parameter is 7.
And S303, the data processing equipment sends the image pickup parameters to the image pickup device so that the image pickup device can be adjusted according to the image pickup parameters.
After the camera shooting parameters adaptive to the environment information are determined, the camera shooting device coordination control system can send the camera shooting parameters to the camera shooting device, so that the camera shooting device can be adjusted according to the camera shooting parameters.
By the control method of the camera device provided by the embodiment corresponding to fig. 3, the video quality corresponding to the acquired video data can be ensured, the subsequent operations such as processing the acquired video data can be avoided, and the acquisition efficiency of the video data can be improved.
It should be noted that in the embodiment corresponding to fig. 3, the imaging parameters have been determined, so that the imaging device can be adjusted according to the imaging parameters. However, in some cases, at a certain time, the environment information of the acquisition location is changed for some reasons, so as to better adapt to the change of the environment information, further ensure that the acquired video data meets the requirements of the video acquisition task, and also update the camera parameters.
For example, when the shooting parameters are determined for the video acquisition task, the weather is clear, the illumination in the video acquisition scene is sufficient, and at this time, for a certain acquisition position, if the environment information of the acquisition position is obtained as the illuminance a, the shooting parameters are determined according to the illuminance a, so that the shooting device can be adjusted according to the shooting parameters. However, as the video acquisition is performed at any time, the weather changes suddenly from clear to cloudy, so that the environmental information of the acquisition position changes for the same acquisition position, the acquired environmental information may become illuminance B which is smaller than illuminance a, and the shooting parameters can be updated in order to further adapt the shooting parameters to the illuminance B and ensure that the acquired video data meet the requirements of the video acquisition task.
In this case, an image pickup apparatus control method is provided on the basis of the embodiment corresponding to fig. 3, that is, after S303, the method further includes: and updating the image pickup parameters and sending the updated image pickup parameters to the image pickup device. The manner of updating the imaging parameters may be: when the change of the environmental information is detected, re-determining the camera shooting parameters according to the changed environmental information, and updating the camera shooting parameters; alternatively, the imaging parameters are manually input, and the original imaging parameters are updated using the input imaging parameters.
The embodiment corresponding to fig. 3 can be implemented based on the embodiment corresponding to fig. 2, that is, when the image capturing apparatus is controlled, the image capturing apparatus can be simultaneously controlled according to the position data and the image capturing parameters, and at this time, the system architecture diagram applying the image capturing apparatus control method can be as shown in fig. 4.
The embodiments corresponding to fig. 1 to fig. 4 describe how the camera device acquires the machine position data and/or the camera parameters, and acquires the video data required by the video acquisition task in the video acquisition scene according to the acquired machine position data and/or the acquired camera parameters. In the following embodiments, a process of the camera device after acquiring the required video data will be described.
In some cases, although in different video capture tasks, since video capture scenes are similar, and contents expressed by video data are also similar, the camera device may employ the same machine position data and camera parameters for different video capture tasks, that is, when a new video capture task needs to be executed, if the video capture task is similar to a video capture scene included in a video capture task that has been completed before, contents expressed by video data, and the like, the camera device may employ the same machine position data and camera parameters to capture video data required by the video capture task in the video capture scene when executing the video capture task.
In this case, in order to enable the camera device to directly acquire the machine position data and the camera parameters that have been used before, it is avoided that S201-S201 and S301-S303 need to be executed completely again each time to acquire the machine position data and the camera parameters, so as to improve the video acquisition efficiency, after the camera device acquires the video data, the camera device may return the data to the camera device coordination control system, wherein the data includes the video data, the machine position data and the camera parameters, and the video data may reflect the video acquisition scene and the content expressed by the video data. After the camera device coordination control system acquires the data returned by the camera device, the camera device coordination control system can store the corresponding relation among the video data, the machine position data and the camera parameters. Therefore, when a new video acquisition task needs to be executed, the camera device coordination control system can determine whether to execute the new video acquisition task according to the stored video data, and whether the camera device can adopt the used machine position data and camera parameters before, if so, the camera device coordination control system can determine the corresponding machine position data and camera parameters according to the corresponding relation among the video data, the machine position data and the camera parameters, and send the corresponding machine position data and the camera parameters to the camera device, so that the camera device can acquire the required video data according to the machine position data and the camera parameters.
Therefore, under the condition that the video data required by the video acquisition task is similar to the video data acquired before, the camera device can directly acquire the used machine position data and camera parameters before, the step of acquiring the machine position data and the camera parameters by complete execution every time is avoided, and the video acquisition efficiency is improved.
Next, how to acquire data returned by the image pickup apparatus will be described. In one implementation, the camera may return data to the camera coordination control system via the transmission system. Wherein, if the transmission system is a wireless transmission system, the adopted wireless transmission system can be a wireless transmission system with the filtering frequency of 1.95-2.7Ghz and the bandwidth of 800M.
It is understood that the receiver of the transmission system may be one or more, and the signal of the receiver of the transmission system may be good or bad at any time, even if the signals of different receivers at the same time are good or bad.
In this case, in order to further reduce the influence of the signal quality on the acquisition of data by the camera device coordinated control system, and improve the reliability of the transmission system, the transmission system may include a transmitter and a plurality of receivers, wherein the distance d between the receivers is greater than or equal to λ/2(λ is the operating wavelength), so as to ensure that the fading characteristics of the output signals of the receivers are independent from each other, that is, when the signal of one receiver is not good, the signals of the other receivers are not necessarily good at this time. In this way, when the image pickup apparatus returns data, the image pickup apparatus coordination control system may receive the data returned by the image pickup apparatus according to the plurality of receivers set, and then select a receiver whose signal satisfies a preset condition from among the plurality of receivers as a target receiver, and use the data received by the target receiver as the data returned by the image pickup apparatus. The signals meet the preset conditions, namely the signals are good in quality, so that the camera device coordination control system can acquire data returned by the camera device.
Therefore, the plurality of receivers are arranged to receive the data returned by the camera device at the same time, and the camera device coordination control system can select the receiver with better signal from the plurality of receivers to acquire the data returned by the camera device when the signal of a certain receiver or some receivers is not good, so that the influence of the signal quality on the acquisition of the data acquired by the camera device coordination control system is reduced, and the reliability of the transmission system is improved.
It can be understood that, during the transmission of the signal in the transmission system, a situation of signal attenuation may occur, so that a situation of carrier distortion or loss occurs, and the acquisition of the data returned by the image pickup device by the coordination control system of the image pickup device is affected.
In this case, the image pickup apparatus can return data to the image pickup apparatus cooperative control system by a modulation technique, that is, modulate the returned data into a narrowband carrier signal, and return the narrowband carrier signal to the image pickup apparatus cooperative control system through the transmission system. Thus, after the receiver receives the return data, the original signal can be reconstructed from the carrier signal, and any carrier distortion or loss due to signal corruption can be compensated for using the error correction features of the modulation technique.
The modulation technique may include Coded Orthogonal Frequency Division Multiplexing (COFDM), for example.
The above embodiments have been described mainly based on one image pickup apparatus, but it is needless to say that the image pickup apparatus may include a plurality of image pickup apparatuses, and in this case, each image pickup apparatus may be independently controlled or a plurality of image pickup apparatuses may be simultaneously controlled by the control method described above. When a plurality of camera devices are controlled simultaneously, all the camera devices can be allocated to reach the correct positions within a short time before video acquisition is carried out, and the camera devices are enabled to enter the correct working state, so that video data required by a video acquisition task is acquired in a video acquisition scene.
Next, the image pickup apparatus control method will be described with reference to specific application scenarios. The method comprises the steps that a video required to be acquired by a live video acquisition task is determined by a director aiming at the live video acquisition task, then, all the camera devices are allocated to reach corresponding positions, and in the process of acquiring the video, all the camera devices are required to be controlled to move according to corresponding machine position data respectively, so that the video data required by the video acquisition task is acquired in a video acquisition scene.
To this end, the embodiment corresponding to fig. 5 provides a camera control method, which applies a system architecture diagram as shown in fig. 5, including a camera 102, a transmission system 106, and a camera coordination control system 103, wherein the camera 102 includes a position control system 1021, a camera 1022, and a camera parameter control system 1024.
Based on this system, referring to fig. 6, the image pickup apparatus control method includes:
s601, the camera device coordination control system acquires the position of the camera device in a video acquisition scene through the transmission system.
Specifically, the camera device coordination control system 103 obtains the position of the camera 1022 in the video capture scene through the transmission system 106.
And S602, the camera device coordination control system determines the machine position data of the camera device for the video acquisition task according to the position.
The position data includes displacement information of the camera device in the video capture scene, and the position data may reflect how the camera 1022 should be controlled by the position control system 1021 to move in the video capture scene in order to capture video data required by the video capture task when the video capture is performed.
And S603, the camera device coordination control system sends the machine position data to a position control system.
After the position control system 1021 acquires the position data, the position control system 1021 can control the camera 1022 to move in the video capture scene according to the position data, so as to capture the video data required by the video capture task.
And S604, the camera device coordination control system acquires the environmental information of the position acquired by the camera device.
And S605, the camera device coordination control system determines corresponding camera parameters according to the environment information.
And S606, sending the image pickup parameters to an image pickup parameter control system by the image pickup device coordination control system.
In this way, the imaging parameter control system 1023 can adjust the camera 1022 according to the imaging parameters.
Through S601-S606, the position control system 1021 can control the camera 1022 to reasonably move according to the position data, and the camera parameter control system 1023 can adjust the camera 1022 to appropriate camera parameters according to the camera parameters, so that the camera can acquire video data required by the video acquisition task. In this embodiment, the sequence of the process of controlling the camera to move in S601-S603 and the process of controlling the camera to adjust the shooting parameters in S604-S606 is not limited.
And S607, the camera device coordination control system acquires the data returned by the camera device.
Wherein the data comprises video data, machine position data and camera parameters. That is, after the camera 1022 is controlled according to the position data and the camera parameters to capture the required video data in the video capture scene, the camera 1022 may return the captured video data to the camera coordination control system 103, while the position control system 1021 returns the position data to the camera coordination control system 103, and the camera parameter control system 1023 returns the camera parameters to the camera coordination control system 103.
And S608, the camera device coordination control system records the corresponding relation among the video data, the machine position data and the camera parameters.
Thus, when the subsequent video acquisition task is executed, if the subsequent video acquisition task is similar to the video data acquired at this time, the position control system 1021 and the camera parameter control system 1023 can directly acquire the used machine position data and camera parameters, so that the step of completely executing and acquiring the machine position data and the camera parameters every time is avoided, and the video acquisition efficiency is improved.
According to the technical scheme, before video acquisition is carried out on a video acquisition scene according to the video acquisition task, machine position data can be planned for the camera devices at different positions in the video acquisition scene in advance according to the video acquisition task, when the video acquisition scene is subjected to video acquisition according to the video acquisition task, the machine position data corresponding to the camera devices can be determined according to the positions of the camera devices in the video acquisition scene, and how to reasonably move the video acquisition task in the video acquisition scene can be indicated through displacement information included by the machine position data when the camera devices execute the video acquisition task. Therefore, the corresponding machine position data can be sent to the camera device, the camera device can automatically move under the indication of the machine position data, the video data required by the video acquisition task is acquired at the indicated position, it can be seen that when the video data is acquired, the position of the camera device for acquiring the video data can be determined or adjusted no longer through the personal experience of photographers, but the displacement control of the camera device in the video data acquisition process can be completed through the determined machine position data, the influence of original human experience is eliminated, and the video data meeting the requirement of the video acquisition task can be acquired through the camera device.
Based on the method for controlling the image capturing apparatus provided in the foregoing embodiment, referring to fig. 7a, the embodiment corresponding to fig. 7a provides an image capturing apparatus control apparatus 700, where the apparatus 700 includes a first determining unit 701 and a first transmitting unit 702:
the first determining unit 701 is configured to determine, according to a position of a camera in a video capture scene, machine position data of the camera for a video capture task; the machine position data comprises displacement information of the camera device in the video acquisition scene;
the first sending unit 702 is configured to send the machine position data to the camera device, so that the camera device collects video data required by the video collection task in the video collection scene according to the machine position data.
In one implementation, referring to fig. 7b, the apparatus 700 further includes a first updating unit 703:
the first updating unit 703 is configured to update the machine position data, and send the updated machine position data to the image capturing apparatus.
In one implementation, referring to fig. 7c, the apparatus 700 further includes a first obtaining unit 704, a second determining unit 705, and a second sending unit 706:
the first obtaining unit 704 is configured to obtain environmental information of a position acquired by the image capturing apparatus;
the second determining unit 705 is configured to determine a corresponding imaging parameter according to the environment information;
the second sending unit 706 is configured to send the image capturing parameters to the image capturing apparatus, so that the image capturing apparatus performs adjustment according to the image capturing parameters.
It should be noted that fig. 7c is only an exemplary structure, and does not limit the structure of the apparatus 700.
In one implementation, referring to fig. 7d, the apparatus 700 further comprises a second updating unit 707:
the second updating unit 707 is configured to update the image capturing parameters and send the updated image capturing parameters to the image capturing apparatus.
In one implementation, referring to fig. 7e, the apparatus 700 further includes a second obtaining unit 708 and a recording unit 709:
the second obtaining unit 708 is configured to obtain data returned by the image capturing apparatus, where the data includes video data, machine position data, and image capturing parameters;
the recording unit 709 is configured to record a corresponding relationship between the video data, the machine position data, and the shooting parameters.
In one implementation, the second obtaining unit 708 is configured to receive data returned by the image capturing apparatus according to a set plurality of receivers; and taking the data received by a target receiver as the data returned by the camera device, wherein the target reception is the receiver of which the signal meets the preset condition in the plurality of receivers.
In one implementation, the second obtaining unit 708 is configured to obtain the data returned by the image capturing apparatus through a modulation technique.
In one implementation, the imaging device includes a plurality of imaging devices.
According to the technical scheme provided by the embodiment, before video acquisition is performed on a video acquisition scene according to a video acquisition task, machine position data can be planned for the camera devices at different positions in the video acquisition scene in advance aiming at the video acquisition task, so that the camera device can be controlled to move by using the camera device control device 700, specifically, when video acquisition is performed on the video acquisition scene according to the video acquisition task, the first determining unit 701 can determine the machine position data corresponding to the camera device according to the positions of the camera devices in the video acquisition scene, and how to perform reasonable movement meeting the video acquisition task in the video acquisition scene can be indicated through displacement information included by the machine position data when the camera device executes the video acquisition task. Therefore, the first sending unit 702 can send the corresponding machine position data to the camera device, so that the camera device can move under the instruction of the machine position data, and collect the video data required by the video collection task at the instructed position, and therefore, when the video data is collected, the position of the camera device for collecting the video data can be determined or adjusted no longer through the personal experience of the photographer, but the displacement control of the camera device in the video data collection process can be completed through the determined machine position data, so that the original human experience influence is eliminated, and the video data meeting the video collection task requirement can be obtained through the camera device.
The embodiment corresponding to fig. 8 also provides an image pickup device control apparatus, which is described below with reference to the accompanying drawings. Referring to fig. 8, an embodiment of the present application provides an image capture device control apparatus 800, where the apparatus 800 may be a server, may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 822 (e.g., one or more processors) and a memory 832, and one or more storage media 830 (e.g., one or more mass storage devices) storing an application 842 or data 844. Memory 832 and storage medium 830 may be, among other things, transient or persistent storage. The program stored in the storage medium 830 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Further, the central processor 822 may be provided to communicate with the storage medium 830, and execute a series of instruction operations in the storage medium 830 on the image pickup device control apparatus 800.
The camera control apparatus 800 may also include one or more power supplies 826, one or more wired or wireless network interfaces 850, one or more input-output interfaces 858, and/or one or more operating systems 841, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps performed by the server in the above embodiments may be based on the server structure shown in fig. 8.
The CPU 822 is configured to execute the following steps:
determining machine position data of a camera device aiming at a video acquisition task according to the position of the camera device in a video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
and sending the machine position data to the camera device so that the camera device can collect the video data required by the video collection task in the video collection scene according to the machine position data.
Referring to fig. 9, the embodiment corresponding to fig. 9 provides an image capturing device control apparatus 900, where the apparatus 900 may also be a terminal apparatus, and the terminal apparatus may be any terminal apparatus including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal apparatus is a mobile phone:
fig. 9 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 9, the handset includes: a Radio Frequency (RF) circuit 910, a memory 920, an input unit 930, a display unit 940, a sensor 950, an audio circuit 960, a wireless fidelity (WiFi) module 970, a processor 980, and a power supply 990. Those skilled in the art will appreciate that the handset configuration shown in fig. 9 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 9:
the RF circuit 910 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information of a base station and then processing the received downlink information to the processor 980; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 920 may be used to store software programs and modules, and the processor 980 may execute various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 920. The memory 920 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 930 may include a touch panel 931 and other input devices 932. The touch panel 931, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 931 (e.g., a user's operation on or near the touch panel 931 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a preset program. Alternatively, the touch panel 931 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 980, and can receive and execute commands sent by the processor 980. In addition, the touch panel 931 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 930 may include other input devices 932 in addition to the touch panel 931. In particular, other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 940 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 940 may include a Display panel 941, and optionally, the Display panel 941 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 931 may cover the display panel 941, and when the touch panel 931 detects a touch operation on or near the touch panel 931, the touch panel transmits the touch operation to the processor 980 to determine the type of the touch event, and then the processor 980 provides a corresponding visual output on the display panel 941 according to the type of the touch event. Although in fig. 9, the touch panel 931 and the display panel 941 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 931 and the display panel 941 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 941 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 941 and/or backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 960, speaker 961, microphone 962 may provide an audio interface between a user and a cell phone. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and convert the electrical signal into a sound signal for output by the speaker 961; on the other hand, the microphone 962 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 960, and outputs the audio data to the processor 980 for processing, and then transmits the audio data to, for example, another mobile phone through the RF circuit 910, or outputs the audio data to the memory 920 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 970, and provides wireless broadband Internet access for the user. Although fig. 9 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 980 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Alternatively, processor 980 may include one or more processing units; preferably, the processor 980 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
The handset also includes a power supply 990 (e.g., a battery) for supplying power to the various components, which may preferably be logically connected to the processor 980 via a power management system, thereby providing management of charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 980 included in the terminal device further has the following functions:
determining machine position data of a camera device aiming at a video acquisition task according to the position of the camera device in a video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
and sending the machine position data to the camera device so that the camera device can collect the video data required by the video collection task in the video collection scene according to the machine position data.
An embodiment of the present application further provides a computer-readable storage medium for storing a program code, where the program code is configured to execute any one implementation of the imaging apparatus control method described in the foregoing embodiments.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. An image pickup apparatus control method, characterized by comprising:
the data processing equipment determines the machine position data of the camera device for the video acquisition task according to the position of the camera device in a video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
the data processing equipment sends the machine position data to the camera device so that the camera device can collect video data required by the video collection task in the video collection scene according to the machine position data;
the method further comprises the following steps:
the data processing equipment acquires environmental information of a position acquired by the camera device, wherein the environmental information comprises information influencing video quality when the camera device acquires a video;
the data processing equipment determines corresponding shooting parameters according to the environment information, wherein the shooting parameters at least comprise aperture parameters and focal length parameters;
and the data processing equipment sends the image pickup parameters to the image pickup device so that the image pickup device can be adjusted according to the image pickup parameters.
2. The method according to claim 1, wherein after the data processing apparatus transmits the machine bit data to the image pickup device, the method further comprises:
and the data processing equipment updates the machine position data and sends the updated machine position data to the camera device.
3. The method of claim 1, further comprising:
the data processing apparatus updates the imaging parameters and transmits the updated imaging parameters to the imaging device.
4. The method according to any one of claims 1-3, further comprising:
the data processing equipment acquires data returned by the camera device, wherein the data comprises video data, machine position data and camera parameters;
and the data processing equipment stores the corresponding relation among the video data, the machine position data and the camera shooting parameters.
5. The method according to claim 4, wherein the data processing device acquires data returned by the camera, and comprises:
the data processing equipment receives data returned by the camera device according to a plurality of receivers;
and the data processing equipment takes the data received by a target receiver as the data returned by the camera device, wherein the target receiver is the receiver of which the signals meet the preset condition.
6. The method according to claim 4, wherein the data processing device acquires data returned by the camera, and comprises:
the data processing device acquires the data returned by the camera device through a modulation technology.
7. The method of any of claims 1-3, wherein the camera comprises a plurality of cameras.
8. An image pickup apparatus control apparatus, characterized in that the apparatus comprises a first determination unit and a first transmission unit:
the first determining unit is used for determining the machine position data of the camera device for the video acquisition task according to the position of the camera device in the video acquisition scene; the machine position data comprises displacement information of the camera device in the video acquisition scene;
the first sending unit is used for sending the machine position data to the camera device so that the camera device can collect video data required by the video collection task in the video collection scene according to the machine position data;
the device also comprises a first acquisition unit, a second determination unit and a second sending unit:
the first acquisition unit is used for acquiring environmental information of a position acquired by the camera device, wherein the environmental information comprises information influencing video quality when the camera device acquires video;
the second determining unit is used for determining corresponding shooting parameters according to the environment information, wherein the shooting parameters at least comprise aperture parameters and focal length parameters;
the second sending unit is configured to send the image capturing parameters to the image capturing apparatus, so that the image capturing apparatus performs adjustment according to the image capturing parameters.
9. The apparatus of claim 8, further comprising a first updating unit:
the first updating unit is used for updating the machine position data and sending the updated machine position data to the camera device.
10. The apparatus of claim 8, further comprising a second updating unit:
and the second updating unit is used for updating the shooting parameters and sending the updated shooting parameters to the shooting device.
11. The apparatus according to any one of claims 8-10, further comprising a second acquisition unit and a recording unit:
the second acquisition unit is used for acquiring data returned by the camera device, wherein the data comprises video data, machine position data and camera parameters;
and the recording unit is used for recording the corresponding relation among the video data, the machine position data and the shooting parameters.
12. An image pickup device control apparatus characterized by comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the image pickup apparatus control method according to any one of claims 1 to 7 in accordance with an instruction in the program code.
13. A computer-readable storage medium characterized by storing a program code for executing the image pickup apparatus control method according to any one of claims 1 to 7.
CN201810785265.5A 2018-07-17 2018-07-17 Camera device control method and device Active CN108696697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810785265.5A CN108696697B (en) 2018-07-17 2018-07-17 Camera device control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810785265.5A CN108696697B (en) 2018-07-17 2018-07-17 Camera device control method and device

Publications (2)

Publication Number Publication Date
CN108696697A CN108696697A (en) 2018-10-23
CN108696697B true CN108696697B (en) 2021-08-27

Family

ID=63850703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810785265.5A Active CN108696697B (en) 2018-07-17 2018-07-17 Camera device control method and device

Country Status (1)

Country Link
CN (1) CN108696697B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110490930B (en) * 2019-08-21 2022-12-13 谷元(上海)文化科技有限责任公司 Calibration method for camera position
CN114500826B (en) * 2021-12-09 2023-06-27 成都市喜爱科技有限公司 Intelligent shooting method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016058767A (en) * 2014-09-05 2016-04-21 株式会社 日立産業制御ソリューションズ Picture imaging device, picture management device, and picture management system
WO2017007456A1 (en) * 2015-07-07 2017-01-12 Halliburton Energy Services, Inc. Semi-autonomous monitoring system
JP2017537333A (en) * 2015-09-16 2017-12-14 エスゼット ディージェイアイ オスモ テクノロジー カンパニー リミテッドSZ DJI Osmo Technology Co., Ltd. Shooting system
CN107992076A (en) * 2017-12-13 2018-05-04 王俊梅 A kind of method that unmanned plane search concealment background hides Eye-controlling focus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2987026B1 (en) * 2013-04-05 2020-03-25 Andra Motion Technologies Inc. System and method for controlling an equipment related to image capture
JP2017223879A (en) * 2016-06-17 2017-12-21 キヤノン株式会社 Focus detector, focus control device, imaging apparatus, focus detection method, and focus detection program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016058767A (en) * 2014-09-05 2016-04-21 株式会社 日立産業制御ソリューションズ Picture imaging device, picture management device, and picture management system
WO2017007456A1 (en) * 2015-07-07 2017-01-12 Halliburton Energy Services, Inc. Semi-autonomous monitoring system
JP2017537333A (en) * 2015-09-16 2017-12-14 エスゼット ディージェイアイ オスモ テクノロジー カンパニー リミテッドSZ DJI Osmo Technology Co., Ltd. Shooting system
CN107992076A (en) * 2017-12-13 2018-05-04 王俊梅 A kind of method that unmanned plane search concealment background hides Eye-controlling focus

Also Published As

Publication number Publication date
CN108696697A (en) 2018-10-23

Similar Documents

Publication Publication Date Title
CN110139139B (en) Service processing method, terminal, server and storage medium
CN106412681B (en) Live bullet screen video broadcasting method and device
CN107038681B (en) Image blurring method and device, computer readable storage medium and computer device
CN109040643B (en) Mobile terminal and remote group photo method and device
CN108366207B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN107820014B (en) Shooting method, mobile terminal and computer storage medium
US20170142451A1 (en) Video remote-commentary synchronization method and system, and terminal device
CN108605085B (en) Method for acquiring shooting reference data and mobile terminal
CN108574778B (en) Screen projection brightness adjusting method and related product
CN108924414B (en) Shooting method and terminal equipment
CN109168013B (en) Method, device and equipment for extracting frame and computer readable storage medium
CN107124556B (en) Focusing method, focusing device, computer readable storage medium and mobile terminal
EP3627823A1 (en) Image selection method and related product
CN112019929A (en) Volume adjusting method and device
CN106851119B (en) Picture generation method and equipment and mobile terminal
CN109803110B (en) Image processing method, terminal equipment and server
CN112691363A (en) Cross-terminal switching method and related device for cloud games
CN108696697B (en) Camera device control method and device
CN110177209B (en) Video parameter regulation and control method, device and computer readable storage medium
CN110187769B (en) Preview image viewing method, equipment and computer readable storage medium
CN115514876B (en) Image fusion method, electronic device, storage medium and computer program product
CN108848321B (en) Exposure optimization method, device and computer-readable storage medium
EP3249999B1 (en) Intelligent matching method for filter and terminal
CN111028192B (en) Image synthesis method and electronic equipment
CN110163036B (en) Image recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant