CN116546315A - Method and device for controlling shooting point position through pan-tilt command - Google Patents

Method and device for controlling shooting point position through pan-tilt command Download PDF

Info

Publication number
CN116546315A
CN116546315A CN202310266759.3A CN202310266759A CN116546315A CN 116546315 A CN116546315 A CN 116546315A CN 202310266759 A CN202310266759 A CN 202310266759A CN 116546315 A CN116546315 A CN 116546315A
Authority
CN
China
Prior art keywords
camera
point
information
point position
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310266759.3A
Other languages
Chinese (zh)
Inventor
温建伟
邓迪旻
武海兵
肖占中
袁潮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuohe Technology Co Ltd
Original Assignee
Beijing Zhuohe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuohe Technology Co Ltd filed Critical Beijing Zhuohe Technology Co Ltd
Priority to CN202310266759.3A priority Critical patent/CN116546315A/en
Publication of CN116546315A publication Critical patent/CN116546315A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The invention discloses a method and a device for controlling a camera shooting point position through a holder command. Wherein the method comprises the following steps: acquiring camera shooting point position information and cradle head command information; refining the cradle head command information to obtain action information and function information; executing shooting point position moving operation according to the action information and the shooting point position information; and activating the command function of the camera shooting point after the movement according to the function information. The invention solves the technical problems that in the prior art, the steering process of the camera only executes steering operation according to instructions, and whether the displacement of the camera is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, so that the wrong steering input parameters of the camera can often cause damage to the torsion device of the camera, thereby influencing the use of the camera and delaying normal security and monitoring work.

Description

Method and device for controlling shooting point position through pan-tilt command
Technical Field
The invention relates to the field of high-precision camera control, in particular to a method and a device for controlling a camera shooting point position through a cradle head command.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, in the camera control process of the high-precision camera system, an input instruction of a camera is generally input, and a remote input mode or a local output mode can be adopted, so that the camera in the high-precision camera array is displaced, and a desired camera angle is obtained. However, in the prior art, the steering operation of the camera is only performed according to the instruction, and whether the displacement of the camera is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, so that the camera is damaged due to wrong steering input parameters of the camera, the use of the camera is affected, and the normal security and monitoring work is delayed.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a method and a device for controlling a camera point position through a holder command, which at least solve the technical problems that in the prior art, a camera steering process only executes steering operation according to a command, and whether the camera displacement is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, so that the wrong camera steering input parameters often cause damage to a camera torsion device, thereby influencing the use of the camera and delaying normal security and monitoring work.
According to an aspect of the embodiment of the present invention, there is provided a method for controlling a camera point location by a pan/tilt command, including: acquiring camera shooting point position information and cradle head command information; refining the cradle head command information to obtain action information and function information; executing shooting point position moving operation according to the action information and the shooting point position information; and activating the command function of the camera shooting point after the movement according to the function information.
Optionally, before the capturing the imaging point location information and the pan/tilt command information, the method further includes: acquiring a shooting point position limit and a shooting point position unit angle in a high-precision shooting system; and generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
Optionally, the executing the camera point moving operation according to the action information and the camera point information includes: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy.
Optionally, after the performing the camera point moving operation according to the motion information and the camera point information, the method further includes: generating camera point operation verification data according to the camera point limit positions and the camera point operation strategies; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
According to another aspect of the embodiment of the present invention, there is also provided a device for controlling an imaging point location by a pan/tilt command, including: the acquisition module is used for acquiring camera shooting point position information and cradle head command information; the refining module is used for refining the cradle head command information to obtain action information and function information; the execution module is used for executing the camera point position moving operation according to the action information and the camera point position information; and the activation module is used for activating the command function of the camera shooting point position after the movement according to the function information.
Optionally, the apparatus further includes: the acquisition module is used for acquiring the shooting point position limit and the shooting point position unit angle in the high-precision shooting system; and the generation module is used for generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
Optionally, the execution module includes: an obtaining unit, configured to obtain pointing data in the action information; the generation unit is used for generating an imaging point position operation strategy through the pointing data and the imaging point position information; and the execution unit is used for executing the image pick-up point operation according to the image pick-up point operation strategy.
Optionally, the apparatus further includes: the generation unit is also used for generating camera point operation verification data according to the camera point limit position and the camera point operation strategy; the verification unit is used for inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, a device where the program controls the nonvolatile storage medium to execute a method for controlling an imaging point location through a pan/tilt command.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a method for controlling a camera point location through a pan/tilt command when executed.
In the embodiment of the invention, the camera shooting point position information and the cradle head command information are acquired; refining the cradle head command information to obtain action information and function information; executing shooting point position moving operation according to the action information and the shooting point position information; according to the mode that the command function of the camera point position after the movement is activated according to the function information, the technical problems that in the prior art, the camera steering process only executes steering operation according to instructions, whether the camera displacement is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, and therefore wrong camera steering input parameters often cause damage to a camera torsion device, the use of the camera is affected, and normal security and monitoring work is delayed are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a method for controlling camera points by pan/tilt commands according to an embodiment of the present invention;
fig. 2 is a block diagram of an apparatus for controlling a camera point location through a pan/tilt command according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a method for controlling an imaging point by a pan/tilt command, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different from that herein.
Example 1
Fig. 1 is a flowchart of a method for controlling an imaging point location by a pan/tilt command according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, acquiring camera point position information and cradle head command information.
Specifically, in order to solve the technical problems that in the prior art, a camera steering process only executes steering operation according to instructions, and whether camera displacement is reasonable or not cannot be verified and calculated according to limit conditions and limit working conditions of a camera, so that a camera torsion device is damaged due to wrong camera steering input parameters, the use of the camera is affected, normal security and monitoring work is delayed, camera point position information and cradle head command information are acquired according to an operation information transmission module of camera array camera equipment of a high-precision camera system, wherein the camera point position information comprises data such as angles, position parameters and the like of a camera lens, and the cradle head command information comprises planned positions and expected function parameters of a user on the camera in the camera system.
Optionally, before the capturing the imaging point location information and the pan/tilt command information, the method further includes: acquiring a shooting point position limit and a shooting point position unit angle in a high-precision shooting system; and generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
Specifically, the camera point location information in the embodiment of the invention comprises the limiting angle walking parameter of the rotation of the camera and the unit angle, and the limiting position of the rotation of the camera can be obtained through the limiting parameter of the limiting angle, namely the walking parameter, so that the movement of the camera can be protected from being damaged when the rotation of the camera is executed in the subsequent judgment.
Step S104, refining the holder command information to obtain action information and function information.
Specifically, in the embodiment of the present invention, the pan-tilt command information needs to be converted into the action information and the function information, that is, when the user inputs the pan-tilt command information, the information includes the position change result and the function change result of the pan-tilt camera, for example, the position change result may be that the sector visible range of the pan-tilt camera changes from (a, B) to (B, C), and then the pan-tilt camera needs to move the target coordinates according to the instruction of the imaging system, so as to achieve the new detection purpose. For example, the function change result may be to activate the camera with the non-night vision infrared function to start the infrared night vision function according to the pan-tilt command information of the camera system, thereby correspondingly reducing the capability of photographing the monitoring area in daytime and increasing the capability of photographing at night.
And step S106, executing the camera point moving operation according to the action information and the camera point information.
Optionally, the executing the camera point moving operation according to the action information and the camera point information includes: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy.
Specifically, when the camera moves, the invention extracts the pointing data according to the action information in the pan-tilt command information in real time, so that parameters of the point location information are utilized to change the shooting range of the shooting equipment, for example, according to the action information and the shooting point location information, the executing shooting point location moving operation comprises: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy.
Optionally, after the performing the camera point moving operation according to the motion information and the camera point information, the method further includes: generating camera point operation verification data according to the camera point limit positions and the camera point operation strategies; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
Specifically, in order to reduce the motion damage of the high-precision camera system, prolong the service life of the camera system and reduce the maintenance cost, the embodiment of the invention can generate the camera point operation verification data according to the camera point limit position and the camera point operation strategy; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
Step S108, activating the command function of the camera shooting point position after moving according to the function information.
Specifically, in the embodiment of the invention, after the camera equipment is moved and operated and is moved and verified, functional information in the cradle head command control information is required to be activated so as to achieve the technical effect of completely executing the cradle head command.
Through the embodiment, the technical problems that in the prior art, the steering operation of the camera is only carried out according to instructions, whether the displacement of the camera is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, and therefore the camera is damaged due to wrong steering input parameters of the camera, so that the use of the camera is affected, and normal security and monitoring work is delayed are solved.
Example two
Fig. 2 is a block diagram of an apparatus for controlling a camera point by a pan/tilt command according to an embodiment of the present invention, as shown in fig. 2, the apparatus includes:
the acquiring module 20 is configured to acquire camera point location information and pan/tilt command information.
Specifically, in order to solve the technical problems that in the prior art, a camera steering process only executes steering operation according to instructions, and whether camera displacement is reasonable or not cannot be verified and calculated according to limit conditions and limit working conditions of a camera, so that a camera torsion device is damaged due to wrong camera steering input parameters, the use of the camera is affected, normal security and monitoring work is delayed, camera point position information and cradle head command information are acquired according to an operation information transmission module of camera array camera equipment of a high-precision camera system, wherein the camera point position information comprises data such as angles, position parameters and the like of a camera lens, and the cradle head command information comprises planned positions and expected function parameters of a user on the camera in the camera system.
Optionally, the apparatus further includes: the acquisition module is used for acquiring the shooting point position limit and the shooting point position unit angle in the high-precision shooting system; and the generation module is used for generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
Specifically, the camera point location information in the embodiment of the invention comprises the limiting angle walking parameter of the rotation of the camera and the unit angle, and the limiting position of the rotation of the camera can be obtained through the limiting parameter of the limiting angle, namely the walking parameter, so that the movement of the camera can be protected from being damaged when the rotation of the camera is executed in the subsequent judgment.
And the refining module 22 is configured to refine the pan-tilt command information to obtain action information and function information.
Specifically, in the embodiment of the present invention, the pan-tilt command information needs to be converted into the action information and the function information, that is, when the user inputs the pan-tilt command information, the information includes the position change result and the function change result of the pan-tilt camera, for example, the position change result may be that the sector visible range of the pan-tilt camera changes from (a, B) to (B, C), and then the pan-tilt camera needs to move the target coordinates according to the instruction of the imaging system, so as to achieve the new detection purpose. For example, the function change result may be to activate the camera with the non-night vision infrared function to start the infrared night vision function according to the pan-tilt command information of the camera system, thereby correspondingly reducing the capability of photographing the monitoring area in daytime and increasing the capability of photographing at night.
And the execution module 24 is used for executing the camera point position moving operation according to the action information and the camera point position information.
Optionally, the execution module includes: an obtaining unit, configured to obtain pointing data in the action information; the generation unit is used for generating an imaging point position operation strategy through the pointing data and the imaging point position information; and the execution unit is used for executing the image pick-up point operation according to the image pick-up point operation strategy.
Specifically, when the camera moves, the invention extracts the pointing data according to the action information in the pan-tilt command information in real time, so that parameters of the point location information are utilized to change the shooting range of the shooting equipment, for example, according to the action information and the shooting point location information, the executing shooting point location moving operation comprises: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy.
Optionally, the apparatus further includes: the generation unit is also used for generating camera point operation verification data according to the camera point limit position and the camera point operation strategy; the verification unit is used for inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
Specifically, in order to reduce the motion damage of the high-precision camera system, prolong the service life of the camera system and reduce the maintenance cost, the embodiment of the invention can generate the camera point operation verification data according to the camera point limit position and the camera point operation strategy; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
And the activating module 26 is used for activating the command function of the camera shooting point position after moving according to the function information.
Specifically, in the embodiment of the invention, after the camera equipment is moved and operated and is moved and verified, functional information in the cradle head command control information is required to be activated so as to achieve the technical effect of completely executing the cradle head command.
Through the embodiment, the technical problems that in the prior art, the steering operation of the camera is only carried out according to instructions, whether the displacement of the camera is reasonable or not cannot be verified and calculated according to the limit condition and the limit working condition of the camera, and therefore the camera is damaged due to wrong steering input parameters of the camera, so that the use of the camera is affected, and normal security and monitoring work is delayed are solved.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, a device where the program controls the nonvolatile storage medium to execute a method for controlling an imaging point location through a pan/tilt command.
Specifically, the method comprises the following steps: acquiring camera shooting point position information and cradle head command information; refining the cradle head command information to obtain action information and function information; executing shooting point position moving operation according to the action information and the shooting point position information; and activating the command function of the camera shooting point after the movement according to the function information. Optionally, before the capturing the imaging point location information and the pan/tilt command information, the method further includes: acquiring a shooting point position limit and a shooting point position unit angle in a high-precision shooting system; and generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle. Optionally, the executing the camera point moving operation according to the action information and the camera point information includes: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy. Optionally, after the performing the camera point moving operation according to the motion information and the camera point information, the method further includes: generating camera point operation verification data according to the camera point limit positions and the camera point operation strategies; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a method for controlling a camera point location through a pan/tilt command when executed.
Specifically, the method comprises the following steps: acquiring camera shooting point position information and cradle head command information; refining the cradle head command information to obtain action information and function information; executing shooting point position moving operation according to the action information and the shooting point position information; and activating the command function of the camera shooting point after the movement according to the function information. Optionally, before the capturing the imaging point location information and the pan/tilt command information, the method further includes: acquiring a shooting point position limit and a shooting point position unit angle in a high-precision shooting system; and generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle. Optionally, the executing the camera point moving operation according to the action information and the camera point information includes: acquiring pointing data in the action information; generating a camera point operation strategy through the pointing data and the camera point information; and executing the camera point position operation according to the camera point position operation strategy. Optionally, after the performing the camera point moving operation according to the motion information and the camera point information, the method further includes: generating camera point operation verification data according to the camera point limit positions and the camera point operation strategies; inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises: y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (10)

1. A method for controlling a camera point location through a pan/tilt command, comprising:
acquiring camera shooting point position information and cradle head command information;
refining the cradle head command information to obtain action information and function information;
executing shooting point position moving operation according to the action information and the shooting point position information;
and activating the command function of the camera shooting point after the movement according to the function information.
2. The method of claim 1, wherein prior to the acquiring camera point information and pan/tilt command information, the method further comprises:
acquiring a shooting point position limit and a shooting point position unit angle in a high-precision shooting system;
and generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
3. The method of claim 2, wherein performing a camera point movement operation based on the motion information and the camera point information comprises:
acquiring pointing data in the action information;
generating a camera point operation strategy through the pointing data and the camera point information;
and executing the camera point position operation according to the camera point position operation strategy.
4. A method according to claim 3, wherein after said performing a camera point movement operation based on said action information and said camera point information, said method further comprises:
generating camera point operation verification data according to the camera point limit positions and the camera point operation strategies;
inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises:
y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
5. A device for controlling a camera point location through a pan/tilt command, comprising:
the acquisition module is used for acquiring camera shooting point position information and cradle head command information;
the refining module is used for refining the cradle head command information to obtain action information and function information;
the execution module is used for executing the camera point position moving operation according to the action information and the camera point position information;
and the activation module is used for activating the command function of the camera shooting point position after the movement according to the function information.
6. The apparatus of claim 5, wherein the apparatus further comprises:
the acquisition module is used for acquiring the shooting point position limit and the shooting point position unit angle in the high-precision shooting system;
and the generation module is used for generating the image pick-up point information according to the image pick-up point limit and the image pick-up point unit angle.
7. The apparatus of claim 6, wherein the execution module comprises:
an obtaining unit, configured to obtain pointing data in the action information;
the generation unit is used for generating an imaging point position operation strategy through the pointing data and the imaging point position information;
and the execution unit is used for executing the image pick-up point operation according to the image pick-up point operation strategy.
8. The apparatus of claim 7, wherein the apparatus further comprises:
the generation unit is also used for generating camera point operation verification data according to the camera point limit position and the camera point operation strategy;
the verification unit is used for inputting the camera point operation verification data into a camera movement verification formula to obtain a camera point verification result, wherein the verification formula comprises:
y=mov (P) ×tan (C), where Y is a verification result, mov (P) is an imaging point limit position, and C is a point steering parameter in an imaging point operation policy.
9. A non-volatile storage medium, characterized in that the non-volatile storage medium comprises a stored program, wherein the program, when run, controls a device in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for executing the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 4.
CN202310266759.3A 2023-03-15 2023-03-15 Method and device for controlling shooting point position through pan-tilt command Pending CN116546315A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310266759.3A CN116546315A (en) 2023-03-15 2023-03-15 Method and device for controlling shooting point position through pan-tilt command

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310266759.3A CN116546315A (en) 2023-03-15 2023-03-15 Method and device for controlling shooting point position through pan-tilt command

Publications (1)

Publication Number Publication Date
CN116546315A true CN116546315A (en) 2023-08-04

Family

ID=87442405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310266759.3A Pending CN116546315A (en) 2023-03-15 2023-03-15 Method and device for controlling shooting point position through pan-tilt command

Country Status (1)

Country Link
CN (1) CN116546315A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105959545A (en) * 2016-05-24 2016-09-21 北京小米移动软件有限公司 Camera and camera control method and device
CN108900770A (en) * 2018-07-17 2018-11-27 广东小天才科技有限公司 A kind of method, apparatus, smartwatch and the mobile terminal of the rotation of control camera
JP2023007056A (en) * 2021-07-01 2023-01-18 キヤノン株式会社 Imaging control device, imaging control method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105959545A (en) * 2016-05-24 2016-09-21 北京小米移动软件有限公司 Camera and camera control method and device
CN108900770A (en) * 2018-07-17 2018-11-27 广东小天才科技有限公司 A kind of method, apparatus, smartwatch and the mobile terminal of the rotation of control camera
JP2023007056A (en) * 2021-07-01 2023-01-18 キヤノン株式会社 Imaging control device, imaging control method, and program

Similar Documents

Publication Publication Date Title
EP3525124B1 (en) Fingerprint unlocking
US10719689B2 (en) Electronic device and fingerprint recognition method
RU2589366C2 (en) Above-lock camera access
EP3082299B1 (en) Method, terminal and computer program for detecting the state of a network channel between a smart device and an iot server
US9338359B2 (en) Method of capturing an image in a device and the device thereof
EP3575917B1 (en) Collecting fingerprints
CN110099219B (en) Panoramic shooting method and related product
CN111601065A (en) Video call control method and device and electronic equipment
CN112929860B (en) Bluetooth connection method and device and electronic equipment
CN107817987B (en) Hardware configuration method, system, terminal and server
CN107515749A (en) Starting-up method, device and the electronic equipment of electronic equipment
CN106292994A (en) The control method of virtual reality device, device and virtual reality device
CN105824474A (en) Control method and device as well as mobile equipment
CN106249995B (en) Notification method and device
CN111191227A (en) Method and device for preventing malicious code from executing
CN104035764A (en) Object control method and relevant device
CN116261044B (en) Intelligent focusing method and device for hundred million-level cameras
CN115409869B (en) Snow field track analysis method and device based on MAC tracking
CN107071008A (en) Localization method, device and the equipment of terminal device
CN116546315A (en) Method and device for controlling shooting point position through pan-tilt command
CN111711762B (en) Camera lens module shielding control method and device based on target detection and camera
CN107948876A (en) Control the method, apparatus and medium of sound-box device
CN114245017A (en) Shooting method and device and electronic equipment
CN116088580B (en) Flying object tracking method and device
CN111626192A (en) Living body detection method, system, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination