CN111121774B - Infrared positioning camera capable of detecting self posture in real time - Google Patents

Infrared positioning camera capable of detecting self posture in real time Download PDF

Info

Publication number
CN111121774B
CN111121774B CN202010036314.2A CN202010036314A CN111121774B CN 111121774 B CN111121774 B CN 111121774B CN 202010036314 A CN202010036314 A CN 202010036314A CN 111121774 B CN111121774 B CN 111121774B
Authority
CN
China
Prior art keywords
camera
attitude
imu
calibration
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010036314.2A
Other languages
Chinese (zh)
Other versions
CN111121774A (en
Inventor
周清会
汤代理
毛佳红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Manheng Digital Technology Co ltd
Original Assignee
Shanghai Manheng Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Manheng Digital Technology Co ltd filed Critical Shanghai Manheng Digital Technology Co ltd
Priority to CN202010036314.2A priority Critical patent/CN111121774B/en
Publication of CN111121774A publication Critical patent/CN111121774A/en
Application granted granted Critical
Publication of CN111121774B publication Critical patent/CN111121774B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

Abstract

The application relates to an infrared positioning camera capable of detecting self postures in real time, and relates to the technical field of computer vision. The method comprises the following steps: acquiring IMU data of a camera and performing data preprocessing; performing IMU attitude calculation on the preprocessed data; performing IMU attitude calibration to calibrate an attitude angle; when the IMU is not in the camera calibration state, determining that the IMU posture is calibrated; calculating the deviation between the real-time attitude angle and the calibration attitude angle; when the deviation is judged to be larger than a preset threshold value, calculating an overflow ratio and recording a maximum deviation value; and judging that the overflow ratio and the maximum deviation value meet a threshold condition, and determining different camera attitude states. The camera attitude angle that this application was solved at camera calibration record IMU judges the camera gesture through the angle change and changes to judge and track positioning system and need mark the camera again, overcome current tracking positioning system, can't judge the problem of marking the camera opportunity again, improve the convenient degree of system's use, reduce system operation personnel's professional requirement.

Description

Infrared positioning camera capable of detecting self posture in real time
Technical Field
The application relates to the technical field of computer vision, in particular to an infrared positioning camera capable of detecting self postures in real time.
Background
In recent years, with the spread of computers, computer vision technology relying on camera images has been developed. The distance information of the space three-dimensional object is the most basic content in three-dimensional imaging, three-dimensional object reconstruction and computer aided design, and has wide practical application value. Binocular Stereo Vision (Binocular Stereo Vision) is an important form of machine Vision, and is a method for acquiring three-dimensional geometric information of an object from a plurality of images based on a parallax principle. In a machine vision system, binocular vision generally obtains two digital images of surrounding scenery from different angles by two cameras simultaneously, and can recover three-dimensional geometric information of an object based on a parallax principle to reconstruct the three-dimensional shape and position of the surrounding scenery. The parallax needs to be obtained through calibration of the cameras, and after the cameras are calibrated, the spatial transformation relation, namely the position and posture relation, between the cameras can be accurately determined, so that the parallax of two digital images can be calculated. However, once the camera has slight displacement or posture change, the parallax of the camera calibration is no longer accurate, which results in large error or failure of three-dimensional reconstruction, and the camera needs to be calibrated again. In addition, after the pose of the camera slightly changes, the problem that the three-dimensional reconstruction error is increased or fails cannot be judged, and when the camera needs to be calibrated again, the camera can be judged after being manually compared with the actual result according to the actual calculation result, so that the use difficulty of the system is greatly increased, and the professional requirements on system users are improved.
Therefore, it is desirable to provide a positioning method and system for an infrared positioning camera capable of detecting the posture of the infrared positioning camera in real time, which records the camera posture angle calculated by the IMU during camera calibration, judges the change of the camera posture through the change of the angle, further judges whether the tracking and positioning system needs to recalibrate the camera, overcomes the problem that the camera recalibration time cannot be judged in the existing tracking and positioning system, improves the convenience of system use, and reduces the professional requirements of system operators.
Disclosure of Invention
According to a first aspect of some embodiments of the present application, there is provided a positioning method of an infrared positioning camera capable of detecting a self-gesture in real time, which is applied in a terminal (e.g., an electronic device, etc.), the method may include: acquiring IMU data of a camera and performing data preprocessing; performing IMU attitude calculation on the preprocessed data; performing IMU attitude calibration to calibrate an attitude angle; when the IMU is not in the camera calibration state, determining that the IMU posture is calibrated; calculating the deviation between the real-time attitude angle and the calibration attitude angle; when the deviation is judged to be larger than a preset threshold value, calculating an overflow ratio and recording a maximum deviation value; and judging that the overflow ratio and the maximum deviation value meet a threshold condition, and determining different camera attitude states.
In some embodiments, the method may further comprise: and determining that the ratio of different camera postures meets a threshold condition, sending a camera calibration prompt, and performing camera recalibration.
In some embodiments, the acquiring and pre-processing IMU data of the camera further comprises: acquiring IMU data of a camera, wherein the IMU data comprises accelerometer and gyroscope data; carrying out zero calibration on the XY axis of the accelerometer, and subtracting zero drift; high-pass filtering is performed.
In some embodiments, the performing IMU pose solution on the preprocessed data further includes: and calculating the yaw angle, the pitch angle and the roll angle of the camera by utilizing a 6-axis attitude Kalman filtering algorithm.
In some embodiments, the performing IMU pose calibration to calibrate the pose angle further comprises: when IMU attitude calibration is carried out, the attitude angles of the camera, including a pitch angle and a roll angle, are saved; and when the IMU attitude calibration is completed, obtaining a calibration attitude angle by averaging all attitude angles.
In some embodiments, when not in the camera calibration state, the method may further comprise: determining that the IMU attitude is calibrated, including the attitude angle is calibrated, and determining different camera attitude states; and determining that the IMU attitude is not calibrated, including the attitude angle is not calibrated, and determining that the attitude state of the camera is unknown.
In some embodiments, the method may further comprise: acquiring the deviation between the real-time attitude angle and the calibration attitude angle; when the deviation is judged to be larger than a preset threshold value, recording the number of overflowing camera frames and recording the maximum deviation value, including pitch angle deviation and roll angle deviation; and calculating the ratio of the overflow camera frame number to the total camera frame number to obtain the overflow ratio.
In some embodiments, the determining the different camera pose states further comprises: when the overflow ratio is not more than 0.08 and the maximum deviation value is not more than 2 degrees, marking the attitude state of the camera to be stable; when the overflow ratio is not more than 0.5, marking the attitude state of the camera as relatively stable; when the threshold condition is not satisfied, the camera pose state is marked as unstable.
In some embodiments, the method may further comprise: sending a camera calibration prompt when the number of stable cameras accounts for 50% of the total number of cameras; sending a camera calibration prompt when the relatively stable number of cameras accounts for 80% of the total number of cameras; and when the threshold condition is not met, ending the calculation of the camera data of the frame, and entering the calculation of the camera data of the next frame.
According to a second aspect of some embodiments of the present application, there is provided an infrared positioning camera capable of detecting its own posture in real time, the system comprising: a memory configured to store data and instructions; a processor in communication with the memory, wherein the processor, when executing instructions in the memory, is configured to: acquiring IMU data of a camera and performing data preprocessing; performing IMU attitude calculation on the preprocessed data; performing IMU attitude calibration to calibrate an attitude angle; when the IMU is not in the camera calibration state, determining that the IMU posture is calibrated; calculating the deviation between the real-time attitude angle and the calibration attitude angle; when the deviation is judged to be larger than a preset threshold value, calculating an overflow ratio and recording a maximum deviation value; and judging that the overflow ratio and the maximum deviation value meet a threshold condition, and determining different camera attitude states.
Therefore, according to the infrared positioning camera capable of detecting the posture of the infrared positioning camera in real time, the posture angle of the camera calculated by the IMU is recorded during camera calibration, the change of the posture of the camera is judged through the change of the angle, whether the tracking and positioning system needs to recalibrate the camera is further judged, the problem that the camera recalibration time cannot be judged in the existing tracking and positioning system is solved, the convenience of using the system is improved, and the professional requirements of system operators are reduced.
Drawings
For a better understanding and appreciation of some embodiments of the present application, reference will now be made to the description of embodiments taken in conjunction with the accompanying drawings, in which like reference numerals designate corresponding parts in the figures.
Fig. 1 is an exemplary schematic diagram of a tracking and positioning system provided in accordance with some embodiments of the present application.
Fig. 2 is an exemplary flowchart of a positioning method of an infrared positioning camera that can detect its own pose in real time according to some embodiments of the present application.
FIG. 3 is an exemplary flow chart of determining whether an attitude angle is nominal provided in accordance with some embodiments of the present application.
Fig. 4 is an exemplary flow diagram for determining different camera pose states provided in accordance with some embodiments of the present application.
Fig. 5 is an exemplary flow diagram of sending calibration camera prompts provided in accordance with some embodiments of the present application.
Detailed Description
The following description, with reference to the accompanying drawings, is provided to facilitate a comprehensive understanding of various embodiments of the application as defined by the claims and their equivalents. These embodiments include various specific details for ease of understanding, but these are to be considered exemplary only. Accordingly, those skilled in the art will appreciate that various changes and modifications may be made to the various embodiments described herein without departing from the scope and spirit of the present application. In addition, descriptions of well-known functions and constructions will be omitted herein for brevity and clarity.
The terms and phrases used in the following specification and claims are not to be limited to the literal meaning, but are merely for the clear and consistent understanding of the application. Accordingly, it will be appreciated by those skilled in the art that the description of the various embodiments of the present application is provided for illustration only and not for the purpose of limiting the application as defined by the appended claims and their equivalents.
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is to be understood that the terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only, and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The expressions "first", "second", "the first" and "the second" are used for modifying the corresponding elements without regard to order or importance, and are used only for distinguishing one element from another element without limiting the corresponding elements.
A terminal according to some embodiments of the present application may be an electronic device, which may include one or a combination of several of a camera, a personal computer (PC, e.g., tablet, desktop, notebook, netbook, palmtop PDA), a client device, a virtual reality device (VR), a rendering machine, a smartphone, a mobile phone, an e-book reader, a Portable Multimedia Player (PMP), an audio/video player (MP3/MP4), a video camera, and a wearable device, etc. According to some embodiments of the present application, the wearable device may include an accessory type (e.g., watch, ring, bracelet, glasses, or Head Mounted Device (HMD)), an integrated type (e.g., electronic garment), a decorative type (e.g., skin pad, tattoo, or built-in electronic device), and the like, or a combination of several. In some embodiments of the present application, the electronic device may be flexible, not limited to the above devices, or may be a combination of one or more of the above devices. In this application, the term "user" may indicate a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).
The embodiment of the application provides a positioning method of an infrared positioning camera capable of detecting the posture of the infrared positioning camera in real time. In order to facilitate understanding of the embodiments of the present application, the embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 1 is an exemplary schematic diagram of a tracking and positioning system provided in accordance with some embodiments of the present application. As shown in fig. 1, the tracking and positioning system 100 may include a network 110, a server 120, a camera 130, and the like. Specifically, the server 120 and the camera 130 are in the same network environment, which may be the same arbitrary network environment, for example, the same local area network may include the network environment of the same router, and the like. Further, the server 120 may be connected to the network 110 in a wired (e.g., network cable, etc.) or wireless manner, and the camera 130 may establish a communication connection with the network 110 in a wired or wireless (e.g., WIFI, etc.) manner. In some embodiments, the server 120 may detect the calibration status of the camera 130 in real time, and/or further send a calibration camera prompt, and the like.
According to some embodiments of the present application, the server 120 may include a server, which is a kind of computer, and has the advantages of faster operation, higher load, and the like than a normal computer, and the corresponding price is higher. In a network environment, a server may provide computing or application services to other clients (e.g., terminals such as PCs, smart phones, ATMs, and large devices such as transportation systems). The server has high-speed CPU computing capability, long-time reliable operation, strong I/O external data throughput capability and better expansibility. The services that the server may provide include, but are not limited to, the ability to undertake responding to service requests, undertake services, secure services, and the like. The server, as an electronic device, has an extremely complex internal structure, including an internal structure similar to that of a general computer, and the like, and the internal structure of the server may include a Central Processing Unit (CPU), a hard disk, a memory, a system bus, and the like, as an example. In some embodiments, the central processor may include a processing module, which may include an acquisition unit, a processing unit, a determination unit, a control unit, and the like. The acquisition unit may acquire information of the camera 130, for example, IMU data and the like. The processing unit may perform IMU attitude resolution and/or IMU attitude calibration, etc. on the preprocessed data. The determining unit can calibrate and calibrate attitude angles and the like according to IMU attitudes; calculating the deviation between the real-time attitude angle and the calibration attitude angle, determining whether the IMU attitude is calibrated, determining whether the attitude angle is calibrated, determining different camera attitude states, determining the ratio of the different camera attitude states and the like. The control unit may recalibrate the camera, etc. in accordance with the calibration camera prompts.
In some embodiments of the present application, the tracking and positioning system 100 may omit one or more elements, or may further include one or more other elements. By way of example, the tracking and positioning system 100 may include a plurality of cameras 130, or the like. Also for example, the tracking and positioning system 100 may include a plurality of servers 120, such as a plurality of servers. The Local Area Network may be another type of communication Network, which may include a computer Network (e.g., a Local Area Network (LAN) or a Wide Area Network (WAN)), the internet and/or a telephone Network, etc., or a combination of several. In some embodiments, the network 110 may be other types of wireless communication networks. The wireless communication may include microwave communication and/or satellite communication, among others. The Wireless communication may include cellular communication, such as Global System for Mobile Communications (GSM), Code Division Multiple Access (CDMA), third Generation Mobile communication (3G, The 3rd Generation communication), fourth Generation Mobile communication (4G), fifth Generation Mobile communication (5G), sixth Generation Mobile communication (6G), Long Term Evolution (LTE-a), LTE-Advanced, Wideband Code Division multiple access (WCDMA, wide Code Division multiple access), Universal Mobile Telecommunications System (UMTS), Wireless Broadband (Wireless Broadband), and The like, or a combination of several.
In some embodiments, the WIFI may be other types of wireless communication technologies. According to some embodiments of the present application, the Wireless Communication may include Wireless local area Network (WiFi), Bluetooth Low Energy (BLE), ZigBee (ZigBee), Near Field Communication (NFC), magnetic security transmission, radio frequency and Body Area Network (BAN), and the like, or a combination of several. According to some embodiments of the present application, the wired communication may include a Global Navigation Satellite System (Global Navigation Satellite System), a Global Positioning System (GPS), a beidou Navigation Satellite System, galileo (european Global Satellite Navigation System), or the like. The wired communication may include a Universal Serial Bus (USB), a High-Definition Multimedia Interface (HDMI), a recommended Standard 232 (RS-232), and/or Plain Old Telephone Service (POTS), or the like, or a combination of several.
It should be noted that the above description of the tracking and positioning system 100 is merely for convenience of description and should not limit the scope of the present application. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the principles of the system, which may be combined in any manner or combined with other elements to form a subsystem for use in a field of application in which the method and system described above is practiced. For example, the server 120 may be integrated with the camera 130 in one device. Such variations are within the scope of the present application.
Fig. 2 is an exemplary flowchart of a positioning method of an infrared positioning camera that can detect its own pose in real time according to some embodiments of the present application. As shown in fig. 2, the process 200 may be implemented by the tracking and positioning system 100. In some embodiments, the positioning method 200 of the infrared positioning camera capable of detecting the self-gesture in real time can be automatically started or started through instructions. The instructions may include system instructions, device instructions, user instructions, action instructions, and the like, or a combination of the several. As an example, the system instructions may be generated from information acquired by a sensor, and the like. The user instructions may include voice, gestures, actions, user interfaces, virtual keys, and/or the like, or a combination of the several.
At 201, IMU data of a camera is acquired and data pre-processing is performed. Operation 201 may be implemented by the acquisition unit, the processing unit and/or the camera 130 of the server 120 of the tracking and positioning system 100. In some embodiments, the acquisition unit may acquire IMU data of the camera 130, which the processing unit may perform data pre-processing. The IMU data may include accelerometer (ax, ay, az) and gyroscope data (gx, gy, gz), among others. The data preprocessing may include Zero calibration of the XY axis for the accelerometers (ax, ay, az), subtraction of Zero Drift (Zero Drift), etc. The data pre-processing may further include performing high-pass filtering.
At 202, IMU pose solution is performed on the preprocessed data. Operation 202 may be implemented by a processing unit of the server 120 of the tracking and positioning system 100. In some embodiments, the processing unit may perform IMU pose solution on the preprocessed data. As an example, the processing unit may calculate (Yaw, Pitch, Roll) of the camera using a 6-axis pose Kalman filtering algorithm as: yaw, pitch and roll. According to some embodiments of the application, the 6-axis data is not accurate for calibrating the Yaw angle (Yaw) and subsequently, without using this data, the processing unit may calculate only the Pitch angle (Pitch) and Roll angle (Roll) of the camera.
At 203, IMU attitude calibration is performed to calibrate attitude angles. Operation 203 may be implemented by a processing unit, a determining unit, of the server 120 of the tracking location system 100. In some embodiments, the determination unit may determine that the processing unit may save an attitude angle of the camera when performing the camera calibration, the attitude angle (Pitch, Roll) including a Pitch angle and a Roll angle. Further, when the camera calibration is completed, the determination unit may average all the attitude angles (Pitch, Roll) to obtain a calibration attitude angle (Pitch _ cal, Roll _ cal).
At 204, when not in the camera calibration state, the IMU pose is determined to be calibrated. Operation 204 may be implemented by tracking a determination unit of the server 120 of the location system 100. In some embodiments, the determination unit may determine that the IMU pose is calibrated when not in the camera calibration state. As an example, the determination unit may determine whether the attitude angle (Pitch, Roll) has been calibrated (Pitch _ cal, Roll _ cal), or the like.
According to some embodiments of the present application, whether the attitude angle is calibrated or not can be implemented through a process 300, and fig. 3 is an exemplary flowchart for determining whether the attitude angle is calibrated or not, provided according to some embodiments of the present application. As shown in fig. 3, the process 300 may be implemented by the tracking and positioning system 100.
At 301, IMU data for the camera, including accelerometer and gyroscope data, is acquired. Operation 301 may be implemented by an acquisition unit of the server 120 of the tracking and positioning system 100. As an example, the acquisition unit may acquire IMU data of the camera, including accelerometer and gyroscope data.
At 302, the pitch and roll angles of the camera are calculated by IMU attitude calculation. Operation 302 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, the determination unit may calculate the pitch angle and the roll angle of the camera using a 6-axis attitude Kalman filter algorithm.
At 303, when IMU attitude calibration is performed, the attitude angles of the camera, including pitch and roll angles, are saved. Operation 303 may be implemented by a processing unit of the server 120 of the tracking and positioning system 100. As an example, when performing IMU pose calibration, the processing unit may save the pose angles of the camera, including pitch and roll angles.
At 304, when IMU attitude calibration is complete, a calibrated attitude angle is obtained by averaging all attitude angles. Operation 304 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, when the IMU attitude calibration is completed, the determination unit may obtain the calibration attitude angle by averaging all the attitude angles.
At 305, it is determined whether the attitude angle has been calibrated. Operation 305 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, the determination unit may determine whether the IMU pose is calibrated, i.e. the pose angle is calibrated.
If the pose angle is calibrated, operation 306 is entered, and at 306, a different camera pose state is determined. Operation 306 may be implemented by tracking the determined units of the server 120 of the location system 100. As an example, the determination unit may further determine a different camera pose state if the pose angle is calibrated. The camera pose states may include stable, relatively stable and unstable, etc. The determination of different camera pose states may be performed by flow 400.
If the pose angle is not calibrated, operation 307 is entered, and at 307 the pose state of the camera is marked as unknown. Operation 307 may be implemented by a processing unit of the server 120 of the tracking and positioning system 100. As an example, if the pose angle is uncalibrated, the processing unit may mark the camera pose state as unknown. The camera pose states can be used to subsequently determine the duty cycle of different camera pose states, etc.
At 205, the deviation of the real-time attitude angle from the calibrated attitude angle is calculated. Operation 205 may be implemented by tracking the determined units of the server 120 of the positioning system 100. In some embodiments, the determination unit may calculate a deviation of the real-time attitude angle from the calibrated attitude angle, and the like. The deviations include a Pitch angle deviation d _ Pitch _ cal-Pitch and a Roll angle deviation d _ Roll _ cal-Roll.
At 206, when the deviation is determined to be greater than the preset threshold, the overflow ratio is calculated and the maximum deviation value is recorded. Operation 206 may be implemented by a determining unit, a processing unit, of the server 120 of the tracking location system 100. In some embodiments, when the determination unit determines that the attitude angle deviation d _ Pitch or d _ Roll is greater than a preset Threshold (empirical Threshold), it is determined that there is a change in the angle, the processing unit may record the overflow camera frame number m _ overflow cout, and record the maximum deviation values including the Pitch angle deviation value d _ Pitch _ max and the Roll angle deviation value d _ Roll _ max. Further, the determining unit may calculate an overflow ratio (Rate), which may be a ratio of an overflow camera frame number to a total camera frame number, where m _ overflow flow cout/m _ frame cout is the total camera frame number.
At 207, it is determined that the overflow ratio and the maximum deviation value satisfy a first threshold condition, and a different camera pose state is determined. Operation 207 may be implemented by tracking the determined units of the server 120 of the location system 100. In some embodiments, the determination unit may determine the different camera pose states when it is judged that the overflow ratio and the maximum deviation value satisfy the first threshold condition. The first threshold condition may include an overflow ratio, a maximum deviation value condition, and the like.
According to some embodiments of the present application, the determining the different camera pose states may be implemented by a process 400, and fig. 4 is an exemplary process flow diagram for determining the different camera pose states provided according to some embodiments of the present application. As shown in fig. 4, the process 400 may be implemented by the tracking and positioning system 100.
At 401, a deviation of the real-time attitude angle from the calibrated attitude angle is calculated. Operation 401 may be implemented by a determining unit of the server 120 of the tracking location system 100. As an example, the determination unit may calculate deviations of the real-time attitude angle from the calibrated attitude angle, including pitch angle deviations and roll angle deviations.
At 402, a change in camera angle is determined when the deviation is determined to be greater than an empirical threshold. Operation 402 may be implemented by a determining unit of the server 120 of the tracking location system 100. As an example, the determination unit may determine that there is a change in the camera angle when the deviation is judged to be greater than a preset threshold value.
At 403, the number of over camera frames is recorded, and the maximum deviation values, including pitch angle deviation and roll angle deviation, are recorded. Operation 403 may be implemented by a processing unit of the server 120 of the tracking and positioning system 100. As an example, the processing unit may record the number of over camera frames and record the maximum deviation values, including pitch angle deviation and roll angle deviation.
At 404, a ratio of the overflow camera frame number to the total camera frame number is calculated to obtain an overflow ratio. Operation 404 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, the determination unit may calculate a ratio of the overflow camera frame number to the total camera frame number, resulting in an overflow ratio.
At 405, the overflow ratio and the maximum deviation value are determined. Operation 405 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, the determination unit may determine whether the overflow ratio and the maximum deviation value satisfy the threshold condition.
At 406, when the overflow ratio is less than or equal to 0.08 and the maximum deviation is less than or equal to 2, the flag is stable. Operation 406 may be implemented by a determining unit, a processing unit, of the server 120 of the tracking and positioning system 100. As an example, when the determination unit determines that the overflow ratio is not more than 0.08 and the maximum deviation value is not more than 2 °, the processing unit may mark the camera attitude state as stable.
At 407, the overflow ratio is ≦ 0.5, marked as relatively stable. Operation 407 may be implemented by a determining unit, a processing unit of the server 120 of the tracking and positioning system 100. As an example, the determination unit determines that the overflow ratio is not more than 0.5, and the processing unit may mark the camera pose state as relatively stable.
In 408, otherwise, the flag is not stable. Operation 408 may be implemented by a determining unit, a processing unit, of the server 120 of the tracking location system 100. As an example, the processing unit may mark the camera pose state as unstable when the determination unit determines that the threshold condition is not satisfied.
According to some embodiments of the present application, the process 200 may further include an operation 208 of determining that the ratio of the different camera pose states satisfies a second threshold condition, and sending a calibration camera prompt, at 208. Operation 208 may be implemented by a determination unit, a processing unit, a control unit, etc. of the server 120 of the tracking and positioning system 100. In some embodiments, the determination unit may determine that the duty ratio of the different camera pose states satisfies the second threshold condition, the processing unit may send a calibration camera prompt, and the control unit may perform a recalibration of the camera. The second threshold condition may include a stable camera count ratio and/or a relatively stable camera count ratio, and/or the like.
According to some embodiments of the present application, the determination of whether the duty ratio of the different camera pose states satisfies the threshold condition may be implemented by a process 500, and fig. 5 is an exemplary process flow diagram for sending a calibration camera prompt provided according to some embodiments of the present application. As shown in fig. 5, the process 500 may be implemented by the tracking and positioning system 100.
At 501, different camera pose states are acquired. Operation 501 may be implemented by an acquisition unit of the server 120 of the tracking and positioning system 100. As an example, the acquisition unit may acquire different camera pose states. The different camera pose states may include unknown, stable, relatively stable, and unstable, among others.
At 502, the duty ratio of the different camera pose states is determined. Operation 502 may be implemented by tracking a determination unit of the server 120 of the location system 100. As an example, the determination unit may determine a ratio of the different camera pose states to the total number of cameras.
At 503, the number of stabilized cameras is 50% of the total number of cameras. Operation 503 may be implemented by the determining unit of the server 120 of the tracking and positioning system 100. As an example, the determination unit may determine that the number of stabilized cameras accounts for 50% of the total number of cameras.
At 504, a calibration camera prompt is sent. Operation 504 may be implemented by a processing unit, a control unit, of the server 120 of the tracking and positioning system 100. As an example, the processing unit may send a calibration camera prompt and the control unit may perform a recalibration of the camera.
At 505, the number of relatively stable cameras accounts for 80% of the total number of cameras. Operation 505 may be implemented by tracking a determination unit of the server 120 of the positioning system 100. As an example, the determination unit may determine that the number of relatively stable cameras accounts for 80% of the total number of cameras.
At 506, a calibration camera prompt is sent. Operation 506 may be implemented by a processing unit, a control unit element, of the server 120 of the tracking and positioning system 100. As an example, the processing unit may send a calibration camera prompt and the control unit may perform a recalibration of the camera.
At 507, when the threshold condition is not satisfied, the calculation of the camera data of the current frame is ended, and the calculation of the camera data of the next frame is started. Operation 507 may be implemented by a determination unit, a control unit of the server 120 of the tracking and positioning system 100. As an example, when the determination unit determines that the threshold condition is not satisfied, the control unit may end the calculation of the present frame of camera data, and proceed to the calculation of the next frame of camera data.
It should be noted that the above descriptions of the processes 200, 300, 400, and 500 are only for convenience of description, and should not limit the scope of the present application. It will be understood by those skilled in the art that various modifications and changes in form and detail may be made in the functions implementing the above-described processes and operations based on the principles of the present system, in any combination of operations or in combination with other operations constituting sub-processes without departing from the principles. For example, the process 200 may perform operations such as camera re-calibration. Such variations are within the scope of the present application.
In summary, according to the infrared positioning camera capable of detecting the posture of the infrared positioning camera in real time, the camera posture angle calculated by the IMU is recorded during camera calibration, the change of the camera posture is judged through the change of the angle, and whether the camera needs to be recalibrated by the tracking and positioning system is further judged, so that the problem that the camera recalibration time cannot be judged in the existing tracking and positioning system is solved, the convenience of using the system is improved, and the professional requirements of system operators are reduced.
It is to be noted that the above-described embodiments are merely examples, and the present application is not limited to such examples, but various changes may be made.
It should be noted that, in the present specification, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Finally, it should be noted that the series of processes described above includes not only processes performed in time series in the order described herein, but also processes performed in parallel or individually, rather than in time series.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware associated with computer program instructions, and the program can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described with reference to a number of illustrative embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (9)

1. A positioning method of an infrared positioning camera capable of detecting self posture in real time is characterized by comprising the following steps:
acquiring IMU data of a camera and performing data preprocessing;
performing IMU attitude calculation on the preprocessed data;
performing IMU attitude calibration to calibrate an attitude angle;
when the IMU is not in the camera calibration state, determining that the IMU posture is calibrated;
calculating the deviation between the real-time attitude angle and the calibration attitude angle;
when the deviation is judged to be larger than a preset threshold value, calculating an overflow ratio and recording a maximum deviation value;
judging that the overflow ratio and the maximum deviation value meet the threshold condition, determining different camera attitude states, wherein the method also comprises the following steps,
acquiring the deviation between the real-time attitude angle and the calibration attitude angle;
when the deviation is judged to be larger than a preset threshold value, recording the number of overflowing camera frames and recording the maximum deviation value, including pitch angle deviation and roll angle deviation;
and calculating the ratio of the overflow camera frame number to the total camera frame number to obtain the overflow ratio.
2. The method of claim 1, further comprising:
and determining that the ratio of different camera postures meets a threshold condition, sending a camera calibration prompt, and performing camera recalibration.
3. The method of claim 2, wherein the acquiring and data pre-processing IMU data for a camera further comprises:
acquiring IMU data of a camera, wherein the IMU data comprises accelerometer and gyroscope data;
carrying out zero calibration on the XY axis of the accelerometer, and subtracting zero drift;
high-pass filtering is performed.
4. The method of claim 3, wherein performing IMU attitude solution on the preprocessed data further comprises:
and calculating the yaw angle, the pitch angle and the roll angle of the camera by utilizing a 6-axis attitude Kalman filtering algorithm.
5. The method of claim 4, wherein the performing IMU pose calibration to calibrate pose angles further comprises:
when IMU attitude calibration is carried out, the attitude angles of the camera, including a pitch angle and a roll angle, are saved;
and when the IMU attitude calibration is completed, obtaining a calibration attitude angle by averaging all attitude angles.
6. The method of claim 5, when not in a camera calibration state, further comprising:
determining that the IMU attitude is calibrated, including the attitude angle is calibrated, and determining different camera attitude states;
and determining that the IMU attitude is not calibrated, including the attitude angle is not calibrated, and determining that the attitude state of the camera is unknown.
7. The method of claim 1, wherein the determining the different camera pose states further comprises:
when the overflow ratio is not more than 0.08 and the maximum deviation value is not more than 2 degrees, marking the attitude state of the camera to be stable;
when the overflow ratio is not more than 0.5, marking the attitude state of the camera as relatively stable;
when the threshold condition is not satisfied, the camera pose state is marked as unstable.
8. The method of claim 7, further comprising:
sending a camera calibration prompt when the number of stable cameras accounts for 50% of the total number of cameras;
sending a camera calibration prompt when the relatively stable number of cameras accounts for 80% of the total number of cameras;
and when the threshold condition is not met, ending the calculation of the camera data of the frame, and entering the calculation of the camera data of the next frame.
9. An infrared positioning camera capable of detecting self-attitude in real time, comprising:
a memory configured to store data and instructions;
a processor in communication with the memory, wherein the processor, when executing instructions in the memory, is configured to:
acquiring IMU data of a camera and performing data preprocessing;
performing IMU attitude calculation on the preprocessed data;
performing IMU attitude calibration to calibrate an attitude angle;
when the IMU is not in the camera calibration state, determining that the IMU posture is calibrated;
calculating the deviation between the real-time attitude angle and the calibration attitude angle;
when the deviation is judged to be larger than a preset threshold value, calculating an overflow ratio and recording a maximum deviation value;
judging that the overflow ratio and the maximum deviation value meet a threshold condition, and determining different camera attitude states; acquiring the deviation between the real-time attitude angle and the calibration attitude angle;
when the deviation is judged to be larger than a preset threshold value, recording the number of overflowing camera frames and recording the maximum deviation value, including pitch angle deviation and roll angle deviation;
and calculating the ratio of the overflow camera frame number to the total camera frame number to obtain the overflow ratio.
CN202010036314.2A 2020-01-14 2020-01-14 Infrared positioning camera capable of detecting self posture in real time Active CN111121774B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010036314.2A CN111121774B (en) 2020-01-14 2020-01-14 Infrared positioning camera capable of detecting self posture in real time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010036314.2A CN111121774B (en) 2020-01-14 2020-01-14 Infrared positioning camera capable of detecting self posture in real time

Publications (2)

Publication Number Publication Date
CN111121774A CN111121774A (en) 2020-05-08
CN111121774B true CN111121774B (en) 2021-04-06

Family

ID=70489302

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010036314.2A Active CN111121774B (en) 2020-01-14 2020-01-14 Infrared positioning camera capable of detecting self posture in real time

Country Status (1)

Country Link
CN (1) CN111121774B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116156310A (en) * 2023-01-10 2023-05-23 济南大学 Wearable camera gesture monitoring and recognition system and method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102997915A (en) * 2011-09-15 2013-03-27 北京自动化控制设备研究所 POS post-processing method with combination of closed-loop forward filtering and reverse smoothing
US9031782B1 (en) * 2012-01-23 2015-05-12 The United States Of America As Represented By The Secretary Of The Navy System to use digital cameras and other sensors in navigation
US9182236B2 (en) * 2013-10-25 2015-11-10 Novatel Inc. System for post processing GNSS/INS measurement data and camera image data
CN104197933B (en) * 2014-09-16 2017-11-07 中国科学院光电技术研究所 High magnitude slides enhancing and the extracting method of fixed star in a kind of range of telescope
CN104243833B (en) * 2014-09-30 2017-12-08 精宸智云(武汉)科技有限公司 Camera posture adjusting method and device
CN104732518B (en) * 2015-01-19 2017-09-01 北京工业大学 A kind of PTAM improved methods based on intelligent robot terrain surface specifications
CN105180963B (en) * 2015-07-22 2018-02-16 北京航空航天大学 Unmanned plane telemetry parameter modification method based on online calibration
CN106705965A (en) * 2017-01-12 2017-05-24 苏州中德睿博智能科技有限公司 Scene three-dimensional data registration method and navigation system error correction method
CN110349219A (en) * 2018-04-04 2019-10-18 杭州海康威视数字技术股份有限公司 A kind of Camera extrinsic scaling method and device
CN108592919B (en) * 2018-04-27 2019-09-17 百度在线网络技术(北京)有限公司 Drawing and localization method, device, storage medium and terminal device

Also Published As

Publication number Publication date
CN111121774A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
US11222440B2 (en) Position and pose determining method, apparatus, smart device, and storage medium
US9936195B2 (en) Calibration for eye tracking systems
CN108985220B (en) Face image processing method and device and storage medium
KR20170088655A (en) Method for Outputting Augmented Reality and Electronic Device supporting the same
EP3465387A1 (en) Communicating information via a computer-implemented agent
Lawanont et al. Neck posture monitoring system based on image detection and smartphone sensors using the prolonged usage classification concept
KR20180013277A (en) Electronic apparatus for displaying graphic object and computer readable recording medium
US11688136B2 (en) 3D object model reconstruction from 2D images
US11430142B2 (en) Photometric-based 3D object modeling
US11543939B2 (en) Encoded image based messaging system
US11886167B2 (en) Method, system, and non-transitory computer-readable recording medium for supporting object control
KR20170097884A (en) Method for processing image and electronic device thereof
CN111060138B (en) Calibration method and device, processor, electronic equipment and storage medium
US20230267687A1 (en) 3d object model reconstruction from 2d images
CN115698947A (en) Bidirectional bridge for web page views
CN111121774B (en) Infrared positioning camera capable of detecting self posture in real time
CN117321385A (en) Scheduling requests for location data
KR20160134428A (en) Electronic device for processing image and method for controlling thereof
US10551195B2 (en) Portable device with improved sensor position change detection
US10091436B2 (en) Electronic device for processing image and method for controlling the same
US10776993B1 (en) Continuous device pose representation in AR/VR displays
KR102012804B1 (en) Method, apparatus and computer program for recognition of a user activity
US20230144111A1 (en) A method for generating a 3d model
CN110717467A (en) Head pose estimation method, device, equipment and storage medium
US20240078762A1 (en) Selecting a tilt angle of an ar display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: Room 809-6, building 2, No. 500, Shunqing Road, Songjiang District, Shanghai 201103

Patentee after: SHANGHAI MANHENG DIGITAL TECHNOLOGY Co.,Ltd.

Address before: 201103 room 1202, building 3, No. 518, Xinzhuan Road, Xinqiao Town, Songjiang District, Shanghai

Patentee before: SHANGHAI MANHENG DIGITAL TECHNOLOGY Co.,Ltd.