CN117670994A - Image processing method, calibration system and related equipment - Google Patents

Image processing method, calibration system and related equipment Download PDF

Info

Publication number
CN117670994A
CN117670994A CN202211005411.0A CN202211005411A CN117670994A CN 117670994 A CN117670994 A CN 117670994A CN 202211005411 A CN202211005411 A CN 202211005411A CN 117670994 A CN117670994 A CN 117670994A
Authority
CN
China
Prior art keywords
imu
images
positional relationship
relative positional
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211005411.0A
Other languages
Chinese (zh)
Inventor
庄开元
周开城
杨兵
谢鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211005411.0A priority Critical patent/CN117670994A/en
Priority to PCT/CN2023/104689 priority patent/WO2024041202A1/en
Publication of CN117670994A publication Critical patent/CN117670994A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses an image processing method which can be applied to an electronic equipment positioning scene. The method comprises the following steps: acquiring a plurality of images, wherein each image comprises a calibration body and electronic equipment which is rigidly connected with the calibration body, the electronic equipment is provided with light spots and internally comprises an Inertial Measurement Unit (IMU) device; determining a first relative positional relationship of the calibration volume and the light spot based on the plurality of images; acquiring a second relative positional relationship of the calibration body and the IMU device based on the plurality of images and the IMU data; and determining a third relative positional relationship between the light spot and the IMU device based on the first relative positional relationship and the second relative positional relationship, the third relative positional relationship being used to position the electronic device. And determining a third relative position relation between the light spot and the IMU device by using a calibration body which is rigidly connected with the electronic equipment as an intermediate medium. The measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relation, and the positioning accuracy of the electronic equipment is improved.

Description

Image processing method, calibration system and related equipment
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an image processing method, a calibration system, and related devices.
Background
With the popularity of augmented reality (augmented reality, AR), virtual Reality (VR) technology, and Mixed Reality (MR), AR devices, VR devices, and MR devices have been widely used in many contexts, such as work, entertainment, and the like. Unlike conventional terminal devices, current AR devices, VR devices and MR devices use independent handles as the primary interaction scheme.
Currently, the handles are often positioned using an outside-in (outlide-in) tracking technique. The positioning is performed by shooting a specially designed light spot on the handle by means of a camera on the helmet and by combining an inertial measurement unit (Inertial Measurement Unit, IMU) carried by the handle itself. However, since there are few (typically only tens of) light points on the handle and the machining accuracy is low, the coordinate system of the handle light points does not coincide with the origin of the IMU coordinate system. I.e. the IMU coordinate system has a certain position deviation from the handle coordinate system. Resulting in IMU data that cannot directly reflect the motion relationship of the handle. Thereby affecting the positioning accuracy of the handle.
Therefore, how to accurately represent the measurement data output by the IMU on the coordinate system of the handle light point is a technical problem to be solved.
Disclosure of Invention
The application provides an image processing method which is used for achieving external parameter calibration of an external light spot and an IMU coordinate system of electronic equipment and improving positioning accuracy of the electronic equipment.
The first aspect of the present embodiment provides an image processing method, which may be applied to a scene of electronic device positioning, for example, a scene of a game, learning, or competition, such as Virtual Reality (VR)/augmented reality (augmented reality, AR). The method may be performed by an image processing apparatus or by a component of an image processing apparatus (e.g., a processor, a chip, or a system-on-chip, etc.). The method comprises the following steps: acquiring a plurality of images, wherein each image comprises a calibration body and electronic equipment with light spots, the calibration body is rigidly connected with the electronic equipment, an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment, and the positions of the electronic equipment in at least two images are different; determining a first relative positional relationship of the calibration volume and the light spot based on the plurality of images; acquiring IMU data acquired by an IMU device, wherein the IMU data are related to a plurality of images; acquiring a second relative positional relationship of the calibration body and the IMU device based on the plurality of images and the IMU data; and determining a third relative position relation between the light spot and the IMU coordinate system based on the first relative position relation and the second relative position relation, wherein the third relative position relation is used for positioning the electronic equipment.
In the embodiment of the application, the third relative positional relationship between the light spot and the IMU device is determined by constructing a calibration body rigidly connected to the electronic device (e.g., the handle, the aircraft, etc.), and using the calibration body as an intermediate medium or reference. And the measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relationship, so that the positioning accuracy of the electronic equipment is improved.
Optionally, in a possible implementation manner of the first aspect, the steps are as follows: determining a first relative positional relationship of the calibration volume and the light spot based on the plurality of images, comprising: acquiring first coordinates of a calibration body on an image coordinate system based on a plurality of images; acquiring second coordinates of the light spot on an image coordinate system based on the plurality of images; and determining a first relative position relation between the calibration body and the light spot based on the first coordinate and the second coordinate.
In this possible implementation manner, the relative positions between the calibration body and the light spot outside the electronic device are associated by using the image as a reference coordinate system, so as to obtain a first relative position relationship.
Optionally, in a possible implementation manner of the first aspect, the steps are as follows: acquiring a second relative positional relationship of the calibration body and the IMU device based on the plurality of images and the IMU data, comprising: determining an error between a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data; a second relative positional relationship of the calibration body and the IMU device is determined based on the error.
In the possible implementation manner, the second relative position relationship between the calibration body and the IMU device can be clearly determined by acquiring the second relative position relationship through the error between the first motion track of the calibration body and the second motion track of the IMU device.
Optionally, in a possible implementation manner of the first aspect, the steps further include: an internal reference of the IMU device is determined based on the error and the IMU data.
In this possible implementation, the internal parameters of the IMU device may also be determined by errors, or so-called trajectory errors. For example, the IMU references include one or more of the following: zero offset error, dimensional error, and angular error of the angular velocity and linear acceleration sensor.
A second aspect of the embodiments of the present application provides an image processing apparatus. The image processing apparatus includes: the device comprises an acquisition unit, a control unit and a display unit, wherein the acquisition unit is used for acquiring a plurality of images, each image comprises a calibration body and electronic equipment with light spots, the calibration body is rigidly connected with the electronic equipment, an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment, and the positions of the electronic equipment in at least two images are different; a determining unit for determining a first relative positional relationship between the calibration body and the light spot based on the plurality of images; the acquisition unit is also used for acquiring IMU data acquired by the IMU device, wherein the IMU data are related to a plurality of images; the acquisition unit is also used for acquiring a second relative position relation between the calibration body and the IMU device based on the plurality of images and the IMU data; and the determining unit is further used for determining a third relative position relation between the light spot and the IMU device based on the first relative position relation and the second relative position relation, and the third relative position relation is used for positioning the electronic equipment.
Optionally, in a possible implementation manner of the second aspect, the determining unit is specifically configured to obtain, based on the plurality of images, a first coordinate of the calibration body on the image coordinate system; a determining unit, in particular for acquiring second coordinates of the light spot on an image coordinate system based on the plurality of images; the determining unit is specifically configured to determine a first relative positional relationship between the calibration body and the light spot based on the first coordinate and the second coordinate.
Optionally, in a possible implementation manner of the second aspect, the acquiring unit is specifically configured to determine an error between the first motion trajectory of the calibration body and the second motion trajectory of the IMU device based on the plurality of images and the IMU data; and the acquisition unit is particularly used for determining a second relative position relation between the calibration body and the IMU device based on the error.
Optionally, in a possible implementation manner of the second aspect, the determining unit is further configured to determine an internal parameter of the IMU device based on the error and the IMU data.
A third aspect of the embodiments of the present application provides a calibration system that may be applied to electronic device positioning scenarios, such as games, learning, and racing scenarios, such as Virtual Reality (VR)/augmented reality (augmented reality, AR). The calibration system comprises: camera, electronic device with light spot, calibration body and image processing device; the electronic equipment is rigidly connected with the calibration body, and an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment; the camera is used for collecting a plurality of images of the calibration body and the electronic equipment in the motion process; the calibration body is used for determining a third relative position relation between the light spot and the IMU device, and the third relative position relation is used for positioning the electronic equipment; the image processing equipment is used for acquiring a plurality of images and IMU data acquired by the IMU device in the motion process; and determining a third relative positional relationship based on the plurality of images and the IMU data.
Optionally, in a possible implementation manner of the third aspect, the image processing apparatus is specifically configured to determine a first relative positional relationship between the calibration body and the light spot based on the plurality of images; the image processing equipment is specifically used for acquiring a second relative position relation between the calibration body and the IMU device based on the plurality of images and the IMU data; the image processing device is specifically configured to determine a third relative positional relationship based on the first relative positional relationship and the second relative positional relationship.
Optionally, in a possible implementation manner of the third aspect, the image processing apparatus is specifically configured to determine an error between a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data; the image processing device is specifically used for determining a second relative position relation between the calibration body and the IMU device based on the error.
Optionally, in a possible implementation manner of the third aspect, the image processing apparatus is further configured to determine an internal parameter of the IMU device based on the error and the IMU data.
A fourth aspect of the present application provides an image processing apparatus, comprising: a processor coupled to a memory for storing a program or instructions which, when executed by the processor, cause the image processing apparatus to implement the method of the first aspect or any possible implementation of the first aspect.
A fifth aspect of the present application provides a computer readable medium having stored thereon a computer program or instructions which, when run on a computer, cause the computer to perform the method of the first aspect or any possible implementation of the first aspect.
A sixth aspect of the present application provides a computer program product which, when executed on a computer, causes the computer to perform the method of the first aspect or any of the possible implementations of the first aspect.
The technical effects of the second, third, fourth, fifth, and sixth aspects or any one of the possible implementation manners of the second, third, fourth, fifth, and sixth aspects may be referred to the technical effects of the first aspect or the different possible implementation manners of the first aspect, which are not described herein.
From the above technical scheme, the application has the following advantages: by constructing a calibration body rigidly connected to the electronic device (e.g. handle, aircraft, etc.), and using the calibration body as an intermediate medium or reference, a third relative positional relationship of the light spot and the IMU device is determined. And the measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relationship, so that the positioning accuracy of the electronic equipment is improved.
Drawings
FIG. 1 is a schematic structural diagram of a calibration system according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 4 is another schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The application provides an image processing method which is used for achieving external parameter calibration of electronic equipment and an IMU coordinate system and improving positioning accuracy of the electronic equipment.
For ease of understanding, related terms and concepts primarily related to embodiments of the present application are described below.
1. External parameters and internal parameters of IMU and electronic equipment
The external parameters of the IMU and the electronic device are understood as the relative positional relationship between the IMU coordinate system and the electronic device.
IMU references include one or more of the following: zero offset error, dimensional error, and angular error of the angular velocity and linear acceleration sensor.
2. Calibration body
The calibration body in the embodiment of the present application may also be referred to as a three-dimensional calibration body, and the calibration body may be a hexahedron, a sphere, a cube, or the like, which is not limited herein.
The calibration plate serves as a fixed reference coordinate system, and because the size and the pattern of the calibration plate are known, the corresponding relation between the 3D coordinate and the 2D coordinate can be obtained through the size and the pattern. The object in the image containing the calibration plate can then be converted from the 2D coordinates of the image to the 3D coordinates by the calibration plate.
Illustratively, taking the calibration body as a hexahedron, each face of the calibration body contains a pattern of calibration plates (chart). The patterns in the six faces may be the same or different. Of course, for the purpose of subsequent determination of the accuracy of the relative positional relationship and differentiation of the six faces, the patterns on the six faces may be different from each other in order to facilitate rapid differentiation of the different calibration plates. The pattern or type of the calibration plate may be set according to actual needs, and is not limited herein.
3. Outside-in (outlide-in) tracking technique
The 'outside-in' tracking technology refers to that a position capturing end is at a certain distance outside a tracked object, and the positioning is performed by capturing specific position parameters of the positioned object (the positioning can be performed by sending out signals from outside, receiving signals by the positioned object, or sending out signals by the positioned object and capturing information from outside), so that the most traditional positioning technology has the advantages of high positioning precision, small delay and low price. However, the external signal transmitting device or the signal receiving device is required to be installed before the device is used, so that the device is not convenient and fast, and the positioning range is fixed.
4. Electronic device with light spot
The electronic device positions the object as described in the foregoing 1, and the light spot on the electronic device may be a luminous spot or a reflected light spot, or the like.
Currently, the electronic devices with light spots often adopt a positioning mode of an 'outlide-in' tracking technology. And shooting a specially designed lamp point on the electronic equipment through a camera on the helmet and positioning the lamp point and an IMU of the combined electronic equipment. However, since there are few (typically only tens of) light points on the electronic device and the machining accuracy is low, the coordinate system of the light points of the electronic device is not coincident with the origin of the IMU coordinate system. Namely, the IMU coordinate system and the electronic equipment coordinate system have certain position deviation. Resulting in IMU data that does not directly reflect the motion relationships of the electronic device. Thereby affecting the positioning accuracy of the electronic device.
Therefore, how to accurately represent the measurement data output by the IMU on the coordinate system of the lamp point of the electronic device is a technical problem to be solved.
In order to solve the above technical problems, an embodiment of the present application provides an image processing method, by constructing a calibration body that is rigidly connected to an electronic device (e.g., a handle, etc.), and determining a third relative positional relationship between a light spot and an IMU device using the calibration body as an intermediate medium or a reference. And the measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relationship, so that the positioning accuracy of the electronic equipment is improved.
Before describing the method provided by the embodiment of the present application, an application scenario to which the method provided by the embodiment of the present application is applicable is described. An application scenario of the method provided in the embodiment of the present application may be shown in fig. 1. The scene includes: a camera 101, a calibration volume 102, an electronic device 103 with a light spot and an image processing device 104.
The calibration body 102 is rigidly connected (may also be referred to as a rigid connection) to the electronic device 103, and the electronic device includes an IMU device therein.
The camera 101 may be a stand alone camera device or may be a camera component located in other devices (e.g., VR/AR helmet, VR/AR glasses, etc.), and is not limited in this particular regard. The camera 101 is mainly used for acquiring a plurality of images of the calibration body 102 and the electronic device 103 in the motion process.
In addition, in order to ensure accuracy in positioning of subsequent electronic devices, the position of the camera 101 at the time of capturing a plurality of images is fixed.
The calibration body 102 is mainly used for determining a third relative positional relationship between the electronic device 103 and an IMU coordinate system, where the IMU coordinate system uses the IMU device as a coordinate origin, and the third relative positional relationship is used for positioning the electronic device 103. The description of the calibration body 102 may also refer to the description in the related terms, which is not repeated herein.
The electronic device 103 contains an IMU device inside and a light spot (e.g., a light emitting spot or a reflective light spot) outside. The electronic device 103 may also be referred to as a device to be positioned in some scenarios. During the movement of the electronic device 103, the calibration body 102, which is rigidly connected to the electronic device 103, will also move.
The image processing device 104 is used for acquiring a plurality of images and IMU data acquired by the IMU device in the movement process of the electronic device; and determining a third relative positional relationship of the light spot and the IMU device based on the plurality of images and the IMU data.
Optionally, the calibration body 102 and the electronic device 103 are rigidly connected in a plurality of ways. For example, the calibration body 102 and the electronic device 103 are connected by hard metal or the like. For another example, the calibration body 102 may be considered a platform, and the ground may have a mounting bracket thereon for capturing the electronic device. It will be appreciated that the rigid connection is to ensure that the relative positions of the calibration body 102 and the electronic device 103 are not changed during movement, so that a third relative positional relationship between the external light spot of the electronic device 103 and the internal IMU device can be determined by the calibration body 102.
Optionally, the image processing device is specifically configured to determine a first relative positional relationship between the calibration body and the light spot based on the plurality of images; the image processing equipment is specifically used for acquiring a second relative position relation between the calibration body and the IMU device based on the plurality of images and the IMU data; the image processing device is specifically configured to determine a third relative positional relationship based on the first relative positional relationship and the second relative positional relationship.
Optionally, the image processing device is specifically configured to determine an error between a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data; the image processing device is specifically used for determining a second relative position relation between the calibration body and the IMU device based on the error.
Optionally, the image processing apparatus 104 may also obtain internal parameters of the IMU device based on the plurality of images and the IMU data.
In the scene shown in fig. 1, the image processing apparatus 104 is connected to the camera 101 and the electronic apparatus 103, respectively. Specifically, the image processing device 104 may be connected to the camera 101 and the electronic device 103 by wired or wireless means.
Typically, the image processing apparatus 104 is connected to the camera 101 in a wired manner. Such as fiber optics, universal serial bus (Universal Serial Bus, USB), high definition multimedia interface (High Definition Multimedia Interface, HDMI). The image processing apparatus 104 is connected to the electronic apparatus 103 in a wireless manner. Such as bluetooth, cellular communication networks, wireless fidelity (wireless fidelity, WIFI), etc.
The electronic device in the embodiment of the application is a device internally provided with an IMU device and externally provided with a light spot. The electronic device may be a handle, an aircraft, a cell phone, a tablet (pad), a wearable electronic device, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, or the like. The specific examples are not limited herein.
The image processing device in the embodiment of the present application may be a server, a mobile phone, a tablet computer (pad), a portable game machine, a palm computer (personal digital assistant, PDA), a notebook computer, an ultra mobile personal computer (ultra mobile personal computer, UMPC), a handheld computer, a netbook, a vehicle-mounted media playing device, a wearable electronic device, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, or the like with sufficient computing power. The specific examples are not limited herein.
The image processing method provided in the embodiment of the present application is described in detail below. The method may be performed by an image processing apparatus. Or by a component of the image processing device, such as a processor, chip, or system-on-chip, etc. Referring to fig. 2, a flowchart of an image processing method according to an embodiment of the present application may include steps 201 to 205. Steps 201 to 205 are described in detail below. The method can be applied to the application scenario shown in fig. 1. Of course, the method can also be applied to scenes such as high-precision map downloading of terminals such as unmanned vehicles, unmanned planes, autonomous robots, AR glasses, VR helmets and the like, global positioning and the like.
In step 201, a plurality of images are acquired.
The image processing device can acquire a plurality of images by controlling the electronic device and the calibration body in the shooting motion process of the camera. The camera may be a camera internal to the image processing device or may be a camera device other than the image processing device (e.g., an AR/VR headset, glasses, etc.).
Each image in the plurality of images comprises a calibration body and electronic equipment with light spots, the calibration body is rigidly connected with the electronic equipment, an IMU device is arranged in the electronic equipment, and the positions of the electronic equipment in at least two images in the plurality of images are different. The plurality of images may also be understood as a plurality of images taken by the camera during the movement of the electronic device and the calibration body.
Step 202, determining a first relative positional relationship between the calibration volume and the light spot based on the plurality of images.
After the image processing device acquires the plurality of images, a first relative positional relationship between the calibration body and the electronic device is determined based on the plurality of images.
Specifically, the image processing apparatus may acquire the first coordinates of the calibration body on the image coordinate system based on the plurality of images. And obtaining a second coordinate of the spot on the image coordinate system based on the plurality of images. And determining a first relative position relationship between the calibration body and the electronic equipment based on the first coordinate and the second coordinate.
The above process can also be understood as that the first coordinate and the second coordinate are obtained by mapping the calibration body and the electronic device to the same image coordinate system, and then the first relative positional relationship between the two coordinates is determined according to the two coordinates in the same coordinate system.
Alternatively, one image may correspond to one coordinate, and the plurality of images correspond to one set of coordinates, where the first coordinates of the plurality of images include a plurality of coordinates, which may also be referred to as a first coordinate array. The second coordinates of the plurality of images include a plurality of coordinates, which may also be referred to as a second coordinate array.
For example, the first coordinate and the second coordinate may be solved using a three-point perspective (P3P) algorithm or the like. And then determining a first relative positional relationship between the first coordinate and the second coordinate.
And 203, acquiring IMU data acquired by the IMU device.
The image processing device may also obtain IMU data through an IMU device within the electronic device. The IMU data is associated with a plurality of images. Specifically, an IMU device inside the electronic device collects IMU data during movement of the electronic device. And transmits the IMU data to the image processing apparatus.
The IMU data and the plurality of images are acquired during the movement of the electronic device.
Of course, in order to increase the accuracy of the subsequent third-phase position determination. The plurality of images may be aligned with the time stamps of the IMU data. By way of example, the image processing apparatus and the electronic apparatus are connected by bluetooth. The image processing device can control the camera to shoot a plurality of images, and meanwhile, IMU data of the electronic device are acquired through the Bluetooth module. And synchronizing the bluetooth timestamp to the camera, calibrating the camera time system with the IMU time system. The plurality of images and IMU data are time stamped aligned by time stamping.
The relative positional relationships (e.g., the first relative positional relationship, the subsequent second relative positional relationship, and the third relative positional relationship) in the embodiments of the present application may be by way of a Rotation Translation (RT) matrix, or may be by way of a coordinate array, etc., which is not limited herein.
The IMU data in the embodiments of the present application relate to the structure of the IMU device, and the embodiments of the present application are not specifically limited herein. For example, for a tri-axial IMU, the IMU data may include tri-axial attitude angles and accelerations. For another example, for a six axis IMU, the IMU data may include three axis attitude angles and accelerations. For another example, for a nine-axis IMU, the IMU data may include a 3-axis magnetometer in addition to three-axis attitude angles and accelerations.
Step 204, obtaining a second relative positional relationship of the calibration body and the IMU device based on the plurality of images and the IMU data.
After the image processing apparatus acquires the plurality of images and the IMU data, a second relative positional relationship of the calibration body and the IMU device may be acquired based on the plurality of images and the IMU data.
Specifically, the image processing apparatus may determine an error between the first motion trajectory of the calibration body and the second motion trajectory of the IMU device through the plurality of images and the IMU data. And determining a second relative positional relationship of the calibration body and the IMU device based on the error.
The above process can be understood as determining the first motion trajectory of the calibration body based on a plurality of images generated during the motion process. And solving a second motion track by an integral mode (using default internal parameters and external parameters of the IMU) based on IMU data corresponding to the plurality of images. A loss function is established that represents a difference between the first motion profile and the second motion profile. And training the external parameters of the IMU by taking the value of the loss function smaller than a certain threshold value as a target. So as to obtain the modified IMU external parameters. In other words, by continuously adjusting the IMU parameters, the difference between the first motion trajectory and the second motion trajectory is smaller and smaller. And when the difference is smaller than a certain threshold value or the first movement track and the second movement track are overlapped, determining the IMU external parameter at the moment as a second relative position relation.
Step 205, determining a third relative positional relationship of the light spot and the IMU device based on the first relative positional relationship and the second relative positional relationship.
After the image processing apparatus obtains the first relative positional relationship and the second relative positional relationship, a third relative positional relationship between the light spot and the IMU device may be determined based on the first relative positional relationship and the second relative positional relationship.
This step is also understood to mean that the image processing apparatus uses the calibration volume as an intermediate medium or reference to determine a third relative positional relationship of the light spot and the IMU device.
Specifically, in the case where the relative positional relationship is a matrix, the third relative positional relationship is obtained by multiplying the first relative position and the second relative position.
In addition, the timing between the steps in the embodiments of the present application is not limited. For example, step 203 may follow step 202 or may precede step 201, and is not specifically limited herein.
In this embodiment, the third relative positional relationship between the light spot and the IMU device is determined by constructing a calibration body that is rigidly connected to the electronic device (e.g., the handle, etc.), and using the calibration body as an intermediate medium or reference. On the one hand, the measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relation, so that the positioning accuracy of the electronic equipment is improved. On the other hand, the position deviation between the IMU coordinate system and the electronic equipment can be determined through the third relative position relation, and the deviation is corrected, so that the IMU data can directly reflect the movement process of the electronic equipment.
Optionally, the IMU parameters may also be corrected by the errors and IMU data in step 204. And training the internal parameters and the external parameters of the IMU by taking the value of the loss function smaller than a certain threshold value as a target. So as to obtain the modified IMU internal parameters and external parameters. In other words, the difference between the first motion trail and the second motion trail is smaller and smaller by continuously adjusting the internal parameters and the external parameters of the IMU. And when the difference is smaller than a certain threshold value or the first movement track and the second movement track are overlapped, determining the IMU external parameter at the moment as a second relative position relation. And determining the IMU internal parameters at the moment as the adjusted IMU internal parameters. The description of the IMU memory may refer to the description in the related terms, and will not be repeated here.
In the mode, on one hand, the high-precision calibration of the IMU internal reference is realized through the three-dimensional calibration body. Because the number of the lamp points of the electronic equipment is small, the pose of the electronic equipment is calculated to have larger jitter, and the accurate pose of the electronic equipment is difficult to calculate, so that too much noise is introduced in the process of optimizing the IMU data, and inaccurate calibration of internal parameters of the IMU is caused. According to the embodiment of the application, the three-dimensional calibration body is introduced to serve as a stable positioning device. The motion state of the IMU device can be accurately judged, so that a good IMU internal parameter optimization effect is obtained. On the other hand, the three-dimensional calibration body can realize that the posture of the calibration plate in the three-dimensional calibration body can be obtained by using one camera in any posture of the IMU.
In addition, in order to improve the calibration efficiency in the method, the relative position relationship between the surfaces of the calibration body can be determined in an off-line or on-line mode. And furthermore, the pose of the calibration body can be calculated conveniently when the image is shot on only one calibration plate.
By way of example, taking the calibration body as a hexahedron, the six faces select different calibration patterns, so that different calibration plates can be distinguished conveniently. The determination mode mainly comprises the following steps: and shooting a plurality of calibration bodies by using a camera to obtain a plurality of images, and selecting one of the calibration plates as an origin of a coordinate system. The calibration plate selected as the origin of coordinates is referred to as a first calibration plate, and the calibration plates not selected are referred to as other calibration plates. And traversing six calibration plates on each shot image, calculating the positions of the calibration plates in the images, and calculating the relative positions of the calibration plates. And (3) processing a plurality of images, and calculating the position relation of all the calibration plates relative to the origin of coordinates. Therefore, when the image is shot on only one calibration plate, the pose of the calibration body can be calculated.
The image processing method in the embodiment of the present application is described above, and the image processing apparatus in the embodiment of the present application is described below, referring to fig. 3, one embodiment of the image processing apparatus in the embodiment of the present application includes:
an acquiring unit 301, configured to acquire a plurality of images, where each image includes a calibration body and an electronic device with a light spot, the calibration body is rigidly connected to the electronic device, an inertial measurement unit IMU device is included in the electronic device, and positions of the electronic devices in at least two images are different;
a determining unit 302, configured to determine a first relative positional relationship between the calibration body and the light spot based on the plurality of images;
the acquiring unit 301 is further configured to acquire IMU data acquired by the IMU device, where the IMU data is related to a plurality of images;
the acquiring unit 301 is further configured to acquire a second relative positional relationship between the calibration body and the IMU device based on the plurality of images and the IMU data;
the determining unit 302 is further configured to determine a third relative positional relationship between the light spot and the IMU device based on the first relative positional relationship and the second relative positional relationship, where the third relative positional relationship is used to position the electronic device.
Optionally, the determining unit 302 is specifically configured to acquire, based on the plurality of images, a first coordinate of the calibration body on the image coordinate system; a determining unit 302, specifically configured to obtain second coordinates of the light spot on the image coordinate system based on the plurality of images; the determining unit 302 is specifically configured to determine a first relative positional relationship between the calibration body and the light spot based on the first coordinate and the second coordinate.
Optionally, the acquiring unit 301 is specifically configured to determine an error between the first motion trajectory of the calibration body and the second motion trajectory of the IMU device based on the plurality of images and the IMU data; the obtaining unit 301 is specifically configured to determine a second relative positional relationship between the calibration body and the IMU device based on the error.
Optionally, the determining unit 302 is further configured to determine an intrinsic parameter of the IMU device based on the error and the IMU data.
In this embodiment, operations performed by each unit in the image processing apparatus are similar to those described in the embodiments shown in fig. 1 to 2, and are not described here again.
In this embodiment, the determining unit 302 determines the third relative positional relationship of the light spot and the IMU device by constructing a calibration body rigidly connected to the electronic device (e.g., handle, aircraft, etc.), using the calibration body as an intermediary or reference. And the measurement data output by the IMU can be accurately represented on the coordinate system of the handle lamp point according to the third relative position relationship, so that the positioning accuracy of the electronic equipment is improved.
Referring to fig. 4, another image processing apparatus provided in the present application is a schematic structural diagram. The image processing device may include a processor 401, a memory 402, and a communication port 403. The processor 401, memory 402 and communication port 403 are interconnected by wires. Wherein program instructions and data are stored in memory 402.
The memory 402 stores program instructions and data corresponding to the steps executed by the image processing apparatus in the embodiment corresponding to fig. 1 and 2.
A processor 401 for executing steps executed by the image processing apparatus as shown in any of the embodiments shown in fig. 1 and 2.
The communication port 403 may be used to receive and transmit data, and is used to perform the steps related to acquiring, transmitting, and receiving in any of the embodiments shown in fig. 1 and 2.
In one implementation, the image processing device may include more or fewer components than in fig. 4, which is only exemplary and not limiting.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM, random access memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (15)

1. An image processing method, the method comprising:
acquiring a plurality of images, wherein each image comprises a calibration body and electronic equipment with light spots, the calibration body is rigidly connected with the electronic equipment, an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment, and the positions of the electronic equipment in at least two images in the plurality of images are different;
determining a first relative positional relationship of the calibration volume and the light spot based on the plurality of images;
acquiring IMU data acquired by the IMU device, wherein the IMU data is related to the plurality of images;
acquiring a second relative positional relationship of the calibration body and the IMU device based on the plurality of images and the IMU data;
and determining a third relative positional relationship between the light spot and the IMU device based on the first relative positional relationship and the second relative positional relationship, wherein the third relative positional relationship is used for positioning the electronic equipment.
2. The method of claim 1, wherein the determining a first relative positional relationship of the calibration volume and the spot based on the plurality of images comprises:
acquiring a first coordinate of the calibration body on an image coordinate system based on the plurality of images;
acquiring second coordinates of the light spot on the image coordinate system based on the plurality of images;
and determining a first relative position relation between the calibration body and the light spot based on the first coordinate and the second coordinate.
3. The method of claim 1 or 2, wherein the acquiring a second relative positional relationship of the calibration volume and the IMU device based on the plurality of images and the IMU data comprises:
determining an error between a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data;
and determining a second relative positional relationship of the calibration body and the IMU device based on the error.
4. A method according to claim 3, characterized in that the method further comprises:
and determining an internal parameter of the IMU device based on the error and the IMU data.
5. An image processing apparatus, characterized in that the image processing apparatus comprises:
the device comprises an acquisition unit, a detection unit and a display unit, wherein the acquisition unit is used for acquiring a plurality of images, each image comprises a calibration body and electronic equipment with light spots, the calibration body is rigidly connected with the electronic equipment, an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment, and the positions of the electronic equipment in at least two images are different;
a determining unit configured to determine a first relative positional relationship between the calibration body and the light spot based on the plurality of images;
the acquisition unit is further used for acquiring IMU data acquired by the IMU device, and the IMU data are related to the plurality of images;
the acquisition unit is further used for acquiring a second relative position relationship between the calibration body and the IMU device based on the plurality of images and the IMU data;
the determining unit is further configured to determine a third relative positional relationship between the light spot and the IMU coordinate system based on the first relative positional relationship and the second relative positional relationship, where the third relative positional relationship is used to position the electronic device.
6. The image processing device according to claim 5, characterized in that the determination unit is in particular adapted to obtain a first coordinate of the calibration volume on an image coordinate system based on the plurality of images;
the determining unit is specifically configured to acquire second coordinates of the light spot on the image coordinate system based on the plurality of images;
the determining unit is specifically configured to determine a first relative positional relationship between the calibration body and the light spot based on the first coordinate and the second coordinate.
7. The image processing apparatus according to claim 5 or 6, wherein the acquisition unit is specifically configured to determine an error between a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data;
the acquisition unit is specifically configured to determine a second relative positional relationship between the calibration body and the IMU device based on the error.
8. The image processing apparatus of claim 7, wherein the determining unit is further configured to determine an internal reference of the IMU device based on the error and the IMU data.
9. A calibration system, the calibration system comprising: camera, electronic device with light spot, calibration body and image processing device; the electronic equipment is rigidly connected with the calibration body, and an Inertial Measurement Unit (IMU) device is arranged in the electronic equipment;
the camera is used for collecting a plurality of images of the calibration body and the electronic equipment in the motion process;
the calibration body is used for determining a third relative position relation between the light spot and the IMU device, and the third relative position relation is used for positioning the electronic equipment;
the image processing equipment is used for acquiring the plurality of images and IMU data acquired by the IMU device in the moving process; and determining the third relative positional relationship based on the plurality of images and the IMU data.
10. The calibration system according to claim 9, wherein the image processing device is specifically configured to determine a first relative positional relationship of the calibration volume and the light spot based on the plurality of images;
the image processing equipment is specifically used for acquiring a second relative position relation between the calibration body and the IMU device based on the plurality of images and the IMU data;
the image processing device is specifically configured to determine the third relative positional relationship based on the first relative positional relationship and the second relative positional relationship.
11. Calibration system according to claim 9 or 10, wherein the image processing device is in particular configured to determine an error of a first motion trajectory of the calibration body and a second motion trajectory of the IMU device based on the plurality of images and the IMU data;
the image processing device is specifically configured to determine a second relative positional relationship between the calibration body and the IMU device based on the error.
12. The calibration system of claim 11, wherein the image processing apparatus is further configured to determine an internal reference of the IMU device based on the error and the IMU data.
13. An image processing apparatus, characterized by comprising: a processor coupled to a memory for storing a program or instructions that, when executed by the processor, cause the image processing apparatus to perform the method of any of claims 1 to 4.
14. A computer storage medium comprising computer instructions which, when run on an image processing apparatus, cause the image processing apparatus to perform the method of any of claims 1 to 4.
15. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1 to 4.
CN202211005411.0A 2022-08-22 2022-08-22 Image processing method, calibration system and related equipment Pending CN117670994A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211005411.0A CN117670994A (en) 2022-08-22 2022-08-22 Image processing method, calibration system and related equipment
PCT/CN2023/104689 WO2024041202A1 (en) 2022-08-22 2023-06-30 Image processing method, calibration system, and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211005411.0A CN117670994A (en) 2022-08-22 2022-08-22 Image processing method, calibration system and related equipment

Publications (1)

Publication Number Publication Date
CN117670994A true CN117670994A (en) 2024-03-08

Family

ID=90012441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211005411.0A Pending CN117670994A (en) 2022-08-22 2022-08-22 Image processing method, calibration system and related equipment

Country Status (2)

Country Link
CN (1) CN117670994A (en)
WO (1) WO2024041202A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107153369B (en) * 2017-05-23 2019-08-20 北京小鸟看看科技有限公司 It is a kind of to position the localization method of object, system and wear display equipment
US10621751B2 (en) * 2017-06-16 2020-04-14 Seiko Epson Corporation Information processing device and computer program
CN113256728B (en) * 2020-02-13 2024-04-12 纳恩博(北京)科技有限公司 IMU equipment parameter calibration method and device, storage medium and electronic device
CN114549285A (en) * 2022-01-21 2022-05-27 广东虚拟现实科技有限公司 Controller positioning method and device, head-mounted display equipment and storage medium

Also Published As

Publication number Publication date
WO2024041202A1 (en) 2024-02-29

Similar Documents

Publication Publication Date Title
CN106774844B (en) Method and equipment for virtual positioning
CN110411476B (en) Calibration adaptation and evaluation method and system for visual inertial odometer
CN111156998A (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN102980556A (en) Distance measuring method and device
US10380761B2 (en) Locating method, locator, and locating system for head-mounted display
US12008173B2 (en) Multi-sensor handle controller hybrid tracking method and device
US10817047B2 (en) Tracking system and tacking method using the same
CN110530356B (en) Pose information processing method, device, equipment and storage medium
CN105182319A (en) Target positioning system and target positioning method based on radio frequency and binocular vision
CN109003305A (en) A kind of positioning and orientation method and device
EP3557378B1 (en) Tracking system for tracking and rendering virtual object corresponding to physical object and the operating method for the same
CN112485785A (en) Target detection method, device and equipment
CN110262667B (en) Virtual reality equipment and positioning method
Maciejewski et al. Design and evaluation of a steamvr tracker for training applications–simulations and measurements
CN105184268A (en) Gesture recognition device, gesture recognition method, and virtual reality system
CN107229055B (en) Mobile equipment positioning method and mobile equipment positioning device
CN108489338B (en) Infrared seeker line-of-sight angular rate precision testing method and system
JP6858387B1 (en) Orbit calculation device, orbit calculation method, orbit calculation program
JP6924455B1 (en) Trajectory calculation device, trajectory calculation method, trajectory calculation program
CN117670994A (en) Image processing method, calibration system and related equipment
US20190228583A1 (en) Systems and methods for tracking object location and orientation in virtual reality environments using ultra-wideband signals, inertia measurement units, and reflective markers
CN107257547B (en) Equipment positioning method and device
CN110285811A (en) The fusion and positioning method and device of satellite positioning and inertial navigation
CN115813556A (en) Surgical robot calibration method and device, surgical robot and storage medium
WO2018196221A1 (en) Interaction method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication