CN111432113B - Data calibration method, device and storage medium - Google Patents
Data calibration method, device and storage medium Download PDFInfo
- Publication number
- CN111432113B CN111432113B CN201911398712.2A CN201911398712A CN111432113B CN 111432113 B CN111432113 B CN 111432113B CN 201911398712 A CN201911398712 A CN 201911398712A CN 111432113 B CN111432113 B CN 111432113B
- Authority
- CN
- China
- Prior art keywords
- structured light
- line laser
- light module
- offset
- autonomous mobile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides a data calibration method, data calibration equipment and a storage medium. In the embodiment of the application, the camera module is combined with the line laser transmitter, and the structured light module with higher measurement precision is provided by utilizing the advantage of higher detection precision of line laser; furthermore, by combining the IMU, the offset of the structured light module relative to the autonomous mobile equipment can be measured when the structured light module is applied to the autonomous mobile equipment, and then the data collected by the camera module can be calibrated according to the offset, so that the problem of unsatisfactory use effect of the measured data caused by relative motion or shaking offset is solved, and the use precision of the sensor data can be greatly improved on the basis that the measured data of the structured light module originally has higher precision.
Description
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a data calibration method, device, and storage medium.
Background
With the development of artificial intelligence technology, robots tend to be more intelligent. The robot can automatically navigate to a target area by means of certain artificial intelligence to complete corresponding tasks, so that the robot is more and more popular.
At present, robots are basically provided with sensors, such as cameras, laser radars and the like, and the sensors collect environmental information around the robots and are used for assisting the robots in obstacle avoidance or three-dimensional reconstruction and the like. In practical applications, the use effect of some sensor data is not ideal, for example, spatial data reconstructed three-dimensionally based on the sensor data is not accurate.
Disclosure of Invention
Aspects of the present application provide a data calibration method, device and storage medium, which are used to calibrate sensor data and improve the use accuracy of the sensor data.
The embodiment of the application provides a structured light module, includes: the system comprises a camera module, line laser transmitters distributed on two sides of the camera module, and a main control unit for controlling the camera module and the line laser transmitters to work; the line laser transmitter transmits line laser outwards under the control of the main control unit; the camera module is used for collecting an environment image detected by the line laser under the control of the main control unit; the structured light module further includes: the inertial measurement unit IMU can measure the offset of the structured light module relative to the autonomous mobile equipment when the structured light module is applied to the autonomous mobile equipment; the main control unit is also used for: and calibrating the environment image according to the offset.
An embodiment of the present application further provides an autonomous mobile device, including: the device comprises a device body, wherein a main controller and a structured light module are arranged on the device body; the structured light module includes: the system comprises a camera module, line laser transmitters distributed on two sides of the camera module and a main control unit for controlling the camera module and the line laser transmitters to work; an inertial measurement unit IMU is also arranged on the structured light module; the line laser transmitter transmits line laser outwards under the control of the main control unit; the camera module is used for collecting an environment image detected by the line laser under the control of the main control unit; the IMU is used for measuring the offset of the structured light module relative to the autonomous mobile equipment; the main control unit is also used for: calibrating the environment image according to the offset, and providing the calibrated environment image to the main controller; and the main controller is used for performing function control on the autonomous mobile equipment by utilizing the calibrated environment image.
An embodiment of the present application further provides an autonomous mobile device, including: the device comprises a device body, wherein a main controller, an Inertial Measurement Unit (IMU) and a structured light module are arranged on the device body; the structured light module includes: the camera comprises a camera module and line laser transmitters distributed on two sides of the camera module; the IMU is arranged on the structured light module; the line laser transmitter transmits line laser outwards under the control of the main controller; the camera module is used for collecting an environment image detected by the line laser under the control of the main controller; the IMU is used for measuring the offset of the structured light module relative to the autonomous mobile equipment; the main controller is also used for: and calibrating the environment image according to the offset, and performing function control on the autonomous mobile equipment by using the calibrated environment image.
The embodiment of the present application further provides a data calibration method, which is applicable to an autonomous mobile device, where the autonomous mobile device is provided with a structured light module, and the structured light module includes: the camera comprises a camera module and line laser transmitters distributed on two sides of the camera module; the method is characterized in that an inertial measurement unit IMU is further installed on the structured light module, and comprises the following steps: utilizing the IMU to collect the offset of the structured light module relative to the autonomous mobile equipment; and calibrating an environment image acquired by the camera module according to the offset, wherein the environment image is detected by line laser emitted by the line laser emitter.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program, which, when executed by one or more processors, causes the one or more processors to implement the steps in the data calibration method provided by the embodiments of the present application.
In the embodiment of the application, the camera module is combined with the line laser emitter, and the structured light module with higher measurement precision is provided by utilizing the advantage of higher detection precision of the line laser; furthermore, by combining an Inertial Measurement Unit (IMU), when the structured light module is applied to the autonomous mobile device, the offset of the structured light module relative to the autonomous mobile device can be measured, and then the data collected by the camera module can be calibrated according to the offset, so that the problem of unsatisfactory use effect of the measured data caused by relative motion or jitter offset is reduced, and the use accuracy of the sensor data can be greatly improved on the basis that the measured data of the structured light module originally has higher accuracy.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a is a schematic structural diagram of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1b is a schematic diagram illustrating an operating principle of a line laser transmitter according to an exemplary embodiment of the present disclosure;
fig. 1c is a schematic diagram illustrating a mounting position relationship of each device in a structured light module according to an exemplary embodiment of the present disclosure;
fig. 1d is a schematic diagram illustrating a relationship between a line laser of a line laser transmitter and a field angle of a camera module according to an exemplary embodiment of the present application;
FIG. 1e is a front view of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1f is a bottom view of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1g is a top view of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1h is a rear view of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1i is an exploded view of a structured light module according to an exemplary embodiment of the present disclosure;
FIG. 1j is a schematic diagram of a device coordinate system and a sensor coordinate system provided in an exemplary embodiment of the present application;
FIG. 2a is a schematic diagram of another structured light module according to an exemplary embodiment of the present disclosure;
fig. 2b is a schematic structural diagram of a main control unit according to an exemplary embodiment of the present disclosure;
fig. 2c is a schematic structural diagram of a laser driving circuit according to an exemplary embodiment of the present disclosure;
fig. 3a is a schematic structural diagram of an autonomous mobile device according to an exemplary embodiment of the present application;
fig. 3b is a schematic structural diagram of an autonomous mobile device control structure optical module according to an exemplary embodiment of the present application;
fig. 3c is an exploded view of an apparatus body and a striking plate according to an exemplary embodiment of the present disclosure;
FIG. 3d is an exploded view of a structured light module and a striking plate according to an exemplary embodiment of the present disclosure;
fig. 4a is a schematic structural diagram of another autonomous mobile device provided in an exemplary embodiment of the present application;
FIG. 4b is a schematic diagram of a main controller according to an exemplary embodiment of the present disclosure;
fig. 4c is a schematic structural diagram of an autonomous mobile device control structure optical module according to an exemplary embodiment of the present application;
fig. 5 is a schematic flowchart of a light source distinguishing method according to an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, the technical solutions of the present application will be clearly and completely described below with reference to the specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the problem that the using effect of sensor data is not ideal, in the embodiment of the application, the camera module is combined with the line laser transmitter, and the structured light module with higher measuring precision is provided by utilizing the advantage of higher line laser detection precision; furthermore, by combining the IMU, the offset of the structured light module relative to the autonomous mobile equipment can be measured when the structured light module is applied to the autonomous mobile equipment, and then the data collected by the camera module can be calibrated according to the offset, so that the problem of unsatisfactory use effect of the measured data caused by relative motion or jitter offset is solved, and the use precision of the sensor data can be greatly improved on the basis that the measured data of the structured light module originally has higher precision.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1a is a schematic structural diagram of a structured light module according to an exemplary embodiment of the present disclosure. As shown in fig. 1a, the structured light module 100 includes: the camera module 101, the line laser transmitter 102, the IMU108 and the main control unit 103 are distributed on two sides of the camera module 101.
In the present embodiment, the implementation form of the line laser transmitter 102 is not limited, and may be any device/product form capable of transmitting line laser. For example, line laser transmitter 102 may be, but is not limited to: and (3) a laser tube. Line laser transmitter 102 may emit line laser light outward to detect an image of the environment. In the present embodiment, the main control unit 103 may control the line laser transmitter 102 to operate, for example, may control the time and the transmitting power of the line laser transmitter 102 to emit the line laser outwards. For the line laser transmitter 102, the line laser may be emitted outward under the control of the main control unit 103. As shown in fig. 1b, the line laser transmitter 102 may emit a laser plane FAB and a laser plane ECD to the outside under the control of the main control unit 103, and after the laser plane reaches an obstacle, a line of laser light is formed on the surface of the obstacle, i.e. a line AB and a line CD shown in fig. 1 b. The IMU108 is not shown in fig. 1 b.
In the present embodiment, the implementation form of the camera module 101 is not limited. All visual equipment capable of acquiring environment images are suitable for the embodiment of the application. For example, the camera module 101 may include, but is not limited to: a monocular camera, a binocular camera, etc. In this embodiment, the wavelength of the line laser beam emitted from the line laser emitter 102 is not limited, and the color of the line laser beam may be different depending on the wavelength. Accordingly, the camera module 101 may employ a camera module capable of collecting line laser emitted from the line laser emitter 102. The camera module 101 may be, for example, an infrared camera, an ultraviolet camera, a starlight camera, a high-definition camera, or the like, adapted to the wavelength of the line laser emitted by the line laser emitter 102. The camera module 101 can capture an environmental image within a field angle thereof. The angle of view of the camera module 101 includes a vertical angle of view and a horizontal angle of view. In the present embodiment, the field angle of the camera module 101 is not limited, and the camera module 101 with a suitable field angle may be selected according to application requirements.
In this embodiment, the line laser emitted by the line laser emitter 102 is located in the field of view of the camera module 101, the line laser can help detect the information such as the profile, height and/or width of the object in the field of view of the camera module 101, and the camera module 101 can collect the environmental image detected by the line laser. In this embodiment, as long as the line laser emitted by the line laser emitter 102 is located within the field of view of the camera module 101, an angle between a laser line segment formed by the line laser on the surface of the object and a horizontal plane is not limited, for example, the line laser may be parallel to or perpendicular to the horizontal plane, or may form any angle with the horizontal plane, which may be determined according to the application requirements.
In this embodiment, the main control unit 103 can control the camera module 101 to operate, for example, can control an exposure frequency, an exposure duration, an operating frequency, and the like of the camera module 101. For the camera module 101, an environment image detected by the line laser can be collected under the control of the main control unit 103. Fig. 1d is a schematic diagram showing the relationship between the line laser emitted by the line laser emitter 102 and the field angle of the camera module 101. Wherein, letter K represents the camera module, and letters J and L represent the line laser transmitters positioned at two sides of the camera module; q represents the intersection point of the line laser emitted by the line laser emitters at the two sides in the field angle of the camera module; straight lines KP and KM represent two boundaries of a horizontal field of view of the camera module, and angle PKM represents a horizontal field of view of the camera module. In fig. 1d, a straight line JN indicates a center line of the line laser light emitted by the line laser emitter J; the straight line LQ represents the center line of the line laser light emitted by the line laser transmitter L. The IMU108 is not shown in fig. 1 d.
Based on the environmental image collected by the camera module 101, the distance from the structured light module 100 or the device in which the structured light module 100 is located to the front object (e.g., an obstacle) can be calculated, the information such as the height, width, shape, or contour of the front object (e.g., the obstacle) can be calculated, and further, three-dimensional reconstruction can be performed. The distance between the line laser transmitter and the object in front of the line laser transmitter can be calculated through a trigonometric function by utilizing the trigonometric principle.
In some application scenarios, the structured light module 100 may be applied to an autonomous mobile device. An autonomous mobile device refers to any device capable of autonomous movement in its working environment, and may be, for example, a robot, a cleaner, an unmanned vehicle, or the like. In a scenario where the structured light module 100 is applied to an autonomous mobile device, the structured light module 100 needs to be installed on the autonomous mobile device, and an environment image collected by the structured light module 100 needs to be transformed from a coordinate system (referred to as a sensor coordinate system for short) where the structured light module 100 is located to a device coordinate system where the autonomous mobile device is located. In practical applications, the structured light module 100 moves along with the movement of the autonomous mobile device, and during the movement, the structured light module 100 may shake for some reason, which may cause the sensor coordinate system to shift relative to the device coordinate system. Taking a robot as an example, the structured light module 100 is installed on a collision plate at the front side of the robot, the collision plate is a movable component, and the amount of shaking is large, and in the moving process of the robot, shaking of the collision plate can cause shaking of the structured light module 100, so that a sensor coordinate system where the structured light module 100 is located is offset relative to an equipment coordinate system where the robot is located (or referred to as a robot coordinate system). Such a deviation may cause the high-precision data collected by the structured light module 100 to deviate during the transformation to the device coordinate system, which may result in an unsatisfactory use effect of the measurement data of the structured light module 100, and is not favorable for fully exerting the high-precision advantage of the structured light module 100.
In view of the above considerations, in the present embodiment, the structured light module 100 has the IMU108 mounted thereon. An IMU is a device that measures the three-axis attitude angles and/or accelerations of an object. In general, IMUs may include, but are not limited to: at least one of a gyroscope, an accelerometer, and a magnetometer. Wherein the gyroscope is a sensor for measuring angular velocity; an accelerometer is a meter used to measure the linear acceleration of a vehicle; a magnetometer is a sensor that measures the strength and direction of a magnetic field, the orientation of a pointing device. In the embodiments of the present application, the specific implementation form of the IMU is not limited. Any IMU that can measure the offset of the structured light module 100 relative to the autonomous mobile device is suitable for use in the embodiments of the present application. For example, but not limited to, a three-axis IMU, a six-axis IMU, a nine-axis IMU, and so forth. Optionally, where the IMU is a three-axis IMU, the IMU may employ a three-axis gyroscope, a three-axis accelerometer, or a three-axis magnetometer. Where the IMU is a six-axis IMU, the IMU may employ, but is not limited to, a three-axis gyroscope and a three-axis accelerometer. For example, where the IMU is a six-axis IMU, the IMU may employ a three-axis gyroscope and a three-axis magnetometer, a three-axis accelerometer and a three-axis magnetometer, and so on.
In the embodiment of the present application, the mounting position of the IMU108 is not limited, and may be, for example, mounted on the front side, the rear side, the left side, the right side, the upper portion, the lower portion, and the like of the structured light module 100. Regardless of the location on the structured light module 100, in the present embodiment, the IMU108 can measure the offset of the structured light module 100 relative to the autonomous mobile device when the structured light module 100 is applied to the autonomous mobile device. In the embodiment of the present application, the type of the offset is not limited, and the type of the IMU108 may be different and the offset may be different according to different application requirements. For example, when the IMU108 is a three-axis gyroscope, the offset herein mainly refers to the rotational offset of the structured light module 100 relative to the autonomous mobile device; when the IMU108 is a three-axis accelerometer, the offset herein mainly refers to a displacement offset of the structured light module 100 relative to the autonomous mobile device; when the IMU is a three-axis magnetometer, the offset herein mainly refers to the azimuth offset of the structured light module 100 relative to the autonomous mobile device, and so on.
In the present embodiment, the implementation of obtaining the offset by the IMU108 is also not limited. Any implementation of obtaining the offset of the structured light module 100 relative to the autonomous mobile device through the IMU108 is suitable for the embodiments of the present application. Of course, when the IMU108 is implemented using different types of sensors, the implementation of the different sensors to detect the amount of displacement of the structured light module 100 relative to the autonomous mobile device may also vary. For example, assuming that the IMU108 employs a three-axis gyroscope, an integration period may be set, and an instantaneous angular velocity value measured by the gyroscope is integrated within the integration period to obtain an angular offset, which represents a rotational offset of the structured light module 100 relative to the autonomous mobile device. For another example, if the IMU108 may also employ a three-axis accelerometer, an integration period may be set, and an instantaneous acceleration value measured by the accelerometer is integrated within the integration period to obtain a distance offset, where the distance offset represents a translational offset of the structured light module 100 relative to the autonomous mobile device, and so on. The integration period may be set according to an application scenario or a requirement, for example, the instantaneous acceleration value measured by the gyroscope is integrated once every 3ms, 5ms, 7ms, and other time lengths, so as to obtain the angular offset.
In the embodiment of the present application, the implementation form of the main control unit 103 is not limited. The main control unit 103 may be, but is not limited to, a CPU, a GPU, an MCU, a processing chip or a single chip computer implemented based on an FPGA or a CPLD, and the like. In this embodiment, the main control unit 103 may control the camera module 101 and the line laser transmitter 102 to work, and on the other hand, may calibrate an environmental image collected by the camera module 101 according to an offset measured by the IMU 108. The master control unit 103 is also electrically connected to the IMU108, and the IMU108 may transmit the measured offset to the master control unit 103. The embodiment of the present application does not limit the specific implementation of calibrating the environment image by using the offset, and all implementations that can calibrate the environment image by using the offset are applicable to the embodiment of the present application.
In an optional embodiment, when the main control unit 103 calibrates the environment image, it is specifically configured to: and calibrating the first transformation matrix by using the offset, and transforming the environment image into an equipment coordinate system by using the calibrated first transformation matrix so as to achieve the purpose of calibrating the environment image.
The first transformation matrix is a transformation matrix between a sensor coordinate system where the structured light module 100 is located and an equipment coordinate system where the autonomous mobile equipment is located. The first transformation matrix may be determined to some extent by the configuration parameters of the structured light module 100 mounted to the autonomous mobile device, including but not limited to: mounting height, mounting angle, horizontal mounting position, etc. The first transformation matrix is a fixed value by default, but since the structured light module 100 is mounted on the autonomous mobile device, a jitter may occur during the travel of the autonomous mobile device, and the jitter may cause a change in a structural parameter of the structured light module 100 mounted on the autonomous mobile device, and the change in the structural parameter means that the first transformation matrix should not be a fixed value, but should be dynamically changed.
In the above optional embodiment, the offset amount of the structured light module 100 relative to the autonomous mobile device is embodied in the first transformation matrix, and the first transformation matrix is calibrated, so that the first transformation matrix can follow the dynamic change of the offset of the structured light module 100 relative to the autonomous mobile device, that is, can follow the dynamic change of the structural parameter, and further, the environmental image acquired by the structured light module 100 can be accurately converted into the device coordinate system where the autonomous mobile device is located based on the dynamically changed first transformation matrix, thereby solving the problem of inaccurate use of the environmental image due to the error generated in the conversion process of the environmental image from the sensor coordinate system to the device coordinate system caused by the jitter to a certain extent.
In the embodiment of the present application, a specific implementation manner of the main control unit 103 calibrating the first transformation matrix is not limited. Any embodiment that can calibrate the first transformation matrix is applicable to the embodiments of the present application. In an optional embodiment, when the main control unit 103 calibrates the first transformation matrix, it is specifically configured to: calculating the product of the offset and the second transformation matrix to obtain an offset matrix under a sensor coordinate system; calculating the product of the offset matrix and the first transformation matrix to obtain a calibrated first transformation matrix; the second transformation matrix is a transformation matrix from an IMU coordinate system where the IMU108 is located to a sensor coordinate system, and is determined by structural parameters of the IMU108 mounted to the structured light module 100, where the structural parameters include but are not limited to: mounting height, mounting angle, horizontal mounting position, etc. The IMU108 is fixedly mounted with respect to the structured-light module 100, and theoretically, the IMU108 is not offset with respect to the structured-light module 100, which means that the second transformation matrix is accurate. Therefore, the offset measured by the IMU108 can be directly transformed into the sensor coordinate system of the structured light module 100 by using the second transformation matrix, so as to obtain an offset matrix in the sensor coordinate system. The offset matrix may be used to calibrate the first transformation matrix.
In the embodiment of the present application, the establishment manners of the sensor coordinate system, the device coordinate system, and the IMU coordinate system are not limited. Alternatively, the device coordinate system and the sensor coordinate system may be established in, but not limited to, the manner shown in FIG. 1 j. As shown in fig. 1j, the device coordinate system has the center of gravity of the autonomous mobile device as the origin of coordinates O1, the center being located inside the device, the coordinate axes located inside the device being indicated by dashed lines, and the part of the coordinate system located outside the device being indicated by solid lines. Taking a central axis in the horizontal forward direction as an axis x1, a straight line which is horizontally leftward and is vertical to the axis x1 as an axis y1, and a straight line which is vertically upward of the autonomous mobile device as an axis z 1; the structured light module is positioned on a collision plate at the front side of the autonomous mobile equipment, the camera module is positioned on an x1 axis of the autonomous mobile equipment, for the convenience of calculation, a sensor coordinate system takes the camera module as a coordinate origin, an x2 axis of the sensor coordinate system is coincident with an x1 axis of the equipment coordinate system, and a y2 axis and a z2 axis of the sensor coordinate system are parallel to a y1 axis and a z1 axis of the equipment coordinate system. Of course, the sensor coordinate system may also establish a coordinate system with the line laser transmitter as the coordinate origin, which is not limited to this.
When the coordinates of a position point in the source coordinate system are converted to the target coordinate system, the coordinates of the position point in the source coordinate system need to be multiplied by a transformation matrix to obtain the coordinates of the position point in the target coordinate system, wherein the transformation matrix represents the position relationship between the source coordinate system and the target coordinate system. In the embodiment of the present application, the first transformation matrix and the second transformation matrix are both transformation matrices, and the implementation form of the transformation matrices is not limited. Alternatively, the first transformation matrix and the second transformation matrix may be composed of a rotation matrix and a translation matrix concatenated. From the source coordinate system to the target coordinate system, a series of rotation and translation operations are required. During the rotation of the coordinate axis, the change of the angle is represented by a rotation matrix R, and during the movement of the coordinate axis, the change of the distance is represented by a translation matrix t. The rotation matrix is an m × m order square matrix, the translation matrix T is an m × 1 order matrix, and a transformation matrix T obtained by splicing the rotation matrix R and the translation matrix T is an (m +1) × (m +1) order square matrix, wherein the last element of the last row of the transformation matrix T is supplemented by 1, and the rest elements of the last row are supplemented by 0. Wherein m is a positive integer. The description of the matrix structure of the first transformation matrix and the second transformation matrix herein is merely exemplary, but not limited thereto. For example, for three-dimensional space, m may take 3, and the first and second transformation matrices are 4 x 4 matrices.
When the first transformation matrix is calibrated, the physical significance of the offset is different according to the IMU. When the IMU108 is a three-axis gyroscope, the measured offset Δ T1 represents the variation Δ R1 of the rotation matrix, and at this time, the value in the variation Δ T1 of the translation matrix is 1; when the IMU is a triaxial accelerometer, the measured offset delta T2 represents the variation delta T2 of the translation matrix, and the value in the variation delta R2 of the rotation matrix is 1; when the IMU is a triaxial gyroscope and a triaxial accelerometer, the offset Δ T31 measured by the triaxial gyroscope represents the variation Δ R3 of the rotation matrix, and the offset Δ T32 measured by the triaxial accelerometer represents the variation Δ T3 of the translation matrix. When the IMU is a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer, the direction offset delta T41 measured by the three-axis magnetometer can be converted into an angle offset, the angle offset and the offset delta T42 measured by the three-axis gyroscope together represent the change quantity delta R4 of a rotation matrix, and the offset delta T42 measured by the three-axis accelerometer represents the change quantity delta T4 of a translation matrix; the direction offset delta T41 measured by the three-axis magnetometer can be converted into distance offset, and the distance offset delta T41 and the offset delta T42 measured by the three-axis accelerometer together represent the variation delta T4 of the translation matrix. And obtaining an offset matrix under a sensor coordinate system by utilizing the product of the second transformation matrix and the offset, and obtaining a calibrated first transformation matrix by utilizing the product of the offset matrix and the first transformation matrix.
And when the environment image is calibrated, multiplying the coordinate point of the environment image in the sensor coordinate system by the calibrated first transformation matrix to obtain the coordinate of the environment image in the equipment coordinate system of the autonomous mobile equipment. Further, the normalized environment image may be transformed into a world coordinate system by multiplying the environment image by a third transformation matrix. Wherein the third transformation matrix is a transformation matrix between the device coordinate system of the autonomous mobile device to the world coordinate system. Based on the environment image after the calibration, can carry out various function control to autonomic mobile device, have the advantage of higher accuracy with the help of the structure optical module, further combine the calibration to the offset of structure optical module relative autonomic mobile device, can guarantee the accuracy and the precision of environment image, be favorable to carrying out accurate ground control to autonomic mobile device.
In the embodiment of the present application, the total number of the line laser transmitters 102 is not limited, and may be two or more, for example. The number of the line laser emitters 102 distributed on each side of the camera module 101 is not limited, and the number of the line laser emitters 102 on each side of the camera module 101 may be one or more; in addition, the number of line laser emitters 102 on both sides may be the same or different. In fig. 1a, one line laser transmitter 102 is provided on each side of the camera module 101 for illustration, but the invention is not limited thereto. For example, 2 line laser emitters 102 may be disposed on the left side of the camera module 101, and 1 line laser emitter 102 may be disposed on the right side of the camera module 101. For another example, 2, 3, or 5 line laser transmitters 102 are disposed on both the left and right sides of the camera module 101.
In this embodiment, the distribution of the line laser emitters 102 on both sides of the camera module 101 is also not limited, and may be, for example, uniform distribution, non-uniform distribution, symmetrical distribution, or non-symmetrical distribution. Wherein, the uniform distribution and the non-uniform distribution may mean that the laser emitters 102 are distributed between the same side of the camera module 101, and may be uniformly distributed or non-uniformly distributed, and of course, it may also be understood that: the line laser emitters 102 distributed on both sides of the camera module 101 are uniformly distributed or non-uniformly distributed as a whole. The symmetric distribution and the asymmetric distribution mainly mean that the line laser emitters 102 distributed on both sides of the camera module 101 are symmetrically distributed or asymmetrically distributed as a whole. Symmetry here includes both the number of equivalents and the mounting location. For example, in the structured light module shown in fig. 1b, the number of the line laser emitters 102 is two, and the two line laser emitters 102 are symmetrically distributed on two sides of the camera module 101.
In the embodiment of the present application, the installation position relationship between the line laser emitter 102 and the camera module 101 is not limited, and the installation position relationship where the line laser emitter 102 is distributed on both sides of the camera module 101 is applicable to the embodiment of the present application. The installation position relationship between the line laser transmitter 102 and the camera module 101 is related to the application scene of the structured light module 100. The installation position relationship between the line laser transmitter 102 and the camera module 101 can be flexibly determined according to the application scene of the structured light module 100. The installation position relationship here includes the following aspects:
installation height: the line laser transmitter 102 and the camera module 101 may be located at different heights in the installation height. For example, the line laser emitters 102 on both sides are higher than the camera module 101, or the camera module 101 is higher than the line laser emitters 102 on both sides; or the line laser transmitter 102 on one side is higher than the camera module 101, and the line laser transmitter 102 on the other side is lower than the camera module 101. Of course, the line laser transmitter 102 and the camera module 101 may be located at the same height. Preferably, the line laser transmitter 102 and the camera module 101 may be located at the same height. For example, in actual use, the structured light module 100 may be installed on a device (e.g., an autonomous mobile device such as a robot, a purifier, an unmanned vehicle, etc.), in which case the distance between the line laser transmitter 102 and the camera module 101 to the working surface (e.g., the ground) of the device is the same, e.g., the distance between the line laser transmitter and the camera module and the working surface is 47mm, 50mm, 10cm, 30cm, or 50cm, etc.
Installation distance: the installation distance is a mechanical distance (or referred to as a baseline distance) between the line laser transmitter 102 and the camera module 101. The mechanical distance between the line laser transmitter 102 and the camera module 101 can be flexibly set according to the application requirements of the structured light module 100. The size of the measurement blind area can be determined to some extent by information such as the mechanical distance between the line laser transmitter 102 and the camera module 101, the detection distance required to be satisfied by the device (e.g., a robot) where the structured light module 100 is located, and the diameter of the device. For the device (for example, a robot) where the structural optical module 100 is located, the diameter is fixed, and the mechanical distance between the measurement range and the line laser transmitter 102 and the camera module 101 can be flexibly set according to the requirement, which means that the mechanical distance and the blind area range are not fixed values. On the premise of ensuring the measurement range (or performance) of the equipment, the range of the blind area should be reduced as much as possible, however, the larger the mechanical distance between the line laser transmitter 102 and the camera module 101 is, the larger the controllable distance range is, which is beneficial to better control of the size of the blind area.
In some application scenarios, the structured light module 100 is applied to a sweeping robot, for example, the structured light module may be mounted on a striking plate or a robot body of the sweeping robot. For the sweeping robot, the following exemplary embodiment provides a reasonable mechanical distance range between the line laser transmitter 102 and the camera module 101. For example, the mechanical distance between the line laser transmitter 102 and the camera module 101 may be greater than 20 mm. Further optionally, the mechanical distance between the line laser transmitter 102 and the camera module 101 is greater than 30 mm. Further, the mechanical distance between the line laser transmitter 102 and the camera module 101 is greater than 41 mm. It should be noted that the mechanical distance range given here is not only applicable to the situation where the structured light module 100 is applied to a robot cleaner, but also applicable to other devices with the structured light module 100 having a size close to or similar to that of the robot cleaner.
Emission angle: the emission angle is an angle between a center line of the line laser emitted from the line laser emitter 102 and an installation base line of the line laser emitter 102 after installation. The installation baseline refers to a straight line where the line laser module 102 and the camera module 101 are located under the condition that the line laser module 102 and the camera module 101 are located at the same installation height. In the present embodiment, the emission angle of the line laser transmitter 102 is not limited. The emission angle is related to the detection distance that needs to be satisfied by the equipment (e.g., a robot) where the structured light module 100 is located, the radius of the equipment, and the mechanical distance between the line laser emitter 102 and the camera module 101. In the case that the detection distance that needs to be satisfied by the device (e.g., a robot) where the structured light module 100 is located, the radius of the device, and the mechanical distance between the line laser transmitter 102 and the camera module 101 are determined, the emission angle of the line laser transmitter 102 can be directly obtained through a trigonometric function relationship, that is, the emission angle is a fixed value.
Of course, if a specific emitting angle is required, it can be achieved by adjusting the detecting distance required to be satisfied by the device (e.g. robot) where the structured light module 100 is located and the mechanical distance between the line laser emitter 102 and the camera module 101. In some application scenarios, in the case of determining the detection distance and the radius of the device (e.g. a robot) that the device (e.g. the structured light module 100) needs to satisfy, the emission angle of the line laser emitter 102 may be varied within a certain angle range, for example, 50-60 degrees, by adjusting the mechanical distance between the line laser emitter 102 and the camera module 101, but is not limited thereto.
Referring to fig. 1c, an example of the application of the structured light module 100 to the sweeping robot is shown in an exemplary manner. In fig. 1C, the letter B denotes a camera module, and the letters a and C denote line laser emitters located on both sides of the camera module; h represents the intersection point of the line laser emitted by the line laser emitters at the two sides in the field angle of the camera module; lines BD and BE represent two boundaries of the horizontal field of view of the camera module, and angle DBE represents the horizontal field of view of the camera module. In fig. 1c, line AG represents the center line of the line laser emitted by line laser emitter a; the straight line CF represents the center line of the line laser light emitted by the line laser transmitter C. In fig. 1c, a straight line BH indicates a center line of a field angle of the camera module, that is, in fig. 1c, center lines of line laser beams emitted from the line laser emitters on both sides intersect with the center line of the field angle of the camera module. In FIG. 1c, IMU108 is not shown.
In the embodiments of the present application, the horizontal angle and the vertical angle of view of the employed camera module are not limited. Alternatively, the horizontal field angle range of the camera module can be 60-75 degrees. Further, the horizontal angle of view of the camera module may be 69.49 degrees, 67.4 degrees, etc. Accordingly, the vertical field angle range of the camera module can be 60-100 degrees. Further, the vertical angle of view of the camera module may be 77.74 degrees, 80 degrees, or the like.
In fig. 1c, the radius of the sweeping robot is 175mm, and the diameter is 350 mm; the line laser transmitters A and C are symmetrically distributed on two sides of the camera module B, and the mechanical distance between the line laser transmitter A or C and the camera module B is 30 mm; the horizontal field angle DBE of the camera module B is 67.4 degrees; in the case where the detection distance of the sweeper robot is 308mm, the emission angle of the line laser emitter a or C is 56.3 degrees. As shown in fig. 1c, the distance between the straight line IH passing through the point H and the installation base line is 45mm, the distance between the straight line IH and the tangent line of the edge of the sweeping robot is 35mm, and the area is a view field blind area. The various values shown in FIG. 1c are exemplary only and not limiting.
For convenience of use, the structured light module 100 provided in the embodiment of the present application further includes some carrying structures for carrying the camera module 101, the line laser emitter 102, and the main control unit 103, in addition to the camera module 101, the line laser emitter 102 and the main control unit 103 which are distributed on two sides of the camera module 101. The bearing structure may have various implementations, which are not limited thereto. In some optional embodiments, the bearing structure includes a fixing base, and further may include a fixing cover used in cooperation with the fixing base. The structure of the structured light module 100 with the holder and the cover will be described with reference to fig. 1 e-1 i. Fig. 1e to 1i are a front view, a bottom view, a top view, a rear view and an exploded view of the structured light module 100, and each view does not show all the components due to the view angle, so only some of the components are labeled in fig. 1e to 1 i. As shown in fig. 1e to fig. 1i, the structured light module 100 further includes: a fixed base 104. The camera module 101, the IMU108 and the line laser transmitter 102 are assembled on the fixing base 104, and the main control unit 103 is fixed behind the fixing base 104.
Further optionally, as shown in fig. 1i, the fixing base 104 includes: a main body 105 and end portions 106 located on both sides of the main body 105; wherein the camera module 101 and the IMU108 are mounted on the main body portion 105, and the line laser transmitter 102 is mounted on the end portion 106; wherein, the end face of the end 106 faces the reference surface, so that the center line of the line laser transmitter 102 and the center line of the camera module 101 intersect at one point; the reference plane is a plane perpendicular to the end face or end face tangent of the body portion 105.
In an alternative embodiment, in order to facilitate fixing and reduce the influence of the device on the appearance of the structural optical module 100, as shown in fig. 1i, a groove 111 is formed in the middle of the main body 105, and the camera module 101 is installed in the groove 111; the end portion 106 is provided with a mounting hole 109, and the line laser transmitter 102 is mounted in the mounting hole 109. Further optionally, as shown in fig. 1i, the structured light module 100 is further equipped with a fixing cover 107 above the fixing base 104; a cavity is formed between the fixing cover 107 and the fixing base 104 to accommodate the camera module 101, the IMU108 and the connecting wires between the line laser transmitter 102 and the main control unit 103. The fixing cover 107, the main control unit 103, the IMU108 and the fixing base 104 may be fixed by fixing members. In fig. 1i, the fixing member is illustrated by taking a screw 110 as an example, but the fixing member is not limited to the screw implementation.
In an optional embodiment, the lens of the camera module 101 is located inside the outer edge of the groove 111, i.e. the lens is retracted inside the groove 111, so that the lens can be prevented from being scratched or knocked, and the protection of the lens is facilitated.
In the embodiment of the present application, the shape of the end surface of the main body 105 is not limited, and may be, for example, a flat surface, or a curved surface recessed inward or outward. The shape of the end face of the main body 105 varies depending on the device in which the structured light module 100 is installed. For example, assuming that the structural optical module 100 is applied to an autonomous mobile device having a circular or elliptical outer shape profile, the end surface of the body portion 105 may be implemented as an inwardly recessed curved surface that is adapted to the outer shape profile of the autonomous mobile device. If the configuration optical module 100 is applied to an autonomous mobile device having a square or rectangular outline, the end surface of the main body portion 105 may be implemented as a plane that matches the outline of the autonomous mobile device. The autonomous mobile equipment with the circular or oval outline can be a sweeping robot, a window cleaning robot and the like with the circular or oval outline. Accordingly, the autonomous moving apparatus having a square or rectangular outer contour may be a sweeping robot, a window cleaning robot, or the like having a square or rectangular outer contour.
In an alternative embodiment, for an autonomous mobile device with a circular or elliptical outline, the structured light module 100 is mounted on the autonomous mobile device, and in order to match the appearance of the autonomous mobile device more and maximize the utilization of the space of the autonomous mobile device, the radius of the curved surface of the main body 105 is the same as or approximately the same as the radius of the autonomous mobile device. For example, if the outline of the autonomous moving apparatus is circular and the radius range is 170mm, when the structured light module is applied to the autonomous moving apparatus, the radius of the curved surface of the main body portion may be 170mm or approximately 170mm, for example, may be in the range of 170mm to 172mm, but is not limited thereto.
Further, in the case that the structured light module is applied to an autonomous mobile device with a circular or elliptical outline, the emission angle of the line laser emitter in the structured light module is mainly determined by the detection distance required by the autonomous mobile device, the radius of the autonomous mobile device, and the like. Under this scene, the terminal surface or the terminal surface tangent line of the main part of structured light module are parallel with the installation baseline, therefore the emission angle of line laser emitter also can be defined as: the included angle between the central line of the line laser emitted by the line laser emitter and the end surface or the tangent of the end surface of the main body part. In some application scenarios, the range of emission angles of the line laser transmitter may be implemented as 50-60 degrees with the detection range and radius determination of the autonomous mobile device, but is not limited thereto. As shown in fig. 1 e-1 i, the number of the line laser emitters 102 is two, and the two line laser emitters 102 are symmetrically distributed on two sides of the camera module 101. The detection distance required to be met by the autonomous mobile device refers to a distance range in which the autonomous mobile device needs to detect environmental information, and mainly refers to a certain distance range in front of the autonomous mobile device.
The structured light module that above-mentioned embodiment of this application provided, stable in structure, size are little, agree with the complete machine outward appearance, have greatly saved the space, can support multiple type autonomic mobile device.
In addition to the structured light module described above, embodiments of the present application provide another structured light module. Fig. 2a is a schematic structural diagram of another structured light module according to an exemplary embodiment of the present disclosure. The structured light module 200 includes: a line laser transmitter 201, a camera module 202, an IMU208 and a main control unit 203; wherein, line laser emitter 201 distributes in camera module 202 both sides.
Further, as shown in fig. 2a, the structured light module 200 further includes a laser driving circuit 204. The laser driving circuit 204 is electrically connected between the main control unit 203 and the line laser transmitter 201. In the embodiment of the present application, the number of the laser driving circuits 204 is not limited. Different laser emitters 201 may share one laser driving circuit 204, or one line laser emitter 201 may correspond to one laser driving circuit 204. Preferably, one line laser emitter 201 corresponds to one laser driving circuit 204. In fig. 2a, a line laser transmitter corresponding to a laser driving circuit is illustrated as an example. As shown in fig. 2a, the structured light module 200 includes two line laser emitters 201, which are respectively denoted by 201a and 201b, and laser driving circuits 204, which are respectively denoted by 204a and 204b, corresponding to the two line laser emitters 201.
In this embodiment, the laser driving circuit 204 is mainly configured to amplify the control signal sent by the main control unit 203 to the line laser transmitter 201, and provide the amplified control signal to the line laser transmitter 201 to control the line laser transmitter 201. In the embodiment of the present application, the circuit structure of the laser driving circuit 204 is not limited, and any circuit structure that can amplify a signal and provide the amplified signal to the line laser transmitter 201 is suitable for the embodiment of the present application.
In an alternative embodiment, as shown in fig. 2c, a circuit structure of the laser driving circuit 204a or 204b includes: a first amplification circuit 2041 and a second amplification circuit 2042. The first amplifying circuit 2041 is electrically connected to the main control unit 203, and an on-off control signal sent by the main control unit 203 to the line laser transmitter 201 is amplified by the first amplifying circuit 2041 and then enters the line laser transmitter 201 to drive the line laser transmitter 201 to start working. The second amplifying circuit 204b is electrically connected to the main control unit 203, and a current control signal sent by the main control unit 203 to the line laser transmitter 201 is amplified by the first amplifying circuit 2041 and then enters the line laser transmitter 201 to control the working current of the line laser transmitter 201.
Further, as shown in fig. 2c, the first amplifying circuit 2041 includes: a transistor Q1; the base electrode of the triode Q1 is connected with the resistor R27, the resistor R27 and the base electrode are grounded through the capacitor C27, and the two ends of the capacitor C27 are connected with the resistor R29 in parallel; the other end of the resistor R27 is electrically connected to the first IO interface of the main control unit 203 as an input end of the first amplifying circuit. The first IO interface of the main control unit 203 outputs an on-off control signal, which is filtered by the capacitor C27 and amplified by the transistor Q1, to drive the line laser transmitter 201 to start working. The main control unit 203 at least includes two first IO interfaces, and each first IO interface is electrically connected to one laser driving circuit 204 and is used for outputting an on-off control signal to the laser driving circuit 204 (e.g., 204a or 204 b). In fig. 2c, the main control unit 203 outputs the on-off control signal to the laser driving circuit 204a through the first IO interface, which is denoted by LD _ L _ EMIT _ CTRL, and the on-off control signal to the laser driving circuit 204b is denoted by LD _ R _ EMIT _ CTRL.
Further, as shown in fig. 2c, the second amplifying circuit 2042 includes: a gate of the MOS transistor Q7 and a gate of the MOS transistor Q7 are connected with the resistor R37 and the resistor R35, the resistor R37 and the resistor R35 are grounded through the capacitor C29, and the other end of the resistor R35 is electrically connected with the second IO interface of the main control unit as the input end of the second amplifying circuit; the drain electrode of the MOS transistor Q7 is grounded through a resistor R31, and the source electrode of the MOS transistor Q7 is electrically connected with the emitter of the triode Q1; the output end of the laser driving circuit is arranged between the collector of the triode Q1 and the power supply of the laser driving circuit and used for connecting the laser emitter. After a Pulse Width Modulation (PWM) signal output by the second IO interface of the main control unit is filtered by a filter circuit formed by a resistor R35 and a capacitor C29, the working current of the laser emitter can be controlled by changing the gate voltage of the MOS transistor Q7. For the main control unit 203, at least two second IO interfaces are included, and each second IO interface is electrically connected to one laser driving circuit 204 and is used for outputting a PWM signal to the laser driving circuit 204 (for example, 204a or 204 b). In fig. 2c, the PWM signal output from the main control unit 203 to the laser driving circuit 204a via the second IO interface is denoted by LD _ L _ PWM, and the PWM signal output to the laser driving circuit 204b is denoted by LD _ R _ PWM. Further, as shown in fig. 2c, J1 denotes a control interface of the line laser transmitter 201a, J2 denotes a control interface of the line laser transmitter 201b, and pin connection relationships between J1 and J2 and the laser driving circuits 204a and 204b are shown in fig. 2 c. That is, the pins LD _ L _ CATHOD (cathode) and LD _ L _ ANODE (ANODE) of the J1 are respectively connected to corresponding pins in the laser driving circuit 204 a; the pins LD _ R _ CATHOD (cathode) and LD _ R _ ANODE (ANODE) of the J2 are respectively connected to corresponding pins in the laser driver circuit 204 b.
In the embodiment of the present application, the implementation form of the main control unit 203 is not limited, and for example, the implementation form may be, but is not limited to: CPU, GPU, MCU, chip and singlechip based on FPGA or CPLD realization.
In an optional embodiment, the main control unit 203 is implemented by a single chip, in other words, the main control unit 203 is in the form of a single chip. Optionally, as shown in fig. 2b, an implementation structure of the main control unit 203 includes: a main control board 20 b.
In the embodiment of the present application, the implementation structure of the main control board 20b is not limited. All circuit boards capable of realizing the control function are suitable for the embodiment of the application. For example, the system can be an FPGA board card, a single chip microcomputer and the like. Optionally, in order to reduce the implementation cost, a single chip microcomputer with low price and high cost performance can be used as the main control board.
As shown in fig. 2b, the main control board 20b includes a plurality of IO interfaces (pins). Among these interfaces, a part of the IO interface may be used as a test interface and connected to the debugging and burning module 21 b. The debugging and burning module 21b is used to complete the burning of the configuration file and the testing of the hardware function after the burning is successful. The connection relationship between the debugging and burning module 21b and the main control board 20b is: the 2 nd pin21 b _ pin2 of the debugging and burning module 21b is electrically connected with the 23 rd pin20 b _ pin23 of the main control board 20b, and the 3 rd pin21 b _ pin3 of the debugging and burning module 21b is electrically connected with the 24 th pin20 b _ pin24 of the main control board 20 b. The pins 21b _ pin3 and 20b _ pin24 belong to IO interfaces for testing.
As shown in fig. 2b, the IO interface of the main control board 20b includes an interface for controlling the IMU to measure the offset, where the connection relationship between the IMU208 and the main control board 20b is: 208_ pin 1-20 b _ pin 41.
As shown in fig. 2b, the IO interfaces of the main control board 20b include interfaces for connecting clock signals, and these interfaces may be electrically connected to the clock control circuit 22b and are responsible for receiving the clock signals provided by the clock control circuit 22 b. The clock control circuit 22b includes: a resistor R9; a crystal oscillator Y1 connected in parallel with the resistor R9; a capacitor C37 connected in parallel with Y1; c38 connected in series with capacitor C37, wherein capacitors C37 and C38 are both connected to ground; the two ends of the resistor R9 are respectively led out of the output end of the clock control circuit 22b, and are electrically connected with the clock signal interface on the main control board 20 b. The clock control circuit 22b further includes: a resistor R10 connected with +3V voltage; the resistor R10 is grounded through the capacitor C40, and an output end is led out between the resistor R10 and the capacitor C40 and is electrically connected with an asynchronous reset (NRST) pin of the main control board 20 b. Further, the clock control circuit 22b further includes: a resistor R5; one end of the resistor R5 is grounded through a capacitor C26; the other end of the resistor R5 is grounded through C18; the +3V voltage and the processor of the autonomous mobile device are connected between the R5 and the C18, and an output end is led out between the resistor R5 and the capacitor C26 and is electrically connected with a VDDA pin of the main control board 20 b. The crystal oscillator Y1 in the clock control circuit 22b provides the high frequency pulse, which becomes the internal clock signal of the main control board 20b after frequency division processing, and uses the clock signal as the control signal for coordinating the operation of each component. In addition, in the case where the structured light module is mounted on an autonomous mobile device, the clock control circuit 22b may be connected to a processor of the autonomous mobile device to enable the autonomous mobile device to control the structured light module. The connection relationship between the clock control circuit 22b and the main control board 20b is: one end of R9 is connected to 20b _ pin2, the other end is connected to 20b _ pin3, 20b _ pin4 is connected between R10 and C40, and 20b _ pin5 is connected between R5 and C26. 20b _ pin2 denotes the 2 nd pin of the main control board 20b, 20b _ pin3 denotes the 3 rd pin of the main control board 20b, 20b _ pin4 denotes the 4 th pin (NRST) of the main control board 20b, and 20b _ pin5 denotes the 5 th pin (VDDA) of the main control board 20 b.
In the embodiment of the present application, the connection mode between the camera module 202 and the main control board 20b is not limited. The camera module 202 may be directly connected to the main control board 20 b; the main control board 20b may also be connected to an fpc (flexible Printed circuit) cable 23 b.
When the camera module 202 is connected to the main control board 20b through the FPC cable 23b, the connection relationship between the FPC cable 23b and the main control board 20b is: 23b _ pin-20 b _ pin, 23b _ pin-20 b _ pin. Wherein "-" represents a connection relationship. It should be noted that fig. 2b only shows some of the pins of the main control board 20b, and it is not meant that the main control board 20b only includes these pins. The connection relationships among the pin names, the pin numbers, and the corresponding pin numbers shown in fig. 2b and 2c are merely exemplary descriptions, and should not be construed as limiting the circuit configuration of the present application.
As shown in fig. 2a to 2c, the connection relationship between the laser driving circuit 204 (taking 204a and 204b as examples) and the main control board 20b is: in fig. 2c, J1 is connected to the line laser transmitter 201a in fig. 2a, and J1 is a control interface of the line laser transmitter 201 a; in fig. 2c, J2 is connected to the line laser transmitter 201b in fig. 2a, and J2 is a control interface of the line laser transmitter 201 b. As shown in fig. 2b, the laser driving circuit 204a includes pins LD _ L _ catod and LD _ L _ ANODE, which are electrically connected to the pins LD _ L _ catod and LD _ L _ ANODE of J1, respectively; the laser driving circuit 204b includes pins LD _ R _ CATHOD and LD _ R _ ANODE, which are electrically connected to the pins LD _ R _ CATHOD and LD _ R _ ANODE of J2, respectively. In fig. 2b, 20b _ pin28 is connected to the LD _ L _ EMIT _ CTRL terminal of the laser driving circuit 204a to control the on/off of the laser transmitter 201 a. When 20b _ pin28 is high, the line laser transmitter 201a is in an on state, and when 20b _ pin28 is low, the line laser transmitter 201a is in an off state. In fig. 2b, 20b _ pin27 is connected to the LD _ R _ EMIT _ CTRL terminal of the laser driving circuit 204b to control the on/off of the line laser transmitter 201b, wherein the line laser transmitter 201b is in the on state when 20b _ pin27 is at the high level, and the line laser transmitter 201b is in the off state when 20b _ pin27 is at the low level. In fig. 2b, 20b _ pin26 is connected to the LD _ L _ PWM terminal of the laser driving circuit 204a to control the operating current of the line laser transmitter 201a, and 20b _ pin26 outputs a PWM signal, the duty ratio of the PWM signal can be increased from 0% to 100%, and as the duty ratio increases, the operating current of the line laser transmitter 201a also increases, so that the operating current of the line laser transmitter 201a can be controlled by adjusting the duty ratio of the PWM signal on 20b _ pin 26. In fig. 2b, 20b _ pin25 is connected to the LD _ R _ PWM terminal of the laser driving circuit 204b to control the operating current of the laser transmitter 201b, and similarly, 20b _ pin25 outputs a PWM signal, and the operating current of the laser transmitter 201b can be controlled by adjusting the duty ratio of the PWM signal output by 20b _ pin 25.
In the above embodiments of the present application, the operation of the line laser transmitters located on both sides of the camera module is not limited. In an optional embodiment, the main control unit 203 is specifically configured to: the line laser emitters on the two sides of the camera module are controlled to work alternately, and the camera module 202 is controlled to set the working mode of the lens alternately so as to be matched with the line laser emitters 201 in the working state.
Further optionally, when the main control unit 203 controls the camera module 202 to alternately set the working mode of the lens, it is specifically configured to: when the line laser transmitter on the left side of the camera module is controlled to work, the lens of the camera module is controlled to work in a right half mode; when the linear laser transmitter on the right side of the camera module is controlled to work, the lens of the camera module is controlled to work in a left half mode.
Further optionally, the main control unit 203 may control the camera module 202 to expose, and control the line laser transmitter on one side to work when the camera module 202 exposes each time, so as to achieve the purpose that the line laser transmitters on two sides work alternately. Specifically, the main control unit 203 may send an on-off control signal and a PWM signal to the line laser transmitter through the laser driving circuit 204a or 204b shown in fig. 2c to drive the line laser transmitter to operate.
Of course, in addition to controlling the line laser emitters on both sides of the camera module to operate alternately, the line laser emitters on both sides of the camera module may also be controlled to operate simultaneously. Under the condition that the line laser transmitters on the two sides of the camera module work simultaneously, the lens of the camera module works in a full-width mode.
Based on the above-mentioned structure optical module, the exemplary embodiment of the present application further provides a structural schematic diagram of an autonomous mobile device. As shown in fig. 3a, the apparatus comprises: the equipment comprises an equipment body 300, wherein a main controller 301 and a structured light module 302 are arranged on the equipment body 300.
The structured light module includes: the system comprises a camera module 302a, line laser transmitters 302b distributed on two sides of the camera module 302a, and a main control unit 302c for controlling the camera module 302a and the line laser transmitters 302b to work; the line laser transmitter 302b transmits line laser outwards under the control of the main control unit 302 c; the camera module 302a is used for collecting an environment image detected by the line laser under the control of the main control unit 302 c; the main controller 301 is used to perform function control of the autonomous mobile device using the ambient image.
In the embodiment of the present application, the autonomous mobile device further includes an inertial measurement unit IMU308, and the IMU108 is mounted on the structured light module 302, but the specific mounting position of the IMU308 on the structured light module 302 is not limited. For example, the IMU308 may be mounted on, but not limited to, the front, back, top, bottom, left, right, etc. sides of the structured light module 302. The IMU308 is used to measure the amount of deflection of the structured light module 302 relative to the autonomous moving apparatus. The master control unit 302c is further configured to: calibrating the environment image according to the offset, and providing the calibrated environment image to the main controller 301; the main controller 301 is configured to perform function control on the autonomous mobile device using the calibrated environment image. For a detailed description of the structured light module, reference is made to the contents of the foregoing embodiments, which are not repeated herein.
In the embodiment of the present application, the self-moving device may be any mechanical device capable of performing space movement highly autonomously in the environment where the self-moving device is located, and for example, the self-moving device may be a robot, a purifier, an unmanned aerial vehicle, or the like. The robot can comprise a sweeping robot, a glass cleaning robot, a family accompanying robot, a welcome robot and the like.
Of course, the shape of the autonomous mobile device may vary according to the implementation of the autonomous mobile device. The embodiment does not limit the implementation form of the autonomous mobile device. Taking the outer contour shape of the autonomous mobile device as an example, the outer contour shape of the autonomous mobile device may be an irregular shape or some regular shapes. For example, the outer contour shape of the autonomous mobile apparatus may be a regular shape such as a circle, an ellipse, a square, a triangle, a drop, or a D-shape. The irregular shapes other than the regular shapes include, for example, an outer contour of a humanoid robot, an outer contour of an unmanned vehicle, and an outer contour of an unmanned vehicle.
In the embodiment of the present application, the implementation form of the main controller 301 is not limited, and the processor may be, for example, but not limited to, a CPU, a GPU, or an MCU. The embodiment of the present application does not limit the specific implementation manner in which the main controller 301 performs function control on the autonomous mobile device according to the environment image. For example, the main controller 301 may control the autonomous mobile device to implement various context awareness-based functions according to the environment image. For example, the functions of object recognition, tracking, classification and the like on a visual algorithm can be realized; in addition, based on the advantage of high line laser detection precision, the functions of positioning, map building and the like with strong real-time performance, strong robustness and high precision can be realized, and further, the constructed high-precision environment map can provide omnibearing support for motion planning, path navigation, positioning and the like. Of course, the main controller 301 may also perform travel control on the autonomous mobile apparatus according to the environmental image, for example, control the autonomous mobile apparatus to perform actions such as continuing forward, backward, and turning.
Similarly, the embodiment of the present application does not limit the implementation form of the main control unit 302c, and for example, the main control unit may be but is not limited to a processor such as a CPU, a GPU, or an MCU. The embodiment of the present application also does not limit the way in which the main control unit 302c controls the structural optical module 302. Any implementation that can achieve the function of the structured light module 302 is applicable to the embodiments of the present application. For example, when the main control unit 302c is an MCU, after power is turned on, the MCU starts initializing an input/output IO (In/Out) interface, where the IO interface is a link for information exchange between the MCU and the structured light module. The MCU utilizes an I2C (Inter Integrated Circuit) interface to configure the structured light module 302.
The principle of the MCU cooperating with the structured light module 302 will be described below by taking an example in which the main control unit 302c is an MCU, and the IMU is a three-axis gyroscope and a three-axis accelerometer. As shown in FIG. 3b, when powered on, the MCU begins initializing the IO interface and configuring the structured light module 302 via the I2C interface. After initialization is completed, the MCU controls the structured light module 302 through the I2C interface, so as to control the camera module 302a and the line laser transmitter 302b in the structured light module. The MCU sends a trigger signal to the camera module 302a through the I2C interface, and the camera module 302a receives the trigger signal to start exposure and sends an exposure synchronization (LED STROBE) signal to the MCU. After receiving the LED STROBE signal, the MCU drives the right line laser transmitter 302b to emit laser through the laser driving circuit 304c on the rising edge of the LED STROBE signal, and turns off the right line laser transmitter 302b on the falling edge of the LED STROBE signal. After exposure is completed, the camera module 302a triggers an MCU through a Digital Video Port (DVP) on the main control board to acquire picture data, and the picture data is processed by the MCU, where the processing is mainly to calibrate the picture data according to an offset measured by the IMU308, so as to accurately convert the picture data from a coordinate system where the structured light module 302 is located to a coordinate system where the autonomous mobile device is located, and then report the picture data after the coordinate system conversion to the main controller 301 of the autonomous mobile device through a serial interface. After the left half mode of the camera module 302a is completed, similarly, the MCU sends a trigger signal to the camera module 302a through the I2C interface, and the camera module 302a receives the trigger signal and starts exposure and sends an exposure synchronization (LED STROBE) signal to the MCU at the same time. After receiving the LED STROBE signal, the MCU drives the left line laser transmitter 302b to emit laser through the laser driving circuit 304c on the rising edge of the LED STROBE signal, and turns off the right line laser transmitter 302b on the falling edge of the LED STROBE signal. After exposure is completed, the camera module 302a triggers the MCU to acquire picture data through the DVP on the main control board and processes the picture data by the MCU, where the processing is mainly to calibrate the picture data according to the offset measured by the IMU308, so as to accurately convert the picture data from the coordinate system where the structured light module 302 is located to the coordinate system where the autonomous mobile device is located, and then report the picture data after coordinate system conversion to the main controller 301 of the autonomous mobile device through the serial interface. The above process is repeated until the operation is finished.
In the embodiment of the present application, the specific position of the structured light module 302 on the apparatus body 300 is not limited. Such as but not limited to the front, back, left, right, top, middle, and bottom of the device body 300, etc. Further, the structured light module 302 is disposed at a middle position, a top position, or a bottom position in the height direction of the apparatus body 300.
In an optional embodiment, the autonomous mobile device moves forward to perform a task, and in order to better detect the environmental information ahead, the structured light module 302 is disposed on the front side of the device body 300; the front side is the side that the device body faces during the forward movement of the autonomous mobile device.
In another alternative embodiment, in order to protect the structured light module 302 from being damaged by external force, a striking plate 305 is further installed on the front side of the apparatus body 300, and the striking plate 305 is located outside the structured light module 302. Fig. 3c is an exploded view of the device body 300 and the striking plate 305. In fig. 3c, the autonomous mobile device is illustrated by a sweeper robot, but is not so limited. The structured light module 302 may be mounted on the strike plate 305; it may not be mounted on the striking plate 305, and is not limited thereto. The striking plate 305 is provided with a window corresponding to the area of the structured light module 302 to expose the camera module 302a and the line laser transmitter 302b of the structured light module. Further optionally, windows are respectively opened on the striking plate corresponding to the positions of the camera module 302a and the line laser transmitter 302 b. As shown in fig. 3c, the striking plate 305 is provided with windows 31, 32 and 33, wherein the windows 31 and 33 correspond to the line laser emitters 302 b; the window 32 corresponds to the camera module 302 a.
In yet another alternative embodiment, the structured light module 302 is mounted on the inner sidewall of the striker plate 305. FIG. 3d is an exploded view of the structured light module 302 and the striking plate 305.
In yet another alternative embodiment, the distance from the center of the structured light module to the work surface on which the autonomous mobile device is located is in the range of 30-60 mm. In order to reduce the spatial blind area of the autonomous mobile device and make the angle of view sufficiently large, further optionally, the distance from the center of the structured light module to the working surface of the autonomous mobile device is 47 mm.
Further, in addition to the various components mentioned above, the autonomous mobile device of the present embodiment may also include some basic components, such as one or more memories, communication components, power components, drive components, and so forth.
Wherein the one or more memories are primarily for storing a computer program executable by the master controller to cause the master controller to control the autonomous mobile device to perform a corresponding task. In addition to storing computer programs, the one or more memories may be configured to store various other data to support operations on the autonomous mobile device. Examples of such data include instructions for any application or method operating on the autonomous mobile device, map data of the environment/scene in which the autonomous mobile device is located, operating modes, operating parameters, and so forth.
The communication component is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as Wifi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may further include a Near Field Communication (NFC) module, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and the like.
Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. Optionally, as shown in fig. 3c, the autonomous mobile device of this embodiment may be implemented as a sweeping robot, and in the case of implementing as a sweeping robot, the autonomous mobile device may further include a cleaning assembly, where the cleaning assembly may include a cleaning motor, a cleaning brush, a dusting brush, a dust collection fan, and the like. These basic components and the configurations of the basic components contained in different autonomous mobile devices are different, and the embodiments of the present application are only some examples.
Based on the above-mentioned structure optical module, an embodiment of the present application further provides a structure schematic diagram of another autonomous mobile device, as shown in fig. 4a, the device includes: the device comprises a device body 400, wherein the device body 400 is provided with a main controller 401, an inertial measurement unit IMU408 and a structured light module 402; the structured light module 402 includes: a camera module 402a and line laser transmitters 402b distributed on both sides of the camera module. For a detailed description of the structured light module 402, reference is made to the contents of the foregoing embodiments, which are not repeated herein.
The embodiment of the present application does not limit the way in which the main controller 401 is electrically connected to the structured light module 402, and any way in which the operation of the structured light module 402 can be controlled is applicable to the embodiment of the present application. Specifically, the main controller 401 is electrically connected to the camera module 402a and the line laser transmitter 402b, respectively. On one hand, the main controller 401 controls the line laser emitter 402b to emit line laser, for example, may control the time and the emission power of the line laser emitter 402b to emit line laser; on the other hand, the camera module 402a is controlled to collect the environment image detected by the line laser, for example, the exposure frequency, the exposure duration, the working frequency, etc. of the camera module 402a can be controlled; further, the main controller 401 is also responsible for performing function control on the autonomous mobile device according to the environment image.
In the embodiment of the present application, the inertial measurement unit IMU408 is mounted on the structured-light module 402, and does not limit the specific mounting position of the IMU408 on the structured-light module 402. For example, the IMU408 may be mounted on, but not limited to, the front, back, top, bottom, left, right, etc. of the structured light module 402. The IMU408 is used to measure the amount of offset that occurs for the structured light module 402 relative to the autonomous mobile device. The master controller 401 is also used to: and calibrating the environment image according to the offset, and performing function control on the autonomous mobile equipment by using the calibrated environment image.
In the embodiment of the present application, the self-moving device may be any mechanical device capable of performing space movement highly autonomously in the environment where the self-moving device is located, and for example, the self-moving device may be a robot, a purifier, an unmanned aerial vehicle, or the like. The robot can comprise a sweeping robot, a glass cleaning robot, a family accompanying robot, a welcome robot and the like.
Of course, the shape of the autonomous mobile device may vary according to the implementation of the autonomous mobile device. The embodiment does not limit the implementation form of the autonomous mobile device. Taking the outer contour shape of the autonomous mobile device as an example, the outer contour shape of the autonomous mobile device may be an irregular shape or some regular shapes. For example, the outer contour shape of the autonomous mobile apparatus may be a regular shape such as a circle, an ellipse, a square, a triangle, a drop, or a D-shape. The irregular shapes other than the regular shapes include, for example, an outer contour of a humanoid robot, an outer contour of an unmanned vehicle, and an outer contour of an unmanned vehicle.
The embodiment of the present application does not limit the specific implementation manner of the main controller 401 performing function control on the autonomous mobile device according to the environment image. For example, the main controller 401 may control the autonomous mobile device to implement various context awareness-based functions according to the environment image. For example, the functions of object recognition, tracking, classification and the like on a visual algorithm can be realized; in addition, based on the advantage of high line laser detection precision, the functions of positioning, map building and the like with strong real-time performance, strong robustness and high precision can be realized, and further, the constructed high-precision environment map can provide all-round support for motion planning, path navigation, positioning and the like. Of course, the main controller 401 may also perform travel control on the autonomous mobile apparatus according to the environment image, for example, control the autonomous mobile apparatus to perform operations such as continuing forward, backward, and turning.
In the embodiment of the present application, the implementation form of the main controller 401 is not limited, and for example, the implementation form may be, but is not limited to, a CPU, a GPU, an MCU, a processing chip implemented based on an FPGA or a CPLD, or a single chip microcomputer.
In an optional embodiment, the main controller 401 is implemented by a single chip, in other words, the main controller 401 is in the form of a single chip. Alternatively, as shown in fig. 4b, one implementation structure of the main controller 401 includes: a main control board 40 b.
In the embodiment of the present application, the implementation structure of the main control board 40b is not limited. All circuit boards capable of realizing the control function are suitable for the embodiment of the application. For example, the system can be an FPGA board card, a single chip microcomputer and the like. Optionally, in order to reduce the implementation cost, a single chip microcomputer with low price and high cost performance can be used as the main control board.
As shown in fig. 4b, the main control board 40b includes a plurality of IO interfaces (pins). Among these interfaces, a part of the IO interface may be used as a test interface and connected to the debugging and burning module 41 b. The debugging and burning module 41b is used to complete the burning of the configuration file and the testing of the hardware function after the burning is successful. The connection relationship between the debugging and burning module 41b and the main control board 40b is: the 2 nd pin41 b _ pin2 of the debugging and burning module 41b is electrically connected with the 23 rd pin 40b _ pin23 of the main control board 40b, and the 3 rd pin41 b _ pin3 of the debugging and burning module 41b is electrically connected with the 24 th pin 40b _ pin24 of the main control board 40 b. The pins 41b _ pin3 and 40b _ pin24 belong to IO interfaces for testing.
As shown in fig. 4b, the IO interface of the main control board 40b includes an interface for controlling the IMU to measure the offset, where the connection relationship between the IMU408 and the main control board 40b is: 408_ pin 1-40 b _ pin 41.
As shown in fig. 4b, the IO interface of the main control board 40b includes interfaces for connecting clock signals, and these interfaces may be electrically connected to the clock control circuit 42b and are responsible for receiving the clock signals provided by the clock control circuit 42 b. The clock control circuit 42b includes: a resistor R9; a crystal oscillator Y1 connected in parallel with the resistor R9; a capacitor C37 connected in parallel with Y1; c38 connected in series with capacitor C37, wherein capacitors C37 and C38 are both connected to ground; the two ends of the resistor R9 are respectively led out of the output end of the clock control circuit 42b, and are electrically connected with the clock signal interface on the main control board 40 b. The clock control circuit 42b further includes: a resistor R10 connected with +3V voltage; the resistor R10 is grounded through the capacitor C40, and an output end is led out between the resistor R10 and the capacitor C40 and is electrically connected with an asynchronous reset (NRST) pin of the main control board 40 b. Further, the clock control circuit 42b further includes: a resistor R5; one end of the resistor R5 is grounded through a capacitor C26; the other end of the resistor R5 is grounded through C18; the +3V voltage and the processor of the autonomous mobile device are connected between the R5 and the C18, and an output end is led out between the resistor R5 and the capacitor C26 and is electrically connected with a VDDA pin of the main control board 40 b. The crystal oscillator Y1 in the clock control circuit 42b provides the high frequency pulse, which becomes the internal clock signal of the main control board 40b after frequency division processing, and uses the clock signal as the control signal for coordinating the operation of each component. In addition, in the case where the structured light module 402 is installed on an autonomous mobile device, the clock control circuit 42b may be connected to the main controller 401 to realize the control of the structured light module 402 by the autonomous mobile device. The connection relationship between the clock control circuit 42b and the main control board 40b is: one termination 40b _ pin2 of R9, the other termination 40b _ pin3, 40b _ pin4 between R10 and C40, and 40b _ pin5 between R5 and C26. 40b _ pin2 denotes the 2 nd pin of the main control board 40b, 40b _ pin3 denotes the 3 rd pin of the main control board 40b, 40b _ pin4 denotes the 4 th pin (NRST) of the main control board 40b, and 40b _ pin5 denotes the 5 th pin (VDDA) of the main control board 40 b.
In the embodiment of the present application, the connection mode between the camera module 402a and the main control board 40b is not limited. The camera module 402a may be directly connected to the main control board 40 b; the main control board 40b may also be connected to the flexible Printed circuit (fpc) bus 43 b.
When the camera module 402a is connected to the main control board 40b through the FPC cable 43b, the connection relationship between the FPC cable 43b and the main control board 40b is: 43b _ pin-40 b _ pin, 43b _ pin-40 b _ pin. Wherein "-" represents a connection relationship; 43b _ pinx represents the x pin on the FPC cable 43 b; 40b _ pinx denotes the x pin on master control board 40 b; x is a natural number greater than or equal to 0.
Further, as shown in fig. 4c, the structured light module 402 used by the autonomous mobile device may further include: a laser driver circuit 402 c. The structure of the laser driving circuit 402c is similar to the laser driving circuit 204a or 204b shown in fig. 2c, and is not described herein again. In fig. 4b, taking the example that the structured light module 402 includes two laser driving circuits 402c, the connection relationship between the laser driving circuits 402c and the main control board 40b is exemplarily illustrated. J1 in FIG. 4b is connected to the left line laser transmitter 402b in FIG. 4c, and J1 is the control interface of the left line laser transmitter 402 b; j2 in FIG. 4b is connected to the right laser transmitter 402b in FIG. 4c, and J2 is the control interface of the right laser transmitter 402 b. As shown in fig. 4b, the laser driving circuit 402c for driving the left line laser emitter 402b includes pins LD _ L _ catod and LD _ L _ ANODE, which are electrically connected to pins LD _ L _ catod and LD _ L _ ANODE of J1, respectively; the laser driving circuit 402c for driving the right-side line laser emitter 402b includes pins LD _ R _ catod and LD _ R _ ANODE, which are electrically connected to the pins LD _ R _ catod and LD _ R _ ANODE of J2, respectively. In fig. 4b, the 40b _ pin28 is connected to the LD _ L _ EMIT _ CTRL terminal of the laser driving circuit 402c for driving the left line laser transmitter 402b to control the on/off of the left line laser transmitter 402b, wherein when the 40b _ pin28 is at a high level, the left line laser transmitter 402b is in an on state, and when the 40b _ pin28 is at a low level, the left line laser transmitter 402b is in an off state. In fig. 4b, the 40b _ pin27 is connected to the LD _ R _ EMIT _ CTRL terminal of the laser driving circuit 402c for driving the right side line laser transmitter 402b to control the right side line laser transmitter 402b to be turned on and off, wherein when the 40b _ pin27 is at a high level, the right side line laser transmitter 402b is in a conducting state, and when the 40b _ pin27 is at a low level, the right side line laser transmitter 402b is in an off state. In fig. 4b, 40b _ pin26 is connected to the LD _ L _ PWM terminal of the laser driving circuit 402c for driving the left line laser transmitter 402b to control the current of the left line laser transmitter 402b, 40b _ pin26 is controlled by PWM, the duty ratio of PWM can be increased from 0% to 100%, the current of the left line laser transmitter 402b can be increased as the duty ratio is increased, so that the current of the left line laser transmitter 402b can be controlled according to the duty ratio of 40b _ pin 26. In fig. 4b, 40b _ pin25 is connected to the LD _ R _ PWM terminal of the laser driving circuit 402c for driving the right laser transmitter 402b to control the current of the right laser transmitter 402b, and similarly, 40b _ pin25 is also PWM controlled, so that the current of the right laser transmitter 402b can be controlled according to the duty ratio of 40b _ pin 25. The connection relationship among the pin names, pin numbers and corresponding pin numbers shown in fig. 4b is merely an exemplary illustration and should not constitute a limitation of the circuit configuration of the present application.
In an optional embodiment, the main controller 401 is specifically configured to: carrying out exposure control on the camera module 402a and acquiring a synchronous signal generated by each exposure of the camera module 402 a; and controlling the laser transmitters 402b to work alternately according to the synchronous signal, and marking the environment image acquired by each exposure of the camera module 402a left and right.
In this embodiment, the synchronization signal is a time reference signal provided to other devices or components that need to process information synchronously, for example, an exposure synchronization (LED STROBE) signal provides a time reference for the camera module 402a and the line laser transmitter 402b, and is a trigger signal for triggering the line laser transmitter 402b to emit line laser. The synchronization signal may be, but is not limited to, a switching signal, a continuous pulse signal, and the like.
In the above embodiments of the present application, the operation of the line laser emitters 402b on both sides of the camera module 402a is not limited. Optionally, the main controller 401 controls the line laser transmitter 402b to alternately operate according to the synchronization signal, and controls the camera module 402a to alternately set the operating mode of the lens thereof to adapt to the line laser transmitter 402b in the operating state.
Further optionally, when controlling the camera module 402a to alternately set the working mode of the lens, the main controller 401 is specifically configured to: when the line laser transmitter 402b positioned on the left side of the camera module 402a is controlled to work, the lens of the camera module 402a is controlled to work in a right half mode; when the line laser emitter 402b on the right side of the camera module 402a is controlled to work, the lens of the camera module 402a is controlled to work in the left half mode.
Further optionally, the main controller 401 may control the camera module 402a to expose, and control the line laser emitter 402b on one side to work when the camera module 402a exposes every time, so as to achieve the purpose that the line laser emitters 402b on two sides work alternately. Specifically, the main controller 401 may send an on-off control signal and a PWM signal to the line laser transmitter 402b through the laser driving circuit 402c to drive the line laser transmitter 402b to operate.
Of course, in addition to controlling the line laser emitters 402b on both sides of the camera module 402a to operate alternately, the line laser emitters 402b on both sides of the camera module 402a may also be controlled to operate simultaneously. In the case where the line laser transmitters 402b located at both sides of the camera module 402a are simultaneously operated, the lens of the camera module 402a is operated in the full-width mode.
In the embodiment of the present application, when the line laser transmitters 402b work alternately, the camera modules 402a set their lens working modes alternately, and the implementation of the main controller 401 marking the environment image collected by the camera modules 402a left and right is not limited. For example, when the lens of the camera module 402a operates in the left half mode, the right line laser emitter 402b emits laser, the camera module 402a collects an environment image, and the main controller 401 marks the collected environment image as a left half environment image.
The principle of the MCU working in cooperation with the structured light module 402 will be described below by taking the main controller 401 as an MCU and the IMU408 as a three-axis gyroscope and a three-axis accelerometer as examples. As shown in FIG. 4c, when powered on, the MCU begins initializing the IO interface and configuring the structured light module 402 via the I2C interface. After the initialization is completed, the MCU controls the structured light module 402 through the I2C interface, so as to control the camera module 402a and the line laser emitter 402b in the structured light module 402. The MCU sends a trigger signal to the camera module 402a through the I2C interface, and the camera module 402a receives the trigger signal and starts exposure while sending an exposure synchronization (LED STROBE) signal to the MCU. After receiving the LED STROBE signal, the MCU drives the right line laser emitter 402b to emit laser through the laser driving circuit 402c at the rising edge of the LED STROBE signal, and turns off the right line laser emitter 402b at the falling edge of the LED STROBE signal. After exposure is completed, the camera module 402a triggers the MCU to acquire image data through the DVP on the main control board and processes the image data by the MCU, where the processing is mainly to calibrate the image data according to the offset measured by the IMU408 (such as a three-axis gyroscope and a three-axis accelerometer) so as to accurately convert the image data from the coordinate system where the structured light module 402 is located to the coordinate system where the autonomous mobile device is located. Similarly, the MCU sends a trigger signal to the camera module 402a through I2C, and the camera module 402a receives the trigger signal and starts exposure while sending an exposure synchronization (LED STROBE) signal to the MCU. After receiving the LED STROBE signal, the MCU drives the left line laser emitter 402b to emit laser through the laser driving circuit 402c at the rising edge of the LED STROBE signal, and turns off the right line laser emitter 402b at the falling edge of the LED STROBE signal. After exposure is completed, the camera module 402a triggers the MCU to acquire image data through the DVP on the main control board and processes the image data by the MCU, where the processing is mainly to calibrate the image data according to the offset measured by the IMU408 (such as a three-axis gyroscope and a three-axis accelerometer) so as to accurately convert the image data from the coordinate system where the structured light module 402 is located to the coordinate system where the autonomous mobile device is located. The above process is repeated until the operation is finished.
In the embodiment of the present application, the specific position of the structured light module 402 on the apparatus body 400 is not limited. Such as but not limited to the front, back, left, right, top, middle, and bottom of the device body 400, etc. Further, the structured light module 402 is disposed at a middle position, a top position, or a bottom position in the height direction of the apparatus body 400.
In an alternative embodiment, the autonomous mobile device moves forward to perform a task, and the structured light module 402 is disposed on the front side of the device body 400 for better detecting the environmental information in front; the front side is a side toward which the device body 400 faces during forward movement of the autonomous mobile device.
In another alternative embodiment, in order to protect the structured light module 402 from external force, a striking plate is further installed on the front side of the apparatus body 400, and the striking plate is located outside the structured light module 402. An exploded view of the device body and the striking plate can be seen in fig. 3 c. The structured light module can be mounted on the striker plate; the striker may not be mounted on the striker plate, and is not limited thereto. A window is opened in the area of the striking plate corresponding to the structured light module 402 to expose the camera module 402a and the line laser emitter 402b in the structured light module 402. Further optionally, windows are respectively opened on the striking plate corresponding to the positions of the camera module 402a and the line laser emitter 402 b.
In yet another alternative embodiment, the structured light module 402 is mounted on the inner sidewall of the striker plate.
In yet another alternative embodiment, the distance from the center of the structured light module 402 to the work surface of the autonomous mobile device is in the range of 30-60 mm. In order to reduce the spatial blind area of the autonomous mobile device and make the angle of view sufficiently large, further optionally, the distance from the center of the structured light module 402 to the working surface of the autonomous mobile device is 47 mm.
Further, in addition to the various components mentioned above, the autonomous mobile device of the present embodiment may also include some basic components, such as one or more memories, communication components, power components, drive components, and so forth.
Wherein the one or more memories are primarily for storing a computer program executable by the master controller to cause the master controller to control the autonomous mobile device to perform a corresponding task. In addition to storing computer programs, the one or more memories may be configured to store various other data to support operations on the autonomous mobile device. Examples of such data include instructions for any application or method operating on the autonomous mobile device, map data of the environment/scene in which the autonomous mobile device is located, operating modes, operating parameters, and so forth.
The communication component is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as Wifi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may further include a Near Field Communication (NFC) module, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and the like.
Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. Optionally, the autonomous mobile device of this embodiment may be implemented as a sweeping robot, and then in the case of being implemented as a sweeping robot, the autonomous mobile device may further include a cleaning assembly, and the cleaning assembly may include a cleaning motor, a cleaning brush, a dust collecting fan, and the like. These basic components and the configurations of the basic components contained in different autonomous mobile devices are different, and the embodiments of the present application are only some examples.
The embodiment of the present application further provides a data calibration method, where the method is applicable to an autonomous mobile device, and a structured light module is installed on the autonomous mobile device, where the structured light module includes: the camera comprises a camera module and line laser transmitters distributed on two sides of the camera module; and an inertial measurement unit IMU is also arranged on the structured light module. As shown in fig. 5, the method includes:
51. utilizing the IMU to collect the offset of the structured light module relative to the autonomous mobile equipment;
52. according to the offset, the environment image collected by the camera module is calibrated, and the environment image is detected by the line laser emitted by the line laser emitter.
The method provided by the embodiment of the application is suitable for any autonomous mobile equipment comprising the structured light module, wherein the structured light module comprises a camera module and line laser emitters distributed on two sides of the camera module. The IMU can be subordinate to the structured light module; or may be affiliated with an autonomous mobile device, which is not limited in this regard. The structured light module may or may not include a master control unit. For a detailed description of the structured light module, reference is made to the description of the foregoing embodiments, which are not repeated herein.
In some application scenarios, the environment image collected by the structured light module needs to be transformed from the coordinate system (referred to as the sensor coordinate system for short) in which the structured light module is located to the device coordinate system in which the autonomous mobile device is located. In practical applications, the structured light module moves along with the movement of the autonomous mobile device, and during the movement, the structured light module may shake for some reason, which may cause the sensor coordinate system to shift relative to the device coordinate system. Taking a robot as an example, the structured light module is installed on a collision plate at the front side of the robot, the collision plate is a movable component, and the amount of shake is large, and in the moving process of the robot, the shake of the collision plate can cause the structured light module to shake, so that the coordinate system of the sensor where the structured light module is located is offset relative to the coordinate system of the device where the robot is located (or referred to as the robot coordinate system). The deviation can cause the deviation of high-precision data acquired by the structured light module in the process of transforming to an equipment coordinate system, so that the use effect of the measurement data of the structured light module is unsatisfactory, and the high-precision advantage of the structured light module is not favorably and fully exerted.
Based on the above considerations, in step 51, the offset of the structured light module with respect to the autonomous mobile device may be collected by the IMU. An IMU is a device that measures the three-axis attitude angles and/or accelerations of an object. Generally, the IMU may include, but is not limited to, at least one of a gyroscope, an accelerometer, and a magnetometer, among others. Wherein the gyroscope is a sensor for measuring angular velocity; an accelerometer is a meter used to measure the linear acceleration of a vehicle; a magnetometer is a sensor that measures the strength and direction of a magnetic field, locating the orientation of a device. In the embodiments of the present application, the specific implementation form of the IMU is not limited. All IMUs that can measure the offset that the relative autonomic mobile device of structured light module takes place are applicable to this application embodiment. For example, but not limited to, a three-axis IMU, a six-axis IMU, a nine-axis IMU, and so forth. Optionally, where the IMU is a three-axis IMU, the IMU may employ a three-axis gyroscope, a three-axis accelerometer, or a three-axis magnetometer. Where the IMU is a six-axis IMU, the IMU may employ, but is not limited to, a three-axis gyroscope and a three-axis accelerometer. For example, where the IMU is a six-axis IMU, the IMU may employ a three-axis gyroscope and a three-axis magnetometer, a three-axis accelerometer and a three-axis magnetometer, and so on.
In the embodiment of the present application, the type of the offset is not limited, and the type of the IMU may be different and the offset may also be different according to different application requirements. For example, when the IMU is a three-axis gyroscope, the offset herein refers primarily to the rotational offset of the structured light module relative to the autonomous mobile device; when the IMU is a three-axis accelerometer, the offset mainly refers to the displacement offset of the structured light module relative to the autonomous mobile equipment; when the IMU is a three-axis magnetometer, the offset herein refers primarily to the azimuthal offset of the structured light module relative to the autonomous mobile device, and so on.
In the embodiment of the present application, an implementation of obtaining the offset by the IMU is also not limited. Any embodiment in which the deviation amount of the structured light module with respect to the autonomous mobile device is obtained by the IMU is applicable to the embodiment of the present application. Of course, when the IMU is implemented with different types of sensors, the implementation of different sensors to detect the amount of offset of the structured light module relative to the autonomous mobile device may also vary. For example, assuming that the IMU employs a three-axis gyroscope, an integration period may be set, and an instantaneous angular velocity value measured by the gyroscope is integrated within the integration period to obtain an angular offset, which represents a rotational offset of the structured light module with respect to the autonomous mobile device. For another example, if the IMU may also use a three-axis accelerometer, an integration period may be set, and an instantaneous acceleration value measured by the accelerometer is integrated within the integration period to obtain a distance offset, where the distance offset represents a translational offset of the structured light module with respect to the autonomous mobile device, and the like. The integration period may be set according to an application scenario or a requirement, for example, the instantaneous acceleration value measured by the gyroscope is integrated once every 3ms, 5ms, 7ms, or other time lengths, so as to obtain the angular offset.
For step 52, the embodiment of the present application does not limit a specific implementation of calibrating the environment image by using the offset, and all implementations that can calibrate the environment image by using the offset are applicable to the embodiment of the present application.
In an alternative embodiment, when calibrating the environment image, the method comprises the following steps: and calibrating the first transformation matrix by using the offset, and transforming the environment image into an equipment coordinate system by using the calibrated first transformation matrix so as to achieve the purpose of calibrating the environment image.
The first transformation matrix is a transformation matrix from a sensor coordinate system where the structured light module is located to an equipment coordinate system where the autonomous mobile equipment is located. The first transformation matrix may be determined to some extent by the configuration parameters of the structured light module mounted to the autonomous mobile device, including but not limited to: mounting height, mounting angle, horizontal mounting position, etc. The first transformation matrix is a fixed value by default, but since the structured light module is mounted on the autonomous mobile device, a shake occurs during the travel of the autonomous mobile device, and the shake causes a change in a structural parameter of the structured light module mounted on the autonomous mobile device, and the change in the structural parameter means that the first transformation matrix should not be a fixed value but should be dynamically changed.
In the above optional embodiment, the offset amount of the structured light module relative to the autonomous mobile device is reflected in the first transformation matrix, and the first transformation matrix is calibrated, so that the first transformation matrix can follow the dynamic change of the offset of the structured light module relative to the autonomous mobile device, that is, can follow the dynamic change of the structural parameter, and further the environmental image acquired by the structured light module can be accurately converted into the device coordinate system where the autonomous mobile device is located based on the dynamically changed first transformation matrix, thereby solving the problem of inaccurate use of the environmental image due to the error generated in the conversion process of the environmental image from the sensor coordinate system to the device coordinate system caused by the jitter to a certain extent.
In the embodiment of the present application, a specific implementation manner of performing correction calibration on the first transformation matrix is not limited. Any embodiment that can correct and calibrate the first transformation matrix is applicable to the embodiment of the present application. In an optional embodiment, optionally, when performing correction calibration on the first transformation matrix, the method includes: calculating the product of the offset and the second transformation matrix to obtain an offset matrix under the coordinate system of the sensor in the module coordinate system; calculating the product of the offset matrix and the first transformation matrix to obtain a corrected and calibrated first transformation matrix; the second transformation matrix is a transformation matrix between an IMU coordinate system where the IMU is located and a sensor coordinate system of the module coordinate system, and is determined by structural parameters of the IMU mounted to the structured light module, where the structural parameters include but are not limited to: mounting height, mounting angle, horizontal mounting position, etc. The IMU is fixedly mounted with respect to the structured light module, and theoretically, the IMU is not deviated with respect to the structured light module, which means that the second transformation matrix is accurate. Therefore, the second transformation matrix can be used for directly transforming the offset measured by the IMU into the sensor coordinate system where the structured light module is located, and an offset matrix under the sensor coordinate system is obtained. The offset matrix may be used to calibrate the first transformation matrix.
In the embodiment of the present application, the establishment manners of the sensor coordinate system, the device coordinate system, and the IMU coordinate system are not limited. For details, please refer to the foregoing embodiments, which are not described herein.
When the coordinates of a position point in the source coordinate system are converted to the target coordinate system, the coordinates of the position point in the source coordinate system need to be multiplied by a transformation matrix to obtain the coordinates of the position point in the target coordinate system, wherein the transformation matrix represents the position relationship between the source coordinate system and the target coordinate system. In the embodiment of the present application, the first transformation matrix and the second transformation matrix are both transformation matrices, and the implementation form of the transformation matrices is not limited. Alternatively, the first transformation matrix and the second transformation matrix may be composed of a rotation matrix and a translation matrix concatenated. From the source coordinate system to the target coordinate system, a series of rotation and translation operations are required. During the rotation of the coordinate axis, the change of the angle is represented by a rotation matrix R, and during the movement of the coordinate axis, the change of the distance is represented by a translation matrix t. The rotation matrix is an m × m order square matrix, the translation matrix T is an m × 1 order matrix, and a transformation matrix T obtained by splicing the rotation matrix R and the translation matrix T is an (m +1) × (m +1) order square matrix, wherein the last element of the last row of the transformation matrix T is supplemented by 1, and the rest elements of the last row are supplemented by 0. Wherein m is a positive integer. The description of the matrix structure of the first transformation matrix and the second transformation matrix herein is merely exemplary, but not limited thereto. For example, for three-dimensional space, m may take 3, and the first and second transformation matrices are 4 x 4 matrices.
When the first transformation matrix is calibrated, the physical significance of the offset is different according to the IMU. When the IMU is a three-axis gyroscope, the measured offset Δ T1 represents the variation Δ R1 of the rotation matrix, and at this time, the value in the variation Δ T1 of the translation matrix is 1; when the IMU is a triaxial accelerometer, the measured offset delta T2 represents the variation delta T2 of the translation matrix, and the value in the variation delta R2 of the rotation matrix is 1; when the IMU is a three-axis gyroscope and a three-axis accelerometer, the offset Δ T31 measured by the three-axis gyroscope represents the variation Δ R3 of the rotation matrix, and the offset Δ T32 measured by the three-axis accelerometer represents the variation Δ T3 of the translation matrix. When the IMU is a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer, the direction offset delta T41 measured by the three-axis magnetometer can be converted into an angle offset, the angle offset and the offset delta T42 measured by the three-axis gyroscope together represent the variation delta R4 of a rotation matrix, and the offset delta T42 measured by the three-axis accelerometer represents the variation delta T4 of a translation matrix; of course, the direction offset Δ T41 measured by the three-axis magnetometer can be converted into a distance offset, and the change Δ T4 of the translation matrix is represented together with the offset Δ T42 measured by the three-axis accelerometer. Regardless of which IMU is used, the final offset is obtained by splicing the rotation matrix and the translation matrix, the offset reflects the change of the transformation matrix, the offset matrix under the sensor coordinate system is obtained by utilizing the product of the second transformation matrix and the offset, and the calibrated first transformation matrix is obtained by utilizing the product of the offset matrix and the first transformation matrix.
When the environment image is corrected and calibrated, the coordinate of the environment image under the equipment coordinate system of the autonomous mobile equipment can be obtained by multiplying the coordinate point of the environment image under the sensor coordinate system of the module coordinate system by the calibrated first transformation matrix. Further, the environment image may be transformed into the world coordinate system by multiplying the normalized environment image by a third transformation matrix. And the third transformation matrix is a transformation matrix from the device coordinate system where the autonomous mobile device is located to the world coordinate system. Based on the environment image after the calibration, various function controls can be carried out on the autonomous mobile equipment, the structured light module has the advantage of higher precision, the calibration of the offset of the structured light module relative to the autonomous mobile equipment is further combined, the accuracy and precision of the environment image can be ensured, and the autonomous mobile equipment can be accurately controlled.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 51 to 52 may be device a; for another example, the execution subject of step 51 may be device a, and the execution subject of step 52 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 51, 52, etc., are merely used for distinguishing different operations, and the sequence numbers themselves do not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Accordingly, the present application also provides a computer readable storage medium storing a computer program, which when executed, can implement the steps that can be performed by the autonomous mobile apparatus in the above method embodiments.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
Claims (13)
1. A structured light module, comprising: the system comprises a camera module, line laser transmitters distributed on two sides of the camera module, and a main control unit for controlling the camera module and the line laser transmitters to work;
the line laser transmitter transmits line laser outwards under the control of the main control unit; the camera module is used for collecting an environment image detected by the line laser under the control of the main control unit;
the structured light module further includes: an inertial measurement unit IMU for measuring an offset of the structured light module with respect to a device body of an autonomous mobile device when the structured light module is applied to the autonomous mobile device; the main control unit is further configured to: and calibrating the environment image according to the offset.
2. The module of claim 1, wherein the IMU is a three-axis IMU or a six-axis IMU.
3. The module of claim 2, wherein, in the case where the IMU is a three-axis IMU, the IMU employs a three-axis gyroscope, or a three-axis accelerometer;
and under the condition that the IMU is a six-axis IMU, the IMU adopts a three-axis gyroscope and a three-axis accelerometer.
4. A module according to any one of claims 1 to 3, wherein the main control unit, when calibrating the environment image, is specifically configured to:
calibrating the first transformation matrix by using the offset, and transforming the environment image into an equipment coordinate system by using the calibrated first transformation matrix so as to calibrate the environment image;
the first transformation matrix is a transformation matrix from a sensor coordinate system where the structured light module is located to an equipment coordinate system where the autonomous mobile equipment is located.
5. The module according to claim 4, wherein the main control unit, when calibrating the first transformation matrix, is specifically configured to:
calculating the product of the offset and a second transformation matrix to obtain an offset matrix under the sensor coordinate system; calculating the product of the offset matrix and the first transformation matrix to obtain a calibrated first transformation matrix;
and the second transformation matrix is a transformation matrix from the IMU coordinate system where the IMU is located to the sensor coordinate system.
6. An autonomous mobile device, comprising: the device comprises a device body, wherein a main controller and a structured light module are arranged on the device body; the structured light module includes: the system comprises a camera module, line laser transmitters distributed on two sides of the camera module and a main control unit for controlling the camera module and the line laser transmitters to work; an inertial measurement unit IMU is further mounted on the structured light module;
the line laser transmitter transmits line laser outwards under the control of the main control unit; the camera module is used for collecting an environment image detected by the line laser under the control of the main control unit;
the IMU is used for measuring the offset of the structured light module relative to the equipment body; the main control unit is further configured to: calibrating the environment image according to the offset, and providing the calibrated environment image to the main controller; the main controller is used for performing function control on the autonomous mobile equipment by using the calibrated environment image.
7. An autonomous mobile device, comprising: the device comprises a device body, wherein a main controller, an Inertial Measurement Unit (IMU) and a structured light module are arranged on the device body; the structured light module includes: the device comprises a camera module and line laser transmitters distributed on two sides of the camera module; the IMU is mounted on the structured light module;
the line laser transmitter transmits line laser outwards under the control of the main controller; the camera module is used for collecting an environment image detected by the line laser under the control of the main controller;
the IMU is used for measuring the offset of the structured light module relative to the equipment body; the master controller is further configured to: and calibrating the environment image according to the offset, and performing function control on the autonomous mobile equipment by using the calibrated environment image.
8. A data calibration method is suitable for autonomous mobile equipment, and is characterized in that a structured light module is installed on the autonomous mobile equipment, and the structured light module comprises: the device comprises a camera module and line laser transmitters distributed on two sides of the camera module; an inertial measurement unit IMU is further mounted on the structured light module; the method comprises the following steps:
acquiring, with the IMU, an offset of the structured light module relative to a device body of the autonomous mobile device;
and calibrating the environment image collected by the camera module according to the offset, wherein the environment image is detected by the line laser emitted by the line laser emitter.
9. The method of claim 8, wherein calibrating the environmental image collected by the camera module according to the offset comprises:
calibrating the first transformation matrix by using the offset, and transforming the environment image into an equipment coordinate system by using the calibrated first transformation matrix so as to calibrate the environment image;
the first transformation matrix is a transformation matrix from a sensor coordinate system where the structured light module is located to an equipment coordinate system where the autonomous mobile equipment is located.
10. The method of claim 9, wherein calibrating the first transformation matrix with the offset comprises:
calculating the product of the offset and a second transformation matrix to obtain an offset matrix under the sensor coordinate system; calculating the product of the offset matrix and the first transformation matrix to obtain a calibrated first transformation matrix;
and the second transformation matrix is a transformation matrix from the IMU coordinate system where the IMU is located to the sensor coordinate system.
11. The method according to any one of claims 8-10, further comprising:
and performing function control on the autonomous mobile equipment by using the calibrated environment image.
12. The method of claim 11, wherein using the calibrated ambient image for functional control of the autonomous mobile device comprises:
converting the calibrated environment image into a world coordinate system according to a third transformation matrix; and performing function control on the autonomous mobile equipment by utilizing the environmental image in the world coordinate system.
13. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to implement the steps of the method of any one of claims 8-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911398712.2A CN111432113B (en) | 2019-12-30 | 2019-12-30 | Data calibration method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911398712.2A CN111432113B (en) | 2019-12-30 | 2019-12-30 | Data calibration method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111432113A CN111432113A (en) | 2020-07-17 |
CN111432113B true CN111432113B (en) | 2022-04-05 |
Family
ID=71546956
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911398712.2A Active CN111432113B (en) | 2019-12-30 | 2019-12-30 | Data calibration method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111432113B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110960138A (en) * | 2019-12-30 | 2020-04-07 | 科沃斯机器人股份有限公司 | Structured light module and autonomous mobile device |
CN115811653B (en) * | 2023-01-29 | 2023-05-16 | 苏州苏映视图像软件科技有限公司 | Imaging system, position calibration method and imaging method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102419178A (en) * | 2011-09-05 | 2012-04-18 | 中国科学院自动化研究所 | Mobile robot positioning system and method based on infrared road signs |
CN103424114A (en) * | 2012-05-22 | 2013-12-04 | 同济大学 | Visual navigation/inertial navigation full combination method |
CN105723240A (en) * | 2013-09-16 | 2016-06-29 | 应美盛股份有限公司 | Method and apparatus for determination of misalignment between device and vessel using acceleration/deceleration |
US9875579B2 (en) * | 2014-08-22 | 2018-01-23 | Applied Research Associates, Inc. | Techniques for enhanced accurate pose estimation |
CN108510550A (en) * | 2018-03-29 | 2018-09-07 | 轻客智能科技(江苏)有限公司 | A kind of binocular camera automatic calibration method and device |
CN110315498A (en) * | 2019-03-26 | 2019-10-11 | 特斯联(北京)科技有限公司 | A kind of autonomous of and MR glasses linkage patrols man-controlled mobile robot and its system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10162362B2 (en) * | 2016-08-29 | 2018-12-25 | PerceptIn, Inc. | Fault tolerance to provide robust tracking for autonomous positional awareness |
-
2019
- 2019-12-30 CN CN201911398712.2A patent/CN111432113B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102419178A (en) * | 2011-09-05 | 2012-04-18 | 中国科学院自动化研究所 | Mobile robot positioning system and method based on infrared road signs |
CN103424114A (en) * | 2012-05-22 | 2013-12-04 | 同济大学 | Visual navigation/inertial navigation full combination method |
CN105723240A (en) * | 2013-09-16 | 2016-06-29 | 应美盛股份有限公司 | Method and apparatus for determination of misalignment between device and vessel using acceleration/deceleration |
US9875579B2 (en) * | 2014-08-22 | 2018-01-23 | Applied Research Associates, Inc. | Techniques for enhanced accurate pose estimation |
CN108510550A (en) * | 2018-03-29 | 2018-09-07 | 轻客智能科技(江苏)有限公司 | A kind of binocular camera automatic calibration method and device |
CN110315498A (en) * | 2019-03-26 | 2019-10-11 | 特斯联(北京)科技有限公司 | A kind of autonomous of and MR glasses linkage patrols man-controlled mobile robot and its system |
Non-Patent Citations (1)
Title |
---|
"MYNT EYE D SDK Documentation";MYNTAI;《https://github.com/slightech/MYNT-EYE-D-SDKhttps://github.com/slightech/MYNT-EYE-D-SDK》;20191212;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111432113A (en) | 2020-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11408728B2 (en) | Registration of three-dimensional coordinates measured on interior and exterior portions of an object | |
WO2021135392A1 (en) | Structured light module and autonomous moving apparatus | |
CN111083332B (en) | Structured light module, autonomous mobile device and light source distinguishing method | |
CN110873883B (en) | Positioning method, medium, terminal and device integrating laser radar and IMU | |
US11692811B2 (en) | System and method of defining a path and scanning an environment | |
EP3637141A1 (en) | A system and method of defining a path and scanning an environment | |
CN110974083A (en) | Structured light module and autonomous mobile device | |
CN111432113B (en) | Data calibration method, device and storage medium | |
US11199614B1 (en) | Lidar and image calibration for autonomous vehicles | |
CN109324634B (en) | Aircraft and positioning method, control method and optical flow module thereof | |
CN212521620U (en) | Structured light module and autonomous mobile device | |
Jiao et al. | Fusionportable: A multi-sensor campus-scene dataset for evaluation of localization and mapping accuracy on diverse platforms | |
US12019453B2 (en) | Multi-sensor-fusion-based autonomous mobile robot indoor and outdoor positioning method and robot | |
CN216265979U (en) | Indoor autonomous mobile robot | |
CN114296057A (en) | Method, device and storage medium for calculating relative external parameter of distance measuring system | |
Papoutsidakis et al. | Design of an autonomous robotic vehicle for area mapping and remote monitoring | |
CN107941167B (en) | Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof | |
CN112828853A (en) | Indoor autonomous mobile robot | |
CN212415596U (en) | Structured light module and autonomous mobile device | |
WO2022006158A1 (en) | Systems, apparatuses, and methods for calibrating lidar sensors of a robot using intersecting lidar sensors | |
WO2021150679A1 (en) | Systems, apparatuses, and methods for calibrating lidar sensors of a robot using intersecting lidar sensors | |
JP2016191735A (en) | Map creation device, autonomous traveling body, autonomous traveling body system, portable terminal, map creation method, map creation program and computer readable recording medium | |
EP4354245A1 (en) | Position-measuring device, position-measuring system, and measuring device | |
US10207410B1 (en) | Robotic autonomous navigation and orientation tracking system and methods | |
JP6631900B1 (en) | Flying object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |