CN115060229A - Method and device for measuring a moving object - Google Patents

Method and device for measuring a moving object Download PDF

Info

Publication number
CN115060229A
CN115060229A CN202111166946.1A CN202111166946A CN115060229A CN 115060229 A CN115060229 A CN 115060229A CN 202111166946 A CN202111166946 A CN 202111166946A CN 115060229 A CN115060229 A CN 115060229A
Authority
CN
China
Prior art keywords
coordinate system
coordinate
moving object
axis
measuring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111166946.1A
Other languages
Chinese (zh)
Inventor
高磊雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Honor Device Co.,Ltd.
Original Assignee
Xi'an Honor Device Co.,Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Honor Device Co.,Ltd. filed Critical Xi'an Honor Device Co.,Ltd.
Priority to CN202111166946.1A priority Critical patent/CN115060229A/en
Publication of CN115060229A publication Critical patent/CN115060229A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/24Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with fixed angles and a base of variable length in the observation station, e.g. in the instrument
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • G01P15/02Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of inertia forces using solid seismic masses
    • G01P15/03Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of inertia forces using solid seismic masses by using non-electrical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light

Abstract

The application provides a measuring method and a measuring device of a moving object, which can acquire kinematic parameters of the moving object in a user-defined coordinate system according to user requirements, and improve the flexibility of measurement and user experience. The method comprises the following steps: measuring a first calibration object to obtain a first coordinate system, wherein the first coordinate system is a camera coordinate system; measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system; reconstructing a second coordinate system based on a difference between the first coordinate system and the second coordinate system; and measuring the moving object under the second coordinate system based on the binocular stereo vision to obtain the motion parameters of the moving object.

Description

Method and device for measuring a moving object
Technical Field
The application relates to the technical field of stereoscopic vision measurement, in particular to a measuring method and a measuring device for a moving object.
Background
Binocular stereo vision, which is an important form of machine vision, is a method of acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions using an imaging device (e.g., a camera) based on a parallax principle and calculating a positional deviation between corresponding points of the object to be measured in the two images.
The user may measure a moving object based on binocular stereo vision techniques, using kinematic parameters to describe the motion of the object, e.g., using kinematic parameters such as displacement, velocity, acceleration, angular velocity, and angular acceleration. The motion of the object is relative and the user needs to specify a reference coordinate system when describing the motion of the object using kinematic parameters.
In the method for measuring a moving object based on the binocular stereo vision technology, the kinematic parameters of the moving object under a camera coordinate system (namely, a coordinate system constructed by a horizontal connecting line of two cameras and the optical path center thereof) or a self coordinate system of the moving object can only be generally obtained, the kinematic parameters of the moving object under other coordinate systems (such as a ground coordinate system or a user-defined arbitrary coordinate system) cannot be flexibly obtained according to the user requirements, and the user experience is not good.
Disclosure of Invention
The application provides a measuring method and a measuring device of a moving object, which can acquire kinematic parameters of the moving object in a user-defined coordinate system according to user requirements, and improve the flexibility of measurement and user experience.
In a first aspect, the present application provides a method for measuring a moving object, including: measuring a first calibration object to obtain a first coordinate system, wherein the first coordinate system is a camera coordinate system; measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system; reconstructing a second coordinate system based on a difference between the first coordinate system and the second coordinate system; and measuring the moving object under the second coordinate system based on the binocular stereo vision to obtain the motion parameters of the moving object.
The camera coordinate system is a coordinate system constructed by a horizontal connecting line of the two cameras and the optical path centers of the two cameras.
The first calibration object is used for obtaining a camera coordinate system. The second calibration object is used for obtaining a user-defined coordinate system.
The user-defined coordinate system may be a coordinate system with orthogonal coordinate axes or an arbitrary coordinate system with non-orthogonal coordinate axes, for example, a ground coordinate system or an arbitrary coordinate system.
The second calibration object may be a cube or a cuboid, which is not limited in the embodiment of the present application.
The motion parameter may include an angle of rotation of the own coordinate system of the moving object with respect to the second coordinate system when the moving object touches the ground.
The moving object measuring method provided by the embodiment of the application is used for measuring the moving object based on the binocular stereo vision technology, the user-defined coordinate system can be reconstructed according to the camera coordinate system, the motion of the measured object can be described under the user-defined coordinate system, the kinematic parameters of the moving object under the user-defined coordinate system can be obtained according to the user requirements, and the measuring flexibility and the user experience are improved.
With reference to the first aspect, in certain implementations of the first aspect, the difference between the first coordinate system and the second coordinate system includes: a first coordinate axis in the first coordinate system rotates to a first angle parallel to a fourth coordinate axis in the second coordinate system according to a first direction, a second coordinate axis in the first coordinate system rotates to a second angle parallel to a fifth coordinate axis in the second coordinate system according to the first direction, and a third coordinate axis in the first coordinate system rotates to a third angle parallel to a sixth coordinate axis in the second coordinate system according to the first direction, wherein the first direction is clockwise or anticlockwise; reconstructing the second coordinate system based on the difference between the first coordinate system and the second coordinate system, including: and rotating the first coordinate axis, the second coordinate axis and the third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively in the first direction to reconstruct a second coordinate system.
With reference to the first aspect, in certain implementations of the first aspect, the motion parameter includes an angle at which a self coordinate system of the moving object is rotated relative to the second coordinate system when the moving object touches the ground.
With reference to the first aspect, in certain implementations of the first aspect, the second marker includes a first marker point, a second marker point, and a third marker point thereon; measuring a second calibration object to obtain a second coordinate system, comprising: obtaining a fourth coordinate axis according to the first mark point and the second mark point; obtaining a fifth coordinate axis according to the first mark point and the third mark point; obtaining a sixth coordinate axis according to the first coordinate axis and the second coordinate axis; and obtaining a second coordinate system according to the fourth coordinate axis, the fifth coordinate axis and the sixth coordinate axis.
With reference to the first aspect, in certain implementations of the first aspect, before the measuring the second calibration object to obtain the second coordinate system, the method further includes: detecting a coordinate system establishing instruction of a user; responding to an instruction for establishing a coordinate system, and displaying a first interface, wherein the first interface displays an image of a second calibration object; and detecting the clicking operation of the user on the image of the second calibration object, wherein the clicking operation is used for marking the first marking point, the second marking point and the third marking point.
With reference to the first aspect, in certain implementations of the first aspect, the second calibration object is a cuboid or a cube, and the second coordinate system is a ground coordinate system.
With reference to the first aspect, in certain implementations of the first aspect, the obtaining motion parameters of the moving object by measuring the moving object in the second coordinate system based on binocular stereo vision includes: detecting a measurement instruction of a user; responding to the measurement instruction, and acquiring an image of the moving object; and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
According to the moving object measuring method, the moving object is measured based on the binocular stereo vision technology, man-machine interaction operation is provided for a user, the movement of the measured object is described in a user-defined coordinate system under the condition that the user does not perceive the moving object, and measuring flexibility and user experience are improved.
In a second aspect, the present application provides a measuring device for a moving object, comprising: the acquisition module is used for measuring the first calibration object and acquiring a first coordinate system, wherein the first coordinate system is a camera coordinate system; measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system; the processing module is used for reconstructing a second coordinate system based on the difference value between the first coordinate system and the second coordinate system; the acquisition module is further configured to: and measuring the moving object under the second coordinate system based on the binocular stereo vision to obtain the motion parameters of the moving object.
With reference to the second aspect, in some implementations of the second aspect, the difference between the first coordinate system and the second coordinate system includes: a first coordinate axis in the first coordinate system rotates to a first angle parallel to a fourth coordinate axis in the second coordinate system according to a first direction, a second coordinate axis in the first coordinate system rotates to a second angle parallel to a fifth coordinate axis in the second coordinate system according to the first direction, and a third coordinate axis in the first coordinate system rotates to a third angle parallel to a sixth coordinate axis in the second coordinate system according to the first direction, wherein the first direction is clockwise or anticlockwise; the processing module is further configured to: and rotating the first coordinate axis, the second coordinate axis and the third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively in the first direction to reconstruct a second coordinate system.
With reference to the second aspect, in certain implementations of the second aspect, the motion parameter includes an angle by which a coordinate system of the moving object is rotated relative to the second coordinate system when the moving object touches the ground.
With reference to the second aspect, in some implementations of the second aspect, the second marker includes a first marker point, a second marker point, and a third marker point thereon; the acquisition module is further configured to: obtaining a fourth coordinate axis according to the first mark point and the second mark point; obtaining a fifth coordinate axis according to the first mark point and the third mark point; obtaining a sixth coordinate axis according to the first coordinate axis and the second coordinate axis; and obtaining a second coordinate system according to the fourth coordinate axis, the fifth coordinate axis and the sixth coordinate axis.
With reference to the second aspect, in certain implementations of the second aspect, the apparatus further includes a detection module; the detection module is used for: detecting an instruction of a user for establishing a coordinate system; the processing module is further configured to: responding to an instruction for establishing a coordinate system, and displaying a first interface, wherein the first interface displays an image of a second calibration object; the detection module is further configured to: and detecting the clicking operation of the user on the image of the second calibration object, wherein the clicking operation is used for marking the first marking point, the second marking point and the third marking point.
With reference to the second aspect, in certain implementations of the second aspect, the second calibration object is a cuboid or a cube, and the second coordinate system is a ground coordinate system.
With reference to the second aspect, in certain implementations of the second aspect, the apparatus further includes a detection module; the detection module is further configured to: detecting a measurement instruction of a user; the acquisition module is further configured to: responding to the measurement instruction, and acquiring an image of the moving object; and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
In a third aspect, the present application provides a measuring device for a moving object, including a processor, coupled to a memory, and configured to execute instructions in the memory to implement the method in any one of the possible implementations of the first aspect. Optionally, the measuring device of the moving object further comprises a memory. Optionally, the measuring device of the moving object further comprises a communication interface, and the processor is coupled with the communication interface.
In a fourth aspect, the present application provides a processor comprising: input circuit, output circuit and processing circuit. The processing circuit is configured to receive a signal via the input circuit and transmit a signal via the output circuit, so that the processor performs the method of any one of the possible implementations of the first aspect.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a flip-flop, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the signal output by the output circuit may be output to and transmitted by a transmitter, for example and without limitation, and the input circuit and the output circuit may be the same circuit that functions as the input circuit and the output circuit, respectively, at different times. The specific implementation of the processor and various circuits are not limited in this application.
In a fifth aspect, the present application provides a processing apparatus comprising a processor and a memory. The processor is configured to read instructions stored in the memory, and may receive signals via the receiver and transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, there are one or more processors and one or more memories.
Alternatively, the memory may be integrated with the processor, or provided separately from the processor.
In a specific implementation process, the memory may be a non-transitory (non-transitory) memory, such as a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips.
It will be appreciated that the associated data interaction process, for example, sending the indication information, may be a process of outputting the indication information from the processor, and receiving the capability information may be a process of receiving the input capability information from the processor. In particular, the data output by the processor may be output to a transmitter and the input data received by the processor may be from a receiver. The transmitter and receiver may be collectively referred to as a transceiver, among others.
The processing device in the fifth aspect may be a chip, the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory, which may be integrated with the processor, located external to the processor, or stand-alone.
In a sixth aspect, the present application provides a computer-readable storage medium storing a computer program (which may also be referred to as code or instructions) which, when executed on a computer, causes the computer to perform the method in any one of the possible implementations of the first aspect.
In a seventh aspect, the present application provides a computer program product comprising: computer program (also called code, or instructions), which when executed, causes a computer to perform the method of any of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic diagram of a coordinate system of an object according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a scene for measuring a moving object according to an embodiment of the present disclosure;
fig. 3 is a schematic view of another scene for measuring a moving object according to an embodiment of the present disclosure;
fig. 4 is a schematic view of a damaged scene of a mobile phone body according to an embodiment of the present disclosure;
fig. 5 is a schematic view of another damaged mobile phone body according to an embodiment of the present application;
FIG. 6 is a schematic view of a scene for measuring a damaged posture of a mobile phone body;
FIG. 7 is a schematic view of another scenario for measuring a damaged posture of a mobile phone body;
FIG. 8 is a schematic view of another scenario for measuring a damaged posture of a mobile phone body;
fig. 9 is a schematic flowchart of a method for measuring a moving object according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a measuring head coordinate system obtained according to an embodiment of the present application;
fig. 11 is a schematic diagram of obtaining a ground coordinate system according to an embodiment of the present application;
FIG. 12 is a schematic diagram of obtaining a custom coordinate system according to an embodiment of the present application;
FIG. 13 is a schematic flow chart of another method for measuring a moving object according to an embodiment of the present disclosure;
FIG. 14 is a schematic interface diagram of a measurement software provided in an embodiment of the present application;
FIG. 15 is a schematic diagram of an interface for customizing a coordinate system function according to an embodiment of the present disclosure;
FIG. 16 is a schematic diagram of an interface for providing another custom coordinate system function according to an embodiment of the present application;
fig. 17 is a schematic flowchart of another method for measuring a moving object according to an embodiment of the present application;
FIG. 18 is a schematic flow chart of a measuring device for a moving object according to an embodiment of the present disclosure;
fig. 19 is a schematic flow chart of another measuring device for a moving object according to an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first compensation angle and the second compensation angle are for distinguishing different compensation angles, and the order of the compensation angles is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that the words "exemplary," "for example," and "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Further, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, and c, may represent: a, or b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Binocular stereo vision is an important form of machine vision, and is a method for acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions by using an imaging device (e.g., a camera) based on a parallax principle and calculating a positional deviation between corresponding points of the object to be measured in the two images. The binocular stereo vision integrates images obtained by two eyes and observes the difference between the images, so that a user can obtain obvious depth feeling, the corresponding relation between the characteristics can be established by adopting the binocular stereo vision, and mapping points of physical points in the same space in different images are corresponded.
The binocular stereo vision method has the advantages of high efficiency, proper precision, simple system structure, low cost and the like, and is very suitable for the detection and quality control of on-line and non-contact products on a manufacturing site. For the measurement of moving objects, the binocular stereo vision method is a more effective measurement method because the image acquisition is completed instantaneously.
Specifically, in the process of moving the object, a user can paste a mark point on the moving object (namely, the object to be detected), shoot the moving object through high-speed shooting, obtain a motion trail of the mark point along with time according to the shot image, further calculate a series of kinematic parameters of the moving object, such as speed, angle, angular speed, acceleration, angular acceleration and the like according to the motion trail and the motion time of the mark point and based on a three-dimensional space coordinate system in a binocular shooting range in which the moving object is located, so that the user can know the motion condition of the object according to the kinematic parameters and further study the moving object.
Since the motion of the object is relative, when describing the motion of the object using kinematic parameters, a coordinate system of reference needs to be specified. Therefore, in order to better understand the embodiments of the present application, the coordinate system related to the embodiments of the present application is described below.
1. A ground coordinate system: the coordinate system can also be called as a geodetic coordinate system, is fixed on the surface of the earth, and can embody the motion characteristics of a moving object relative to the earth. Illustratively, the ground coordinate system may include three coordinate axes of OX, OY and OZ, wherein the axis of OX points to any direction of the ground plane, the axis of OZ is vertically upward, and the axis of OY is perpendicular to the plane formed by the axis of OZ, forming a right-hand coordinate system.
2. Measuring head coordinate system: it may also be referred to as a "camera coordinate system" and refers to a coordinate system constructed by a horizontal connection line of two cameras and the optical path center thereof in the binocular stereo vision method.
3. Object own coordinate system: the coordinate system is established by defining coordinate axes by using a certain point of the object as an origin and a straight line parallel to the edge of the object.
The object's own coordinate system may define coordinate axes with straight lines parallel to the edges of the object, so the object's own coordinate system is related to the shape of the object itself. For example, if the object is a cuboid or a cube, the angles between the coordinate axes are all right angles; if the object is a triangular prism, the angles between the coordinate axes may not be right angles.
The coordinate system of the object itself may include three coordinate axes of x, y and z, wherein the x axis, the y axis and the z axis in xyz may satisfy the following two conditions:
1) the x axis, the y axis or the z axis is a straight line where any one side of the edge of the object is located or a straight line parallel to the edge of the object;
2) the x-axis, y-axis, and z-axis may intersect at a point.
By way of example, fig. 1 shows a schematic representation of the coordinate system of an object itself. As shown in fig. 1, the object is a rectangular parallelepiped, the rectangular parallelepiped includes 8 vertices, 6 planes and 8 sides, the 8 vertices are a, b, c, d, e, f, g and h, the 6 planes are a plane abcd, a plane abef, a plane aehd, a plane bfgc, a plane efgh and a plane dcgh, and the 8 sides are ab, cd, ef, gh, ad, bc, eh and fg.
The coordinate system of the object may use any vertex of the 8 vertices as an origin, may use any point in the 6 planes as an origin, and may use any point on the 8 edges as an origin.
In fig. 1, a coordinate system is established with point d as the origin, a coordinate axis x is defined with a straight line where the side dc is located, a coordinate axis y is defined with a straight line where the side da is located, and a coordinate axis z is defined with a straight line where the side dh is located, so as to obtain an object coordinate system xyz.
It should be understood that the establishment of the coordinate system xyz of the object itself is only an example, and the embodiment of the present application does not limit this.
4. Any coordinate system: the coordinate system is defined by a user and arbitrarily selects an origin and coordinate axes, the coordinate system satisfies that one coordinate axis is perpendicular to a plane formed by the other two coordinate axes, and the included angle range of the two coordinate axes of the plane can be (0,180), namely, the angle between the two coordinate axes can be arbitrary in the coordinate system selected by the user.
Any coordinate system may include the above-mentioned ground coordinate system, the measurement head coordinate system, or the object coordinate system itself, and may also be another coordinate system defined by the user, which is not limited in this embodiment of the present application.
Next, a detailed description will be given of a measurement method based on binocular stereo vision with reference to the coordinate system of the measurement head as an example, and with reference to fig. 2 and 3.
Illustratively, fig. 2 shows a schematic view of a scene for measuring moving objects based on binocular stereo vision technology. As shown in fig. 2, the scene includes a camera 1, a camera 2, a support, a wall, and a measured object. The camera 1 and the camera 2 can shoot a measured object to obtain an image of the measured object. The object to be measured moves from top to bottom next to the wall. The camera 1 and the camera 2 are fixed on the bracket, and when the measured object moves, the camera 1 and the camera 2 can respectively rotate up and down, left and right around the fixed point of the bracket to shoot the measured object.
When the measured object moves to the position shown in fig. 2, a measuring head coordinate system abc is constructed by the horizontal connecting line of the camera 1 and the camera 2 and the optical path centers thereof (i.e., the optical path 1 and the optical path 2), and the measuring head coordinate system abc includes an a-axisB-axis and c-axis. The camera 1 and the camera 2 shoot the object to be measured to obtain two images of the object to be measured, and three-dimensional coordinates (x coordinate) of the object to be measured at the position shown in fig. 2 by using the measuring head coordinate system abc as a reference coordinate system can be obtained according to the position deviation between corresponding points of the object to be measured in the two images 1 ,y 1 ,z 1 )。
When the measured object moves downwards along the wall, the view angles of the camera 1 and the camera 2 can rotate downwards around the fixed point of the bracket along with the movement of the measured object. When the object to be measured moves down the wall to the position shown in fig. 3, the horizontal line between the camera 1 and the camera 2 and the coordinate system formed by the centers of the optical paths (i.e., the optical paths 3 and 4) of the horizontal line are changed from the measurement head coordinate system abc to the measurement head coordinate system a due to the change in the optical paths of the camera 1 and the camera 2 1 b 1 c 1 The coordinate system a of the measuring head 1 b 1 c 1 Comprises a 1 Shaft, b 1 Shaft and c 1 And a shaft. The camera 1 and the camera 2 shoot the measured object to obtain two images of the measured object, and the measured object can be obtained according to the position deviation between corresponding points of the measured object in the two images and the coordinate system a of the measuring head 1 b 1 c 1 Three-dimensional coordinates (x) at the position shown in FIG. 3 for reference to a coordinate system 2 ,y 2 ,z 2 )。
According to three-dimensional coordinates (x) 1 ,y 1 ,z 1 ) And three-dimensional coordinates (x) 2 ,y 2 ,z 2 ) The displacement of the object to be measured can be obtained, and the velocity of the object to be measured can be obtained from the displacement and the time interval between the cameras (camera 1 and camera 2) taking the images.
Therefore, the kinematic parameters obtained by measuring the moving object based on the binocular stereo vision are based on the measuring head coordinate system, and the measuring head coordinate system can change along with the movement of the measured object.
In the following, a mobile phone drop test scene is taken as an example, and a method for measuring a moving object based on a binocular stereo vision technology is described in detail.
The mobile phone drop test mainly tests whether the mobile phone drops from a certain height and the body can not be damaged. Wherein the mobile phone can fall from a certain height in a free-fall manner. The damaged body can include the broken screen of the mobile phone, the broken shell of the mobile phone, the intact shell of the mobile phone but damaged internal devices, and the like.
Illustratively, the falling height of the mobile phone can be 1 meter, and the mobile phone can fall from 6 faces of the mobile phone, namely the front face, the back face, the upper face, the lower face, the left face and the right face of the mobile phone when falling. The handset can also fall from the 3 sides of the handset, i.e. the length, width and height of the handset, respectively.
The research cell-phone falls from the take the altitude, and the gesture when bumping with ground and producing the organism impaired is convenient for follow-up reinforcing cell-phone damaged position after to whether the gesture when the organism is impaired can make the organism can not be impaired from the altitude drop test, thereby the validity of the impaired position of cell-phone is strengthened in the test.
For example, fig. 4 is a schematic diagram illustrating a damaged mobile phone body. The mobile phone is dropped downwards from a certain height by a free falling body with the front facing a tester, and the screen is broken after the mobile phone collides with the ground. The tester needs to know the posture of the mobile phone when touching the ground, for example, the included angle between the long side of the mobile phone and the ground, the included angle between the short side of the mobile phone and the ground, whether the screen of the mobile phone is inclined, and the like.
As another example, fig. 5 shows a schematic view of a scene in which another mobile phone body is damaged. The mobile phone falls downwards from a certain height by a free falling body with the front facing a tester, the mobile phone body is not damaged after the mobile phone collides with the ground, but the mobile phone body contacts the ground for the second time after the mobile phone collides with the ground and is overturned, so that the screen is broken. The tester needs to know the flip angle of the mobile phone after the first touchdown and the posture of the mobile phone when the mobile phone touches the ground for the second time. For example, the angle, angular velocity, and angular acceleration of the cell phone that flips within 0.1 ms after touchdown, and the posture at the time of secondary touchdown after 0.5 ms has elapsed, and so on.
The embodiment of the application is based on the binocular stereoscopic vision measurement, and the mobile phone falls from a certain height and collides with the ground to generate the movement and the posture of the damaged mobile phone body.
In one possible implementation manner, the posture of the mobile phone body when damaged can be obtained by taking the measuring head coordinate system as a reference coordinate system based on binocular stereo vision. Illustratively, fig. 6 shows a scene schematic diagram for measuring the posture of a mobile phone body when the mobile phone body is damaged based on binocular stereo vision. As shown in fig. 6, the scene for measuring the posture of the damaged mobile phone body includes a camera 1, a camera 2, a cradle, and a mobile phone. The camera 1 and/or the camera 2 may have a function of processing operation and a function of display in addition to the photographing function. It will be appreciated that the handset is shown in figure 6 in the form of a rectangular parallelepiped in order to highlight the pose of the handset.
When the mobile phone collides with the ground through the free falling body, the mobile phone body is damaged. The camera 1 and the camera 2 can shoot images of the touch of the mobile phone, and a measuring head coordinate system a is constructed by a horizontal connecting line of the camera 1 and the camera 2 and the light path centers (namely the light path 5 and the light path 6) of the horizontal connecting line 2 b 2 c 2 The coordinate system a of the measuring head 2 b 2 c 2 Comprises a 2 Shaft, b 2 Shaft and c 2 A shaft. The camera 1 and the camera 2 take images of the contact with the ground of the mobile phone, and a coordinate system a of the measuring head can be obtained 2 b 2 c 2 The touchdown attitude of the mobile phone is a reference coordinate system, namely the touchdown attitude of the mobile phone relative to a 2 Shaft, b 2 Shaft and c 2 The angle of the shaft.
It should be understood that, the camera 1 and/or the camera 2 is installed with measurement software, and a tester can use the measurement software to set a time interval between two times of image taking by the camera 1 and the camera 2, can also display images of the mobile phone taken by the camera 1 and the camera 2, can also calculate the posture of the mobile phone according to the images of the mobile phone taken by the camera 1 and the camera 2, and can also establish a coordinate system of the object itself.
The tester can use the measurement software to establish the coordinate system x of the mobile phone 1 y 1 z 1 The coordinate system x 1 y 1 z 1 Including x 1 Axis, y 1 Axis and z 1 A shaft. When the mobile phone touches the ground, the position of the mobile phone is opposite to the position a 2 Shaft, b 2 Shaft and c 2 The angle of the axis may be x 1 Axis, y 1 Axis and z 1 Axis relative to a 2 Shaft, b 2 Shaft and c 2 The angle of the shaft. If clockwise is positive, the mobile phone shown in fig. 6 is touched down with respect to a 2 Shaft, b 2 Shaft and c 2 The angle of the axis may be, for example (-45, -15, -30). The angle may be displayed by camera 1 and/or camera 2.
After the developer strengthens the mobile phone body, the tester can freely fall the mobile phone to the ground in the above-mentioned ground-contacting posture to perform a falling test, and the effectiveness of the damaged part of the mobile phone is measured and strengthened. The above-mentioned ground-contacting posture of mobile phone is a coordinate system of measuring head 2 b 2 c 2 For the measurement with reference to the coordinate system, the tester needs to use the coordinate system a of the measuring head 2 b 2 c 2 And setting the initial posture of the mobile phone, and judging whether the mobile phone body is damaged when the mobile phone falls to the ground or not through images shot by the camera 1 and the camera 2.
In another possible implementation manner, the posture of the damaged mobile phone body can be obtained based on binocular stereo vision, and the coordinate system of the mobile phone is taken as a reference coordinate system. Illustratively, fig. 7 shows a scene schematic diagram of another test mobile phone body posture based on binocular stereo vision when damaged. The scene for testing the posture of the damaged mobile phone body comprises a camera 1, a camera 2, a support, a mobile phone, a tester and a desktop computer. The desktop computer comprises a display screen and a host, wherein the display screen has a display function, and the host can have a function of processing operation. The camera 1 and the camera 2 may have only a photographing function. It should be understood that the desktop computer in the embodiment of the present application is merely an example.
The software displayed on the desktop computer may be measurement software matched with the camera 1 and the camera 2, and may be connected with the camera 1 and the camera 2 in a wired or wireless manner. After the measurement software is connected with the camera 1 and the camera 2, the time interval of shooting images by the camera 1 and the camera 2 every two times can be set, the images of the mobile phone shot by the camera 1 and the camera 2 can also be displayed, the gesture of the mobile phone can also be calculated according to the images of the mobile phone shot by the camera 1 and the camera 2, and the coordinate system of the object can also be established.
The menu bar of the measurement software includes new, edit, and view. It should be understood that the menu bar is merely an example, and the embodiment of the present application does not limit the functions included in the menu bar. The interface of the measurement software also displays a save icon, a withdraw icon, a minimize icon, a maximize icon and a close icon, and a tester can realize the requirements of saving a file, withdrawing an operation, minimizing an interface, maximizing an interface or closing an interface and the like by clicking the save icon, the withdraw icon, the minimize icon, the maximize icon or the close icon.
As shown in fig. 7, a tester may establish a mobile phone coordinate system xyz through measurement software, and set the mobile phone coordinate system xyz as a reference coordinate system, after the mobile phone touches the ground, the posture of the mobile phone is changed, and the mobile phone coordinate system is changed from xyz to x 1 y 1 z 1 . The coordinate system xyz of the mobile phone is taken as a reference coordinate system, and the touchdown posture of the mobile phone is the coordinate system x of the mobile phone when the mobile phone touches the ground 1 y 1 z 1 In x 1 Axis, y 1 Axis and z 1 The angle of the axes with respect to the x-axis, y-axis, and z-axis. If clockwise is positive, the angle of the mobile phone shown in fig. 7 with respect to the x-axis, y-axis and z-axis when touching down may be (-10 °, -5 °, -10 °), which may be displayed on the display screen of the desktop computer.
After the developer strengthens the mobile phone body, the tester needs to freely fall the mobile phone to the ground in the above-mentioned ground-contacting posture to perform a falling test, and measure and strengthen the effectiveness of the damaged part of the mobile phone. Because the above-mentioned gesture of touching the ground of the mobile phone is measured with the coordinate system xyz of the mobile phone itself as the reference coordinate system, the tester needs to set the initial gesture of the mobile phone with the coordinate system xyz of the mobile phone itself, and judge whether the body is damaged when the mobile phone falls to the ground through the images shot by the camera 1 and the camera 2.
In the method for moving objects based on the binocular stereo vision technology, the kinematic parameters of the moving object under the coordinate system of the camera (i.e. the coordinate system constructed by the horizontal connecting line of the two cameras and the optical path center thereof) or the self coordinate system of the moving object can only be obtained, the kinematic parameters of the moving object under other coordinate systems (e.g. the ground coordinate system or any coordinate system customized by the user) can not be flexibly obtained according to the user requirements, and the user experience is not good.
Currently, a tester can measure a moving object by placing a calibration object in the range of the shooting visual angles of the camera 1 and the camera 2 and using a ground coordinate system as a reference coordinate system. Illustratively, fig. 8 shows a schematic diagram of a scene for measuring the posture of a mobile phone body when the mobile phone body is damaged based on binocular stereo vision. As shown in fig. 8, the scene of the moving object includes a camera 1, a camera 2, a support, a calibration object a, a mobile phone, a tester, and a desktop computer. The desktop computer comprises a display screen and a host, the display screen has a display function, the host can have a function of processing operation, and the desktop computer is provided with measurement software. The camera 1 and the camera 2 may have only a photographing function. It should be understood that the desktop computer in the embodiment of the present application is merely an example.
The tester places the calibration object a on the ground and within the viewing angles of the camera 1 and the camera 2, and the object to be tested falls on the ground and also within the viewing angles of the camera 1 and the camera 2. The calibration object A is a cube, and a mark point 1, a mark point 2 and a mark point 3 can be arranged on the calibration object A. The straight lines of the mark points 1 and 2 are parallel to the straight line of the side of the cube, and the straight lines of the mark points 1 and 3 are parallel to the straight line of the side of the cube. The sides and planes of the calibration object A are parallel or perpendicular to the coordinate axes in the ground coordinate system, so the calibration object A can be used for establishing the ground coordinate system. The mobile phone is pasted with a mark point 4, a mark point 5 and a mark point 6, the mark point 4, the mark point 5 and the mark point 6 can be used for establishing a coordinate system x of the mobile phone 1 y 1 z 1
The tester can establish a ground coordinate system uvw and a coordinate system x of the object to be tested by using measurement software according to the images of the calibration object A and the mobile phone shot by the camera 1 and the camera 2 1 y 1 z 1 And further according to the ground coordinate system uvw of the self coordinate system and the self coordinate system x of the measured object 1 y 1 z 1 Obtaining the touchdown attitude of the mobile phone, namely the u axis, the v axis and the w axis relative to the x axis 1 Axis, y 1 Axis and z 1 The angle of the shaft.
However, this method has two disadvantages:
1) camera 1 and camera 2 have effective measurement volume, and when camera 1 and camera 2 had calibration object A and testee, calibration object A and testee all need be in camera 1 and camera 2's effective measurement volume, and camera 1 and camera 2's effective measurement volume is less, and calibration object A is deposited in unnecessary space not.
2) The calibration object A is placed on the ground, and a measured object freely falls to the ground from a certain height and possibly collides with the calibration object A, so that the position or the posture of the calibration object A is changed, and further, the reference coordinate system is changed.
In view of this, embodiments of the present application provide a method and a system for measuring a moving object, in measuring the moving object based on a binocular stereo vision technology, a coordinate system defined by a user may be used as a reference coordinate system to describe the motion of the object, and at the same time, an effective measurement volume of a binocular camera may not be occupied, and the reference coordinate system is not easily changed, thereby improving user experience.
Fig. 9 illustrates a method 900 for measuring a moving object according to an embodiment of the present application, where the method 900 may be performed by a terminal device installed with measurement software, for example, a mobile phone, a tablet computer, a Personal Computer (PC), a desktop computer, a smart screen, an Artificial Intelligence (AI) device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, and the like. It should be understood that the embodiments of the present application do not limit the specific technology and the specific device form adopted by the terminal device. The method 900 may also be performed by an imaging device installed with measurement software, which may have a photographing function, and may also have a function of processing operations and a function of displaying.
The method 900 of the embodiment of the present application may be applied to measuring a scene of a moving object based on stereoscopic binocular vision, for example, any one of the scenes of fig. 2 to 5 described above.
The method 900 may include the steps of:
and S901, obtaining a measuring head coordinate system according to the image of the calibration object B shot by the camera 1 or the camera 2.
The terminal device may calculate the measurement head coordinate system abc from the image of the calibration object B provided by the manufacturer captured by the camera 1 or the camera 2.
Fig. 10 shows a schematic diagram of an exemplary method for obtaining the coordinate system of a measuring head. As shown in fig. 10, the calibration object B is provided by the manufacturer and made of a precise ceramic material. The precise ceramic material has small change in heated, cooled, dried and humid environments, and can be ignored.
The image of the calibration object B can be used to calculate the coordinate system of the measuring head. The calibration object B is provided with a plurality of marking points, the size and the number of the marking points are fixed, and the arrangement of the marking points and the distance between different marking points are precisely calculated.
The tester can place the calibration object B in the viewing angles of the cameras 1 and 2 and on the ground. The tester can control the cameras 1 and 2 to take one set of images of the calibration object B at the position (i.e., two images at the position). Then the tester tilts the calibration object B forward, backward, leftward, rightward, leftward forward, leftward backward, rightward forward, rightward backward, upward pitch, and downward pitch, respectively, and captures images of the calibration object B through the camera 1 and the camera 2, respectively, so as to obtain twelve sets of images. It should be understood that camera 1 and camera 2 together capture thirteen sets of images of the calibration object B.
A tester may import thirteen sets of images of the calibration object B on the measurement software, and it should be understood that fig. 10 shows partial images of the thirteen sets of images, and the terminal device (i.e., a desktop computer) may establish an equation according to the position difference of the mark point on the calibration object B in each set of images, the actual position difference of the mark point on the calibration object B, and the view angle of the camera, may establish an equation set including thirteen equations, and solve the equation set, thereby obtaining the measurement head coordinate system abc.
And S902, constructing a coordinate system defined by a tester according to the coordinate system of the measuring head.
The method for the terminal equipment to establish the coordinate system customized by the tester comprises the following steps:
1) and obtaining a coordinate system defined by the tester according to the image of the calibration object C shot by the camera 1 or the camera 2. The calibration object C is used for establishing a self-defined coordinate system.
2) According to a measuring head coordinate system abc and a coordinate system u defined by a tester 1 v 1 w 1 Obtaining a first compensation angle, a second compensation angle and a third compensation angle, wherein the first compensation angle is that the coordinate axis a rotates clockwise to the coordinate axis u 1 Parallel angle, the second compensation angle is that the coordinate axis b rotates clockwise to the coordinate axis v 1 Parallel angle, third compensation angle is that coordinate axis c rotates clockwise to coordinate axis w 1 The angle of parallelism.
Illustratively, the first compensation angle may be referred to as a roll angle, the second compensation angle may be referred to as a pitch angle, and the third compensation angle may be referred to as a heading angle.
Alternatively, the first compensation angle may be a coordinate axis a rotated counterclockwise to a coordinate axis u 1 The parallel angle, the second compensation angle can also be the coordinate axis b rotated counterclockwise to the coordinate axis v 1 The third compensation angle can also be a rotation of the coordinate axis c counterclockwise to the coordinate axis w 1 The angle of parallelism.
3) According to the measuring head coordinate system abc, the first compensation angle, the second compensation angle and the third compensation angle, a coordinate system u defined by a tester is constructed 1 v 1 w 1
The terminal equipment can respectively rotate the a axis, the b axis and the c axis in the measuring head coordinate system abc in the clockwise direction by a first compensation angle, a second compensation angle and a third compensation angle to obtain a coordinate system u defined by a tester 1 v 1 w 1
The coordinate system customized by the tester may be a coordinate system (e.g., a ground coordinate system) in which coordinate axes are all perpendicular angles, or an arbitrary coordinate system in which coordinate axes are not perpendicular angles.
In one possible implementation, the calibration object C may be used to establish a coordinate system with orthogonal axes, for example, a ground coordinate system. The calibration object C can be the above calibrationAnd the object A is used for establishing a ground coordinate system. The calibration object C may also be another calibration object that can establish a ground coordinate system, and the embodiment of the present application is not limited. The terminal equipment can obtain a coordinate system u with orthogonal coordinate axes according to the images of the calibration object C shot by the camera 1 and the camera 2 1 v 1 w 1
Illustratively, FIG. 11 shows a schematic diagram of obtaining a ground coordinate system. As shown in fig. 11, the calibration object C is a cube, and the calibration object C may have a mark point 1, a mark point 2, and a mark point 3. The straight lines of the mark points 1 and 2 are parallel to the straight line of the side of the cube, and the straight lines of the mark points 1 and 3 are parallel to the straight line of the side of the cube.
The tester can place the calibration object C in the view of the cameras 1 and 2 and on the ground. The tester can control the camera 1 and the camera 2 to take two images of the calibration object C at the position. The tester can import the image of the calibration object C on the measurement software, and then determine the coordinate axis by clicking the mark points to obtain the ground coordinate system.
For example, the tester clicks on the marker point 1 on the marker C, and then clicks on the marker point 2. The terminal equipment detects that the tester firstly clicks the mark point 1 and then clicks the mark point 2, and the terminal equipment displays the u pointing to the mark point 2 from the mark point 1 1 A shaft. The tester clicks the mark point 1 on the calibration object C and then clicks the mark point 3. The terminal equipment detects the operation that the tester clicks the mark point 1 first and then clicks the mark point 3, and the terminal equipment displays the w pointing to the mark point 3 from the mark point 1 1 A shaft. The terminal equipment can display u on the measuring software 1 Shaft and w 1 Normal to the plane defined by the axes, which normal includes the marking point 1 and on which the v of the two directions is shown 1 Axis, v of the two directions 1 The origin of the axis may be marker point 1. The tester can click v in any direction 1 Axes, determined as coordinate system u 1 v 1 w 1 V in (1) 1 A shaft. As shown in fig. 11, after the terminal device detects the operation of the tester clicking the coordinate axis, the display seat is displayedMark system u 1 v 1 w 1
It should be understood that the tester can click any point on the calibration object C on the image of the calibration object C to determine the coordinate system u 1 v 1 w 1 Of the origin. When the tester does not perform the clicking operation, the origin may be automatically the mark point 1.
If the object to be measured is a mobile phone, the mobile phone can be pasted with a mark point 4, a mark point 5 and a mark point 6. The terminal equipment can obtain the self coordinate system x of the measured object by adopting the method 1 y 1 z 1
In another possible implementation, the calibration object C may be used to establish a coordinate system with coordinate axes that are not at right angles to each other. The tester places the calibration object C in the visual angles of the camera 1 and the camera 2, and the terminal equipment can obtain any coordinate system u according to the images of the calibration object C shot by the camera 1 and the camera 2 1 v 1 w 1 . The angle of the coordinate axis in the arbitrary coordinate system is not limited in the embodiment of the application.
It should be understood that the arbitrary coordinate system is the coordinate system on which the test person wishes to describe the pose of the object under test.
Illustratively, fig. 12 shows a schematic diagram of a scenario for obtaining a tester-defined coordinate system. The calibration object C is a triangular pyramid. The marker C is pasted with a mark point 7, a mark point 8 and a mark point 9. The straight line of the mark points 7 and 8 is parallel to the straight line of the side of the triangular pyramid, and the straight line of the mark points 7 and 9 is parallel to the straight line of the side of the triangular pyramid.
The tester can place the calibration object C in the view of the cameras 1 and 2 and on the ground. The tester can control the camera 1 and the camera 2 to take two images of the calibration object C at the position. The tester can import the image of the calibration object C on the measurement software, and then the coordinate axis is determined by clicking the mark point to obtain any coordinate system u defined by the tester 1 v 1 w 1
For a specific operation process, reference may be made to the above operations, which are not described herein again. As shown in FIG. 12, the terminal device checksAfter the operation of clicking coordinate axis by the tester is detected, the coordinate system u is displayed 1 v 1 w 1 . The coordinate system u 1 v 1 w 1 And 4, an arbitrary coordinate system is defined by a tester.
And S903, describing the posture of the measured object by using a coordinate system defined by a tester as a reference coordinate system according to the image of the measured object shot by the camera 1 or the camera 2.
According to the image of the measured object shot by the camera 1 or the camera 2, the terminal equipment can obtain the coordinate system x of the measured object according to the mark points on the measured object 1 y 1 z 1 . According to the coordinate system x of the object to be measured 1 y 1 z 1 And a coordinate system u customized by a tester 1 v 1 w 1 And obtaining the posture of the measured object.
If the measured object is a mobile phone, the posture of the mobile phone can be represented by angles i, j and k, namely a coordinate axis x 1 Rotate clockwise to co-ordinate axis u 1 The parallel angle is the included angle i between the wide side of the mobile phone and the ground, and the coordinate axis y 1 Rotate clockwise to coordinate axis v 1 The parallel angle is the included angle j between the height of the mobile phone and the ground, and the coordinate axis z 1 Rotate clockwise to coordinate axis w 1 The parallel angle is an included angle k between the long edge of the mobile phone and the ground.
The moving object measuring method provided by the embodiment of the application is used for measuring the moving object based on the binocular stereoscopic vision technology, and can be used for constructing any coordinate system defined by a tester according to a measuring head coordinate system, so that the motion of the tested object can be described by the any coordinate system defined by the tester, the kinematic parameters of the moving object under the coordinate system defined by the user can be obtained according to the user requirements, and the measuring flexibility and the user experience are improved. In addition, the method can avoid the need of placing a calibration object in the visual angles of the camera 1 and the camera 2, save the measurement space and simultaneously avoid the problem of reference coordinate system change caused by the collision between the falling of a measured object and the calibration object.
In the method 900, the terminal device can obtain the posture of the object to be measured falling to the ground, when the object to be measured falls to the ground and continues to move, the terminal device can continuously shoot images of the object to be measured through high-speed shooting to obtain the speed, the acceleration, the angle, the angular velocity, the angular acceleration and other kinematic parameters of the object to be measured, and if the object to be measured is a mobile phone, the body of the mobile phone is damaged due to secondary collision in the moving process, the damaged posture of the body of the mobile phone can also be obtained.
Illustratively, the terminal device may obtain a time interval t between adjacent photographing of the camera 1 and the camera 2, may also obtain three-dimensional information of the object to be measured in two photographing, may calculate a displacement s of the object to be measured in the time interval t according to the three-dimensional information of the object to be measured, and obtains the velocity v according to the displacement s and the time interval t.
Exemplarily, the terminal device may obtain the initial posture of the object to be measured with the coordinate system defined by the tester as the reference coordinate system according to the three-dimensional information of the object to be measured at the time of the first photographing, the terminal device may obtain the motion posture of the object to be measured with the coordinate system defined by the tester as the reference coordinate system according to the three-dimensional information of the object to be measured at the time of the second photographing, and the angle of motion of the object to be measured may be obtained according to the initial posture of the object to be measured and the motion posture of the object to be measured
Figure BDA0003291694930000121
According to the angle
Figure BDA0003291694930000122
And a time interval t, the angular velocity can be obtained.
The method 900 can be integrated in measurement software, and a tester can measure a moving object under a coordinate system defined by the tester through human-computer interaction. From the perspective of a tester operating measurement software, another measurement method of a moving object is provided in the embodiments of the present application.
Fig. 13 is a diagram illustrating another method 1300 for measuring a moving object according to an embodiment of the present application, where the method may be performed by a terminal device or an imaging device having measurement software installed therein. The method is applicable to measuring scenes of moving objects based on stereoscopic binocular vision, for example, any one of the scenes of fig. 2 to 5 described above.
The method 1300 may include the steps of:
and S1301, detecting the operation of clicking the user-defined coordinate system option by a tester, and displaying the interface of the user-defined coordinate system.
The self-defined coordinate system option can be added in the measurement software, and a tester can self-define any coordinate system as a reference coordinate system by clicking the self-defined coordinate system option.
Exemplarily, fig. 14 is an interface schematic diagram of a measurement software provided in an embodiment of the present application. As shown in the a interface in fig. 14, the custom coordinate system option may be in the edit menu bar. It should be understood that other options may be included in the editing menu bar, which is not limited in this embodiment of the present application.
And when the terminal equipment detects that the tester clicks the editing menu bar, the user-defined coordinate system option can be displayed. When the terminal device detects that the tester clicks the option of the custom coordinate system, an interface of the custom coordinate system may be displayed, as shown in an interface b in fig. 14. In the interface b in fig. 14, a prompt message may be displayed on the interface, and the content of the prompt message may include "please place the calibration object with the marked point within the viewing angle of the binocular camera", for prompting the tester to place the calibration object within the viewing angle of the binocular camera.
The calibration object to which the marker is attached may be the calibration object C described above.
S1302, when the image shot by the camera 1 or the camera 2 is detected to be led in by a tester, the image is displayed, the image comprises a calibration object pasted with at least three marking points, and the edge and the plane of the calibration object are parallel or vertical to the coordinate axis of the coordinate system defined by the tester.
Illustratively, the tester-customized coordinate system is a ground coordinate system. The calibrant may be calibrant C described above. FIG. 15 illustrates an interface diagram of a custom coordinate system function. As shown in an interface a in fig. 15, when the terminal device detects that the tester introduces an image of the calibration object C captured by the camera 1 or the camera 2, the image of the calibration object C is displayed. The calibration object C comprises a marking point 1, a marking point 2 and a marking point 3.
S1303, detecting the operation of clicking the mark point 1 of the at least three mark points by the tester, detecting the operation of clicking the mark point 2 of the at least three mark points by the tester, and displaying the coordinate axis u pointing to the mark point 2 from the mark point 1 1
S1304, detecting the operation of the tester clicking the mark point 1 of the at least three mark points, detecting the operation of the tester clicking the mark point 3 of the at least three mark points, and displaying the coordinate axis w pointing from the mark point 1 to the mark point 3 1
For example, as shown in an interface a in fig. 15, after the terminal device displays the image of the calibration object C, a prompt message may be displayed, where the content of the prompt message may include "please define a coordinate axis" for prompting the tester to define the coordinate axis, and the tester may click the mark point 1 on the image of the calibration object C. As shown in the interface b in fig. 15, the tester clicks the mark point 2 on the image of the calibration object C, the terminal device detects the operation of the tester clicking the mark point 1, detects the operation of the tester clicking the mark point 2, and displays the coordinate axis u pointing from the mark point 1 to the mark point 2 1
As shown in an interface a in fig. 16, when the terminal device detects that the tester clicks the mark point 1, and detects that the tester clicks the mark point 3, a coordinate axis w pointing from the mark point 1 to the mark point 3 is displayed 1
S1305, detecting coordinate axis u 1 And a coordinate axis w 1 Then, the coordinate axis u is displayed 1 And a coordinate axis w 1 A normal to the plane in which the coordinate axis of the two directions is displayed, which normal may include the marker point 1.
Illustratively, as shown in the b interface in fig. 16, the terminal device displays a coordinate axis u 1 And a coordinate axis w 1 The normal of the plane may include the mark point 1, and the normal displays two directional coordinate axes on the normal, and displays a prompt message, where the prompt message may include "please select the direction of the third coordinate axis" for prompting the tester to determine the third coordinate axis.
S1306, after the operation that a tester clicks any one coordinate axis of the coordinate axes in the two directions is detected, a coordinate system u is displayed 1 v 1 w 1
Illustratively, as shown in an interface c in fig. 16, after the terminal device detects that the tester clicks any one of the coordinate axes in two directions, the terminal device displays a coordinate system u 1 v 1 w 1
S1307, detecting that the tester clicks the use coordinate system u 1 v 1 w 1 When measuring the operation of a moving object, using the coordinate system u 1 v 1 w 1 For reference to the coordinate system, the posture of the measured object is described.
As shown in the interface c in fig. 16, the terminal device displays a coordinate system u 1 v 1 w 1 And then, inquiring information can be displayed, wherein the inquiring information can comprise that the self-defined coordinate system is finished and whether the moving object is measured by using the self-defined coordinate system, the inquiring information is used for inquiring whether the testing personnel measures the moving object by using the self-defined coordinate system, a yes option and a no option are provided at the same time, when the terminal equipment detects that the testing personnel clicks the yes option, an interface d in the display 16 is displayed, the interface is an interface for measuring the moving object, and the reference coordinate system is u 1 v 1 w 1
S1307 may include steps 2) and 3) in S902 described above and S903.
According to the moving object measuring method, a man-machine interaction interface is provided for a tester in the process of measuring the moving object based on the binocular stereo vision technology, the motion of the tested object is described by using the coordinate system defined by the tester as the reference coordinate system under the condition that the tester does not sense, and the measuring flexibility and the user experience are improved.
As an alternative embodiment, an option of adding a ground coordinate system to the measurement software may be added, and a developer of the measurement software may construct the ground coordinate system by using the above methods S901 and S902 during development. And when the terminal equipment detects that the tester clicks the ground coordinate system, displaying the coordinate system constructed by the developer.
The embodiment of the application provides another method for measuring a moving object, which can comprise the following steps: based on the method of the S901, a measurement head coordinate system is obtained, the measurement head coordinate system is used as a reference coordinate system to describe the posture of the measured object, and the kinematic parameters of the measured object are stored; based on the steps 1) and 2) of S902, a coordinate system defined by a tester, a first compensation angle, a second compensation angle, and a third compensation angle are obtained, and a kinematic parameter of the object to be measured in the measurement head coordinate system is converted into a kinematic parameter in the coordinate system defined by the tester, so that the posture of the object to be measured is described in the coordinate system defined by the tester as a reference coordinate system.
The tester can use the measurement software to obtain the kinematic parameters of the tested object under the measurement head coordinate system, and obtain the coordinate system customized by the tester based on the option of the customized coordinate system in the method 1300, when the terminal device detects that the tester measures the moving object under the customized coordinate system, the kinematic parameters of the tested object under the measurement head coordinate system are converted to the kinematic parameters under the customized coordinate system of the tester, and are displayed on the interface of the measurement software.
Fig. 17 is a diagram illustrating another method 1700 for measuring a moving object according to an embodiment of the present application, where the method is performed by a terminal device or an imaging device having measurement software installed therein. The method is applicable to measuring scenes of moving objects based on stereoscopic binocular vision, for example, any one of the scenes of fig. 2 to 5 described above.
The method 1700 may include the steps of:
s1701, measure the first calibration object to obtain a first coordinate system, which is a camera coordinate system.
The first calibration object may be the calibration object B in S901 of the method 900, and the first coordinate system is a camera coordinate system, i.e., a measurement head coordinate system.
The camera coordinate system (measurement head coordinate system) may include a first coordinate axis, a second coordinate axis, and a third coordinate axis. For example, the measurement head coordinate system abc includes an a-axis, a b-axis, and a c-axis.
The method for the terminal device to obtain the first coordinate system may refer to S901 in the method 900, which is not described herein again.
S1702, measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system.
The second calibrator may be calibrator C of 1) of S902 of method 900 described above. The user-defined coordinate system may be a coordinate system with orthogonal coordinate axes or an arbitrary coordinate system with non-orthogonal coordinate axes, for example, a ground coordinate system or an arbitrary coordinate system.
The second calibration object may be a cube or a cuboid, which is not limited in the embodiment of the present application.
The user-defined coordinate system may include a fourth axis, a fifth axis, and a sixth axis. For example, an arbitrary coordinate system u 1 v 1 w 1 Comprising u 1 Shaft, v 1 Shaft and w 1 A shaft.
The method for the terminal device to obtain the second coordinate system may refer to 1 of S901 in the above method 900), which is not described herein again.
S1703, reconstructing a second coordinate system based on a difference between the first coordinate system and the second coordinate system.
The difference between the first coordinate system and the second coordinate system may include: the first coordinate axis in the first coordinate system is rotated to a first angle parallel to a fourth coordinate axis in the second coordinate system in a first direction, the second coordinate axis in the first coordinate system is rotated to a second angle parallel to a fifth coordinate axis in the second coordinate system in the first direction, and the third coordinate axis in the first coordinate system is rotated to a third angle parallel to a sixth coordinate axis in the second coordinate system in the first direction, wherein the first direction may be a clockwise direction or a counterclockwise direction.
And the terminal equipment rotates the first coordinate axis, the second coordinate axis and the third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively and according to the first direction, and reconstructs a second coordinate system.
Illustratively, the first coordinate system abc and the arbitrary coordinate system u 1 v 1 w 1 The difference of (a) may include: the a axis rotates clockwise to the u axis 1 The first angle of the axis parallel, b axis rotate clockwise to v 1 Second angle parallel to the axis and c-axis rotated in the first direction to w 1 A third angle at which the axes are parallel.
And S1704, measuring the moving object under a second coordinate system based on binocular stereo vision to obtain the motion parameters of the moving object.
The motion parameter may include an angle of rotation of the own coordinate system of the moving object with respect to the second coordinate system when the moving object touches the ground. Illustratively, the moving object may be the mobile phone, and based on binocular stereoscopic vision, the terminal device may obtain the posture of the coordinate system of the mobile phone itself relative to the user-defined coordinate system.
The moving object measuring method provided by the embodiment of the application is used for measuring the moving object based on the binocular stereo vision technology, the user-defined coordinate system can be reconstructed according to the camera coordinate system, the motion of the measured object can be described under the user-defined coordinate system, the kinematic parameters of the moving object under the user-defined coordinate system can be obtained according to the user requirements, and the measuring flexibility and the user experience are improved.
Optionally, before the step S1702 of measuring the second calibration object and obtaining the second coordinate system, the method 1700 further includes: detecting a coordinate system establishing instruction of a user; responding to an instruction for establishing a coordinate system, and displaying a first interface, wherein the first interface displays an image of a second calibration object; and detecting the clicking operation of the user on the image of the second calibration object, wherein the clicking operation is used for marking the first marking point, the second marking point and the third marking point.
The instruction for the user to establish the coordinate system may be an operation of the tester in S1301 of the method 1300 to click the user-defined coordinate system option, or may be another instruction for establishing the coordinate system, which is not limited in this embodiment of the present application.
The first interface may be the interface a in fig. 15 described above.
The clicking operation of the user on the image of the second calibration object may be S1 in the method 1300 described above303. Operations S1304 and S1305. The second coordinate system may be the coordinate system u shown in the interface c of fig. 16 1 v 1 w 1 I.e. the second coordinate system may be the ground coordinate system.
Optionally, based on binocular stereo vision, measuring the moving object in the second coordinate system to obtain the motion parameters of the moving object, including: detecting a measurement instruction of a user; responding to the measurement instruction, and acquiring an image of the moving object; and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
The measurement instruction of the user may be the coordinate system u clicked by the tester in S1307 in the method 1300 1 v 1 w 1 And measuring the moving object. Illustratively, the measurement instruction of the user may be an operation of clicking an option as shown in the interface c in fig. 16.
The terminal device may acquire an image of a moving object using a binocular camera or two monocular cameras (e.g., the camera 1 and the camera 2 described above), measure the moving object in the second coordinate system according to the image of the moving object, and obtain a motion parameter of the moving object.
According to the moving object measuring method, the moving object is measured based on the binocular stereo vision technology, man-machine interaction operation is provided for a user, the movement of the measured object is described in a user-defined coordinate system under the condition that the user does not perceive the moving object, and measuring flexibility and user experience are improved.
The sequence numbers of the above processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not be limited in any way to the implementation process of the embodiments of the present application.
The method for measuring a moving object according to the embodiment of the present application is described in detail above with reference to fig. 1 to 17, and the device for measuring a moving object according to the embodiment of the present application is described in detail below with reference to fig. 18 and 19.
Fig. 18 shows a measurement apparatus 1800 for a moving object according to an embodiment of the present application, where the apparatus 1800 includes: an acquisition module 1810 and a processing module 1820. The obtaining module 1810 is configured to: measuring a first calibration object to obtain a first coordinate system, wherein the first coordinate system is a camera coordinate system; measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system; the processing module 1820 is configured to: reconstructing a second coordinate system based on a difference between the first coordinate system and the second coordinate system; the acquisition module 1810 is further configured to: and measuring the moving object under the second coordinate system based on the binocular stereo vision to obtain the motion parameters of the moving object.
Optionally, the difference between the first coordinate system and the second coordinate system comprises: a first coordinate axis in the first coordinate system rotates to a first angle parallel to a fourth coordinate axis in the second coordinate system according to a first direction, a second coordinate axis in the first coordinate system rotates to a second angle parallel to a fifth coordinate axis in the second coordinate system according to the first direction, and a third coordinate axis in the first coordinate system rotates to a third angle parallel to a sixth coordinate axis in the second coordinate system according to the first direction, wherein the first direction is clockwise or anticlockwise; the processing module 1820 is further configured to: and rotating the first coordinate axis, the second coordinate axis and the third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively in the first direction to reconstruct a second coordinate system.
Optionally, the motion parameter includes an angle of rotation of the own coordinate system of the moving object with respect to the second coordinate system when the moving object touches the ground.
Optionally, the second calibration object includes a first marker point, a second marker point, and a third marker point; the acquisition module 1810 is further configured to: obtaining a fourth coordinate axis according to the first mark point and the second mark point; obtaining a fifth coordinate axis according to the first mark point and the third mark point; obtaining a sixth coordinate axis according to the first coordinate axis and the second coordinate axis; and obtaining a second coordinate system according to the fourth coordinate axis, the fifth coordinate axis and the sixth coordinate axis.
Optionally, the apparatus 1800 further comprises a detection module; the detection module is used for: detecting a coordinate system establishing instruction of a user; the processing module 1820 is further configured to: responding to an instruction for establishing a coordinate system, and displaying a first interface, wherein the first interface displays an image of a second calibration object; the detection module is further configured to: and detecting the clicking operation of the user on the image of the second calibration object, wherein the clicking operation is used for marking the first marking point, the second marking point and the third marking point.
Optionally, the second calibration object is a cuboid or a cube, and the second coordinate system is a ground coordinate system.
Optionally, the apparatus 1800 further comprises a detection module; the detection module 1820 is further configured to: detecting a measurement instruction of a user; the acquisition module 1810 is further configured to: responding to the measurement instruction, and acquiring an image of the moving object; and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
It should be appreciated that the apparatus 1800 herein is embodied in the form of functional modules. The term module herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it may be understood by those skilled in the art that the apparatus 1800 may be specifically a terminal device or an imaging device in the foregoing method embodiment, or functions of the terminal device or the imaging device in the foregoing method embodiment may be integrated in the apparatus 1800, and the apparatus 1800 may be configured to perform each flow and/or step corresponding to the terminal device or the imaging device in the foregoing method embodiment, and details are not described herein again to avoid repetition.
The apparatus 1800 has functions of implementing corresponding steps executed by the terminal device or the imaging device in the method embodiments; the above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
In an embodiment of the present application, the apparatus 1800 in fig. 18 may also be a chip or a chip system, such as: system on chip (SoC).
Fig. 19 is a schematic block diagram of another measuring device 1900 for a moving object according to an embodiment of the present application. The apparatus 1900 includes a processor 1910, a transceiver 1920, and a memory 1930. The processor 1910, the transceiver 1920, and the memory 1930 are in communication with each other via an internal connection, the memory 1930 is configured to store instructions, and the processor 1920 is configured to execute the instructions stored in the memory 1930 to control the transceiver 1920 to transmit and/or receive signals.
It should be understood that apparatus 1900 may be embodied as the terminal device or the imaging device in the foregoing method embodiment, or the functions of the terminal device or the imaging device in the foregoing method embodiment may be integrated in apparatus 1900, and apparatus 1900 may be configured to perform each step and/or flow corresponding to the terminal device or the imaging device in the foregoing method embodiment. Alternatively, the memory 1930 may include read-only memory and random access memory, and provide instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information. The processor 1910 may be configured to execute the instructions stored in the memory, and when the processor executes the instructions, the processor may execute the steps and/or the flows corresponding to the terminal device or the imaging device in the method embodiments described above.
It should be understood that in the embodiments of the present application, the processor 1910 may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor executes instructions in the memory, in combination with hardware thereof, to perform the steps of the above-described method. To avoid repetition, it is not described in detail here.
The application also provides a computer-readable storage medium for storing a computer program for implementing the method corresponding to the terminal device or the imaging device in the above method embodiments.
The application also provides a chip system, which is used for supporting the terminal device or the imaging device in the above method embodiment to realize the functions shown in the embodiment of the application.
The present application also provides a computer program product including a computer program (also referred to as code, or instructions) which, when run on a computer, can execute the method corresponding to the terminal device or the imaging device shown in the above method embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

1. A method of measuring a moving object, comprising:
measuring a first calibration object to obtain a first coordinate system, wherein the first coordinate system is a camera coordinate system;
measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system;
reconstructing the second coordinate system based on a difference between the first coordinate system and the second coordinate system;
and measuring the moving object under the second coordinate system based on binocular stereo vision to obtain the motion parameters of the moving object.
2. The method of claim 1, wherein the difference between the first coordinate system and the second coordinate system comprises:
a first coordinate axis in the first coordinate system is rotated to a first angle parallel to a fourth coordinate axis in the second coordinate system according to a first direction, a second coordinate axis in the first coordinate system is rotated to a second angle parallel to a fifth coordinate axis in the second coordinate system according to the first direction, and a third coordinate axis in the first coordinate system is rotated to a third angle parallel to a sixth coordinate axis in the second coordinate system according to the first direction, wherein the first direction is clockwise or counterclockwise;
said reconstructing said second coordinate system based on differences between said first coordinate system and said second coordinate system comprises:
and rotating a first coordinate axis, a second coordinate axis and a third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively in the first direction, and reconstructing the second coordinate system.
3. The method of claim 1 or 2, wherein the motion parameter comprises an angle at which a self coordinate system of the moving object is rotated relative to the second coordinate system when the moving object touches the ground.
4. The method of any one of claims 1-3, wherein the second marker includes a first marker point, a second marker point, and a third marker point thereon;
the measuring a second calibration object to obtain a second coordinate system, comprising:
obtaining a fourth coordinate axis according to the first mark point and the second mark point;
obtaining a fifth coordinate axis according to the first mark point and the third mark point;
obtaining a sixth coordinate axis according to the first coordinate axis and the second coordinate axis;
and obtaining the second coordinate system according to the fourth coordinate axis, the fifth coordinate axis and the sixth coordinate axis.
5. The method of claim 4, wherein prior to measuring the second calibration object to obtain the second coordinate system, the method further comprises:
detecting a coordinate system establishing instruction of a user;
responding to the instruction for establishing the coordinate system, and displaying a first interface, wherein the first interface displays the image of the second calibration object;
and detecting a clicking operation of a user on the image of the second calibration object, wherein the clicking operation is used for marking the first mark point, the second mark point and the third mark point.
6. The method according to any one of claims 1 to 5, wherein the second calibration object is a cuboid or a cube and the second coordinate system is a ground coordinate system.
7. The method according to any one of claims 1-6, wherein the measuring the moving object under the second coordinate system based on binocular stereo vision to obtain the motion parameters of the moving object comprises:
detecting a measurement instruction of a user;
responding to the measurement instruction, and acquiring an image of the moving object;
and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
8. A measuring device for a moving object, comprising:
the acquisition module is used for measuring a first calibration object and acquiring a first coordinate system, wherein the first coordinate system is a camera coordinate system; measuring a second calibration object to obtain a second coordinate system, wherein the second coordinate system is a user-defined coordinate system;
a processing module for reconstructing the second coordinate system based on a difference between the first coordinate system and the second coordinate system;
the acquisition module is further configured to: and measuring the moving object under the second coordinate system based on binocular stereo vision to obtain the motion parameters of the moving object.
9. The apparatus of claim 8, wherein the difference between the first coordinate system and the second coordinate system comprises:
a first coordinate axis in the first coordinate system is rotated to a first angle parallel to a fourth coordinate axis in the second coordinate system according to a first direction, a second coordinate axis in the first coordinate system is rotated to a second angle parallel to a fifth coordinate axis in the second coordinate system according to the first direction, and a third coordinate axis in the first coordinate system is rotated to a third angle parallel to a sixth coordinate axis in the second coordinate system according to the first direction, wherein the first direction is clockwise or counterclockwise;
the processing module is further configured to:
and rotating a first coordinate axis, a second coordinate axis and a third coordinate axis of the first coordinate system according to the first angle, the second angle and the third angle respectively according to the first direction, and reconstructing the second coordinate system.
10. The device of claim 8 or 9, wherein the motion parameter comprises an angle at which a self coordinate system of the moving object is rotated relative to the second coordinate system when the moving object touches the ground.
11. The device of any one of claims 8-10, wherein the second marker includes a first marker point, a second marker point, and a third marker point thereon;
the acquisition module is further configured to:
obtaining a fourth coordinate axis according to the first mark point and the second mark point;
obtaining a fifth coordinate axis according to the first mark point and the third mark point;
obtaining a sixth coordinate axis according to the first coordinate axis and the second coordinate axis;
and obtaining the second coordinate system according to the fourth coordinate axis, the fifth coordinate axis and the sixth coordinate axis.
12. The apparatus of claim 11, further comprising a detection module;
the detection module is used for:
detecting a coordinate system establishing instruction of a user;
the processing module is further configured to:
responding to the instruction for establishing the coordinate system, and displaying a first interface, wherein the first interface displays the image of the second calibration object;
the detection module is further configured to:
and detecting a clicking operation of a user on the image of the second calibration object, wherein the clicking operation is used for marking the first marking point, the second marking point and the third marking point.
13. The apparatus of any one of claims 8-12, wherein the second calibration object is a cuboid or cube and the second coordinate system is a ground coordinate system.
14. The apparatus of any one of claims 8-13, further comprising a detection module;
the detection module is further configured to:
detecting a measurement instruction of a user;
the acquisition module is further configured to:
responding to the measurement instruction, and acquiring an image of the moving object;
and measuring the moving object under the second coordinate system according to the image of the moving object to obtain the motion parameters of the moving object.
15. A measuring device for a moving object, comprising: a processor coupled with a memory for storing a computer program that, when invoked by the processor, causes the apparatus to perform the method of any of claims 1 to 7.
16. A chip system, comprising: a processor for calling and running a computer program from a memory so that a device on which the system-on-chip is installed implements the method of any one of claims 1 to 7.
17. A computer-readable storage medium for storing a computer program comprising instructions for implementing the method of any one of claims 1 to 7.
18. A computer program product, characterized in that computer program code is included in the computer program product, which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 7.
CN202111166946.1A 2021-09-30 2021-09-30 Method and device for measuring a moving object Pending CN115060229A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111166946.1A CN115060229A (en) 2021-09-30 2021-09-30 Method and device for measuring a moving object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111166946.1A CN115060229A (en) 2021-09-30 2021-09-30 Method and device for measuring a moving object

Publications (1)

Publication Number Publication Date
CN115060229A true CN115060229A (en) 2022-09-16

Family

ID=83197134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111166946.1A Pending CN115060229A (en) 2021-09-30 2021-09-30 Method and device for measuring a moving object

Country Status (1)

Country Link
CN (1) CN115060229A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186257A (en) * 2011-12-30 2013-07-03 百度在线网络技术(北京)有限公司 Simulator rotating vector sensing data acquisition method and device
CN105118055A (en) * 2015-08-11 2015-12-02 北京电影学院 Camera positioning correction calibration method and system
CN110321005A (en) * 2019-06-14 2019-10-11 深圳传音控股股份有限公司 A kind of method, apparatus, AR equipment and storage medium improving AR equipment virtual article display effect
CN111699445A (en) * 2018-07-13 2020-09-22 深圳配天智能技术研究院有限公司 Robot kinematics model optimization method and system and storage device
CN112180362A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Conversion pose determination method and device between radar and camera and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186257A (en) * 2011-12-30 2013-07-03 百度在线网络技术(北京)有限公司 Simulator rotating vector sensing data acquisition method and device
CN105118055A (en) * 2015-08-11 2015-12-02 北京电影学院 Camera positioning correction calibration method and system
CN111699445A (en) * 2018-07-13 2020-09-22 深圳配天智能技术研究院有限公司 Robot kinematics model optimization method and system and storage device
CN110321005A (en) * 2019-06-14 2019-10-11 深圳传音控股股份有限公司 A kind of method, apparatus, AR equipment and storage medium improving AR equipment virtual article display effect
CN112180362A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Conversion pose determination method and device between radar and camera and electronic equipment

Similar Documents

Publication Publication Date Title
US20210166495A1 (en) Capturing and aligning three-dimensional scenes
US11798190B2 (en) Position and pose determining method, apparatus, smart device, and storage medium
CN110006343B (en) Method and device for measuring geometric parameters of object and terminal
EP1886281B1 (en) Image processing method and image processing apparatus
CN100426198C (en) Calibration method and apparatus
CN108805917A (en) Sterically defined method, medium, device and computing device
US20230113647A1 (en) Object measurement method, virtual object processing method, and electronic device
US20120154377A1 (en) Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
US20100077341A1 (en) Smart content presentation
US20230298280A1 (en) Map for augmented reality
CN102971692A (en) Three-dimensional display device, three-dimensional image capturing device, and pointing determination method
CN105125160A (en) Oral cavity endoscope detecting system and detecting method thereof
CN107851333A (en) Video generation device, image generation system and image generating method
CN110064200A (en) Object construction method, device and readable storage medium storing program for executing based on virtual environment
CN110232707A (en) A kind of distance measuring method and device
CN108958483A (en) Rigid body localization method, device, terminal device and storage medium based on interaction pen
CN204889937U (en) Oral cavity endoscope
EP2439920B1 (en) System and Method for Displaying Object Location in Augmented Reality
JP2013168120A (en) Stereoscopic image processing apparatus, stereoscopic image processing method, and program
CN115060229A (en) Method and device for measuring a moving object
JP6152888B2 (en) Information processing apparatus, control method and program thereof, and information processing system, control method and program thereof
CN110222563A (en) Blood pressure measurement processing method, device and electronic equipment
CN109859265A (en) A kind of measurement method and mobile terminal
WO2021075307A1 (en) Information processing device, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination