CN115937321A - Attitude detection method and device for electronic equipment - Google Patents

Attitude detection method and device for electronic equipment Download PDF

Info

Publication number
CN115937321A
CN115937321A CN202211185821.8A CN202211185821A CN115937321A CN 115937321 A CN115937321 A CN 115937321A CN 202211185821 A CN202211185821 A CN 202211185821A CN 115937321 A CN115937321 A CN 115937321A
Authority
CN
China
Prior art keywords
corner points
coordinate system
corner
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211185821.8A
Other languages
Chinese (zh)
Other versions
CN115937321B (en
Inventor
高磊雯
王斌
王雪松
张东升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211185821.8A priority Critical patent/CN115937321B/en
Publication of CN115937321A publication Critical patent/CN115937321A/en
Application granted granted Critical
Publication of CN115937321B publication Critical patent/CN115937321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The embodiment of the application discloses a method and a device for detecting the posture of electronic equipment, relates to the technical field of posture detection, and can quickly and accurately detect the posture of the electronic equipment in the falling process. The specific scheme is as follows: and determining the original position information of the electronic equipment in the first space coordinate system according to the first image and the second image. The first spatial coordinate system is a camera coordinate system of the first camera. The original position information is subjected to correction processing to acquire corrected position information. The correction process is for correcting errors of the original position information introduced by aberrations of the first camera and the second camera. And acquiring the attitude information of the electronic equipment at the first moment according to the corrected position information.

Description

Attitude detection method and device for electronic equipment
Technical Field
The present application relates to the field of gesture detection technologies, and in particular, to a gesture detection method and apparatus for an electronic device.
Background
The electronic equipment inevitably falls in the use process. In order to analyze problems such as functional failure caused by falling of the electronic device in advance, it is necessary to accurately acquire attitude information of the electronic device in the process of simulating falling.
In the current scheme, the attitude information of the electronic equipment in the process of simulating the falling can be analyzed and obtained by shooting the falling process of the electronic equipment.
However, this solution may cause an inaccurate analysis of the acquired pose information due to problems such as aberration of the photographing apparatus (e.g., camera).
Disclosure of Invention
The embodiment of the application provides a method and a device for detecting the posture of electronic equipment, which can quickly and accurately detect the posture of the electronic equipment in the dropping process.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a gesture detection method is provided, which is applied to detecting gesture information in a dropping process of an electronic device. At a first moment in the falling process of the electronic equipment, the first camera shoots the electronic equipment to obtain a first image, the second camera shoots the electronic equipment to obtain a second image, and the poses of the first camera and the second camera are different. The method comprises the following steps: and determining the original position information of the electronic equipment in the first space coordinate system according to the first image and the second image. The first spatial coordinate system is a camera coordinate system of the first camera. The original position information is subjected to correction processing to acquire corrected position information. The correction process is for correcting an error of the original position information introduced by aberrations of the first camera and the second camera. And acquiring the attitude information of the electronic equipment at the first moment according to the corrected position information.
In this way, by correcting the spatial information acquired by the camera shooting, the dots that should be on the same plane are corrected to the same plane, and the dots that should be on the same row/column are corrected to the same row/column. The attitude information derived from this calculation can be more accurate.
Optionally, a checkerboard is disposed on at least one surface of the electronic device, and the checkerboard includes at least three corner points.
Optionally, the first image includes image information of at least three corner points corresponding to the pose of the first camera. The second image includes image information with at least three corner points corresponding to the pose of the second camera.
Optionally, determining original position information of the electronic device in the first spatial coordinate system according to the first image and the second image includes: and determining original position information of at least three corner points in a first space coordinate system according to the image information of the at least three corner points included in the first image and the second image.
Exemplarily, the checkerboard is attached to the electronic device, so that the corner points on the checkerboard can quickly and accurately provide a plurality of ways for constructing the device coordinate system. In addition, based on the prior information of the angular points, such as coplanarity and uniformity, the position information under the camera coordinate system can be corrected, and more accurate attitude information can be obtained.
Optionally, the performing a correction process on the original position information to obtain corrected position information includes: and correcting the at least three corner points to the fitting optimal plane according to the original position information of the at least three corner points in the first space coordinate system.
Optionally, the best-fit plane is determined according to the original position information of all corner points.
This realizes the coplanarity correction processing.
Optionally, the checkerboard includes N corner points in the same row. After correcting the at least three corner points to the fitted optimal plane according to the original position information of the at least three corner points in the first spatial coordinate system, the method further comprises: and determining the first direction according to the original position information of all the corner points. The first direction is used for indicating the direction of a straight line where a plurality of corner points on the same line on the checkerboard are located in the first spatial coordinate system under ideal conditions. And determining a first reference line according to the first direction and the first corner point. The first corner point is included in the N corner points, and the first corner point is the corner point nearest to the center of the image in the first spatial coordinate system. And projecting other corner points which are not on the first reference line in the N corner points to the first reference line so as to acquire the corrected position information of the N corner points.
Optionally, before projecting, onto the first reference line, other corner points that are not on the first reference line, of the N corner points, the method further includes: and determining M reference lines corresponding to all the angular points of the M lines on the checkerboard respectively, and adjusting the distance between any other adjacent reference lines according to the distance between the two reference lines closest to the center of the image. And the adjusted distance between any adjacent reference rows is the same.
Thereby, a row alignment for uniformity correction can be achieved.
Optionally, the checkerboard includes P corner points in the same column. After correcting the at least three corner points to the fitted optimal plane according to the original position information of the at least three corner points in the first spatial coordinate system, the method further comprises: and determining a second direction according to the original position information of all the corner points. The second direction is used for indicating the direction of a straight line where a plurality of corner points in the same column on the checkerboard are located in the first space coordinate system under the ideal condition. And determining a first reference column according to the second direction and the second corner point. The second corner point is included in the P corner points, and the second corner point is the corner point nearest to the center of the image in the first spatial coordinate system. And projecting other corner points which are not in the first reference column in the P corner points to the first reference column to acquire corrected position information of the P corner points.
Optionally, before projecting other corner points, which are not on the first reference column, of the P corner points onto the first reference column, the method further includes: and determining Q reference lines corresponding to all Q column corner points on the checkerboard, and adjusting the distance between any other adjacent reference columns according to the distance between two reference columns closest to the center of the image. And the adjusted distance between any adjacent reference columns is the same.
Column alignment for uniformity correction is thereby achieved.
Through the coplanarity correction and the uniformity correction, the positions of the angular points shot and acquired by the camera in the space are distributed and are more fit with an ideal state. Also by this correction process, the problem of inaccurate position information due to aberrations and the like is significantly reduced. Therefore, the acquired attitude information is more accurate.
Optionally, obtaining the posture information of the electronic device at the first time according to the corrected position information includes: and acquiring attitude information of the electronic equipment under the reference coordinate system according to the corresponding relation between the preset first space coordinate system and the reference coordinate system and the corrected position information. The reference coordinate system includes: a first device coordinate system, or a geodetic coordinate system. The first device coordinate system is a device coordinate system corresponding to a position of the electronic device when the electronic device does not start falling, the device coordinate system is arranged on the electronic device, and when the postures of the electronic device are different, all axial directions of the device coordinate system are not all the same.
In this way, by determining the mapping relationship between the camera coordinate system and the reference coordinate system in advance, the attitude information of the electronic equipment based on the reference coordinate system can be obtained by calculation smoothly even if the camera does not shoot the ground as a reference.
In a second aspect, a gesture detection apparatus is provided that includes one or more processors and one or more memories. One or more memories are coupled to the one or more processors, and the one or more memories store computer instructions. The computer instructions, when executed by one or more processors, cause an apparatus to perform a method as provided by any one of the first aspect and its possible designs.
In a third aspect, there is provided a computer readable storage medium comprising computer instructions which, when executed, perform the method as provided in any one of the first aspect and its possible designs.
In a fourth aspect, a chip system is provided that includes a processor and a communication interface. The processor is configured to invoke and execute a computer program stored in the storage medium from the storage medium, and to perform the method as provided in any one of the first aspect and its possible designs.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method as provided in any one of the above first aspect and various possible designs, in accordance with the instructions.
It should be understood that, technical features of the solutions provided in the second to fifth aspects can all correspond to the solutions provided in the first aspect and possible designs thereof, so that similar beneficial effects can be achieved, and further description thereof is omitted.
Drawings
FIG. 1 is a schematic diagram of a mobile phone drop scenario;
FIG. 2 is a schematic diagram of a method for detecting a dropping gesture of a mobile phone;
FIG. 3 is a schematic diagram of device coordinate system calibration for a cell phone;
FIG. 4 is a logic diagram of coordinate transformation during attitude information acquisition;
fig. 5 is a schematic composition diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a checkerboard arrangement provided in an embodiment of the present application;
fig. 7 is a schematic diagram of shooting and imaging during a drop simulation process of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic simulation diagram of different corner points in a camera space, which is not coplanar according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating simulation of uneven distribution of different corner points in a camera space according to an embodiment of the present application;
FIG. 10 is a schematic illustration of coplanarity correction provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a uniformity correction provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of a row calibration provided in an embodiment of the present application;
FIG. 13 is a schematic diagram of a column calibration provided in an embodiment of the present application;
fig. 14 is a schematic flowchart of a method for detecting an attitude according to an embodiment of the present disclosure;
FIG. 15 is a schematic diagram illustrating an exemplary embodiment of a detecting device;
fig. 16 is a schematic composition diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
With the development of electronic devices, more and more functions are integrated in the electronic devices. As shown in fig. 1, during the use of the electronic device, a situation of dropping (e.g. dropping to a hard structure such as a ground) inevitably occurs, thereby affecting the normal operation of the electronic device.
In order to avoid the influence on the functions of the electronic equipment after the electronic equipment falls, the posture information of the electronic equipment in the falling process can be collected and researched, and the settings of all components in the electronic equipment are optimized and adjusted based on the posture information. Therefore, even if the electronic equipment is dropped, various functions of the electronic equipment are not influenced obviously.
The following describes, with reference to the example of fig. 2, acquiring attitude information during a falling process of an electronic device.
As shown in fig. 2, in a scene simulating a fall of the electronic device, at least two cameras at different positions may be provided. For example, a camera 21 and a camera 22 shown in fig. 2 are provided. The cameras 21 and 22 may be high speed cameras capable of capturing instantaneous images of high speed moving objects, such as electronic devices during a fall.
The camera 21 and the camera 22 may capture the falling process of the electronic device, so as to determine the postures of the electronic device at different times based on the images obtained by capturing.
For example, the time a during the dropping of the electronic device is taken as an example. The scene of the electronic device in the air (or in contact with the ground) can be photographed by the cameras 21 and 22 at the time point a, and corresponding images are acquired. For example, the camera 21 may capture the captured image 23 at time a, and the camera 22 may capture the captured image 24 at time a. It is understood that the poses of camera 21 and camera 22 may be different. Then, the image 23 may be used to indicate the posture of the electronic device dropped in the pose of the camera 21; the image 24 may be used to indicate the pose of the electronic device being dropped while the camera 22 is in position.
Therefore, the comprehensive analysis can obtain the attitude information of the electronic equipment in the three-dimensional space (such as the geodetic coordinate system) at the time A.
Generally, in order to be able to accurately describe the change of the posture of the electronic device, a device coordinate system centered on the device may be constructed on the electronic device. The device coordinate system may be bound to the electronic device. For example, the XOY plane of the device coordinate system may correspond to a plane in which a display screen of the electronic device is located or a plane in which a back cover is located. Therefore, the accurate description of the posture of the electronic equipment can be realized by calculating the change condition of the coordinate system of the equipment in the falling process of the electronic equipment.
Figure 3 shows a schematic of a device coordinate system. In this example, the device coordinate system on the electronic device may be implemented by previously setting at least three markers that are not on the same straight line on the electronic device.
As shown in fig. 3, a mark 31, a mark 32, and a mark 33 may be provided on the electronic device. The line of the marks 31 and 32 may be perpendicular to the line of the marks 32 and 33. Thus, a device coordinate system can be constructed based on the three markers. For example, the plane formed by the markers 31, 32, and 33 may correspond to the XOY plane of the device coordinate system. The direction of the Z-axis can be set to be perpendicular to the X-Y plane (i.e. the plane where the display screen of the electronic device is located) inward or perpendicular to the plane where the display screen of the electronic device is located outward according to actual needs.
It can be understood that, in conjunction with the illustration of fig. 2, in order to accurately describe the pose of the electronic device, it is necessary to first shoot by the camera and convert the pose of the electronic device into data (pose information shown in the image 23 and the image 24) in the camera coordinate system. And further converting the position data under the camera coordinate system into pose information under the geodetic coordinate system based on the fixed relation between the camera coordinate system and the geodetic coordinate system.
That is, as shown in fig. 4, the pose information needs to be acquired by conversion of the device coordinate system-the camera coordinate system-the geodetic coordinate system.
Obviously, in order to accurately describe the posture of the electronic device, firstly, an accurate calibration of the device coordinate system is required, and secondly, an accurate conversion of the device coordinate system to the camera coordinate system is also required.
However, in the current technical solutions, the setting of the device coordinate system is mostly manually set by a developer, and there may be a case where the X axis and the Y axis are not perpendicular, so that the setting of the device coordinate system is not accurate enough.
Furthermore, the photographing images of all cameras have aberration, which makes the result obtained by converting the device coordinate system to the camera coordinate system inaccurate.
The above problems may cause the obtained posture information of the electronic device to be inaccurate.
Based on this, in order to obtain more accurate attitude information of the electronic device in the dropping process, the embodiment of the application provides a technical scheme, a device coordinate system is rapidly and accurately set through a checkerboard, and a conversion result of the device coordinate system and a camera coordinate system is corrected by combining prior information (such as coplanarity and uniformity) of the checkerboard, so that the attitude information of the electronic device in the dropping process is rapidly and accurately obtained.
The embodiments provided in the present application will be described in detail below.
It should be noted that the technical scheme provided by the embodiment of the application can be applied to the falling posture determination process of the electronic equipment. For example, the electronic device may be a portable mobile device such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and a media player, and the electronic device may also be a wearable electronic device such as a smart watch. The embodiment of the present application does not specifically limit the specific form of the apparatus.
As an example, fig. 5 is a schematic hardware composition diagram of an electronic device 100 provided in an embodiment of the present application.
As shown in fig. 5, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) connector 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The above-described composition of the electronic device may enable the electronic device to provide a variety of different functions to a user. According to the scheme provided by the embodiment of the application, the attitude information of the electronic equipment in the falling process can be acquired. Based on the analysis of the posture information, the above components shown in fig. 6 can be specifically set in the electronic device for guidance, so that the falling of the electronic device does not significantly affect the functions provided by the above components.
In the following description, an electronic device that needs to determine attitude information during a fall may be referred to as a device under test. For example, the device under test may be a mobile phone.
In the technical scheme provided by the embodiment of the application, the checkerboard can be arranged on the equipment to be tested (such as a mobile phone), and the equipment coordinate system can be accurately calibrated based on the checkerboard. Wherein the checkerboard may include black cells and white cells alternately arranged. The black cells and the white cells are rectangles having the same size. For example, the black cells and white cells may both be squares.
As an example, fig. 6 shows a schematic diagram of a checkerboard setting on a mobile phone.
In different implementations, the checkerboard may be attached to the display of the mobile phone and/or to the rear cover of the mobile phone. Under the condition that the chessboard grids are attached to the display screen and the rear cover of the mobile phone, the chessboard grids can be prevented from being shielded due to rotation in the falling process of the mobile phone, and therefore the pose information can be determined more accurately.
In the example shown in fig. 6, a checkerboard is attached to the display screen of the mobile phone, and the checkerboard includes 4 × 4 positive direction cells as an example.
In the checkerboard, a plurality of corner points may be included. Wherein, the corner point can be a vertex shared by any white or black cell. For example, a schematic of 9 corner points is shown in fig. 6. Such as corner A1-corner A3, corner B1-corner B3, corner C1-corner C3. The corner points A1-A3 may be the same as the corner points in the row a, the corner points B1-B3 may be the same as the corner points in the row B, and the corner points C1-C3 may be the same as the corner points in the row C. The corner point A1, the corner point B1, and the corner point C1 may be the same corner points in the 1 st column, the corner point A2, the corner point B2, and the corner point C2 may be the same corner points in the 2 nd column, and the corner point A3, the corner point B3, and the corner point C3 may be the same corner points in the 3 rd column.
In the actual operation process, the different corner points can be distinguished on the surface-mounted checkerboard through the character (number) marks or other modes, so that the corner points in the same row/column can be accurately judged after being shot by a camera.
Thus, the equipment coordinate system can be constructed by selecting two corner points in any row and two corner points in any column. For example, corner A1 in row a and corner A2 in row a may constitute the X-axis in the device coordinate system. The X-axis may coincide with the line of row a. Corner A1 in column 1 and corner B1 in column 1 may constitute the Y-axis in the device coordinate system. The Y axis may coincide with the line in column 1.
Of course, other forms of corner point combinations may be used in constructing the X-Y coordinate system. Such as constructing an X-Y coordinate system using any three or more corner points that lie in lines that are perpendicular to each other.
In conjunction with the foregoing description, when it is necessary to construct a three-dimensional device coordinate system, a direction perpendicular to the X-Y plane inward or outward may be set as the Z direction based on the already constructed X-Y coordinate system according to actual needs.
It can be understood that, in the embodiment of the present application, since the checkerboard itself has a plurality of corner points, which can construct a plane coordinate system with two mutually perpendicular axes, the device coordinate system setting based on the checkerboard can make the identification of the device coordinate system more accurate and simple.
In addition, the checkerboard has the characteristics that all the corner points are coplanar (i.e. coplanarity) and the distances between all the corner points are the same (i.e. uniformity). Therefore, in this example, by using a checkerboard to identify the device coordinate system, it is also possible to play a key role in the subsequent correction process under the camera coordinate system. The specific implementation of this process will be described in detail later. In the embodiment of the present application, the coplanarity and uniformity of the corner points on the checkerboard may also be collectively referred to as prior information.
Referring to fig. 7 in conjunction with the description of fig. 2, after the device coordinate system is identified by the checkerboard, the dropping process of the mobile phone can be photographed by at least two cameras with different poses.
Take the camera 21 as an example. At time a, the camera 21 may take a picture of the phone while it is falling, thereby acquiring a corresponding image 23. The image 23 may include a checkerboard mounted on the phone and corner points on the checkerboard. It should be noted that, in the actual operation process, the image 23 may further include a mobile phone, which is not illustrated in fig. 7 for simplicity.
Similarly, by shooting with the camera 22, an image 24 including a checkerboard and its upper corner points corresponding to the corresponding pose of the camera 22 can be acquired.
Based on the images 23 and 24, the original positions of the corner points in space can be obtained by fitting calculation.
It will be appreciated that the foregoing description of the uniformity and coplanarity of the corner points on the checkerboard is incorporated. Ideally, the positions of the corner points in space should also have the above-mentioned uniformity and coplanarity. However, the original positions of the corner points in space are not in fact uniformly coplanar due to the aberration of the camera itself, and so on.
Exemplarily, fig. 8 is a schematic illustration of the lack of coplanarity of the corner point distribution after the actual shooting. In the example as in fig. 8, a distribution diagram of 9 corner points in space is shown, as well as a planar tangential diagram of an ideal coplanar. It can be seen that the 9 corner points do not lie exactly in the same plane. Some of the corner points are located above the plane (e.g., positive z-axis) and some of the corner points are located below the plane (e.g., negative z-axis).
Fig. 9 is a schematic illustration of lack of uniformity in corner distribution after actual photographing. In the example as shown in fig. 9, a distribution pattern of 9 corner points in space and a grid pattern of ideally uniformly distributed distances are shown. It can be seen that the distances between the 9 angular points differ from one another from the perspective shown in fig. 9.
Due to the lack of uniformity and coplanarity of the corner points after actual shooting, a large error occurs in the conversion from the device coordinate system to the camera coordinate system realized by camera shooting.
In the embodiment of the application, the corner point distribution in the image shot and acquired by the camera can be corrected based on the uniformity and coplanarity of the corner point distribution on the checkerboard. Therefore, large errors are compensated for when the equipment coordinate system is converted into the camera coordinate system through camera shooting, and accuracy of the acquired mobile phone posture information is improved.
In the embodiment of the present application, the correction of the diagonal point distribution may include coplanarity correction and uniformity correction.
Exemplarily, referring to fig. 10, an example of coplanarity correction provided in the embodiments of the present application is shown.
As shown in fig. 10, before coplanarity correction, the original positions of the respective corner points in space are not coplanar due to factors such as photographing aberrations of the two cameras.
In the coplanarity correction process, a fitting optimal plane can be obtained through calculation according to the original positions of the angular points in the space. The fitted optimal plane may be obtained by a least squares calculation. The fitted optimal plane can be used to mark the plane where each corner point is located in space under an ideal state.
And projecting the original position of each corner point to the optimal fitting plane according to the optimal fitting plane, thereby acquiring the position information of each corner point after coplanarity correction. As the effect after coplanarity correction in fig. 10 illustrates, all corner points can be located in the fitted optimal plane obtained by calculation by the coplanarity correction. That is, all corner points are made to have coplanarity by the coplanarity correction.
Based on the coplanarity correction as shown in fig. 10, the positions of the corner points may be further subjected to uniformity correction, so that the corner points have uniformity characteristics.
For example, the uniformity correction in the present application may include: row aligned and column aligned. Wherein each alignment process (e.g., row alignment or column alignment) may also include leveling adjustment between different rows/columns.
After coplanarity correction, the corner points belonging to the same row are not collinear, as shown at 1101 in fig. 11. For example, corner A1, corner A2, and corner A3 are not collinear in the image shown at 1101. Corner B1, corner B2 and corner B3 are not collinear in the image shown at 1101. Corner C1, corner C2, and corner C3 are not collinear in the image shown at 1101.
There are also cases where the corner points belonging to the same column are not collinear. For example, corner A1, corner B1, and corner C1 are not collinear in the image shown at 1101. Corner A2, corner B2, and corner C2 are not collinear in the image shown at 1101.
Based on the correction scheme provided by the embodiment of the application, the line alignment processing can be performed on the corner points marked as the same line, so that the corner points which belong to the same line are aligned to the same straight line.
Illustratively, in conjunction with fig. 12, a schematic diagram of a row alignment process is shown. Specifically, the line alignment process may include:
1. determining a reference line direction according to all corner points on the plane;
2. aiming at a plurality of corner points marked as the same line, determining a reference line of the line according to the reference line direction and the nearest corner point close to the center of the image;
3. and projecting other corner points of the same line onto the reference line.
As an example, the directions of the lines are obtained by the least square calculation according to the position information of the corner points A1-C3 after the coplanarity correction. For example, the reference row direction may be a first direction.
From the identification of the different corner points, a plurality of corner points on the same row may be determined. For example, the corner points identified as A1, A2 and A3 may be in a same row. The corner points identified as B1, B2 and B3 may be in the same line. The corner points identified as C1, C2 and C3 may be in a same row.
Then, based on the determined directions of the respective lines and the corner points in any line, the reference position of the line direction where the corner point of the line should be located can be determined.
It will be appreciated that the closer to the optical center (i.e., the closer to the image center) the aberrations are in the camera imaging process, the smaller the aberrations. Therefore, in the present example, the reference of the corresponding line is determined using one corner point, which is closest to the center of the image, among the plurality of corner points in the arbitrary line, and the above-described reference line direction.
In the following example, the center of the image is coincident with the position of the corner point B2.
For corner A1, corner A2 and corner A3, corner A2 is closest to the center of the image. Then, a line (e.g., the reference line 1201) having a first direction passes through the corner point A2, and the line of the corner points A1, A2, and A3 is an alignment target line.
Therefore, the line alignment of the corner points A1, A2 and A3 can be realized by projecting the corner points A1 and A3 to the reference line 1201.
Similarly, for corner B1, corner B2, and corner B3, corner B2 is closest to the center of the image. Then, a line (e.g., reference line 1202) with a first direction passes through the corner B2, and is a line alignment target line of the corner B1, the corner B2, and the corner B3.
Thus, the line alignment of the corner points B1, B2 and B3 can be realized by projecting the corner points B1 and B3 to the reference line 1202.
For corner point C1, corner point C2 and corner point C3, corner point C2 is closest to the center of the image. Then, a line (e.g., the reference line 1203) having a first direction passes through the corner point C2, and the line of the corner point C1, the corner point C2, and the corner point C3 is an alignment target line.
Therefore, the corner points C1 and C3 are projected onto the reference line 1203, and line alignment of the corner points C1, C2 and C3 can be achieved.
It should be noted that after the ideal alignment of the rows, the distance between the rows should be uniform.
Therefore, in other embodiments of the present application, after determining the reference lines of each line according to the line alignment scheme, the reference lines of each line may be subjected to a homogenization process before performing the projection operation of the corresponding corner point.
For example, the distance between the other reference lines may be uniformly adjusted according to the distance between two reference lines closest to the center of the image as a target adjustment distance.
As an example, in the example shown in fig. 12, the image center coincides with the corner point B2, and then the determined reference line 1201 — reference line 1203 have distances from the image center in order from near to far: reference line 1202, reference line 1203, reference line 1201. Therefore, the position of the reference line 1201 can be adjusted using the distance between the reference line 1202 and the reference line 1203 as the target adjustment distance.
For example, the distance between the reference line 1201 and the reference line 1202 is adjusted to the target adjustment distance described above, thereby acquiring the reference line 1204 as a line of the adjusted corner point A1, corner point A2, and corner point A3 aligned with the target straight line.
Thus, the alignment of the lines of the corner point A1, the corner point A2, and the corner point A3 can be realized by projecting the corner point A1 and the corner point A3 to the reference line 1204.
Thus, the line alignment processing for each corner point can be realized. In the example shown in fig. 12, after the line alignment process is completed, the corner points in the same line are aligned to the corresponding reference lines, and the distances between the reference lines are also adjusted uniformly. I.e. to obtain the effect of 1102 as shown in figure 11.
After the row alignment is completed, column alignment can be performed on each corner point.
The process of column alignment is similar to the process of row alignment. Illustratively, the column alignment process may specifically include:
A. and determining the reference column direction according to all corner points on the plane.
B. And determining a reference column of the column according to the direction of the reference column and the corner point closest to the center of the image for a plurality of corner points marked as the same column.
C. The other corner points of the same column are projected onto the reference column.
As an example, the description of the column alignment process is provided in connection with fig. 13.
And calculating and acquiring the direction of each column by a least square method according to the position information of the corner points A1-C3 after coplanarity correction. For example, the reference column direction may be the second direction. In some implementations, the second direction may be perpendicular to the first direction.
From the identification of the different corner points, a plurality of corner points on the same column may be determined. For example, the corner points identified as A1, B1, and C1 may be in a column. The corner points identified as A2, B2 and C2 may be in the same column. The corner points identified as A3, B3 and C3 may be in a column.
Then, based on the determined directions of the respective columns and the corner points in any column, the reference position of the column direction where the corner point of the column should be located can be determined.
It will be appreciated that the closer to the optical center (i.e., the closer to the image center) the aberrations are in the camera imaging process, the smaller the aberrations. Therefore, in the present example, the reference of the corresponding column is determined using one corner point, which is closest to the center of the image, among the plurality of corner points in the arbitrary column, and the above-described reference column direction.
In the following example, the center of the image is coincident with the position of the corner point B2.
For corner A1, corner B1 and corner C1, corner B1 is closest to the center of the image. Then, through the corner point B1, a line (e.g., the reference line 1301) with a direction in the second direction is the alignment target line of the corner point A1, the corner point B1, and the corner point C1.
Therefore, the angular points A1 and C1 are projected to the reference column 1301 in a column, and column alignment of the angular points A1, B1 and C1 can be achieved.
Similarly, for corner A2, corner B2, and corner C2, corner B2 is closest to the center of the image. Then, a line (e.g., the reference column 1302) with a second direction passes through the corner point B2, and is a column alignment target line of the corner point A2, the corner point B2, and the corner point C2.
Therefore, the angular points A2 and C2 are projected to the reference column 1302 in a row, so that the alignment of the angular points A2, B2 and C2 can be realized.
For corner A3, corner B3 and corner C3. In this example, the three corner points are already on the reference column 1303, so no further column alignment is required.
It should be understood that in other embodiments of the present application, during column alignment, as illustrated in row alignment. After the reference row of each column is determined according to the column alignment scheme, the reference column of each column may be subjected to a homogenization process before the projection operation of the corresponding corner point is performed. The processing modes can be referred to each other, and are not described herein again.
Thus, after column alignment shown in fig. 13 is performed on 1102 in fig. 11, uniformity correction for each corner point already in the same plane can be completed, so as to obtain the effect shown in 1103 in fig. 11.
Through the exemplary illustrations of fig. 10 to fig. 13, the coplanarity correction and the uniformity correction provided by the present application can compensate the positions of the corner points acquired after the camera takes a picture in the space, and the imaging effect of the corner points on the ideal checkerboard is closer to that of the ideal checkerboard.
Therefore, the scheme of coplanarity correction and uniformity correction is combined with the posture acquisition scheme in the dropping process of the mobile phone as shown in fig. 2, so that the error in the conversion process from the equipment coordinate system to the camera coordinate system caused by the imaging error of the camera can be overcome. The acquired attitude information of the mobile phone is more accurate.
In the following description, a specific implementation of the gesture detection method in the dropping process of the mobile phone provided by the embodiment of the present application is illustrated with reference to the above descriptions of the schemes in fig. 6 to fig. 13.
Fig. 14 is a schematic flowchart of a gesture detection method according to an embodiment of the present disclosure. For example, the gesture information of the mobile phone in the falling process is detected by two cameras with different poses as shown in fig. 2. As shown in fig. 14, the method may include:
and S1401, calibrating internal and external parameters of the first camera and the second camera.
The camera internal reference (intrinsics matrix) corresponds to the camera itself, and can be used to mark the relationship between the actual position of the object and the projection position on the imaging plane. Once the camera is manufactured, the internal parameters of the camera cannot change along with the position and the posture of the camera.
The position and the posture of the camera which participates in the shooting outside the camera are related.
In the embodiment of the application, the calibration of the first camera and the second camera can be performed before the environment construction is completed and the gesture information of the mobile phone in the dropping process is detected. For example, based on the zhang's scaling method, the scaling plate is set to different poses, 15 or so pictures are respectively collected by the first camera and the second camera and are numbered sequentially, and the images shot by the first camera and the second camera with the corresponding numbers are the images collected at the same time. Thus, based on the 15 sets of images, the first camera internal reference matrix K and the second camera external reference matrix with respect to the first camera (i.e., the pose of the second camera with respect to the first camera) can be calculated and acquired. Wherein the external parameter matrix of the second camera relative to the first camera may include: a rotation matrix R of the second camera relative to the first camera, and a translation matrix T of the second camera relative to the first camera.
And S1402, arranging a checkerboard comprising at least three corner points on the surface of the mobile phone.
In conjunction with the description of fig. 6, before the mobile phone starts to fall off, a checkerboard including at least three corner points may be attached to the display and/or the rear case.
In some embodiments, the angular points on the checkerboard may be identified in a differentiated manner through characters (numbers) or other manners, and according to the identification of different angular points, which angular points belong to the same row and which angular points belong to the same column may be determined.
In this example, the checkerboard setting is completed, and the device coordinate system of the mobile phone can be correspondingly constructed.
For example, at least two corner points in the same row and at least two corner points in the same column in the checkerboard are selected according to the identification. The X axis is formed by straight lines where at least two angular points of the same row are located, and the Y axis is formed by straight lines where at least two angular points of the same column are located. And based on the determined X-axis and Y-axis, the Z-axis of the equipment coordinate system can be set. For example, the positive direction of the Z-axis is determined according to the right-hand coordinate system criteria. The X, Y and Z axes converge at point O.
This completes the device coordinate system of the handset (as denoted by (O) m -X m Y m Z m ) ) was constructed. In the motion process of the mobile phone, the origin and the directions of the coordinate axes of the coordinate system are always fixedly connected with the mobile phone.
And S1403, acquiring the first image and the second image in the dropping process of the mobile phone. The first image is acquired by the first camera shooting mobile phone at a first moment. The second image is acquired by the second camera shooting the mobile phone at the first moment.
Illustratively, the first time is taken as time a. The first camera and the second camera can continuously track the mobile phone to shoot in the process that the mobile phone is simulated to fall. Then, at the time a, the image captured by the first camera is the first image, and the image captured by the second camera is the second image.
In other embodiments of the application, when pose information of a mobile phone at multiple moments in a dropping process needs to be detected, the first camera and the second camera can shoot the mobile phone at multiple moments simultaneously.
Therefore, multiple groups of first images and second images of the checkerboard including the corner points, which are shot by the two cameras at different times, can be obtained.
In some implementations, after the first and second images are acquired, the first and second images may be pre-processed. The preprocessing may include smoothing. For example, the first image and the second image are subjected to gaussian filtering processing, so that shooting noise in the first image and shooting noise in the second image are removed preliminarily, and the accuracy of subsequent mobile phone attitude information calculation is improved.
And S1404, acquiring original position information of each corner point of the checkerboard on the mobile phone at a first moment according to the first image and the second image.
In this example, the position information of the corner points in the two-dimensional coordinates corresponding to the pose of any camera may be acquired based on a preset image processing mechanism according to the acquired image captured by the camera.
Illustratively, the preset image processing mechanism may include: carrying out binarization processing on the image by using a local average adaptive thresholding method; performing image expansion processing on the white pixel region, reducing the black quadrilateral region, and separating the connection of each black quadrilateral region; performing quadrilateral detection operation, calculating a convex hull of each contour and judging whether only four vertexes exist; removing some interference contours by using the constraints of length-width ratio, perimeter, area and the like, and screening out black quadrilateral region contours; and sequencing each effective quadrangle according to the known number of angular points, taking two opposite points of the two quadrangles in the diagonal direction, and taking the middle point of the connecting line of the two opposite points as the angular point. Therefore, the position information of each angular point under the corresponding pose of the camera can be obtained.
In some embodiments, on the basis of the corner point information, a sub-pixel level optimization process may be further performed, so as to improve the accuracy of detecting the two-dimensional pixel coordinate position of the checkerboard corner points.
In this example, for the first image of the first camera, the above-mentioned preset image processing mechanism is applied, and the first pixel coordinate position information of each corner point on the first image can be obtained. Similarly, for the second image of the second camera, the preset image processing mechanism is applied to obtain the second pixel coordinate position information of each corner point on the second image.
And based on the first pixel coordinate position information and the second pixel coordinate position information, solving a corresponding over-positive definite equation according to the pinhole camera model and the triangulation method, and obtaining the position information of each angular point in the space.
It will be appreciated that the process may be derived with reference to internal and external references of different cameras. That is, the position information of each corner point is the position information in the geodetic coordinate system.
As an example, the first pixel coordinate position information may include (x) for a photographing result of the first camera 0i ,y 0i ). Wherein i is a positive integer for identifying different corner points. The coordinates of the first pixel coordinates in the camera coordinate system of the first camera may be identified as (X) icl ,Y icl ,Z icl )。
Thus, (x) 0i ,y 0i ) And (X) icl ,Y icl ,Z icl ) The relationship (c) may be as shown in the following formula (1).
Formula (1):
Figure BDA0003867649490000111
wherein K is an internal reference matrix of the first camera.
The second pixel coordinate position information may include a result of photographing by the second camera(x 1i ,y 1i ). Wherein i is a positive integer for identifying different corner points. The coordinates of the first pixel coordinates in the camera coordinate system of the first camera may be identified as (X) icl ,Y icl ,Z icl )。
Thus, (x) 1i ,y 1i ) And (X) icl ,Y icl ,Z icl ) The relationship (c) may be as shown in the following formula (2). Formula (2):
Figure BDA0003867649490000121
wherein, R is a rotation matrix of the second camera relative to the first camera, and T is a translation matrix of the second camera relative to the first camera.
Based on the formula (1) and the formula (2), the spatial coordinate information of each corner point in the first camera coordinate system can be calculated and acquired through the following formula (3).
Formula (3):
Figure BDA0003867649490000122
wherein the content of the first and second substances,
Figure BDA0003867649490000123
thus, the original location information at the first time, i.e., (X) can be obtained icl ,Y icl ,Z icl ). It will be appreciated that the raw position information may be position information based on the camera space of the first camera.
S1405, carrying out coplanarity correction and uniformity correction on the original position information of each corner point at the first moment, and acquiring the corrected position information of each corner point at the first moment.
For example, the specific implementation of the coplanarity correction and the uniformity correction in this step may refer to the descriptions of fig. 10 to fig. 13 in the foregoing examples, and details are not repeated here.
And S1406, determining the pose information of the mobile phone at the first moment according to the corrected position information of each angular point at the first moment. It is to be understood that the corrected position information acquired after the correction in S1405 may correspond to position information of each corner point in the camera space of the first camera at the first time instant.
In this example, the corrected position information at the first time may be converted into position information in a reference coordinate system (e.g., a geodetic coordinate system) according to a relationship between a camera space of the first camera and the reference coordinate system.
For example, the relationship between the camera space of the first camera and the reference coordinate system may be acquired in advance. As an example, a coordinate system established by a position (denoted as a zero position) where the mobile phone does not start to simulate falling is taken as an example of the reference coordinate system.
The relationship between the camera space of the first camera and the reference coordinate system may be obtained as follows:
determining the zero position attitude of the mobile phone, and taking the equipment coordinate system of the mobile phone as a reference coordinate system (O) 0 -X 0 Y 0 Z 0 ). Image capture and solution of a reference coordinate system (O) 0 -X 0 Y 0 Z 0 ) Unit angular point P in three coordinate axis directions j0 (X j0 ,Y j0 ,Z j0 ). Exemplarily, in the case of j =1,2,3, three attitude solution key points are respectively corresponded.
(O) in the first camera coordinate system cl -X cl Y cl Z cl ) Coordinate (X) jcl ,Y jcl ,Z jcl ) Shooting the image after the attitude of the mobile phone changes, and resolving the space coordinate P of the three attitude resolving key points in the equipment coordinate system of the mobile phone jm (X jm ,Y jm ,Z jm ) (ii) a According to the relation P of the same point under the reference coordinate system and the equipment coordinate system jm =R 0m ×P j0 +T 0m And solving the equation to obtain the attitude information of the mobile phone. In this example, the pose information of the handset may include a rotation matrix R of the device coordinate system of the handset relative to a reference coordinate system 0m . Wherein, T 0m And the translation matrix of the mobile phone body coordinate system relative to the reference coordinate system is obtained.
It can be understood that, by way of the above example, the acquired pose information of the mobile phone can be represented by a rotation matrix relative to a reference coordinate systemR 0m And (5) identifying.
In other embodiments of the present application, the rotation matrix R relative to the reference coordinate system may also be used 0m And converting the coordinate system into an Euler angle under a geodetic coordinate system for identification.
Illustratively, according to a ZXY rotation sequence (first rotating φ about the Z axis and then rotating φ about the X axis)
Figure BDA0003867649490000131
And finally rotating theta around the Y axis), and the relation between each element of the rotation matrix and the Euler angle is shown in the following formula (4) -formula (6).
Formula (4):
Figure BDA0003867649490000132
formula (5):
Figure BDA0003867649490000133
formula (6):
Figure BDA0003867649490000134
thus, with the scheme shown in fig. 14, the attitude information at any time during the dropping process of the mobile phone can be detected.
In the scheme, the quick and accurate identification of the equipment coordinate system is realized in a calibration mode of the checkerboard on the mobile phone. Accurate conversion of the device coordinate system to the camera coordinate system is achieved through correction based on prior information. By the relationship between the camera space of the first camera and the reference coordinate system determined in advance in S1406, even if the ground without the ground in the captured image is taken as a reference, the attitude information based on the reference coordinate system can be smoothly acquired.
It is understood that the implementation logic of the above scheme may be preset in the corresponding detection device or detection apparatus. In this way, with the detection device or the detection apparatus as an execution body, the posture detection of the electronic device including the mobile phone in the falling process can be performed according to the posture detection scheme provided in the above embodiments.
The scheme provided by the embodiment of the application is mainly introduced from the perspective of the detection device. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
Fig. 15 is a schematic diagram of a detection apparatus 1500. As shown in fig. 15, the detection apparatus 1500 may include: a processor 1501 and memory 1502. The memory 1502 is used to store computer-executable instructions. For example, in some embodiments, the processor 1501, when executing instructions stored in the memory 1502, can cause the detection apparatus 1500 to perform any of the techniques described in any of the above embodiments.
It should be noted that all relevant contents of each step related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Fig. 16 shows a schematic diagram of a chip system 1600. The chip system 1600 may include: a processor 1601 and a communication interface 1602 for supporting a detection device or detection apparatus to implement the functions referred to in the above embodiments. In one possible design, the system-on-chip further includes a memory for storing necessary program instructions and data. The chip system may be constituted by a chip, or may include a chip and other discrete devices. It should be noted that, in some implementations of the present application, the communication interface 1602 may also be referred to as an interface circuit.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The functions or actions or operations or steps, etc., in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application are all or partially generated upon loading and execution of computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, etc., that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations may be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (14)

1. The method for detecting the posture of the electronic equipment is characterized by being applied to detecting the posture information of the electronic equipment in the falling process; at a first moment in the falling process of the electronic equipment, shooting the electronic equipment through a first camera to obtain a first image, and shooting the electronic equipment through a second camera to obtain a second image, wherein the poses of the first camera and the second camera are different; the method comprises the following steps:
determining original position information of the electronic equipment in a first space coordinate system according to the first image and the second image; the first spatial coordinate system is a camera coordinate system of the first camera;
correcting the original position information to obtain corrected position information; the correction processing is for correcting an error of the original position information introduced by an aberration of the first camera and the second camera;
and acquiring the attitude information of the electronic equipment at the first moment according to the corrected position information.
2. The method of claim 1, wherein a checkerboard is provided on at least one face of the electronic device, the checkerboard comprising at least three corner points.
3. The method of claim 2, wherein the first image comprises image information of the at least three corner points corresponding to a pose of the first camera;
the second image comprises image information of the at least three corner points corresponding to the pose of the second camera.
4. The method of claim 3, wherein determining the original position information of the electronic device in the first spatial coordinate system according to the first image and the second image comprises:
and determining original position information of the at least three corner points in the first space coordinate system according to the image information of the at least three corner points included in the first image and the second image.
5. The method of claim 4, wherein the performing correction processing on the original position information to obtain corrected position information comprises:
and correcting the at least three corner points to a fitting optimal plane according to the original position information of the at least three corner points in the first space coordinate system.
6. The method of claim 5, wherein the best-fit plane is determined from original location information of all corner points.
7. The method according to claim 5 or 6, wherein the checkerboard comprises N corner points in the same row;
after the correcting the at least three corner points to a fitting optimal plane according to the original position information of the at least three corner points in the first spatial coordinate system, the method further includes:
determining a first direction according to the original position information of all corner points; the first direction is used for indicating the direction of a straight line where a plurality of corner points on the same line on the checkerboard are located under the first spatial coordinate system under the ideal condition;
determining a first reference line according to the first direction and the first corner point; the first corner point is included in the N corner points, and the first corner point is the corner point, which is closest to the center of the image, of the N corner points in the first space coordinate system;
and projecting other corner points which are not on the first reference line in the N corner points to the first reference line so as to acquire the corrected position information of the N corner points.
8. The method according to claim 7, wherein before said projecting other corner points of said N corner points not on said first reference line onto said first reference line, said method further comprises:
determining M reference lines respectively corresponding to all the M lines of corner points on the checkerboard,
adjusting the distance between any other adjacent reference lines according to the distance between two reference lines closest to the center of the image; and the adjusted distance between any adjacent reference rows is the same.
9. The method according to any of claims 5-8, wherein said checkerboard comprises P corner points in the same column;
after the correcting the at least three corner points to a fitting optimal plane according to the original position information of the at least three corner points in the first spatial coordinate system, the method further includes:
determining a second direction according to the original position information of all the corner points; the second direction is used for indicating the direction of a straight line where a plurality of corner points positioned in the same column on the checkerboard are positioned in the first space coordinate system under an ideal condition;
determining a first reference column according to the second direction and the second corner point; the second corner point is included in the P corner points, and the second corner point is the corner point of the P corner points which is closest to the center of the image under the first spatial coordinate system;
and projecting other corner points which are not on the first reference column in the P corner points onto the first reference column to acquire the corrected position information of the P corner points.
10. The method according to claim 9, wherein before projecting other corner points of the P corner points not on the first reference column onto the first reference column, the method further comprises:
determining Q reference lines respectively corresponding to all Q column angular points on the checkerboard,
adjusting the distance between any other adjacent reference columns according to the distance between two reference columns closest to the center of the image; and the adjusted distance between any adjacent reference columns is the same.
11. The method according to any one of claims 1 to 10, wherein the obtaining of the attitude information of the electronic device at the first time is performed according to the corrected position information; the method comprises the following steps:
acquiring attitude information of the electronic equipment under a reference coordinate system according to a corresponding relation between a preset first space coordinate system and the reference coordinate system and the corrected position information;
the reference coordinate system includes: a first device coordinate system, or a geodetic coordinate system;
the first device coordinate system is a device coordinate system corresponding to a position when the electronic device does not start falling, the device coordinate system is arranged on the electronic device, and when the postures of the electronic device are different, all axial directions of the device coordinate system are not all the same.
12. An attitude detection apparatus, characterized in that the apparatus comprises one or more processors and one or more memories; the one or more memories coupled with the one or more processors, the one or more memories storing computer instructions;
the computer instructions, when executed by the one or more processors, cause the apparatus to perform the method of any of claims 1-11.
13. A computer-readable storage medium comprising computer instructions which, when executed, perform the method of any one of claims 1-11.
14. A system-on-chip, the system-on-chip comprising a processor and a communication interface; the processor is configured to call and run a computer program stored in a storage medium from the storage medium, and to execute the method according to any one of claims 1 to 11.
CN202211185821.8A 2022-09-27 2022-09-27 Gesture detection method and device of electronic equipment Active CN115937321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211185821.8A CN115937321B (en) 2022-09-27 2022-09-27 Gesture detection method and device of electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211185821.8A CN115937321B (en) 2022-09-27 2022-09-27 Gesture detection method and device of electronic equipment

Publications (2)

Publication Number Publication Date
CN115937321A true CN115937321A (en) 2023-04-07
CN115937321B CN115937321B (en) 2023-09-22

Family

ID=86699475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211185821.8A Active CN115937321B (en) 2022-09-27 2022-09-27 Gesture detection method and device of electronic equipment

Country Status (1)

Country Link
CN (1) CN115937321B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112446896A (en) * 2021-02-01 2021-03-05 苏州澳昆智能机器人技术有限公司 Conveying material falling monitoring method, device and system based on image recognition
CN112785646A (en) * 2021-01-26 2021-05-11 联想(北京)有限公司 Landing pose determining method and electronic equipment
CN112800860A (en) * 2021-01-08 2021-05-14 中电海康集团有限公司 Event camera and visual camera cooperative high-speed scattered object detection method and system
CN113542575A (en) * 2020-04-15 2021-10-22 荣耀终端有限公司 Device pose adjusting method, image shooting method and electronic device
CN113536892A (en) * 2021-05-13 2021-10-22 泰康保险集团股份有限公司 Gesture recognition method and device, readable storage medium and electronic equipment
US20220215573A1 (en) * 2019-05-13 2022-07-07 Changsha Intelligent Driving Institute Corp. Ltd Camera pose information detection method and apparatus, and corresponding intelligent driving device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220215573A1 (en) * 2019-05-13 2022-07-07 Changsha Intelligent Driving Institute Corp. Ltd Camera pose information detection method and apparatus, and corresponding intelligent driving device
CN113542575A (en) * 2020-04-15 2021-10-22 荣耀终端有限公司 Device pose adjusting method, image shooting method and electronic device
CN112800860A (en) * 2021-01-08 2021-05-14 中电海康集团有限公司 Event camera and visual camera cooperative high-speed scattered object detection method and system
CN112785646A (en) * 2021-01-26 2021-05-11 联想(北京)有限公司 Landing pose determining method and electronic equipment
CN112446896A (en) * 2021-02-01 2021-03-05 苏州澳昆智能机器人技术有限公司 Conveying material falling monitoring method, device and system based on image recognition
CN113536892A (en) * 2021-05-13 2021-10-22 泰康保险集团股份有限公司 Gesture recognition method and device, readable storage medium and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PANG HONGFENG ET AL.: "A new misalignment calibration method of portable geomagnetic field vector measurement system", 《 MEASUREMENT》 *
吴丽媛: "数字摄影测量技术在飞机结构瞬间变形监测中的研究", 《知网》 *

Also Published As

Publication number Publication date
CN115937321B (en) 2023-09-22

Similar Documents

Publication Publication Date Title
CN110458898B (en) Camera calibration board, calibration data acquisition method, distortion correction method and device
CN104006825B (en) The system and method that machine vision camera is calibrated along at least three discontinuous plans
JP2016167229A (en) Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination
CN108805938B (en) Detection method of optical anti-shake module, mobile terminal and storage medium
CN109920004B (en) Image processing method, device, calibration object combination, terminal equipment and calibration system
CN109920003B (en) Camera calibration detection method, device and equipment
CN106534665A (en) Image display device and image display method
US10552984B2 (en) Capture device calibration methods and systems
US20220044443A1 (en) Fisheye camera calibration system, method and electronic device
CN110136207B (en) Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium
CN108344401A (en) Localization method, device and computer readable storage medium
CN113329179B (en) Shooting alignment method, device, equipment and storage medium
CN109949232A (en) Measurement method, system, electronic equipment and medium of the image in conjunction with RTK
EP3967969B1 (en) Fisheye camera calibration system, method and apparatus, electronic device, and storage medium
CN111699513B (en) Calibration plate, internal parameter calibration method, machine vision system and storage device
CN113763478B (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN115937321B (en) Gesture detection method and device of electronic equipment
CN113052912A (en) Camera calibration method and device
CN116051652A (en) Parameter calibration method, electronic equipment and storage medium
CN111353945A (en) Fisheye image correction method, fisheye image correction device and storage medium
CN113538590A (en) Zoom camera calibration method and device, terminal equipment and storage medium
WO2022184929A1 (en) Calibration method of a portable electronic device
CN111223139B (en) Target positioning method and terminal equipment
CN110838147B (en) Camera module detection method and device
CN113920144B (en) Real-scene photo ground vision field analysis method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant