CN111176445B - Interactive device identification method, terminal equipment and readable storage medium - Google Patents

Interactive device identification method, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN111176445B
CN111176445B CN201911342720.5A CN201911342720A CN111176445B CN 111176445 B CN111176445 B CN 111176445B CN 201911342720 A CN201911342720 A CN 201911342720A CN 111176445 B CN111176445 B CN 111176445B
Authority
CN
China
Prior art keywords
sub
markers
marker
interaction device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911342720.5A
Other languages
Chinese (zh)
Other versions
CN111176445A (en
Inventor
胡永涛
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201911342720.5A priority Critical patent/CN111176445B/en
Publication of CN111176445A publication Critical patent/CN111176445A/en
Application granted granted Critical
Publication of CN111176445B publication Critical patent/CN111176445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The embodiment of the application provides an identification method of an interaction device, wherein the interaction device comprises a first interaction device and a second interaction device, the first interaction device is provided with a first marker, the second interaction device is provided with a second marker, the first marker is distinguished from the second marker, and the first marker and the second marker both comprise a plurality of sub-markers; according to the method, the sub-markers in the target image are firstly identified, the first marker and the second marker are identified, then the current pose information of the interaction device corresponding to the markers is obtained, and finally the interaction device is judged to correspond to the left hand or the right hand according to the current pose information of the interaction device, so that the double interaction device is accurately identified, tracked and interaction accuracy is improved. The embodiment of the application also provides a terminal device and a computer readable storage medium.

Description

Interactive device identification method, terminal equipment and readable storage medium
Technical Field
The present invention relates to the field of interactive device control, and more particularly, to an interactive device identification method, a terminal device, and a readable storage medium.
Background
In recent years, with the advancement of technology, technologies such as augmented Reality (AR, augmented Reality) and Virtual Reality (VR) have gradually become hot spots for research at home and abroad.
In the conventional technology, when the display of augmented reality or mixed reality is performed by superimposing virtual content in a real scene image, a user can control the virtual content or interact with the virtual content through a single controller, but in some scenes, when the user needs to interact through dual controllers, the recognition of the dual controllers needs to be improved so as to ensure the accuracy of the interaction.
Disclosure of Invention
The embodiment of the application provides an identification method of an interaction device, which can identify a double interaction device and improve interactivity.
In one aspect, an embodiment of the present application provides a method for identifying an interaction device, which is applied to a terminal device, where the interaction device includes a first interaction device and a second interaction device, the first interaction device is provided with a first marker, the second interaction device is provided with a second marker, the first marker is different from the second marker, and the first marker and the second marker each include a plurality of sub-markers; the method comprises the following steps: acquiring a target image containing a sub-marker; identifying sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result; based on the grouping result, determining current pose information of the interaction device to which the sub-markers classified into the same group belong according to the image coordinates of the sub-markers classified into the same group in the target image; and determining the corresponding left hand or the corresponding right hand of the interaction device to which the sub-markers belonging to the same group belong according to the current pose information.
In another aspect, embodiments of the present application provide a terminal device that includes one or more processors, memory, and one or more applications. Wherein one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more program configured to perform the above-described method.
In yet another aspect, embodiments of the present application provide a computer readable storage medium having program code stored therein, the program code being executable by a processor to perform the method described above.
According to the identification method, the terminal equipment and the readable storage medium of the interaction device, sub-markers in a target image are identified, after the first marker and the second marker are identified, pose information of the interaction device corresponding to the marker is acquired, and finally the interaction device is judged to correspond to the left hand or the right hand according to the pose information of the interaction device, so that the double interaction device is accurately identified and tracked, and interaction accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an identification tracking system according to an embodiment of the present application.
Fig. 2 is a block diagram of a structure of a terminal device according to an embodiment of the present application.
Fig. 3 is a flow chart of an identification method of an interaction device according to an embodiment of the present application.
Fig. 4 is a flowchart of another method for identifying an interaction device according to an embodiment of the present application.
Fig. 5 is a flow chart of an embodiment of step S203 of fig. 4.
Fig. 6 is a flow chart of an embodiment of step S205 of fig. 4.
Fig. 7 is a flow chart of another embodiment of step S205 of fig. 4.
Fig. 8 is a schematic diagram of the arrangement of the sub-markers of the first marker according to the first rule and the arrangement of the sub-markers of the second marker according to the second rule according to the embodiment of the present application.
Fig. 9 is a schematic diagram of the first and second spatial coordinate systems established according to the first and second rules shown in fig. 8.
Fig. 10 is another schematic diagram of establishing a first spatial coordinate system and a second spatial coordinate system according to the first rule and the second rule shown in fig. 8.
Fig. 11 is a flow chart of an embodiment of step S207 of fig. 4.
Fig. 12 is a block diagram of a configuration of an identification device of an interaction device according to an embodiment of the present application.
Fig. 13 is a block diagram of a readable storage medium according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application.
Referring to fig. 1 and fig. 2 together, an identification tracking system 10 of an interaction device 200 according to an embodiment of the present application includes a terminal device 100 and the interaction device 200. The terminal device 100 and the interactive apparatus 200 may be connected by a communication method such as bluetooth, wi-Fi (Wireless Fidelity), zigBee (purple peak technology), or may be connected by a wired method such as a data line, and the terminal device 100 and the interactive apparatus 200 may not be connected. Of course, the connection manner of the terminal device 100 and the interaction device 200 may not be limited in the embodiment of the present application.
In some embodiments, the terminal device 100 may be a head-mounted display device, or may be a mobile device such as a mobile phone, tablet, or the like. When the terminal device 100 is a head-mounted display device, the head-mounted display device may be an integrated head-mounted display device. The terminal device 100 may also be an external head-mounted display device, and be connected to an intelligent terminal such as a mobile phone as a processing and storage device of the head-mounted display device, that is, the terminal device 100 may be inserted into or connected to the intelligent terminal to display virtual content on the external head-mounted display device. In some embodiments, the terminal device 100 may be used as a processing and storage device of a head mounted display device, and may be inserted into or connected to an external head mounted display device, where virtual content is displayed.
In one embodiment, the terminal device 100 includes one or more processors 110, memory 130. Wherein the memory 130 stores one or more application programs 150, the one or more application programs 150 being configurable to be executed by the one or more processors 110 to implement the method of identifying the interactive device 200.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall terminal device 100 using various interfaces and lines, performs various functions of the terminal device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 130, and invoking data stored in the memory 130. Alternatively, the processor 110 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 110 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 110 and may be implemented solely by a single communication chip.
The Memory 130 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Memory 130 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 130 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc. The storage data area may also store data created by the terminal device 100 in use, etc.
In the embodiment of the present application, the terminal device 100 further includes a camera for capturing an image of a real object and capturing a scene image of a target scene. The camera may be an infrared camera or a visible light camera, and the specific type is not limited in the embodiments of the present application.
In one embodiment, the terminal device is a head-mounted display device, and may further include one or more of the following components in addition to the above-mentioned processor, memory, and image sensor: display device module, optical module, communication module and power.
The display module assembly may include a display control unit. The display control unit is used for receiving the display image of the virtual content rendered by the processor, and then displaying and projecting the display image onto the optical module, so that a user can watch the virtual content through the optical module. The display device may be a display screen or a projection device, etc., and may be used to display an image.
The optical module can adopt an off-axis optical system or a waveguide optical system to be a transmission and reflection lens, so that a display image displayed by the display device can be directly projected to the eyes of a user after being directly projected by the optical module. The user sees the display image projected by the display device through the optical module. In some embodiments, the user can also observe the real environment through the optical module, so that the image obtained by the eyes of the user is a display image of the virtual content and an augmented reality scene effect obtained by overlaying the perceived virtual content and the real environment.
The communication module may be a module such as bluetooth, wiFi (Wireless-Fidelity), zigBee (purple peak technology), etc., and the head-mounted display device may establish communication connection with the terminal device through the communication module. The head-mounted display device which is in communication connection with the terminal equipment can interact information and instructions with the terminal equipment. For example, the head-mounted display device may receive the transmitted image data of the terminal device through the communication module, and generate virtual contents of the virtual world from the received image data to display.
The power supply can supply power for the whole head-mounted display device, and normal operation of each component of the head-mounted display device is guaranteed.
With continued reference to fig. 1, the interaction device 200 includes a first interaction device 210 and a second interaction device 230, the first interaction device 210 is provided with a first marker 211, the second interaction device 230 is provided with a second marker 231, and the first marker 211 is distinguished from the second marker 231. Wherein the terminal device 100 can identify the first interaction means 210 and the second interaction means 230 by the first and second markers 211 and 231 which are distinguished, i.e. the first and second interaction means 210 and 230 are distinguished by the first and second markers 211 and 231.
The user enters a preset virtual scene using the terminal apparatus 100, and when the interaction device 200 is within the visual range of the camera, the camera captures a target image including the first interaction device 210 and the second interaction device 230. The processor 110 acquires the first marker 211 and the second marker 231 and the related information of the first marker 211 and the second marker 213 included in the target image, and respectively calculates and identifies the relative position information between the first interaction device 210, the second interaction device 230 and the terminal device 100. In some embodiments, it may be determined that the left hand or the right hand of the user corresponds to the first interaction device 210 or the second interaction device 230, so as to accurately identify and track the interaction device 200, thereby improving the accuracy of the interaction.
Based on the identification tracking system of the interactive device, the embodiment of the application provides an identification method of the interactive device, which is applied to the terminal equipment of the identification tracking system and the interactive device, and the specific identification method of the interactive device is described below.
Referring to fig. 3, an embodiment of the present application provides an identification method of an interaction device, which may be applied to the terminal device, where the identification method of the interaction device is used to identify the first interaction device and the second interaction device and track at least one interaction device, and the method may include:
step S101: a target image is acquired that includes a sub-marker.
Specifically, when part or whole of the first interaction device and the second interaction device are in the visual field of the camera of the terminal equipment, the camera can acquire an image containing the marker on the interaction device, and the image is a target image; the processor obtains the target image from the camera. Further, the target image should include at least a plurality of sub-markers, each of which may be assigned to either the first marker or the second marker.
Step S103: identifying the sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result.
In some embodiments, the sub-markers in the target image may be grouped according to the arrangement information and/or the identity information of the sub-markers in the target image and the grouping result may be obtained. In embodiments of the present application, the markers comprise a first marker and a second marker, and thus the grouping result comprises a first group and a second group. The arrangement information in the sub-markers in the target image refers to the distribution situation of the sub-markers in the target image, for example, based on one sub-marker, the relative position relationship of other sub-markers relative to the sub-marker, and the like. The identity information of the sub-markers in the target image may be used to identify the identity of the sub-markers, e.g. may include the number of the sub-markers, etc.
Step S105: based on the grouping result, determining the current pose information of the interaction device to which the sub-markers classified into the same group belong according to the image coordinates of the sub-markers classified into the same group in the target image.
In this embodiment of the present application, the first group of sub-markers may be first identified and judged, for example, by identifying the identity of the sub-markers and judging the arrangement rule thereof, so as to determine that the first group of sub-markers belongs to the first marker or the second marker. Correspondingly, the sub-markers of the second group are identified and judged, for example, the identities of the sub-markers are identified, the arrangement rules of the sub-markers are judged, and then the markers of the second group are determined to belong to the first markers or the second markers.
In some embodiments, the first tag may include a plurality of sub-tags, each of which has different identity information, and the second tag may also include a plurality of sub-tags, each of which has different identity information. The identity information may be represented by the number of the sub-tag, for example, the number of the sub-tag is 1, 2, 3 … …, etc., but is not limited thereto. The identity information of the sub-tag may be determined by the number of feature points contained in the sub-tag, for example, the sub-tag contains 1 feature point, corresponding number 1, the sub-tag contains 2 feature points, corresponding number 2, etc. The identity information may be determined according to the shape and color of the feature points, the shape of the sub-markers, and the like, which is not limited herein.
The first tag may comprise a sub-tag having different identity information than the second tag. In some embodiments, the first tag and the second tag may also comprise sub-tags of the same identity information, but the arrangement rules of the sub-tags are different.
As an embodiment, when determining that a first group of sub-markers is a first marker, it may be determined directly that a second group of sub-markers is a second marker. It is also possible to verify whether the sub-markers of the second set belong to the second marker by the same method as determining that the sub-markers of the first set belong to the first marker.
Further, according to the image coordinates of the markers classified into the same group in the target image, the current pose information of the interaction device to which the markers classified into the same group belong is determined. It should be noted that, the pose information of the interaction device includes the position information of the interaction device and the pose information of the interaction device, and the pose information of the interaction device may include a rotation direction and a rotation angle of the interaction device relative to the terminal device.
If the sub-markers classified into the same group belong to the first marker, the interaction device classified into the same group belongs to the first interaction device, so that the current pose information of the first interaction device is determined and acquired according to the image coordinates of the first marker in the target image. In other embodiments, if the sub-markers classified into the same group belong to the second marker, the interaction device to which the sub-markers classified into the same group belong is the second interaction device, so that the current pose information of the second interaction device is determined and acquired according to the image coordinates of the second marker in the target image.
Step S107: and determining that the interaction devices belonging to the sub-markers classified into the same group correspond to the left hand or the right hand according to the current pose information.
In the embodiment of the application, according to the pose information of the interaction device, the interaction device can be determined to correspond to the left hand or the right hand. For example, when the interactive device is used, the user can hold the interactive device, and the interactive device which is held by the left hand and the interactive device which is held by the right hand in the two interactive devices can be determined according to the pose information of the first interactive device and the second interactive device respectively.
According to the identification method of the interaction device, the sub-markers in the target image are identified, the first marker and the second marker are identified, then the current pose information of the interaction device corresponding to the markers is obtained, and finally the interaction device is judged to correspond to the left hand or the right hand according to the current pose information of the interaction device, so that the double-interaction device is accurately identified and tracked, and the interaction accuracy is improved.
Referring to fig. 4, another method for identifying an interaction device is provided in an embodiment of the present application, which may be applied to the terminal device, and is used for identifying any one of the first interaction device and the second interaction device and tracking the identified interaction device. The method may include:
Step S201: when the first interaction device and the second interaction device start to be used, the reference corresponding states of the first interaction device and the second interaction device are calibrated.
As an embodiment, the reference corresponding state of the first interaction device and the second interaction device may be determined according to the relative positional relationship when the first interaction device and the second interaction device start to use, based on which step S201 may include the steps of: when the first interaction device and the second interaction device start to be used, acquiring a reference image containing a first marker and a second marker, and determining a first relative position of the first marker and the second marker in the reference image; and determining a second relative position of the first interaction device and the second interaction device according to the first relative position, and calibrating reference corresponding states of the first interaction device and the second interaction device according to the second relative position.
When the first interaction device and the second interaction device start to be used, a reference image containing the first marker and the second marker can be acquired through the terminal equipment. The reference image is an image when the first interaction device and the second interaction device are in an initial state. In the embodiment of the present application, the first interaction device and the second interaction device are in the initial state, which is understood as a state when the first interaction device and the second interaction device start to be used.
In one embodiment, when the terminal device collects the reference image, it may first determine whether pose information of the first interaction device and the second interaction device is in a preset reference range, and if the pose information is in the preset reference range, collect the reference image, and calibrate the reference corresponding states of the first interaction device and the second interaction device. The reference range can be preset, and can be the pose range of the interaction device approximately facing the terminal equipment, so that the situation that the interaction device is overturned and used improperly is prevented, and the follow-up recognition and tracking of the interaction device are inaccurate.
Further, the second relative positions of the first and second interaction means may be determined by determining the first relative positions of the first and second markers in the reference image. For example, if the first marker is located to the left of the second marker in the reference image, it may be determined that the first interaction device is located to the left of the second interaction device; if the first marker is located on the right side of the second marker in the reference image, it may be determined that the first interaction device is located on the right side of the second interaction device; the "left" and "right" are determined based on the left and right sides of the reference image, for example, the first marker is located on the left side of the reference image and the second marker is located on the right side of the reference image, and it may be determined that the first marker is located on the left side of the second marker.
Further, the reference corresponding states of the first interaction device and the second interaction device are calibrated according to the second relative position, wherein the reference corresponding states are used for representing the left hand or the right hand of the first interaction device and the second interaction device respectively. Taking the second relative positional relationship as an example that the first interaction device is located at the left side of the second interaction device, the reference corresponding state may be that the first interaction device corresponds to the left hand and the second interaction device corresponds to the right hand. It should be noted that, the reference corresponding state at this time may also be that the first interaction device corresponds to the right hand and the second interaction device corresponds to the left hand.
Step S202: a target image is acquired that includes a sub-marker.
Step S202 is similar to step S101, and will not be described here.
Step S203: identifying the sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result.
In some embodiments, the sub-markers of the first marker have non-repeating identity information and the sub-markers of the second marker have non-repeating identity information, wherein the identity information of the sub-markers of the first marker corresponds one-to-one and is the same as the identity information of the sub-markers of the second marker. Further, the identity information of each sub-marker in the target image can be obtained by identifying the sub-marker in the target image; sub-markers with the same identity information are then classified into different groups.
For example, the above identification information is given as an example, the number of the sub-tag of the first tag is 1, 2 …, the number of the sub-tag of the second tag is 1, 2 …, and for convenience of description, "a" refers to the sub-tag being assigned to the first tag, and "B" refers to the sub-tag being assigned to the second tag. In the process of identifying the target image, the sub-markers corresponding to the group A and the group B are not obviously distinguished, the number information of the sub-markers in the target image can be identified as 1 and 2 …, the number corresponding to each number is two, and the sub-markers with the same number identified in the target image can be further divided into different groups.
As an embodiment, after identifying the sub-markers in the target image, the sub-markers in the target image may be grouped according to the relative distance between the sub-markers in the target image and the reference sub-markers and the grouping result may be obtained, based on which, referring to fig. 5, step S203 may include steps S301 to S305:
step S301: determining the fiducial markers.
Specifically, the reference sub-marker is any one of the sub-markers of the target image, which serves as a reference for all the sub-markers in the target image, and is also a reference point for grouping all the sub-markers in the target image in the present embodiment.
For example, a plurality of sub-markers are included in the target image, and one of the plurality of sub-markers is randomly selected as a reference marker. In the case of actually determining the reference sub-marker, it is not known whether the marker belongs to the first marker or the second marker.
Step S303: relative distance information between the fiducial sub-marker and other sub-markers in the target image is obtained.
After determining the reference sub-marker, the reference sub-marker can be used as a reference to acquire the relative distance information between other sub-markers in the target image and the reference sub-marker.
Step S305: sub-markers in the target image with the relative distance information being greater than the distance threshold are grouped into one group, and sub-markers with the relative distance information being less than or equal to the distance threshold are grouped into another group.
The sub-markers in the target image comprise sub-markers with relative distance information larger than a distance threshold value and sub-markers with relative distance information smaller than or equal to the distance threshold value. Since the first marker and the second marker belong to different interaction means, if the reference marker is a sub-marker of the first marker, the relative distance between the reference marker and the other sub-markers of the first marker may be significantly smaller than the relative distance between the reference marker and the sub-markers of the second marker.
For example, if the fiducial sub-marker belongs to the first marker, the relative distance between the fiducial marker and the other sub-markers in the first marker may be less than the distance between the fiducial marker and the sub-markers in the second marker. Thus, by comparing the relative distance information with the distance threshold, the sub-markers whose relative distance information is greater than the distance threshold are grouped into one group, and the sub-markers whose relative distance information is less than or equal to the distance threshold are grouped into another group, and thus the sub-markers of the first marker and the sub-markers of the second marker can be substantially grouped. In some embodiments, the distance threshold may be the maximum of the relative distance between any one of the sub-markers of the first marker and the other sub-markers, or the maximum of the relative distance between any one of the sub-markers of the second marker and the other sub-markers. The grouping of the sub-markers in the target image can be completed faster by grouping the relative distance information between the reference sub-marker and other sub-markers, so that the recognition speed of the interaction device is improved.
Step S205: from the grouping result, it is determined that the sub-markers classified into the same group belong to the first marker or the second marker.
In the embodiment of the application, the first group of sub-markers can be determined to belong to the first marker or the second marker by identifying and judging the sub-markers of the first group, for example, identifying the identities of the sub-markers and judging the arrangement rules of the sub-markers. Correspondingly, the sub-markers of the second group are also identified and judged, for example, the identities of the sub-markers are identified and the arrangement rules are judged, so that the markers of the second group are determined to belong to the first markers or the second markers.
As an embodiment, after identifying the sub-markers in the target image and grouping the sub-markers, it may be determined that the sub-markers classified into the same group belong to the first marker or the second marker through the history frame image information of the markers, based on which, referring to fig. 6, step S205 may further include steps S401 to S409:
step S401: a historical frame image is acquired that includes a first marker and a second marker.
The history frame image is the target image of the previous frame or frames of the target image acquired in step S202, that is, the history frame image is the image including the first marker and the second marker acquired by the history time node. Further, the recognition results of the first marker and the second marker are already included in the history frame image, in other words, the sub-marker in the history frame image has been clarified that it belongs to the first marker or the second marker.
Step S403: first image coordinates of the sub-markers of the first marker and second image coordinates of the sub-markers of the second marker in the history frame image are identified.
The image coordinate system is a plane coordinate system established by a plane where the target image is located.
Further, coordinates of the sub-markers of the first marker in the history frame image in the image coordinate system are identified and recorded as first image coordinates. Coordinates of the sub-markers of the second marker in the history frame image in the image coordinate system are identified and recorded as second image coordinates.
Step S405: the current image coordinates of the sub-markers classified into the same group in the target image are acquired.
Coordinates of the sub-markers classified into the same group in the target image are acquired and recorded as current image coordinates. The first group or the second group in step S203 is a sub-label classified into the same group, and for convenience of description, the following description will take the sub-label classified into the same group as the sub-label of the first group as an example. Accordingly, the treatment method of the second group of sub-markers is similar to that of the first group of sub-markers, and the treatment method is referred to herein for brevity. The image coordinate system in which the target image in step S405 is located and the image coordinate system in which the history frame image in step S403 is located are the same coordinate system.
Step S407: and acquiring a first displacement degree of the current image coordinate relative to the first image coordinate.
Wherein the current image coordinates may be used to identify the position of the first set of sub-markers in the current target image and the first image coordinates may be used to identify the position of the first set of sub-markers in the history frame image. Further, the degree of displacement between the first group of sub-markers and the sub-markers of the first marker may be obtained, i.e. a first degree of displacement of the current image coordinates with respect to the first image coordinates is obtained.
As an embodiment, the first displacement degree may be a variance of an amount of displacement performed by a sub-marker of the first marker in the history frame image when the sub-marker of the first marker is displaced to correspond to (or overlap with) a sub-marker of the first group in the current target image. Alternatively, the first degree of displacement may be an average of the amounts of displacement performed by the sub-markers of the first marker when they are displaced to correspond to (or coincide with) the sub-markers of the first group.
Step S408: and obtaining the second displacement degree of the current image coordinate relative to the second image coordinate.
Wherein the second image coordinates may be used to identify the position of a sub-marker of the second marker in the history frame image. Further, the degree of displacement between the sub-markers of the first set and the second set of sub-markers may be obtained, i.e. the second degree of displacement of the current image coordinates with respect to the second image coordinates.
As an embodiment, the second displacement degree may be a variance of an amount of displacement performed by a sub-marker of the second marker in the history frame image when the sub-marker is displaced to correspond to (or overlap with) the sub-marker of the first group of the current target image. Alternatively, the second degree of displacement may be an average of the amounts of displacement performed by the sub-markers of the second marker when they are displaced to correspond to (or coincide with) the sub-markers of the first group.
Step S409: determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first displacement degree and the second displacement degree.
Wherein, the first group of sub-markers can be determined to belong to the first marker or the second marker according to the magnitude of the first displacement degree and the second displacement degree. If the first displacement degree is smaller than the second displacement degree, determining the first group of sub-markers as sub-markers of the first marker; if the second displacement is less than the first displacement, determining the first group of sub-markers as sub-markers of the second marker.
For example, when the obtained first displacement degree is the variance of the displacement amounts performed by the sub-markers of the first marker, the deviation degree between the displacement amounts performed by the sub-markers of the first marker during the movement and the average value of the displacement amounts can be obtained by the variance value, and the higher the deviation degree is, the lower the correspondence degree between the sub-markers of the first marker and the sub-markers of the first group is; the lower the degree of deviation means the higher the degree of correspondence of the sub-markers of the first marker with the sub-markers of the first set.
Further, by comparing the first displacement degree with the second displacement degree, a marker with a higher degree corresponding to the first group of sub-markers can be obtained, and then it is determined that the first group of sub-markers belongs to the first marker or the second marker. Since the history frame image is the target image of the previous frame or frames of the target image acquired in step S202, the displacement variation of the first marker in the history frame image and the first marker in the target image acquired in step S202 is not great. Furthermore, the first marker or the second marker belonging to the same group of sub markers can be judged by comparing the first displacement degree with the second displacement degree, and the sub markers in the current target image can be identified by referring to the history frame image, so that the speed of judging the first marker and the second marker can be increased.
As an embodiment, the arrangement of the sub-markers of the first marker and the arrangement of the sub-markers of the second marker are mirror images, taking a first rule that the arrangement of the sub-markers of the first marker is arranged in the first interaction device and a second rule that the arrangement of the sub-markers of the second marker is arranged in the second interaction device as an example, based on this, please refer to fig. 7, step S205 further includes steps S501 to S509, wherein steps S501 to S509 can replace steps S401 to S409, and the determination result of steps S401 to S409 can also be verified to improve the accuracy of judging that the sub-markers classified as the same group belong to the first marker or the second marker.
Specifically, the sub-markers of the first marker are arranged according to a first rule on the first interaction device, and the sub-markers of the second marker are arranged according to a second rule on the second interaction device, the second rule and the first rule being mirror images of each other. The second rule and the first rule are mirror images of each other, namely, the sub-markers with the same identity information in the sub-markers of the first marker and the sub-markers of the second marker are mirror images of each other. The steps S501 to S509 include:
step S501: and constructing a first space coordinate system according to the first rule, and constructing a second space coordinate system according to the second rule.
In an embodiment of the present application, the sub-markers of the first marker are arranged according to a first rule. Further, as shown in fig. 8, the first rule is to construct a plane with 1, 3, 5, and 7 in the "a" group of sub-markers, 4 and 8 being distributed to the left of the plane in the figure, and 2 and 6 being distributed to the right of the plane in the figure. It should be noted that, the first rule is also 1, 2, 3, 4 and 5, 6, 7, 8 are arranged clockwise, and "left", "right" and "clockwise" in the text description shown in fig. 8 are all references in the direction of fig. 8, and are only definitions made for description, not limitation.
In this embodiment, the sub-markers of the second marker are arranged according to a second rule, and the first rule is to number the sub-markers of the second marker in a counterclockwise order. The second space coordinate system is mirror image with the first space coordinate system. Further, as shown in fig. 8, the second rule is to construct a plane with 1, 3, 5, and 7 in the "B" group of sub-markers, 2 and 6 being distributed on the left of the plane, and 4 and 8 being distributed on the right of the plane. It should be noted that the second rule is also that 1, 2, 3, 4, 5, 6, 7, 8 are arranged counterclockwise, and for example, "left" and "right" and "counterclockwise" in the text description shown in fig. 8 are all referred to in the left and right directions in fig. 8, and are only defined for description, not limitation.
As shown in fig. 8, the first rule refers to the arrangement rule of 1 to 8 (the sub-markers of the first marker) of the "a" group, and the second rule refers to the arrangement rule of 1 to 8 (the sub-markers of the second marker) of the "B" group. 1-8 arranged according to the first rule and 1-8 arranged according to the second rule are mirror images of each other, and the identity information of the corresponding sub-markers is the same.
Further, the sub-markers of the first marker are arranged according to a first rule on the first interaction device, the sub-markers of the second marker are arranged according to a second rule on the second interaction device, the first rule and the second rule are mirror images of each other, and therefore the arrangement of the sub-markers of the first marker is mirror images of the arrangement of the sub-markers of the second marker. The first space coordinate system is constructed according to the first rule, the second space coordinate system is constructed according to the second rule, the X axis and the Y axis of the space coordinate system can be determined according to the same establishment mechanism, and then the Z axis of the space coordinate system is determined.
The first space coordinate system and the second space coordinate system are identical in establishment rule, and the first arrangement rule and the second arrangement rule are mirror images, so that the first space coordinate system and the second space coordinate system are mirror images.
For example, as shown in fig. 9, the establishment mechanism of the first spatial coordinate system and the second spatial coordinate system is: the X axis and the Y axis are established by taking the middle point of the sub-marker 6 pointing between the sub-markers 8 as an origin, the positive direction of the X axis is the direction of pointing at the sub-marker 8 by the origin, and the positive direction of the Y axis is the direction of pointing at the sub-marker 7 by the origin. And then determining the Z axis according to the same rectangular coordinate system determination mode. Further, the second space coordinate system has the same coordinate value as the sub-tag having the same identity information in the first space coordinate system. In other words, the second spatial coordinate system has the same coordinate value as the same numbered sub-markers in the first spatial coordinate system.
In other embodiments, as shown in fig. 10, the establishment mechanism of the first spatial coordinate system and the second spatial coordinate system is: the middle point of the sub-marker 6 pointing to the sub-marker 8 is used as an origin to establish an X axis and a Y axis, the positive direction of the X axis is used as the origin to point to the direction of the sub-marker 8, the positive direction of the Y axis is used as the origin to point to the direction of the sub-marker 7, and then the Z axes of the first space coordinate system and the second space coordinate system are respectively determined according to the determination mode of the right-hand coordinate system, so that the Z axes of the two space coordinate systems are in opposite directions.
Step S503: and acquiring the arrangement rule of the sub-markers classified into the same group in the target image.
In the embodiment of the application, the arrangement rule of the sub-markers classified into the same group in the target image is acquired. Alternatively, the arrangement rule may be a distribution rule of a part of the sub-markers classified into the same group with respect to a certain reference plane, which may be determined according to a positional relationship between other sub-markers except the part of the sub-markers, for example, the reference plane may be a plane formed by connecting 1, 3, 5, 7 of the sub-markers classified into the same group.
Step S505: a first degree of matching between the arrangement rule and the first rule of the sub-markers classified into the same group in the target image is obtained.
For example, the first degree of matching is the degree of matching between the position of the planar distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connections in the arrangement rule and the position of the planar distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connections in the "a" group of sub-labels in the first rule. Optionally, the first matching degree is the same number of positions of the planar distribution of the connection relative to 1, 3, 5, 7 in 2, 4, 6, 8 as the positions of the planar distribution of the connection relative to 1, 3, 5, 7 in the group "a" sub-markers.
Step S507: and acquiring a second matching degree between the arrangement rule and a second rule of the sub-markers classified into the same group in the target image.
For example, the second degree of matching is the degree of matching between the position of the planar distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connections in the arrangement rule and the position of the planar distribution of 2, 4, 6, 8 relative to 1, 3, 5, 7 connections in the "B" group of sub-labels in the second rule. Optionally, the second matching degree is the same number of positions of the planar distribution of the 2, 4, 6, 8 relative to the 1, 3, 5, 7 connection as the positions of the planar distribution of the 2, 4, 6, 8 relative to the 1, 3, 5, 7 connection in the "B" group sub-label.
Step S509: determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first matching degree and the second matching degree.
According to the first matching degree and the second matching degree, the arrangement rule of the sub-markers classified into the same group can be determined to be close to the first rule or close to the second rule, and then the sub-markers classified into the same group are determined to belong to the first marker or the second marker. If the first matching degree is greater than the second matching degree, it may be determined that the arrangement rule of the sub-markers classified into the same group is close to the first rule, and the sub-markers classified into the same group belong to the first marker. If the second matching degree is greater than the first matching degree, it may be determined that the arrangement rule of the sub-markers classified into the same group is close to the second rule, and the sub-markers classified into the same group belong to the second marker.
For example, as shown in FIG. 8, if the arrangement rule of the sub-markers classified into the same group is that 2, 6 is located on the right of the plane where 1, 3, 5, 7 are connected, and 4, 8 is located on the left of the plane where 1, 3, 5, 7 are connected. Comparing the arrangement rule with the first rule, and recording that the positions of the plane distribution of the connection of the 2, 4, 6 and 8 relative to the 1, 3, 5 and 7 are identical to the first rule (the arrangement rule of the group A sub-markers), and the first matching degree is 4; comparing the arrangement rule with the second rule (the arrangement rule of the sub-markers in the group B), and recording that the second matching degree is 0 when the positions of the plane distribution of the connection of the 2, 4, 6 and 8 relative to the 1, 3, 5 and 7 are different from the second rule, so that a result that the first matching degree is larger than the second matching degree can be obtained, and further, the sub-markers classified into the same group belong to the first marker. The sub-markers belonging to the same group can be accurately judged by acquiring and comparing the first matching degree and the second matching degree, so that the identification speed and the accuracy of the interaction device are improved.
As an implementation manner, a matching threshold may be set, for example, in the matching process of the first rule and the second rule in the positions of the 2, 4, 6, 8 relative to the plane distribution of the 1, 3, 5, 7 connection, when the positions of at least two points in the 2, 4, 6, 8 correspond to the positions in the first rule or the positions in the second rule, a first matching degree and a second matching degree may be obtained to judge that the sub-markers classified into the same group belong to the first marker or the second marker. If fewer than two points in the sets 2, 4, 6, 8 correspond to the first rule or the second rule, the packet result needs to be corrected, for example, by the steps S401 to S409.
Alternatively, in some embodiments, if the sub-markers of the first marker and the sub-markers of the second marker are significantly different, such as the size of the sub-markers, the shape of the sub-markers, etc., the sub-markers belonging to the first marker or the second marker can be significantly identified in the target image, and step S203 may be omitted.
Step S207: and determining the current pose information of the interaction device to which the sub-markers classified into the same group belong according to the images of the sub-markers classified into the same group in the target image.
If the sub-markers classified into the same group belong to the first marker and the interaction device to which the sub-markers classified into the same group belong is the first interaction device, determining pose information of the interaction device to which the sub-markers classified into the same group belong according to the images of the sub-markers classified into the same group in the target image, referring to fig. 11, step S207 includes steps S601 to S605:
s601: the pixel coordinates of the sub-markers of the first marker in the image coordinate system are obtained.
The pixel coordinates are coordinates of the sub-markers of the first marker in an image coordinate system corresponding to the target image. Alternatively, the processor may acquire the pixel coordinates of all the sub-markers, i.e., the processor may acquire the pixel coordinates of all the sub-markers.
S603: physical coordinates of the sub-markers of the first marker are obtained.
The physical coordinates are coordinates of a sub-marker of the first marker in a physical coordinate system corresponding to the first interaction device, and the physical coordinates of the sub-marker are used for representing the actual position of the sub-marker on the first interaction device. The physical coordinates of the sub-markers may be preset, and then the physical coordinates of the sub-markers are called according to the identified sub-markers. For example, a plurality of markers are arranged on the marker surface of the first interaction device, a certain point on the marker surface is selected as an origin, and a physical coordinate system is established. The physical coordinates of the sub-markers are all preset values at the coordinate values corresponding to the preset coordinate axes. For example, an XYZ coordinate system is established with a certain point of the controller as an origin, the positional relationship of each sub-marker (and the characteristic points included in the sub-marker) with respect to the origin can be measured, thereby determining the physical coordinates (X 0 ,Y 0 ,Z 0 )。
S605: and acquiring current pose information of the first interaction device according to the pixel coordinates and the physical coordinates.
After the pixel coordinates and the physical coordinates of the sub-markers of the first marker in the target image are acquired, the relative position and the attitude information between the terminal equipment and the first interaction device are acquired according to the pixel coordinates and the physical coordinates of the sub-markers, and specifically, the mapping parameters between the image coordinate system and the physical coordinate system are acquired according to the pixel coordinates and the physical coordinates of each sub-marker and the pre-acquired internal parameters of the terminal equipment.
For example, by calculating the obtained pixel coordinates and physical coordinates of the sub-markers and the internal parameters of the terminal device obtained in advance through a preset algorithm (such as SVD algorithm), the rotation parameters between the image coordinate system and the physical coordinate system of the terminal device and the translation parameters between the image coordinate system and the physical coordinate system of the terminal device can be obtained.
It should be noted that the rotation parameter and the translation parameter are used for characterizing current pose information between the terminal device and the first interaction device. The rotation parameter represents the rotation state between the image coordinate system and the physical coordinate system, that is, the rotation degree of freedom of each coordinate axis of the first interaction device and the physical coordinate system in the physical coordinate system. The translation parameter represents a movement state between the image coordinate system and the physical coordinate system, that is, the movement degree of freedom of each coordinate axis of the first interaction device and the physical coordinate system in the physical coordinate system. The rotation parameter and the translation parameter are six-degree-of-freedom information of the first interaction device in the physical coordinate system, and can represent the rotation and movement states of the first interaction device in the physical coordinate system, namely, the angles, the distances and the like between the visual field of the terminal equipment and all coordinate axes in the physical coordinate system can be obtained.
Step S208: and determining that the interaction devices belonging to the sub-markers classified into the same group correspond to the left hand or the right hand according to the current pose information.
In this embodiment, taking the interaction device to which the sub-markers belonging to the same group as the first interaction device as an example, the current pose information of the first interaction device may be determined first, and then the corresponding left hand or the corresponding right hand of the first interaction device may be determined according to the pose information of the first interaction device in the reference image and the reference corresponding state.
The pose information and the reference corresponding state of the first interaction device in the reference image may be the pose information and the calibrated reference corresponding state of the first interaction device acquired in step S201. And determining that the first interaction device corresponds to the left hand or the right hand according to the pose information of the first interaction device in the reference image and the current pose information of the interaction device. Accordingly, the interaction device to which the sub-markers belonging to the same group belong is the second interaction device, and the manner of determining whether the second interaction device corresponds to the left hand or the right hand is similar to the above method, which is not described herein.
In one embodiment, after current pose information of interaction devices classified into the same group is obtained, first historical pose information of a first interaction device and second historical pose information of a second interaction device are obtained according to a historical frame image containing a first marker and a second marker; determining pose changes between the current pose information and the first historical pose information and between the current pose information and the second historical pose information respectively; and determining the interaction device to which the sub-markers belonging to the same group belong as a first interaction device or a second interaction device according to the pose change. At this time, step S205 may be omitted.
The pose change is used for reflecting the change of the first historical pose information and the current pose information and the change of the second historical pose information and the current pose information. By comparing the change of the first historical pose information with the change of the current pose information and the change of the second historical pose information with the change of the current pose information, the pose change of the interaction device to which the sub-markers belonging to the same group belong can be determined to be consistent with the pose change of the first interaction device or the pose change of the second interaction device, and therefore the interaction device to which the sub-markers belonging to the same group belong is determined to be the first interaction device or the second interaction device.
Further, the interaction device to which the sub-markers belonging to the same group belong is determined to correspond to the left hand or the right hand according to the reference corresponding state. In other embodiments, the general activity route of the interaction device to which the sub-marker belonging to the same group belongs may be obtained by comparing the change between the pose information of the interaction device to which the sub-marker belonging to the same group belongs and the current pose information of the interaction device to which the sub-marker belonging to the same group belongs in the history frame image, so as to determine whether the interaction device to which the sub-marker belonging to the same group belongs corresponds to the left hand or the right hand.
Step S209: and determining and executing the corresponding control instruction according to the corresponding state of the interaction device to which the sub-markers belonging to the same group belong.
The identification method of the interactive device can also determine a control instruction according to the determined gesture information of the interactive device and the hand corresponding to the interactive device, and realize different control on the interactive device. The pose information of the interaction device may be obtained from the pose information of the interaction device obtained in step S207.
For example, if it is determined that the first interaction device corresponds to the left hand, the first control instruction is determined according to the gesture information of the first interaction device. And if the first interaction device is determined to correspond to the right hand, determining a second control instruction according to the gesture information of the first interaction device. The first control instruction and the second control instruction are different, that is, different control instructions can be obtained according to interaction devices with the same gesture corresponding to different hands. Further, the terminal device selectively executes the first control instruction or the second control instruction according to whether the current corresponding hand is the left hand or the right hand, so that diversified control of virtual content is realized.
Referring to fig. 12, a block diagram of an identification device 500 of an interaction device according to an embodiment of the present application is shown and applied to a terminal device. The recognition device 500 includes an image acquisition module 510, a sub-marker grouping module 530, a pose information acquisition module 550, and a correspondence determination module 570.
Wherein the image acquisition module 510 is configured to acquire a target image including a sub-marker. The sub-tag grouping module 530 is configured to identify sub-tags in the target image, group the sub-tags in the target image, and obtain a grouping result. The pose information obtaining module 550 is configured to determine current pose information of an interaction device to which the sub-markers classified into the same group belong according to the images of the sub-markers classified into the same group in the target image. The correspondence determining module 570 is configured to determine, according to the current pose information, whether the interaction device to which the sub-markers belonging to the same group belong corresponds to a left hand or a right hand.
In some embodiments, the sub-marker grouping module 530 includes a fiducial determination unit for determining a fiducial sub-marker, which is any one of the sub-markers in the target image, a distance acquisition unit, and a grouping unit. The distance acquisition unit is used for acquiring the relative distance information between the reference sub-marker and other sub-markers in the target image. The grouping unit is used for grouping the sub-markers with the relative distance information being larger than the distance threshold value into one group, and grouping the sub-markers with the relative distance information being smaller than or equal to the distance threshold value into another group.
In other embodiments, the first tag comprises a plurality of sub-tags that are distinguishable from each other and the second tag comprises a plurality of sub-tags that are identical to each other, but are arranged differently from each other. The sub-tag grouping module 530 includes an identity information identifying unit and a grouping unit, where the identity information identifying unit is configured to identify sub-tags in the target image, and obtain identity information of each sub-tag in the target image. The grouping unit is used for classifying the sub-markers with the same identity information into different groups respectively.
Further, the arrangement of the sub-markers of the first marker and the arrangement of the sub-markers of the second marker are mirror images of each other, a first space coordinate system is constructed according to a first rule, and a second space coordinate system is constructed according to a second rule. The second space coordinate system is in mirror symmetry with the first space coordinate system, wherein the first rule is the arrangement rule of the sub-markers of the first marker, and the second rule is the arrangement rule of the sub-markers of the second marker; the sub-markers of the first marker have non-repeated identity information, the sub-markers of the second marker have non-repeated identity information, and the second space coordinate system has the same identity information as the sub-markers with the same coordinate values in the first space coordinate system.
In some embodiments, the identifying apparatus 500 includes a marker determining module for determining that the sub-markers classified into the same group are the first marker or the second marker, and the marker determining module may include a preset rule acquiring unit, a current rule acquiring unit, a matching degree acquiring unit, and a determining unit. The preset rule acquisition unit is used for acquiring the first rule and the second rule. The current rule acquisition unit is used for acquiring the arrangement rule of the sub-markers classified into the same group in the target image. The matching degree acquisition unit is used for acquiring a first matching degree between the arrangement rule and the first rule and a second matching degree between the arrangement rule and the second rule; and the determining unit is used for determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first matching degree and the second matching degree.
In other embodiments, the marker determination module may include a history frame unit, a history frame image coordinate acquisition unit, a current image coordinate acquisition unit, a first displacement unit, a second displacement unit, and a determination unit. The history frame unit is used for acquiring history frame images containing the first marker and the second marker. The history frame image coordinate acquiring unit is used for identifying a first image coordinate of a sub-marker of a first marker and a second image coordinate of a sub-marker of a second marker in the history frame image. The current image coordinate acquisition unit is used for acquiring the current image coordinates of the sub-markers classified into the same group in the target image. The first displacement degree unit is used for acquiring a first displacement degree of the current image coordinate relative to the first image coordinate. The second displacement unit is used for acquiring a second displacement of the current image coordinate relative to the second image coordinate. The determining unit is used for determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first displacement degree and the second displacement degree.
The pose information acquisition module 550 includes a pixel coordinate acquisition unit, a physical coordinate acquisition unit, and a current pose acquisition unit. If the interaction device to which the sub-markers belonging to the same group belong is the first interaction device, the pixel coordinate acquiring unit is configured to acquire the pixel coordinates of the sub-markers of the first marker in the image coordinate system. The physical coordinate acquisition unit is used for acquiring physical coordinates of the sub-markers of the first marker, wherein the physical coordinates are coordinates of the sub-markers of the first marker in a first space coordinate system. The current pose information acquisition unit is used for acquiring current pose information of the first interaction device according to the pixel coordinates and the physical coordinates. It should be noted that the pixel coordinate acquiring unit, the physical coordinate acquiring unit, and the current pose acquiring unit may also be used for acquiring the pose of the second marker, and the process is similar to the current pose acquiring of the first marker, which is not described herein.
In some embodiments, the identifying device 500 further includes a calibration module (not shown in the figure) for calibrating the reference corresponding states of the first interaction device and the second interaction device when they are started to be used. Specifically, the calibration module includes a first relative position unit, a second relative position unit, and a calibration unit, where the first relative position unit is configured to acquire a reference image including a first marker and a second marker when the first interaction device and the second interaction device start to be used, and determine a first relative position of the first marker and the second marker in the reference image. The second relative position unit is used for determining a second relative position between the first interaction device and the second interaction device according to the first relative position. The calibration unit is used for calibrating the reference corresponding states of the first interaction device and the second interaction device according to the second relative position, wherein the reference corresponding states are used for representing the left hand or the right hand of the first interaction device and the second interaction device respectively.
Further, the correspondence determining module 570 includes a reference position determining unit and a correspondence determining unit, where the reference position determining unit is configured to determine, according to the reference image, reference position information when the first interaction device and the second interaction device start to be used, respectively. The corresponding determining unit is used for determining that the first interaction device belongs to a corresponding left hand or a corresponding right hand according to the reference position information, the current position information and the reference corresponding state.
As an embodiment, the recognition device 500 further comprises an instruction determination module for executing the control instruction of the interaction device. The instruction determining module comprises a first instruction obtaining unit, a second instruction obtaining unit and an execution instruction determining unit, wherein the first instruction obtaining unit is used for determining a first control instruction according to the gesture information of the first interaction device if the first interaction device is determined to correspond to the left hand. And the second instruction acquisition unit is used for determining a second control instruction according to the gesture information of the first interaction device if the first interaction device is determined to correspond to the right hand. The execution instruction determination unit is used for executing the first control instruction or the second control instruction.
It should be noted that, the relevant control of the first interaction device in the unit of the above module may be replaced by the second interaction device, that is, the relevant control of the unit of the above module on the second interaction device is the same as the relevant control on the first interaction device, which is not described herein.
In summary, according to the identification method and the identification device for the interaction device provided in the embodiments of the present application, the sub-markers in the target image are identified, and after the first marker and the second marker are identified, pose information of the interaction device corresponding to the marker is obtained, and finally, the pose information of the interaction device is used to determine whether the interaction device corresponds to the left hand or the right hand, so as to accurately identify the interaction device, track the interaction device, and further improve the accuracy of interaction.
Referring to fig. 13, a block diagram of a computer readable storage medium according to an embodiment of the present application is shown. The computer readable storage medium 600 may be an electronic memory such as a group of flash memory, EEPROM (electrically erasable programmable read only memory), EPROM, hard disk, or ROM. Optionally, the computer readable storage medium 600 comprises a non-volatile computer readable medium (non-transitory computer-readable storage medium). The computer readable storage medium 600 has storage space for program code 610 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 610 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, one of ordinary skill in the art will appreciate that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (9)

1. The identification method of the interaction device is characterized by being applied to terminal equipment, wherein the interaction device comprises a first interaction device and a second interaction device, the first interaction device is provided with a first marker, the second interaction device is provided with a second marker, the first marker is distinguished from the second marker, and the first marker and the second marker comprise a plurality of sub-markers; the method comprises the following steps:
acquiring a target image containing a sub-marker;
identifying sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result;
Acquiring pixel coordinates of the sub-markers classified into the same group in an image coordinate system based on the grouping result;
acquiring physical coordinates of the same group of sub-markers, wherein the physical coordinates are used for representing the real physical positions of the sub-markers on the interaction device;
according to the pixel coordinates, the physical coordinates and the internal parameters of the terminal equipment, carrying out operation through a preset algorithm to obtain the current pose information of the interaction device to which the sub-markers belonging to the same group belong; the current pose information comprises a rotation parameter between an image coordinate system and a physical coordinate system of the terminal equipment and a translation parameter between the image coordinate system and the physical coordinate system of the terminal equipment; and
and determining that the interaction device to which the sub-markers belonging to the same group belong corresponds to the left hand or the right hand according to the current pose information.
2. The method of claim 1, wherein the identifying the sub-markers in the image, grouping the sub-markers in the target image and obtaining the grouping result, comprises:
determining a fiducial sub-marker, the fiducial sub-marker being any one of the sub-markers in the target image;
Acquiring relative distance information between the reference sub-marker and other sub-markers in the target image; and
sub-markers with the relative distance information greater than a distance threshold are grouped into one group, and sub-markers with the relative distance information less than or equal to the distance threshold are grouped into another group.
3. The method of claim 1, wherein the first markers comprise mutually distinguishable sub-markers and the second markers comprise mutually distinguishable sub-markers, wherein the sub-markers of the first markers are identical to the sub-markers of the second markers, but the sub-markers of the first markers are arranged differently from the sub-markers of the second markers;
the identifying the sub-markers in the target image, grouping the sub-markers in the target image and obtaining a grouping result comprises the following steps:
identifying sub-markers in the target image, and acquiring identity information of each sub-marker in the target image; and
sub-markers with the same identity information are respectively classified into different groups.
4. A method according to claim 2 or 3, wherein after said obtaining a grouping result, the method further comprises:
Determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the grouping result, comprising:
acquiring a first rule and a second rule, wherein the first rule is an arrangement rule of sub-markers of the first marker, and the second rule is an arrangement rule of sub-markers of the second marker;
acquiring the arrangement rule of the sub-markers classified into the same group in the target image;
acquiring a first matching degree between the arrangement rule of the sub-markers classified into the same group in the target image and the first rule, and a second matching degree between the arrangement rule of the sub-markers classified into the same group in the target image and the second rule; and
and determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the first matching degree and the second matching degree.
5. The method of claim 1, wherein prior to the capturing the target image comprising the sub-marker, the method further comprises:
when the first interaction device and the second interaction device start to be used, acquiring a reference image containing the first marker and the second marker, and determining a first relative position of the first marker and the second marker in the reference image;
Determining a second relative position between the first interaction device and the second interaction device according to the first relative position; and
and calibrating reference corresponding states of the first interaction device and the second interaction device according to the second relative position, wherein the reference corresponding states are used for representing that the first interaction device and the second interaction device respectively correspond to a left hand or a right hand.
6. The method of claim 5, wherein the determining, according to the current pose information, whether the interaction device to which the sub-markers belonging to the same group belong corresponds to a left hand or a right hand, comprises:
acquiring first historical pose information of the first interaction device and second historical pose information of the second interaction device according to a historical frame image containing the first marker and the second marker;
determining pose changes between the current pose information and the first and second historical pose information respectively;
and determining that the interaction device to which the sub-marker belonging to the same group belongs is the first interaction device or the second interaction device according to the pose change, and determining that the interaction device to which the sub-marker belonging to the same group belongs corresponds to the left hand or the right hand according to the reference corresponding state.
7. The method of claim 1, wherein after the obtaining the grouping result, the method further comprises:
determining that the sub-markers classified into the same group belong to the first marker or the second marker according to the grouping result, comprising:
acquiring a historical frame image comprising the first marker and the second marker;
identifying first image coordinates of a sub-marker of the first marker and second image coordinates of a sub-marker of the second marker in the history frame image;
acquiring current image coordinates of the sub-markers classified into the same group in the target image;
acquiring a first displacement degree of the current image coordinate relative to the first image coordinate;
acquiring a second displacement degree of the current image coordinate relative to the second image coordinate; and
and determining that the sub-markers classified into the same group belong to a first marker or a second marker according to the first displacement degree and the second displacement degree.
8. A terminal device, comprising:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the method of any of claims 1-7.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a program code, which is callable by a processor for executing the method according to any one of claims 1 to 7.
CN201911342720.5A 2019-12-23 2019-12-23 Interactive device identification method, terminal equipment and readable storage medium Active CN111176445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911342720.5A CN111176445B (en) 2019-12-23 2019-12-23 Interactive device identification method, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911342720.5A CN111176445B (en) 2019-12-23 2019-12-23 Interactive device identification method, terminal equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111176445A CN111176445A (en) 2020-05-19
CN111176445B true CN111176445B (en) 2023-07-14

Family

ID=70655640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911342720.5A Active CN111176445B (en) 2019-12-23 2019-12-23 Interactive device identification method, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111176445B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN208722146U (en) * 2018-09-30 2019-04-09 广东虚拟现实科技有限公司 Wearable device for auxiliary positioning tracking
CN110120062A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Image processing method and device
CN110443853A (en) * 2019-07-19 2019-11-12 广东虚拟现实科技有限公司 Scaling method, device, terminal device and storage medium based on binocular camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120062A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Image processing method and device
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN208722146U (en) * 2018-09-30 2019-04-09 广东虚拟现实科技有限公司 Wearable device for auxiliary positioning tracking
CN110443853A (en) * 2019-07-19 2019-11-12 广东虚拟现实科技有限公司 Scaling method, device, terminal device and storage medium based on binocular camera

Also Published As

Publication number Publication date
CN111176445A (en) 2020-05-19

Similar Documents

Publication Publication Date Title
US11928838B2 (en) Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display
CN112527102B (en) Head-mounted all-in-one machine system and 6DoF tracking method and device thereof
CN106062862B (en) System and method for immersive and interactive multimedia generation
CN110443853B (en) Calibration method and device based on binocular camera, terminal equipment and storage medium
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
JP6008397B2 (en) AR system using optical see-through HMD
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
CN110335307B (en) Calibration method, calibration device, computer storage medium and terminal equipment
CN109710056A (en) The display methods and device of virtual reality interactive device
US20200278763A1 (en) Rendering device and rendering method
US11127156B2 (en) Method of device tracking, terminal device, and storage medium
US20210056370A1 (en) Optical tag based information apparatus interaction method and system
US11380063B2 (en) Three-dimensional distortion display method, terminal device, and storage medium
CN111427452B (en) Tracking method of controller and VR system
CN110737414B (en) Interactive display method, device, terminal equipment and storage medium
CN110874868A (en) Data processing method and device, terminal equipment and storage medium
CN110688002B (en) Virtual content adjusting method, device, terminal equipment and storage medium
CN108769668A (en) Method for determining position and device of the pixel in VR display screens in camera imaging
CN110874135B (en) Optical distortion correction method and device, terminal equipment and storage medium
CN110874867A (en) Display method, display device, terminal equipment and storage medium
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
CN111176445B (en) Interactive device identification method, terminal equipment and readable storage medium
CN111913560A (en) Virtual content display method, device, system, terminal equipment and storage medium
GB2581248A (en) Augmented reality tools for lighting design
CN110598605B (en) Positioning method, positioning device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant