CN112642141B - Simulated shooting system and coordinate conversion method thereof - Google Patents

Simulated shooting system and coordinate conversion method thereof Download PDF

Info

Publication number
CN112642141B
CN112642141B CN202011399076.8A CN202011399076A CN112642141B CN 112642141 B CN112642141 B CN 112642141B CN 202011399076 A CN202011399076 A CN 202011399076A CN 112642141 B CN112642141 B CN 112642141B
Authority
CN
China
Prior art keywords
display screen
optical
optical mark
simulated
combination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011399076.8A
Other languages
Chinese (zh)
Other versions
CN112642141A (en
Inventor
张家华
陈慧
黄俊全
吴东升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Leyard Equipment Technology Co Ltd
Original Assignee
Beijing Leyard Equipment Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Leyard Equipment Technology Co Ltd filed Critical Beijing Leyard Equipment Technology Co Ltd
Priority to CN202011399076.8A priority Critical patent/CN112642141B/en
Publication of CN112642141A publication Critical patent/CN112642141A/en
Application granted granted Critical
Publication of CN112642141B publication Critical patent/CN112642141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a simulated shooting system and a coordinate conversion method thereof, wherein the simulated shooting system comprises a display screen, a first optical marker combination, a plurality of optical dynamic capturing cameras and data processing equipment; the first optical mark combination comprises at least three optical mark points; each optical mark point in the first optical mark combination is matched to mark the position and the posture of the display screen; the optical dynamic capturing camera is used for capturing optical mark points in the first optical mark combination to form a dynamic capturing image; the data processing equipment is used for calculating the position and the posture of the display screen according to the motion capture image; and calculating the coordinate conversion relation between the virtual space and the real space displayed by the display screen according to the position and the posture of the display screen. By adopting the simulated shooting system, after the display screen is moved, the position and the posture of the display screen can still be rapidly determined, and then the coordinate conversion relation between the virtual space and the display space is determined, so that rapid deployment of diversified simulated shooting scenes can be realized.

Description

Simulated shooting system and coordinate conversion method thereof
Technical Field
The application relates to the technical field of simulated shooting, in particular to a simulated shooting system and a coordinate conversion method thereof.
Background
Shooting games and simulated military training display targets through a screen, and shooting the screen display targets by holding a simulated gun by a gunman; an inertial sensor or an optical dynamic capture camera in the simulation gun captures the position and the posture of a player simulation gun, the shooting direction of the simulation gun is determined, and the back-end processing equipment determines whether the shooting of a gunman hits the target or not according to the shooting direction of the simulation gun and the coordinates of the target displayed by the display screen.
In order to ensure that the rear-end processing equipment can accurately determine the shooting target position or target when the gunner opens the gun, the display screen needs to be calibrated in space position, and the position of the display screen needs to be strictly fixed and cannot be moved after the spatial position calibration. In the real shooting game and the simulated military training, the position of the display screen needs to be adjusted, even a plurality of display screens are arranged according to the requirements of an actual field and a training target; the conventional measurement positioning method cannot quickly and accurately position the display screen after the position of the display screen is adjusted.
Disclosure of Invention
To solve the above technical problem or at least partially solve the above technical problem, the present application provides a simulated shooting system and a coordinate conversion method thereof.
In one aspect, the present application provides a simulated shooting system comprising a display screen, a first optical marker assembly, a plurality of optical motion capture cameras, and a data processing device;
the first optical mark combination comprises at least three optical mark points; each optical mark point in the first optical mark combination is matched to mark the position and the posture of the display screen;
the optical dynamic capture camera is used for capturing optical mark points in the first optical mark combination to form a dynamic capture image;
the data processing equipment is used for calculating the position and the posture of the display screen according to the motion capture image; and calculating the coordinate conversion relation between the virtual space and the real space displayed by the display screen according to the position and the posture of the display screen.
Optionally, the optical marker points in the first optical marker combination are all fixed on the display screen.
Optionally, the simulated shooting system further comprises a distance meter independent of the display screen;
optical mark points in the first optical mark combination are fixed on the distance measuring instrument;
the data processing equipment is used for determining the position and the posture of the display screen according to the dynamic capture image and at least two measuring distances;
wherein: the measured distance is the distance from the luminous point of the distance meter to the pre-selected point on the display screen, which is measured by the distance meter.
Optionally, the simulated shooting system further comprises a rigid body independent of the display screen; the rigid body is provided with a linear mark;
the optical mark points in the first optical mark combination are fixed on the rigid body;
the data processing equipment is used for determining the position and the posture of the display screen according to the dynamic capture image and at least two measuring distances;
wherein: the measured distance is the distance from a specific point on the linear mark to a pre-selected point on the display screen when the linear mark is aligned with the pre-selected point.
Optionally, the simulated shooting system further comprises a simulated firearm and a second optical marker combination mounted on the simulated firearm;
the second optical mark combination comprises at least three optical mark points; optical mark points in the second optical mark combination are used for calibrating the position and the posture of the simulated gun;
the dynamic capture image captured by the optical dynamic capture camera comprises information of optical marker points in the second optical marker combination;
the data processing device is further used for determining the position and the posture of the simulated gun according to the dynamic capture image; and determining the pointing direction of the simulated firearm according to the position and the posture of the simulated firearm.
Optionally, the optical mark point is an active light emitting point; the first optical mark combination and the second optical mark combination have different lamp language information;
the data processing device stores the lamp language information of the first optical mark combination and the second optical mark combination.
On the other hand, the present application provides a coordinate transformation method of a simulated shooting system, which is used for determining a coordinate transformation relationship between a virtual space and a real space displayed on a display screen in the simulated shooting system, and includes:
acquiring a dynamic capture image formed by capturing optical mark points in a first optical mark combination by an optical dynamic capture camera; the optical mark point combination comprises at least three optical mark points, and each optical mark point in the first optical mark combination is matched with each other to calibrate the position and the posture of the display screen;
determining the spatial position and the posture of the display screen according to the dynamic capture image;
and determining the coordinate conversion relation between the virtual space and the real space according to the spatial position and the posture of the display screen.
Optionally, the optical marker points in the first optical marker combination are all fixed on a distance meter independent from the display screen;
the method further comprises the following steps: acquiring the measuring distance output by the distance measuring instrument; the measured distance is the distance from the luminous point of the distance meter to a pre-selected point on the display screen;
determining the spatial position and the posture of the display screen according to the motion capture image, comprising: and determining the spatial position and the posture of the display screen according to the dynamic capture image and the at least two measuring distances.
Optionally, each optical mark point is an active light emitting point; the simulated shooting system comprises a plurality of display screens;
the simulated shooting system comprises a plurality of first optical mark combinations which correspond to the display screens one by one; the lamplanguage information formed by the optical mark point combinations in each first optical mark combination is different;
determining the spatial position and the posture of the display screen according to the motion capture image, comprising:
and respectively determining the spatial position and the posture of each display screen according to the dynamic captured image and the lamp language information combined by the first optical mark.
Optionally, the simulated shooting system further comprises a simulated firearm and a second optical marker combination mounted on the simulated firearm;
each second optical mark combination at least comprises three optical mark points; optical mark points in the second optical mark combination are used for calibrating the position and the posture of the simulated gun;
the method further comprises the following steps: determining the aiming direction of the simulated gun according to the dynamic capture image;
and determining the target which is aimed by the simulated gun and displayed in the virtual space according to the aiming direction and the coordinate conversion relation.
In the simulated shooting system provided by the application, the position and the posture of the display screen can be rapidly determined by utilizing the first optical marker combination and the optical dynamic capture camera and combining corresponding data processing equipment, and the coordinate conversion relation between a virtual space and a real space can also be rapidly determined. By adopting the simulated shooting system, after the display screen is moved, the position and the posture of the display screen can still be rapidly determined, and then the coordinate conversion relation between the virtual space and the display space is determined, so that rapid deployment of diversified simulated shooting scenes can be realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic structural diagram of a simulated shooting system provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of a simulated shooting system provided by an embodiment of the present application;
FIG. 3 is a flow chart of a coordinate transformation method of a simulated shooting system provided by an embodiment of the application;
wherein: 11-display screen, 12-first optical marker combination, 13-optical motion capture camera, 14-data processing device, 15-range finder, 16-simulated gun, 17-second optical marker combination.
Detailed Description
In order that the above-mentioned objects, features and advantages of the present application may be more clearly understood, the solution of the present application will be further described below. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, but the present application may be practiced in other ways than those described herein; it is to be understood that the embodiments described in this specification are only some embodiments of the present application and not all embodiments.
The embodiment of the application provides a simulated shooting system and a coordinate conversion method thereof, which can rapidly determine the coordinate conversion relation between a virtual space and a real space displayed by a display screen according to the position of the display screen, and then can realize random movement and diversified deployment of the display screen.
Fig. 1 is a schematic structural diagram of a simulated shooting system provided in an embodiment of the present application. As shown in fig. 1, the analog shooting system provided by the embodiment of the present application includes a display screen 11, a first optical marker combination 12, a plurality of optical motion capture cameras 13, and a data processing device 14.
In the embodiment of the application, the display screen 11 is the display screen 11 which is arbitrarily arranged in a set space range; the display screen 11 may be various types of display screens 11, such as an active light-emitting flat display screen, a curved display screen, or a spherical screen, or a passive light-emitting projection screen.
The first optical marker set 12 is an optical marker set for marking the position and posture of the display screen 11. At least three optical mark points are arranged in the first optical mark combination 12; the optical mark points in the first optical mark combination 12 cooperate to mark the position and posture of the display screen 11.
The optical motion capture camera 13 is used to capture the optical marker points in the first optical marker combination 12 to form a motion capture image. In the embodiment of the present application, the number of the optical dynamic capturing cameras 13 is plural, and the plurality of optical dynamic capturing cameras 13 are arranged relatively uniformly; the position of each optical motion capture camera 13 is calibrated to establish the coordinate correspondence between the motion capture image and the real space.
The data processing device 14 is used for determining the spatial position and the posture of the display screen 11 according to the dynamic capture image collected by the optical dynamic capture camera 13; after determining the spatial position and orientation of the display screen 11, the data processing device 14 determines the coordinate conversion relationship between the virtual space and the real space displayed by the display screen 11 according to the spatial position and orientation of the display screen 11.
According to the foregoing analysis, after the relative position relationship between the optical marker points in the first optical marker combination 12 and the display screen 11 is determined, the coordinates of the optical marker points in the first optical marker combination 12 in the real space are determined by moving the captured image, and then the position and the posture of the display screen 11 can be determined by the relative coordinate conversion.
In the case of determining the position and the posture of the real screen and determining the pixel relationship between the virtual space displayed on the display screen 11 and the display screen 11, the coordinate transformation relationship between the virtual space displayed on the display screen 11 and the real space can be determined according to the position and the posture of the display screen 11.
As can be seen from the foregoing analysis, in the embodiment of the present application, the position and the posture of the display screen 11 can be quickly determined by using the first optical marker combination 12 and the optical motion capture camera 13 in combination with the corresponding data processing device 14, and thus the coordinate transformation relationship between the virtual space and the real space can be quickly determined.
By adopting the simulated shooting system, after the display screen 11 is moved, the position and the posture of the display screen 11 can still be quickly determined, and then the coordinate conversion relation between the virtual space and the display space is determined, so that the quick deployment of diversified simulated shooting scenes can be realized.
In the embodiment of the present application, the optical mark points in the first optical mark combination 12 are disposed in the following manners.
In the first case, the optical mark points in the first optical mark combination 12 are all fixed on the display screen 11, and the relative position relationship between the optical mark points in the first optical mark combination 12 and the display screen 11 is determined.
In this case, if a plurality of display screens 11 need to be arranged, a plurality of sets of first optical marker combinations 12 need to be provided; alternatively, it is necessary to set the optical marker points in the first optical marker set 12 as detachable marker points and set each optical marker point at a specific position on the display screen 11.
Fig. 2 is a schematic diagram of an analog shooting system provided by an embodiment of the present application, as shown in fig. 2, in a second case, the analog shooting system further includes a range finder 15 in addition to the first optical marker combination 12, the motion capture camera, the data processing device 14, and the display screen 11. The optical marker points in the first optical marker set 12 are fixed to a distance meter 15 independent of the display screen 11.
At this time, the relative positional relationship of the light-emitting point on the distance meter 15 and the optical mark point in the first optical mark combination 12 has been determined.
Correspondingly, the data processing device 14 may determine the position and orientation of the display screen 11 based on the motion capture image and the at least two measured distances; the distance from the light-emitting point obtained by the distance meter 15 to a predetermined point on the display screen 11 is measured.
In the second case, the data processing device 14 can determine the position and attitude of the rangefinder 15 from the live-captured image, and then determine the spatial coordinates of the light-emitting point of the rangefinder 15 and the direction in which the light emitted by the light-emitting point of the rangefinder 15 extends. After determining the measured distance from the preselected point on the display screen 11 to the light emitting point of the distance meter 15, as measured by the distance meter 15, the spatial coordinates of the two preselected points can be determined based on the measured distance and the pointing direction of the distance meter 15.
The position and attitude of the display screen 11 can be determined from the spatial coordinates of the preselected points and the positions of the two preselected points on the display screen 11.
In the embodiment of the present application, the distance meter 15 may be a laser distance meter 15. In the case of the foregoing method, the positions and attitudes of the plurality of display screens 11 can be determined using one distance meter 15 and one first optical marker set 12, and then the number of first optical marker sets 12 is reduced.
Of course, in practical applications, a rigid body with a straight line mark may be further provided in the embodiment of the present application, and when the distance is measured, the straight line mark may be aligned to a pre-selected point on the display screen 11, and then a measuring device such as a tape is used to measure the distance from a certain point on the straight line to the pre-selected point, so as to determine the measured distance.
The present embodiment provides a simulated shooting system, which comprises a simulated firearm 16 and a second optical marker combination 17, in addition to the aforementioned display screen 11, first optical marker combination 12, a plurality of optical motion capture cameras 13 and data processing device 14.
The second optical mark combination 17 comprises at least three optical mark points. The optical marker points in the second optical marker combination 17 are fixed on the simulated firearm 16 and are used for calibrating the position and the posture of the simulated firearm 16.
The optical motion capture camera 13 also simultaneously captures and displays the optical marker points in the second optical marker combination 17 on the motion captured image when capturing the optical marker points in the first optical marker combination 12 to form the motion captured image. The data processing device 14 is also operable to determine a position and pose of the simulated firearm 16 from the kinetic images, and to determine a pointing direction of the simulated firearm 16 from the position and pose of the simulated firearm 16.
In practical application, the simulated shooting system is adopted, the optical dynamic capture camera 13 can measure the position of the display screen 11 and the pointing direction of the simulated gun 16, and various data in the simulated shooting system can be measured.
Of course, in other applications of the embodiments of the present application, the simulated firearm 16 may also have an inertial sensor mounted thereon to determine the orientation of the simulated firearm 16 using the inertial sensor. Given the firing of a simulated firearm 16, the target in virtual space at which the simulated firearm 16 is actually aimed can be determined based on the pointing direction of the simulated firearm 16 and the coordinate transformation relationships identified above.
In the embodiment of the present application, the optical mark point is an active light emitting point. By adjusting the light-emitting characteristics such as the light-emitting frequency of each active light-emitting point, the lamp language information formed by the first optical mark combination 12 and the second optical mark combination 17 can be different; the data processing device 14 stores the lamp language information of the respective first optical marker combination 12 and second optical marker combination 17.
When the data processing device 14 processes the kinetic image captured by the optical kinetic camera 13, the first optical marker combination 12 and the second optical marker combination 17 can be determined according to the lamp language information and the kinetic image content, and then the display screen 11 and the simulated gun 16 can be distinguished.
Of course, in other applications, if the relative spatial positions of the optical marker points in the first optical marker combination 12 and the second optical marker combination 17 are different, the respective light-emitting points in the first optical marker combination 12 and the second optical marker combination 17 may also be set as passive light-emitting points, and after the live-action image is received, the data processing device 14 determines that the corresponding first optical marker combination 12 and the second optical marker combination 17 are determined by the positional relationship between the passive light-emitting points.
In addition to providing the simulated shooting system described above, the embodiment of the present application also provides a coordinate transformation method for the simulated shooting system, where the coordinate transformation method is used to determine the coordinate transformation relationship between the virtual space and the real space displayed by the display screen 11 in the simulated shooting system.
Fig. 3 is a flowchart of a coordinate transformation method of a simulated shooting system according to an embodiment of the present application. As shown in fig. 3, the method provided by the embodiment of the present application includes steps S101 to S103.
S101: and acquiring a dynamic capture image formed by capturing the optical mark points in the first optical mark combination by the optical dynamic capture camera.
In step S101, the first optical mark combination at least includes at least three optical mark points, and the position of each optical mark point relative to the display screen can be directly determined or determined by data calculation; the optical markers cooperate to calibrate the position and attitude of the display.
In the embodiment of the application, the simulated shooting system is provided with a plurality of optical dynamic capturing cameras which are arranged uniformly; the positions of all the optical motion capture cameras are calibrated so as to establish the coordinate corresponding relation between the motion capture images acquired by the optical motion capture cameras and the real space.
S102: and determining the spatial position and the posture of the display screen according to the motion capture image.
And determining the spatial position and the posture of the display screen according to the dynamic captured image, namely determining the position of each optical mark point based on the dynamic captured image after the dynamic captured image is acquired, and determining the spatial position and the posture of the display screen according to the position of each optical mark point relative to the display screen.
S103: and determining the coordinate conversion relation between the virtual space and the real space according to the spatial position and the posture of the display screen.
After the spatial position and the posture of the display screen are determined, the position coordinate characteristics of each pixel in the display screen in a spatial coordinate system are also determined; and the coordinates of the content displayed by each pixel of the display screen in the virtual coordinate system can be obtained according to the operation of the display card. After the positions of all the pixel points of the display screen under the two coordinate systems are determined, the coordinate conversion relation between the virtual space and the real space can be determined.
The coordinate conversion relationship between the virtual space and the real space can be used for calculating the shooting target position of the simulated gun in the simulated shooting system, and then whether the simulated gun can hit a specific target after being excited is determined.
In the embodiment of the application, the optical mark points in the first optical mark combination are mark points fixed on a distance meter independent from the display screen. In this case, steps S1021 and S1022 are specifically included in step S102.
S1021: and acquiring the measuring distance output by the distance measuring instrument.
The measurement distance in step S1021 is the distance from the light emitting point of the distance meter to the predetermined point on the display screen, that is, the distance determined after the predetermined point on the display screen is adjusted to the line on which the light emitting light of the distance meter is located.
S1022: and determining the spatial position and the posture of the display screen according to the dynamic captured image and the at least two measuring distances.
Because the optical mark points of the first optical mark combination are fixed on the distance meter, the pointing direction of the distance meter can be determined according to the first optical mark combination, and the space coordinates of specific points on the display screen can be determined according to the pointing direction and the measured distance of the distance meter. After the spatial coordinates of at least two specific points are determined, the spatial position and the posture of the display screen can be determined according to the relative positions of the specific points on the display screen and the size of the display screen.
In the embodiment of the application, each optical mark point is an active light emitting point, and the simulated shooting system can comprise a plurality of display screens. The simulated shooting system comprises a plurality of first optical mark combinations which correspond to the display screens one by one, and the lamp language information formed by the optical mark point combinations in the first optical mark combinations is different.
In the corresponding step S102, determining the spatial position and the posture of the display screen according to the motion capture image specifically includes: and respectively determining the spatial position and the posture of each display screen according to the dynamic captured image and the lamp language information combined by the first optical marks.
Because the lamp language information of each first optical mark combination is different, after different optical mark combinations are adopted, which first optical mark combination is still determined according to the luminous frequency combination of the optical mark points in each first optical mark combination, and then which display screen is determined, and then the spatial position and the posture of each display screen can be respectively determined.
In particular applications of embodiments of the present application, the simulated fire system may further include a simulated firearm and a second optical marker combination mounted on the simulated firearm. Each second optical mark combination also comprises at least three optical mark points, and the optical mark points in each second optical mark combination are matched to mark the position and the posture of the simulated gun.
In the case of the combination of the simulated firearm and the second optical marker, the aforementioned coordinate conversion method may further include steps S104 and S105.
S104: and determining the aiming direction of the simulated gun according to the dynamic capture image.
S105: and determining a target which is simulated to be aimed by the gun and is displayed in the virtual space according to the aiming direction and the coordinate conversion relation.
The aiming direction of the simulated gun is determined according to the dynamic capture image, the orientation information of each optical mark point in the second optical mark combination is determined based on the dynamic capture image, the muzzle direction of the simulated gun is determined according to the orientation information of the optical mark points and the positions of the optical mark points on the simulated gun, and then the aiming direction of the simulated gun is determined according to the muzzle direction.
After the aiming direction is determined, the pointing target in the aiming direction in the virtual space can be determined according to the coordinate conversion relationship of step S103.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is merely exemplary of the present application and is presented to enable those skilled in the art to understand and practice the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A simulated shooting system comprising a display screen, a first combination of optical markers, a plurality of optical motion capture cameras and a data processing device;
the first optical mark combination comprises at least three optical mark points; each optical mark point in the first optical mark combination is matched to mark the position and the posture of the display screen;
the optical dynamic capture camera is used for capturing optical mark points in the first optical mark combination to form a dynamic capture image;
the data processing equipment is used for calculating the position and the posture of the display screen according to the moving capture image and acquiring the coordinates of the display content of each pixel of the display screen in a virtual space; and calculating the coordinate conversion relation between the virtual space and the real space displayed by the display screen according to the position and the posture of the display screen and the coordinates of the display contents of the pixels of the display screen in the virtual space, wherein the coordinate conversion relation is used for calculating the shooting target position of a simulated gun in the virtual space in the simulated shooting system.
2. The simulated shooting system of claim 1,
and the optical mark points in the first optical mark combination are all fixed on the display screen.
3. The simulated shooting system of claim 2, further comprising a range finder independent of said display screen;
optical mark points in the first optical mark combination are fixed on the distance measuring instrument;
the data processing equipment is used for determining the position and the posture of the display screen according to the dynamic capture image and at least two measuring distances;
wherein: the measuring distance is the linear distance from the luminous point of the distance meter to the pre-selected point on the display screen, which is measured by the distance meter.
4. A simulated shooting system as claimed in claim 3 further comprising a rigid body separate from said display screen; the rigid body is provided with a linear mark;
the optical mark points in the first optical mark combination are fixed on the rigid body;
the data processing equipment is used for determining the position and the posture of the display screen according to the dynamic capture image and at least two measuring distances;
wherein: the measured distance is the distance from a specific point on the linear mark to a pre-selected point on the display screen when the linear mark is aligned with the pre-selected point.
5. The simulated shooting system of any of claims 1-4 further comprising a simulated firearm and a second optical marker combination mounted on the simulated firearm;
the second optical mark combination comprises at least three optical mark points; optical mark points in the second optical mark combination are used for calibrating the position and the posture of the simulated gun;
the dynamic capture image captured by the optical dynamic capture camera comprises information of optical marker points in the second optical marker combination;
the data processing device is further used for determining the position and the posture of the simulated gun according to the dynamic capture image; and determining the pointing direction of the simulated firearm according to the position and the posture of the simulated firearm.
6. A simulated shooting system as claimed in claim 5, wherein:
the optical mark point is an active light-emitting point; the first optical mark combination and the second optical mark combination have different lamp language information;
the data processing device stores the lamp language information of the first optical mark combination and the second optical mark combination.
7. A coordinate conversion method of a simulated shooting system is used for determining a coordinate conversion relation between a virtual space and a real space displayed by a display screen in the simulated shooting system, and is characterized by comprising the following steps:
acquiring a dynamic capture image formed by capturing optical mark points in a first optical mark combination by an optical dynamic capture camera; the first optical mark combination comprises at least three optical mark points, and each optical mark point in the first optical mark combination is matched with each other to calibrate the position and the posture of the display screen;
determining the spatial position and the posture of the display screen according to the dynamic capture image;
and determining a coordinate conversion relation between the virtual space and the real space according to the spatial position and the posture of the display screen, wherein the coordinate conversion relation is used for calculating the shooting target position of a simulated gun in the simulated shooting system in the virtual space.
8. The coordinate conversion method of the analog shooting system according to claim 7,
the optical mark points in the first optical mark combination are all fixed on a distance meter independent of the display screen;
the method further comprises the following steps: acquiring the measuring distance output by the distance measuring instrument; the measured distance is the distance from the luminous point of the distance meter to a pre-selected point on the display screen;
determining the spatial position and the posture of the display screen according to the motion capture image, comprising: and determining the spatial position and the posture of the display screen according to the dynamic capture image and the at least two measuring distances.
9. The coordinate conversion method of a simulated shooting system as claimed in claim 7, wherein: each optical mark point is an active light-emitting point; the simulated shooting system comprises a plurality of display screens;
the simulated shooting system comprises a plurality of first optical mark combinations which correspond to the display screens one by one; the lamplanguage information formed by the optical mark point combinations in each first optical mark combination is different;
determining the spatial position and the posture of the display screen according to the motion capture image, comprising:
and respectively determining the spatial position and the posture of each display screen according to the dynamic captured image and the lamp language information combined by the first optical mark.
10. The coordinate conversion method of the analog shooting system according to claim 7,
the simulated shooting system further comprises a simulated firearm and a second optical marker combination mounted on the simulated firearm;
each second optical mark combination at least comprises three optical mark points; optical mark points in the second optical mark combination are used for calibrating the position and the posture of the simulated gun;
the method further comprises the following steps: determining the aiming direction of the simulated gun according to the dynamic capture image;
and determining the target which is aimed by the simulated gun and displayed in the virtual space according to the aiming direction and the coordinate conversion relation.
CN202011399076.8A 2020-12-02 2020-12-02 Simulated shooting system and coordinate conversion method thereof Active CN112642141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011399076.8A CN112642141B (en) 2020-12-02 2020-12-02 Simulated shooting system and coordinate conversion method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011399076.8A CN112642141B (en) 2020-12-02 2020-12-02 Simulated shooting system and coordinate conversion method thereof

Publications (2)

Publication Number Publication Date
CN112642141A CN112642141A (en) 2021-04-13
CN112642141B true CN112642141B (en) 2022-02-25

Family

ID=75350554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011399076.8A Active CN112642141B (en) 2020-12-02 2020-12-02 Simulated shooting system and coordinate conversion method thereof

Country Status (1)

Country Link
CN (1) CN112642141B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2191715A5 (en) * 1972-06-27 1974-02-01 Giravions Doran
JP4464948B2 (en) * 2006-10-13 2010-05-19 株式会社日立製作所 Projection display system
US8459997B2 (en) * 2009-02-27 2013-06-11 Opto Ballistics, Llc Shooting simulation system and method
US9733048B2 (en) * 2015-01-06 2017-08-15 Egismos Technology Corporation Shooting training and game system with virtual target
CN105354820B (en) * 2015-09-30 2018-05-22 深圳多新哆技术有限责任公司 Adjust the method and device of virtual reality image
CN107093195B (en) * 2017-03-10 2019-11-05 西北工业大学 A kind of locating mark points method of laser ranging in conjunction with binocular camera
CN107803025B (en) * 2017-11-05 2019-03-15 北京度量科技有限公司 Analogy method is aimed at and triggered when a kind of 3D high-precision real
CN110544278B (en) * 2018-05-29 2022-09-16 杭州海康机器人技术有限公司 Rigid body motion capture method and device and AGV pose capture system

Also Published As

Publication number Publication date
CN112642141A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
KR101244440B1 (en) A method and an apparatus for determining a deviation between an actual direction of a launched projectile and a predetermined direction
US7738082B1 (en) System and method for measuring a size of a distant object
US8794967B2 (en) Firearm training system
KR101603281B1 (en) Firearm laser training system and method thereof
CN107543530B (en) Method, system, and non-transitory computer-readable recording medium for measuring rotation of ball
US9308437B2 (en) Error correction system and method for a simulation shooting system
JP6164679B2 (en) Camera calibration method and camera calibration apparatus
CN110910459B (en) Camera device calibration method and device and calibration equipment
US10670687B2 (en) Visual augmentation system effectiveness measurement apparatus and methods
JPH11305935A (en) Position detection system
US20070103673A1 (en) Passive-optical locator
JP7043681B2 (en) Guidance method for autonomously movable devices using optical communication devices
CN108846864A (en) A kind of position capture system, the method and device of moving object
US20220049931A1 (en) Device and method for shot analysis
CN113884081A (en) Method and equipment for measuring three-dimensional coordinates of positioning point
US20210372738A1 (en) Device and method for shot analysis
CN112642141B (en) Simulated shooting system and coordinate conversion method thereof
CN113091512B (en) Shooting device aiming method and device
KR20130006113A (en) Apparatus for fire training simulation system
US20180040266A1 (en) Calibrated computer display system with indicator
CN109323691A (en) A kind of positioning system and localization method
JP2000258122A (en) Luminous position standardizing device
KR101668990B1 (en) Apparatus and method for scoring at air-to-surface shooting range
RU2804155C1 (en) Method for determining whether target is hit (variants)
KR101121807B1 (en) Method for measuring 3-dimensional coordinate of an object and portable terminal implementing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant