CN104121892A - Method, device and system for acquiring light gun shooting target position - Google Patents

Method, device and system for acquiring light gun shooting target position Download PDF

Info

Publication number
CN104121892A
CN104121892A CN201410325348.8A CN201410325348A CN104121892A CN 104121892 A CN104121892 A CN 104121892A CN 201410325348 A CN201410325348 A CN 201410325348A CN 104121892 A CN104121892 A CN 104121892A
Authority
CN
China
Prior art keywords
binocular camera
coordinate system
light source
camera
light gun
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410325348.8A
Other languages
Chinese (zh)
Other versions
CN104121892B (en
Inventor
李乐
周琨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huanchuang Technology Co ltd
Original Assignee
SHENZHEN TVPALY TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN TVPALY TECHNOLOGY Co Ltd filed Critical SHENZHEN TVPALY TECHNOLOGY Co Ltd
Priority to CN201410325348.8A priority Critical patent/CN104121892B/en
Publication of CN104121892A publication Critical patent/CN104121892A/en
Application granted granted Critical
Publication of CN104121892B publication Critical patent/CN104121892B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention is applicable to the field of electronic equipment, provides a method for acquiring a light gun shooting target position, the method includes the following steps: real-time acquisition of the position of a first light source of a light gun in an image generated by a binocular camera; based on the position of the first light source in the image generated by the binocular camera, calculation of a first attitude relationship matrix of the coordinate system of the light gun and the coordinate system of the binocular camera; based on the calculated first attitude relationship matrix and a pre-calibrated a second attitude relationship matrix of the coordinate system of the binocular camera and the coordinate system of a target screen, determination of the position of a second light source of the light gun in the target screen. Compared with the prior art, the method can acquire a determined position relationship of the light gun, the binocular camera and the target screen, so that the position information of the second light source of the light gun in the target screen can be more accurate.

Description

A kind of method, Apparatus and system of the target location that obtains light gun shooting
Technical field
The invention belongs to electronic device field, relate in particular to a kind of method, Apparatus and system of the target location that obtains light gun shooting.
Background technology
Light gun, as important body sense shooting game stage property, because it is by the shooting of light emulation bullet, can make shooting game from by liberating key control, to utilize the shooting of the rectilinear propagation gun-simulation class of light, and more lively fire effect is provided.
Shoot the target location on screen in order to obtain accurately light gun, need to effectively locate light gun.The existing general practice is the mode of being combined with motion sensor by optics perception, by being arranged on the visible image capturing head of target screen end and being arranged at the visible lamp ball on handheld terminal, obtain the locus of the size judgement handheld terminal of luminescent ball by camera, simultaneously, be equipped with the sensors such as acceleration and gyroscope at controller, for obtaining the movement angle of handheld terminal, thereby obtain the steering order that handheld terminal is corresponding.
But, because the spatial relationship of handheld terminal, camera, target screen is inaccurate, make handheld terminal sending controling instruction, while being light gun emission control instruction as handheld terminal, the degree of accuracy of the target location that target screen obtains is not high.
Summary of the invention
The object of the embodiment of the present invention is to provide a kind of method, Apparatus and system of the target location that obtains light gun shooting, to solve in prior art because the spatial relationship of handheld terminal, camera, target screen is inaccurate, while making handheld terminal sending controling instruction, the problem that the degree of accuracy of the target location that target screen obtains is not high.
The embodiment of the present invention is achieved in that a kind of method of the target location that obtains light gun shooting, is provided with the first light source and secondary light source on described light gun, on target screen, is provided with binocular camera, and described method comprises:
Position in the image that the first light source of the described light gun of Real-time Obtaining generates at described binocular camera;
Position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
According to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
The device that a kind of target location that obtains light gun shooting is also provided on the other hand of the embodiment of the present invention, is provided with the first light source and secondary light source on described light gun, on target screen, be provided with binocular camera, and described device comprises:
Real time position acquiring unit, the position of the image generating at described binocular camera for the first light source of the described light gun of Real-time Obtaining;
Attitude relational matrix computing unit, for the position at the image of described binocular camera generation according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Position determination unit, for according to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
The system that a kind of target location that obtains light gun shooting is also provided on the other hand of the embodiment of the present invention, described system comprises light gun, binocular camera, controller, on described light gun, be provided with the first light source and secondary light source, described binocular camera can be arranged on the target screen for light gun shooting, described controller is connected with target screen, light gun, binocular camera respectively, the position of the image that described controller generates at described binocular camera for the first light source of the described light gun of Real-time Obtaining; Position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera; According to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
In embodiments of the present invention, position in the image generating in the binocular camera being arranged on target screen by the first light source arranging on light gun, can determine the first attitude relational matrix of the spatial relation of described light gun and described binocular camera, and the second attitude relational matrix by described the first attitude relational matrix and between the coordinate system of described binocular camera and the coordinate system of described target screen demarcated in advance, thereby the position of the secondary light source of definite described light gun in described target screen.Scheme of the present invention compared with prior art, can be by the position relationship between the light gun of determining, binocular camera shooting head and target screen, and the positional information of the secondary light source that makes light gun in target screen is more accurate.
Brief description of the drawings
Fig. 1 is the realization flow figure of the method for the target location that obtains light gun shooting that provides of the embodiment of the present invention;
Fig. 2 is the structural representation of the system of the target location that obtains light gun shooting that provides of the embodiment of the present invention;
Fig. 3 is the realization flow figure that demarcation that the embodiment of the present invention provides generates the second attitude relational matrix;
The structural representation of the device of the target location that obtains light gun shooting that Fig. 4 provides for the embodiment of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
The embodiment of the present invention can be applicable to the interaction device of various use light as control terminal, comprise as shooting game machine in game machine etc., described light gun can be understood as the various handheld terminals of controlling by light signal, due in game class equipment, occur mainly with light gun form, therefore be referred to as in this application light gun.By the firing point of light gun correspondence in target screen, make corresponding steering order act on this position of screen, the present invention, by improving the target location of light gun and the accuracy of response coordinate, further improves user's experience.
Fig. 1 shows the realization flow of the method for the target location that obtains light gun shooting that the embodiment of the present invention provides, and is provided with the first light source and secondary light source on described light gun, on target screen, is provided with binocular camera, and details are as follows for described method:
In step S101, the position in the image that the first light source of the described light gun of Real-time Obtaining generates at described binocular camera.
Concrete, the first light source and the secondary light source that on described light gun, arrange, be respectively used to demarcate and shoot used.Described the first light source can be infrared light supply, can be also visible light source.In the time that described the first light source is infrared light supply, correspond to thermal camera for the binocular camera that receives described infrared light supply, in the time that described the first light source is visible light source, described binocular camera can be common video camera.
Described the first light source, can be three infrared light supplies or visible light source, can be also three above multiple infrared light supplies or visible light sources, when multiple the first light sources are used for locating, can further improve the accuracy of location, generally can select four infrared light supplies.
Described secondary light source is for shooting, for by light gun to described target screen utilizing emitted light signal, can be laser source, this is the laser due to laser instrument transmitting, only penetrate towards a direction, the divergence of light beam is minimum, approximately only has 0.001 radian, approaches parallel, and laser is owing to being directional lighting, a large amount of photons concentrate in a very little spatial dimension and penetrate, and energy density is high, and therefore corresponding brightness is also very high.With respect to the mode of ordinary light source optically focused, can better adapt to the requirement of shooting.
Described target screen, can be common televisor or game machine screen, can certainly be other liquid crystal display equipment screen.On described screen, can preset multiple calibration points, can be for demarcating the posture position of current light gun, be preferred embodiment four calibration points of four corner location equipment at target screen, and in the center of screen, calibration point be set, can better realize like this effect of demarcation.
Described binocular camera, its principle is the mode of simulating human visual processes scenery, utilize two relatively-stationary video cameras in position simultaneously to same scenery (to the first light source of light gun) imaging from different perspectives, obtain the three-dimensional information of the spatial scene of scenery by calculating the parallax of corresponding point wherein.
The image that described binocular camera generates, is two two-dimentional images, and in every two dimensional image, the position of the first light source that comprises described light gun in image.
Wherein, the position of described the first light source in two dimensional image, first by synchronous exposure process, environment is filtered, concrete, when the first light source igniting, send to recognizer by infrared signal, recognizer carries out single exposure, in the time that the first light source extinguishes, recognizer exposes for the second time, subtract each other and can obtain frame difference image by two two field pictures, ensure like this to only have on frame difference image the first light source to exist, surround lighting in subtraction by filtering, can detect by image recognition, obtain the position of described the first light source in image.
In step S102, the position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera.
Because the first light source has generated two images in binocular camera, and because the position of binocular camera is not identical, in two images that two cameras that generally be arranged in parallel obtain, the position of the first light source also has parallax, thereby according to the range difference of the distance between the binocular camera be arrangeding in parallel, the first light source position of image in described binocular camera, can obtain the spatial attitude information of light gun.
Concrete optional, can calculate the first attitude relational matrix between the coordinate system of light gun and the coordinate system of binocular camera shooting head according to following formula:
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Wherein, p 0, p 1for the position of described the first light source in the image of described binocular camera generation, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; K0 in the time that binocular recognizer dispatches from the factory, K1, M0 and M1 determine.Described M tfor the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera, described P wfor the three-dimensional coordinate of the first light source on described light gun on the coordinate system of described light gun.
As shown in Figure 2, on light gun as steady arm, be provided with infrared LED light source as the first light source, the coordinate of steady arm light gun is Ow, coordinate as the binocular infrared camera of perceptron is Oc, the coordinate of target screen is Os, parameter matrix as the laser source divergent-ray of light gun secondary light source is L, and it is (pixX, pixY) that light gun is beaten in the 2D position of target screen.
The above-mentioned M that solves tprocess be real-time process, that is to say, light gun in use, the position angle in space is constantly to change, therefore need the variation of the real-time position relationship that records its space, can be according to the predetermined update cycle, calculate the first attitude relational matrix M between the coordinate system of described light gun and the coordinate system of described binocular camera t.
In step S103, according to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
Concrete, described according to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcation in advance and the second attitude relational matrix of the coordinate system of described target screen, in the position step of the secondary light source of determining described light gun in described target screen, the position of the concrete secondary light source that calculates described light gun according to following formula in described target screen:
L c = M t * L * M t T
c=M*L c*M T*P
pixX=c(1)/c(4)
pixY=c(2)/c(4)
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described P wfor the first light source on described light gun is at the three-dimensional coordinate of the coordinate system of described light gun, described L is light gun ray parameter matrix, described M tfor the first attitude relational matrix of the coordinate system of described light gun and the coordinate system of described binocular camera, described M is the second attitude relational matrix between the coordinate system of described binocular camera shooting head and the coordinate system of target screen; Described for M ttransposed matrix, described MT is the transposed matrix of M, described P is the plane parameter vector of target screen, described c (1), c (2), c (4) are respectively the 1st value in vectorial c, the 2nd value and the 4th value, (pixX, pixy) is the coordinate position of target screen.
In the embodiment of the present invention by the first attitude relational matrix between the real-time coordinate system that obtains described light gun and the coordinate system of described binocular camera, and in conjunction with the second attitude relational matrix between the coordinate system of described binocular camera shooting head and the coordinate system of target screen that obtain in advance, thereby can effectively obtain the coordinate of described light gun shooting in the optional position of target screen, improve the degree of accuracy of aimed fire.
Because the second attitude relational matrix between the coordinate system of described binocular camera shooting head and the coordinate system of target screen is the position relationship of describing binocular camera shooting head and target screen, setting after described binocular camera, generally can be by the top of binocular camera Offered target screen, below, left side or right side, after setting, position generally can not change, therefore, can be by demarcating in advance once, its calibration process as shown in Figure 3, comprises the following steps:
In step S301, shoot by the aim at the mark coordinate (pixX, pixY) of multiple calibration points of screen of the secondary light source of light gun, and in the time of each shooting, the position p in the image generating at described binocular camera by described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix M of the coordinate system of described light gun and the coordinate system of described binocular camera t;
In step S302, according to the first calculated attitude relational matrix M tand the coordinate of multiple calibration points, by formula
L c = M t * L * M t T
c=M*L c*M T*P
pixX=c(1)/c(4)
pixY=c(2)/c(4)
Calculate the second attitude relational matrix M between the coordinate system of described binocular camera shooting head and the coordinate system of target screen;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described P wfor the first light source on described light gun is at the three-dimensional coordinate of the coordinate system of described light gun, described L is light gun ray parameter matrix, described in for M ttransposed matrix, described M tfor the transposed matrix of M, described P is the plane parameter vector of target screen, and described c (1), c (2), c (4) are respectively the 1st value in vectorial c, the 2nd value and the 4th value.
In the time solving described the second attitude relational matrix, need to demarcate by the calibration point in the TV screen shown in Fig. 2, the coordinate position of four calibration points is as shown in the figure known, be the pixX in formula, pixY is known, by the punctuate of multiple calibration points, obtains described the second attitude relational matrix by computing formula iterative computation, after once demarcating, can be with the second attitude relational matrix of demarcating in follow-up computation process.
The structural representation that is illustrated in figure 4 the device of the target location that obtains light gun shooting that the embodiment of the present invention provides, is provided with the first light source and secondary light source on described light gun, on target screen, be provided with binocular camera, and described device comprises:
Real time position acquiring unit 401, the position of the image generating at described binocular camera for the first light source of the described light gun of Real-time Obtaining;
Attitude relational matrix computing unit 402, for the position at the image of described binocular camera generation according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Position determination unit 403, for according to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
Preferably, described attitude relational matrix computing unit 402 specifically for:
Position p in the image generating at described binocular camera according to described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described M tfor the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera, described P wfor the three-dimensional coordinate of the first light source on described light gun on the coordinate system of described light gun.
Further, described device also comprises:
The first attitude relational matrix computing unit, for the coordinate (pixX of multiple calibration points of the screen that aims at the mark by the secondary light source of light gun, pixY) shoot, and in the time of each shooting, the position p in the image generating at described binocular camera by described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix M of the coordinate system of described light gun and the coordinate system of described binocular camera t;
The second attitude relational matrix computing unit, for according to the first calculated attitude relational matrix M tand the coordinate of multiple calibration points, by formula
L c = M t * L * M t T
c=M*L c*M T*P
pixX=c(1)/c(4)
pixY=c(2)/c(4)
Calculate the second attitude relational matrix M between the coordinate system of described binocular camera shooting head and the coordinate system of target screen;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described P wfor the first light source on described light gun is at the three-dimensional coordinate of the coordinate system of described light gun, described L is light gun ray parameter matrix, described in for M ttransposed matrix, described M tfor the transposed matrix of M, described P is the plane parameter vector of target screen, and described c (1), c (2), c (4) are respectively the 1st value in vectorial c, the 2nd value and the 4th value.
Preferably, described real time position acquiring unit comprises:
The first exposure subelement, when sending to binocular camera shooting head by infrared signal when the first light source igniting, binocular camera shooting head carries out single exposure, obtains the first two field picture;
The second exposure subelement, in the time that the first light source extinguishes, binocular camera shooting head exposes for the second time, obtains the second two field picture;
Recognition detection subelement, for the first two field picture and the second two field picture are subtracted each other and obtain frame difference image, by frame difference image recognition detection, obtains the position of described the first light source in image.
Preferably, described the first light source is infrared light supply or visible light source, and described secondary light source is laser source.
The device of the target location that obtains light gun shooting shown in Fig. 4 is corresponding with the method for the target location that obtains light gun shooting shown in Fig. 1 and Fig. 3, does not repeat at this.
As shown in Figure 2, the structural representation of the system of the target location that obtains light gun shooting described in the embodiment of the present invention, described system comprises light gun, binocular camera, controller, on described light gun, be provided with the first light source and secondary light source, described binocular camera can be arranged on the target screen for light gun shooting, described controller is connected with target screen, light gun, binocular camera respectively, the position of the image that described controller generates at described binocular camera for the first light source of the described light gun of Real-time Obtaining; Position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera; According to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
Preferably, described controller is specifically when sending to binocular camera shooting head by infrared signal when the first light source igniting, and binocular camera shooting head carries out single exposure, obtains the first two field picture; In the time that the first light source extinguishes, binocular camera shooting head exposes for the second time, obtains the second two field picture; The first two field picture and the second two field picture are subtracted each other and obtain frame difference image, by frame difference image recognition detection, obtain the position of described the first light source in image.
The system of target location of obtaining light gun shooting described in the embodiment of the present invention is corresponding with the method for obtaining the target location that light gun shoots described in Fig. 1 and Fig. 3.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, all any amendments of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (10)

1. a method of obtaining the target location of light gun shooting, is characterized in that, is provided with the first light source and secondary light source on described light gun, on target screen, is provided with binocular camera, and described method comprises:
Position in the image that the first light source of the described light gun of Real-time Obtaining generates at described binocular camera;
Position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
According to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
2. method according to claim 1, it is characterized in that, position in the described image generating at described binocular camera according to described the first light source, the first attitude relational matrix step of calculating between the coordinate system of described light gun and the coordinate system of described binocular camera comprises:
Position p in the image generating at described binocular camera according to described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described M tfor the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera, described P wfor the three-dimensional coordinate of the first light source on described light gun on the coordinate system of described light gun.
3. method according to claim 1, it is characterized in that, described according to the first calculated attitude relational matrix and in advance demarcate the coordinate system of described binocular camera and the second attitude relational matrix of the coordinate system of described target screen, before the position step of the secondary light source of determining described light gun in described target screen, described method also comprises:
Shoot by the aim at the mark coordinate (pixX, pixY) of multiple calibration points of screen of the secondary light source of light gun, and in the time of each shooting, the position p in the image generating at described binocular camera by described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix M of the coordinate system of described light gun and the coordinate system of described binocular camera t;
According to the first calculated attitude relational matrix M tand the coordinate of multiple calibration points, by formula
L c = M t * L * M t T
c=M*L c*M T*P
pixX=c(1)/c(4)
pixY=c(2)/c(4)
Calculate the second attitude relational matrix M between the coordinate system of described binocular camera shooting head and the coordinate system of target screen;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described P wfor the first light source on described light gun is at the three-dimensional coordinate of the coordinate system of described light gun, described L is light gun ray parameter matrix, described in for M ttransposed matrix, described M tfor the transposed matrix of M, described P is the plane parameter vector of target screen, and described c (1), c (2), c (4) are respectively the 1st value in vectorial c, the 2nd value and the 4th value.
4. method according to claim 1, is characterized in that, the position step in the image that the first light source of the described light gun of described Real-time Obtaining generates at described binocular camera comprises:
Send to binocular camera shooting head by infrared signal in the time of the first light source igniting time, binocular camera shooting head carries out single exposure, obtains the first two field picture;
In the time that the first light source extinguishes, binocular camera shooting head exposes for the second time, obtains the second two field picture;
The first two field picture and the second two field picture are subtracted each other and obtain frame difference image, by frame difference image recognition detection, obtain the position of described the first light source in image.
5. a device that obtains the target location of light gun shooting, is characterized in that, is provided with the first light source and secondary light source on described light gun, on target screen, is provided with binocular camera, and described device comprises:
Real time position acquiring unit, the position of the image generating at described binocular camera for the first light source of the described light gun of Real-time Obtaining;
Attitude relational matrix computing unit, for the position at the image of described binocular camera generation according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Position determination unit, for according to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
According to claim 5 device, it is characterized in that, described attitude relational matrix computing unit specifically for:
Position p in the image generating at described binocular camera according to described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described M tfor the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera, described P wfor the three-dimensional coordinate of the first light source on described light gun on the coordinate system of described light gun.
7. install according to claim 5, it is characterized in that, described device also comprises:
The first attitude relational matrix computing unit, for the coordinate (pixX of multiple calibration points of the screen that aims at the mark by the secondary light source of light gun, pixY) shoot, and in the time of each shooting, the position p in the image generating at described binocular camera by described the first light source 0, p 1, and formula
p 0=Mc 0*M t*P w
p 1=Mc 1*M t*P w
Calculate the first attitude relational matrix M of the coordinate system of described light gun and the coordinate system of described binocular camera t;
The second attitude relational matrix computing unit, for according to the first calculated attitude relational matrix M tand the coordinate of multiple calibration points, by formula
L c = M t * L * M t T
c=M*L c*M T*P
pixX=c(1)/c(4)
pixY=c(2)/c(4)
Calculate the second attitude relational matrix M between the coordinate system of described binocular camera shooting head and the coordinate system of target screen;
Wherein, described Mc 0=K 0* M 0, Mc 1=K 1* M 1, wherein K 0for the inner parameter matrix of the camera 0 in binocular camera, M 0for camera 0 is with respect to the attitude relational matrix of the coordinate system of binocular camera, K 1for the inner parameter matrix of the camera 1 in binocular camera, M 1for camera 1 is with respect to the attitude relational matrix of the coordinate system of binocular camera; Described P wfor the first light source on described light gun is at the three-dimensional coordinate of the coordinate system of described light gun, described L is light gun ray parameter matrix, described in for M ttransposed matrix, described M tfor the transposed matrix of M, described P is the plane parameter vector of target screen, and described c (1), c (2), c (4) are respectively the 1st value in vectorial c, the 2nd value and the 4th value.
8. install according to claim 5, it is characterized in that, described real time position acquiring unit comprises:
The first exposure subelement, when sending to binocular camera shooting head by infrared signal when the first light source igniting, binocular camera shooting head carries out single exposure, obtains the first two field picture;
The second exposure subelement, in the time that the first light source extinguishes, binocular camera shooting head exposes for the second time, obtains the second two field picture;
Recognition detection subelement, for the first two field picture and the second two field picture are subtracted each other and obtain frame difference image, by frame difference image recognition detection, obtains the position of described the first light source in image.
9. one kind is obtained the system of the target location of light gun shooting, it is characterized in that, described system comprises light gun, binocular camera, controller, on described light gun, be provided with the first light source and secondary light source, described binocular camera can be arranged on the target screen for light gun shooting, described controller is connected with target screen, light gun, binocular camera respectively, the position of the image that described controller generates at described binocular camera for the first light source of the described light gun of Real-time Obtaining; Position in the image generating at described binocular camera according to described the first light source, calculates the first attitude relational matrix between the coordinate system of described light gun and the coordinate system of described binocular camera; According to the first calculated attitude relational matrix and the coordinate system of described binocular camera of demarcating in advance and the second attitude relational matrix of the coordinate system of described target screen, the position of the secondary light source of determining described light gun in described target screen.
10. system according to claim 9, is characterized in that, described controller is specifically when sending to binocular camera shooting head by infrared signal when the first light source igniting, and binocular camera shooting head carries out single exposure, obtains the first two field picture; In the time that the first light source extinguishes, binocular camera shooting head exposes for the second time, obtains the second two field picture; The first two field picture and the second two field picture are subtracted each other and obtain frame difference image, by frame difference image recognition detection, obtain the position of described the first light source in image.
CN201410325348.8A 2014-07-09 2014-07-09 Method, device and system for acquiring light gun shooting target position Active CN104121892B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410325348.8A CN104121892B (en) 2014-07-09 2014-07-09 Method, device and system for acquiring light gun shooting target position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410325348.8A CN104121892B (en) 2014-07-09 2014-07-09 Method, device and system for acquiring light gun shooting target position

Publications (2)

Publication Number Publication Date
CN104121892A true CN104121892A (en) 2014-10-29
CN104121892B CN104121892B (en) 2017-01-25

Family

ID=51767405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410325348.8A Active CN104121892B (en) 2014-07-09 2014-07-09 Method, device and system for acquiring light gun shooting target position

Country Status (1)

Country Link
CN (1) CN104121892B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104436643A (en) * 2014-11-17 2015-03-25 深圳市欢创科技有限公司 Method, device and system for outputting aim point of light gun on display screen
CN105844199A (en) * 2016-03-30 2016-08-10 乐视控股(北京)有限公司 Method and device for determining aiming positions of game guns on display screen
CN109173252A (en) * 2018-09-19 2019-01-11 深圳华侨城文化旅游科技股份有限公司 A kind of screen shooting game localization method, storage medium and device
CN111698467A (en) * 2020-05-08 2020-09-22 北京中广上洋科技股份有限公司 Intelligent tracking method and system based on multiple cameras

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02110313A (en) * 1988-10-20 1990-04-23 Oki Electric Ind Co Ltd Detection of target position by use of camera
US20050202870A1 (en) * 2003-12-26 2005-09-15 Mitsuru Kawamura Information processing device, game device, image generation method, and game image generation method
CN1710614A (en) * 2005-06-16 2005-12-21 上海交通大学 Method for evaluating parameter of 3-D motion of human climbs based on model
KR20070102942A (en) * 2006-04-17 2007-10-22 이문기 Sighting device using virtual camera
CN101158883A (en) * 2007-10-09 2008-04-09 深圳先进技术研究院 Virtual gym system based on computer visual sense and realize method thereof
TW201241396A (en) * 2011-04-06 2012-10-16 Wei-Kai Liou Leaser guide interactive electronic whiteboard technology apply to military firing training and the first person shooting (F.P.S) game system
US20130293548A1 (en) * 2009-11-16 2013-11-07 Sony Corporation Information processing apparatus, information processing method, program, and information processing system
US20140184496A1 (en) * 2013-01-03 2014-07-03 Meta Company Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02110313A (en) * 1988-10-20 1990-04-23 Oki Electric Ind Co Ltd Detection of target position by use of camera
US20050202870A1 (en) * 2003-12-26 2005-09-15 Mitsuru Kawamura Information processing device, game device, image generation method, and game image generation method
CN1710614A (en) * 2005-06-16 2005-12-21 上海交通大学 Method for evaluating parameter of 3-D motion of human climbs based on model
KR20070102942A (en) * 2006-04-17 2007-10-22 이문기 Sighting device using virtual camera
CN101158883A (en) * 2007-10-09 2008-04-09 深圳先进技术研究院 Virtual gym system based on computer visual sense and realize method thereof
US20130293548A1 (en) * 2009-11-16 2013-11-07 Sony Corporation Information processing apparatus, information processing method, program, and information processing system
TW201241396A (en) * 2011-04-06 2012-10-16 Wei-Kai Liou Leaser guide interactive electronic whiteboard technology apply to military firing training and the first person shooting (F.P.S) game system
US20140184496A1 (en) * 2013-01-03 2014-07-03 Meta Company Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104436643A (en) * 2014-11-17 2015-03-25 深圳市欢创科技有限公司 Method, device and system for outputting aim point of light gun on display screen
CN104436643B (en) * 2014-11-17 2017-05-31 深圳市欢创科技有限公司 Output light gun aims at method, the apparatus and system of the quasi- heart on a display screen
CN105844199A (en) * 2016-03-30 2016-08-10 乐视控股(北京)有限公司 Method and device for determining aiming positions of game guns on display screen
CN109173252A (en) * 2018-09-19 2019-01-11 深圳华侨城文化旅游科技股份有限公司 A kind of screen shooting game localization method, storage medium and device
CN111698467A (en) * 2020-05-08 2020-09-22 北京中广上洋科技股份有限公司 Intelligent tracking method and system based on multiple cameras

Also Published As

Publication number Publication date
CN104121892B (en) 2017-01-25

Similar Documents

Publication Publication Date Title
CN106550228B (en) The equipment for obtaining the depth map of three-dimensional scenic
CN106780601B (en) Spatial position tracking method and device and intelligent equipment
US10306134B2 (en) System and method for controlling an equipment related to image capture
US10008028B2 (en) 3D scanning apparatus including scanning sensor detachable from screen
TW201932914A (en) Augmented reality display with active alignment
US20210152802A1 (en) Apparatus and method for generating a representation of a scene
CN108234984A (en) Binocular depth camera system and depth image generation method
CN105378794A (en) 3d recording device, method for producing 3d image, and method for setting up 3d recording device
JP2017516157A (en) Wearable projection device, focus adjustment method thereof, and projection method
WO2018028152A1 (en) Image acquisition device and virtual reality device
KR101624416B1 (en) System for calculating point of impact and method of thereof
CN104121892A (en) Method, device and system for acquiring light gun shooting target position
TWI458532B (en) System and method for detecting a shot direction of a light gun
US20190236847A1 (en) Method and system for aligning digital display of images on augmented reality glasses with physical surrounds
CN110880161B (en) Depth image stitching and fusion method and system for multiple hosts and multiple depth cameras
CN102628693A (en) Method for registering camera spindle and laser beam in parallel
CN110458104B (en) Human eye sight direction determining method and system of human eye sight detection system
CN104190078B (en) A kind of target identification method, the apparatus and system of light gun shooting
TWI486052B (en) Three-dimensional image processing device and three-dimensional image processing method
KR102185322B1 (en) System for detecting position using ir stereo camera
TW201537137A (en) Twin image guiding-tracking shooting system and method
CN104436643A (en) Method, device and system for outputting aim point of light gun on display screen
CN203719535U (en) Positioning system for multiple CCD (charge-coupled device) large-scene indicating target
KR101754975B1 (en) apparatus for controlling position of battery
JP2016099638A5 (en)

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000, Floor 1801, Block C, Minzhi Stock Commercial Center, North Station Community, Minzhi Street, Longhua District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Huanchuang Technology Co.,Ltd.

Address before: 3A, Maikelong Building, No. 6 Gaoxin South Sixth Road, Nanshan District, Shenzhen, Guangdong Province, 518000

Patentee before: SHENZHEN CAMSENSE TECHNOLOGIES Co.,Ltd.

CP03 Change of name, title or address