KR101552403B1 - Point of an image according to the user's changing attitude imager - Google Patents

Point of an image according to the user's changing attitude imager Download PDF

Info

Publication number
KR101552403B1
KR101552403B1 KR1020130136685A KR20130136685A KR101552403B1 KR 101552403 B1 KR101552403 B1 KR 101552403B1 KR 1020130136685 A KR1020130136685 A KR 1020130136685A KR 20130136685 A KR20130136685 A KR 20130136685A KR 101552403 B1 KR101552403 B1 KR 101552403B1
Authority
KR
South Korea
Prior art keywords
user
image
screen
unit
posture
Prior art date
Application number
KR1020130136685A
Other languages
Korean (ko)
Other versions
KR20150054342A (en
Inventor
제영호
조문철
임용묵
Original Assignee
주식회사 제이디솔루션
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 제이디솔루션 filed Critical 주식회사 제이디솔루션
Priority to KR1020130136685A priority Critical patent/KR101552403B1/en
Publication of KR20150054342A publication Critical patent/KR20150054342A/en
Application granted granted Critical
Publication of KR101552403B1 publication Critical patent/KR101552403B1/en

Links

Images

Abstract

[0001] The present invention relates to a video apparatus in which a video viewpoint changes according to a position of a user, and more particularly, to a video apparatus having a screen for outputting an image, an output unit for outputting an image to the screen, A shooting detection unit for detecting a shooting beam of the blast unit irradiated on the screen, a position checking unit for checking a position and a posture of the user based on the screen, And a control unit for controlling the output unit to output image data corresponding to information on the position and attitude of the user identified by the position confirming unit to the database, And outputting a video corresponding to the video signal.
According to the present invention as described above, the three-dimensional feeling and the real feeling can be felt through the image according to the position and the posture of the user, so that the consumer's desire can be satisfied.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a video apparatus,

The present invention relates to a video device, and more particularly, to a video device that simulates video using a video output on a screen, moves a video according to a user's position and posture to provide a three- To a video apparatus in which a video viewpoint changes according to a user's attitude.

In general, from arcade game machines to 3D online games, today's game industry is recognized for its potential as an industry with endless marketability.

Especially, the increase of the screen golf shop which can enjoy golf on the screen installed in the designated space accelerates the growth of the game industry as it attracts adults to the game room again.

In such an atmosphere, there have been attempts to enjoy indoor events in the field, such as shooting, in particular, in place of shooting practice of shooting athletes or shooting training of soldiers. However, in order to introduce a simulated shooting system, It is necessary to construct a simulated gun that can be used and a system for effectively operating it.

To this end, the prior art Patent No. 10-0572006 fulfills the desire to enable the user to feel the three-dimensional feeling and the real feeling by enabling shooting in a virtual space simulating the actual situation by using multimedia and computer technology.

However, in the conventional image system, the 3D virtual space can be displayed on the screen to visually sense the stereoscopic effect. However, the user's gaze is fixed in the middle of the screen, and the image is output to the next image.

Therefore, although the user feels the three-dimensional feeling and the real feeling in the eyes, the user does not feel the actual feeling according to the position and the posture and uses the same in a state where the actual feeling is decreased. Therefore, there is a desperate need to develop a technique to overcome this.

SUMMARY OF THE INVENTION Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide an apparatus and a method for detecting a shooting beam of a blasting part irradiated on a screen, A position detection unit for checking the position and attitude of the user, a database for storing image data of each position of the user and each position, and information on the position and attitude of the user identified by the position confirmation unit And a control unit for controlling the output unit to output the image data from the database to output a video corresponding to the position and attitude of the user, so that the user can feel the three-dimensional feeling and the reality through the image according to the user's position and posture, To provide a video apparatus in which a video viewpoint changes according to a posture of a user This is the purpose.

According to an aspect of the present invention, there is provided an image display apparatus including a screen for outputting an image, an output unit for outputting an image to the screen, a grasping unit for grasping a shooting beam by the user grasping the screen, A position detecting unit for detecting a position and a posture of the user based on the screen, a database for storing image data of each position and each position of the user, And a controller for controlling the output unit to output image data corresponding to information on the position and attitude of the user identified by the image output unit, and outputting an image corresponding to the user's position and posture.

Preferably, the position confirming unit is a position sensor mounted on at least one of a chest region, a waist region, and a knee region of a user, and the signal of the position sensor is transmitted to the control unit to check information on the position and posture of the user And controls the output unit to move the image.

The position sensor includes a chest position sensor for mounting on a user's chest, a waist position sensor for mounting on a user's waist, and a knee position sensor for mounting on a user's knee.

The knee position sensor may include a first knee position sensor mounted on the left knee of the user and a second knee position sensor mounted on the user's right knee.

The position sensor is not particularly limited, and for example, sensors such as a gyro sensor may be used.

The position confirming unit is a camera for capturing an image of a user in the screen direction and acquiring an image. The image obtained by the camera is transmitted to the control unit, and information about the position and attitude of the user is checked so that the image is moved And controls the output unit.

The control unit controls the output unit to output an image of the user's position and posture to a part of the screen in real time, so that the posture and behavior of the user can be confirmed in real time.

As described above, according to the video apparatus according to the present invention in which the viewpoint of the video varies according to the attitude of the user, the three-dimensional feeling and the real feeling can be felt through the video according to the position and attitude of the user, It is a useful and effective invention.

1 is a view showing a video apparatus according to the present invention,
FIG. 2 is a view showing a position determining unit of a video apparatus according to the present invention,
3 is a view showing another embodiment of the positioning unit according to the present invention,
4 is a view showing an image according to a stand-and-shoot posture of a video apparatus according to the present invention,
5 is a diagram illustrating an image change according to a knee shooting posture of a video apparatus according to the present invention,
FIG. 6 is a diagram illustrating an image change according to a shoot-up posture of a video apparatus according to the present invention.

Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings.

It should be noted that the present invention is not limited to the scope of the present invention but is only illustrative and various modifications are possible within the scope of the present invention.

FIG. 1 is a view showing a video apparatus according to the present invention, FIG. 2 is a view showing a position check unit of a video apparatus according to the present invention, FIG. 3 is a view showing another embodiment of a position check unit according to the present invention FIG. 4 is a view showing an image according to a stand-and-shoot posture of a video apparatus according to the present invention,

5 is a diagram illustrating an image change according to a knee shooting posture of a video apparatus according to the present invention,

FIG. 6 is a diagram illustrating an image change according to a shoot-up posture of a video apparatus according to the present invention.

As shown in the drawing, the imaging device 10, which changes the viewpoint of an image according to the attitude of the user, includes a screen 100, an output unit 200, a splash unit 300, a shooting detection unit 400, 500, a database 600, and a control unit 700.

The screen 100 is provided for outputting an image, and the output unit 200 is provided for outputting an image to the screen 100.

The triggering unit 300 is for irradiating the shooting beam with the screen 100 by grasping by the user, and irradiates the shooting beam toward the image output to the screen 100.

Here, the detaching unit 300 irradiates a laser beam, and irradiates the screen 100 with a shooting beam toward a target in an image irradiated with the laser beam.

The shooting detection unit 400 is provided to sense a shooting beam of the wartlet unit 300 irradiated to the screen 100.

The shooting detection unit 400 can determine the hit by extracting the coordinate information of the spot of the price beam irradiated by the detaching unit 300 and comparing the spot information with the coordinates of the target on the screen 100 .

Also, the position confirming unit 500 is provided to confirm the position and the posture of the user based on the screen 100, and the database 600 stores image data of each position of the user and each posture.

The control unit 700 controls the output unit 200 to output the image data corresponding to the information on the position and attitude of the user identified by the position confirming unit 500 from the database 600 to correspond to the position and posture of the user And outputs the image.

Here, as shown in FIG. 2, the position determining unit 500 is a position sensor mounted on at least one of a bust part, a waist part, and a knee part of the user.

The signal of the position sensor, which is the position determining unit 500, is transmitted to the controller 700, and controls the output unit 200 so that the image is moved by confirming information about the position and attitude of the user.

In other words, the control unit 700 controls the output unit 200 to receive the signal of the position sensor as the position confirmation unit 500 and output the corresponding image data from the database 600, Is output.

The position sensor, which is the position determining unit 500, includes a chest position sensor 510, a waist position sensor 520, and a knee position sensor 530.

The chest position sensor 510 is mounted on the user's chest, the waist position sensor 520 is mounted on the user's waist, and the knee position sensor 530 is mounted on the user's knee.

The knee position sensor 530 includes a first knee position sensor 532 mounted on the user's left knee and a second knee position sensor 534 mounted on the user's right knee, .

In one embodiment, the database 600 stores the corresponding image data according to the signals of the respective position sensors, and controls the output unit 200 to output the corresponding image data according to the signals received by the controller 700.

In another embodiment, the signals of the position sensors are transmitted to the controller 700 at predetermined time intervals, and are preferably transmitted at intervals of approximately 0.5 to 1.5 seconds.

The signal of the position sensor transmitted in this way is used to check the information about the position and the attitude of the user in the controller 700 and to transmit any information about the position and attitude of the user to the previous information about the position and attitude of the previous user .

The image is moved by checking the comparison data and controlling the output unit 200 to move the image by the difference.

As described above, it is possible to confirm whether the user is standing posture, sitting posture, lying posture, or the like by each position sensor mounted on each part of the user. As shown in Figs. 4 to 6, It is possible to output a video according to the user's request, thereby providing a sense of life rather than a simple shooting, thereby satisfying the user's desire.

Here, it is preferable to use a gyro sensor as the position sensor, and it is natural that any position sensor can be used.

On the other hand, as shown in FIG. 3, the position determining unit 500 'of another embodiment is a camera for capturing an image of a user in the screen direction to acquire an image.

The image obtained by the camera, which is the position determining unit 500 ', is transmitted to the control unit 700, and the information about the position and the attitude of the user is checked to control the output unit 200 to move the image.

In one embodiment, when the image obtained by the camera as the location confirmation unit 500 'is transmitted, the control unit 700 extracts coordinate information on the location of the user, and stores the image data corresponding to the extracted coordinate information in the database 600 to control the output unit 200.

Meanwhile, in another embodiment, the camera, which is the position confirming unit 500 ', captures a user in the screen direction to acquire an image, and extracts coordinate information on the position of the user in the acquired image.

Here, the camera transmits an image at a predetermined time interval, and after confirming the coordinate information in any one of the images, the camera compares the coordinate information with the coordinate information in the previous image and confirms the difference.

In the control unit 700, the output unit 200 is controlled so that the image is moved by the difference of each coordinate information, so that the image can be moved to satisfy the user's desire.

To this end, a coordinate extracting unit for extracting coordinate information from each image and an operation processing unit for comparing the coordinate information of each image extracted by the coordinate extracting unit and confirming the difference are constituted.

In addition, the operation processing unit can confirm the movement path by confirming the coordinate information of the user's position in each image by time, and transmits information about the movement path to the control unit 700 so that the output unit 200 moves the image, May be controlled.

Here, the camera, which is the position determining unit 500 ', may be provided at the upper center portion of the screen 100 so as to capture the user, or may be provided at both upper corners of the screen 100 to capture the user.

When each of the cameras is provided at both corners of the upper end of the screen 100, the coordinates of the user are confirmed in each image captured by each camera, and the average value of each coordinate information is checked to reduce the error to move the image .

The control unit 700 controls the output unit 200 to output an image of the user's position and posture to a part of the screen 100 in real time.

This allows the user to check his / her posture and behavior in real time, thereby correcting the posture in real time.

The triggering unit 300 is a firearm, and is composed of, for example, a pistol, a rifle, a machine gun, a grenade launcher, and a recoilless gun.

The control unit 700 controls the output unit 200 such that a corresponding image of a selected one of a pistol, a rifle, a machine gun, a grenade launcher, and a non-return gun is output to the screen 100.

In other words, when the user grasps any one of the image data of the pistol, the rifle, the machine gun, the grenade launcher and the non-recoil gun in the database 600, the controller 700 transmits the image data to the database 600 ), It is possible to provide the same feeling as if the user actually grasps the corresponding weapon, thereby satisfying the desire.

Further, the blower unit 300 such as a pistol, a rifle, a machine gun, a grenade launcher, and a non-return gun may further include a rebound unit (not shown) interlocked with a leverage lever (not shown)

It is preferable that such a recoil portion generates a recoil to the rear side by the compressed air so as to provide a real feeling that the user is performing actual shooting.

10: Image shooting device 100: Screen
200: output unit 300:
400: fire detection unit 500: position detection unit
600: database 700: control unit

Claims (7)

A screen for outputting an image;
An output unit for outputting an image to the screen;
A grasping portion for grasping a shooting beam by the user grasping the screen;
A shooting sensor for sensing a shooting beam of the trigger unit irradiated to the screen;
A position confirming unit for confirming a position and a posture of the user based on the screen;
A database storing image data of each position of the user and each posture; And
And a controller for controlling the output unit to output image data corresponding to the position and attitude of the user identified by the position confirming unit to the database,
Wherein the control unit controls the output unit to output an image of the position and the posture of the user in real time on a part of the screen so that the posture and behavior of the user can be checked in real time. Imaging device.
The apparatus according to claim 1,
A position sensor mounted on at least one of a chest region, a waist region and a knee region of a user,
Wherein the signal of the position sensor is transmitted to the controller, and the information about the position and the posture of the user is checked to control the output unit so that the image is moved.
The position sensor according to claim 2,
A chest position sensor for mounting on a user's chest;
A waist position sensor for mounting on a user's waist; And
And a knee position sensor for being mounted on a knee area of the user.
The apparatus of claim 3, wherein the knee position sensor comprises:
A first knee position sensor mounted on the left knee of the user; And
And a second knee position sensor mounted on the right knee of the user.
The position sensor according to claim 2,
Wherein the gyro sensor is a gyro sensor.
The apparatus according to claim 1,
A camera for capturing a user in the screen direction to acquire an image,
Wherein the controller controls the output unit to move the image by checking the information about the position and the posture of the user, the image being transmitted to the controller by the image obtained by the camera.
delete
KR1020130136685A 2013-11-12 2013-11-12 Point of an image according to the user's changing attitude imager KR101552403B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130136685A KR101552403B1 (en) 2013-11-12 2013-11-12 Point of an image according to the user's changing attitude imager

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130136685A KR101552403B1 (en) 2013-11-12 2013-11-12 Point of an image according to the user's changing attitude imager

Publications (2)

Publication Number Publication Date
KR20150054342A KR20150054342A (en) 2015-05-20
KR101552403B1 true KR101552403B1 (en) 2015-10-14

Family

ID=53390515

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130136685A KR101552403B1 (en) 2013-11-12 2013-11-12 Point of an image according to the user's changing attitude imager

Country Status (1)

Country Link
KR (1) KR101552403B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108970112A (en) * 2018-07-05 2018-12-11 腾讯科技(深圳)有限公司 The method of adjustment and device of posture, storage medium, electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004324974A (en) * 2003-04-24 2004-11-18 Babcock Hitachi Kk Image shooting training device
JP2011044160A (en) * 2005-12-12 2011-03-03 Sony Computer Entertainment Inc Method and system for enabling depth and direction detection when interfacing with computer program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004324974A (en) * 2003-04-24 2004-11-18 Babcock Hitachi Kk Image shooting training device
JP2011044160A (en) * 2005-12-12 2011-03-03 Sony Computer Entertainment Inc Method and system for enabling depth and direction detection when interfacing with computer program

Also Published As

Publication number Publication date
KR20150054342A (en) 2015-05-20

Similar Documents

Publication Publication Date Title
US7722465B2 (en) Image generation device, image display method and program product
US6951515B2 (en) Game apparatus for mixed reality space, image processing method thereof, and program storage medium
US7637817B2 (en) Information processing device, game device, image generation method, and game image generation method
JP3413127B2 (en) Mixed reality device and mixed reality presentation method
KR101366444B1 (en) Virtual reality shooting system for real time interaction
CN105188867B (en) The client-side processing of role's interaction in remote game environment
TWI448318B (en) Virtual golf simulation apparatus and sensing device and method used for the same
JP5039808B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US20190219364A1 (en) Method And Apparatus For Providing Live-Fire Simulation Game
WO2013111146A2 (en) System and method of providing virtual human on human combat training operations
KR20110056019A (en) Screen shooting simulation system and method thereof
KR101552403B1 (en) Point of an image according to the user's changing attitude imager
JP5433318B2 (en) Video game equipment
KR101117404B1 (en) The Shooting Training System of Moving for Real Direction
CN111228791A (en) Real person AR shooting game equipment, and shooting fighting system and method based on AR technology
TW201620597A (en) Wearable grenade throwing simulation system
KR101938458B1 (en) shooting method with rotating mapped images
KR20150137261A (en) Target tracking using image generation method and apparatus.
JP2003240494A (en) Training system
CN110108159B (en) Simulation system and method for large-space multi-person interaction
KR102262886B1 (en) Haptic type sports recognition system using Infrared depth camera sensor
RU114772U1 (en) INTERACTIVE TYPE WITH POSITION CONTROL ARROW
Pinheiro et al. RealShooting: Expanding the experience of point-and-click target shooting games
KR101068799B1 (en) Controller being used in apparatus for screen shooting and driving method thereof
KR101349350B1 (en) Simulation method and system for video shooting using posture recognition

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
FPAY Annual fee payment

Payment date: 20181004

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20190904

Year of fee payment: 5