WO2021005655A1 - Visiocasque - Google Patents

Visiocasque Download PDF

Info

Publication number
WO2021005655A1
WO2021005655A1 PCT/JP2019/026857 JP2019026857W WO2021005655A1 WO 2021005655 A1 WO2021005655 A1 WO 2021005655A1 JP 2019026857 W JP2019026857 W JP 2019026857W WO 2021005655 A1 WO2021005655 A1 WO 2021005655A1
Authority
WO
WIPO (PCT)
Prior art keywords
golf
head
mounted display
display
camera
Prior art date
Application number
PCT/JP2019/026857
Other languages
English (en)
Japanese (ja)
Inventor
塩川 淳司
橋本 康宣
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2019/026857 priority Critical patent/WO2021005655A1/fr
Publication of WO2021005655A1 publication Critical patent/WO2021005655A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities

Definitions

  • the present invention relates to a head-mounted display.
  • it relates to the technology of a head-mounted display for golf.
  • the golf simulation device analyzes the ball hit by the player and calculates the flight distance and drop point of the ball, so that the player feels as if he or she is actually playing on the course simulated on the display. Can be made to.
  • Patent Document 1 is a technique for analyzing a ball hit by a player and calculating the flight distance and falling point of the ball.
  • Patent Document 1 various data including a trigger sensor for detecting that a golf ball has been hit, a CCD camera for capturing a golf ball after hitting, and aerodynamic coefficient data under arbitrary conditions are stored. Then, a PC that calculates the trajectory and flight distance of the golf ball using data such as the speed of the golf ball obtained from the image data captured by the CCD camera and various data including aerodynamic coefficient data, and a calculation by the PC. It is equipped with a monitor that displays the trajectory and flight distance of the golf ball based on the results. In particular, in Patent Document 1, the ballistic trajectory and flight distance of a golf ball are calculated by measuring the aerodynamic coefficient using a golf ball rotating device that actually rotates the ball.
  • Patent Document 1 many golf simulation devices, including Patent Document 1, are not premised on being used on an actual golf course. For example, there is no mention of assisting the player, such as addressing in the right direction on an actual golf course.
  • an object of the present invention is to provide a head-mounted display for golf that assists in taking an address toward a correct target direction in an actual golf course.
  • Another object of the present invention is to provide a head-mounted display for golf, which can shorten the time for searching for a ball and, as a result, reduce the number of lost balls.
  • one aspect of the head-mounted display for golf of the present invention is a camera that captures a target position in response to a predetermined operation of the user and an internal coordinate system in which the target position is managed by the head-mounted display.
  • one aspect of the head mount display for golf of the present invention is to analyze a camera that captures a state of hitting a golf ball by a club at high speed and an image captured by the camera to analyze the image of the golf ball. It has a controller that calculates a ball speed and a spin amount, and calculates a golf ball fall prediction point based on the ball speed and the spin amount, and a transmissive display that displays a second virtual object at the fall prediction point.
  • the time to search for the ball can be shortened, and as a result, the lost ball can be reduced.
  • FIG. It is a figure which shows an example of the appearance of the HMD of Example 1. It is explanatory drawing of the operation which sets the launch direction of a ball in the HMD of Example 1.
  • FIG. It is a figure which shows an example of the display at the time of address in the HMD of Example 1. It is a figure which showed an example of the display of the HMD of Example 1. It is a figure which showed an example of the hardware composition of the HMD of Example 1.
  • FIG. It is a figure which showed an example of the functional block of the HMD of Example 1.
  • FIG. It is a figure which showed an example of the flowchart which showed the processing operation of the HMD of Example 1.
  • FIG. It is a figure which showed the application which automatically registers the score by the cooperation of the HMD and the smartphone of Example 2.
  • the [CPU] is a processing unit and is a Central Processing Unit including one or more processors.
  • the processor may include hardware circuits that perform some or all of the processing.
  • the process may be described with the [program] as the main body of operation, but the program is executed by the CPU to appropriately perform the specified process as a storage resource (for example, memory) or the like.
  • the main body of the actual processing is the CPU because it is performed while using. Therefore, in the process described with the program as the subject of operation, the processor may be described as the subject of operation.
  • a hardware circuit such as an Application Specific Integrated Circuit (ASIC) or a Field-Programmable Gate Array (FPGA) may perform a part or all of the processing performed by the processor.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • FIG. 1 is a diagram showing an example of the appearance of the golf head-mounted display of Example 1 (hereinafter, may be referred to as a head-mounted display or HMD).
  • HMD head-mounted display
  • the head-mounted display (HMD) 100 has a transmissive display 101, a controller 102, a camera 103, a frame 104 for supporting each part, and a button 105 for the user to wear.
  • the transmissive display 101 is also a kind of head-up display, and the display device is made of a half mirror so that the outside can be seen.
  • FIG. 1 shows a display for both eyes, a display device may be attached to only one eye. Further, a display using a holographic element may be used. By using a half mirror of an optical multilayer film, it is possible to see through the outside while displaying only necessary information on the surface of the display plate.
  • the camera 103 includes a color image camera 103a and a distance image camera 103b.
  • the distance image camera 103b is a camera capable of measuring three-dimensional information using the flight time of light, such as a TOF camera (Time-of-Flight Camera) or the like.
  • FIG. 2 is an explanatory diagram of an operation of setting the launch direction of the ball in the HMD of the first embodiment. In order to help the understanding of the invention, a method of using the HMD when playing golf will be described, and then how to realize it with the HMD of the embodiment will be described.
  • the player marks the tee ground with a predetermined movement, for example, the movement of the tip of the hand 201, with the launch direction as the target position via the display 101 of the HMD 100.
  • the marked target position is captured by the camera 103 and stored in the internal coordinate system managed by the HMD.
  • the player can recognize the green 203 and the flag 204 through the transmissive display 101 of the HMD 100.
  • the depth coordinate of the point pointed by the finger at this time is a predetermined value, for example, 200 yards ahead.
  • Reference numeral 202 denotes an edge of the display 101 of the HDM 100.
  • the camera 103 starts high-speed imaging and performs imaging at a high frame rate.
  • the HMD 100 calculates the marked direction with a 3D sensor in the internal coordinate system of the HMD 100, and displays an arrow toward the marked direction coordinate.
  • the start of high-speed imaging may be manually instructed by using the button 105 of the HMD or the like.
  • FIG. 3 is a diagram showing an example of a display at the time of addressing on the head-mounted display of the first embodiment.
  • the player can recognize the club 301 and the ball 302 through the transmissive display 101 of the HMD 100.
  • the HMD 100 displays an arrow 311 indicating the marked direction described in FIG. When the player faces the marked direction, highlight it by changing the color of the arrow or the density, brightness, solid line / dotted line of the display. If the player is addressing to the right of the marked direction, the arrow 313 is displayed, and if the player is addressing to the left of the marked direction, the arrow 312 is displayed.
  • the color image camera 103a of the HMD 100 photographs the ball 302 and the club 301, and the HMD 100 detects that the ball 302 and the club 301 are in the address state by image recognition. If the system is registered by the user in advance, the club to be used can be accurately determined by comparing with the image of the registered club. The determination of the club to be used is the information necessary for more accurate calculation of the flight distance of the ball.
  • the HMD analyzes the ball launch speed, the head speed, the meet rate, and the spin amount from the image.
  • the existing simulation technique may be applied.
  • the HMD 100 of the first embodiment stores the launch direction (direction of the target position) designated by the player on the tee ground, the fairway, or the like by using the internal coordinate system of the HMD 100. Even if you rotate your body when entering the address, you can display the direction marked at the time of address as an AR (Augmented Reality) as a virtual object based on the coordinates of the memorized launch direction, and take the address in the correct direction. be able to.
  • AR Augmented Reality
  • FIG. 4 is a diagram showing an example of the display of the HMD of the first embodiment. It is the figure which image-processed the state which hit the ball and displayed the result of having analyzed the ball launch speed, the head speed, the meet rate, and the spin amount from the image in the display area 401 of the display 101.
  • FIG. 4 shows that the image-processed ball launch speed, head speed, meet rate, and spin amount indicate that the ball fall prediction point 402 is AR-displayed.
  • the player can identify the drop point of the ball on the actual course, so that the time for searching for the ball can be shortened, and as a result, the lost ball can be reduced.
  • FIG. 5 is a diagram showing an example of the hardware configuration of the head-mounted display of the first embodiment.
  • the HMD 100 of the first embodiment basically has the same configuration as a general-purpose computer (information processing device). That is, as shown in FIG. 5, the HMD 100 includes a controller 102, a camera 103, a display 101, a voice interface (I / F) 540, a communication I / F 520, a sensor 530, a line-of-sight detection device 510, and the like. It includes a button 550 and a bus 560 that electrically connects each part.
  • I / F voice interface
  • the controller 102 performs various processes according to a predetermined program such as the image processing unit 601 shown in FIG.
  • the controller 102 AR-displays virtual objects such as the arrow 311 in FIG. 3, the display area 401 in FIG. 4, and the fall prediction point 402 at a predetermined position on the display 101, for example.
  • the controller 102 of the first embodiment includes a CPU 501, a RAM 502, and a ROM 503.
  • the CPU 501 realizes various functions by loading a program previously stored in the ROM 503 into the RAM 512 and executing the program.
  • the RAM 502 and the ROM 503 are collectively referred to as a storage device 630 (see FIG. 6) when it is not necessary to distinguish them.
  • the controller 102 is arranged on the frame 104, for example.
  • the camera 103 includes a color image camera 103a and a distance image camera 103b.
  • the color image camera 103a captures a shooting range including the user's field of view and acquires a color image.
  • the distance image camera 103b acquires a distance image in a shooting range substantially the same as that of the color image camera 103a.
  • the camera 103 (color image camera 103a and distance image camera 103b) captures the marked target position shown in FIG. 2 and stores it in the storage device 630 in the internal coordinate system managed by the HMD by the controller 102.
  • the shooting range is arranged at a position where shooting is possible.
  • both / or either of the color image camera 103a and the distance image camera 103b may have a memory (not shown) for temporarily storing an image at the time of imaging at a high frame rate.
  • the display 101 is a transmissive device that displays the image acquired by the camera 103 and the display data generated in the HMD 100.
  • the display 101 is composed of, for example, a transmissive liquid crystal device, an organic EL device, an optical scanning type device using MEMS (micro electronics mechanical systems), or the like.
  • MEMS micro electronics mechanical systems
  • the device is not limited to this, and any device can be used as long as it can realize a transmissive display structure in which the other side of the display 101 can be seen through while displaying an image on the display 101.
  • the transmissive display 101 is supported in front of one or both eyes of the user.
  • the display 101 can take an arbitrary shape.
  • the display 101 may include right and left display panels, and the display 101 may display one or more UI objects of the graphical user I / F.
  • the voice I / F540 is, for example, a voice output device such as a microphone, a speaker, and a buzzer.
  • the voice I / F 540 inputs a sound from the outside world and outputs a sound created in the HMD 100 or a sound such as a voice or music transmitted via the communication I / F 520.
  • the voice I / F 520 may not be provided.
  • the communication I / F520 is provided with a code circuit, a decoding circuit, an antenna, etc., and transmits / receives data (data communication) to / from another device which is an external device via a network.
  • the communication I / F 520 is an I / F that connects to the network via an access point or the like (not shown) or a base station or the like of a mobile telephone communication network (not shown). ..
  • the HMD 100 transmits / receives data to / from each server connected to the network via the communication I / F 520.
  • the button 550 is for turning on / off the power of the HMD 100, switching the operation mode, and inputting the start timing of a predetermined operation to the HMD 100 by the user.
  • Button 550 corresponds to button 105 in FIG.
  • the connection between the HMD100 and the access point is performed by, for example, a wireless communication method such as Wi-Fi (registered trademark) or Bluetooth (registered trademark) or a communication method such as others.
  • a wireless communication method such as Wi-Fi (registered trademark) or Bluetooth (registered trademark) or a communication method such as others.
  • the connection between the HMD100 and the base station of the mobile telephone communication network is, for example, W-CDMA (registered trademark) (Wideband Code Division Multiple Access) method, GSM (registered trademark) (Global System for Mobile communications) method, LTE (LTE). It is performed by the Term Evolution) method or other communication methods such as 5G.
  • the sensor 530 detects the current position, tilt, speed, user operation, etc. of the HMD100.
  • the HMD 100 includes, for example, a position information acquisition sensor such as a GPS receiver 531a, a gyro sensor 531b, an acceleration sensor 531c, a geomagnetic sensor 531d, a touch sensor 531e, and the like.
  • the sensor 530 does not have to include all of them.
  • the line-of-sight detection device 510 detects the user's line-of-sight direction.
  • the line-of-sight detection device 510 is realized by, for example, a line-of-sight detection camera that detects the line-of-sight direction of the user.
  • the line-of-sight detection camera is attached so as to include the iris, pupil, etc. of the user's eye in the imaging range.
  • FIG. 6 is a functional block diagram of the function related to the virtual object display processing of the HMD 100 of the first embodiment.
  • the controller 102 realizes the functions of the image processing unit 601, the display control unit 610, and the audio output control unit 625.
  • the display control unit 610 includes a space recognition unit 611, an instruction reception unit 622, a display data generation unit 623, and a display correction unit 624.
  • Each function is realized by the CPU 501 loading the program stored in the ROM 503 into the RAM 502 and executing it.
  • the storage device 630 stores color image data 631, distance image data 632, spatial recognition data 633, virtual object data (virtual OJT data) 634, audio data 636, and map data 638.
  • the map data 638 is not always necessary in the first embodiment.
  • the color image data 631 is an image acquired by the color image camera 103a.
  • the distance image data 632 is an image acquired by the distance image camera 103b.
  • the image processing unit 601 stores the color image and the distance image acquired by the color image camera 103a and the distance image camera 103b in the storage device 630 as the color image data 631 and the distance image data 632, respectively.
  • the color image and the distance image are acquired substantially synchronously.
  • the space recognition unit 611 is an internal coordinate system generated based on the output from the sensor 530 inside the HMD, recognizes the surrounding real space, and stores the result as space recognition data 633 in the storage device 630. To do. The recognition is performed using the color image data 631 and the distance image data 632 acquired substantially at the same time.
  • the space recognition unit 611 generates space recognition data 633, which is three-dimensional data (three-dimensional map) of the structure in the shooting range, from each image data at predetermined time intervals according to the scanning operation of the user, and stores the storage device. Store in 630.
  • the surrounding scan is performed, for example, immediately after startup, periodically as an initial setting, or by operating a button of the user.
  • the map data 638 is absolute coordinate system data, and stores the map information of the golf course downloaded from the communication I / F 520 via the Internet in the storage device 630 by GNSS (Global Navigation Satellite System / Global Positioning Satellite System).
  • the map data includes the course layout of each hole of the golf course.
  • the course layout also includes information on the distance from the teeing ground to the tee and the location and size of obstacles such as bunkers, ponds and OBs.
  • the space recognition data 633 is created, for example, in the world coordinate system (coordinate system including absolute coordinates) that defines the entire three-dimensional space.
  • the local coordinate system (internal coordinate system) of the HMD100 which is specified by the position and orientation (initial posture) of the HMD100 main body when the instruction to start spatial recognition is received as the origin and each axial direction of this world coordinate system. Includes) origin and each axial direction.
  • the origin is a predetermined position on the display 101 of the HMD100
  • the inside of the display 101 is the xy plane
  • the z-axis direction is the direction perpendicular to the xy plane (display 101 plane).
  • the amount of displacement and the amount of rotation of the local coordinate system of the HMD100 by the user's scanning operation with respect to the world coordinate system are calculated using the data obtained by various sensors 530.
  • spatial recognition is performed using, for example, existing technology such as Spatial Mapping. That is, the HMD 100 of the first embodiment scans the surroundings by the color image camera 103a and the distance image camera 103b. Then, using the result, the space recognition unit 611 generates three-dimensional data by using an application such as Spatial Mapping.
  • the spatial recognition data 633 is held as, for example, mesh data.
  • the space recognition unit 611 can recognize the material and type of the constituents within the photographing range by the spatial understanding. That is, it is possible to recognize whether the structure is, for example, a "wall", a "floor”, or a "ceiling".
  • the space recognition unit 611 stores these recognition results in the storage device 630 as attribute data of the space recognition data 633.
  • the instruction receiving unit 622 receives display instructions and operation instructions for the virtual object displayed on the display 101.
  • the display instruction and the operation instruction include, for example, those by the line of sight (gaze) and those by the movement of the fingers (gesture).
  • the line-of-sight direction information used for gaze is detected using, for example, the line-of-sight detection device 510.
  • Gestures include, for example, click events (air taps), tap and hold, bloom, etc. for operation points of virtual objects.
  • the instruction receiving unit 622 detects the movement of the fingers in the gesture frame provided within the shooting range of the color image camera 103a and the distance image camera 103b, and detects display instructions, operation instructions, and the like.
  • the instruction reception unit 622 extracts the data of the instructed virtual object from the virtual object data 634, and causes the display data generation unit 623 described later to generate display data.
  • the instruction reception unit 622 detects the operation and notifies the display data generation unit 623 described later.
  • the display data generation unit 623 generates display data from the virtual object data 634 to display the instructed virtual object at a predetermined position on the display 101 in a predetermined shape according to the user's instruction via the instruction reception unit 622. For example, in addition to the arrow 311 in FIG. 3 indicating the marking in the direction supported by the fingertip 201 in FIG. 2, virtual objects such as the display area 401 and the predicted fall point 402 in FIG. 4 calculated by the HMD are displayed on the display 101. Generated to do.
  • FIG. 7 is a flowchart showing the processing operation of the HMD 100.
  • step S71 the image acquired by the image processing unit 601 from the camera 103 is shown in FIG. 2 by the instruction receiving unit 622 with respect to the three-dimensional data of the structure in the photographing range generated by the space recognition unit 611.
  • the user acquires the position (marking position) specified by the fingertip 201.
  • the marking position may be simply stored by detecting the operation for marking the target position via the camera 103.
  • step S72 the display data generation unit 623 displays a target arrow on the display 101 toward the internal coordinate direction marked by the integrated value of the output from each sensor 530.
  • the marked launch direction is determined. calculate.
  • An arrow pointing in the marked direction is read from the virtual object data 634, processed in 3D, and displayed on the display 101 of the HMD 100.
  • step S73 the image processing unit 601 of the controller 102 detects the address state from the positional relationship between the ball and the club head from the color image from the color image camera 103a.
  • the output of each sensor 530 detects that the player has entered the address state.
  • the display 101 is virtually displayed in the direction of.
  • step S74 the controller 102 controls the camera 103 to shoot at a high frame rate, and the camera 103 starts shooting at a high frame rate.
  • step S75 the image processing unit 601 of the controller 102 detects the shot state by image processing, and calculates the fall prediction point from the detected ball speed and spin amount.
  • step S76 the image processing unit 601 of the controller 102 calculates the fall prediction point from the calculated ball speed and spin amount.
  • step S77 the display data generation unit 623 reads the display data for displaying the calculated fall prediction point from the virtual object data 634 and displays it as a virtual object on the display 101 under the control of the display control unit 610 of the controller 102.
  • the player can take an address toward the correct target direction.
  • the time for searching for the ball can be shortened, and as a result, the lost ball can be reduced.
  • Example 2 is an application using the HMD map data 638.
  • FIG. 8 is an application for automatically registering a score by linking the HMD 100 and the smartphone 801.
  • the HMD 100 shown in FIG. 1 and the smartphone 801 are connected by wireless communication such as Bluetooth.
  • the controller 102 of the HMD 100 identifies the hall information corresponding to the current position of the player from the map data 638 and the GPS receiver 531a.
  • the GPS information from the GPS receiver 531a includes information indicating the current position. In the example of FIG. 8, it is displayed that the player is in the second hole 810.
  • the screen 810 of the smartphone 801 acquires and displays the hole number and the scorecard information indicating the number of pars and the distance (number of yards) of the hole from the map data 638.
  • the scorecard information does not have to include information about distance.
  • the controller 102 transmits the scorecard information corresponding to the specified hall information to an external device such as a smartphone 801 via the communication interface 520.
  • the smartphone 801 displays the scorecard information transmitted from the HMD 100 as shown in FIG.
  • the HMD 100 shown in FIG. 8 can recognize the action of hitting a ball by image processing as described in the first embodiment, if the hole number is specified from the position of the player by the GPS receiver 531a, the number of shots of each hole can be determined. It can be entered automatically. Since the shot and putt operations can be distinguished based on the shape of the club photographed by the camera 103 and the head speed calculated by image processing, the number of shots and the number of putts in each hole can also be automatically input. That is, the controller 102 identifies the hole number to be played by the user based on the GPS information from the GPS receiver 531a and the map data 638, and analyzes the image taken by the camera 103 for the specified hole number. As a result, the number of shots and the number of putts in each hole can be calculated and stored in the storage device 630, or the score can be displayed on the display 101.
  • FIG. 9 is a diagram showing an example of the display of the HMD of the second embodiment.
  • FIG. 9 shows an example in which the results of analyzing the ball launch speed, head speed, meet rate, and spin amount from the image are displayed on the display 101.
  • the HMD100 By providing the HMD100 with the GPS receiver 531a and the map data 638, it is possible to determine which point in which hole of the golf course the player is. Based on the GPS receiver 531a, the map data 638, and the ball fall prediction point described in the first embodiment, when the ball fall prediction point corresponds to a bunker, a pond, an OB, a one-penalty area, etc., the display control of the controller 102 is performed.
  • the unit 610 controls the display 101 to change the color of the fall prediction point display or display characters.
  • the controller 102 can also calculate the distance to the green edge before the shot of the player from the GPS information from the GPS receiver 531a and the map data 638 and display it on the display 101.
  • the score can be registered by linking the HMD and the smartphone. Further, according to the second embodiment, the number of shots and the number of putts can be automatically registered from the GPS information, the map data, and the operation of the shots and putts that can be grasped by the HMD.
  • the absolute coordinate system is a coordinate system obtained from latitude and longitude information obtained from satellites.
  • the absolute coordinate system is shown by x', y', and z'.
  • the internal coordinate system is the internal coordinate system of the HMD100, which always performs inertial calculation from the accelerometer, gyro, and compass to calculate its own position and orientation.
  • FIG. 10B shows the internal coordinate system with x, y, and z. This internal coordinate system gradually deviates from the absolute coordinate system obtained from GPS information due to errors in the gyro, compass, accelerometer, and integration error. In particular, the error of the compass is large, and therefore, as shown in FIG. 10B, the integration of the deviation in the rotation direction is large.
  • the position of the bunker at the position of 101b in the absolute coordinate system is grasped as 101a in the internal coordinate system.
  • the fall prediction point 402 is near the bunker in the absolute coordinate system, it is predicted that the ball will fall to a place other than the bunker, so that it is not possible to determine or display that the ball may have entered the bunker. This is because the bunker 101, the map data of the hole, and the ball fall prediction point 402 are calculated in the internal coordinate system. It is necessary to correct the error with the absolute coordinate system accumulated in this internal coordinate system.
  • FIG. 11 is a diagram showing correction of the coordinate system. It is a figure explaining the method of adjusting the internal coordinate system of HMD100 to the absolute coordinate system.
  • FIG. 11 shows the positions P1 (x'1, y'1), P2 (x2, y2), and P3 (x3, y3) where the player hit the shot on the map data of the golf course.
  • the locus 1 and the locus 2 represent a locus that linearly connects the positions where the player hits the shot.
  • the alternate long and short dash line indicates that it is parallel to the x-axis of the absolute coordinate system.
  • the locus 1 between the position P1 and the position P2 can be represented by an angle ⁇ and a length L with respect to the x-axis of the absolute coordinate system.
  • the internal coordinate system is corrected (calibrated) by replacing the angle with respect to the x-axis when moving from the P1 point to the P2 point obtained by integrating the output of the internal sensor 530 with ⁇ and the distance with L. ..
  • the calibration timing may be at regular time intervals, shot points, and the like.
  • the calibration may be performed by another method. For example, the player manually identifies the bunker position and the pin position of the hole, which are easy to identify on the course, so that the bunker position and the pin position virtually displayed based on the map data are adjusted to the positions specified by the user. You may go by that.
  • FIG. 12 is a flowchart showing the processing operation of the HMD 100.
  • the steps that perform the same processing as in FIG. 7 are designated by the same reference numerals. That is, steps S71 to S75, and steps S76 and S77 correspond to the processes of steps S71 to S77 described with reference to FIG. 7.
  • the process of step S121 is performed between steps S75 and S76.
  • Step S121 is a step executed by the display correction unit 624 in order to correct the internal coordinates described with reference to FIG. 11, and stores the shot point (absolute coordinates) this time.
  • This is a process of recording the absolute coordinates of the position P2 in FIG. Calculate the slope with the previous shot point (absolute coordinates).
  • the slope of the position P1 in FIG. 11 with respect to the shot point (absolute coordinates) is calculated.
  • the slope obtained by the integral calculation of the sensor 530 is corrected as described above.
  • the player can grasp the correct situation of the ball fall prediction point by matching the internal coordinate system of the HMD with the absolute coordinate system.
  • the player can grasp the correct shot target from the map data before the shot.
  • 100 Head-mounted display, 101: Display, 102: Controller, 103: Camera, 104: Frame, 105: Button, 501: CPU, 502: RAM, 503: ROM, 510: Line-of-sight detector, 520: Communication I / F, 530: Sensor, 540: Voice I / F.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Afin d'aider à prendre une adresse vers une direction cible correcte sur un parcours de golf du monde réel, ce visiocasque pour le golf comprend : une caméra qui capture une position cible en réponse à une action prédéterminée d'un utilisateur ; un dispositif de mémoire qui stocke la position cible au moyen d'un système de coordonnées internes géré par le visiocasque ; un capteur qui détecte les mouvements de l'utilisateur ; un dispositif de commande qui intègre la sortie du capteur ; et un dispositif d'affichage transparent qui, sur la base du résultat d'intégration provenant du capteur, affiche un premier objet virtuel indiquant une direction pour la position cible.
PCT/JP2019/026857 2019-07-05 2019-07-05 Visiocasque WO2021005655A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/026857 WO2021005655A1 (fr) 2019-07-05 2019-07-05 Visiocasque

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/026857 WO2021005655A1 (fr) 2019-07-05 2019-07-05 Visiocasque

Publications (1)

Publication Number Publication Date
WO2021005655A1 true WO2021005655A1 (fr) 2021-01-14

Family

ID=74113954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026857 WO2021005655A1 (fr) 2019-07-05 2019-07-05 Visiocasque

Country Status (1)

Country Link
WO (1) WO2021005655A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3139827U (ja) * 2007-10-30 2008-03-06 株式会社パー七十二プラザ ゴルフ情報送受信装置
JP2012095914A (ja) * 2010-11-04 2012-05-24 Ns Solutions Corp ゴルフプレイヤー支援システム、ユーザ端末装置、ゴルフプレイヤー支援方法及びプログラム
US20120295739A1 (en) * 2007-12-03 2012-11-22 Julius Young Machine and Method for Comprehensive GolfTrainingand Instruction
JP2015503399A (ja) * 2011-12-30 2015-02-02 ナイキ イノヴェイト シーヴィー ゴルフボールを追跡してゴルフボールの強調画像を表示するためのシステム
JP2015190850A (ja) * 2014-03-28 2015-11-02 セイコーエプソン株式会社 誤差推定方法、運動解析方法、誤差推定装置及びプログラム
JP2017102768A (ja) * 2015-12-03 2017-06-08 セイコーエプソン株式会社 情報処理装置、表示装置、情報処理方法、及び、プログラム
US20180036621A1 (en) * 2014-10-28 2018-02-08 Mats NORDSTROM Method and device for providing guiding for executing a golf swing
JP2018079315A (ja) * 2016-11-15 2018-05-24 ストロークプレイ 高速ビデオカメラを用いた飛行物体の飛行データ測定装置及び方法、及びこれを遂行するためのプログラムを記録したコンピュータで読取可能な記録媒体
US20180256962A1 (en) * 2017-03-07 2018-09-13 vSports, LLC Mixed reality sport simulation and training system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3139827U (ja) * 2007-10-30 2008-03-06 株式会社パー七十二プラザ ゴルフ情報送受信装置
US20120295739A1 (en) * 2007-12-03 2012-11-22 Julius Young Machine and Method for Comprehensive GolfTrainingand Instruction
JP2012095914A (ja) * 2010-11-04 2012-05-24 Ns Solutions Corp ゴルフプレイヤー支援システム、ユーザ端末装置、ゴルフプレイヤー支援方法及びプログラム
JP2015503399A (ja) * 2011-12-30 2015-02-02 ナイキ イノヴェイト シーヴィー ゴルフボールを追跡してゴルフボールの強調画像を表示するためのシステム
JP2015190850A (ja) * 2014-03-28 2015-11-02 セイコーエプソン株式会社 誤差推定方法、運動解析方法、誤差推定装置及びプログラム
US20180036621A1 (en) * 2014-10-28 2018-02-08 Mats NORDSTROM Method and device for providing guiding for executing a golf swing
JP2017102768A (ja) * 2015-12-03 2017-06-08 セイコーエプソン株式会社 情報処理装置、表示装置、情報処理方法、及び、プログラム
JP2018079315A (ja) * 2016-11-15 2018-05-24 ストロークプレイ 高速ビデオカメラを用いた飛行物体の飛行データ測定装置及び方法、及びこれを遂行するためのプログラムを記録したコンピュータで読取可能な記録媒体
US20180256962A1 (en) * 2017-03-07 2018-09-13 vSports, LLC Mixed reality sport simulation and training system

Similar Documents

Publication Publication Date Title
US9703102B2 (en) Information processing device including head mounted display
JP6396027B2 (ja) プログラムおよびゲーム装置
US20160292924A1 (en) System and method for augmented reality and virtual reality applications
JP2021520978A (ja) 仮想対象と投擲物とのインタラクションを制御する方法及びその装置、並びにコンピュータプログラム
KR101898782B1 (ko) 객체 추적 장치
TW201501751A (zh) 運動解析裝置
US11100713B2 (en) System and method for aligning virtual objects on peripheral devices in low-cost augmented reality/virtual reality slip-in systems
EP3156110A1 (fr) Dispositif de traitement d'informations doté d'un visiocasque
KR20180095588A (ko) 스포츠 기구의 운동분석을 위한 방법 및 장치
JP2002233665A (ja) ゲーム装置、ゲーム方法及び可読記憶媒体
KR101270489B1 (ko) 골프 시뮬레이션 hmd
KR102232253B1 (ko) 두 개의 골프영상과 결과 데이터를 하나로 겹쳐서 확인하는 어플리케이션을 이용한 자세비교 및 교정 방법
JP2015231445A (ja) プログラムおよび画像生成装置
CN112370795B (zh) 基于头戴设备的挥拍类球类运动方法、装置及设备
WO2015048890A1 (fr) Système et procédé pour applications de réalité augmentée et de réalité virtuelle
US10286285B2 (en) Display method, display apparatus, motion analysis system, motion analysis program, and recording medium
CN105850109A (zh) 信息处理装置、记录介质和信息处理方法
KR20160106671A (ko) 운동 해석 장치, 운동 해석 시스템, 운동 해석 방법, 운동 해석 정보의 표시 방법 및 프로그램
CN107106900A (zh) 检测装置、检测系统、运动分析系统、记录介质以及分析方法
US10096260B2 (en) Golf play assisting system
JP2012055450A (ja) ゴルフ支援装置
WO2021005655A1 (fr) Visiocasque
US10653948B2 (en) Calibration of a magnetometer for augmented reality experience
US20220339495A1 (en) Golf analysis device with calibration function
JP2013009789A (ja) カメラシステム、撮影システム及び撮影方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19936735

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19936735

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP