CN110732133A - method and device for remotely controlling game view angle based on intelligent glasses - Google Patents

method and device for remotely controlling game view angle based on intelligent glasses Download PDF

Info

Publication number
CN110732133A
CN110732133A CN201910228695.1A CN201910228695A CN110732133A CN 110732133 A CN110732133 A CN 110732133A CN 201910228695 A CN201910228695 A CN 201910228695A CN 110732133 A CN110732133 A CN 110732133A
Authority
CN
China
Prior art keywords
mouse
data
game
mouse event
motion data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910228695.1A
Other languages
Chinese (zh)
Inventor
马兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ingenic Semiconductor Co Ltd
Original Assignee
Beijing Ingenic Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ingenic Semiconductor Co Ltd filed Critical Beijing Ingenic Semiconductor Co Ltd
Publication of CN110732133A publication Critical patent/CN110732133A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention provides methods and devices for remotely controlling game visual angles based on intelligent glasses, wherein the method comprises the steps of obtaining motion data of the head of a user through a gravity sensor in the intelligent glasses, converting the motion data into mouse event data, and transmitting the mouse event data to a game host computer to control a game in the game host computer.

Description

method and device for remotely controlling game view angle based on intelligent glasses
Technical Field
The invention relates to the technical field of data processing, in particular to methods and devices for remotely controlling game visual angles based on intelligent glasses.
Background
More people play shooting games at th perspective through mobile phones and computers, such as fire line Crossing (CF), CS, fire line assault, temple fleeing, etc. however, generally requires remote control through the swinging of the mobile phone up and down or left and right or through the control of the mobile phone, whether playing on an android phone or a desktop.
For example, when the game crosses a live wire, a keyboard at th people's view angle can control the characters to move left and right to move forward and backward, and a mouse clicks to control shooting at a gun, so that wearable glasses can be used for controlling the target stars in the game characters to move left and right through left shaking and right shaking of the head, for example, head raising is forward, head lowering is backward, and a touch pad is shooting, and only USB lines are required to be connected with a main device for realizing the technology.
However, no effective solution has been proposed at present for how to implement the conversion of specific control instructions.
Disclosure of Invention
The embodiment of the invention provides methods for remotely controlling game visual angles based on intelligent glasses so as to achieve the purpose of controlling games through the intelligent glasses, and the method comprises the following steps:
acquiring motion data of the head of a user through a gravity sensor in the intelligent glasses;
converting the motion data into mouse event data;
and transmitting the mouse event to a game host to realize the control of the game in the game host.
In embodiments, the motion data includes acceleration data and angle data.
In implementations, converting the motion data into mouse event data includes:
and converting the motion data into a mouse event according to a preset corresponding relation, wherein the corresponding relation is the corresponding relation between the head motion data and the mouse event which is established in advance.
In embodiments, communicating the mouse event to a game host includes:
and transmitting the mouse event to the host through a USB data line by using an HID protocol of a USB mouse in the intelligent glasses.
In embodiments, the mouse event includes mouse movement speed, movement distance, movement angle and movement direction.
The embodiment of the invention also provides devices for remotely controlling the game visual angle based on the intelligent glasses so as to achieve the purpose of controlling the game through the intelligent glasses, and the device comprises:
the acquisition module is used for acquiring the motion data of the head of the user through a gravity sensor in the intelligent glasses;
the conversion module is used for converting the motion data into mouse event data;
and the control module is used for transmitting the mouse event to a game host so as to control the game in the game host.
In embodiments, the motion data includes acceleration data and angle data.
In embodiments, the conversion module is specifically configured to convert the motion data into a mouse event according to a preset corresponding relationship, where the corresponding relationship is a pre-established corresponding relationship between head motion data and the mouse event.
In embodiments, the control module is specifically configured to transmit the mouse event to the host over a USB data line via an HID protocol of a USB mouse in the smart glasses.
In embodiments, the mouse event includes mouse movement speed, movement distance, movement angle and movement direction.
In the embodiment of the invention, the gravity sensor is added in the intelligent glasses to sense the head movement of the user, and the head movement of the user is converted into a mouse instruction (such as front-back left-right movement) so as to realize the control of the game. By means of the method, the user can realize game control through head movement without a mouse, and the technical effect of simply and efficiently realizing game control is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this application and not limiting of the invention, are illustrated in the accompanying drawings, in which:
fig. 1 is a schematic structural diagram of smart glasses according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for game control via smart glasses according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method for remotely controlling a game perspective based on smart glasses according to an embodiment of the present invention;
fig. 4 is a block diagram illustrating an apparatus for remotely controlling a game viewing angle based on smart glasses according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further with reference to the following embodiments and accompanying drawings.
In this example, it is considered that in order to realize the control of the game based on the wearable device, the game can be driven by the MPU sensor in the wearable device, and a mouse connected with the host can be replaced by an HID mouse protocol inside the wearable device system; specifically, the acceleration data and the angle data generated by the MPU sensor may be converted into instructions in a mouse protocol through a protocol, and then transmitted to the host through the USB.
However, although the above-mentioned way can theoretically accomplish the control of the game by the wearable device, the following problems still need to be solved:
1) the interface needs to use the HID protocol code inside the mouse device itself, and the protocol code needs to be migrated and merged with the MPU driver.
2) The hardware interface of the mouse end needs to be maintained with the wearable device end, and the interface connected with the host needs to be a USB interface, which may cause incompatibility of circuits or increase of power consumption, and cannot guarantee complete simulation.
3) There is a problem of distance limitation, and when the person moves too hard, the USB connection cable may be torn.
The mouse used in the current game is mainly an optical machine type mouse, when the mouse moves, the mouse can drive a rubber ball to roll, the rolling of the rubber ball can rub a grid wheel rolling shaft in the horizontal and vertical directions of a branch pipe in the mouse, so as to drive a grid wheel to rotate, the wheel edge of the grid wheel is in a grid shape and is close to two sides of the grid, an infrared luminous tube is arranged on the side, and an infrared receiving component is arranged on the side, so that the movement of the mouse can be converted into the rotation of the horizontal and vertical grid wheels in different directions and rotating speeds.
For a mouse, there are several positioning methods as follows:
1) the operation principle of editing the track ball location by track ball location is similar to that of grating, only the movement mode of the roller is changed, the ball seat is fixed, the track ball is directly stirred by hand to control the movement of the arrow of the mouse, when the track ball is rubbed, the roller shafts at the left side, the right side, the upper side and the lower side of the track ball are driven, the roller shafts are provided with grating wheels, pulse signals are generated by the light emitting tube and the receiving component to carry out location, however, the track ball has large roller volume and long stroke, the location mode can carry out very accurate operation, and the track ball has the additional advantage of stability, the operation and the location are controlled by fingers, the location cannot be influenced by the movement of the hand.
2) light-emitting diodes are arranged in the optical mouse, the bottom surface of the optical mouse is illuminated by light emitted by the light-emitting diodes (which is the reason why the bottom of the optical mouse always emits light), part of light reflected by the bottom surface of the optical mouse is transmitted to light sensing devices (micro-imagers) for imaging through sets of optical lenses, so that the moving track of the optical mouse is recorded as sets of continuous images shot at high speed, and finally, special image analysis chips (DSP and digital microprocessor) in the optical mouse are used for analyzing the series images shot on the moving track, and the moving direction and the moving distance of the mouse are judged by analyzing the change of the positions of the characteristic points on the images, so that the positioning of the cursor is completed.
3) Laser positioning editing is also positioning modes of an optical mouse, and is characterized in that laser is used for replacing ordinary light emitted by a light emitting diode, the laser is light emitted by electronic excitation, and has extremely high monochromaticity and linearity compared with the ordinary light, the laser used for positioning is mainly invisible light, the reflectivity of the ordinary light on different color surfaces is not , so that the problem of color blindness caused by low light reflectivity on certain color surfaces of the optical mouse and incapability of being identified by a DSP (digital signal processor), in addition, the ordinary light cannot be used or jumps on the surfaces of transparent substances, and the surface condition can be better identified due to the fact that the wavelength of the laser is nearly single , the sensitivity is effectively improved, and the problems can be effectively solved by using the laser positioning mouse.
4) The blue positioning editing blue positioning is a latest accurate positioning mode developed by microsoft, a mouse utilizing the blue track blue technology uses blue visible light, but does not utilize a diffuse reflection principle, but utilizes a principle of mirror reflection point imaging of a laser engine, a blue light source penetrates through a high-angle collimating lens to be irradiated on the surface of any object, reflected light enters a converging lens to be transmitted into a CMOS chip to be subjected to blue positioning processing, and an optical sensor (CMOS Detector) is used for shooting thousands of photos every second like high-speed continuous shooting cameras and transmits the photos into an image processing chip, and the chip compares each picture to finally obtain a moving track of the mouse.
The above is the principle of several existing determining ways of the moving direction and the moving distance of the mouse.
Under the Linux system, most usb mouse drivers are roughly divided into three parts: the usb device driving part is used for acquiring coordinates and action events transmitted by the mouse device through a usb protocol; an input subsystem part; the report interruption part is described as follows:
1) the process of configuration, whether Windows or Linux kernel, is complex, in which the system establishes the device, configuration, interface, setup, endpoint description information for the device, establishes a channel for transmission with the USB device (these follow the USB protocol), after which the USB device driver matches these interfaces, and then initialization tasks are performed on the device, for example, allocation urb, where urb is mouse data, which is called to prepare for receiving data.
2) In the interrupt part, data are coming in, and the usb core layer triggers callback functions, which are submitted to hcd (usb controller) to schedule receiving and sending data after Api calls provided by Kernel after device driver registration.
3) The Input subsystem registers Input devices in the driver, when obtaining the coordinate data of the mouse, the coordinates and action events need to be reported to the udev node of the system, and the rest part is delivered to the system layer to finish-
Based on this, in this example it is considered that the distance and angle data, from which the mouse-generated events can be replaced, are calculated from the accelerations and angular velocities that can be generated by the movement of the wearable device. The mouse driving at the host system side is not concerned, and the game control can be realized through the wearable device only by doing replacement work.
For the mouse, the mouse interior generally includes two modules, part is a DSP chip and a grating mechanical sensor, which generate digital signals through physical movement and then calculate the coordinates and distance of each point in the movement, and part is a usb device part for transmitting data to the host through usb lines, following the usb device side protocol.
The wearable device (such as glasses) is considered as whole, 3 axes extend in three-dimensional direction, for example, the z axis is used as the received gravity value, the 3-axis accelerator is programmably controlled and programmably controlled within the range of +/-2 g, + -4 g, + -8 g and +/-16 g, mean value sampling operation can be carried out on the HAL layer through the accelerometer data, so as to obtain instantaneous acceleration data of artificial shaking, the instantaneous acceleration data is converted into the moving distance and the moving speed of the mouse, and then the moving distance and the moving speed are written back into the source code of the mouse through an ioctl character device interface.
Based on the consideration that the HID protocol code can be transplanted into a Kernel of the wearable device, the wearable device can become wearable mobile mice, accordingly, the transmitted data source is data generated by shaking the head of the MPU, the final result is classified through calculation, the moving speed, the moving distance, the angle and the moving direction are packaged in a structural form according to the HID protocol, and switch judgment can be made, so that the data of the head movement is converted into an event of moving the mouse up and down, namely, the event generated by the MPU needs to correspond to the event corresponding to the mouse, and the event generated by the MPU can be finally converted into the data of the original mouse to be sent to the host, because the system and the MPU protocol are supported by the mouse but not supported by the mouse.
Specifically, MPU produces acceleration data and angular velocity data at every moment, and the motion through each axis can produce the weight of the acceleration of different directions, can calculate instantaneous motion contained angle through the weight, through gathering acceleration data and change analysis in the time, can judge that wearable equipment is motion or static, whether move to some point and stop to and specific displacement, and obtain mouse motion data with these data simulation, then adopt the data format of mouse to pack.
For example, the MPU detects that the head moves 10cm to the left, the distance acceleration is 0.6g generated in the x-axis direction, and the angle is 0, and after calculating the data, the data can be manually mapped in a program to map the data to the movement distance event and the angle event in the mouse. For example: the event that the head moves 10cm to the right left is mapped to the host side, namely the 10cm of the left movement of the mouse, and for the host side, the command sent by the mouse is still considered to be reported to the host through the driver.
Through the mode, the mouse simulation can be realized by connecting the wearable device with the host computer.
As shown in fig. 1, in this example, kinds of smart glasses for games are provided, which may include:
1) the wearing part 101 is used for sleeving the intelligent glasses on the head;
2) a gravity sensor 102 for sensing head movement of a user;
3) and the MPU 103 is connected with the gravity sensor 102 and used for acquiring the head motion data and converting the head motion data into mouse control data, wherein the mouse control data is transmitted to a host computer to control a game.
That is, the gravity sensor is added in the wearable device (such as intelligent glasses) to sense the head movement of the user, and the head movement of the user is converted into a mouse instruction (such as front-back, left-right movement and the like), so that the game is controlled. By means of the method, the user can realize game control through head movement without a mouse, and the technical effect of simply and efficiently realizing game control is achieved.
The gravity sensor 102 may include an acceleration sensor and an angular velocity sensor, and through the arrangement of the acceleration sensor and the angular velocity sensor, the moving speed, the moving distance, the moving angle, the moving direction, and the like of the head movement of the user may be effectively obtained, so that may be converted into an event corresponding to the mouse, and thus may be packaged as a mouse event, thereby implementing the control of the game.
In order to realize the control of the game in the host, a USB data line 104 may be provided on the smart glasses as shown in fig. 1, and the USB data line may be connected to the MPU 103 for transmitting the mouse control data to the host.
The USB data line may transmit data to the host via an HID protocol, and the smart glasses of step may be provided with a Kernel module, wherein the HID protocol is located in the Kernel.
The host may be any device capable of carrying and playing games, for example, a mobile terminal, a desktop computer, etc.
, considering that the main effect of the intelligent glasses is to complete the watching or VR AR and other technologies of the intelligent glasses, the mouse function is not the most important effect of the intelligent glasses, therefore, mouse function control switches can be set, and through the control switches, whether the mouse function is opened or not can be controlled by the intelligent glasses, so that the mouse simulation function of the glasses can be started when the game control is needed, and if the game control is not needed, the mouse simulation function can be closed to adapt to different scene requirements.
Based on the smart glasses, the method steps as shown in fig. 2 can be adopted to realize control:
s1: triggering a glasses motion sensor through head motion to obtain acceleration data of three axes of x, y and z axes;
s2, reporting to HAL layer through android standard interface, simulating mouse motion track algorithm, sampling acceleration data according to fixed time to obtain average acceleration values, thereby calculating moving distance in each direction, then simulating mouse positioning photoelectric value by 3 acceleration values to obtain simulated coordinate.
S3: writing the group of data from the Hal layer to a mouse protocol code of a Driver layer, and then transmitting the data to a host by using a usb protocol in an interrupt transmission mode through the usb;
s4: after receiving the data, the Kernel of the host system considers that the data are data sent by the mouse and reports the data to the system through the driver so as to complete the simulation.
Through the intelligent glasses and the specific implementation mode provided by the above example, the sense of reality of the game can be enhanced, the aim of controlling the game by replacing a manual mouse with the th person named head visual angle is achieved, the interest of the game is increased, specifically, the gravity sensor in the glasses replaces the mouse on the host, the freshness of the game can be increased, and the aiming degree in the game is higher.
Because of the HID protocol, the driver involved in the device and the method can be based on a Linux platform or a Windows system, and the compatibility of various systems and devices can be realized.
Based on the above smart glasses for games, methods for remotely controlling game viewing angles based on the smart glasses are provided in this example.
Where the context permits, reference to element or component or step(s), etc. should not be construed as limited to only of the elements, components, or steps, but may be or more of the elements, components, or steps, etc.
Although the flow described below includes a number of operations that occur in a particular order, it should be appreciated that the processes may include more or fewer operations that may be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment). As shown in FIG. 3, the method includes:
step 301: acquiring motion data of the head of a user through a gravity sensor in the intelligent glasses;
step 302: converting the motion data into mouse event data;
step 303: and transmitting the mouse event to a game host to realize the control of the game in the game host.
In embodiments, the motion data may include, but is not limited to, acceleration data and angle data.
In embodiments, the converting the motion data into mouse event data in step 302 may include converting the motion data into a mouse event according to a preset corresponding relationship, wherein the corresponding relationship is a pre-established corresponding relationship between head motion data and a mouse event.
Transmitting the mouse event to the game host while performing the control command transmission may include: and transmitting the mouse event to the host through a USB data line by using an HID protocol of a USB mouse in the intelligent glasses.
The mouse events described above may include, but are not limited to: the mouse moves the mouse moving speed, the moving distance, the moving angle and the moving direction.
Based on the same inventive concept as , devices for remotely controlling a game viewing angle based on smart glasses are further provided in the embodiments of the present invention, as described in the following embodiments, since the principle of solving the problem of the device for remotely controlling a game viewing angle based on smart glasses is similar to that of the method for remotely controlling a game viewing angle based on smart glasses, the implementation of the device for remotely controlling a game viewing angle based on smart glasses can refer to the implementation of the method for remotely controlling a game viewing angle based on smart glasses, and the repetition is not repeated.
An obtaining module 401, configured to obtain motion data of a head of a user through a gravity sensor in smart glasses;
a conversion module 402, configured to convert the motion data into mouse event data;
and a control module 403, configured to transmit the mouse event to a game host, so as to control a game in the game host.
In embodiments, the motion data may include acceleration data and angle data.
In embodiments, the conversion module may be specifically configured to convert the motion data into a mouse event according to a preset corresponding relationship, where the corresponding relationship is a pre-established corresponding relationship between the head motion data and the mouse event.
In embodiments, the control module may be specifically configured to transmit the mouse event to the host via a USB data line via an HID protocol of a USB mouse in the smart glasses.
In embodiments, the mouse event may include mouse movement speed, movement distance, movement angle and movement direction.
In another embodiments, software is provided, which is used to implement the technical solutions described in the above embodiments and preferred embodiments.
In another embodiments, storage media are provided, which store the above software, including but not limited to optical disks, floppy disks, hard disks, removable memory, and the like.
From the above description, it can be seen that the embodiments of the present invention achieve the following technical effects: the gravity sensor is added in the intelligent glasses to sense the head movement of the user, and the head movement of the user is converted into a mouse instruction (such as front-back left-right movement) so as to control the game. By means of the method, the user can realize game control through head movement without a mouse, and the technical effect of simply and efficiently realizing game control is achieved.
It will be apparent to those skilled in the art that the modules or steps of the embodiments of the invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1, method for remotely controlling game visual angle based on intelligent glasses, which comprises:
acquiring motion data of the head of a user through a gravity sensor in the intelligent glasses;
converting the motion data into mouse event data;
and transmitting the mouse event to a game host to realize the control of the game in the game host.
2. The method of claim 1, wherein the motion data comprises: acceleration data and angle data.
3. The method of claim 1, wherein converting the motion data into mouse event data comprises:
and converting the motion data into a mouse event according to a preset corresponding relation, wherein the corresponding relation is the corresponding relation between the head motion data and the mouse event which is established in advance.
4. The method of claim 1, wherein communicating the mouse event to a game host comprises:
and transmitting the mouse event to the host through a USB data line by using an HID protocol of a USB mouse in the intelligent glasses.
5. The method of claim 4, wherein the mouse event comprises: the mouse moves the mouse moving speed, the moving distance, the moving angle and the moving direction.
6, kind of device based on intelligent glasses remote control recreation visual angle, characterized by, includes:
the acquisition module is used for acquiring the motion data of the head of the user through a gravity sensor in the intelligent glasses;
the conversion module is used for converting the motion data into mouse event data;
and the control module is used for transmitting the mouse event to a game host so as to control the game in the game host.
7. The apparatus of claim 6, wherein the motion data comprises: acceleration data and angle data.
8. The apparatus according to claim 6, wherein the conversion module is specifically configured to convert the motion data into a mouse event according to a preset correspondence, where the correspondence is a correspondence between the head motion data and the mouse event that is established in advance.
9. The device of claim 6, wherein the control module is specifically configured to transmit the mouse event to the host over a USB data line via an HID protocol of a USB mouse in the smart glasses.
10. The apparatus of claim 9, wherein the mouse event comprises: the mouse moves the mouse moving speed, the moving distance, the moving angle and the moving direction.
CN201910228695.1A 2018-07-20 2019-03-25 method and device for remotely controlling game view angle based on intelligent glasses Pending CN110732133A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2018108032013 2018-07-20
CN201810803201 2018-07-20

Publications (1)

Publication Number Publication Date
CN110732133A true CN110732133A (en) 2020-01-31

Family

ID=69236680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910228695.1A Pending CN110732133A (en) 2018-07-20 2019-03-25 method and device for remotely controlling game view angle based on intelligent glasses

Country Status (1)

Country Link
CN (1) CN110732133A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024031614A1 (en) * 2022-08-12 2024-02-15 Tze Yuk MAK A human-interface device and a method for human-machine interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062214A1 (en) * 2008-11-01 2010-06-03 Malygin Viktor Nikolaevich System for video games with a 2d/3d monitor
CN201945946U (en) * 2011-01-20 2011-08-24 叶尔肯·拜山 Head control mouse
CN205360552U (en) * 2016-01-05 2016-07-06 赵大同 Three -dimensional interactive game system of 3D bore hole
US20180357978A1 (en) * 2016-01-25 2018-12-13 Hiscene Information Technology Co., Ltd Method and devices used for implementing augmented reality interaction and displaying

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062214A1 (en) * 2008-11-01 2010-06-03 Malygin Viktor Nikolaevich System for video games with a 2d/3d monitor
CN201945946U (en) * 2011-01-20 2011-08-24 叶尔肯·拜山 Head control mouse
CN205360552U (en) * 2016-01-05 2016-07-06 赵大同 Three -dimensional interactive game system of 3D bore hole
US20180357978A1 (en) * 2016-01-25 2018-12-13 Hiscene Information Technology Co., Ltd Method and devices used for implementing augmented reality interaction and displaying

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024031614A1 (en) * 2022-08-12 2024-02-15 Tze Yuk MAK A human-interface device and a method for human-machine interface

Similar Documents

Publication Publication Date Title
CN110650354B (en) Live broadcast method, system, equipment and storage medium for virtual cartoon character
US11625103B2 (en) Integration of artificial reality interaction modes
US9939911B2 (en) Computer interface for remotely controlled objects and wearable articles with absolute pose detection component
CN106662925B (en) Multi-user gaze projection using head mounted display devices
EP3000020B1 (en) Hologram anchoring and dynamic positioning
EP2984541B1 (en) Near-plane segmentation using pulsed light source
TW202004421A (en) Eye tracking with prediction and late update to GPU for fast foveated rendering in an HMD environment
CN110647239A (en) Gesture-based projection and manipulation of virtual content in an artificial reality environment
US20130342572A1 (en) Control of displayed content in virtual environments
CN105393158A (en) Shared and private holographic objects
CN104380347A (en) Video processing device, video processing method, and video processing system
WO2013151947A1 (en) Touch sensitive user interface
CN110709897A (en) Shadow generation for image content inserted into an image
CN110806797A (en) Method and device for controlling game based on head movement
CN110732133A (en) method and device for remotely controlling game view angle based on intelligent glasses
US20230349693A1 (en) System and method for generating input data from pose estimates of a manipulated object by using light data and relative motion data
CN110732134A (en) intelligent glasses for games
CN110806798A (en) Control method and device based on HID protocol
CN110806796A (en) Control method and device
CN110806811A (en) Method and device for generating mouse control instruction through MPU
CN113467625A (en) Virtual reality control device, helmet and interaction method
CN116866541A (en) Virtual-real combined real-time video interaction system and method
WO2023133304A1 (en) Mobile device holographic calling with front and back camera capture
CN109316738A (en) A kind of human-computer interaction game system based on AR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200131