CN112052725B - Interaction method and interaction system of intelligent glasses and intelligent shoes - Google Patents

Interaction method and interaction system of intelligent glasses and intelligent shoes Download PDF

Info

Publication number
CN112052725B
CN112052725B CN202010738595.6A CN202010738595A CN112052725B CN 112052725 B CN112052725 B CN 112052725B CN 202010738595 A CN202010738595 A CN 202010738595A CN 112052725 B CN112052725 B CN 112052725B
Authority
CN
China
Prior art keywords
intelligent
control instruction
shoes
intelligent glasses
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010738595.6A
Other languages
Chinese (zh)
Other versions
CN112052725A (en
Inventor
蔡清来
许金泰
郭献招
杨鑫杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangwei Zhilian Fujian Technology Co ltd
Original Assignee
Xiangwei Zhilian Fujian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangwei Zhilian Fujian Technology Co ltd filed Critical Xiangwei Zhilian Fujian Technology Co ltd
Priority to CN202010738595.6A priority Critical patent/CN112052725B/en
Publication of CN112052725A publication Critical patent/CN112052725A/en
Application granted granted Critical
Publication of CN112052725B publication Critical patent/CN112052725B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B13/00Soles; Sole-and-heel integral units
    • A43B13/14Soles; Sole-and-heel integral units characterised by the constructive form
    • A43B13/18Resilient soles
    • A43B13/20Pneumatic soles filled with a compressible fluid, e.g. air, gas
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B3/00Footwear characterised by the shape or the use
    • A43B3/34Footwear characterised by the shape or the use with electrical or electronic arrangements
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B3/00Footwear characterised by the shape or the use
    • A43B3/34Footwear characterised by the shape or the use with electrical or electronic arrangements
    • A43B3/36Footwear characterised by the shape or the use with electrical or electronic arrangements with light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The application discloses an interaction method of intelligent glasses and intelligent shoes, which comprises the following steps: when the intelligent glasses and the intelligent shoes establish network communication and enter an interaction mode, a first control instruction is acquired through the intelligent glasses; determining a scene category corresponding to the first control instruction, and displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category association action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, wherein the intelligent shoe receives the second control instruction to execute the corresponding control action. The intelligent glasses are connected with the shoes with different functions in a wireless mode, and control commands are sent to the functional shoes. The intelligent glasses have the advantages that the corresponding intelligent shoes are controlled by the user to perform corresponding transformation in different practical scenes of the intelligent glasses, and better user experience is obtained.

Description

Interaction method and interaction system of intelligent glasses and intelligent shoes
Technical Field
The application relates to the technical field of clothing fabrics, in particular to an interaction method of intelligent glasses and intelligent shoes.
Background
With the development of technology, intelligent glasses are gradually entering into the life of people. The smart glasses, also called smart glasses, are "like smart phones, have independent operating systems, and the smart glasses can be provided by software service providers such as user installation software and games. The intelligent glasses can complete the functions of map navigation, friend interaction, photo and video shooting, video call spreading with friends and the like through voice or action control, and can realize the collective name of the glasses accessed by a wireless network through a mobile communication network.
Running is a daily and convenient physical exercise method, and is an effective exercise mode of aerobic respiration. Glasses such as 'Recon Jet' which can promote running scientificity and fun have appeared, and sports data and external data can be intuitively seen when wearing the glasses. Jet has built-in global positioning systems, cameras and maps, as well as the capabilities of three-dimensional accelerometers, gyroscopes, and altimeters, as well as bluetooth and ant + sensors. The function of controlling music playing, answering calls and the like can be realized by connecting the mobile phone.
However, such intelligent glasses generally operate independently and are not interconnected with running shoes, and particularly lack of interactive control with running shoes with special functions such as inflatable shoes, vibrating shoes, luminous shoes and the like.
Disclosure of Invention
The application aims to overcome the defects and provide an interaction method and an interaction system of intelligent glasses and intelligent shoes.
In order to achieve the above object, the technical solution of the present application is:
an interaction method of intelligent glasses and intelligent shoes, comprising: when the intelligent glasses and the intelligent shoes establish network communication and enter an interaction mode, a first control instruction is acquired through the intelligent glasses; determining a scene category corresponding to the first control instruction, and displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category association action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, wherein the intelligent shoe receives the second control instruction to execute the corresponding control action.
Preferably, the first control instruction includes any one of a voice control signal, a touch screen control signal and a gesture control signal.
Preferably, the intelligent glasses and the intelligent shoes establish network communication through Bluetooth, UWB, IBEACON, BLE MESH and/or a wireless AP.
Preferably, the scene category includes any one of map navigation, interaction with friends, photo and video taking, video and audio appreciation, and video conversation with friends.
Preferably, the map navigation includes map navigation that can view a current road name, a current traffic condition, and a route planning according to an input address.
Preferably, when the scene category is map navigation, the intelligent glasses judge whether the human body is in a running state, if so, a second control instruction is generated and transmitted to the intelligent shoes, and when the intelligent shoes receive the second control instruction, the intelligent shoes perform inflation and deflation actions and lighting actions.
Preferably, the smart glasses comprise
The first acquisition module is used for acquiring a control instruction of a user, wherein the control instruction comprises target information, and the target information comprises the position or type of a destination which the user needs to reach;
the second acquisition module is used for acquiring the initial position of the user and planning a path from the initial position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image through image processing of the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
the projection module is used for projecting a direction mark of the path in the preset scene area through an augmented reality technology, wherein the direction mark is used for indicating a direction or a track to be travelled from a current position of a user to the destination position along the path, and the current position is positioned between a starting position of the user and the path of the destination position;
preferably, the smart glasses further comprise
The acceleration sensor is used for sensing the motion state of a user and detecting whether the user runs or not;
and the wireless transmission module is used for being connected with the intelligent shoes in a wireless way and controlling the intelligent shoes to be inflated or luminous.
The interaction system of the intelligent glasses and the intelligent shoes comprises a wireless transmission module, wherein the wireless transmission module is used for establishing network communication between the intelligent glasses and the intelligent shoes, and the intelligent glasses and the intelligent shoes establish network communication in a Bluetooth, UWB, IBEACON, BLE MESH and/or a wireless AP mode; the device comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring a first control instruction of a user, and the first control instruction comprises a voice control signal, a touch screen control signal, a gesture control signal and the like; the determining module is used for determining scene categories corresponding to the control instructions, wherein the scene categories comprise any one of map navigation, interaction with friends, photo and video taking and video conversation with friends; and the interaction module is used for enabling the intelligent shoe to receive the control instruction and feedback information of the interaction interface to perform related actions.
By adopting the technical scheme, the application has the beneficial effects that: the intelligent glasses are connected with the shoes with different functions in a wireless mode, and control commands are sent to the functional shoes. The intelligent glasses have the advantages that the corresponding intelligent shoes are controlled by the user to perform corresponding transformation in different practical scenes of the intelligent glasses, and better user experience is obtained.
Drawings
Reference numerals
FIG. 1 is a schematic flow chart of an interaction method of intelligent glasses and intelligent shoes according to the present application;
fig. 2 is a schematic flow chart of an interaction method of intelligent glasses and intelligent shoes according to an embodiment of the application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the following detailed description. It should be understood that the detailed description is presented merely to illustrate the application, and is not intended to limit the application.
In addition, in the description of the present application, it should be understood that the terms "center", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the drawings, are merely for convenience in describing the present application and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application.
As shown in fig. 1, a method for interaction between smart glasses and smart shoes includes: the intelligent glasses and the intelligent shoes establish network communication in a Bluetooth, UWB, IBEACON, BLE MESH, a wireless local area network (WIFIAP) mode and the like, and when the intelligent glasses enter an interaction mode, a first control instruction is acquired through the intelligent glasses; determining a scene category corresponding to the first control instruction, and displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; the scene category comprises any one of map navigation, friend interaction, photo and video shooting and friend video call expanding, if the scene category belongs to a specific scene category, a second control instruction is generated and transmitted to the intelligent shoe, and the intelligent shoe receives the second control instruction to execute corresponding control actions.
As shown in fig. 2, the present embodiment mainly describes a specific scene category as map navigation and an associated action as running, but is not limited to the map navigation scene. When a specific scene is map navigation, the intelligent glasses comprise
The first acquisition module is used for acquiring a control instruction of a user, wherein the control instruction comprises target information, and the target information comprises the position or type of a destination which the user needs to reach; the first acquisition module comprises an analysis module, a first analysis module and a second acquisition module, wherein the analysis module is used for identifying the target information through a semantic analysis algorithm and determining the destination position according to the position or type of a destination which the user needs to reach in the target information;
the second acquisition module is used for acquiring the initial position of the user and planning a path from the initial position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image through image processing of the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
the projection module is used for projecting a direction mark of the path in the preset scene area through an augmented reality technology, wherein the direction mark is used for indicating a direction or a track to be travelled from a current position of a user to the destination position along the path, and the current position is positioned between a starting position of the user and the path of the destination position;
the acceleration sensor is used for sensing the motion state of a user and detecting whether the user runs or not; the acceleration sensor can also be arranged on the intelligent shoes and used for sensing the motion state of a user, detecting whether running is performed or not, and transmitting the detected signals to the intelligent glasses.
And the wireless transmission module is used for being connected with the shoes in a wireless way and controlling the inflation of the inflatable shoes or the luminescence of the luminous shoes.
The intelligent shoe is a common intelligent shoe in the market and at least comprises a wire transmission module which is used for being connected with the intelligent glasses in a wireless way and receiving a second control instruction; the air bag inflation and deflation module or the light-emitting module or both of the air bag inflation and deflation module and the light-emitting module can also have other functions such as vibration.
After the intelligent glasses and the intelligent shoes establish network communication through Bluetooth, a user inputs the name of a certain place through voice, an analysis module in a first acquisition module in the intelligent glasses recognizes the target information through a semantic analysis algorithm, and the destination position is determined according to the position or type of a destination which the user needs to reach in the target information; and planning a path from the starting position to the destination position, and projecting a direction identifier of the path by using the augmented reality technology, wherein the direction identifier is used for indicating a direction or a track to be travelled from the current position of the user to reach the destination position along the path.
At this moment, the intelligent glasses acceleration sensor senses the motion state of a user, detects whether running is performed, after the planned route and the acceleration sensing running are met, the intelligent glasses transmit the inflation or lighting command, and likewise, after the planned route and the acceleration sensing running are not met, the intelligent glasses transmit the inflation or lighting stopping command, so that false triggering of the user command, such as occasional suspension in running, is prevented. Wherein, the inflatable shoes can rely on pressure sensor automatically regulated comfort level. After the running of the user is finished, the accelerometer senses the running finishing state, and meanwhile, the user operates the termination of the route planning, and at the moment, the intelligent glasses transmit a command of stopping inflation or stopping lighting to the inflatable shoes or the lighting shoes so that the inflatable shoes or the lighting shoes stop inflation or the lighting. The functional shoes can be automatically opened to be inflated or luminous when the route is planned, so that the functional shoes are fashionable and convenient, and the purposes of wrapping feet or improving night running safety can be achieved.
In addition, acceleration also can set up on intelligent shoes, and intelligent shoes pass through acceleration sensor response user's motion state, detect whether running.
When the specific scene category is video and audio appreciation and the associated action is dancing, the intelligent glasses judge whether the user is in dancing or not when performing video and audio playing, if the user is in the dancing associated action, a second control instruction is generated and transmitted to the intelligent shoes to control the lighting and vibration of the intelligent shoes, and the user obtains better experience by matching the rhythm of music and the flickering effect of light. The same scene category can be different in the associated action, and if the associated action is at rest, a second control instruction is generated and transmitted to the intelligent shoe, so that only the vibration of the intelligent shoe is controlled, and the sole is massaged. The associated action is not limited to a specific scene category and can be changed according to the requirements of the user.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of modules or elements described above is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The foregoing description is only illustrative of the preferred embodiments of the present application and is not intended to limit the scope of the application, which is defined by the appended claims.

Claims (10)

1. An interaction method of intelligent glasses and intelligent shoes is characterized in that: comprising the following steps: when the intelligent glasses and the intelligent shoes establish network communication and enter an interaction mode, a first control instruction is acquired through the intelligent glasses; determining a scene category corresponding to the first control instruction, and displaying display content associated with the first control instruction on a virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category association action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, wherein the intelligent shoe receives the second control instruction to execute the corresponding control action.
2. The method for interacting the intelligent glasses with the intelligent shoes according to claim 1, wherein: the first control instruction comprises any one of a voice control signal, a touch screen control signal and a gesture control signal.
3. The method for interacting the intelligent glasses with the intelligent shoes according to claim 1, wherein: the intelligent glasses and the intelligent shoes establish network communication in a Bluetooth, UWB, IBEACON, BLE MESH and/or wireless AP mode.
4. The method for interacting the intelligent glasses with the intelligent shoes according to claim 1, wherein: the scene category comprises any one of map navigation, friend interaction, photo taking and video taking, video and audio appreciation and friend video call expanding.
5. The method for interacting the intelligent glasses and the intelligent shoes according to claim 4, wherein: the map navigation includes map navigation that can view a current road name, a current traffic condition, and can perform path planning according to an input address.
6. The method for interacting the intelligent glasses with the intelligent shoes according to claim 1, wherein: the related actions comprise any one of running, dancing and resting.
7. The method for interacting the intelligent glasses with the intelligent shoes according to claim 1, wherein: when the scene category is map navigation, the associated action is running, the intelligent glasses judge whether the user is in a running state, if so, a second control instruction is generated and transmitted to the intelligent shoes, and when the intelligent shoes receive the second control instruction, the intelligent shoes perform the air charging and discharging action and the light emitting action.
8. The method for interacting with the intelligent glasses and the intelligent shoes according to claim 7, wherein: the intelligent glasses comprise
The first acquisition module is used for acquiring a control instruction of a user, wherein the control instruction comprises target information, and the target information comprises the position or type of a destination which the user needs to reach;
the second acquisition module is used for acquiring the initial position of the user and planning a path from the initial position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image through image processing of the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
the projection module is used for projecting a direction mark of the path in the preset scene area through an augmented reality technology, wherein the direction mark is used for indicating a direction or a track to be travelled from a current position of a user to the destination position along the path, and the current position is positioned between a starting position of the user and the path of the destination position.
9. The method for interacting the intelligent glasses and the intelligent shoes according to claim 8, wherein: the intelligent glasses further comprise
The acceleration sensor is used for sensing the motion state of a user and detecting whether the user runs or not;
and the wireless transmission module is used for being connected with the intelligent shoes in a wireless way and controlling the intelligent shoes to be inflated and/or luminous.
10. An interaction system of intelligent glasses and intelligent shoes, which is characterized in that: the intelligent glasses and the intelligent shoes are in network communication; the acquisition module is used for acquiring a control instruction of a user; the determining module is used for determining the scene category corresponding to the control instruction; and the interaction module is used for enabling the intelligent shoe to receive the control instruction and feedback information of the interaction interface to perform related actions.
CN202010738595.6A 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes Active CN112052725B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010738595.6A CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010738595.6A CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Publications (2)

Publication Number Publication Date
CN112052725A CN112052725A (en) 2020-12-08
CN112052725B true CN112052725B (en) 2023-09-01

Family

ID=73602511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010738595.6A Active CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Country Status (1)

Country Link
CN (1) CN112052725B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112587378B (en) * 2020-12-11 2022-06-07 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112731839A (en) * 2020-12-25 2021-04-30 深圳市倍轻松科技股份有限公司 Linkage control method and system between intelligent devices and computer storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104407697A (en) * 2014-11-17 2015-03-11 联想(北京)有限公司 Information processing method and wearing type equipment
CN106858883A (en) * 2017-03-03 2017-06-20 厦门精图信息技术有限公司 VR experience footwear based on panoramic table
KR101757377B1 (en) * 2016-04-06 2017-07-12 한국기계연구원 Smart shoes having function of information transfer and Method of providing information using the same
CN110038274A (en) * 2019-05-21 2019-07-23 福建工程学院 A kind of wired home nobody instruct body building method
US10542385B1 (en) * 2019-01-09 2020-01-21 International Business Machines Corporation Location determination using device coordination

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017119642A1 (en) * 2016-01-05 2017-07-13 엘지전자 주식회사 Smart shoe and method for processing data therefor
CN106027796B (en) * 2016-06-30 2019-11-05 京东方科技集团股份有限公司 A kind of information processing method, terminal and running shoes
US10639540B2 (en) * 2017-05-09 2020-05-05 Google Llc Augmented and/or virtual reality footwear
US10824132B2 (en) * 2017-12-07 2020-11-03 Saudi Arabian Oil Company Intelligent personal protective equipment
US11122852B2 (en) * 2018-05-31 2021-09-21 Nike, Inc. Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104407697A (en) * 2014-11-17 2015-03-11 联想(北京)有限公司 Information processing method and wearing type equipment
KR101757377B1 (en) * 2016-04-06 2017-07-12 한국기계연구원 Smart shoes having function of information transfer and Method of providing information using the same
CN106858883A (en) * 2017-03-03 2017-06-20 厦门精图信息技术有限公司 VR experience footwear based on panoramic table
US10542385B1 (en) * 2019-01-09 2020-01-21 International Business Machines Corporation Location determination using device coordination
CN110038274A (en) * 2019-05-21 2019-07-23 福建工程学院 A kind of wired home nobody instruct body building method

Also Published As

Publication number Publication date
CN112052725A (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN105320277B (en) Wearable device and the method for controlling it
CN112052725B (en) Interaction method and interaction system of intelligent glasses and intelligent shoes
CN104303130B (en) Electronic system and its operating method with augmented reality mechanism
US10169923B2 (en) Wearable display system that displays a workout guide
WO2016021997A1 (en) Virtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same
KR20200087260A (en) Method for indicating marker point location, electronic device, and computer readable storage medium
US20180356636A1 (en) Information processing apparatus, information processing method, and program
CN104520787A (en) Headset computer (HSC) as auxiliary display with ASR and HT input
US20230333703A1 (en) Interface Display Method and Device
US20180164983A1 (en) Display system, display apparatus, control method for display apparatus
CN109041148A (en) A kind of mobile terminal operating method and mobile terminal
CN108897597B (en) Method and device for guiding configuration of live broadcast template
CN109173258B (en) Virtual object display and positioning information sending method, equipment and storage medium
CN112569600B (en) Path information sending method in virtual scene, computer device and storage medium
US8897567B2 (en) Information processor, device, and information processing system
CN111744185B (en) Virtual object control method, device, computer equipment and storage medium
CN109040968A (en) Road conditions based reminding method, mobile terminal and computer readable storage medium
CN116331102A (en) Vehicle control method, device, equipment and storage medium
CN114079838A (en) Audio control method, equipment and system
CN109806583B (en) User interface display method, device, equipment and system
JP2018055416A (en) Display device, head-mounted display device, method for controlling display device, and program
CN110470293B (en) Navigation method and mobile terminal
KR20200123062A (en) IoT WEARABLE DEVICE
CN114090140A (en) Interaction method between devices based on pointing operation and electronic device
CN111383243B (en) Method, device, equipment and storage medium for tracking target object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant