CN112052725A - Interaction method and interaction system of intelligent glasses and intelligent shoes - Google Patents

Interaction method and interaction system of intelligent glasses and intelligent shoes Download PDF

Info

Publication number
CN112052725A
CN112052725A CN202010738595.6A CN202010738595A CN112052725A CN 112052725 A CN112052725 A CN 112052725A CN 202010738595 A CN202010738595 A CN 202010738595A CN 112052725 A CN112052725 A CN 112052725A
Authority
CN
China
Prior art keywords
intelligent
control instruction
glasses
shoes
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010738595.6A
Other languages
Chinese (zh)
Other versions
CN112052725B (en
Inventor
蔡清来
许金泰
郭献招
杨鑫杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangwei Zhilian Fujian Technology Co ltd
Original Assignee
Xiangwei Zhilian Fujian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangwei Zhilian Fujian Technology Co ltd filed Critical Xiangwei Zhilian Fujian Technology Co ltd
Priority to CN202010738595.6A priority Critical patent/CN112052725B/en
Publication of CN112052725A publication Critical patent/CN112052725A/en
Application granted granted Critical
Publication of CN112052725B publication Critical patent/CN112052725B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B13/00Soles; Sole-and-heel integral units
    • A43B13/14Soles; Sole-and-heel integral units characterised by the constructive form
    • A43B13/18Resilient soles
    • A43B13/20Pneumatic soles filled with a compressible fluid, e.g. air, gas
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B3/00Footwear characterised by the shape or the use
    • A43B3/34Footwear characterised by the shape or the use with electrical or electronic arrangements
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43BCHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
    • A43B3/00Footwear characterised by the shape or the use
    • A43B3/34Footwear characterised by the shape or the use with electrical or electronic arrangements
    • A43B3/36Footwear characterised by the shape or the use with electrical or electronic arrangements with light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Strategic Management (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Game Theory and Decision Science (AREA)
  • Software Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optics & Photonics (AREA)
  • Computational Linguistics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an interaction method of intelligent glasses and intelligent shoes, which comprises the following steps: when network communication is established between the intelligent glasses and the intelligent shoes and an interaction mode is entered, a first control instruction is obtained through the intelligent glasses; if the scene category corresponding to the first control instruction is determined, displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category correlation action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, and the intelligent shoe receives the second control instruction and carries out the corresponding control action. The intelligent glasses are in wireless connection with the functional shoes, and control commands are sent to the functional shoes. The user can correspondingly control the intelligent shoes to perform corresponding transformation in different practical scenes of the intelligent glasses, and better user experience is obtained.

Description

Interaction method and interaction system of intelligent glasses and intelligent shoes
Technical Field
The invention relates to the technical field of garment materials, in particular to an interaction method of intelligent glasses and intelligent shoes.
Background
With the development of technology, smart glasses are beginning to gradually enter people's lives. The smart glasses, also called smart glasses, are "provided with an independent operating system like a smart phone, and the smart glasses can be installed with programs provided by software service providers such as software, games and the like by users. The intelligent glasses can complete map navigation, interaction with friends, photo and video shooting, video call with friends and other functions through voice or action control, and can realize the general name of the glasses accessing a wireless network through a mobile communication network.
Running is a daily convenient physical exercise method and is an effective exercise mode of aerobic respiration. Glasses capable of improving the scientificity and fun of running, such as "Recon Jet", are available at present, and exercise data and external data can be visually seen when the glasses are worn. Jet has built in global positioning system, camera and map, as well as three-dimensional accelerometer, gyroscope, and altimeter and bluetooth and ant + sensor capabilities. The functions of controlling music playing, answering calls and the like can be realized by connecting the mobile phone.
However, such smart glasses generally operate independently and are not interconnected with running shoes, and in particular lack interactive control with running shoes with special functions, such as inflatable shoes, vibrating shoes, luminous shoes, and the like.
Disclosure of Invention
The invention aims to overcome the defects and provides an interaction method and an interaction system of intelligent glasses and intelligent shoes.
In order to achieve the purpose, the technical solution of the invention is as follows:
an interaction method of intelligent glasses and intelligent shoes comprises the following steps: when network communication is established between the intelligent glasses and the intelligent shoes and an interaction mode is entered, a first control instruction is obtained through the intelligent glasses; if the scene category corresponding to the first control instruction is determined, displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category correlation action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, and the intelligent shoe receives the second control instruction and carries out the corresponding control action.
Preferably, the first control instruction includes any one of a voice control signal, a touch screen control signal and a gesture control signal.
Preferably, the smart glasses and the smart shoes establish network communication in a Bluetooth mode, a UWB mode, an IBEACON mode, a BLE MESH mode and/or a wifi AP mode.
Preferably, the scene category includes any one of map navigation, interaction with friends, photo and video shooting, video and audio appreciation, and video conversation with friends.
Preferably, the map navigation includes map navigation that can view a current road name, current traffic conditions, and route planning according to an input address.
Preferably, when the scene type is map navigation, the intelligent glasses judge whether the human body is in a running state, if so, a second control instruction is generated and transmitted to the intelligent shoes, and when the intelligent shoes receive the second control instruction, the intelligent shoes perform inflation and deflation actions and light emitting actions.
Preferably, the smart glasses comprise
The system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a control instruction of a user, the control instruction comprises target information, and the target information comprises the position or the type of a destination which the user needs to reach;
the second acquisition module is used for acquiring the starting position of the user and planning a path from the starting position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image by performing image processing on the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
the projection module is used for projecting a direction identifier of the path in the preset scene area through an augmented reality technology, wherein the direction identifier is used for indicating a direction or a track to be traveled from a current position of a user to the destination position along the path, and the current position is located between the starting position of the user and the path of the destination position;
preferably, the smart glasses further comprise
The acceleration sensor is used for sensing the motion state of the user and detecting whether the user runs;
and the wireless transmission module is used for being wirelessly connected with the intelligent shoes and controlling the intelligent shoes to inflate or emit light.
An interactive system of intelligent glasses and intelligent shoes comprises a wireless transmission module, a wireless communication module and a wireless communication module, wherein the wireless transmission module is used for establishing network communication between the intelligent glasses and the intelligent shoes, and the intelligent glasses and the intelligent shoes establish network communication in a Bluetooth mode, a UWB mode, an IBEACON mode, a BLE MESH mode and/or a wifi AP mode; the acquisition module is used for acquiring a first control instruction of a user, wherein the first control instruction comprises a voice control signal, a touch screen control signal, a gesture control signal and the like; the determining module is used for determining a scene type corresponding to the control instruction, wherein the scene type comprises any one of map navigation, interaction with friends, photo and video shooting, and video conversation with friends; and the intelligent shoe receives the control instruction and the feedback information of the interactive interface to perform associated actions.
By adopting the technical scheme, the invention has the beneficial effects that: the intelligent glasses are in wireless connection with the functional shoes, and control commands are sent to the functional shoes. The user can correspondingly control the intelligent shoes to perform corresponding transformation in different practical scenes of the intelligent glasses, and better user experience is obtained.
Drawings
Reference numerals
FIG. 1 is a schematic flow chart illustrating a method for interacting smart glasses with smart shoes according to the present application;
fig. 2 is a schematic flow chart of an interaction method between smart glasses and smart shoes according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the detailed description and specific examples, while indicating the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
In addition, in the description of the present invention, it is to be understood that the terms "center", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", etc., indicate orientations and positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention.
As shown in fig. 1, an interaction method of smart glasses and smart shoes includes: the method comprises the steps that network communication is established between intelligent glasses and intelligent shoes through the modes of Bluetooth, UWB, IBEACON, BLE MESH and/or wifi AP and the like, and when an interaction mode is entered, a first control instruction is obtained through the intelligent glasses; if the scene category corresponding to the first control instruction is determined, displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; the scene type comprises any one of map navigation, interaction with friends, photo and video shooting and video conversation with friends, if the scene type belongs to the specific scene type, a second control instruction is generated and transmitted to the intelligent shoe, and the intelligent shoe receives the second control instruction to carry out corresponding control action.
As shown in fig. 2, the present embodiment mainly describes when the specific scene type is map navigation and the associated action is running, but is not limited to the scene of map navigation. When the specific scene is map navigation, the intelligent glasses comprise
The system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a control instruction of a user, the control instruction comprises target information, and the target information comprises the position or the type of a destination which the user needs to reach; the first acquisition module comprises an analysis module, and is used for identifying the target information through a semantic analysis algorithm and determining the position of a destination according to the position or type of the destination required to be reached by the user in the target information;
the second acquisition module is used for acquiring the starting position of the user and planning a path from the starting position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image by performing image processing on the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
the projection module is used for projecting a direction identifier of the path in the preset scene area through an augmented reality technology, wherein the direction identifier is used for indicating a direction or a track to be traveled from a current position of a user to the destination position along the path, and the current position is located between the starting position of the user and the path of the destination position;
the acceleration sensor is used for sensing the motion state of the user and detecting whether the user runs; the acceleration sensor can also be arranged on the intelligent shoe and used for sensing the motion state of the user, detecting whether the user runs or not and transmitting the detected signal to the intelligent glasses.
And the wireless transmission module is used for being wirelessly connected with the shoes to control the inflation of the inflatable shoes or the luminescence of the luminescent shoes.
The intelligent shoes are common intelligent shoes in the market, and at least comprise a wire transmission module which is used for being in wireless connection with the intelligent glasses and receiving a second control instruction; the air bag inflating and deflating module or the light emitting module or both the air bag inflating and deflating module and the light emitting module can also have other functions such as vibration.
After network communication is established between the intelligent glasses and the intelligent shoes through Bluetooth, a user inputs the name of a certain place through voice, an analysis module in a first acquisition module in the intelligent glasses identifies the target information through a semantic analysis algorithm, and the position or the type of a destination to which the user needs to arrive in the target information is determined; and planning a path from the starting position to the destination position, and projecting a direction identifier of the path by using an augmented reality technology, wherein the direction identifier is used for indicating a direction or a track to be traveled from the current position of the user to the destination position along the path.
At this moment, the intelligent glasses acceleration sensor senses the motion state of a user, detects whether the user runs, and after the planned route and the acceleration sensing run are both met, the intelligent glasses transmit an inflation or light-emitting command. Wherein, the inflatable shoes can rely on the pressure sensor to automatically adjust the comfort level. After the user runs, the accelerometer senses the running ending state, meanwhile, the user operates the ending of route planning, and at the moment, the intelligent glasses transmit the command of stopping inflating or lighting to the inflatable shoes or the lighting shoes to enable the inflatable shoes or the lighting shoes to stop inflating or lighting. The functional shoes can be automatically opened to be in an inflated or luminous state when a route is planned, are fashionable and convenient, and can achieve the purpose of wrapping feet or improving the night running safety.
In addition, the acceleration also can set up on intelligent shoes, and intelligent shoes pass through acceleration sensor response user motion state, detect whether running.
When specific scene classification is audio-visual appreciating, and the associated action is dancing, when intelligent glasses carried out audio-visual broadcast, judge whether the user was in the dancing, if in the associated action of dancing, then generated second control command and transmitted intelligent shoes, the luminous and vibrations of control intelligent shoes, the rhythm of collocation music and the scintillation effect of light make the user obtain better experience. And if the associated action is a rest, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, and only controlling the vibration of the intelligent shoe to massage the sole. The associated action is not limited to a specific scene type, and can be changed according to the requirements of the user.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above description is only a preferred embodiment of the present invention, and should not be taken as limiting the scope of the invention, and all equivalent variations and modifications made in the claims of the present invention should be included in the scope of the present invention.

Claims (10)

1. An interaction method of intelligent glasses and intelligent shoes is characterized in that: the method comprises the following steps: when network communication is established between the intelligent glasses and the intelligent shoes and an interaction mode is entered, a first control instruction is obtained through the intelligent glasses; if the scene category corresponding to the first control instruction is determined, displaying display content associated with the first control instruction on the virtual reality interaction interface through the intelligent glasses; and if the user is in the scene category correlation action, generating a second control instruction and transmitting the second control instruction to the intelligent shoe, and the intelligent shoe receives the second control instruction and carries out the corresponding control action.
2. The method for interaction between smart glasses and smart shoes according to claim 1, wherein the method comprises the following steps: the first control instruction comprises any one of a voice control signal, a touch screen control signal and a gesture control signal.
3. The method for interaction between smart glasses and smart shoes according to claim 1, wherein the method comprises the following steps: the intelligent glasses and the intelligent shoes establish network communication in a Bluetooth mode, a UWB mode, an IBEACON mode, a BLE MESH mode and/or a wifi AP mode.
4. The method for interaction between smart glasses and smart shoes according to claim 1, wherein the method comprises the following steps: the scene category comprises any one of map navigation, interaction with friends, photo and video shooting, audio and video appreciation and friend video call expansion.
5. The method for interaction between smart glasses and smart shoes according to claim 4, wherein the method comprises the following steps: the map navigation includes map navigation that can view a current road name, current traffic conditions, and route planning according to an input address.
6. The method for interaction between smart glasses and smart shoes according to claim 1, wherein the method comprises the following steps: the associated action comprises any one of running, dancing and resting.
7. The method for interaction between smart glasses and smart shoes according to claim 1, wherein the method comprises the following steps: when the scene type is map navigation, the correlation action is running, the intelligent glasses judge whether the user is in the running state, if yes, a second control instruction is generated and transmitted to the intelligent shoes, and when the intelligent shoes receive the second control instruction, the inflation and deflation action and the light emitting action are carried out.
8. The method for interaction between smart glasses and smart shoes according to claim 7, wherein: the intelligent glasses comprise
The system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a control instruction of a user, the control instruction comprises target information, and the target information comprises the position or the type of a destination which the user needs to reach;
the second acquisition module is used for acquiring the starting position of the user and planning a path from the starting position to the destination position;
the third acquisition module is used for acquiring a live-action image of a real scene seen by the user through the intelligent glasses, identifying a preset image area in the live-action image by performing image processing on the live-action image, and determining a preset scene area corresponding to the preset image area in the real scene according to the preset image area;
and the projection module is used for projecting a direction identifier of the path in the preset scene area through an augmented reality technology, wherein the direction identifier is used for indicating a direction or a track to be traveled from the current position of the user to the destination position along the path, and the current position is located between the starting position of the user and the path of the destination position.
9. The method for interaction between smart glasses and smart shoes according to claim 8, wherein: the intelligent glasses also comprise
The acceleration sensor is used for sensing the motion state of the user and detecting whether the user runs;
and the wireless transmission module is used for being in wireless connection with the intelligent shoe and controlling the inflation and/or the luminescence of the intelligent shoe.
10. The utility model provides an interactive system of intelligence glasses and intelligent shoes which characterized in that: the intelligent glasses comprise a wireless transmission module, a wireless communication module and a wireless communication module, wherein the wireless transmission module is used for establishing network communication between the intelligent glasses and the intelligent shoes; the acquisition module is used for acquiring a control instruction of a user; the determining module is used for determining the scene category corresponding to the control instruction; and the intelligent shoe receives the control instruction and the feedback information of the interactive interface to perform associated actions.
CN202010738595.6A 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes Active CN112052725B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010738595.6A CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010738595.6A CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Publications (2)

Publication Number Publication Date
CN112052725A true CN112052725A (en) 2020-12-08
CN112052725B CN112052725B (en) 2023-09-01

Family

ID=73602511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010738595.6A Active CN112052725B (en) 2020-07-28 2020-07-28 Interaction method and interaction system of intelligent glasses and intelligent shoes

Country Status (1)

Country Link
CN (1) CN112052725B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112587378A (en) * 2020-12-11 2021-04-02 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112731839A (en) * 2020-12-25 2021-04-30 深圳市倍轻松科技股份有限公司 Linkage control method and system between intelligent devices and computer storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104407697A (en) * 2014-11-17 2015-03-11 联想(北京)有限公司 Information processing method and wearing type equipment
CN106858883A (en) * 2017-03-03 2017-06-20 厦门精图信息技术有限公司 VR experience footwear based on panoramic table
KR101757377B1 (en) * 2016-04-06 2017-07-12 한국기계연구원 Smart shoes having function of information transfer and Method of providing information using the same
US20180200598A1 (en) * 2016-06-30 2018-07-19 Boe Technology Group Co., Ltd. Method, terminal and running shoe for prompting a user to adjust a running posture
US20180326286A1 (en) * 2017-05-09 2018-11-15 Google Llc Augmented and/or virtual reality footwear
US20180360157A1 (en) * 2016-01-05 2018-12-20 Lg Electronics Inc. Smart shoe and method for processing data therefor
US20190179286A1 (en) * 2017-12-07 2019-06-13 Saudi Arabian Oil Company Intelligent Personal Protective Equipment
CN110038274A (en) * 2019-05-21 2019-07-23 福建工程学院 A kind of wired home nobody instruct body building method
US10542385B1 (en) * 2019-01-09 2020-01-21 International Business Machines Corporation Location determination using device coordination
US20200297063A1 (en) * 2018-05-31 2020-09-24 Nike, Inc. Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104407697A (en) * 2014-11-17 2015-03-11 联想(北京)有限公司 Information processing method and wearing type equipment
US20180360157A1 (en) * 2016-01-05 2018-12-20 Lg Electronics Inc. Smart shoe and method for processing data therefor
KR101757377B1 (en) * 2016-04-06 2017-07-12 한국기계연구원 Smart shoes having function of information transfer and Method of providing information using the same
US20180200598A1 (en) * 2016-06-30 2018-07-19 Boe Technology Group Co., Ltd. Method, terminal and running shoe for prompting a user to adjust a running posture
CN106858883A (en) * 2017-03-03 2017-06-20 厦门精图信息技术有限公司 VR experience footwear based on panoramic table
US20180326286A1 (en) * 2017-05-09 2018-11-15 Google Llc Augmented and/or virtual reality footwear
US20190179286A1 (en) * 2017-12-07 2019-06-13 Saudi Arabian Oil Company Intelligent Personal Protective Equipment
US20200297063A1 (en) * 2018-05-31 2020-09-24 Nike, Inc. Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
US10542385B1 (en) * 2019-01-09 2020-01-21 International Business Machines Corporation Location determination using device coordination
CN110038274A (en) * 2019-05-21 2019-07-23 福建工程学院 A kind of wired home nobody instruct body building method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112587378A (en) * 2020-12-11 2021-04-02 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112587378B (en) * 2020-12-11 2022-06-07 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112731839A (en) * 2020-12-25 2021-04-30 深圳市倍轻松科技股份有限公司 Linkage control method and system between intelligent devices and computer storage medium

Also Published As

Publication number Publication date
CN112052725B (en) 2023-09-01

Similar Documents

Publication Publication Date Title
US11599187B2 (en) Home and portable augmented reality and virtual reality game consoles
US10216264B2 (en) Signal acquiring device, virtual reality apparatus and control method thereof
CN104303130B (en) Electronic system and its operating method with augmented reality mechanism
US9216347B2 (en) Portable device, virtual reality system and method
US20140266570A1 (en) System and method for haptic based interaction
CN102600613B (en) Game system,operation device and game processing method
CN112052725B (en) Interaction method and interaction system of intelligent glasses and intelligent shoes
US20140266571A1 (en) System and method for haptic based interaction
WO2016133158A1 (en) Footwear, audio output system, and output control method
US20110292348A1 (en) Balloon and balloon control method
CN104520787A (en) Headset computer (HSC) as auxiliary display with ASR and HT input
CN107534824A (en) Message processing device, information processing method and program
US20070231778A1 (en) Dance training method and system using sensor-equipped shoes and portable wireless terminal
CN105455304A (en) Intelligent insole system
JP6242473B1 (en) Method for providing virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program
WO2021136266A1 (en) Virtual image synchronization method and wearable device
CN108628515A (en) A kind of operating method and mobile terminal of multimedia content
CN109040968A (en) Road conditions based reminding method, mobile terminal and computer readable storage medium
US20100041454A1 (en) Portable dance game system
JP2018207151A (en) Display device, reception device, program, and control method of reception device
JP2018055416A (en) Display device, head-mounted display device, method for controlling display device, and program
CN110470293B (en) Navigation method and mobile terminal
CN209514548U (en) AR searcher, the articles search system based on AR searcher
WO2022111648A1 (en) Vr interaction method and apparatus
JP2020201575A (en) Display controller, display control method, and display control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant