CN107340853B - Remote presentation interaction method and system based on virtual reality and gesture recognition - Google Patents

Remote presentation interaction method and system based on virtual reality and gesture recognition Download PDF

Info

Publication number
CN107340853B
CN107340853B CN201611025123.6A CN201611025123A CN107340853B CN 107340853 B CN107340853 B CN 107340853B CN 201611025123 A CN201611025123 A CN 201611025123A CN 107340853 B CN107340853 B CN 107340853B
Authority
CN
China
Prior art keywords
remote
user
space
local space
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201611025123.6A
Other languages
Chinese (zh)
Other versions
CN107340853A (en
Inventor
裴明涛
李佩霖
梁玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201611025123.6A priority Critical patent/CN107340853B/en
Publication of CN107340853A publication Critical patent/CN107340853A/en
Application granted granted Critical
Publication of CN107340853B publication Critical patent/CN107340853B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The invention discloses a remote presentation interaction method and system based on virtual reality and gesture recognition, and belongs to the technical field of human-computer interaction. The system comprises a remote presentation robot equipped with a binocular camera and a remote interactive device in a local space, interactive VR glasses consisting of VR glasses, a smart phone and the binocular camera in a remote space, a wireless network connecting the local space and the remote space, and a computing server. The remote presentation robot acquires two paths of real-time video images of a local space, transmits the two paths of real-time video images to a smart phone of a user in a remote space through a wireless network, and displays the two paths of real-time video images in a split screen mode; a user can perceive a local space in a virtual reality mode through VR glasses and a smart phone; the binocular camera on the interactive VR glasses in the remote space acquires gesture data of the user, and interacts with the remote interactive device in the local space according to the position of the hand of the user and the specific gesture. The user can obtain real immersive experience and interactive experience.

Description

Remote presentation interaction method and system based on virtual reality and gesture recognition
Technical Field
The invention relates to a design of a remote presentation interaction mode based on a virtual reality technology and a gesture recognition technology, in particular to a remote presentation interaction method and system based on virtual reality and gesture recognition, and belongs to the technical field of human-computer interaction.
Background
Currently, many remote interactive modes and systems are available, such as video conferencing, remote operation robots, and remote presentation robots. These systems mostly adopt the traditional remote interaction mode, i.e. the user interacts with the local space through the keyboard, the mouse, the joystick and the graphical interface on the user interface in the remote space. These systems are typically designed for a specific task and require a trained operator to operate.
With the development of touch screen technology, some remote interaction systems also adopt a tablet computer, a smart phone and other mobile devices as interaction equipment, superimpose a graphical user interface on a live video image, and perform interaction operation by clicking a virtual button on the graphical user interface. In addition, a user interface based on a touchable field real-time video image is also provided, the local space field real-time video image is obtained through the remote presentation robot, and a user directly touches the local space field real-time video image through the user interface based on the touchable field real-time video image in the remote space, so that the interaction with an object and the environment in the local space is realized.
The development of virtual reality technology makes an interactive mode which is more real and has a sense of scene possible. In some virtual reality systems, a user may interact with a virtual reality scene through natural gestures, but may not yet interact with a real scene.
To achieve a more immersive telepresence, we aim to propose a telepresence interaction method and system based on virtual reality technology and gesture recognition technology.
Disclosure of Invention
The invention aims to solve the problems of lack of immersion and non-intuitive interaction mode in the prior remote interaction technology, and provides a remote presentation interaction method and system based on a virtual reality technology and a gesture recognition technology.
The core idea of the invention is that: the remote presentation robot acquires two paths of real-time video images of a local space through a binocular camera, transmits the two paths of real-time video images to a smart phone of a user in a remote space through a wireless network, and displays the two paths of real-time video images in a split screen mode; a user in a remote space can perceive a local space in a virtual reality mode through VR glasses and a smart phone to obtain real immersive experience; when a user needs to interact with the remote robot, real-time video data of the user gesture needs to be transmitted to a computing server, and then gesture recognition is carried out to control the remote presentation robot; when a user needs to interact with the remote interactive device, the device needs to be recognized first, a corresponding relation between an image of the remote interactive device in a visual field and an actual remote interactive device is established, a binocular camera on interactive VR glasses in a remote space acquires gesture data of the user and recognizes gestures of the user, virtual hands of the user are drawn in two paths of video images displayed on a screen of a mobile phone screen in a split mode according to the acquired gesture data, the user can control the movement of the virtual hands by moving the hands of the user, the gestures of the virtual hands are changed by changing the gestures of the user, and therefore the virtual hands and the remote interactive device in a local space can be used for remote presentation interaction.
A remote presentation interaction method and system based on virtual reality technology and gesture recognition technology comprises a remote presentation interaction method based on virtual reality technology and gesture recognition technology and a remote presentation interaction system based on virtual reality technology and gesture recognition technology; a remote presentation interaction method based on a virtual reality technology and a gesture recognition technology is called the method for short, and a remote presentation interaction system based on the virtual reality technology and the gesture recognition technology is called the system for short;
the system comprises a local space, a remote space, a wireless network for connecting the local space and the remote space, and a computing server; the computing server can be located in a local space, a remote space or any networking place; the local space comprises a remote presentation robot and a remote interactive device; the remote space also comprises interactive VR glasses, and the interactive VR glasses comprise VR glasses, a smart phone and a binocular camera; the user is located in a remote space;
the remote interactable device has three attributes, an identifier (id), a driver (activator), a wireless communication network; wherein, the mark refers to a unique characteristic which can be identified by a computer vision method;
the driver is a motor and a relay which convert electric energy into mechanical energy or other energy forms; the wireless communication network comprises Wifi;
wherein the identification of the remotely interactable device by computer vision methods is typically performed by extracting image features, including color, texture, appearance, and shape; if the image characteristics are the same, namely two remote interactive devices with the same appearance are identified, the positions of the devices are required to be identified so as to distinguish the two devices, or the position parameters of the devices are acquired by using a positioning technology; for example, two curtains are arranged in a room, and the curtains at different positions can be distinguished through a positioning technology, so that the control of different curtains is realized;
the remote interactive device is completely independent and can be remotely and directly controlled by a user, and the identification of the device is mainly realized by a computer vision method;
the telepresence robot is also a remotely interactable device, consisting of a mobile base and a robot head, wherein the robot head is equipped with a binocular camera;
the connection relationship of the components of the system is as follows:
the local space and the remote space are connected through a wireless network; the computing server is connected with the local space and the remote space through a wireless network;
the functions of the components of the system are as follows:
the function of the computing server is to recover the three-dimensional information of the user gesture and the local space and perform gesture recognition;
the function of the remote interactive device in the local space is to realize remote interaction; wherein the function of the identification is to distinguish and identify the device; the function of the driver is to realize automatic control; the wireless communication network functions to enable the remote interactive apparatus to connect with the internet;
the remote presentation robot in the local space has the function of moving in the local space to acquire a real-time video image of the local space;
the function of the interactive VR glasses in the remote space is to provide the user with a virtual reality perception of the local space in the remote space and interact with the remote interactive device in the local space through gestures;
a remote presentation interaction method based on a virtual reality technology and a gesture recognition technology comprises the following steps:
step one, acquiring a live real-time video image;
the method specifically comprises the following steps: the method comprises the following steps of obtaining a live real-time video image of a local space by moving a telepresence robot in the local space, namely: a binocular camera equipped on the robot head is used for acquiring two paths of local space on-site real-time video images;
the two paths of local space on-site real-time video images are respectively shot by a left camera and a right camera of the binocular camera;
step two, the user perceives the local space in a virtual reality mode;
the method specifically comprises the following steps: the wireless network transmits the two paths of on-site real-time video images acquired by the telepresence robot in the first step to a smart phone in a remote space and performs split-screen display, namely, the left side of a screen of the smart phone displays an image shot by a left camera in a binocular camera on the telepresence robot, and the right side of the screen of the smart phone displays an image shot by a right camera in the binocular camera on the telepresence robot; a user in a remote space can perceive a local space in a virtual reality mode through VR glasses and a smart phone to obtain real immersive experience;
step three, determining whether to interact with the remote presentation robot or the remote interactive device through gestures according to user requirements, specifically;
a, if the user needs to interact with the remote presentation robot, the specific steps are as follows:
step 3, A1 obtains two paths of real-time video data of user gestures through a binocular camera on the interactive VR glasses;
step 3, A2 sends the acquired real-time video data of the two paths of user gestures to a computing server through a wireless network;
step 3.A3, recovering the three-dimensional information of the user gesture on the computing server by using a stereoscopic vision method, and performing gesture recognition;
step 3.A4, controlling the remote presentation robot to move forward, backward, turn left and turn right according to the gesture recognition result;
and 3.B, if the user needs to interact with the remote interactive device, the specific steps are as follows:
step 3, B1 acquiring two paths of real-time video data of user gestures through binocular cameras on the interactive VR glasses;
step 3, B2 sends the two paths of real-time video data to a calculation server through a wireless network;
step 3.B3, recovering the three-dimensional information of the user gesture on the computing server by using a stereoscopic vision method, and performing gesture recognition; meanwhile, two paths of local space video data acquired by the remote presentation robot are also sent to the computing server, and the three-dimensional information of the local space is recovered by using a stereoscopic vision method;
step 3, B4, merging the three-dimensional information of the virtual hand into the three-dimensional information of the local space, and drawing the virtual hand in the corresponding two-dimensional image of the local space;
the method specifically comprises the following steps: the method comprises the steps that the spatial position of a hand of a user relative to the user can be obtained by recovering three-dimensional information of a gesture of the user, and the position of a virtual hand relative to a remote presentation robot can be obtained according to the spatial position of the hand of the user relative to the user, so that the three-dimensional information of the virtual hand can be merged into the three-dimensional information of a local space, and the virtual hand can be drawn in a corresponding two-dimensional image of the local space;
step 3.B5 the user controls the movement of the virtual hand by moving his own hands, judges whether the virtual hand touches the remote interactive device of the local space according to the three-dimensional information of the local space and the position of the virtual hand, and uses the result of gesture recognition of step 3.B3 to operate the remote interactive device when the virtual hand touches the remote interactive device of the local space;
therefore, through the steps from the first step to the third step, the remote presentation interaction method based on the virtual reality technology and the gesture recognition technology is completed.
Advantageous effects
Compared with the prior art, the remote presentation interaction method based on the virtual reality technology and the gesture recognition technology has the following beneficial effects:
(1) according to the invention, two paths of field real-time video images of a local space are acquired by equipping a binocular camera on the telepresence robot, and are transmitted to a mobile phone of a user in the remote space through a wireless network and displayed on a split screen, so that the user can sense the local space in a virtual reality mode through VR glasses, and the user can really feel personally on the scene;
(2) the user can move the virtual hand drawn in the local space two-dimensional image by moving own hand through the interactive VR glasses in the remote space to operate the remote interactive device in the local space.
Drawings
FIG. 1 is a diagram of a remote presentation interaction method and system based on virtual reality and gesture recognition;
FIG. 2 is a remote presentation robot architecture in a remote presentation interaction method and system based on virtual reality and gesture recognition according to the present invention;
FIG. 3 is an interactive VR glasses configuration for a telepresence interaction method and system based on virtual reality and gesture recognition in accordance with the present invention;
wherein, fig. 1 telepresence interaction system consists of 101 a local space where a telepresence robot is located, 102 a remote space where a user is located, 103 and 104 a wireless network, 105 an internet, 106 an interactable device in the local space, 107 a telepresence robot, 108 interactive VR glasses consisting of VR glasses, a smartphone and a binocular camera, 109 a user, 110 a computing server which can be placed in the local space or the remote space, or any networked location;
fig. 2 telepresence robot architecture, 201 is a binocular camera, 202 is a mobile base;
fig. 3 interactive VR glasses structure, 301 is the smart mobile phone, 302 is two mesh cameras, 303 is VR glasses.
Detailed Description
The invention is further illustrated and described in detail below with reference to the figures and examples.
Examples
The embodiment details the specific implementation situation of the remote presentation interaction method and system based on virtual reality and gesture recognition under the remote accompanying situation.
Fig. 1 is a schematic diagram of a system of a telepresence interaction method and system based on virtual reality and gesture recognition, where 101 is a local space where a telepresence robot is located, 102 is a remote space where a user is located, 103 and 104 are wireless networks, 105 is the internet, the wireless networks and the internet are used for connecting the local space and the remote space, 106 is an interactable device in the local space, 107 is the telepresence robot used for moving in the local space and acquiring a real-time video image of the local space, 108 is interactive VR glasses composed of VR glasses, a smart phone and a binocular camera, 109 is the user, 110 is a computing server, and the computing server can be placed in the local space or the remote space, or any networked place.
The telepresence robot is composed of a binocular camera and a mobile base, as shown in fig. 2, the binocular camera 201 is used for acquiring two paths of real-time video images of a local space, and the mobile base 202 is used for moving the robot. Interactive VR glasses comprise VR glasses, a smart phone, and a binocular camera, as shown in fig. 3, wherein smart phone 301 is configured to receive two paths of video images transmitted by binocular camera 201 and display the images in a split screen manner, VR glasses 303 and smart phone 301 provide a user with a virtual reality manner to perceive a local space, and binocular camera 302 is configured to capture hand data of the user for interaction with a remote interactive device in the local space.
A user makes an appointed control gesture through two hands, and the remote presentation robot is controlled to move to a door with a coded lock; the system automatically identifies the coded lock and establishes the corresponding relation between the image of the coded lock and the actual coded lock; a user sees the password key panel of the access control system in the local space through VR glasses, directly stretches out a hand (control virtual hand) to press the password key, and judges which key the user presses by calculating the position of the finger of the user and mapping the finger to the reconstructed local three-dimensional space. According to the corresponding relation established between the password key image and the remote interactive device, a corresponding control instruction can be sent to a driver of the device through a wireless network, so that the device can complete corresponding actions, and remote presentation interaction based on gestures is realized. The method specifically comprises the following steps: when the user inputs the correct password, the door can be opened; a user controls the telepresence robot to enter a room through an appointed control gesture and move to the front of a switch of a lamp; a user sees a switch of a lamp in a local space through VR glasses, directly stretches a hand (controls a virtual hand) to press the switch, calculates the position of a finger of the user, maps the position to a reconstructed local three-dimensional space, and judges whether the finger of the user is close to the switch.
While the foregoing is directed to the preferred embodiment of the present invention, it is not intended that the invention be limited to the embodiment and the drawings disclosed herein. Equivalents and modifications may be made without departing from the spirit of the disclosure, which is to be considered as within the scope of the invention.

Claims (2)

1. A remote presentation interaction method based on a virtual reality technology and a gesture recognition technology is characterized in that: the method comprises the following steps:
the method comprises the following steps of firstly, acquiring a field real-time video image, specifically: the method comprises the following steps of obtaining a live real-time video image of a local space by moving a telepresence robot in the local space, namely: a binocular camera equipped on the robot head is used for acquiring two paths of local space on-site real-time video images;
the two paths of local space on-site real-time video images are respectively shot by a left camera and a right camera of the binocular camera;
step two, the user perceives the local space in a virtual reality mode, which specifically comprises the following steps: the wireless network transmits the two paths of on-site real-time video images acquired by the telepresence robot in the first step to a smart phone in a remote space and performs split-screen display, namely, the left side of a screen of the smart phone displays an image shot by a left camera in a binocular camera on the telepresence robot, and the right side of the screen of the smart phone displays an image shot by a right camera in the binocular camera on the telepresence robot; a user in a remote space can perceive a local space in a virtual reality mode through VR glasses and a smart phone to obtain real immersive experience;
step three, determining whether to interact with the remote presentation robot or the remote interactive device through gestures according to user requirements, specifically:
a, if the user needs to interact with the remote presentation robot, specifically:
step 3, A1 obtains two paths of real-time video data of user gestures through a binocular camera on the interactive VR glasses;
step 3, A2 sends the acquired real-time video data of the two paths of user gestures to a computing server through a wireless network;
step 3.A3, recovering the three-dimensional information of the user gesture on the computing server by using a stereoscopic vision method, and performing gesture recognition;
step 3.A4, controlling the remote presentation robot to move forward, backward, turn left and turn right according to the gesture recognition result;
and 3.B, if the user needs to interact with the remote interactive device, specifically:
step 3, B1 acquiring two paths of real-time video data of user gestures through binocular cameras on the interactive VR glasses;
step 3, B2 sends the two paths of real-time video data to a calculation server through a wireless network;
step 3.B3, recovering the three-dimensional information of the user gesture on the computing server by using a stereoscopic vision method, and performing gesture recognition; meanwhile, two paths of local space video data acquired by the remote presentation robot are also sent to the computing server, and the three-dimensional information of the local space is recovered by using a stereoscopic vision method;
step 3, B4, merging the three-dimensional information of the virtual hand into the three-dimensional information of the local space, and drawing the virtual hand in the corresponding two-dimensional image of the local space;
the method specifically comprises the following steps: the method comprises the steps that the spatial position of a hand of a user relative to the user can be obtained by recovering three-dimensional information of a gesture of the user, and the position of a virtual hand relative to a remote presentation robot can be obtained according to the spatial position of the hand of the user relative to the user, so that the three-dimensional information of the virtual hand can be merged into the three-dimensional information of a local space, and the virtual hand can be drawn in a corresponding two-dimensional image of the local space;
and 3.B5, the user controls the movement of the virtual hand by moving the two hands of the user, judges whether the virtual hand touches the remote interactive device of the local space according to the three-dimensional information of the local space and the position of the virtual hand, and operates the remote interactive device by using the result of gesture recognition in the step 3.B3 when the virtual hand touches the remote interactive device of the local space.
2. The telepresence interaction system of claim 1, in which a telepresence interaction method based on virtual reality technology and gesture recognition technology relies on: the system comprises a local space, a remote space, a wireless network for connecting the local space and the remote space, and a computing server;
the computing server can be located in a local space, a remote space or any networking place; the local space comprises a remote presentation robot and a remote interactive device; the remote space also comprises interactive VR glasses, and the interactive VR glasses comprise VR glasses, a smart phone and a binocular camera; the user is located in a remote space; the remote interactive device has three attributes of an identifier, a driver and a wireless communication network;
wherein, the mark refers to a unique characteristic which can be identified by a computer vision method; the driver is a motor and a relay which convert electric energy into mechanical energy or other energy forms; the wireless communication network comprises Wifi;
wherein the identification of the remotely interactable device by computer vision methods is typically performed by extracting image features, including color, texture, appearance, and shape; if the image characteristics are the same, namely two remote interactive devices with the same appearance are identified, the positions of the devices are required to be identified so as to distinguish the two devices, or the position parameters of the devices are acquired by using a positioning technology;
the connection relationship of the components of the system is as follows:
the local space and the remote space are connected through a wireless network; the computing server is connected with the local space and the remote space through a wireless network;
the functions of the components of the system are as follows:
the function of the computing server is to recover the three-dimensional information of the user gesture and the local space and perform gesture recognition;
the function of the remote interactive device in the local space is to realize remote interaction; wherein the function of the identification is to distinguish and identify the device; the function of the driver is to realize automatic control; the wireless communication network functions to enable the remote interactive apparatus to connect with the internet;
the remote presentation robot in the local space has the function of moving in the local space to acquire a real-time video image of the local space;
the function of the interactive VR glasses in the remote space is to provide the user with a virtual reality sense of the local space in the remote space and interact with the remotely interactable device in the local space through gestures.
CN201611025123.6A 2016-11-18 2016-11-18 Remote presentation interaction method and system based on virtual reality and gesture recognition Expired - Fee Related CN107340853B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611025123.6A CN107340853B (en) 2016-11-18 2016-11-18 Remote presentation interaction method and system based on virtual reality and gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611025123.6A CN107340853B (en) 2016-11-18 2016-11-18 Remote presentation interaction method and system based on virtual reality and gesture recognition

Publications (2)

Publication Number Publication Date
CN107340853A CN107340853A (en) 2017-11-10
CN107340853B true CN107340853B (en) 2020-04-14

Family

ID=60222423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611025123.6A Expired - Fee Related CN107340853B (en) 2016-11-18 2016-11-18 Remote presentation interaction method and system based on virtual reality and gesture recognition

Country Status (1)

Country Link
CN (1) CN107340853B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108211B (en) * 2017-11-20 2021-06-25 福建天泉教育科技有限公司 Method and terminal for performing remote interaction in virtual reality scene
CN108399799A (en) * 2018-02-11 2018-08-14 广州特种机电设备检测研究院 A kind of elevator inspection training system and method based on virtual reality technology
US11366514B2 (en) 2018-09-28 2022-06-21 Apple Inc. Application placement based on head position
CN109391802A (en) * 2018-12-20 2019-02-26 北京伊神华虹系统工程技术有限公司 A kind of method and apparatus for realizing real-time VR function
WO2020135719A1 (en) * 2018-12-29 2020-07-02 广东虚拟现实科技有限公司 Virtual content interaction method and system
CN109872519A (en) * 2019-01-13 2019-06-11 上海萃钛智能科技有限公司 A kind of wear-type remote control installation and its remote control method
CN110308797A (en) * 2019-07-09 2019-10-08 西北工业大学 Underwater robot environmental interaction system based on body-sensing technology mechanical arm and virtual reality technology
WO2021061351A1 (en) 2019-09-26 2021-04-01 Apple Inc. Wearable electronic device presenting a computer-generated reality environment
CN116360601A (en) 2019-09-27 2023-06-30 苹果公司 Electronic device, storage medium, and method for providing an augmented reality environment
CN110955328B (en) * 2019-10-24 2022-10-04 北京小米移动软件有限公司 Control method and device of electronic equipment and storage medium
CN111438673B (en) * 2020-03-24 2022-04-22 西安交通大学 High-altitude operation teleoperation method and system based on stereoscopic vision and gesture control
CN111565307A (en) * 2020-04-29 2020-08-21 昆明埃舍尔科技有限公司 Remote space synchronization guidance method and system based on MR
CN111716365B (en) * 2020-06-15 2022-02-15 山东大学 Immersive remote interaction system and method based on natural walking
CN112276914B (en) * 2020-12-28 2021-03-16 佛山冠博机械科技发展有限公司 Industrial robot based on AR technology and man-machine interaction method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102699914A (en) * 2012-05-15 2012-10-03 郑州大学 Robot
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105922262A (en) * 2016-06-08 2016-09-07 北京行云时空科技有限公司 Robot and remote control equipment and remote control method thereof
CN106101687A (en) * 2016-07-25 2016-11-09 深圳市同盛绿色科技有限公司 VR image capturing device and VR image capturing apparatus based on mobile terminal thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6950534B2 (en) * 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US9643314B2 (en) * 2015-03-04 2017-05-09 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102699914A (en) * 2012-05-15 2012-10-03 郑州大学 Robot
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105922262A (en) * 2016-06-08 2016-09-07 北京行云时空科技有限公司 Robot and remote control equipment and remote control method thereof
CN106101687A (en) * 2016-07-25 2016-11-09 深圳市同盛绿色科技有限公司 VR image capturing device and VR image capturing apparatus based on mobile terminal thereof

Also Published As

Publication number Publication date
CN107340853A (en) 2017-11-10

Similar Documents

Publication Publication Date Title
CN107340853B (en) Remote presentation interaction method and system based on virtual reality and gesture recognition
CN107168537B (en) Cooperative augmented reality wearable operation guidance method and system
KR101918262B1 (en) Method and system for providing mixed reality service
CN105760106B (en) A kind of smart home device exchange method and device
CN104410883B (en) The mobile wearable contactless interactive system of one kind and method
CN106468917B (en) A kind of long-range presentation exchange method and system of tangible live real-time video image
US20130063560A1 (en) Combined stereo camera and stereo display interaction
US20140257532A1 (en) Apparatus for constructing device information for control of smart appliances and method thereof
CN104656893B (en) The long-distance interactive control system and method in a kind of information physical space
CN108255454B (en) Splicing processor and visual interaction method of splicing processor
CN107741782B (en) Equipment virtual roaming method and device
CN107452119A (en) virtual reality real-time navigation method and system
CN107817701B (en) Equipment control method and device, computer readable storage medium and terminal
CN104699233A (en) Screen operation control method and system
CN106896920B (en) Virtual reality system, virtual reality equipment, virtual reality control device and method
CN103294024A (en) Intelligent home system control method
CN102929547A (en) Intelligent terminal contactless interaction method
JP2021078104A (en) Program relating to web-based remote assistance system with context and content-aware 3d hand gesture visualization
WO2019028855A1 (en) Virtual display device, intelligent interaction method, and cloud server
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
KR20150097049A (en) self-serving robot system using of natural UI
CN109788359B (en) Video data processing method and related device
US20210245368A1 (en) Method for virtual interaction, physical robot, display terminal and system
CN112099681B (en) Interaction method and device based on three-dimensional scene application and computer equipment
CN111492396A (en) Mixed reality service providing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200414

Termination date: 20201118