CN112925467A - Commercial cleaning robot human-computer interaction system - Google Patents

Commercial cleaning robot human-computer interaction system Download PDF

Info

Publication number
CN112925467A
CN112925467A CN202110109958.4A CN202110109958A CN112925467A CN 112925467 A CN112925467 A CN 112925467A CN 202110109958 A CN202110109958 A CN 202110109958A CN 112925467 A CN112925467 A CN 112925467A
Authority
CN
China
Prior art keywords
map
robot
human
track
clicking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110109958.4A
Other languages
Chinese (zh)
Inventor
刘奇
马祥祥
陈剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Tianze Robot Technology Co ltd
Original Assignee
Jiangsu Tianze Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Tianze Robot Technology Co ltd filed Critical Jiangsu Tianze Robot Technology Co ltd
Priority to CN202110109958.4A priority Critical patent/CN112925467A/en
Publication of CN112925467A publication Critical patent/CN112925467A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a commercial cleaning robot human-computer interaction system, which comprises an access layer, a platform layer and a business logic layer which are in communication connection with each other, wherein the platform layer is based on a ROS robot; the ROS server links with the client to establish a business system by receiving a request of the client, and transmits data to a human-computer interface for display after the ROS server performs business processing on the business system; the business system comprises map building navigation, map editing and task state viewing, and the selection of the business system is carried out by clicking a corresponding icon button displayed on a human-computer interface; drawing navigation, map editing and task state viewing are drawn and rendered based on opengles. The invention has better maintainability expansibility, more direct and beautiful display of the map track and the position of the robot, and greatly improves the usability, user experience, maintainability, expandability and iterability of the man-machine interaction system.

Description

Commercial cleaning robot human-computer interaction system
Technical Field
The invention belongs to the technical field of human-computer interaction, and particularly relates to a commercial cleaning robot human-computer interaction system.
Background
The Human-computer interaction (HCI) is a technical science which enables a researcher to complete information management, service, processing and other functions for people to the maximum extent through mutual understanding of communication and communication between the researcher and the computer, so that the computer is really a harmonious assistant for people to work and study. Human-computer interaction is one of key technologies applied to intelligent robots, has great development value in the construction of functional scenes in different industry fields along with the progress of scientific technology, and becomes an important medium for creating an interaction space for market users.
The application of the robot-human-machine interaction system is more and more extensive, but the prior art has the following problems:
1. the existing robot has no good human-computer interaction system, a user cannot operate intuitively when using the robot, and the existing products all need a simple, easily understood and convenient operation interface for the user to use, the user does not need to be aware of specific implementation details of the product, and the robot can be used normally, so that the existing robot is lack of a commercial robot human-computer interaction system;
2. the existing robot software engineering is not completely connected with a ros system, so that the maintainability, the iterability and the updating performance of software are poor, and the concept of high-cohesion low-coupling in the software engineering is violated;
3. the existing robot man-machine interaction system has no beautiful interface and graphic rendering, can not enable software to become individual and tasteful, and can not realize simple and comfortable operation.
Disclosure of Invention
The invention aims to solve the problems and provides a commercial cleaning robot man-machine interaction system which can operate a robot intuitively, simply and conveniently and has better maintainability.
In order to achieve the purpose, the invention adopts the following technical scheme:
a commercial cleaning robot human-computer interaction system comprises an access layer, a platform layer and a business logic layer which are in communication connection with each other, wherein the platform layer comprises a robot based on an ROS (reactive oxygen species), the business logic layer comprises a client and an ROS server, the ROS server is linked with the client to establish a business system by receiving a request of the client, and the ROS server transmits data to a human-computer interface for display after carrying out business processing on the business system; the business system comprises map building navigation, map editing and task state viewing, and the selection of the business system is carried out by clicking a corresponding icon button on a human-computer interface display; drawing navigation, map editing and task state viewing are drawn and rendered based on opengles. The commercial robot man-machine interaction system is compiled through the android operating system and the android sdk, usability of user operability is greatly improved, and user experience is improved through interactive experience of key feedback, key vibration and the like.
Further, the access layer includes user authentication and rights management. The invention improves the safety and reliability of the system by setting user authentication and authority management.
Further, the platform layer further comprises a server which can communicate with the client, and the server is an enterprise server. The server can be selected according to actual conditions, and the invention selects the enterprise server, thereby improving the fault tolerance, the expansion performance, the fault pre-alarming function and the online diagnosis capability.
Further, the map building navigation comprises the following steps:
establishing a graph: the robot action is remotely controlled by clicking a newly built map button on a human-computer interface display, the running linear speed and the running angular speed of the robot are transmitted through a tcp network channel, and the running speed is converted according to the wheel spacing and the wheel diameter to carry out forward movement;
navigation: generating a full-coverage track for the newly-built map, selecting a map track to be operated, clicking to start navigation, and navigating according to a roscove _ base navigation algorithm;
selecting a map track: and displaying the selected map and the track on the human-computer interface and then transmitting the map and the track to the robot through a tcp network channel.
Further, the map editing includes:
electronic fence: clicking an electronic fence on a human-computer display interface, clicking any plurality of points on a map to generate a closed-loop irregular polygon, sending the closed-loop irregular polygon to a robot after clicking confirmation, and generating a full-coverage track by the robot according to a full-coverage path planning algorithm;
which is pointed to: selecting a point on a map, clicking to confirm and then sending the point to the robot, generating a point-to-point track by the robot according to a point-to-point path planning algorithm, and navigating the robot to a destination according to the track;
an eraser: after clicking the eraser, drawing on the map to erase the corresponding noise point or obstacle;
virtual wall: after clicking the virtual wall, drawing on the map to generate a corresponding virtual wall;
and (3) path planning modification: clicking path planning modification, moving the existing track points through fingers, making corresponding path planning modification by monitoring touch events of a user, and drawing and rendering the track again through opengles;
drawing a track: clicking a drawing track, clicking a map by a finger, making connection of a corresponding track by the human-computer interaction system by monitoring a touch event of a user, and re-drawing and rendering the track by opengles. The 2D and 3D graphic rendering technology based on opengles enables display of map tracks, robot positions and the like to be more direct and attractive, and visual impact is brought to people.
Further, the task state is checked to be that the robot publishes the running state of the current robot to a corresponding topic through a ros subscription publishing system, the running state of the current robot can be obtained after the human-computer interaction system subscribes the topic, and an interface of the running state of the robot is drawn through an android canvas.
Further, the client sends a request to be linked to the ROS server, the ROS server receives the request and sends a corresponding request to the client to establish a tcp long-chain access business system, and the ROS server performs business processing and transmits data to a human-computer interface to be displayed and transmitted to the user.
Further, map building is carried out according to the laser camera odometer, after the map building is proper, the map building is finished by clicking, and the map is stored persistently; the map can be edited after being established.
Furthermore, the eraser and the virtual wall redefine RGB color values of pixel points of the map according to the conversion relation between the cost map and the corresponding pixels, and a new map can be generated after clicking is completed.
Compared with the prior art, the invention has the advantages that:
1. according to the commercial cleaning robot human-computer interaction system, the human-computer interaction system is realized through the android operating system and the android sdk, the usability of user operability is greatly improved, and the user experience of the human-computer interaction system is improved through the interactive experiences of key feedback, key vibration and the like;
2. the human-computer interaction system based on ros is better in maintainability and expansibility and more friendly to developers;
3. according to the invention, the map track, the robot position and the like are displayed more directly and beautifully based on opengles 2D and 3D graphic rendering technology, visual impact is given to people, and a user can use the existing cleaning robot product more simply and conveniently through the human-computer interaction system; the usability, the user experience, the maintainability, the expandability and the iterability of the man-machine interaction system are greatly improved.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a flow chart of the operation of the present invention;
fig. 2 is a functional block diagram of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
As shown in fig. 1 and 2, a commercial cleaning robot human-computer interaction system includes an access layer, a platform layer and a business logic layer, which are communicatively connected with each other, where the platform layer includes a server and an ROS-based robot, the business logic layer includes a client and an ROS server, the ROS server is linked with the client by receiving a request of the client to establish a business system, and the ROS server transmits data to a human-computer interface for display after performing business processing on the business system; the business system comprises map building navigation, map editing and task state viewing, and the selection of the business system is carried out by clicking a corresponding icon button on a human-computer interface display; drawing navigation, map editing and task state viewing are drawn and rendered based on opengles. The commercial robot man-machine interaction system is compiled through the android operating system and the android sdk, operability and usability of a user can be greatly improved, and user experience is improved through interactive experience of key feedback, key vibration and the like.
The access layer of the embodiment comprises user authentication and authority management, and the security and the reliability of the system are improved by setting the user authentication and the authority management.
The server of the embodiment can be communicated with the client, and the server is an enterprise server. The server can be selected according to actual conditions, and the enterprise-level server is selected and adopted in the embodiment, so that the fault tolerance capability, the expansion performance, the fault pre-alarming function and the online diagnosis capability can be improved.
In the embodiment, the client sends a request to be linked to the ROS server, the ROS server receives the request and sends a corresponding request to the client to establish a tcp long-chain access business system, and the ROS server performs business processing and transmits data to a human-computer interface to be displayed and transmitted to a user.
The map building navigation in the embodiment comprises the following steps:
establishing a graph: when a user wants to build a new map, clicking a new map building button, remotely controlling the robot to move forward, backward, turn left and turn right through app, transmitting the linear speed and the angular speed (x, y) of the robot needing to run through a tcp network channel by the app, and converting the right speed of the left wheel to run according to the wheel distance and the wheel diameter of the dolly to advance; the trolley can establish a map according to the laser camera odometer; when the map is built properly, clicking to finish the map building, and performing persistent storage and subsequent editing of the map;
navigation: after the map is established, a user can edit the map, and the robot generates a full-coverage track according to the existing algorithm; a user selects a map track to be operated, clicks to start navigation, and a man-machine interaction system carries out navigation according to a roscove _ base navigation algorithm;
selecting a map track: the user can select a map and a track which are required to be used in the human-computer interaction system, and the map and the track are transmitted to the robot through a tcp network channel after the selection is completed.
The map editing of the embodiment comprises the following steps:
electronic fence: the man-machine interaction system renders a current map based on opengles, a user clicks any multiple points on the map after clicking an electronic fence to generate a closed-loop irregular polygon, the closed-loop irregular polygon is sent to the robot after clicking confirmation, and the robot generates a full-coverage track according to an existing full-coverage path planning algorithm;
which is pointed to: the method comprises the following steps that a user selects a point which a robot wants to go at present on a map, the point is clicked and confirmed and then sent to the robot, the robot generates a point-to-point track according to the existing point-to-point path planning algorithm, and the robot navigates to a destination according to the track;
an eraser: after the user clicks the eraser, the user can draw a picture on the map to erase the corresponding noise point or barrier; the algorithm redefines RGB color values of pixel points of the map according to the conversion relation between the cost map and corresponding pixels, and generates a new map after clicking is completed;
virtual wall: after clicking the virtual wall, a user can draw a picture on the map to generate a corresponding virtual wall; the algorithm redefines RGB color values of pixel points of the map according to the conversion relation between the cost map and corresponding pixels, and generates a new map after clicking is completed;
and (3) path planning modification: the user clicks path planning modification, the existing track point is moved through a finger, the man-machine interaction system monitors the touch event of the user to make corresponding path planning modification, and drawing and rendering of the track are carried out again through opengles;
drawing a track: and clicking a drawing track by a user, clicking a map by a finger, making connection of a corresponding track by the human-computer interaction system by monitoring a touch event of the user, and re-drawing and rendering the track by opengles. The 2D and 3D graphic rendering technology based on opengles enables display of map tracks, robot positions and the like to be more direct and attractive, and visual impact is given to users.
The task state of the embodiment is checked in such a way that the robot publishes the running state of the current robot to a corresponding topic through a ros subscription publishing system, the running state of the current robot can be obtained after the human-computer interaction system subscribes to the topic, an interface of the running state of the robot is drawn through an android canvas, and a user can check the task state of the current robot. The human-computer interaction system based on ros is better in maintainability and expansibility and more friendly for developers.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (9)

1. A commercial cleaning robot human-computer interaction system is characterized by comprising an access layer, a platform layer and a business logic layer which are in communication connection with each other, wherein the platform layer comprises a ROS-based robot; the ROS server links with the client to establish a business system by receiving a request of the client, and transmits data to a human-computer interface for display after the ROS server performs business processing on the business system; the business system comprises map building navigation, map editing and task state viewing, and the selection of the business system is carried out by clicking a corresponding icon button displayed on a human-computer interface; drawing navigation, map editing and task state viewing are drawn and rendered based on opengles.
2. The robot-interactive system for commercial cleaning robots according to claim 1, characterized in that the access layer comprises user authentication and rights management.
3. The robotic interaction system of a commercial cleaning robot as claimed in claim 1, wherein the platform layer further comprises a server in communication with the client, the server being an enterprise-class server.
4. The robotic interaction system for commercial cleaning according to claim 1, wherein said mapping navigation comprises:
establishing a graph: clicking a newly-built map button displayed on a human-computer interface, remotely controlling the action of the robot, transmitting the running linear speed and angular speed of the robot through a tcp network channel, and converting the running speed according to the wheel spacing and the wheel diameter to advance;
navigation: generating a full-coverage track for the newly-built map, selecting a map track to be operated, clicking to start navigation, and navigating according to a roscove _ base navigation algorithm;
selecting a map track: and displaying the selection map and the track on the human-computer interface, and transmitting the selection map and the track to the robot through a tcp network channel.
5. The commercial cleaning robot human-computer interaction system according to claim 1, wherein the map editing comprises:
electronic fence: clicking an electronic fence displayed on a human-computer interface, clicking any plurality of points on a map to generate a closed-loop irregular polygon, sending the closed-loop irregular polygon to a robot after confirmation, and generating a full-coverage track by the robot according to a full-coverage path planning algorithm;
which is pointed to: selecting a point on a map, sending the point to the robot after confirmation, generating a point-to-point track by the robot according to a point-to-point path planning algorithm, and navigating to a destination according to the track;
an eraser: clicking an eraser, drawing a picture on the map, and erasing corresponding noise points or obstacles;
virtual wall: clicking the virtual wall, drawing on the map, and generating a corresponding virtual wall;
and (3) path planning modification: clicking path planning modification, moving the existing track points by a finger, making corresponding path planning modification by monitoring a touch event, and drawing and rendering the path again by opengles;
drawing a track: clicking a drawing track, clicking a map by a finger, making connection of a corresponding track by monitoring a touch event, and re-drawing and rendering the track by opengles.
6. A commercial cleaning robot human-machine interaction system in accordance with claim 1, characterized by, that the task status view is: the robot publishes the running state of the current robot to a corresponding topic through a ros subscription and publication system, the human-computer interaction system subscribes to the topic to acquire the running state of the current robot, and an interface of the running state of the robot is drawn through an android canvas.
7. The system of claim 1, wherein the client sends a request to link to an ROS server, the ROS server accepts the request and sends a corresponding request to the client to establish a tcp long-chain access business system, and the ROS server performs business processing and transmits data to a human-computer interface for display and transmission to the user.
8. The robot-computer interaction system of the commercial cleaning robot as claimed in claim 4, wherein the map is built according to a laser camera odometer, and after the map is built properly, the map is built by clicking to finish the map building and the map is stored persistently; the map can be edited after being established.
9. The robot-human interaction system of claim 5, wherein the eraser and the virtual wall redefine RGB color values of pixels of the map according to the transformation relationship between the cost map and corresponding pixels, and generate a new map by clicking.
CN202110109958.4A 2021-01-27 2021-01-27 Commercial cleaning robot human-computer interaction system Pending CN112925467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110109958.4A CN112925467A (en) 2021-01-27 2021-01-27 Commercial cleaning robot human-computer interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110109958.4A CN112925467A (en) 2021-01-27 2021-01-27 Commercial cleaning robot human-computer interaction system

Publications (1)

Publication Number Publication Date
CN112925467A true CN112925467A (en) 2021-06-08

Family

ID=76166833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110109958.4A Pending CN112925467A (en) 2021-01-27 2021-01-27 Commercial cleaning robot human-computer interaction system

Country Status (1)

Country Link
CN (1) CN112925467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419193A (en) * 2022-01-24 2022-04-29 北京思明启创科技有限公司 Image drawing method, image drawing device, electronic equipment and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106020208A (en) * 2016-07-27 2016-10-12 湖南晖龙股份有限公司 Robot remote control method based on ROS operating system and remote control system thereof
CN108333974A (en) * 2018-03-15 2018-07-27 珠海金萝卜智动科技有限公司 A kind of all-purpose robot control system and method based on ROS
CN108873913A (en) * 2018-08-22 2018-11-23 深圳乐动机器人有限公司 From mobile device work compound control method, device, storage medium and system
CN109276833A (en) * 2018-08-01 2019-01-29 吉林大学珠海学院 A kind of robot patrol fire-fighting system and its control method based on ROS
CN110448232A (en) * 2019-08-14 2019-11-15 成都普诺思博科技有限公司 Intelligent cleaning robot management system based on cloud platform
CN110531725A (en) * 2019-09-19 2019-12-03 上海机器人产业技术研究院有限公司 A kind of map sharing method based on cloud
CN112099487A (en) * 2020-08-06 2020-12-18 盐城工学院 Map construction and simultaneous positioning method based on ROS

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106020208A (en) * 2016-07-27 2016-10-12 湖南晖龙股份有限公司 Robot remote control method based on ROS operating system and remote control system thereof
CN108333974A (en) * 2018-03-15 2018-07-27 珠海金萝卜智动科技有限公司 A kind of all-purpose robot control system and method based on ROS
CN109276833A (en) * 2018-08-01 2019-01-29 吉林大学珠海学院 A kind of robot patrol fire-fighting system and its control method based on ROS
CN108873913A (en) * 2018-08-22 2018-11-23 深圳乐动机器人有限公司 From mobile device work compound control method, device, storage medium and system
CN110448232A (en) * 2019-08-14 2019-11-15 成都普诺思博科技有限公司 Intelligent cleaning robot management system based on cloud platform
CN110531725A (en) * 2019-09-19 2019-12-03 上海机器人产业技术研究院有限公司 A kind of map sharing method based on cloud
CN112099487A (en) * 2020-08-06 2020-12-18 盐城工学院 Map construction and simultaneous positioning method based on ROS

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419193A (en) * 2022-01-24 2022-04-29 北京思明启创科技有限公司 Image drawing method, image drawing device, electronic equipment and computer-readable storage medium
CN114419193B (en) * 2022-01-24 2023-03-10 北京思明启创科技有限公司 Image drawing method, image drawing device, electronic equipment and computer-readable storage medium

Similar Documents

Publication Publication Date Title
Gadre et al. End-user robot programming using mixed reality
Gossow et al. Interactive markers: 3-d user interfaces for ros applications [ros topics]
EP2645267A1 (en) Application sharing
Manring et al. Augmented reality for interactive robot control
CN110216683A (en) A kind of cooperation robot teaching method based on game paddle
CN112925467A (en) Commercial cleaning robot human-computer interaction system
CN116197899A (en) Active robot teleoperation system based on VR
US20220358258A1 (en) Computer-aided design methods and systems
Li et al. Towards robust exocentric mobile robot tele-operation in mixed reality
JPH09190549A (en) Interactive video presenting device
CN114791765B (en) ROS intelligent vehicle interaction method based on mixed reality technology
CN115185368A (en) Mobile robot interactive operation system based on Hololens
Mitterberger et al. Extended reality collaboration: Virtual and mixed reality system for collaborative design and holographic-assisted on-site fabrication
Fang et al. A survey on HoloLens AR in support of human-centric intelligent manufacturing
Ressler et al. Integrating active tangible devices with a synthetic environment for collaborative engineering
Fang et al. Assisted human-robot interaction for industry application based augmented reality
CN212135182U (en) Laser machine operating system
Naef et al. Autoeval mkII-interaction design for a VR design review system
Villa et al. Cobity: A Plug-And-Play Toolbox to Deliver Haptics in Virtual Reality
JPH1166351A (en) Method and device for controlling object operation inside three-dimensional virtual space and recording medium recording object operation control program
JP3615840B2 (en) Image display device
JP4347017B2 (en) Information processing method and image processing method
Aarizou ROS-based web application for an optimized multi-robots multi-users manipulation
KR100946672B1 (en) Method and system for providing 3d scene navigation using of camera haptic interface
Zheng et al. ARCrowd-a tangible interface for interactive crowd simulation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination