CN108032302A - The design method that a kind of computer vision tactile smell is mutually mapped with the time - Google Patents

The design method that a kind of computer vision tactile smell is mutually mapped with the time Download PDF

Info

Publication number
CN108032302A
CN108032302A CN201711404039.XA CN201711404039A CN108032302A CN 108032302 A CN108032302 A CN 108032302A CN 201711404039 A CN201711404039 A CN 201711404039A CN 108032302 A CN108032302 A CN 108032302A
Authority
CN
China
Prior art keywords
time
smell
data
tactile
computer vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711404039.XA
Other languages
Chinese (zh)
Inventor
胡明建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201711404039.XA priority Critical patent/CN108032302A/en
Publication of CN108032302A publication Critical patent/CN108032302A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40005Vision, analyse image at one station during manipulation at next station

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

A kind of technical field for the design method that computer vision tactile smell is mutually mapped with the time,It is to belong to,Robot,Artificial intelligence,Computer,Image procossing,Olfactory sensor technology,Touch sensor,The technical fields such as mathematics,Major technique is the image gathered by camera,The data of hearing transducer collection and the data of touch sensor collection,Record at the same time,Then handled by calculating,The material image of camera shooting and the smell smelt are felt to be mapped with what is touched,And when being preserved by computing,Save at the same time,This time is also saved at the same time,And using this time as mapping,So when receiving the smell that has preserved,It can be found in lane database,And the time for passing through record,Find corresponding image and sensation at the same time,When seeing that sub-picture,Corresponding smell and sensation are found also by time map,When touching such sensation,It can also map passage time and find corresponding image and smell.

Description

The design method that a kind of computer vision tactile smell is mutually mapped with the time
Technical field
A kind of technical field for the design method that computer vision tactile smell is mutually mapped with the time, is to belong to, robot, Technical field, the major techniques such as artificial intelligence, computer, image procossing, olfactory sensor technology, touch sensor, mathematics are The data that the data and touch sensor that image, the hearing transducer gathered by camera gathers gather, are recorded at the same time, Then being handled by calculating, the material image of camera shooting and the smell smelt are felt to be mapped with what is touched, and When being preserved by computing, save at the same time, this time is also saved at the same time, and using this time as mapping, So when receiving the smell that has preserved, can be found in lane database, and by the time of record, while find corresponding figure Picture and sensation, when seeing that sub-picture, corresponding smell and sensation are found also by time map, when touching such sense Feel, can also map passage time and find corresponding image and smell.
Background technology
Intelligent robot is most complicated robot, and the mankind most thirst for the machine friend that can manufacture early Friend.But to produce an intelligent robot and be not easy to, it is only to allow the walking motion of the machine simulation mankind, scientists Tens of or even upper century-old effort will have been paid.Nineteen twenty, Czechoslovakia writer Ka Leier Karel apeks were in his section It is unreal it is small be right, according to Robota(Czech's text, original meaning is " forced labour, hard work ")And Robotnik(Polish, original meaning are " worker "), Create " robot " this word.The domestic appliance of Westinghouse Electrical Corp. manufacture has been put on display in nineteen thirty-nine USA New York World Expo People Elektro.It is controlled by cable, can be walked, and can say 77 words, it might even be possible to smoke, but from living go back of really doing chores It is not by a long chalk.But it allows people to become more specific to the hope of domestic robot.Nineteen forty-two U.S.'s science fiction great master's Asimov carries Go out " three law of robot ".Although this is the creation in science fiction, later as the research and development principle of academia's acquiescence. Promise Bert wiener is published within 1948《Cybernetics --- on the science for controlling and communicating in animal and machine》, elaborate machine Nerve, the common law of sensor capability of communication and Control function and people in device, take the lead in proposing using computer as core from Dynamic chemical plant.Marvin's Ming Siji proposes his view to intelligence machine in Dartmouth meeting within 1954:Intelligent machine Device " can create the abstract model of surrounding environment, if encountered problems, can find one's way out from abstract model ".This Definition influences the research direction of later 30 years intelligent robots.American's George's de Waele produces in the world within 1956 The programmable robot of First, and have registered patent.This manipulator can be engaged in different work according to different programs, because This is with versatility and flexibility.Nineteen fifty-nine de Waele jointly produces first with American inventor Joseph's English lattice Burger Platform industrial robot.Then, first hand robot building factory in the world has been set up --- Unimation companies.Due to English lattice Research and development and publicity of the Burger to industrial robot, he is also referred to as " father of industrial robot ".AMF Inc. of U.S. life in 1962 Output " VERSTRAN "(Mean omnipotent carrying), with Unimation companies production Unimate as become really commercialization Industrial robot, and export to countries in the world, started upsurge of the whole world to robot and robot research.Sensor Application improve the operability of robot.People try to install various sensors in robot, including 1961 The touch sensor that year Ernest & Whitney uses, asks Abramovich and Bo Ni 1962 to use pressure in earliest " Dextrous Hand " in the world Force snesor, and mccarthy then starts to add visual sensing system in robot for 1963, and in 1964 years, help MIT to push away Gone out in the world first carry visual sensor, can identify and position the robot system of building blocks.Nineteen sixty-five John Hope Develop Beast robots in Jin Si universities Experiment of Applied Physics room.Beast can by devices such as Sonar system, photoelectric tubes, According to the position of environmental correction oneself.Middle 1960s, Massachusetts Institute Technology, Stanford University, Britain Robot laboratory has been set up successively in Edinburgh University etc..Rise research second generation belt sensor, the machine of " feeling " in the U.S. People, and set out to artificial intelligence.Stanford Research Institute of the U.S. announced them and researched and developed successful robot Shakey nineteen sixty-eight.It With visual sensor, it can be found according to the instruction of people and capture building blocks, but control its computer to have a room so Greatly.Shakey can be the first in the world platform intelligent robot, pull open the prelude of third generation robot research and development.Japanese early rice Field university adds one youth laboratory of rattan to develop the robot that First is walked with both feet.One youth of rattan is added to be devoted for years in research apery Robot, is known as " father of anthropomorphic robot ".Japanese expert one is to research and develop the technology of anthropomorphic robot and amusement robot It is good at, later further, expedites the emergence of out the ASIMO of Honda Company and the QRIO of Sony.In the world first time robot and Minicom is cooperated, the robot T3 for Cincinnati Milacron companies of the U.S. of being just born.The U.S. Unimation companies release popular industrial machine people PUMA, this indicates Industrial Robot Technology full maturity.PUMA is extremely The present is still operated in factory's First Line.English lattice Burger Zai Tui robot Helpmate, this robot can be patient in hospital Bring meal, drug delivery, send mail.In the same year, he also foretells:" I will allow robot to wash the floor, and cook, and go out to help my carwash, check peace Entirely ".Chinese Famous scholar Zhou Haizhong professors exist《By robot》Foretold in one text:To middle period 21st century, nanometer robot By the thorough work for changing the mankind and life style.Lego Company of Denmark releases robot(Mind-storms)External member, allows machine Device people manufacture becomes with playing with building blocks, and relatively easy and energy is any assembled, makes robot start to enter into private world.1999 It is precious that Sony corporation of Japan releases dog humanoid robot love(AIBO), it is sold out, is stepped from this amusement robot as robot at once One of approach into average family.IRobot companies of the U.S. are proposed dust collector robot Roomba within 2002, it can avoid hindering Hinder, Automated Design route of travel, moreover it is possible in not enough power supply, drive towards cradle automatically.Roomba is that sales volume is most in the world at present Greatly, most commercialized domestic robot.Authorised distributor of Beijing area of iRobot companies:Beijing microgrid Zhi Hong Science and Technology Ltd.s. In June, 2006, Microsoft release Microsoft Robotics Studio, and what robot modularized, platform unitized becomes Gesture is more and more obvious, and Bill Gates prophesy, domestic robot will have swept the globe quickly.Therefore it is important for intelligent robot It is exactly vision, smell and tactile to input information, and present technology is not well mapped vision, smell and tactile, because This has the present invention.
The content of the invention
Due to the fast development of artificial intelligence and robot, in existing technology, human use's camera can be well Identify some things, while human use's electronics nose energy distinguishes various smells well, the mankind also surpass in tactile discrimination ability The mankind are crossed, the ability of the identification of the mankind has been above in one-side technology, in the behavior of the mankind, when the mankind see durian, It will remember its smell, while remember the sensation and touch outside it one by one that the taper for touching durian external pricked The shape of conicle, smells the smell of durian, will remember durian, while can also remember the sensation of the touch of its shape, touches It to the shape of durian, can also remember it being durian, while remember the smell of durian, present computer technology also has no idea to realize The effect of this association, therefore the present invention exactly solves this technology, realizes this function of mutually mapping, weight of the invention In be passage time they are connected again, passage time mapping is realized.A kind of computer vision tactile smell is with the time The design method mutually mapped, it is characterized in that, the design method that the smell of computer vision tactile is mutually mapped with the time, by 5 parts Form, 1 is computer vision process part, and 2 be mechanical smell process part, and 3 be Mechanical Touch process part, and 4 be to establish to correspond to Relation, 5 be the method stored using the time as mapping, and computer vision process part mainly claps picture using video camera Photograph come, picture is handled by various algorithms, then picture extraction feature and storage data be compared or It is identified by deep learning method, the method for smell processing is exactly that the gas in air is identified with electronic nose, These information are converted into data, then allow these data to store, and tactile processing is exactly to be connect by the sensor on manipulator The surface of material is touched, shape, temperature, size, weight, the soft or hard various data of material is then identified, establishes image, smell and listen The correspondence of feel, when camera photographed a material just the feature extraction of this material, while electronic nose also smells this The smell that material is sent, simultaneously also by this material of the sensor contacts on manipulator, is at this moment just saved in various data In respective space, this can carry out storage mode regardless of the progress of order using the time as mapping, this is the most important thing, Allow machinery to remember that this is the thing when occurred, using time memory as 3 association, possess uniqueness and real-time, because This is designed using following, and when smell sensor receives signal, at first the data with the lane database of storage are matched, if Mix and be known that it is what scents of, if unmatched, positioned by the concentration gradient of smell, by fixed Position, and go to shoot according to concentration gradient, mechanically moving vision, at this moment start picture Processing Technique, arrive at, with machinery Whether hand goes to experience the tactile feature of material, this progress computing, it is then determined that to store, store if desired this existing As, time for the time timer that just goes to transfer, while the material property after picture is handled and the data after smell processing, and touch Data after feel processing store respectively, are stored in the space set up for each of which, passage time corresponds to them Come, if mechanical nose has smelt smell after so, just the data that this smell and lane database store are compared, if Such data are found, just recall the time remembered originally, corresponding picture and tactile are then looked for according to this time, together If reason computer vision see things, will with the Data Matching of storage, if matched, will recall remembered originally when Between, by the time of memory, the time is scanned for find corresponding smell and tactile, if manipulator contact material, production Raw data and the data of storage are matched, if matched, just recall the time remembered originally, then according to this when Between look for corresponding image and smell, can so correspond, and be known that this when occur.
Brief description of the drawings
Fig. 1 is the design method schematic diagram that a kind of computer vision tactile smell is mutually mapped with the time, and what a-1 was represented is Odor data storehouse, what a-2 was represented is image data base, and a-3 represents haptic data storehouse, and what b-1 was represented is pair of smell and image It should be related to, what b-2 was represented is image and the correspondence of tactile, and what b-3 was represented is smell and the correspondence of tactile, c-1 generations Table be odor data storehouse time portion, c-2 represent is odor data storehouse data portion, d-1 represent be image Time portion, what d-2 was represented is the data portion of image, and what e-1 was represented is the time portion of tactile, and what e-2 was represented is tactile Data portion, following circle represents many meanings.
Embodiment
After a robot starts, system enters working status, its Mechanical Touch, computer vision and the sense of hearing also enters just The manipulator meeting and exterior materials contact of normal state, at this moment robot, not stop incoming tactile data, auditory system ceaselessly receives Exterior sound, camera ceaselessly shoot various materials, the design method that the smell of computer vision tactile is mutually mapped with the time, It is made of 5 parts, 1 is computer vision process part, and 2 be mechanical smell process part, and 3 be Mechanical Touch process part, and 4 be to build Vertical correspondence, 5 be the method stored using the time as mapping, and computer vision process part mainly uses video camera handle Picture shooting gets off, and picture is handled by various algorithms, and then the feature of picture extraction and the data of storage are carried out Compare or be identified by deep learning method, the method for smell processing is exactly that the gas in air is known with electronic nose Not, these information are converted into data, then allow these data to store, tactile processing is exactly by the sensing on manipulator The surface of device contact material, then identifies shape, temperature, size, weight, the soft or hard various data of material, establishes image, smell With the correspondence of the sense of hearing, when camera photographed a material just the feature extraction of this material, while electronic nose is also smelt The smell that this material is sent, simultaneously also by this material of the sensor contacts on manipulator, at this moment just protects various data It is stored in respective space, this can carry out storage mode, this is in weight regardless of the progress of order using the time as mapping Weight, allows machinery to remember that this is the thing when occurred, the association using time memory as 3, possesses uniqueness and in real time Property, therefore designed using following, when smell sensor receives signal, at first the data with the lane database of storage are matched, It is known that it is what scents of if matched, if unmatched, is positioned by the concentration gradient of smell, Go to shoot by positioning, and according to concentration gradient, mechanically moving vision, at this moment start picture Processing Technique, arrive at, Gone to experience the tactile feature of material with manipulator, this progress computing, it is then determined that whether to store, if necessary to store This phenomenon, the time for the time timer that just goes to transfer, while the material property after picture is handled and the number after smell processing According to, and tactile processing after data store respectively, be stored in the space set up for each of which, passage time is them It is mapped, if mechanical nose has smelt smell after so, just the data that this smell and lane database store is compared Compared with, if finding such data, just recall the time remembered originally, then according to this time look for corresponding picture and Tactile, similarly if computer vision sees things, if matched, will will recall originally with the Data Matching of storage The time of memory, by the time of memory, scans for the time to find corresponding smell and tactile, if manipulator contactant Matter, matches the data of generation and the data of storage, if matched, just recalls the time remembered originally, Ran Hougen Corresponding image and smell are looked for according to this time, can so be corresponded, and is known that this is when to send out Raw.As shown in drawings, when need storage when, according to the time, store at the same time, for example, 01 month 2013 No. 01,12 points 12 points, smell the smell of durian, just 01 month 2013 No. 01, the odor data of 12 points of 12 minutes and durians is collectively stored in one Rise, at the same 01 month 2013 No. 01, the characteristics of image of 12 points of 12 minutes and durians is stored together, and at this moment goes to touch durian, Tactile data can be produced, then just 01 month 2013 No. 01,12 points of 12 minutes this times and this haptic data store at the same time Get up, then establish algorithm, establish correspondence, it is just aobvious if be transmitted through in the Data Matching of the durian smell and database come Show this smell, at the same by 01 month 2013 No. 01, the time portion of the memory space of 12 points of 12 minutes removal search characteristics of image, So find the matched time, then just durian shown, at the same with 01 month 2013 No. 01,12: 12 this time went Haptic data library searching data, the tactile data with regard to durian can be found, same reason, if seeing durian, it is possible to figure As the characteristics of image of database lookup durian, the characteristics of image of durian is shown, while with the time 2013 year 01 month No. 01, 12: 12 assign to odor data storehouse into line search, find the data of smell, then show, at the same 01 month 2013 No. 01, 12: 12 this time went haptic data library searching data, the tactile data with regard to that can find durian, similarly if touching durian, Obtain the tactile data of durian, search haptic data storehouse, finds the tactile data of durian, with corresponding 01 month 2013 No. 01, 12: 12 this time, the characteristics of image of durian and the smell of durian are found, can thus be closed according to time correspondence mappings System realizes, and knows and when occur.

Claims (1)

1. the design method that a kind of computer vision tactile smell is mutually mapped with the time, it is characterized in that, computer vision tactile smell The design method mutually mapped with the time, is made of 5 parts, and 1 is computer vision process part, and 2 be mechanical smell process part, 3 be Mechanical Touch process part, and 4 be to establish correspondence, and 5 be to be used as the method that is stored of mapping, computer vision using the time Process part mainly gets off picture shooting using video camera, picture is handled by various algorithms, then picture The feature of extraction and the data of storage are compared or are identified by deep learning method, and the method for smell processing is exactly to use The gas in air is identified in electronic nose, these information are converted into data, then allow these data to store, tactile Processing be exactly by the surface of the sensor contacts material on manipulator, then the shape of identification material, temperature, size, weight, Soft or hard various data, establish image, smell and the correspondence of the sense of hearing, when camera photographed a material just this material Feature extraction, while electronic nose also smells the smell that this material is sent, simultaneously also by the sensor contacts on manipulator this A material, is at this moment just saved in various data in respective space, this can regardless of the progress of order, using the time as Mapping carries out storage mode, this is the most important thing, allows machinery to remember that this is the thing when occurred, 3 are used as using time memory The association of person, possesses uniqueness and real-time, therefore using following design, when smell sensor receives signal, at first and storage The data of lane database matched, be known that it is what scents of if matched, if unmatched, lead to The concentration gradient for crossing smell is positioned, and is gone to shoot, is at this moment opened by positioning, and according to concentration gradient, mechanically moving vision Motion picture cartoon treatment technology, arrives at, and is gone to experience the tactile feature of material with manipulator, this is carried out computing, then really It is fixed whether to store, this phenomenon is stored if desired, the time for the time timer that just goes to transfer, while after picture is handled Material property and smell processing after data, and tactile processing after data store respectively, be stored in and set for each of which In vertical space, passage time is mapped them, if mechanical nose has smelt smell after so, just this smell and The data of lane database storage are compared, if finding such data, the time remembered originally are just recalled, then according to this A time looks for corresponding picture and tactile, similarly if computer vision sees things, will with the Data Matching of storage, such as Fruit matches, and will recall the time remembered originally, by the time of memory, the time is scanned for find corresponding smell And tactile, if manipulator contact material, the data of generation and the data of storage are matched, if matched, are just adjusted Go out the time remembered originally, corresponding image and smell are then looked for according to this time, can so be corresponded, and It is known that when this occurs.
CN201711404039.XA 2017-12-22 2017-12-22 The design method that a kind of computer vision tactile smell is mutually mapped with the time Pending CN108032302A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711404039.XA CN108032302A (en) 2017-12-22 2017-12-22 The design method that a kind of computer vision tactile smell is mutually mapped with the time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711404039.XA CN108032302A (en) 2017-12-22 2017-12-22 The design method that a kind of computer vision tactile smell is mutually mapped with the time

Publications (1)

Publication Number Publication Date
CN108032302A true CN108032302A (en) 2018-05-15

Family

ID=62100505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711404039.XA Pending CN108032302A (en) 2017-12-22 2017-12-22 The design method that a kind of computer vision tactile smell is mutually mapped with the time

Country Status (1)

Country Link
CN (1) CN108032302A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112720448A (en) * 2019-10-14 2021-04-30 防灾科技学院 Positioning robot for self-recognition and positioning system thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006059815A1 (en) * 2004-11-30 2006-06-08 Electronics And Telecommunications Research Institute Olfactory information encoding apparatus and method, and scent code generating apparatus and method
CN102770820A (en) * 2009-12-22 2012-11-07 Atonarp株式会社 Robot
CN103003761A (en) * 2010-07-22 2013-03-27 吉拉吉尔斯芬两合公司 System and method for processing visual, auditory, olfactory, and/or haptic information
CN103608749A (en) * 2011-04-26 2014-02-26 加利福尼亚大学董事会 Systems and devices for recording and reproducing senses
CN106997388A (en) * 2017-03-30 2017-08-01 宁波亿拍客网络科技有限公司 A kind of image and non-image labeling method, equipment and application process
CN107370660A (en) * 2017-06-07 2017-11-21 维沃移动通信有限公司 The method and mobile terminal of a kind of information Perception

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006059815A1 (en) * 2004-11-30 2006-06-08 Electronics And Telecommunications Research Institute Olfactory information encoding apparatus and method, and scent code generating apparatus and method
CN102770820A (en) * 2009-12-22 2012-11-07 Atonarp株式会社 Robot
CN103003761A (en) * 2010-07-22 2013-03-27 吉拉吉尔斯芬两合公司 System and method for processing visual, auditory, olfactory, and/or haptic information
CN103608749A (en) * 2011-04-26 2014-02-26 加利福尼亚大学董事会 Systems and devices for recording and reproducing senses
CN106997388A (en) * 2017-03-30 2017-08-01 宁波亿拍客网络科技有限公司 A kind of image and non-image labeling method, equipment and application process
CN107370660A (en) * 2017-06-07 2017-11-21 维沃移动通信有限公司 The method and mobile terminal of a kind of information Perception

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112720448A (en) * 2019-10-14 2021-04-30 防灾科技学院 Positioning robot for self-recognition and positioning system thereof
CN112720448B (en) * 2019-10-14 2022-11-15 防灾科技学院 Positioning robot for self-recognition and positioning system thereof

Similar Documents

Publication Publication Date Title
US10445939B2 (en) Tactile interaction in virtual environments
Dario et al. Humanoids and personal robots: Design and experiments
Burdea Keynote address: haptics feedback for virtual reality
Li et al. Survey on mapping human hand motion to robotic hands for teleoperation
JP6439817B2 (en) Adapting object handover from robot to human based on cognitive affordance
CN107972069A (en) The design method that a kind of computer vision and Mechanical Touch are mutually mapped with the time
Pyo et al. Service robot system with an informationally structured environment
CN110026987A (en) Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track
CN109591013A (en) A kind of flexible assembly analogue system and its implementation
CN109732593A (en) A kind of far-end control method of robot, device and terminal device
Hong et al. Multiply: A multisensory object-centric embodied large language model in 3d world
Shidujaman et al. “roboquin”: A mannequin robot with natural humanoid movements
CN108032302A (en) The design method that a kind of computer vision tactile smell is mutually mapped with the time
CN107643820A (en) The passive humanoid robots of VR and its implementation method
Oniga et al. Intelligent human-machine interface using hand gestures recognition
Nguyen et al. Reaching development through visuo-proprioceptive-tactile integration on a humanoid robot-a deep learning approach
CN107891448A (en) The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time
CN108247601A (en) Semantic crawl robot based on deep learning
Brock et al. A framework for learning and control in intelligent humanoid robots
Luo Intelligent Textiles for Physical Human-Environment Interactions
CN108108437A (en) The design method that a kind of computer vision sense of hearing smell is mutually mapped with the time
CN108115729A (en) The design method that a kind of computer vision sense of hearing tactile smell is mutually mapped with the time
CN108520074A (en) A kind of design method that the sense of hearing of the robot vision sense of taste is associated with the time
CN108115728A (en) The design method that a kind of machinery sense of hearing tactile smell is mutually mapped with the time
Lin et al. Action recognition for human-marionette interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180515