CN107891448A - The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time - Google Patents

The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time Download PDF

Info

Publication number
CN107891448A
CN107891448A CN201711423150.3A CN201711423150A CN107891448A CN 107891448 A CN107891448 A CN 107891448A CN 201711423150 A CN201711423150 A CN 201711423150A CN 107891448 A CN107891448 A CN 107891448A
Authority
CN
China
Prior art keywords
time
data
sound
hearing
tactile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711423150.3A
Other languages
Chinese (zh)
Inventor
胡明建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201711423150.3A priority Critical patent/CN107891448A/en
Publication of CN107891448A publication Critical patent/CN107891448A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/007Means or methods for designing or fabricating manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A kind of technical field for the design method that computer vision sense of hearing tactile is mutually mapped with the time,It is to belong to,Robot,Artificial intelligence,Computer,Image procossing,Tactile sensor technology,Hearing transducer,The technical fields such as mathematics,Major technique is the image gathered by camera,The data of hearing transducer collection and the data of touch sensor collection,Record simultaneously,Then handled by calculating,The property material allowed in image,The sensation for contacting this material and the sound for hearing this material are mapped,And when being preserved by computing,Save simultaneously,This time is also saved simultaneously,And using this time as mapping,So when receiving the tactile that has preserved,It can be found in lane database,And the time for passing through record,Image and sound find simultaneously corresponding to,When seeing that sub-picture,Corresponding feel and sound is found also by time map,When hearing such sound,Image and sensation corresponding to finding can also be mapped passage time.

Description

The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time
Technical field
A kind of technical field for the design method that computer vision sense of hearing tactile is mutually mapped with the time, is to belong to, robot, The technical fields such as artificial intelligence, computer, image procossing, tactile sensor technology, hearing transducer, mathematics, major technique are The data that the data and touch sensor that image, the hearing transducer gathered by camera gathers gather, are recorded simultaneously, Then handled by calculating, the property material that allows in image, contact the sensation of this material and hear the sound pair of this material When should get up, and being preserved by computing, save simultaneously, this time is also saved simultaneously, and with this Time is mapping, so when receiving the tactile that has preserved, can be found in lane database, and by the time of record, simultaneously Image corresponding to finding and sound, when seeing that sub-picture, corresponding feel and sound is found also by time map, when hearing Such sound, image and sensation corresponding to finding can also be mapped passage time.
Background technology
Intelligent robot is most complicated robot, and the mankind most thirst for the machine friend that can manufacture early Friend.But to produce an intelligent robot and be not easy to, it is only to allow the walking motion of the machine simulation mankind, scientists Tens of or even upper century-old effort will have been paid.Nineteen twenty, Czechoslovakia writer Ka Leier Karel apeks were in his section It is unreal it is small be right, according to Robota(Czech's text, original meaning is " forced labour, hard work ")And Robotnik(Polish, original meaning are " worker "), Create " robot " this word.The domestic appliance of Westinghouse Electrical Corp. manufacture has been put on display in nineteen thirty-nine USA New York World Expo People Elektro.It is controlled by cable, can be walked, and can say 77 words, it might even be possible to smoke, but from living go back of really doing chores It is not by a long chalk.But it allows people to become more specific to the hope of domestic robot.Nineteen forty-two U.S.'s science fiction great master's Asimov carries Go out " law of robot three ".Although this is the creation in science fiction, later as the research and development principle of academia's acquiescence. Promise Bert wiener is published within 1948《Cybernetics --- on the science for controlling and communicating in animal and machine》, elaborate machine Nerve, the common law of sensor capability of communication and Control function and people in device, take the lead in proposing using computer as core from Dynamic chemical plant.Marvin's Ming Siji proposes his view to intelligence machine in Dartmouth meeting within 1954:Intelligent machine Device " can create the abstract model of surrounding environment, if encountered problems, can be found one's way out from abstract model ".This Definition has influence on the research direction of later 30 years intelligent robots.American's George's de Waele produces in the world within 1956 The programmable robot of First, and have registered patent.This manipulator can be engaged in different work according to different programs, because This has versatility and flexibility.Nineteen fifty-nine de Waele jointly produces first with American inventor Joseph's English lattice Burger Platform industrial robot.Then, first hand robot building factory in the world has been set up --- Unimation companies.Due to English lattice Research and development and publicity of the Burger to industrial robot, he is also referred to as " father of industrial robot ".AMF Inc. of U.S. life in 1962 Output " VERSTRAN "(Mean omnipotent carrying), with Unimation companies production Unimate as turn into really commercialization Industrial robot, and export to countries in the world, started upsurge of the whole world to robot and robot research.Sensor Application improve the operability of robot.People try to install various sensors in robot, including 1961 The touch sensor that year Ernest & Whitney uses, Abramovich and Bo Ni 1962 is ask to use pressure in earliest " Dextrous Hand " in the world Force snesor, and mccarthy then starts to add visual sensing system in robot for 1963, and in 1964 years, help MIT to push away Gone out in the world first carry vision sensor, can identify and position the robot system of building blocks.Nineteen sixty-five John Hope Develop Beast robots in Jin Si universities Experiment of Applied Physics room.Beast can by devices such as Sonar system, photoelectric tubes, According to the position of environmental correction oneself.Middle 1960s, Massachusetts Institute Technology, Stanford University, Britain Robot laboratory has been set up successively in Edinburgh University etc..Rise research second generation belt sensor, the machine of " feeling " in the U.S. People, and seted out to artificial intelligence.Stanford Research Institute of the U.S. announced them and researched and developed successful robot Shakey nineteen sixty-eight.It With vision sensor, it can be found according to the instruction of people and capture building blocks, but control its computer to have a room so Greatly.Shakey can be the first in the world platform intelligent robot, pull open the prelude of third generation robot research and development.Japanese early rice Field university adds the youth laboratory of rattan one to develop the robot that First is walked with both feet.The youth of rattan one is added to be devoted for years in research apery Robot, it is described as " father of anthropomorphic robot ".Japanese expert one is to research and develop the technology of anthropomorphic robot and amusement robot It is good at, later further, expedites the emergence of out the ASIMO of Honda Company and the QRIO of Sony.In the world first time robot and Minicom is cooperated, the robot T3 for Cincinnati Milacron companies of the U.S. of being just born.The U.S. Unimation companies release popular industrial machine people PUMA, and this indicates Industrial Robot Technology full maturity.PUMA is extremely The present is still operated in factory's First Line.English lattice Burger Zai Tui robot Helpmate, this robot can be patient in hospital Bring meal, drug delivery, send mail.In the same year, he also foretells:" I will allow robot to wash the floor, and cook, and go out to help my carwash, check peace Entirely ".Chinese Famous scholar Zhou Haizhong professors exist《By robot》Foretold in one text:To middle period 21st century, nanometer robot By the thorough work for changing the mankind and life style.Lego Company of Denmark releases robot(Mind-storms)External member, allows machine Device people manufacture becomes with playing with building blocks, and relatively easy and can is arbitrarily assembled, makes robot start to enter into private world.1999 It is precious that Sony corporation of Japan releases dog humanoid robot love(AIBO), it is sold out, is stepped from this amusement robot as robot at once Enter one of approach of average family.IRobot companies of the U.S. are proposed dust collector robot Roomba within 2002, and it can avoid hindering Hinder, Automated Design course, moreover it is possible in not enough power supply, drive towards cradle automatically.Roomba is that sales volume is most in the world at present Greatly, most commercialized domestic robot.Authorised distributor of Beijing area of iRobot companies:Beijing microgrid Zhi Hong Science and Technology Ltd.s. In June, 2006, Microsoft release Microsoft Robotics Studio, and what robot modularized, platform unitized becomes Gesture is more and more obvious, and Bill Gates prophesy, domestic robot will have swept the globe quickly.Therefore it is important for intelligent robot It is exactly vision, tactile and the sense of hearing to input information, and present technology is not well mapped vision, tactile and the sense of hearing, because This has the present invention.
The content of the invention
Due to the fast development of artificial intelligence and robot, in existing technology, human use's camera can be well Some things are identified, while the touch sensor of human use's manipulator can distinguish various sensations well, the mankind are in the sense of hearing point Distinguish in ability also above the mankind, the ability of the identification of the mankind has been above in one-side technology, in the behavior of the mankind, has worked as people Class sees dog, will remember and touch its sensation, while remembers the cry of dog, and when touching dog, it is dog that will remember this, together When can also remember its cry, when the cry for hearing dog, can also remember that this is dog, while remember the smell of dog, present calculating Machine technology also has no idea to realize the effect of this association, therefore the present invention exactly solves this technology, realizes this mutual The function of mapping, the most important thing of the invention are passage time they to be connected, and passage time is mapped to realize.A kind of machine The design method that tool audiovisual tactile is mutually mapped with the time, it is characterized in that, computer vision sense of hearing tactile is mutually reflected with the time The design method penetrated, it is made up of 5 parts, 1 is computer vision process part, and 2 be Mechanical Touch process part, and 3 be the mechanical sense of hearing Process part, 4 be to establish corresponding relation, and 5 be the method stored using the time as mapping, and computer vision process part is main Be that image taking is got off using video camera, image handled by various algorithms, then the feature of image zooming-out and The data of storage are compared or are identified by deep learning method, and the method for tactile processing is exactly with each on manipulator Kind of sensor is carried out contacting caused data to material, and these information are converted into data, then allows these data storages, Auditory processing be exactly sound signal frequencies are handled, computer disposal is adapted to by analog-to-digital conversion, establish image, tactile and The corresponding relation of the sense of hearing, just go to contact the feature extraction of this things, while with manipulator when camera photographed a things Material, the shape of material is perceived, it is big few, temperature, soft or hard various data, arranged simultaneously also by the diverse location of sound transducer Row, determine the sound that this material is sent, it is also possible to which the sound that things is sent is heard in the arrangement for first passing through sound transducer, then Contact material is removed with manipulator again, obtains various data, the image for shooting material that then camera is turned round, then carry out feature Extraction, or the material of first manipulator contact, obtain various tactile impressions informations, while can hear the sound that material is sent, and then pass through Camera is shot, and extracts feature, carries out storage mode using the time as mapping, this is the most important thing, allows machinery to remember that this is assorted When the thing that occurs, the association using time memory as 3 possesses uniqueness and real-time, therefore using following design, when Manipulator touches material, just produces various perception informations, is at first matched with the data for being stored in lane database, if Mix and be known that what is, if unmatched, just first kept in these information, mechanically moving vision goes to shoot, at this moment Start picture Processing Technique, while receive the sound that this things is sent, sound is handled, this is carried out computing, It is then determined that whether to store, this phenomenon is stored if desired, the time for the time timer that just goes to transfer, while at picture Material property after reason and obtained haptic data, and data after auditory processing store respectively, are stored in as they are each From in the space set up, passage time is mapped them, if manipulator touches material after so, just this tactile Data and the data of lane database storage are compared, if finding such data, just recall the time remembered originally, then Picture and sound according to corresponding to looking for this time, will be with the data of storage similarly if computer vision sees things Matching, if matched, will recall the time remembered originally, by the time of memory, the time be scanned for finding pair The feel and sound answered, it if hearing sound, will be matched with the data of storage, if matched, just be recalled originally The time of memory, then image and sensation according to corresponding to looking for this time, so can mutually be corresponded to, and can be known Road this when occur.
Brief description of the drawings
Fig. 1 is the design method schematic diagram that a kind of computer vision sense of hearing tactile is mutually mapped with the time, and what a-1 was represented is Visual database, what a-2 was represented is audible data storehouse, and a-3 represents haptic data storehouse, and what b-1 was represented is pair of vision and the sense of hearing Should be related to, b-2 represent be the sense of hearing and tactile corresponding relation, b-3 represent be tactile and vision corresponding relation, c-1 generations Table be visual database data division, c-2 represent is visual database time portion, d-1 represent be the sense of hearing Time portion, what d-2 was represented is the data division of the sense of hearing, and what e-1 was represented is the time portion of tactile, and what e-2 was represented is tactile Data division, following circle represents many meanings.
Embodiment
After a robot starts, system enters working condition, and its computer vision, the mechanical sense of hearing and tactile also enter just Normal state, at this moment the haptic system of robot can ceaselessly receive outside smell be passed to, auditory system ceaselessly receives outside Sound, camera ceaselessly shoots various materials, the design method that computer vision sense of hearing tactile is mutually mapped with the time, by 5 Part is formed, and 1 is computer vision process part, and 2 be Mechanical Touch process part, and 3 be mechanical auditory processing part, and 4 be to establish Corresponding relation, 5 be the method stored using the time as mapping, and computer vision process part is mainly using video camera figure As being filmed, image is handled by various algorithms, then the feature of image zooming-out and the data of storage compared It is identified compared with or by deep learning method, the method for tactile processing is exactly that material is entered with the various sensors on manipulator These information, are converted into data, then allow these data storages, auditory processing is exactly to sound by data caused by row contact Sound signal frequency is handled, and is adapted to computer disposal by analog-to-digital conversion, establishes image, tactile and the corresponding relation of the sense of hearing, Contact material just is removed the feature extraction of this things, while with manipulator when camera photographed a things, perceives material Shape, it is big few, temperature, soft or hard various data, arranged simultaneously also by the diverse location of sound transducer, determine that this material is sent out The sound gone out, it is also possible to which the sound that things is sent is heard in the arrangement for first passing through sound transducer, then goes to contact with manipulator again Material, various data are obtained, the image for shooting material that then camera is turned round, then feature extraction is carried out, or first manipulator The material of contact, various tactile impressions informations are obtained, while the sound that material is sent can be heard, then shot by camera, extracted Feature, storage mode being carried out using the time as mapping, this is the most important thing, allows machinery to remember that this is the thing when occurred, Using time memory as 3 association, possess uniqueness and real-time, therefore using following design, when manipulator touches thing Matter, various perception informations are just produced, at first and be stored in the data of lane database and matched, be known that if matched and be What, if unmatched, just first keeps in these information, and mechanically moving vision goes to shoot, and at this moment starts picture processing skill Art, while the sound that this things is sent is received, sound is handled, this progress computing, it is then determined that whether Storage, stores this phenomenon if desired, the time for the time timer that just goes to transfer, while the material property after picture processing Store, be stored in the space set up for each of which respectively with obtained haptic data, and data after auditory processing, Passage time is mapped them, if manipulator touches material after so, just this haptic data and lane database The data of storage are compared, if finding such data, are just recalled the time remembered originally, are then gone according to this time Picture corresponding to searching and sound, similarly if computer vision sees things, will with the Data Matching of storage, if matched , the time remembered originally will be recalled, by the time of memory, the time is scanned for find corresponding feel and sound, It if hearing sound, will be matched with the data of storage, if matched, just recall the time remembered originally, then Image and sensation according to corresponding to looking for this time, so can mutually be corresponded to, and be known that when this is Occur.As shown in drawings, when need storage when, according to the time, store simultaneously, for example, 01 month 2013 No. 01,12 Point 12 minutes, touch the sensation of dog, just 01 month 2013 No. 01,12 points are stored together for 12 minutes with this haptic data, together When 01 month 2013 No. 01, the characteristics of image of 12 points of 12 minutes and dogs is stored together, and at this moment hears the cry of dog, then Just 01 month 2013 No. 01, the data of 12 points of 12 minutes this times and this sound store simultaneously, then establish algorithm, Corresponding relation is established, if touched in data caused by dog and the Data Matching of database, just shows this sensation, is led to simultaneously Spend 01 month 2013 No. 01, the time portion of the memory space of 12 points of 12 minutes removal search characteristics of image, so find matching when Between, then just dog is shown, while with 01 month 2013 No. 01,12: 12 this time removed audible data library searching number According to, the sound with regard to dog can be found, same reason, if seeing dog, it is possible to the characteristics of image of view data library lookup dog, The characteristics of image of dog is shown, while with the time 2013 year 01 month No. 01,12: 12, which assign to haptic data storehouse, enters line search, Find the data of tactile, then show, at the same with 01 month 2013 No. 01,12: 12 this time went to audible data storehouse to search Rope data, the sound with regard to dog can be found, similarly if hearing the sound of dog, with 01 month 2013 No. 01,12: 12 this time, The characteristics of image of dog and the sensory data of contact dog are found, can thus be realized according to time correspondence mappings relation, and And know and when occur.

Claims (1)

1. the design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time, it is characterized in that, computer vision sense of hearing tactile The design method mutually mapped with the time, it is made up of 5 parts, 1 is computer vision process part, and 2 be Mechanical Touch process part, 3 be mechanical auditory processing part, and 4 be to establish corresponding relation, and 5 be to be used as the method that is stored of mapping, computer vision using the time Process part mainly gets off image taking using video camera, image is handled by various algorithms, then image The feature of extraction and the data of storage are compared or are identified by deep learning method, and the method for tactile processing is exactly to use Various sensors on manipulator are carried out contacting caused data to material, and these information are converted into data, then allow these Data storage is got up, and auditory processing is exactly that sound signal frequencies are handled, and is adapted to computer disposal by analog-to-digital conversion, builds Vertical image, tactile and the corresponding relation of the sense of hearing, when camera photographed a things just the feature extraction of this things, use simultaneously Manipulator removes contact material, perceives the shape of material, big few, temperature, soft or hard various data, simultaneously also by sound transducer Diverse location arranges, and determines the sound that this material is sent, it is also possible to which the arrangement for first passing through sound transducer hears that things is sent Sound, then remove contact material with manipulator again, obtain various data, then camera turn round shooting material figure Picture, then feature extraction is carried out, or the material of first manipulator contact, various tactile impressions informations are obtained, while can hear what material was sent Sound, then shot by camera, extract feature, using the time as mapping progress storage mode, this is the most important thing, allows machine Tool remembers that this is the thing when occurred, using time memory as 3 association, possesses uniqueness and real-time, therefore adopt With following design, when manipulator touches material, various perception informations are just produced, at first and the data of lane database is stored in and enters Row matching, is known that what is if matched, if unmatched, just first keeps in these information, mechanically moving regards Feel is gone to shoot, and at this moment starts picture Processing Technique, while receives the sound that this things is sent, and sound is handled, This carries out computing, it is then determined that whether to store, stores this phenomenon if desired, the time timer that just goes to transfer when Between, while the material property after picture processing and obtained haptic data, and data after auditory processing store respectively, It is stored in the space set up for each of which, passage time is mapped them, if manipulator touches after so Material, just the data that this haptic data and lane database store are compared, if finding such data, just recalled originally The time of memory, then picture and sound according to corresponding to looking for this time, similarly if computer vision sees things, just Meeting and the Data Matching of storage, if matched, will recall the time remembered originally, by the time of memory, to the time Scan for finding corresponding feel and sound, if hearing sound, will be matched with the data of storage, if matched , the time remembered originally is just recalled, then image and sensation according to corresponding to looking for this time, so can be mutually right Should, and it is known that when this occurs.
CN201711423150.3A 2017-12-25 2017-12-25 The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time Pending CN107891448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711423150.3A CN107891448A (en) 2017-12-25 2017-12-25 The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711423150.3A CN107891448A (en) 2017-12-25 2017-12-25 The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time

Publications (1)

Publication Number Publication Date
CN107891448A true CN107891448A (en) 2018-04-10

Family

ID=61808307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711423150.3A Pending CN107891448A (en) 2017-12-25 2017-12-25 The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time

Country Status (1)

Country Link
CN (1) CN107891448A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1343551A (en) * 2000-09-21 2002-04-10 上海大学 Hierarchical modular model for robot's visual sense
CN101573734A (en) * 2006-09-04 2009-11-04 N·I·帕申科 Method and system for simulating and operating a single virtual space
CN103608749A (en) * 2011-04-26 2014-02-26 加利福尼亚大学董事会 Systems and devices for recording and reproducing senses
CN104871160A (en) * 2012-09-28 2015-08-26 加利福尼亚大学董事会 Systems and methods for sensory and cognitive profiling
KR20160020136A (en) * 2014-08-13 2016-02-23 박현정 Training system for treating disaster using virtual reality and role playing game
CN105556506A (en) * 2013-10-25 2016-05-04 英特尔公司 Apparatus and methods for capturing and generating user experiences
CN106030707A (en) * 2014-02-14 2016-10-12 唐纳德·詹姆士·德里克 System for audio analysis and perception enhancement
CN106415616A (en) * 2014-05-24 2017-02-15 宫崎洋彰 Autonomous thinking pattern generator
CN106778959A (en) * 2016-12-05 2017-05-31 宁波亿拍客网络科技有限公司 A kind of specific markers and method system that identification is perceived based on computer vision
CN107370660A (en) * 2017-06-07 2017-11-21 维沃移动通信有限公司 The method and mobile terminal of a kind of information Perception

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1343551A (en) * 2000-09-21 2002-04-10 上海大学 Hierarchical modular model for robot's visual sense
CN101573734A (en) * 2006-09-04 2009-11-04 N·I·帕申科 Method and system for simulating and operating a single virtual space
CN103608749A (en) * 2011-04-26 2014-02-26 加利福尼亚大学董事会 Systems and devices for recording and reproducing senses
CN104871160A (en) * 2012-09-28 2015-08-26 加利福尼亚大学董事会 Systems and methods for sensory and cognitive profiling
CN105556506A (en) * 2013-10-25 2016-05-04 英特尔公司 Apparatus and methods for capturing and generating user experiences
CN106030707A (en) * 2014-02-14 2016-10-12 唐纳德·詹姆士·德里克 System for audio analysis and perception enhancement
CN106415616A (en) * 2014-05-24 2017-02-15 宫崎洋彰 Autonomous thinking pattern generator
KR20160020136A (en) * 2014-08-13 2016-02-23 박현정 Training system for treating disaster using virtual reality and role playing game
CN106778959A (en) * 2016-12-05 2017-05-31 宁波亿拍客网络科技有限公司 A kind of specific markers and method system that identification is perceived based on computer vision
CN107370660A (en) * 2017-06-07 2017-11-21 维沃移动通信有限公司 The method and mobile terminal of a kind of information Perception

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system
CN113942009B (en) * 2021-09-13 2023-04-18 苏州大学 Robot bionic hand grabbing method

Similar Documents

Publication Publication Date Title
US10445939B2 (en) Tactile interaction in virtual environments
CN110026987B (en) Method, device and equipment for generating grabbing track of mechanical arm and storage medium
JP6439817B2 (en) Adapting object handover from robot to human based on cognitive affordance
CN107972069A (en) The design method that a kind of computer vision and Mechanical Touch are mutually mapped with the time
US10864453B2 (en) Automatic mobile robot for facilitating activities to improve child development
CN107408146A (en) Monitoring
US20220026991A1 (en) Method and Arrangement for Handling Haptic Feedback
Magnenat-Thalmann et al. Context aware human-robot and human-agent interaction
CN107891448A (en) The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time
Hong et al. Multiply: A multisensory object-centric embodied large language model in 3d world
Nguyen et al. Reaching development through visuo-proprioceptive-tactile integration on a humanoid robot-a deep learning approach
CN108032302A (en) The design method that a kind of computer vision tactile smell is mutually mapped with the time
Oniga et al. Intelligent human-machine interface using hand gestures recognition
Luo Intelligent Textiles for Physical Human-Environment Interactions
CN108108437A (en) The design method that a kind of computer vision sense of hearing smell is mutually mapped with the time
CN108115729A (en) The design method that a kind of computer vision sense of hearing tactile smell is mutually mapped with the time
CN108520074A (en) A kind of design method that the sense of hearing of the robot vision sense of taste is associated with the time
CN107871016A (en) The design method that a kind of computer vision and the mechanical sense of taste are mutually mapped with the time
CN107967307A (en) The design method that a kind of computer vision and the mechanical sense of hearing are mutually mapped with the time
CN108115728A (en) The design method that a kind of machinery sense of hearing tactile smell is mutually mapped with the time
CN107832803A (en) The design method that a kind of computer vision and mechanical smell are mutually mapped with the time
CN105955255A (en) A control method, device, controller and control system
Ramirez et al. An extension of spatial and tactile perception based on haptics
TWM474176U (en) Non-contact real-time interaction system
Zheng Sentimental soft robotics as companion artefacts

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180410