CN102895093A - Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor - Google Patents

Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor Download PDF

Info

Publication number
CN102895093A
CN102895093A CN2011104139344A CN201110413934A CN102895093A CN 102895093 A CN102895093 A CN 102895093A CN 2011104139344 A CN2011104139344 A CN 2011104139344A CN 201110413934 A CN201110413934 A CN 201110413934A CN 102895093 A CN102895093 A CN 102895093A
Authority
CN
China
Prior art keywords
rgb
user
sensor
assistant robot
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011104139344A
Other languages
Chinese (zh)
Inventor
冷春涛
李宝顺
黄怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN2011104139344A priority Critical patent/CN102895093A/en
Publication of CN102895093A publication Critical patent/CN102895093A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a walker aid robot tracking system and walker aid robot tracking method based on an RGB-D (red, green and blue-depth) sensor. The walker aid robot tracking system is characterized by comprising an RGB-D sensor module, a central controller, a motion control module and a walker aid robot executing module, wherein the RGB-D sensor module is used for acquiring human body characteristic value of a user, the central controller controls the RGB-D sensor module to search a target which is stored in a memory bank and matched with the human body characteristic valve of the user according to received commands of the user, determines a position value of the matched target if in matching and transmits motion commands to the motion control module; the motion control module receives the motion commands, controls the walker aid robot executing module and relevant mechanisms to complete a target tracking task and feeds relevant signals to the central controller. Since the human body tracking function is realized by the RGB-D sensor, the user can enable the walker aid robot to follow himself for use at any time when with no need for walking assistance, portability of the walker aid robot is greatly improved, and using experience of the walker aid robot is improved.

Description

A kind of assistant robot tracking system and method based on the RGB-D sensor
Technical field
The present invention relates to automation field, relate in particular to a kind of tracking system and method for the assistant robot based on the RGB-D sensor.
Background technology
Helping the elderly/concept of disabled aiding robot under, intelligent assistant robot mainly is the elderly and the disabled for walking disorder, poor or weak eyesight, blind person etc., take daily life and auxiliary and daily the accompanying and attending to as main target of the needed walking of work.
Common walking aid comprises crutch, walking aid, old man and the people with disability of difficulty in walking, and the widely used walking auxiliary facilities such as the poor or weak eyesight of perception difficulty and blind person, but the function of these common walking aids is very simple, comfortableness and safety are limited, can limit to a certain extent old man's Autonomy trip confidence.Intelligent walking aid then can provide more service function on the basis that common walking aid function is provided, it can be by the perception to environment, help the old man to make some decisions, avoid some danger, and can exchange with the old man, old man's condition also there is the control function of a time, will greatly improves old man's the ability of living on one's own life, simultaneously by allowing old man's health more can keep fit to accompanying and attending to of old man.
The following function of assistant robot is to feel full of physical strength when user, leave robot assist independent ambulation in the time, robot follows automatically near user, uses at any time in order to user.
Disclose " a kind of friendly walking-aid robot " (Chinese invention patent application number 200810150388.8) in the prior art, dynamic relationship and the kinestate that whether having wrestling trend change robot of this invention by the perception old people carries out the close friend to the old people and helps.But full of physical strength as the user, leave robot assist independent ambulation in the time, robot can't follow near user automatically, thereby brought the inconvenience on using.
Disclose " a kind of human body tracing method and equipment thereof " (Chinese invention patent application number 200810116432.3) in the prior art, this invention improves whole trace ability by spatial positional information is fused in the colouring information.Yet this invention is only used colouring information and is followed the tracks of, and under the inadequate environment of light or the human body with same color feature together the time, tracking effect will variation.
Summary of the invention
The object of the invention is for the deficiencies in the prior art, and a kind of tracking system scheme of facing walking robot is provided, and makes assistant robot can satisfy function as the daily instrument of accompanying and attending to of old man.The present invention is achieved by the following technical solutions:
The invention discloses a kind of assistant robot tracking system based on the RGB-D sensor, this system comprises: RGB-D sensor assembly, central controller, motion-control module and assistant robot execution module, wherein, described RGB-D sensor assembly is used for obtaining characteristics of human body's value of user; Described central controller is according to the user instruction that receives, control the target that characteristics of human body's value of storing in the search of described RGB-D sensor and the data base is complementary, if the match is successful then determine that institute mates the positional value of target, and the transmission movement instruction is to motion-control module; Described motion-control module receives this movement instruction, controls described assistant robot execution module and associated mechanisms realize target tracing task, and the feedback coherent signal is to described central controller.
Further, described RGB-D sensor assembly is comprised of RGB video camera and depth transducer, and wherein, described RGB video camera catches the color feature value of obtaining user's whole body, sets up the color feature value storehouse; Described depth transducer obtains the 3D shape of user's whole body, sets up user's shape facility value storehouse.
Further, described motion-control module comprises slave computer submodule and motor-driven submodule, the command information that described slave computer submodule acceptance is sent from central controller, by relevant motion control arithmetic, the output control signal is given described motor-driven submodule, described motor-driven submodule receives the control signal from described slave computer submodule, controls described assistant robot execution module and associated mechanisms performance objective tracing task.
Further, this system also comprises human-computer interaction module, is used for realizing the mutual of user and assistant robot; Can receive the phonetic order that comprises that the user sends by the interactive voice mode.
Further, RGB-D sensor assembly, central controller, motion-control module, assistant robot execution module all are installed on the assistant robot body.
The invention also discloses a kind of tracking of the assistant robot based on the RGB-D sensor, the method comprises the steps:
S1. at first the registered user characteristics of human body is worth, and is recorded in the data base of central controller;
S2. when described central controller receives the instruction that starts following function, control described RGB-D sensor search for data base in the target that is complementary of characteristics of human body's value of storing, if the match is successful then definite positional value that mates target sends movement instruction to motion-control module;
S3. described motion-control module receives this movement instruction, control assistant robot execution module and associated mechanisms realize target tracing task, and the feedback coherent signal is to central controller.
Further, characteristics of human body's value described in the step S1 comprises color feature value and shape facility value; The process that the sensor of RGB-D described in the step S2 is searched for is specifically: the color of the user's whole body that obtains by the RGB video camera is carried out the RGB analysis of threshold, and then obtains the unique color feature value of this user's human body, sets up the color feature value storehouse; Utilize the 3D shape of user's whole body of depth transducer acquisition, set up user's shape facility value storehouse.
Further, step S3 is specifically: after motion-control module receives movement instruction, by the motion control arithmetic of slave computer submodule by being correlated with, the output control signal is to the motor-driven submodule, the motor-driven submodule receives control signal, drive motors and the assistant robot execution module realize target tracing task from the slave computer submodule.
Further, the coherent signal of feedback described in the step S3 comprises: the signals such as relevant position, speed, electric current.
Further, among the step S2, the instruction that described reception starts following function comprises: receive the phonetic order that the user sends by the interactive voice mode.
The present invention is by the realization of the human body tracking function of RGB-G sensor, the user can follow thereafter assistant robot when not needing auxiliary walking, in order to using at any time, greatly improve the portability of assistant robot, improved the experience of assistant robot.
Description of drawings
Fig. 1 is that embodiment of the invention system forms module frame chart;
Fig. 2 is embodiment of the invention track algorithm module workflow diagram.
The specific embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated: present embodiment is implemented under take technical solution of the present invention as prerequisite, has provided detailed embodiment, but protection scope of the present invention is not limited to following embodiment.
Fig. 1 is that the described system of the embodiment of the invention forms module frame chart, as shown in Figure 1, assistant robot tracking system of the present invention comprises: the RGB-D sensor assembly, the track algorithm module, embedded central controller, human-computer interaction module, embedded motion-control module, the assistant robot execution module, RGB-D sensor assembly wherein, embedded central controller, embedded motion-control module, the assistant robot execution module all is installed on the assistant robot body, the track algorithm module is installed in the embedded central controller, be used for realizing the software module of human body tracking, human-computer interaction module mainly is that embedded motion-control module comprises slave computer submodule and motor-driven submodule for the medium of realizing that user and assistant robot are mutual.
Described embedded motion-control module comprises slave computer submodule and motor-driven submodule.The slave computer submodule is accepted the target velocity command information sent from embedded central controller when following function, by relevant motion control arithmetic, the output control signal is to the motor-driven submodule, the motor-driven submodule receives the control signal from the slave computer submodule, work is carried out in drive motors and relevant actuator, and feeds back to the signals such as relevant position, speed, electric current.
Described RGB-D sensor assembly is comprised of RGB video camera and depth transducer, the RGB-D sensor is a kind of sensor that can obtain simultaneously environmental colors value (RGB) and Object Depth value (Depth), the advantage of having gathered range sensor and visual organ, this module is obtained the color characteristic of user's whole body by the seizure of RGB video camera, by the stereoscopic three-dimensional feature modeling of depth transducer with user's whole body.
Fig. 2 is embodiment of the invention track algorithm module workflow diagram, as shown in Figure 2, the method mainly is that the color of user's whole body that the RGB video camera is obtained is carried out the RGB analysis of threshold, obtains the unique color feature value of this user's human body by analytic statistics, sets up the color feature value storehouse; Utilize the 3D shape of user's whole body of depth transducer acquisition, set up user's shape facility value storehouse.When practical application, at first carry out the user profile record, soon the color feature value of user's human body and shape facility value record are in the memorizer of embedded central controller, when starting following function, the track algorithm module begin to search for data base in the highest human body of the color feature value that records and shape facility value matching degree as tracking target.
In the present embodiment, the user at first utilizes the eigenvalue of RGB-D sensor login oneself, comprises color feature value and shape facility value, and is recorded in the memorizer of embedded central controller; When the user need to open following function, send corresponding instruction by human-computer interaction interface, this moment, embedded central controller received user instruction, control RGB-D sensor search for data base in the object that is complementary of characteristics of human body's value of storing, if the match is successful then definite positional value that mates object, send movement instruction to embedded motion-control module, after embedded motion-control module receives instruction, the slave computer submodule becomes the instruction transformation that receives the control signal of drive motors, and this control signal exported to the motor-driven submodule, the motor-driven submodule receives the control signal from the slave computer submodule, work is carried out in drive motors and relevant actuator, and feeds back to relevant position, speed, the signals such as electric current.
Described human-computer interaction module Main Function in the present invention is to allow the user open following function by friendly interface, and main implementation method is the interactive voice mode, namely makes assistant robot receive the phonetic order that the user sends by speech recognition technology.
Compared with prior art, outstanding feature of the present invention is to adopt the RGB-D sensor to realize the usertracking function of assistant robot, owing to adopted user's color characteristic and the method that shape facility combines, user's discrimination is high, greatly improved the intelligent degree of following the tracks of the user, improve the portability of assistant robot, improved the experience of assistant robot.So that when the user full of physical strength when not needing assistant robot to assist, assistant robot can be followed the user at one's side at any time, for user provides free and convenient.
Above-mentioned only is preferred embodiment of the present invention and institute's application technology principle, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses, the variation that can expect easily or replacement, all should be encompassed in protection scope of the present invention in.

Claims (10)

1. assistant robot tracking system based on the RGB-D sensor, it is characterized in that, this system comprises: RGB-D sensor assembly, central controller, motion-control module and assistant robot execution module, and wherein, described RGB-D sensor assembly is used for obtaining characteristics of human body's value of user; Described central controller is according to the user instruction that receives, control the target that characteristics of human body's value of storing in the search of described RGB-D sensor and the data base is complementary, if the match is successful then determine that institute mates the positional value of target, and the transmission movement instruction is to motion-control module; Described motion-control module receives this movement instruction, controls described assistant robot execution module and associated mechanisms realize target tracing task, and the feedback coherent signal is to described central controller.
2. the assistant robot tracking system based on the RGB-D sensor according to claim 1, it is characterized in that, described RGB-D sensor assembly is comprised of RGB video camera and depth transducer, wherein, described RGB video camera catches the color feature value of obtaining user's whole body, sets up the color feature value storehouse; Described depth transducer obtains the 3D shape of user's whole body, sets up user's shape facility value storehouse.
3. the assistant robot tracking system based on the RGB-D sensor according to claim 1, it is characterized in that, described motion-control module comprises slave computer submodule and motor-driven submodule, the command information that described slave computer submodule acceptance is sent from central controller, by relevant motion control arithmetic, the output control signal is given described motor-driven submodule, described motor-driven submodule receives the control signal from described slave computer submodule, controls described assistant robot execution module and associated mechanisms performance objective tracing task.
4. the assistant robot tracking system based on the RGB-D sensor according to claim 1 is characterized in that this system also comprises human-computer interaction module, is used for realizing the mutual of user and assistant robot; Can receive the phonetic order that comprises that the user sends by the interactive voice mode.
5. according to claim 1 to one of 4 described assistant robot tracking systems based on the RGB-D sensor, it is characterized in that RGB-D sensor assembly, central controller, motion-control module, assistant robot execution module all are installed on the assistant robot body.
6. the tracking based on the assistant robot of RGB-D sensor is characterized in that the method comprises the steps:
S1. at first the registered user characteristics of human body is worth, and is recorded in the data base of central controller;
S2. when described central controller receives the instruction that starts following function, control described RGB-D sensor search for data base in the target that is complementary of characteristics of human body's value of storing, if the match is successful then definite positional value that mates target sends movement instruction to motion-control module;
S3. described motion-control module receives this movement instruction, control assistant robot execution module and associated mechanisms realize target tracing task, and the feedback coherent signal is to central controller.
7. the tracking of the assistant robot based on the RGB-D sensor according to claim 6 is characterized in that the value of characteristics of human body described in the step S1 comprises color feature value and shape facility value; The process that the sensor of RGB-D described in the step S2 is searched for is specifically: the color of the user's whole body that obtains by the RGB video camera is carried out the RGB analysis of threshold, and then obtains the unique color feature value of this user's human body, sets up the color feature value storehouse; Utilize the 3D shape of user's whole body of depth transducer acquisition, set up user's shape facility value storehouse.
8. the tracking of the assistant robot based on the RGB-D sensor according to claim 6, it is characterized in that, step S3 is specifically: after motion-control module receives movement instruction, by the motion control arithmetic of slave computer submodule by being correlated with, the output control signal is to the motor-driven submodule, the motor-driven submodule receives control signal, drive motors and the assistant robot execution module realize target tracing task from the slave computer submodule.
9. the tracking of the assistant robot based on the RGB-D sensor according to claim 6 is characterized in that, the coherent signal of feedback described in the step S3 comprises: the signals such as relevant position, speed, electric current.
10. according to claim 6 to the tracking of one of 9 described assistant robots based on the RGB-D sensor, it is characterized in that among the step S2, the instruction that described reception starts following function comprises: receive the phonetic order that the user sends by the interactive voice mode.
CN2011104139344A 2011-12-13 2011-12-13 Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor Pending CN102895093A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011104139344A CN102895093A (en) 2011-12-13 2011-12-13 Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011104139344A CN102895093A (en) 2011-12-13 2011-12-13 Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor

Publications (1)

Publication Number Publication Date
CN102895093A true CN102895093A (en) 2013-01-30

Family

ID=47567789

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011104139344A Pending CN102895093A (en) 2011-12-13 2011-12-13 Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor

Country Status (1)

Country Link
CN (1) CN102895093A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105182757A (en) * 2015-06-05 2015-12-23 普天智能照明研究院有限公司 Mobile intelligent housekeeper robot control method
CN105335696A (en) * 2015-08-26 2016-02-17 湖南信息职业技术学院 3D abnormal gait behavior detection and identification based intelligent elderly assistance robot and realization method
CN105326629A (en) * 2015-11-26 2016-02-17 哈尔滨博强机器人技术有限公司 Walking-assist robot adapting to adjustment and interaction
CN105425795A (en) * 2015-11-26 2016-03-23 纳恩博(北京)科技有限公司 Method for planning optimal following path and apparatus
CN106681326A (en) * 2017-01-04 2017-05-17 京东方科技集团股份有限公司 Seat, method of controlling seat movement, and motion control system for seat
CN109460031A (en) * 2018-11-28 2019-03-12 科大智能机器人技术有限公司 A kind of system for tracking of the automatic tractor based on human bioequivalence
CN110515384A (en) * 2019-09-09 2019-11-29 深圳市三宝创新智能有限公司 A kind of the human body follower method and robot of view-based access control model mark
CN111880575A (en) * 2020-08-10 2020-11-03 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN116309590A (en) * 2023-05-22 2023-06-23 四川新迎顺信息技术股份有限公司 Visual computing method, system, electronic equipment and medium based on artificial intelligence

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020024599A1 (en) * 2000-08-17 2002-02-28 Yoshio Fukuhara Moving object tracking apparatus
US20030004911A1 (en) * 2001-07-02 2003-01-02 Wong Judy Shuk-Ching Device and method to enhance verification of biometric features
JP2003271181A (en) * 2002-03-15 2003-09-25 Sony Corp Information processor, information processing method, recording medium and program
CN101385677A (en) * 2008-10-16 2009-03-18 上海交通大学 Blind guiding method and device based on moving body track
CN101881153A (en) * 2010-06-04 2010-11-10 中国石油天然气股份有限公司 Conventional logging information fusion visualization method and system thereof
CN101986673A (en) * 2010-09-03 2011-03-16 浙江大学 Intelligent mobile phone blind-guiding device and blind-guiding method
CN102048612A (en) * 2011-01-07 2011-05-11 东华大学 Blind-guidance robot based on machine vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020024599A1 (en) * 2000-08-17 2002-02-28 Yoshio Fukuhara Moving object tracking apparatus
US20030004911A1 (en) * 2001-07-02 2003-01-02 Wong Judy Shuk-Ching Device and method to enhance verification of biometric features
JP2003271181A (en) * 2002-03-15 2003-09-25 Sony Corp Information processor, information processing method, recording medium and program
CN101385677A (en) * 2008-10-16 2009-03-18 上海交通大学 Blind guiding method and device based on moving body track
CN101881153A (en) * 2010-06-04 2010-11-10 中国石油天然气股份有限公司 Conventional logging information fusion visualization method and system thereof
CN101986673A (en) * 2010-09-03 2011-03-16 浙江大学 Intelligent mobile phone blind-guiding device and blind-guiding method
CN102048612A (en) * 2011-01-07 2011-05-11 东华大学 Blind-guidance robot based on machine vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
M JAMSHIDI等: "Mobile Robot Navigation and Target Tracking", 《PROC. OF THE 2011 6TH INTERNATIONAL CONFERENCE ON SYSTEM OF SYSTEMS ENGINEERING》 *
W CHOI等: "Detecting and Tracking People using an RGB-D Camera via Multiple Detector Fusion", 《IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISIO WORKSHOPS》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105182757B (en) * 2015-06-05 2019-03-05 普天智能照明研究院有限公司 A kind of movable-type intelligent house keeper robot control method
CN105182757A (en) * 2015-06-05 2015-12-23 普天智能照明研究院有限公司 Mobile intelligent housekeeper robot control method
CN105335696B (en) * 2015-08-26 2018-05-22 湖南信息职业技术学院 A kind of intelligence based on the identification of 3D abnormal gaits behavioral value is helped the elderly robot and implementation method
CN105335696A (en) * 2015-08-26 2016-02-17 湖南信息职业技术学院 3D abnormal gait behavior detection and identification based intelligent elderly assistance robot and realization method
CN105425795A (en) * 2015-11-26 2016-03-23 纳恩博(北京)科技有限公司 Method for planning optimal following path and apparatus
CN105326629A (en) * 2015-11-26 2016-02-17 哈尔滨博强机器人技术有限公司 Walking-assist robot adapting to adjustment and interaction
CN106681326A (en) * 2017-01-04 2017-05-17 京东方科技集团股份有限公司 Seat, method of controlling seat movement, and motion control system for seat
US10682270B2 (en) 2017-01-04 2020-06-16 Boe Technology Group Co., Ltd. Seat, motion control method thereof and motion control system thereof
CN109460031A (en) * 2018-11-28 2019-03-12 科大智能机器人技术有限公司 A kind of system for tracking of the automatic tractor based on human bioequivalence
CN110515384A (en) * 2019-09-09 2019-11-29 深圳市三宝创新智能有限公司 A kind of the human body follower method and robot of view-based access control model mark
CN111880575A (en) * 2020-08-10 2020-11-03 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN111880575B (en) * 2020-08-10 2023-03-24 重庆依塔大数据研究院有限公司 Control method and device based on color tracking, storage medium and robot
CN116309590A (en) * 2023-05-22 2023-06-23 四川新迎顺信息技术股份有限公司 Visual computing method, system, electronic equipment and medium based on artificial intelligence
CN116309590B (en) * 2023-05-22 2023-08-04 四川新迎顺信息技术股份有限公司 Visual computing method, system, electronic equipment and medium based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN102895093A (en) Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor
CN100360204C (en) Control system of intelligent perform robot based on multi-processor cooperation
Mahmud et al. Interface for human machine interaction for assistant devices: A review
CN105468145B (en) A kind of robot man-machine interaction method and device based on gesture and speech recognition
JP4689107B2 (en) Autonomous robot
CN106737760B (en) Human-type intelligent robot and human-computer communication system
KR101343860B1 (en) Robot avatar system using hybrid interface and command server, learning server, and sensory server therefor
CN109605385B (en) Rehabilitation assisting robot driven by hybrid brain-computer interface
JP7375748B2 (en) Information processing device, information processing method, and program
JP7120254B2 (en) Information processing device, information processing method, and program
CN110688910B (en) Method for realizing wearable human body basic gesture recognition
CN109093633A (en) A kind of detachable robot and its control method
JP2024023193A (en) Information processing device and information processing method
Chu et al. The helping hand: An assistive manipulation framework using augmented reality and tongue-drive interfaces
Petit et al. An integrated framework for humanoid embodiment with a BCI
JP7375770B2 (en) Information processing device, information processing method, and program
CN208930273U (en) A kind of detachable robot
CN116572260A (en) Emotion communication accompanying and nursing robot system based on artificial intelligence generated content
Zatout et al. A Novel Output Device for visually impaired and blind people’s aid systems
Trivedi et al. Design and implementation of a smart wheelchair
CN106997449A (en) Robot and face identification method with face identification functions
Jin et al. Human-robot interaction for assisted object grasping by a wearable robotic object manipulation aid for the blind
WO2018157355A1 (en) Humanoid intelligent robot and human-machine communication system
Cherubini et al. Development of a multimode navigation system for an assistive robotics project
Kibria et al. Creation of a Cost-Efficient and Effective Personal Assistant Robot using Arduino & Machine Learning Algorithm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
DD01 Delivery of document by public notice

Addressee: Li Baoshun

Document name: the First Notification of an Office Action

WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130130

WD01 Invention patent application deemed withdrawn after publication