WO2017092620A1 - Dispositif intelligent, son procédé de commande intelligent et support de stockage informatique - Google Patents

Dispositif intelligent, son procédé de commande intelligent et support de stockage informatique Download PDF

Info

Publication number
WO2017092620A1
WO2017092620A1 PCT/CN2016/107306 CN2016107306W WO2017092620A1 WO 2017092620 A1 WO2017092620 A1 WO 2017092620A1 CN 2016107306 W CN2016107306 W CN 2016107306W WO 2017092620 A1 WO2017092620 A1 WO 2017092620A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
smart device
smart
location
electronic device
Prior art date
Application number
PCT/CN2016/107306
Other languages
English (en)
Chinese (zh)
Inventor
陆见微
王野
蒲立
陈子冲
Original Assignee
纳恩博(北京)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 纳恩博(北京)科技有限公司 filed Critical 纳恩博(北京)科技有限公司
Publication of WO2017092620A1 publication Critical patent/WO2017092620A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers

Definitions

  • the invention relates to a smart device, an intelligent control method, and a computer storage medium.
  • a smart device including:
  • a receiving unit configured to receive a first signal sent by the first electronic device, where the first signal is used to at least represent a device identifier of the first electronic device;
  • a calculating unit configured to: after the receiving unit receives the first signal, pass the The first signal calculates a first location of the first electronic device;
  • a first driving unit configured to drive the smart device
  • control unit configured to control the first driving unit according to the first position to move the smart device to a first area by a driving force generated by the first driving unit; wherein the first area Corresponding to the first position.
  • the smart device further includes: an image acquisition unit configured to acquire the first image of the first region after the smart device moves to the first region.
  • the smart device further comprises a transmitting unit configured to transmit the first image to the at least one second electronic device.
  • the calculating unit further includes a path planning unit configured to parse the first location of the first electronic device according to the first signal, and the calling includes at least covering the first a map of the first area corresponding to the location, and the area of the second location where the smart device is located, and planning the path according to the map and the first area and the second location.
  • the smart device further includes an image acquisition unit configured to search for the first electronic device or located in the first region during the moving of the smart device to the first region Target device within.
  • control unit is configured to turn on the first driving unit while the smart device moves to the first area.
  • the smart device further includes a mechanical operating unit and a second driving unit that drives the mechanical operating unit;
  • the mechanical operating unit is configured to operate on at least one operable target device
  • the operable target device is the first electronic device or other device in the first area.
  • the present invention also provides a smart for controlling a smart device.
  • the method can be controlled, and the intelligent control method includes:
  • the first electronic device Receiving, by the first electronic device, the first signal, wherein the first signal is at least used to represent a device identifier of the first electronic device;
  • the method further includes:
  • the first image of the first area is acquired.
  • the method further includes:
  • calculating the first location of the first electronic device includes:
  • a path is planned according to the map and the first area and the second location.
  • the method further includes:
  • Image acquisition is maintained during the movement of the smart device to the first area.
  • the smart device includes a first driving unit
  • the method further includes:
  • the first driving unit is turned on while the smart device moves to the first area.
  • the method further includes:
  • the smart device After the smart device reaches the first target area, operating at least one operable target device, wherein the operable target device is the first electronic device, or is at the first Other devices in the area.
  • a computer storage medium comprising a set of instructions that, when executed, cause at least one processor to perform operations comprising:
  • the first electronic device Receiving, by the first electronic device, the first signal, wherein the first signal is at least used to represent a device identifier of the first electronic device;
  • a smart device and an intelligent control method according to an embodiment of the present invention, and a computer storage medium capable of forming a network based on a first signal with surrounding electronic devices, acquiring required information from a surrounding environment, and navigating to the first according to requirements At a first area corresponding to the first location where the electronic device is located; thus, the degree of intelligence of the autonomous control of the smart device is improved.
  • FIG. 1 is a schematic functional block diagram of a smart device in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic block diagram of an intelligent control method for controlling a smart device according to another embodiment of the present invention.
  • Figure 3 shows a schematic block diagram of the calculation steps in intelligent control in accordance with an embodiment of the present invention.
  • FIG. 1 is a schematic functional block diagram of a smart device 100 in accordance with an embodiment of the present invention.
  • the smart device may be the smart robot 100.
  • the intelligent robot 100 is capable of communicating with the surrounding first electronic devices 202, 204 and the second electronic devices 302, 304.
  • the specific examples of the first electronic devices 202, 204 may be a rainwater sensing device 202 disposed outdoors and an infrared sensing device 204 disposed at an indoor doorway.
  • the specific examples of the second electronic devices 302, 304 may be a memory 302 and a handheld smart terminal 304, such as a smart phone that is currently widely used.
  • the rainwater sensing device 202 and the infrared sensing device 204 are merely exemplary and are not intended to limit the particular type or number of the first electronic devices 202, 204.
  • memory 302 and handheld smart terminal 304 are merely exemplary and are not intended to limit the particular type or number of second electronic devices 302, 304.
  • the first electronic device and the second electronic device may be any type and number of other electronic devices contemplated by the particular application, without departing from the scope of the invention as defined by the appended claims. device.
  • the intelligent robot 100 may include a receiving unit 102, a computing unit 104, a first driving unit 106, and a control unit 108;
  • the receiving unit 102 is configured to receive the first signal sent by the first electronic device, where the first signal is used to at least represent a device identifier of the first electronic device;
  • the calculating unit 104 is configured to: after the receiving unit receives the first signal, calculate a first location of the first electronic device by using the first signal;
  • the first driving unit 106 is configured to drive the smart device
  • the control unit 108 is configured to control the first driving unit 106 according to the first position to move the smart device to a first area by a driving force generated by the first driving unit 106; wherein The first area corresponds to the first position.
  • the number of the first electronic devices is not limited in this embodiment.
  • one smart device can receive signals sent from multiple electronic devices, and can process only the first signal of one of the electronic devices at the same time.
  • this embodiment does not exclude a scenario in which a first signal is processed based on a plurality of electronic devices.
  • processing may be performed based on processing priorities corresponding to different first electronic devices, for example, receiving two first The first signal sent by the electronic device, wherein the processing priority of one of the electronic devices is high, then the first electronic device is selected for subsequent processing, and the manner of processing is the same as the manner of receiving only the first signal of the first electronic device. The same is true; or, when the first signals of the two first electronic devices are received, if the two first electronic devices are adjacent, the first area to be reached by the smart device may be determined.
  • the receiving unit 102 is configured to receive the first signal and the second signal sent by the rain sensing device 202 and the infrared sensing device 204, wherein the first signal and the second signal respectively represent the rainwater sensing device 202 And the device identification of the infrared sensing device 204.
  • the calculating unit 104 is configured to calculate, by the first signal and the second signal, the rainwater sensing device 202 and the infrared sensing device 204 after the receiving unit 102 receives the first signal and the second signal The first position and the second position.
  • the first driving unit 106 is configured to enable the smart robot 100 to be driven.
  • the first driving unit 106 In the standby state, the first driving unit 106 is powered off to save power, and when the smart robot 100 needs to be driven, the first driving unit 106 is turned on.
  • the control unit 108 is configured to be able to turn on and control the first driving unit 106 according to the first position or the second position to navigate the smart robot 100 to the first area or the second area, wherein the first The area or the second area respectively corresponds to the first position or the second position.
  • first area or the second area corresponding to the first position or the second position means that the first area or the second area may be an area including the first position or the second position. It may be an area located near the first location or the second location, or may be an area determined by the first location or the second location that requires the smart device 100 to go to perform a certain function or operation.
  • the first area may be associated with rainwater sensing device 202
  • the location corresponds to the sill area
  • the second area may be a doorway area near the doorway of the room corresponding to the location of the infrared sensing device 204.
  • the calculating unit 104 includes a path planning unit, and the path planning unit parses the first position of the first electronic device according to the first signal, for example, the first position of the rain sensing device 202 And then retrieving a map including an area covering the first area and the third location where the intelligent robot 100 is located, and planning the path according to the map and the first area and the third position .
  • the intelligent robot 100 may further include an image acquisition unit 110 according to an embodiment of the present invention.
  • the image acquisition unit 110 is configured to be able to acquire a first image or a second image of the first region or the second region after the smart robot 100 is navigated to the first region or the second region;
  • the image acquisition unit 110 can be used for photographing or recording after reaching the target area, for storing or transmitting for verifying the environmental status, and for performing video navigation during the navigation process.
  • the image acquisition unit 110 can operate in different modes of operation in both processes. For example, in the process of the smart robot 100 navigating to the first area or the second area, the image acquisition unit 110 may operate in a low frequency, low resolution mode of operation, and when the smart robot 100 reaches the first area or the second area Thereafter, the image acquisition unit 110 can operate in a high frequency, high resolution mode of operation based on the purpose of verification.
  • the intelligent robot 100 may also include a transmitting unit 112; the transmitting unit 112 is configured to enable the first or second image to be transmitted to the memory 302 and the handheld smart terminal 304.
  • the application software capable of manipulating the robot 100 can be installed on the smart terminal 304, so that the related person can also manipulate the robot 100 by implementing the application software.
  • the intelligent robot 100 may further include a mechanical operation unit 114 and a second drive unit 116 that drives the mechanical operation unit.
  • the mechanical operating unit 114 is configured to operate on at least one operable target device.
  • the mechanical operating unit 114 is, for example, a mechanical arm.
  • the operable target device is the first electronic device or other operable device in the first area.
  • the operable target device may be a window in a window sill area
  • the operation performed by the mechanical operating unit 114 may be, for example, switching a switch of a device located in the first area, or performing a window closing action.
  • the second driving unit 116 and the first driving unit 106 may also be the same driving unit or two driving units sharing the partial driving unit according to a specific design, that is, the navigation travel of the robot 100 and the operation of the mechanical operation unit 114.
  • the action can be performed by different drive components, the same drive component, or a portion of the same drive component.
  • the robot can also send instructions to the target device to implement a certain function, so that the robot interacts with the target device to form an intelligent interconnection network with the surrounding devices to complete various complicated tasks.
  • the calculating unit 104 calculates the first position of the rainwater sensing device 202, and plans a path to the first region based on the map;
  • the control unit 108 turns on the image acquisition unit 110 in the first working mode of low frequency and low resolution according to the calculation result of the calculation unit 104, and simultaneously turns on and controls the first driving unit 106 to navigate the robot 100 to the window sill according to the path planned by the calculation unit 104. region;
  • control unit 108 switches the image acquisition unit 110 from the first working mode to the second working mode of high frequency and high resolution, and analyzes the state of the window according to the collected image data;
  • control unit 108 controls the mechanical operating unit 114 (such as a robot arm) to close the window.
  • the mechanical operating unit 114 such as a robot arm
  • the image is also sent to the memory 302 and/or the handheld smart terminal 304.
  • the robot 100 according to the present invention can automatically navigate to the window sill area and the door opening area corresponding to the positions of the rain sensing device 202 and the infrared sensing device 204 for taking a picture or video, and transmitting the obtained image data to the memory 302.
  • the handheld smart terminal 304 is stored in the memory 302 and viewed by the relevant person on the smart terminal 304.
  • the robot 100 according to the present invention is capable of communicating with surrounding electronic devices and autonomously navigating to a target area for photographing forensics or performing some functional operation, realizing intelligent autonomous control.
  • FIG. 2 is a block diagram of an intelligent control method for controlling the smart device 100, in accordance with another embodiment of the present invention.
  • the intelligent control method will be described below by taking an example in which the intelligent robot 100 in the above embodiment performs a window closing operation when it is raining.
  • the intelligent control method includes:
  • Step 1002 Receive a first signal sent by the first electronic device, where the first signal is used to at least represent a device identifier of the first electronic device.
  • Step 1004 Calculate a first location of the first electronic device according to the received first signal.
  • Step 1006 Control the first driving unit according to the first position to move the smart device to a first area by a driving force generated by the first driving unit; wherein the first area and the The first position corresponds to the first position.
  • step 1002 the first signal sent by the rainwater sensing device 202 can be received, wherein the first signal represents the device identification of the rainwater sensing device 202.
  • step 1004 acquiring the first location may be calculated according to the received first signal.
  • the first driving unit may be powered on to turn on the first driving unit 106; the driving force generated by the first driving unit may cause the smart device to move to the first area: according to the A location navigates the intelligent robot 100 to a window area, wherein the window area corresponds to the first position.
  • the image capturing unit 110 of the intelligent robot 100 is always turned on in the first working mode of low frequency and low resolution to Look for windows.
  • the image capturing unit 110 takes a high frequency, a high resolution
  • the second mode of operation captures a first image of the window region.
  • the method further includes transmitting the first image to the memory 302 and/or the handheld smart terminal 304.
  • the intelligent control method may further include operating the window to close with the smart robot 100 when the window is found to be in an open state.
  • FIG. 3 illustrates an example of a computing step 1004 in the above described intelligent control method in accordance with an embodiment of the present invention.
  • step 1004 may include a path planning step, the path planning step comprising:
  • Step 100422 Parsing a first location of the first electronic device according to the first signal
  • Step 100424 retrieve a map including at least an area covering a first area corresponding to the first location and a second location where the smart device is located;
  • Step 100426 Plan a path according to the map and the first area and the second location.
  • step 100422 specifically, the first location of the rainwater sensing device 202 is parsed according to the first signal
  • step 100424 specifically, a map including at least an area covering a window area corresponding to the first location and a third location where the smart robot 100 is located may be acquired;
  • the path may be planned according to the map and the first area and the third location.
  • Embodiments of the present invention also provide a computer storage medium comprising a set of instructions that, when executed, cause at least one processor to perform operations including:
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • the embodiment of the invention discloses a smart device and an intelligent control method, and a computer storage medium, which can form a network based on a first signal with surrounding electronic devices, obtain required information from the surrounding environment, and navigate to the At a first area corresponding to the first location where the electronic device is located; thus, the degree of intelligence of the autonomous control of the smart device is improved.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention porte sur un dispositif intelligent (100), comprenant : une unité de réception (102), configurée pour recevoir un premier signal émis par au moins un premier dispositif électronique (202, 204), le premier signal indiquant un identifiant de dispositif du premier dispositif électronique (202, 204) ; une unité de calcul (104), configurée pour calculer une première position du premier dispositif électronique (202, 204) par l'intermédiaire du premier signal après que l'unité de réception (102) reçoit le premier signal ; une première unité d'entraînement (106), configurée pour entraîner le dispositif intelligent (100) ; et une unité de commande (108), configurée pour commander la première unité d'entraînement (106) selon la première position, de façon à faire naviguer le dispositif intelligent (100) vers une première région, la première région correspondant à la première position. La présente invention porte également sur un procédé de commande intelligent destiné à commander le dispositif intelligent (100) et un support de stockage informatique.
PCT/CN2016/107306 2015-12-01 2016-11-25 Dispositif intelligent, son procédé de commande intelligent et support de stockage informatique WO2017092620A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510864522.0A CN105527869B (zh) 2015-12-01 2015-12-01 智能设备及其智能控制方法
CN201510864522.0 2015-12-01

Publications (1)

Publication Number Publication Date
WO2017092620A1 true WO2017092620A1 (fr) 2017-06-08

Family

ID=55770161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/107306 WO2017092620A1 (fr) 2015-12-01 2016-11-25 Dispositif intelligent, son procédé de commande intelligent et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN105527869B (fr)
WO (1) WO2017092620A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105527869B (zh) * 2015-12-01 2019-03-01 纳恩博(北京)科技有限公司 智能设备及其智能控制方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151924A (ja) * 2002-10-30 2004-05-27 Sony Corp 自律移動ロボット及びその制御方法
CN1954974A (zh) * 2005-10-28 2007-05-02 Lg电子株式会社 移动机器人和移动机器人充电站返回系统
CN102845285A (zh) * 2012-06-07 2013-01-02 常熟理工学院 太阳能自动浇花机器人
CN104199452A (zh) * 2014-09-26 2014-12-10 上海未来伙伴机器人有限公司 移动机器人、移动机器人系统、移动及通信方法
CN105527869A (zh) * 2015-12-01 2016-04-27 纳恩博(北京)科技有限公司 智能设备及其智能控制方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2998816B1 (fr) * 2001-06-12 2018-12-05 iRobot Corporation Couverture multimodale pour un robot autonome
CN101957617A (zh) * 2010-10-13 2011-01-26 东莞市泰斗微电子科技有限公司 一种家电控制装置及控制方法
CN103389486B (zh) * 2012-05-07 2017-04-19 联想(北京)有限公司 控制方法和电子设备
CN103778143B (zh) * 2012-10-22 2017-09-01 联想(北京)有限公司 一种构建地图的方法、电子设备及系统
CN103198572B (zh) * 2013-02-19 2016-01-13 东莞宇龙通信科技有限公司 区域内设备的智能控制方法及系统
CN104887155B (zh) * 2015-05-21 2017-05-31 南京创维信息技术研究院有限公司 智能扫地机
CN105005305B (zh) * 2015-07-22 2018-07-31 上海思依暄机器人科技股份有限公司 受控机器人、遥控设备、机器人系统及所适用的方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151924A (ja) * 2002-10-30 2004-05-27 Sony Corp 自律移動ロボット及びその制御方法
CN1954974A (zh) * 2005-10-28 2007-05-02 Lg电子株式会社 移动机器人和移动机器人充电站返回系统
CN102845285A (zh) * 2012-06-07 2013-01-02 常熟理工学院 太阳能自动浇花机器人
CN104199452A (zh) * 2014-09-26 2014-12-10 上海未来伙伴机器人有限公司 移动机器人、移动机器人系统、移动及通信方法
CN105527869A (zh) * 2015-12-01 2016-04-27 纳恩博(北京)科技有限公司 智能设备及其智能控制方法

Also Published As

Publication number Publication date
CN105527869A (zh) 2016-04-27
CN105527869B (zh) 2019-03-01

Similar Documents

Publication Publication Date Title
US11961285B2 (en) System for spot cleaning by a mobile robot
JP7395229B2 (ja) 状況認識のためのモバイル清掃ロボット人工知能
KR102639675B1 (ko) 이동 로봇 시스템, 이동 로봇 및 이동 로봇 시스템의 제어 방법
US10335949B2 (en) System for operating mobile robot based on complex map information and operating method thereof
US20220032458A1 (en) Systems, apparatus, and methods for robotic learning and execution of skills
US9399290B2 (en) Enhancing sensor data by coordinating and/or correlating data attributes
KR101857952B1 (ko) 청소로봇을 원격으로 제어하기 위한 원격 제어 장치, 제어 시스템 및 제어 방법
González-Jiménez et al. Technical improvements of the Giraff telepresence robot based on users' evaluation
US10612934B2 (en) System and methods for robotic autonomous motion planning and navigation
US20230057965A1 (en) Robot and control method therefor
JP2011201002A (ja) ロボット装置、ロボット装置の遠隔制御方法及びプログラム
US11014243B1 (en) System and method for instructing a device
KR100962593B1 (ko) 영역 기반의 청소기 제어 방법 및 장치, 그 기록 매체
US11833684B2 (en) Systems, apparatus, and methods for robotic learning and execution of skills
JP2017054475A (ja) 遠隔操作装置、方法及びプログラム
KR102370873B1 (ko) 로봇 원격 제어 방법 및 시스템
CN110146098A (zh) 一种机器人地图扩建方法、装置、控制设备和存储介质
Petersson et al. Systems integration for real-world manipulation tasks
KR20210004487A (ko) 환기 상황을 자동으로 파악할 수 있는 인공 지능 장치 및 그의 동작 방법
KR20150097049A (ko) 네추럴 ui를 이용한 자율서빙 로봇 시스템
WO2023115927A1 (fr) Procédé de mappage de robot en nuage, système, dispositif et support d'enregistrement
KR20220151460A (ko) 통신 기반 로봇 제어 방법 및 시스템
CN114505840A (zh) 一种自主操作箱式电梯的智能服务机器人
KR102656581B1 (ko) 이동 로봇, 그를 포함하는 시스템 및 이동 로봇의 제어 방법
WO2017092620A1 (fr) Dispositif intelligent, son procédé de commande intelligent et support de stockage informatique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16869930

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16869930

Country of ref document: EP

Kind code of ref document: A1