WO2021125415A1 - 360도 공간 촬영을 통한 공간 모니터링 로봇 - Google Patents
360도 공간 촬영을 통한 공간 모니터링 로봇 Download PDFInfo
- Publication number
- WO2021125415A1 WO2021125415A1 PCT/KR2019/018370 KR2019018370W WO2021125415A1 WO 2021125415 A1 WO2021125415 A1 WO 2021125415A1 KR 2019018370 W KR2019018370 W KR 2019018370W WO 2021125415 A1 WO2021125415 A1 WO 2021125415A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- space
- robot
- degree
- processor
- monitoring
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 59
- 230000002159 abnormal effect Effects 0.000 claims description 13
- 238000000034 method Methods 0.000 claims description 7
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000005856 abnormality Effects 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 239000006096 absorbing agent Substances 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000035939 shock Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000004571 lime Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present disclosure relates to a space monitoring robot through 360-degree space shooting, and more specifically, a mobile robot monitors various spaces such as a home or an office through a processor capable of monitoring 360 degrees at the same time. It may include a processor for caring for the subject of care.
- the main application markets for monitoring robots include smart home, medical care, business management, retail, facility management and operation, and smart factories. It is a form of fusion of robot and communication, in which the user remotely controls the robot or the robot moves by itself, allowing the user to look around or talk with the other person as if they were actually there.
- the present disclosure is in accordance with the above-described problems, and the most important part of the present invention is that it can open a new market for spatial monitoring by overcoming the line-of-sight limit of the existing monitoring robot and realizing the price compared to the telepresence robot. developing a robot that has
- the existing home monitoring robot has a blind spot because it monitors with a fixed field of view, but the robot developed in the present invention can monitor various spaces in real time because it is characterized by being able to monitor 360 degrees.
- the present invention includes being able to monitor all spaces in real time by processing images input through a plurality of lenses and combining them into one image.
- the monitoring robot equipped with such a 360-degree omnidirectional camera has the advantage of being able to monitor all the spaces where the robot was located without blind areas because the stored images can be checked with the tie-lime function in the past.
- the conversation is equipped with a high-performance microphone and high-quality speaker for two-way conversation, and includes a conversation function that can cancel noise, one-way focused conversation function, and remove background noise and distracting voice.
- a laser distance measuring sensor that can measure the distance between a camera and an object
- an autonomous driving function that allows the robot to move by itself through camera image recognition.
- IOT devices By interlocking IOT devices with these autonomous driving functions and voice recognition technology, it is possible to perform intelligent space security and monitoring missions beyond the limits of existing monitoring robots, and includes the ability to actively monitor objects.
- the most important part of the monitoring robot is the camera function that can monitor the space 360 degrees, and it combines the images from two or more cameras to monitor the space with the 360-degree omnidirectional monitoring camera function, or It includes movement path analysis and subject-following techniques.
- the monitoring robot since the monitoring robot will be developed in a form capable of autonomous driving unlike the existing robot, it does not function only when a person is connected, but performs monitoring and monitoring tasks at an arbitrary time and space input by the user.
- autonomous driving obstacle recognition and mapping algorithms are included, and mapping correction algorithms are included through convergence technology of gyro sensor, encoder sensor, and camera sensor.
- the monitoring robot can perform various services through 360-degree omnidirectional monitoring and autonomous driving function. It includes an auto-charging function that finds a charger and charges it automatically.
- a shock absorber and shock absorber to prevent the robot from overturning or being damaged It contains a sensing device.
- it includes a situation-aware speed control algorithm that allows the robot to maintain its own speed stably even if the consumer controls the maximum speed when there are many obstacles by judging the surrounding situation through the distance sensor that can detect the surrounding obstacles. aims to ensure stable driving and quick driving in the absence of obstacles.
- the representative service development contents through image processing in the monitoring robot include a tracking algorithm for an object, which analyzes the movement path of the object and converts it into data to check the movement amount or abnormal state of the object, and includes an emergency notification service. have.
- the robot moves to the location by itself, checks the status, and includes a security function to transmit an image.
- the monitoring robot includes a departure notification technology capable of confirming the departure of the object by sending a notification to the registered terminal when the object moves to the movement prohibited area set by the user based on the location or when it deviates from the field of view of the robot.
- a space monitoring robot through 360-degree space shooting includes a camera for acquiring a 360-degree image, a sensor unit for detecting an obstacle, a moving means for moving the robot, and an image acquired through the camera It includes a processor that extracts and analyzes information about the space in all directions using the omni-directional space and checks the abnormality of the space and the object.
- the processor controls the moving means so that the robot moves to a specific area to care for the space and the object.
- the processor analyzes the movement path of the object through the 360-degree omnidirectional image acquired through the camera.
- the processor transmits the acquired image to the server, transmits it to the user terminal, and transmits an abnormal state related message to the user terminal and the manager terminal when the abnormal state of the space and the object is confirmed.
- the robot can take the first response by itself, thereby reducing the burden on the administrator, and also Since all contents of the space can be saved at the time of this occurrence, a situation monitoring service without blind spots can be provided.
- FIGS. 1 and 2 are diagrams for explaining a space monitoring robot through 360-degree space shooting according to an embodiment of the present invention.
- FIG. 3 is a block diagram illustrating a space monitoring robot through 360-degree space shooting according to an embodiment of the present invention.
- FIG. 4 is a flowchart illustrating a process in which a space monitoring robot is driven through 360-degree space photography.
- FIGS. 1 and 2 are diagrams for explaining a space monitoring robot through 360-degree space shooting according to an embodiment of the present invention.
- the moving means may be a means for moving the body of the robot.
- the moving means may include a wheel-like component.
- the camera may acquire images for 360 degrees.
- the camera may acquire a 360-degree image while rotating through the driving unit, or may acquire a 360-degree image by merging images obtained from two or more cameras in real time.
- the controller may control components included in the mobile space monitoring robot.
- the controller may extract and analyze information about the omnidirectional space by using the image acquired through the camera.
- the space in monitoring while moving in space through a monitoring robot that can move by itself in space, the space can be monitored at once through an omnidirectional camera, unlike the existing monitoring robot.
- the camera continuously collects images of space in all directions and transmits these images to the server, so the user can access at any time and check the images of space again in all directions or in real time.
- the present invention continuously checks the movement path of the object.
- the robot must be continuously driven, but in the present invention, the object can be continuously monitored without driving the robot.
- FIG. 3 is a block diagram illustrating a space monitoring robot through 360-degree space shooting according to an embodiment of the present invention.
- the space monitoring robot 100 through 360-degree space shooting may include a communication unit 110 , a processor 120 , a position detection sensor 130 , a camera 140 , and a moving means 150 . have.
- the communication unit 110 may perform communication with various types of external devices according to various wired and wireless communication methods under the control of the processor 120 .
- the communication unit 110 may communicate with various servers through a network such as the Internet.
- the position detection sensor 130 may detect an obstacle and detect the position of the robot.
- the environment sensor 130 may include various sensors such as an infrared sensor, an ultraviolet sensor, a GPS sensor, and the like.
- the camera 140 captures a 360 degree external image under the control of the processor 120 .
- the camera 140 may capture a 360-degree surrounding image while the space monitoring robot 100 is moving through 360-degree space shooting. In this way, the captured image data may be provided to the processor 120 .
- the moving means 150 may move the space monitoring robot 100 through 360-degree space shooting.
- the moving unit 150 may include a rotating unit such as a wheel.
- the processor 120 controls the overall operation of the space monitoring robot through 360-degree space shooting.
- the processor 120 builds a spatial map and maps the location of the charging unit.
- the processor 120 acquires a 360-degree omnidirectional image.
- the processor 120 performs autonomous driving of the robot and senses an obstacle or an object.
- the processor 120 moves to a specific area to care for the space and the object.
- the processor 120 analyzes the movement path of the object through the 360-degree omnidirectional image.
- the processor 120 transmits the acquired image to the server and transmits it to the user terminal.
- the processor 120 checks the abnormal state of the space and the object.
- the processor 120 When the abnormal state is confirmed, the processor 120 notifies the abnormal state to the user terminal and the manager terminal.
- the storage unit (not shown) stores service space map data and stores various programs and data for the operation of the space monitoring robot 100 through 360-degree space shooting.
- the charging unit (not shown) is provided separately to be interlocked with the robot so that the robot can dock itself, and can charge the robot.
- the home robot 100 may move to an abnormal state occurrence region of the object based on data transmitted from the wearable sensor, photograph and transmit the state of the object.
- FIG. 4 is a flowchart illustrating a process in which a space monitoring robot is driven through 360-degree space photography.
- the space monitoring robot through 360-degree space shooting builds a space map and maps the location of the charging unit ( 300 ).
- the space monitoring robot through 360-degree space photography acquires a 360-degree omnidirectional image ( 310 ).
- the space monitoring robot through 360-degree space shooting performs autonomous robot driving and senses an obstacle or an object ( 320 ).
- the space monitoring robot through 360-degree spatial photography moves to a specific area to care for the space and the object ( 330 ).
- the space monitoring robot through 360-degree spatial imaging analyzes the movement path of the object through the 360-degree omnidirectional image ( 340 ).
- the space monitoring robot through 360-degree space shooting transmits the acquired image to the server and transmits it to the user terminal (350).
- the space monitoring robot through 360-degree space photography checks the abnormal state of the space and the object (360).
- an abnormal state message is transmitted ('notified') to the user terminal and the manager terminal ( 370 ). Accordingly, the manager under the guardian can grasp the situation in real time.
- the described embodiments may be configured by selectively combining all or part of each of the embodiments so that various modifications can be made.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Evolutionary Computation (AREA)
- Human Computer Interaction (AREA)
- Alarm Systems (AREA)
Abstract
Description
Claims (4)
- 360도 이미지를 획득하는 카메라;장애물을 감지하기 위한 센서부;로봇을 이동시키는 이동 수단; 및상기 카메라를 통해 획득된 이미지를 이용하여 전 방향의 공간에 대한 정보를 추출하고 분석하고, 공간 및 대상체의 이상을 확인하는 프로세서를 포함하는, 360도 공간 촬영을 통한 공간 모니터링 로봇.
- 제 1 항에 있어서,상기 프로세서는,공간 및 대상체를 케어하기 위해 로봇이 특정 지역으로 이동하도록 상기 이동 수단을 제어하는, 360도 공간 촬영을 통한 공간 모니터링 로봇.
- 제 1 항에 있어서,상기 프로세서는,상기 카메라를 통해 획득된 360도 전방위 영상을 통해 대상체 이동 경로를 분석하는, 360도 공간 촬영을 통한 공간 모니터링 로봇.
- 제 1 항에 있어서,상기 프로세서는,획득한 영상을 서버로 전송하고, 사용자 단말기로 전송하고, 공간 및 대상체의 이상 상태가 확인되면 사용자 단말기 및 관리자 단말기로 이상 상태 관련 메시지를 전송하는, 360도 공간 촬영을 통한 공간 모니터링 로봇.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20190171730 | 2019-12-20 | ||
KR10-2019-0171730 | 2019-12-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021125415A1 true WO2021125415A1 (ko) | 2021-06-24 |
Family
ID=76437169
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/018370 WO2021125415A1 (ko) | 2019-12-20 | 2019-12-24 | 360도 공간 촬영을 통한 공간 모니터링 로봇 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20210187744A1 (ko) |
WO (1) | WO2021125415A1 (ko) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118269113A (zh) * | 2022-12-29 | 2024-07-02 | Oppo广东移动通信有限公司 | 陪护方法、装置、系统、机器人以及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090012542A (ko) * | 2007-07-30 | 2009-02-04 | 주식회사 마이크로로봇 | 로봇을 이용한 홈 모니터링 시스템 |
KR20170107341A (ko) * | 2016-03-15 | 2017-09-25 | 엘지전자 주식회사 | 이동로봇 및 그 이동로봇의 제어 방법 |
US10029370B2 (en) * | 2012-12-21 | 2018-07-24 | Crosswing Inc. | Control system for mobile robot |
KR20180098891A (ko) * | 2017-02-27 | 2018-09-05 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
KR20190134554A (ko) * | 2019-11-15 | 2019-12-04 | 엘지전자 주식회사 | 동적 장애물을 식별하는 방법 및 이를 구현한 로봇 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4455417B2 (ja) * | 2005-06-13 | 2010-04-21 | 株式会社東芝 | 移動ロボット、プログラム及びロボット制御方法 |
US9014848B2 (en) * | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
US9530058B2 (en) * | 2014-12-11 | 2016-12-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Visual-assist robots |
-
2019
- 2019-12-24 WO PCT/KR2019/018370 patent/WO2021125415A1/ko active Application Filing
- 2019-12-26 US US16/727,470 patent/US20210187744A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090012542A (ko) * | 2007-07-30 | 2009-02-04 | 주식회사 마이크로로봇 | 로봇을 이용한 홈 모니터링 시스템 |
US10029370B2 (en) * | 2012-12-21 | 2018-07-24 | Crosswing Inc. | Control system for mobile robot |
KR20170107341A (ko) * | 2016-03-15 | 2017-09-25 | 엘지전자 주식회사 | 이동로봇 및 그 이동로봇의 제어 방법 |
KR20180098891A (ko) * | 2017-02-27 | 2018-09-05 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
KR20190134554A (ko) * | 2019-11-15 | 2019-12-04 | 엘지전자 주식회사 | 동적 장애물을 식별하는 방법 및 이를 구현한 로봇 |
Also Published As
Publication number | Publication date |
---|---|
US20210187744A1 (en) | 2021-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10102730B2 (en) | Monitoring apparatus for monitoring a targets exposure to danger | |
US9679458B2 (en) | Information processing system, information processing apparatus, information processing method, information processing program, portable communication terminal, and control method and control program of portable communication terminal | |
WO2017018744A1 (ko) | 무인 스마트카를 이용한 공익서비스 시스템 및 방법 | |
WO2020095379A1 (ja) | 画像処理装置、画像処理方法、及びコンピュータ可読媒体 | |
WO2021112283A1 (ko) | 홈 로봇 및 홈 디바이스 연동을 통한 스마트 홈 안전 모니터링 시스템 | |
KR102335994B1 (ko) | 드론 감시를 위한 촬영 감시 장치들의 통합 제어 장치 | |
WO2019221416A1 (ko) | 실시간 현장 동영상 중계를 이용한 시각장애인 안내 서비스 제공 방법 | |
WO2021125415A1 (ko) | 360도 공간 촬영을 통한 공간 모니터링 로봇 | |
CN107195167A (zh) | 受控设备及应用该受控设备的通信系统和方法 | |
WO2013157801A1 (ko) | 네트워크를 통한 현장 모니터링 방법, 및 이에 사용되는 관리 서버 | |
WO2024136078A1 (ko) | 폐쇄망 cctv용 정보 전송 시스템 | |
WO2023113394A1 (ko) | 범죄 예방을 위한 스마트 드론의 운용 방법 | |
WO2023096394A1 (ko) | 자세유형 판별을 위한 서버 및 그 동작방법 | |
KR20150103859A (ko) | 복합센서모듈을 이용한 센서 네트워크 시스템 | |
WO2022097805A1 (ko) | 이상 이벤트 탐지 방법, 장치 및 시스템 | |
Kogut et al. | Using video sensor networks to command and control unmanned ground vehicles | |
WO2017204598A1 (ko) | 촬영된 영상에 관한 데이터 프로토콜을 설정하는 단말기 및 방법 | |
WO2022145594A1 (ko) | 딥러닝 객체 추적을 통한 무인주차 관제 시스템 및 방법 | |
WO2014035053A1 (ko) | 초광각 카메라를 이용한 카메라 시스템 | |
KR20070061081A (ko) | 보안 로봇 및 로봇을 이용한 보안 방법 | |
WO2023128015A1 (ko) | 비전카메라를 이용한 크레인 안전관리 시스템 | |
CN108259831B (zh) | 一种摄像头及摄像头基站 | |
Seljanko | Low-cost electronic equipment architecture proposal for urban search and rescue robot | |
CN106200485A (zh) | 基于Android系统的公共管廊监控机器人 | |
WO2015182967A1 (ko) | 매칭시스템 및 매칭방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19956611 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19956611 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19956611 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/12/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19956611 Country of ref document: EP Kind code of ref document: A1 |