WO2021109806A1 - 服务机器人及其显示控制方法、控制器和存储介质 - Google Patents
服务机器人及其显示控制方法、控制器和存储介质 Download PDFInfo
- Publication number
- WO2021109806A1 WO2021109806A1 PCT/CN2020/127751 CN2020127751W WO2021109806A1 WO 2021109806 A1 WO2021109806 A1 WO 2021109806A1 CN 2020127751 W CN2020127751 W CN 2020127751W WO 2021109806 A1 WO2021109806 A1 WO 2021109806A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- service robot
- controller
- distance
- user
- display screen
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000007246 mechanism Effects 0.000 claims description 31
- 238000001514 detection method Methods 0.000 claims description 21
- 230000014509 gene expression Effects 0.000 claims description 9
- 230000004913 activation Effects 0.000 claims description 3
- 238000004891 communication Methods 0.000 abstract description 7
- 238000013459 approach Methods 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 18
- 230000003993 interaction Effects 0.000 description 13
- 230000008859 change Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013475 authorization Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
- B25J11/001—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/026—Acoustical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
Definitions
- the present disclosure relates to the field of logistics, in particular to a warehouse flexible production method and device, and the field of computer-readable storage medium artificial intelligence, and in particular to a service robot and its display control method, controller and storage medium.
- a service robot including:
- the human body recognition sensor is used to detect whether a user appears in a predetermined range around the service robot; and when a user appears in a predetermined range around the service robot, output a start signal to the controller;
- the controller is used to control the mounted device to start running when the start signal is received;
- Mounting equipment is used to start operation according to the control instructions of the controller.
- the mounting device includes at least one distance sensor, wherein:
- the distance sensor is used to measure the distance between the user and the service robot after it is turned on and running;
- the controller is used to determine the position of the user relative to the service robot according to the distance of the user relative to the service robot and the setting position of the distance sensor, and control the head of the service robot to rotate in the horizontal direction to the corresponding position of the user.
- the mounting device includes a plurality of distance sensors, wherein:
- the sum of the detection range of all distance sensors includes the detection range of the human body recognition sensor.
- the human body recognition sensor and all distance sensors are provided on the service robot.
- the distance sensors are all arranged on the same horizontal plane at a predetermined distance from the human body recognition sensor.
- the distance sensors are symmetrically arranged on both sides of the vertical plane of the horizontal plane passing through the human body recognition sensor.
- one distance sensor is arranged on the vertical plane of the horizontal plane passing through the human body recognition sensor, and the other distance sensors are symmetrically arranged on the passing human body recognition sensor. On both sides of the vertical plane of the horizontal plane.
- the mounting device includes a first display screen, wherein:
- the first display screen is set on the head of the service robot
- the controller is used for controlling the first display screen to rotate in a horizontal direction to the position corresponding to the user according to the distance and orientation of the user relative to the service robot, and controlling the first display screen to make corresponding expression changes.
- the mounting device further includes a camera, and the camera is arranged above the first display screen, wherein:
- the camera is used to take a picture of the camera according to the instructions of the controller when the controller receives the start signal;
- the controller is used to identify the face area in the camera picture, and adjust the pitch and horizontal angle of the first display screen according to the position of the face area in the camera picture so that the face area is located in the center area of the camera picture.
- the mounting device further includes a second display screen, wherein:
- the second display screen is used to display the corresponding service content to the user according to the instruction of the controller when the controller receives the start signal.
- the controller includes a single-chip microcomputer and a robot processor, wherein:
- the single-chip microcomputer and the robot processor are connected through the CAN bus;
- the single-chip microcomputer is respectively connected with the pitch mechanism, the distance sensor and the first display screen;
- the robot processor is respectively connected with the camera and the second display screen.
- a display control method for a service robot including:
- a human body recognition sensor Receiving a start signal sent by a human body recognition sensor, where the human body recognition sensor detects whether a user appears in a predetermined range around the service robot, and outputs a start signal to the controller when a user appears in a predetermined range around the service robot;
- controlling the mounting device to start operation includes: controlling the distance sensor to start operation, and measuring the distance of the user relative to the service robot.
- the display control method of the service robot further includes: determining the position of the user relative to the service robot according to the distance of the user relative to the service robot and the setting position of the distance sensor, and controlling the head of the service robot. Rotate horizontally to the position corresponding to the user.
- the display control method of the service robot further includes: controlling the first display screen to rotate in a horizontal direction to the corresponding position of the user according to the distance and orientation of the user relative to the service robot, and controlling the first display The screen changes accordingly.
- controlling the mounting device to start and run includes: controlling the camera to start and run, and take a picture of the camera.
- the service robot display control method further includes: identifying the face area in the camera picture, and adjusting the pitch angle and level of the first display screen according to the position of the face area in the camera picture Angle so that the face area is located in the center area of the camera image.
- a controller including:
- the signal receiving module is used to receive the activation signal sent by the human body recognition sensor, where the human body recognition sensor detects whether a user appears in a predetermined range around the service robot, and when a user appears in the predetermined range around the service robot, it sends a signal to the controller Output start signal;
- the mounting control module is used to control the mounting equipment to start operation when the start signal is received;
- controller is used to perform operations for implementing the display control method of the service robot as described in any of the foregoing embodiments.
- a controller including a memory and a processor, wherein:
- Memory used to store instructions
- the processor is configured to execute the instructions, so that the controller executes operations for implementing the display control method of the service robot according to any of the foregoing embodiments.
- a computer-readable storage medium wherein the computer-readable storage medium stores computer instructions that, when executed by a processor, implement the service described in any of the above embodiments Robot display control method.
- FIG. 1 is a schematic diagram of some embodiments of the service robot of the present disclosure.
- Fig. 2 is a schematic diagram of the positions of the various modules of the service robot in the service robot in some embodiments of the present disclosure.
- Fig. 3 is a schematic diagram of other embodiments of the service robot of the present disclosure.
- FIG. 4 is a schematic diagram of the detection area of the sensor in some embodiments of the disclosure.
- FIG. 5 is a schematic diagram of still other embodiments of the service robot of the present disclosure.
- Fig. 6 is a schematic diagram of the rotation of the first display screen of the service robot in some embodiments of the present disclosure.
- FIG. 7 is a schematic diagram of some embodiments of the display control method of the service robot of the present disclosure.
- Fig. 8 is a schematic diagram of some embodiments of the controller of the present disclosure.
- Fig. 9 is a schematic diagram of other embodiments of the controller of the present disclosure.
- the inventor discovered that the display screen of the related technical service robot adopts a fixed method.
- the display screen of the robot is always on, and no corresponding interaction will be made as the person approaches or moves away.
- the present disclosure provides a service robot and a display control method thereof, a controller, and a storage medium, so as to improve the interactive display interaction of the service robot.
- Figure 1 is a schematic diagram of some embodiments of a service robot of the present disclosure.
- Fig. 2 is a schematic diagram of the positions of the various modules of the service robot in the service robot in some embodiments of the present disclosure.
- the service robot of the present disclosure may include a human body recognition sensor 1, a controller 2, and a mounting device 3. Among them:
- the human body recognition sensor 1 is used to detect whether a user appears in a predetermined range around the service robot; and when a user appears in a predetermined range around the service robot, it outputs a start signal to the controller 2.
- the human body recognition sensor 1 may also be used to output a start signal to the controller 2 when the distance between the user and the service robot is less than or equal to a predetermined distance.
- the human body recognition sensor may be implemented as an infrared pyroelectric sensor.
- the human body recognition sensor 1 is a human body recognition sensor with a detection angle of 120° and a detection distance of 2 meters located on the chest of the service robot.
- the controller 2 is used to control the mounted device 3 to start running when the start signal is received.
- the human body recognition sensor 1 may also be used to output a standby signal to the controller 2 when the distance between the user and the service robot is greater than a predetermined distance.
- the controller 2 can also be used to control the mounted device 3 to be in a standby state when a standby signal is received.
- the mounting device 3 is used to start operation according to the control instruction of the controller 2.
- Fig. 3 is a schematic diagram of other embodiments of the service robot of the present disclosure.
- the mounting device 3 of the present disclosure in some embodiments of the present disclosure, the mounting device 3 may include at least one of the second display screen 31 and the first display screen 32.
- the second display screen 31 and the first display screen 32 are used for displaying corresponding content to the user according to the instruction of the controller 2 when the controller receives the start signal.
- the second display screen 31 is located on the chest of the service robot; the human body recognition sensor 1 is located below the second display screen 31, and the first display screen 32 is located on the head of the service robot. .
- the second display screen 31 may be a service content display screen
- the first display screen 32 may be an emoticon screen
- the mounting device 3 of the embodiment of FIG. 1 or FIG. 3 may further include a distance sensor 33, wherein:
- the distance sensor 33 is used to measure the distance of the user relative to the service robot after the operation is turned on.
- the distance sensor 33 may be implemented as a distance sensor such as an ultrasonic sensor and an optical sensor.
- the mounting device includes at least one distance sensor, wherein: the sum of the detection ranges of all the distance sensors includes the detection range of the human body recognition sensor. That is, the sum of the detection ranges of all distance sensors can cover the detection range of the human body recognition sensor.
- the human body recognition sensor and all distance sensors are arranged on the service robot; the distance sensors are all arranged on the same horizontal plane at a predetermined distance from the human body recognition sensor.
- the distance sensors are symmetrically arranged on both sides of the vertical plane of the horizontal plane passing through the human body recognition sensor.
- one distance sensor is arranged on a vertical surface of the horizontal plane passing the human body recognition sensor, and the other The distance sensor is symmetrically arranged on both sides of the vertical plane of the horizontal plane passing through the human body recognition sensor.
- the controller 2 is used to determine the position of the user relative to the service robot according to the distance of the user relative to the service robot and the setting position of the distance sensor, and control the head of the service robot to rotate in the horizontal direction to the position corresponding to the user.
- the controller 2 can also be used to control the first display screen to rotate in the horizontal direction to the corresponding position of the user according to the distance and orientation of the user relative to the service robot, and control the first display screen to do Make a corresponding change in expression.
- FIG. 4 is a schematic diagram of the detection area of the sensor in some embodiments of the disclosure.
- Figure 4 is a schematic cross-sectional view of the chest of the service robot.
- the human body recognition sensor 1 is a human body recognition sensor with a detection angle of 120° on the chest of the service robot and a detection distance of 2 meters; the distance sensor 33 may include a detection distance of 2 meters on the chest and a detection angle of 60 degrees. ° of the first ultrasonic sensor 331, the second ultrasonic sensor 332, and the third ultrasonic sensor 333.
- the first ultrasonic sensor 331, the second ultrasonic sensor 332 and the third ultrasonic sensor 333 are located under the second display screen 31, and the human body recognition sensor 1 is located under the second ultrasonic sensor 332.
- the mounting device 3 of the embodiment of FIG. 1 or FIG. 3 may further include a left-right rotation mechanism 34, wherein:
- the left-right rotating mechanical device 34 is used for instructions of the controller 2 to rotate left and right so as to drive the first display screen 32 to rotate to the corresponding orientation in the horizontal direction.
- the left-right rotation mechanism 34 may be implemented as a first steering gear.
- the left and right rotation mechanism 34 is located in the chest of the robot and supports the neck of the robot.
- the rotation of the left and right rotation mechanism 34 can realize the rotation of the robot neck in the horizontal direction, that is, can realize the left and right rotation of the service robot head (including the first display screen). .
- the mounting device 3 described in the embodiment of FIG. 1 or FIG. 3 may further include a camera 35, wherein:
- the camera 35 is arranged above the first display screen 32.
- the camera 35 is used for taking a picture of the camera according to the instruction of the controller when the controller receives the start signal.
- the controller 2 is used to identify the face area in the camera picture, and adjust the pitch and horizontal angle of the first display screen 32 according to the position of the face area in the camera picture so that the face area is located in the center area of the camera picture.
- the mounting device 3 in the embodiment of FIG. 1 or FIG. 3 may further include a pitch mechanism 36, wherein:
- the controller 2 is configured to calculate the distance between the position of the face area in the camera image and the position of the central area of the camera image according to the position of the face area in the camera image, and convert the distance into the pitch of the first display screen 32 Angle and horizontal angle adjustment angle.
- the pitch mechanism 36 and the left and right rotation mechanism 34 are used to perform pitch rotation and left and right rotation according to the adjustment angle according to the instructions of the controller 2, so as to drive the first display screen 32 to perform pitch movement and left and right rotation, so that the face area Located in the center area of the camera screen.
- the pitch mechanism 36 may be implemented as a second steering gear.
- the left and right rotation mechanism 34 is located in the chest of the robot and supports the neck of the robot.
- the rotation of the left and right rotation mechanism 34 can realize the horizontal rotation of the robot neck
- the pitch mechanism 36 is located at the fixed position of the head and neck of the robot.
- the rotation of the pitch mechanism 36 can control the robot head for pitch selection.
- the cooperation of the left and right rotation mechanism 34 and the pitch mechanism 36 can realize the freedom of the robot head in both horizontal and vertical directions. Degree rotation.
- the service robot provided by the above-mentioned embodiment of the present disclosure, it is mainly used for service robots.
- the user's position can be detected through the cooperation of pyroelectric and ultrasonic sensors, the user's image is captured by the camera, the processor performs face detection, and the detection result is used as the adjustment A basis for the pitch and horizontal angle of the display screen, so that the first display screen can always face the user, making the user feel being watched, and the user will feel respected psychologically, giving the robot more anthropomorphic features, thereby enhancing the user Experience.
- the first display screen of the robot of the present disclosure is in a standby state when there is no user.
- the first display screen starts to brighten, so that the user can feel the anthropomorphic communication mode of the robot when using the robot.
- FIG. 5 is a schematic diagram of still other embodiments of the service robot of the present disclosure.
- the controller of the present disclosure may include a single chip computer 21 and a robot processor 22, wherein:
- the single-chip computer 21 and the robot processor 22 are connected through the CAN bus.
- the CAN bus may be a robot CAN bus.
- the single-chip microcomputer 21 is connected to the pitch mechanism 36, the left and right rotation mechanism 34, the human body recognition sensor 1, the distance sensor 33, and the first display screen 32, respectively.
- the robot processor 22 is connected to the camera 35 and the second display screen 31 respectively.
- the first ultrasonic sensor 331, the second ultrasonic sensor 332, and the third ultrasonic sensor 333 use the 485 interface, and the communication between the ultrasonic sensor and the single-chip computer 21 needs to pass 485 to UART. (Universal Asynchronous Receiver/Transmitter, Universal Asynchronous Receiver/Transmitter) circuit for conversion.
- the first display screen 32 uses RS232 communication
- the single-chip computer 21 sends the content to be displayed through UART
- the UART to RS232 circuit converts the UART level to RS232 level.
- the first display screen 32 receives an instruction to change the displayed content according to the instruction information.
- the service robot may further include a comparator 11, wherein:
- the comparator 11 is connected to the human body recognition sensor 1 and the single-chip computer 21 respectively.
- the human body recognition sensor 1 outputs high and low levels. When a person approaches, the output level of the pyroelectric sensor will change. The output level of the human body recognition sensor 1 is compared by the comparator 11, and the comparator 11 outputs the result of the comparison to the single-chip microcomputer. 21 for identification.
- the service robot is in a standby state when there are no users around.
- the human body recognition sensor 1 outputs a high level
- the comparator 11 compares and outputs a TTL level.
- the microcontroller 21 passes the user approaching information through the robot.
- the CAN bus is reported to the robot processor 22, and the robot processor 22 wakes up the device mounted on the robot.
- the service content that needs to be displayed is output to the second display screen 31 for display through HDMI.
- the robot processor 22 outputs the content to be displayed to the second display screen 31 for display through the HDMI interface.
- the camera 35 on the first display screen 32 communicates with the processor through a USB interface.
- the service robot may further include a first driving circuit 341 and a second driving circuit 361, wherein:
- the first driving circuit 341 is used for driving the left and right rotating mechanical device 34 to rotate in the horizontal direction according to the instruction of the single chip computer 21.
- the second driving circuit 361 is used for driving the pitch mechanism 36 to rotate in the vertical direction according to the instruction of the single chip computer 21.
- each ultrasonic sensor has a detection distance of 2 meters and a detection angle of 60°. After the robot is awakened, it means that the distance between the user and the robot is less than 2 meters. The distance of the robot. If the first ultrasonic sensor 331 detects a user whose distance is less than 2 meters, the single-chip computer 21 sends an instruction to the first drive circuit 341 to drive the left and right rotation mechanism 34 to rotate in the horizontal direction, so that the first display screen 32 Towards the direction of the first ultrasonic sensor 331, and the single-chip computer 21 sends an instruction to the first display screen 32 to display the preset content.
- Fig. 6 is a schematic diagram of the rotation of the first display screen of the service robot in some embodiments of the present disclosure.
- the left-right rotation mechanism 34 When the left-right rotation mechanism 34 is rotated, the first display screen 32 will face the user, and the user will enter the field of view captured by the camera 35, as shown in FIG. 6.
- the camera 35 will transmit the collected images to the robot processor 22 via USB, and the robot processor 22 runs the face detection algorithm to select the nearest face; and calculates the distance between the face area and the center area of the camera 35, and moves according to the movement. Adjust the rotation angle of the left and right rotation mechanism 34 and the pitch mechanism 36 in two steps of 1 and movement 2.
- the robot processor 22 sends instructions to the single-chip computer 21 to rotate the left-right rotation mechanism 34 and the pitch mechanism 36 through the CAN bus. After receiving the instructions, the single-chip computer 21 controls the pitch mechanism 36 to rotate in the opposite direction of movement 1, and controls the left-right rotation mechanism 34 Rotate in the opposite direction of movement 2, and gradually locate the face area in the center
- the first display screen 32 is facing the user at all times, so that the user can experience the feeling of being watched.
- the above-mentioned embodiments of the present disclosure can be used for service robots.
- the first display screen 32 of the robot is in a standby state when there is no user. When a user approaches the robot and the distance reaches 2 meters, the pyroelectric will detect that someone is approaching and wake up the robot. .
- the distance sensor 33 located on the chest of the robot measures the distance and orientation of the user.
- the first display screen 32 rotates to the corresponding orientation in the horizontal direction and makes corresponding expression changes.
- the camera located on the first display screen 32 recognizes the face area, and adjusts the pitch angle of the first display screen 32 according to the position of the face in the picture, so that the face area is located in the center area of the camera picture. Through the rotation movement and expression change of the first display screen 32, the user feels the feeling of being paid attention to, thereby enhancing the interactive experience.
- FIG. 7 is a schematic diagram of some embodiments of the display control method of the service robot of the present disclosure. Preferably, this embodiment can be executed by the service robot of the present disclosure or the controller of the present disclosure. The method includes the following steps:
- Step 71 Receive a start signal sent by the human body recognition sensor 1, where the human body recognition sensor 1 detects whether a user appears in a predetermined range around the service robot, and the user’s presence in a predetermined range around the service robot (for example, the user’s distance from the service robot) When the distance is less than or equal to the predetermined distance), output a start signal to the controller 2;
- Step 72 In the case of receiving the start signal, control the mounting device 3 to start operation.
- the step of controlling the mounting device 3 to start operation may further include: controlling the distance sensor 33 to start operation, and measuring the distance of the user relative to the service robot; The distance of the service robot and the setting position of the distance sensor determine the position of the user relative to the service robot, and control the head of the service robot to rotate in the horizontal direction to the position corresponding to the user.
- step 72 the step of controlling the mounting device 3 to start operation may further include: controlling the first display screen 32 to rotate in the horizontal direction according to the distance and orientation of the user relative to the service robot The user's corresponding position, and the first display screen makes a corresponding change in expression.
- the step of controlling the operation of the mount device 3 may further include: controlling the camera 35 to turn on and running, and take the camera picture; identify the face area in the camera picture, according to the person For the position of the face area in the camera image, the pitch and horizontal angles of the first display screen 32 are adjusted so that the face area is located in the central area of the camera image.
- the step of adjusting the elevation angle and the horizontal angle of the first display screen 32 according to the position of the face area in the camera image so that the face area is located in the central area of the camera image may include: The position of the face area in the camera image, the distance between the position of the face area in the camera image and the position of the center area of the camera image is calculated, and the distance is converted into the adjustment angle of the pitch angle and the horizontal angle of the first display screen 32 ; According to the adjustment angle, the tilt mechanism 36 and the left and right rotation mechanism 34 are controlled to perform pitch rotation and left and right rotation, so as to drive the first display screen 32 to pitch and rotate left and right, so that the face area is located in the center area of the camera screen.
- the service robot display control method may further include: receiving a standby signal sent by the human body recognition sensor 1, where the human body recognition sensor 1 is greater than a predetermined distance from the user to the service robot.
- the standby signal is output to the controller 2; when the standby signal is received, the mounted device 3 is controlled to be in a standby state.
- the service robot display control method may further include: controlling the second display screen 31 and the first display screen 32 to display corresponding content to the user.
- the service robot display control method provided by the foregoing embodiments of the present disclosure, it can be applied to service robots.
- the purpose of the foregoing embodiments of the present disclosure is to improve the interactive display interaction of service robots, so that users can feel the anthropomorphism of the robot when using the robot.
- the way of communication makes the robot more vigorous and vivid, thus improving the user experience.
- the above-mentioned embodiments of the present disclosure can better personify the service robot, and it is necessary to improve the display interaction of the robot, so that the user can feel the eye interaction similar to when communicating with people.
- Fig. 8 is a schematic diagram of some embodiments of the controller of the present disclosure.
- the controller of the present disclosure (for example, the controller 2 in the embodiment of FIG. 1 and FIG. 3) may include a signal receiving module 201 and a mounting control module 202, wherein:
- the signal receiving module 201 is used to receive the activation signal sent by the human body recognition sensor 1, where the human body recognition sensor 1 detects whether a user appears in a predetermined range around the service robot, and the user appears in a predetermined range around the service robot (for example, the user distance When the distance of the service robot is less than or equal to the predetermined distance), the start signal is output to the controller 2.
- the mounting control module 202 is configured to control the mounting device 3 to start operation when the start signal is received.
- the mount control module 202 can be used to control the distance sensor 33 to start operation when the start signal is received, and to measure the distance of the user relative to the service robot, according to the user relative to the service robot
- the position of the user relative to the service robot is determined by the distance and the setting position of the distance sensor; according to the distance and position of the user relative to the service robot, the first display screen 32 is controlled to rotate in the horizontal direction to the corresponding position of the user, and the first display is controlled
- the screen 32 makes a corresponding change in expression.
- the mount control module 202 can be used to control the camera 35 to start and run when receiving a start signal, and take the camera picture; to identify the face area in the camera picture, according to the face area At the position in the camera picture, the pitch and horizontal angle of the first display screen 32 are adjusted so that the face area is located in the center area of the camera picture.
- the mounting control module 202 adjusts the elevation and horizontal angles of the first display screen 32 according to the position of the face area in the camera image, so that the face area is located in the central area of the camera image below, it can be used to calculate the distance between the position of the face area in the camera image and the position of the center area of the camera image according to the position of the face area in the camera image, and convert the distance into the pitch angle of the first display screen 32 And the adjustment angle of the horizontal angle; according to the adjustment angle, the pitch mechanism 36 and the left and right rotation mechanism 34 are controlled to perform pitch rotation and left and right rotation, so as to drive the first display screen 32 to perform pitch movement and left and right rotation, so that the face area is located on the camera The center area of the screen.
- the controller 2 may also be used to receive the standby signal sent by the human body recognition sensor 1, where the human body recognition sensor 1 can control the robot when the distance between the user and the service robot is greater than a predetermined distance
- the device 2 outputs a standby signal; when the standby signal is received, the mounted device 3 is controlled to be in a standby state.
- the controller 2 may also be used to control the second display screen 31 and the first display screen 32 to display corresponding content to the user.
- the controller 2 may be used to perform operations for implementing the service robot display control method described in any of the above embodiments (for example, the embodiment of FIG. 7).
- the controller provided by the foregoing embodiments of the present disclosure, it can be applied to service robots.
- the purpose of the foregoing embodiments of the present disclosure is to improve the interactive display interaction of service robots, so that users can feel the robot's anthropomorphic communication mode when using the robot. , Make the robot more vitality, more vivid, thereby improving the user experience.
- Fig. 9 is a schematic diagram of other embodiments of the controller of the present disclosure.
- the controller of the present disclosure (for example, the controller 2 in the embodiment of FIG. 1 and FIG. 3) may include a memory 208 and a processor 209, wherein:
- the memory 208 is used to store instructions.
- the processor 209 is configured to execute the instructions, so that the controller 2 executes operations for implementing the service robot display control method described in any of the foregoing embodiments (for example, the embodiment of FIG. 7).
- the above-mentioned embodiments of the present disclosure can better personify the service robot, and it is necessary to improve the display interaction of the robot, so that the user can feel the eye interaction similar to when communicating with people.
- a computer-readable storage medium wherein the computer-readable storage medium stores computer instructions, and when the instructions are executed by a processor, it implements any of the above-mentioned embodiments (for example, FIG. 7 Embodiment) The service robot display control method.
- the interactive display interaction of the service robot can be improved, so that the user can feel the anthropomorphic communication mode of the robot when using the robot, making the robot more vital and vivid. Thereby improving the user experience.
- the above-mentioned embodiments of the present disclosure can better personify the service robot, and it is necessary to improve the display interaction of the robot, so that the user can feel the eye interaction similar to when communicating with people.
- the controller described above can be implemented as a general-purpose processor, programmable logic controller (PLC), digital signal processor (DSP), application-specific integrated circuit (ASIC), field programmable Gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, or any suitable combination thereof.
- PLC programmable logic controller
- DSP digital signal processor
- ASIC application-specific integrated circuit
- FPGA field programmable Gate array
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种服务机器人及其显示控制方法、控制器和存储介质。该服务机器人显示控制方法包括:接收人体识别传感器(1)发送的启动信号,其中,人体识别传感器(1)在服务机器人周围的预定范围内出现用户的情况下,向控制器(2)输出启动信号;在接收到启动信号的情况下,控制挂载设备(3)开启运行。机器人第一显示屏(32)在无用户时处于待机状态,当有用户接近机器人的情况下,第一显示屏(32)启动变亮,从而使得用户在使用机器人时能感受到机器人拟人化的交流方式。
Description
相关申请的交叉引用
本申请是以CN申请号为201911220522.1,申请日为2019年12月3日的申请为基础,并主张其优先权,该CN申请的公开内容在此作为整体引入本申请中。
本公开涉及物流领域,特别涉及一种仓库柔性生产方法和装置、计算机可读存储介质人工智能领域,特别涉及一种服务机器人及其显示控制方法、控制器和存储介质。
人工智能技术的发展,推动了服务机器人的市场化。显示交互是服务机器人的重要交互方式之一。
发明内容
根据本公开的一个方面,提供一种服务机器人,包括:
人体识别传感器,用于检测服务机器人周围的预定范围内是否出现用户;并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;
控制器,用于在接收到启动信号的情况下,控制挂载设备开启运行;和
挂载设备,用于根据控制器的控制指令,开启运行。
在本公开的一些实施例中,所述挂载设备包括至少一个距离传感器,其中:
距离传感器,用于在开启运行后,测量出用户相对于服务机器人的距离;
控制器,用于根据用户相对于服务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
在本公开的一些实施例中,所述挂载设备包括多个距离传感器,其中:
所有距离传感器的探测范围之和包含人体识别传感器的探测范围。
在本公开的一些实施例中,人体识别传感器和所有距离传感器均设置于服务机器人上。
在本公开的一些实施例中,距离传感器均设置于距离人体识别传感器预定距离的同一水平面上。
在本公开的一些实施例中,在挂载设备包括偶数个距离传感器的情况下,距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧。
在本公开的一些实施例中,在挂载设备包括奇数个距离传感器的情况下,一个距离传感器设置于通过人体识别传感器的所述水平面的垂直面上,其他距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧。
在本公开的一些实施例中,所述挂载设备包括第一显示屏,其中:
第一显示屏,设置在服务机器人头部;
控制器,用于根据用户相对于服务机器人的距离和方位,控制第一显示屏沿水平方向旋转到用户对应的方位,并控制第一显示屏做出相应的表情变化。
在本公开的一些实施例中,所述挂载设备还包括摄像头,摄像头设置于第一显示屏上方,其中:
摄像头,用于在控制器接收到启动信号的情况下,根据控制器的指示,拍摄摄像头画面;
控制器,用于识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,所述挂载设备还包括第二显示屏,其中:
第二显示屏,用于在控制器接收到启动信号的情况下,根据控制器的指示,向用户显示相应的服务内容。
在本公开的一些实施例中,所述控制器包括单片机和机器人处理器,其中:
单片机和机器人处理器通过CAN总线连接;
单片机分别与俯仰机械装置、距离传感器和第一显示屏连接;
机器人处理器分别与摄像头和第二显示屏连接。
根据本公开的另一方面,提供一种服务机器人显示控制方法,包括:
接收人体识别传感器发送的启动信号,其中,人体识别传感器检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;
在接收到启动信号的情况下,控制挂载设备开启运行。
在本公开的一些实施例中,所述控制挂载设备开启运行包括:控制距离传感器开启运行,并测量出用户相对于服务机器人的距离。
在本公开的一些实施例中,所述服务机器人显示控制方法还包括:根据用户相对于服 务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
在本公开的一些实施例中,所述服务机器人显示控制方法还包括:根据用户相对于服务机器人的距离和方位,控制第一显示屏沿水平方向旋转到用户对应的方位,并控制第一显示屏做出相应的表情变化。
在本公开的一些实施例中,所述控制挂载设备开启运行包括:控制摄像头开启运行,拍摄摄像头画面。
在本公开的一些实施例中,所述服务机器人显示控制方法还包括:识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
根据本公开的另一方面,提供一种控制器,包括:
信号接收模块,用于接收人体识别传感器发送的启动信号,其中,人体识别传感器检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;
挂载控制模块,用于在接收到启动信号的情况下,控制挂载设备开启运行;
其中,所述控制器用于执行实现如上述任一实施例所述的服务机器人显示控制方法的操作。
根据本公开的另一方面,提供一种控制器,包括存储器和处理器,其中:
存储器,用于存储指令;
处理器,用于执行所述指令,使得所述控制器执行实现如上述任一实施例所述的服务机器人显示控制方法的操作。
根据本公开的另一方面,提供一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机指令,所述指令被处理器执行时实现如上述任一实施例所述的服务机器人显示控制方法。
为了更清楚地说明本公开实施例或相关技术中的技术方案,下面将对实施例或相关技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开服务机器人一些实施例的示意图。
图2为本公开一些实施例中服务机器人各个模块在服务机器人的位置示意图。
图3为本公开服务机器人另一些实施例的示意图。
图4为本公开一些实施例中传感器探测区域的示意图。
图5为本公开服务机器人又一些实施例的示意图。
图6为本公开一些实施例中服务机器人第一显示屏旋转的示意图。
图7为本公开服务机器人显示控制方法一些实施例的示意图。
图8为本公开控制器一些实施例的示意图。
图9为本公开控制器另一些实施例的示意图。
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本公开的范围。
同时,应当明白,为了便于描述,附图中所示出的各个部分的尺寸并不是按照实际的比例关系绘制的。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为授权说明书的一部分。
在这里示出和讨论的所有示例中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它示例可以具有不同的值。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。
发明人通过研发发现:相关技术服务机器人显示屏采用固定的方式,当机器人处于运行状态时,机器人显示屏处于常亮状态,并且随着人的靠近或者远离不会做出相应的互动。
鉴于以上技术问题中的至少一项,本公开提供了一种服务机器人及其显示控制方法、控制器和存储介质,提高了服务型机器人显示交互的互动性。
图1为本公开服务机器人一些实施例的示意图。图2为本公开一些实施例中服务机器人各个模块在服务机器人的位置示意图。
如图1所示,本公开服务机器人可以包括人体识别传感器1、控制器2和挂载设备3,其中:
人体识别传感器1,用于检测服务机器人周围的预定范围内是否出现用户;并在服务机器人周围的预定范围内出现用户的情况下,向控制器2输出启动信号。
在本公开的一些实施例中,人体识别传感器1还可以用于在用户距离服务机器人的距离小于等于预定距离的情况下,向控制器2输出启动信号。
在本公开的一些实施例中,所述人体识别传感器可以实现为红外热释电传感器。
在本公开的一些实施例中,如图2所示,人体识别传感器1为位于服务机器人胸部的探测角度120°、探测距离2米的人体识别传感器。
控制器2,用于在接收到启动信号的情况下,控制挂载设备3开启运行。
在本公开的一些实施例中,人体识别传感器1还可以用于在用户距离服务机器人的距离大于预定距离的情况下,向控制器2输出待机信号。
控制器2,还可以用于在接收到待机信号的情况下,控制挂载设备3处于待机状态。
挂载设备3,用于根据控制器2的控制指令,开启运行。
图3为本公开服务机器人另一些实施例的示意图。如图3所示,本公开挂载设备3在本公开的一些实施例中,所述挂载设备3可以包括第二显示屏31和第一显示屏32中的至少一项。
第二显示屏31和第一显示屏32,用于在控制器接收到启动信号的情况下,根据控制器2的指示,向用户显示相应的内容。
在本公开的一些实施例中,如图2所示,第二显示屏31位于服务机器人的胸部;人体识别传感器1位于第二显示屏31的下方,第一显示屏32位于服务机器人的头部。
在本公开的一些实施例中,第二显示屏31可以为服务内容显示屏,第一显示屏32可以为表情屏。
在本公开的一些实施例中,如图2和图3所示,图1或图3实施例的挂载设备3还可以包括距离传感器33,其中:
距离传感器33,用于在开启运行后,测量出用户相对于服务机器人的距离。
在本公开的一些实施例中,所述距离传感器33可以实现为超声波传感器和光学传感器等距离传感器。
在本公开的一些实施例中,如图2和图4所示,所述挂载设备包括至少一个距离传感器,其中:所有距离传感器的探测范围之和包含人体识别传感器的探测范围。即,所有距离传感器的探测范围之和可以覆盖人体识别传感器的探测范围。
在本公开的一些实施例中,如图2和图4所示,人体识别传感器和所有距离传感器均设置于服务机器人上;距离传感器均设置于距离人体识别传感器预定距离的同一水平面上。
在本公开的一些实施例中,在挂载设备包括偶数个距离传感器的情况下,距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧。
在本公开的一些实施例中,如图2和图4所示,在挂载设备包括奇数个距离传感器的情况下,一个距离传感器设置于通过人体识别传感器的所述水平面的垂直面上,其他距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧。
控制器2,用于根据用户相对于服务机器人的距离和距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
在本公开的一些实施例中,控制器2,还可以用于根据用户相对于服务机器人的距离和方位,控制第一显示屏沿水平方向旋转到用户对应的方位,并控制第一显示屏做出相应的表情变化。
图4为本公开一些实施例中传感器探测区域的示意图。图4为服务机器人的胸部剖面示意图。如图2和图4所示,人体识别传感器1为位于服务机器人胸部的探测角度120°、探测距离2米的人体识别传感器;距离传感器33可以包括位于胸部探测距离为2米、探测角度为60°的第一超声波传感器331、第二超声波传感器332和第三超声波传感器333。
如图2和图4所示,第一超声波传感器331、第二超声波传感器332和第三超声波传感器333位于第二显示屏31的下方,人体识别传感器1位于第二超声波传感器332的下方。
在本公开的一些实施例中,如图3所示,图1或图3实施例的挂载设备3还可以包括左右转动机械装置34,其中:
左右转动机械装置34,用于控制器2的指示,进行左右转动,以便带动第一显示屏32沿水平方向旋转到对应的方位。
在本公开的一些实施例中,如图2所示,左右转动机械装置34可以实现为第一舵机。左右转动机械装置34位于机器人胸部内,支撑着机器人的脖子,左右转动机械装置34旋转可实现机器人脖子在水平方向上旋转,即,可以实现服务机器人头部(包括第一显示屏) 的左右转动。
在本公开的一些实施例中,如图2或图3所示,图1或图3实施例所述挂载设备3还可以包括摄像头35,其中:
如图2所示,摄像头35设置于第一显示屏32的上方。
摄像头35,用于在控制器接收到启动信号的情况下,根据控制器的指示,拍摄摄像头画面。
控制器2,用于识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏32的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,如图2或图3所示,图1或图3实施例所述挂载设备3还可以包括俯仰机械装置36,其中:
控制器2,用于根据人脸区域在摄像头画面中的位置,计算出人脸区域在摄像头画面中的位置与摄像头画面中心区域位置的距离,将所述距离转换为第一显示屏32的俯仰角和水平角的调节角度。
俯仰机械装置36和左右转动机械装置34,用于根据控制器2的指示,根据所述调节角度进行俯仰转动和左右转动,以便带动第一显示屏32进行俯仰运动和左右转动,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,俯仰机械装置36可以实现为第二舵机。
在本公开的一些实施例中,如图2所示,左右转动机械装置34位于机器人胸部内,支撑着机器人的脖子,左右转动机械装置34旋转可实现机器人脖子在水平方向上旋转,俯仰机械装置36位于机器人头部与脖子固定位置,俯仰机械装置36的旋转可控制机器人头部进行俯仰选择,通过左右转动机械装置34和俯仰机械装置36的配合可实现机器人头部在水平和垂直两个自由度旋转。
基于本公开上述实施例提供的服务机器人,主要用于服务机器人,通过热释电和超声波传感器配合可探测用户方位,利用摄像头捕捉用户影像,处理器进行人脸检测,将检测的结果作为调整第一显示屏俯仰和水平角的依据,由此可实现第一显示屏时刻朝向用户,使用户产生被注视的感觉,用户在心理上会感受到尊重,赋予机器人更多拟人特征,从而提升了用户体验。
本公开的机器人第一显示屏在无用户时处于待机状态,当有用户接近机器人的情况下,第一显示屏启动变亮,从而使得用户在使用机器人时能感受到机器人拟人化的交流方式。
图5为本公开服务机器人又一些实施例的示意图。如图5所示,本公开控制器(例如图1、图3实施例中的控制器2)可以包括单片机21和机器人处理器22,其中:
单片机21和机器人处理器22通过CAN总线连接。
在本公开的一些实施例中,如图5所示,所述CAN总线可以为机器人CAN总线。
如图5所示,单片机21分别与俯仰机械装置36、左右转动机械装置34、人体识别传感器1、距离传感器33和第一显示屏32连接。
如图5所示,机器人处理器22分别与摄像头35和第二显示屏31连接。
在本公开的一些实施例中,如图5所示,第一超声波传感器331、第二超声波传感器332和第三超声波传感器333选用的是485接口,超声波传感器与单片机21进行通讯需要通过485转UART(Universal Asynchronous Receiver/Transmitter,通用异步收发传输器)电路进行转换。
在本公开的一些实施例中,如图5所示,第一显示屏32采用的是RS232通讯,单片机21将要显示的内容通过UART发送,UART转RS232电路将UART电平转换为RS232电平,第一显示屏32收到指令根据指令信息变更显示的内容。
在本公开的一些实施例中,如图5所示,所述服务机器人还可以包括比较器11,其中:
比较器11分别与人体识别传感器1和单片机21连接。
人体识别传感器1输出高低电平,当有人靠近时,热释电传感器输出电平会发生变化,人体识别传感器1输出的电平经过比较器11进行比较,比较器11将比较的结果输出给单片机21进行识别。
服务机器人在周边无用户时处于待机状态,当有用户靠近并且距离2米时,人体识别传感器1输出高电平,通过比较器11比较输出TTL电平,单片机21将用户接近的信息通过机器人内的CAN总线上报给机器人处理器22,机器人处理器22唤醒机器人挂载的设备。通过HDMI将需要显示的服务内容输出到第二显示屏31显示。
在本公开的一些实施例中,如图5所示,机器人处理器22通过HDMI接口将需要显示的内容,输出到第二显示屏31显示。
在本公开的一些实施例中,如图5所示,第一显示屏32上的摄像头35通过USB接口与处理器进行通信。
在本公开的一些实施例中,如图5所示,所述服务机器人还可以包括第一驱动电路341和第二驱动电路361,其中:
第一驱动电路341,用于根据单片机21的指令,驱动左右转动机械装置34发生水平方向上的旋转。
第二驱动电路361,用于根据单片机21的指令,驱动俯仰机械装置36发生垂直方向上的旋转。
在本公开的一些实施例中,如图4所示,每个超声波传感器探测距离为2米,探测角度为60°,机器人被唤醒后意味着用户与机器人距离小于2米,超声波探测到用户与机器人的距离,若第一超声波传感器331探测到距离小于2米的用户,则单片机21向第一驱动电路341发出指令,驱动左右转动机械装置34发生水平方向上的旋转,使第一显示屏32朝向第一超声波传感器331的方向,并且单片机21向第一显示屏32发送指令,显示预设的内容。
图6为本公开一些实施例中服务机器人第一显示屏旋转的示意图。当左右转动机械装置34旋转完成后,第一显示屏32会朝向用户方向,并且用户会进入摄像头35拍摄的视野范围,如图6所示。摄像头35会通过USB将采集到的画面传输给机器人处理器22,机器人处理器22运行人脸检测算法,框选出距离最近的人脸;并且计算人脸区域与摄像头35中心区域距离,按照移动1和移动2两步,调整左右转动机械装置34和俯仰机械装置36的旋转角度。机器人处理器22通过CAN总线向单片机21发出旋转左右转动机械装置34和俯仰机械装置36的指令,单片机21收到指令后控制俯仰机械装置36朝移动1相反的方向旋转,控制左右转动机械装置34朝移动2相反的方向旋转,逐渐将人脸区域位于摄像头35十字靶标中心区域
本公开上述实施例经过第一显示屏32的姿态调整,第一显示屏32时刻正对用户,使用户体验到被注视的感觉。
本公开上述实施例可以用于服务型机器人,机器人第一显示屏32在无用户时处于待机状态,当有用户接近机器人,距离达到2米时,热释电会探测到有人靠近,从而唤醒机器人。位于机器人胸部的距离传感器33,测量出用户的距离和方位,第一显示屏32在水平方向旋转到对应的方位,并做出相应的表情变化。位于第一显示屏32上的摄像机识别人脸区域,并根据人脸在画面中的位置调节第一显示屏32的俯仰角,使人脸区域位于摄像头画面中心区域。通过第一显示屏32旋转运动和表情变化,使用户感受到被关注的感觉,从而提升了交互体验。
图7为本公开服务机器人显示控制方法一些实施例的示意图。优选的,本实施例可由 本公开服务机器人或本公开控制器执行。该方法包括以下步骤:
步骤71,接收人体识别传感器1发送的启动信号,其中,人体识别传感器1检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户的(例如用户距离服务机器人的距离小于等于预定距离)情况下,向控制器2输出启动信号;
步骤72,在接收到启动信号的情况下,控制挂载设备3开启运行。
在本公开的一些实施例中,步骤72中,所述控制挂载设备3开启运行的步骤还可以包括:控制距离传感器33开启运行,并测量出用户相对于服务机器人的距离;根据用户相对于服务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
在本公开的一些实施例中,步骤72中,所述控制挂载设备3开启运行的步骤还可以包括:根据用户相对于服务机器人的距离和方位,控制第一显示屏32沿水平方向旋转到用户对应的方位,并第一显示屏做出相应的表情变化。
在本公开的一些实施例中,步骤72中,所述控制挂载设备3开启运行的步骤还可以包括:控制摄像头35开启运行,拍摄摄像头画面;识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏32的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,所述根据人脸区域在摄像头画面中的位置,调节第一显示屏32的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域的步骤可以包括:根据人脸区域在摄像头画面中的位置,计算出人脸区域在摄像头画面中的位置与摄像头画面中心区域位置的距离,将所述距离转换为第一显示屏32的俯仰角和水平角的调节角度;根据所述调节角度控制俯仰机械装置36和左右转动机械装置34进行俯仰转动和左右转动,以便带动第一显示屏32进行俯仰运动和左右转动,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,所述服务机器人显示控制方法还可以包括:接收人体识别传感器1发送的待机信号,其中,人体识别传感器1在用户距离服务机器人的距离大于预定距离的情况下,向控制器2输出待机信号;在接收到待机信号的情况下,控制挂载设备3处于待机状态。
在本公开的一些实施例中,所述服务机器人显示控制方法还可以包括:控制第二显示屏31和第一显示屏32向用户显示相应的内容。
基于本公开上述实施例提供的服务机器人显示控制方法,可以应用于服务机器人,本 公开上述实施例的目的是提高服务型机器人显示交互的互动性,使用户在使用机器人时能感受到机器人拟人化的交流方式,使机器人更富有生命力,更生动,从而提高了用户的使用体验。
本公开上述实施例可以更好地将服务机器人拟人化,需要提高机器人的显示互动性,使用户感受到类似于与人交流时的眼神互动。
图8为本公开控制器一些实施例的示意图。本公开控制器(例如图1、图3实施例中的控制器2)可以包括信号接收模块201和挂载控制模块202,其中:
信号接收模块201,用于接收人体识别传感器1发送的启动信号,其中,人体识别传感器1检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户(例如用户距离服务机器人的距离小于等于预定距离)的情况下,向控制器2输出启动信号。
挂载控制模块202,用于在接收到启动信号的情况下,控制挂载设备3开启运行。
在本公开的一些实施例中,挂载控制模块202可以用于在接收到启动信号的情况下,控制距离传感器33开启运行,并测量出用户相对于服务机器人的距离,根据用户相对于服务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位;根据用户相对于服务机器人的距离和方位,控制第一显示屏32沿水平方向旋转到用户对应的方位,并控制第一显示屏32做出相应的表情变化。
在本公开的一些实施例中,挂载控制模块202可以用于在接收到启动信号的情况下,控制摄像头35开启运行,拍摄摄像头画面;识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏32的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,挂载控制模块202在根据人脸区域在摄像头画面中的位置,调节第一显示屏32的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域的情况下,可以用于根据人脸区域在摄像头画面中的位置,计算出人脸区域在摄像头画面中的位置与摄像头画面中心区域位置的距离,将所述距离转换为第一显示屏32的俯仰角和水平角的调节角度;根据所述调节角度控制俯仰机械装置36和左右转动机械装置34进行俯仰转动和左右转动,以便带动第一显示屏32进行俯仰运动和左右转动,使得人脸区域位于摄像头画面中心区域。
在本公开的一些实施例中,所述控制器2还可以用于接收人体识别传感器1发送的待 机信号,其中,人体识别传感器1在用户距离服务机器人的距离大于预定距离的情况下,向控制器2输出待机信号;在接收到待机信号的情况下,控制挂载设备3处于待机状态。
在本公开的一些实施例中,所述控制器2还可以用于控制第二显示屏31和第一显示屏32向用户显示相应的内容。
在本公开的一些实施例中,所述控制器2可以用于执行实现如上述任一实施例(例如图7实施例)所述的服务机器人显示控制方法的操作。
基于本公开上述实施例提供的控制器,可以应用于服务机器人,本公开上述实施例的目的是提高服务型机器人显示交互的互动性,使用户在使用机器人时能感受到机器人拟人化的交流方式,使机器人更富有生命力,更生动,从而提高了用户的使用体验。
图9为本公开控制器另一些实施例的示意图。本公开控制器(例如图1、图3实施例中的控制器2)可以包括存储器208和处理器209,其中:
存储器208,用于存储指令。
处理器209,用于执行所述指令,使得所述控制器2执行实现如上述任一实施例(例如图7实施例)所述的服务机器人显示控制方法的操作。
本公开上述实施例可以更好地将服务机器人拟人化,需要提高机器人的显示互动性,使用户感受到类似于与人交流时的眼神互动。
根据本公开的另一方面,提供一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机指令,所述指令被处理器执行时实现如上述任一实施例(例如图7实施例)所述的服务机器人显示控制方法。
基于本公开上述实施例提供的计算机可读存储介质,可以提高服务型机器人显示交互的互动性,使用户在使用机器人时能感受到机器人拟人化的交流方式,使机器人更富有生命力,更生动,从而提高了用户的使用体验。
本公开上述实施例可以更好地将服务机器人拟人化,需要提高机器人的显示互动性,使用户感受到类似于与人交流时的眼神互动。
在上面所描述的控制器可以实现为用于执行本申请所描述功能的通用处理器、可编程逻辑控制器(PLC)、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件或者其任意适当组合。
至此,已经详细描述了本公开。为了避免遮蔽本公开的构思,没有描述本领域所公知的一些细节。本领域技术人员根据上面的描述,完全可以明白如何实施这里公开的技术方案。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指示相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
本公开的描述是为了示例和描述起见而给出的,而并不是无遗漏的或者将本公开限于所公开的形式。很多修改和变化对于本领域的普通技术人员而言是显然的。选择和描述实施例是为了更好说明本公开的原理和实际应用,并且使本领域的普通技术人员能够理解本公开从而设计适于特定用途的带有各种修改的各种实施例。
Claims (16)
- 一种服务机器人,包括:人体识别传感器,用于检测服务机器人周围的预定范围内是否出现用户;并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;控制器,用于在接收到启动信号的情况下,控制挂载设备开启运行;和挂载设备,用于根据控制器的控制指令,开启运行。
- 根据权利要求1所述的服务机器人,其中,所述挂载设备包括至少一个距离传感器,其中:距离传感器,用于在开启运行后,测量出用户相对于服务机器人的距离;控制器,用于根据用户相对于服务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
- 根据权利要求2所述的服务机器人,其中,所述挂载设备包括多个距离传感器,其中:所有距离传感器的探测范围之和包含人体识别传感器的探测范围。
- 根据权利要求3所述的服务机器人,其中:人体识别传感器和所有距离传感器均设置于服务机器人上;距离传感器均设置于距离人体识别传感器预定距离的同一水平面上。
- 根据权利要求4所述的服务机器人,其中:在挂载设备包括偶数个距离传感器的情况下,距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧;在挂载设备包括奇数个距离传感器的情况下,一个距离传感器设置于通过人体识别传感器的所述水平面的垂直面上,其他距离传感器对称设置在通过人体识别传感器的所述水平面的垂直面的两侧。
- 根据权利要求2-5中任一项所述的服务机器人,其中,所述挂载设备包括第一显示屏,其中:第一显示屏,设置在服务机器人头部;控制器,用于根据用户相对于服务机器人的距离和方位,控制第一显示屏沿水平方向旋转到用户对应的方位,并控制第一显示屏做出相应的表情变化。
- 根据权利要求6所述的服务机器人,其中,所述挂载设备还包括摄像头,摄像头设置于第一显示屏上方,其中:摄像头,用于在控制器接收到启动信号的情况下,拍摄摄像头画面;控制器,用于识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
- 根据权利要求7所述的服务机器人,其中,所述挂载设备还包括第二显示屏,其中:第二显示屏,用于在控制器接收到启动信号的情况下,根据控制器的指示,向用户显示相应的服务内容。
- 根据权利要求8所述的服务机器人,其中,所述控制器包括单片机和机器人处理器,其中:单片机和机器人处理器通过CAN总线连接;单片机分别与俯仰机械装置、距离传感器和第一显示屏连接;机器人处理器分别与摄像头和第二显示屏连接。
- 一种服务机器人显示控制方法,包括:接收人体识别传感器发送的启动信号,其中,人体识别传感器检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;在接收到启动信号的情况下,控制挂载设备开启运行。
- 根据权利要求10所述的服务机器人显示控制方法,其中,所述控制挂载设备开启运行包括:控制距离传感器开启运行,并测量出用户相对于服务机器人的距离;所述服务机器人显示控制方法还包括:根据用户相对于服务机器人的距离以及距离传感器的设置位置,确定用户相对于服务机器人的方位,并控制服务机器人头部沿水平方向旋转到用户对应的方位。
- 根据权利要求10或11所述的服务机器人显示控制方法,还包括:根据用户相对于服务机器人的距离和方位,控制第一显示屏沿水平方向旋转到用户对应的方位,并控制第一显示屏做出相应的表情变化。
- 根据权利要求10或11所述的服务机器人显示控制方法,其中,所述控制挂载设备开启运行包括:控制摄像头开启运行,拍摄摄像头画面;所述服务机器人显示控制方法还包括:识别摄摄像头画面中的人脸区域,根据人脸区域在摄像头画面中的位置,调节第一显示屏的俯仰角和水平角,使得人脸区域位于摄像头画面中心区域。
- 一种控制器,包括:信号接收模块,用于接收人体识别传感器发送的启动信号,其中,人体识别传感器检测服务机器人周围的预定范围内是否出现用户,并在服务机器人周围的预定范围内出现用户的情况下,向控制器输出启动信号;挂载控制模块,用于在接收到启动信号的情况下,控制挂载设备开启运行;其中,所述控制器用于执行实现如权利要求10-13中任一项所述的服务机器人显示控制方法的操作。
- 一种控制器,包括存储器和处理器,其中:存储器,用于存储指令;处理器,用于执行所述指令,使得所述控制器执行实现如权利要求10-13中任一项所述的服务机器人显示控制方法的操作。
- 一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机指令, 所述指令被处理器执行时实现如权利要求10-13中任一项所述的服务机器人显示控制方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/777,250 US20220402142A1 (en) | 2019-12-03 | 2020-11-10 | Service robot and display control method thereof, controller and storage medium |
EP20895347.1A EP4046759A4 (en) | 2019-12-03 | 2020-11-10 | SERVICE ROBOT AND DISPLAY CONTROL METHOD THEREOF, CONTROL AND STORAGE MEDIUM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911220522.1A CN110861107B (zh) | 2019-12-03 | 2019-12-03 | 服务机器人及其显示控制方法、控制器和存储介质 |
CN201911220522.1 | 2019-12-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021109806A1 true WO2021109806A1 (zh) | 2021-06-10 |
Family
ID=69657295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/127751 WO2021109806A1 (zh) | 2019-12-03 | 2020-11-10 | 服务机器人及其显示控制方法、控制器和存储介质 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220402142A1 (zh) |
EP (1) | EP4046759A4 (zh) |
CN (1) | CN110861107B (zh) |
WO (1) | WO2021109806A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114190295A (zh) * | 2021-12-20 | 2022-03-18 | 珠海一微半导体股份有限公司 | 一种多宠物机器人控制方法、系统及芯片 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110861107B (zh) * | 2019-12-03 | 2020-12-22 | 北京海益同展信息科技有限公司 | 服务机器人及其显示控制方法、控制器和存储介质 |
CN111469137B (zh) * | 2020-04-10 | 2022-09-06 | 京东科技信息技术有限公司 | 机器人 |
CN118519518A (zh) * | 2023-02-20 | 2024-08-20 | 华为技术有限公司 | 显示屏控制方法、设备和系统 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007221300A (ja) * | 2006-02-15 | 2007-08-30 | Fujitsu Ltd | ロボット及びロボットの制御方法 |
CN106355242A (zh) * | 2016-09-26 | 2017-01-25 | 苏州小璐机器人有限公司 | 一种基于人脸检测的互动机器人 |
CN206170100U (zh) * | 2016-11-14 | 2017-05-17 | 上海木爷机器人技术有限公司 | 机器人 |
CN107932511A (zh) * | 2017-11-29 | 2018-04-20 | 芜湖星途机器人科技有限公司 | 自动控制人脸姿态的机器人 |
CN109062482A (zh) * | 2018-07-26 | 2018-12-21 | 百度在线网络技术(北京)有限公司 | 人机交互控制方法、装置、服务设备及存储介质 |
CN209022077U (zh) * | 2018-09-11 | 2019-06-25 | 小白智能科技(长春)股份有限公司 | 小白人智媒体设备 |
CN109940638A (zh) * | 2019-04-26 | 2019-06-28 | 北京猎户星空科技有限公司 | 机器人、机器人控制方法、装置、存储介质和控制器 |
CN110154056A (zh) * | 2019-06-17 | 2019-08-23 | 常州摩本智能科技有限公司 | 服务机器人及其人机交互方法 |
CN110653812A (zh) * | 2018-06-29 | 2020-01-07 | 深圳市优必选科技有限公司 | 一种机器人的交互方法、机器人及具有存储功能的装置 |
CN110861107A (zh) * | 2019-12-03 | 2020-03-06 | 北京海益同展信息科技有限公司 | 服务机器人及其显示控制方法、控制器和存储介质 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8918209B2 (en) * | 2010-05-20 | 2014-12-23 | Irobot Corporation | Mobile human interface robot |
JP6686583B2 (ja) * | 2016-03-17 | 2020-04-22 | カシオ計算機株式会社 | ロボット及びプログラム |
CN106054895A (zh) * | 2016-07-11 | 2016-10-26 | 湖南晖龙股份有限公司 | 智能营业厅机器人及其室内行走偏航自动校正方法 |
KR102639904B1 (ko) * | 2016-12-23 | 2024-02-26 | 엘지전자 주식회사 | 공항용 로봇 및 그의 동작 방법 |
CN106956274A (zh) * | 2017-03-28 | 2017-07-18 | 旗瀚科技有限公司 | 一种机器人唤醒方法 |
KR102391322B1 (ko) * | 2017-06-30 | 2022-04-26 | 엘지전자 주식회사 | 이동 로봇 |
CN207189673U (zh) * | 2017-08-25 | 2018-04-06 | 科沃斯机器人股份有限公司 | 自移动机器人 |
CN107825438A (zh) * | 2017-12-06 | 2018-03-23 | 山东依鲁光电科技有限公司 | 城市智能综合服务机器人及其运行方法 |
CN107891430A (zh) * | 2017-12-23 | 2018-04-10 | 河南智盈电子技术有限公司 | 一种基于物联网的安防机器人 |
WO2019147235A1 (en) * | 2018-01-24 | 2019-08-01 | Ford Global Technologies, Llc | Path planning for autonomous moving devices |
CN108346243A (zh) * | 2018-02-10 | 2018-07-31 | 佛山市建金建电子科技有限公司 | 一种优惠券领票机器人 |
CN208262848U (zh) * | 2018-03-19 | 2018-12-21 | 山西安信恒创机器人技术有限公司 | 一种智能服务机器人 |
KR102012968B1 (ko) * | 2018-08-07 | 2019-08-27 | 주식회사 서큘러스 | 인터렉션 로봇의 제어 방법 및 제어 서버 |
CN111230927A (zh) * | 2018-11-28 | 2020-06-05 | 天津工业大学 | 一种基于红外和超声传感器的跟踪机器人 |
-
2019
- 2019-12-03 CN CN201911220522.1A patent/CN110861107B/zh active Active
-
2020
- 2020-11-10 US US17/777,250 patent/US20220402142A1/en active Pending
- 2020-11-10 EP EP20895347.1A patent/EP4046759A4/en active Pending
- 2020-11-10 WO PCT/CN2020/127751 patent/WO2021109806A1/zh unknown
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007221300A (ja) * | 2006-02-15 | 2007-08-30 | Fujitsu Ltd | ロボット及びロボットの制御方法 |
CN106355242A (zh) * | 2016-09-26 | 2017-01-25 | 苏州小璐机器人有限公司 | 一种基于人脸检测的互动机器人 |
CN206170100U (zh) * | 2016-11-14 | 2017-05-17 | 上海木爷机器人技术有限公司 | 机器人 |
CN107932511A (zh) * | 2017-11-29 | 2018-04-20 | 芜湖星途机器人科技有限公司 | 自动控制人脸姿态的机器人 |
CN110653812A (zh) * | 2018-06-29 | 2020-01-07 | 深圳市优必选科技有限公司 | 一种机器人的交互方法、机器人及具有存储功能的装置 |
CN109062482A (zh) * | 2018-07-26 | 2018-12-21 | 百度在线网络技术(北京)有限公司 | 人机交互控制方法、装置、服务设备及存储介质 |
CN209022077U (zh) * | 2018-09-11 | 2019-06-25 | 小白智能科技(长春)股份有限公司 | 小白人智媒体设备 |
CN109940638A (zh) * | 2019-04-26 | 2019-06-28 | 北京猎户星空科技有限公司 | 机器人、机器人控制方法、装置、存储介质和控制器 |
CN110154056A (zh) * | 2019-06-17 | 2019-08-23 | 常州摩本智能科技有限公司 | 服务机器人及其人机交互方法 |
CN110861107A (zh) * | 2019-12-03 | 2020-03-06 | 北京海益同展信息科技有限公司 | 服务机器人及其显示控制方法、控制器和存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4046759A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114190295A (zh) * | 2021-12-20 | 2022-03-18 | 珠海一微半导体股份有限公司 | 一种多宠物机器人控制方法、系统及芯片 |
Also Published As
Publication number | Publication date |
---|---|
EP4046759A1 (en) | 2022-08-24 |
CN110861107A (zh) | 2020-03-06 |
US20220402142A1 (en) | 2022-12-22 |
EP4046759A4 (en) | 2023-11-22 |
CN110861107B (zh) | 2020-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021109806A1 (zh) | 服务机器人及其显示控制方法、控制器和存储介质 | |
EP2534554B1 (en) | Input command | |
Matsumoto et al. | Behavior recognition based on head pose and gaze direction measurement | |
US9064144B2 (en) | Method and apparatus for recognizing location of user | |
CN109605363B (zh) | 机器人语音操控系统及方法 | |
WO2023071884A1 (zh) | 注视检测方法、电子设备的控制方法及相关设备 | |
CN110779150B (zh) | 一种基于毫米波的空调器控制方法、装置及空调器 | |
CN207292469U (zh) | 多姿态传感器云台装置、相机和无人机飞行器 | |
CN107103309A (zh) | 一种基于图像识别的学生坐姿检测与纠正系统 | |
US9934735B2 (en) | Display control method and electronic device | |
CN102541085A (zh) | 具有球形底部的视频目标跟踪和姿态控制的装置和方法 | |
TW201506776A (zh) | 螢幕顯示模式的調整方法與電子裝置 | |
JP5403522B2 (ja) | 制御装置、ロボット、制御方法、ならびに、プログラム | |
US10013802B2 (en) | Virtual fitting system and virtual fitting method | |
CN113160260B (zh) | 一种头眼双通道智能人机交互系统及运行方法 | |
JP2010082714A (ja) | コミュニケーションロボット | |
TWI574801B (zh) | Intelligent robot control method | |
JPH05298015A (ja) | 視線検出システムおよび情報処理システム | |
US11727719B2 (en) | System and method for detecting human presence based on depth sensing and inertial measurement | |
CN108064136B (zh) | 一种智能雨伞 | |
CN215814080U (zh) | 一种头眼双通道智能人机交互系统 | |
US11574532B2 (en) | Visible-light-image physiological monitoring system with thermal detecting assistance | |
KR20210115842A (ko) | 인공지능 영상처리로 사용자를 인식하고 추종하는 이동체 장치 및 동작방법 | |
CN110555331B (zh) | 脸部辨识系统与方法 | |
TWI826189B (zh) | 具六自由度之控制器追蹤系統及方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20895347 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020895347 Country of ref document: EP Effective date: 20220518 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |