WO2018028200A1 - 电子机器设备 - Google Patents

电子机器设备 Download PDF

Info

Publication number
WO2018028200A1
WO2018028200A1 PCT/CN2017/076922 CN2017076922W WO2018028200A1 WO 2018028200 A1 WO2018028200 A1 WO 2018028200A1 CN 2017076922 W CN2017076922 W CN 2017076922W WO 2018028200 A1 WO2018028200 A1 WO 2018028200A1
Authority
WO
WIPO (PCT)
Prior art keywords
action
user
processing device
image
motion
Prior art date
Application number
PCT/CN2017/076922
Other languages
English (en)
French (fr)
Inventor
韩阳
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US15/561,770 priority Critical patent/US20180245923A1/en
Publication of WO2018028200A1 publication Critical patent/WO2018028200A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • H04W4/14Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • Embodiments of the present disclosure relate to an electronic machine device.
  • the guiding robot identifies things based on a large amount of image data, determines the destination that the user wants to reach, and leads the user to the place he wants to reach based on the pre-stored path.
  • the leading robot in the prior art can only walk in a fixed area and lead the user to a designated place, and needs to perform track planning according to the current place and the destination in advance, and guide the route according to the planned path.
  • the guiding robot will not be able to complete the task.
  • an electronic machine apparatus comprising: an image capture device, a processing device, and a control device, wherein the image capture device is configured to collect motion information of a user and generate a captured image;
  • the processing device is configured to acquire a first action that the user wants to perform based on the acquired image; determine a second action of the electronic device based on the first action; and generate a second action based on the second action Controlling the command and transmitting to the control device; the control device controlling the electronic machine device to perform the second action based on the control command.
  • the processing device determines whether the user changes from the initial action to the first action based on the acquired image, wherein the initial action and the first action are different types of actions.
  • the image capture device collects motion information of the user and generates at least a first captured image and a second captured image; the processing device compares images between the first captured image and the second captured image The amount of information change, determining that the user is based on the amount of change in the image information Whether it changes from the initial action to the first action.
  • the processing device separately performs information extraction on the first acquired image and the second collected image, and determines that the user changes from an initial action to the first based on an amount of change in image information between the extracted information. action.
  • the processing device separately performs binarization processing on the first acquired image and the second collected image, and is based on image information between the binarized first acquired image and the second collected image The amount of change is used to determine that the user changes from the initial action to the first action.
  • the image capture device collects motion information of the user and generates at least a first captured image and a second captured image; the processing device analyzes the user in the first captured image and the second captured image The position change information determines whether the user changes from the initial action to the first action based on the position change information.
  • the processing device analyzes coordinate position change information of the user in the first acquired image and the second acquired image, and determines, according to the coordinate position change information, whether the user changes from an initial action to the first An action.
  • the electronic machine device may further include a wireless signal transmitting device configured to transmit a wireless signal to the user and receive a wireless signal returned from the user; the processing device determines And determining, by the image information change amount, whether the user changes from the initial action to the first action based on the amount of change in image information between the transmitted wireless signal and the returned wireless signal.
  • a wireless signal transmitting device configured to transmit a wireless signal to the user and receive a wireless signal returned from the user
  • the processing device determines And determining, by the image information change amount, whether the user changes from the initial action to the first action based on the amount of change in image information between the transmitted wireless signal and the returned wireless signal.
  • the first action is a displacement action
  • the processing device determines a motion direction and a motion speed of the first motion based on the first motion; determining the motion based on the motion direction and the motion speed of the first motion
  • the moving direction and the moving speed of the electronic machine device are such that the moving direction and the moving speed of the second motion match the moving direction and the moving speed of the first motion.
  • the processing device further acquires a location of the user, determining a direction of motion and a speed of movement of the second motion based on the location of the user to maintain the electronic machine device at a predetermined distance in front of or lateral to the user The second action is performed.
  • the electronic machine device may further include a first sensor, wherein the first sensor is configured to recognize brightness of ambient light, and notify the processing device when the ambient light brightness is greater than a first brightness threshold; The processing device stops execution of the second action based on the brightness notification.
  • the electronic machine device may further include a second sensor, wherein the second sensor
  • the device is configured to identify an obstacle within a predetermined range around the electronic machine device, and when the obstacle is identified, send an obstacle notification to the processing device; the processing device changes the said based on the obstacle notification The direction and/or speed of the second action.
  • the electronic machine device may further include a third sensor and a prompting device, wherein the third sensor detects a radio signal within a predetermined range, and when the radio signal is detected, notifying the prompting device; the prompting device is based on The radio signal notifies that the user is prompted for information.
  • the third sensor detects a radio signal within a predetermined range, and when the radio signal is detected, notifying the prompting device; the prompting device is based on The radio signal notifies that the user is prompted for information.
  • the electronic machine device may further include a fourth sensor, wherein the second action is a displacement action, the fourth sensor detects a position of the user within a predetermined range, and when detecting a location of the user, The processing device transmits location information; the processing device determines a path of the electronic device to the location based on the location information, and determines the displacement action to the user direction based on the path.
  • the second action is a displacement action
  • the fourth sensor detects a position of the user within a predetermined range, and when detecting a location of the user,
  • the processing device transmits location information; the processing device determines a path of the electronic device to the location based on the location information, and determines the displacement action to the user direction based on the path.
  • the fourth sensor detects a plurality of pieces of position information of the user within a predetermined time and transmits the plurality of pieces of position information to the processing device; the processing device determines whether the user is based on the plurality of pieces of position information There is a change in position; when it is determined that there is no change in position, a path of the electronic machine device to the position is determined based on the position information, and the displacement action in the direction of the user is determined based on the path.
  • the electronic machine device may further include a storage unit, wherein the first action is a plurality of consecutive actions, and the processing device determines the plurality of electronic machine devices based on the consecutive plurality of first actions a continuous second action; and generating a movement path based on the plurality of consecutive second actions; the storage unit being configured to store the movement path.
  • the first action is a plurality of consecutive actions
  • the processing device determines the plurality of electronic machine devices based on the consecutive plurality of first actions a continuous second action; and generating a movement path based on the plurality of consecutive second actions
  • the storage unit being configured to store the movement path.
  • the electronic machine device may further include a function button, wherein the storage unit includes at least one movement path, wherein the function button is configured to determine a movement path corresponding to the input based on an input of the user,
  • the processing device determines a second action of the electronic device based on the movement path and the first action.
  • the electronic machine device may further include a second sensor, wherein the second sensor is configured to identify an obstacle within a predetermined range around the electronic machine device, in response to identifying the obstacle,
  • the processing device transmits an obstacle notification; the processing device determines a second action of the electronic device based on the obstacle notification to cause the electronic device to avoid the obstacle.
  • the processing device changes the movement path based on the second action and will change a subsequent movement path is sent to the storage unit;
  • the storage unit stores the changed movement path.
  • the second sensor transmits an obstacle notification to the processing device; the processor is based on the movement notification and based on the movement path and the first An action determines a second action of the electronic machine device.
  • the electronic equipment device does not need to plan the path in advance, and can determine the actions that need to be performed according to the actions of the user, and complete various service tasks.
  • FIG. 1 shows a schematic structural view of an electronic machine device according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram showing the outline design of an electronic machine device according to an embodiment of the present disclosure
  • FIG. 3 illustrates another structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure
  • FIG. 4 shows a third structural schematic view of an electronic machine device in accordance with an embodiment of the present disclosure
  • FIG. 5 shows a fourth structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure
  • FIG. 6 illustrates a flowchart of obstacle processing in accordance with an embodiment of the present disclosure.
  • an electronic machine device refers to a machine device that is based on digital and logical computing devices and that can move itself without an external command state, such as an artificial intelligence device, a robot, or a robotic pet.
  • FIG. 1 shows a schematic structural view of an electronic machine device according to an embodiment of the present disclosure.
  • 2 shows a schematic diagram of an outline design of an electronic machine device in accordance with an embodiment of the present disclosure.
  • the electronic machine device 100 includes an image capture device 110, a processing device 120, and a control device 130.
  • the electronic machine equipment may include a driving device, which may include a power component such as a motor and moving parts such as wheels, tracks, etc., and may be activated and stopped according to instructions. Stop, go straight, turn, climb obstacles, etc.
  • a driving device which may include a power component such as a motor and moving parts such as wheels, tracks, etc., and may be activated and stopped according to instructions. Stop, go straight, turn, climb obstacles, etc.
  • Embodiments of the present disclosure are not limited to a particular type of drive device.
  • the image capture device 110 is configured to collect motion information of the user and generate a captured image.
  • Image acquisition device 110 may, for example, include one or more cameras, cameras, and the like.
  • the image capture device 110 may collect images in a fixed direction or may flip to capture image information at different positions and at different angles.
  • the image capture device 110 can be configured to capture not only an image of visible light, but also an image of infrared light for use in a nighttime environment.
  • the image acquired by the image capture device 110 can be stored in a storage device or stored in the storage device according to a user's instruction.
  • the processing device 120 is configured to acquire a first action that the user wants to perform based on the image acquired by the image capture device 110, and then determine a second action of the electronic device based on the first action, and generate a control command based on the second action and send Give the control device.
  • Processing device 120 may be, for example, a general purpose processor, such as a central processing unit (CPU), or a special purpose processor, such as a programmable logic circuit (PLC), a field programmable gate array (FPGA), or the like.
  • CPU central processing unit
  • PLC programmable logic circuit
  • FPGA field programmable gate array
  • the control device 130 controls the electronic machine device to perform the second action based on the control command.
  • the control device 130 can, for example, control an action of the electronic device to walk, turn on an internal specific function, or emit a sound.
  • the control commands can be stored in a predetermined storage device and read into the control device 130 when the electronic machine device is in operation.
  • examples of the electronic machine device 100 include a wheel 210, a function button 220, a light source 230, and the like.
  • the electronic machine device 100 can acquire images through the image capture device 110.
  • the electronic machine device 100 can pass various function buttons 220 for the user to input commands.
  • Light source 230 can be turned on as needed for illumination or the like, which can be, for example, a brightness-adjustable LED light.
  • the various functional components in FIG. 2 are not necessary for the embodiments of the present disclosure, and those skilled in the art may understand that the functional components may be increased or decreased according to actual needs.
  • the function button 220 can be replaced with a touch screen or the like.
  • FIG. 3 illustrates another structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
  • the structure and operation of an electronic machine device that can move by itself according to an embodiment of the present disclosure will be described below with reference to FIG.
  • the processing device 120 determines a first action of the user and determines a second action of the electronic machine device based on the first action.
  • the first action may be, for example, a displacement action, a gesture action, or the like.
  • the processing device 120 determines the action direction and the action speed of the displacement action, and is based on the first
  • the motion direction and the motion speed of the motion determine the motion direction and the motion speed of the electronic device to match the motion direction and the motion speed of the second motion of the electronic device with the motion direction and motion speed of the first motion of the user.
  • the electronic equipment can lead, illuminate, etc. when the user walks.
  • the processing device 120 may also determine the action direction and the action speed of the other gesture actions of the user, and determine the motion direction and the motion speed of the electronic device based on the action direction and the action speed of the first action, so as to enable the second action of the electronic device.
  • the direction of motion and the speed of motion match the direction of motion and the speed of motion of the user's first motion.
  • the electronic machine device can transmit a medical device or the like according to the gesture of the user.
  • the embodiments of the present disclosure will be described below by taking only the user's bit movement as an example.
  • the processing device 120 determines the user's walking action, in order to protect the security of the user, for example, when the user is a child or an elderly person, the user may be assisted to lead the way, or as a companion of the user.
  • the electronic machine can walk in front of or to the side of the user. If there is no pre-stored path inside the electronic device at this time, then the user does not know where the user wants to go, then the image capture device 110 can be used to continuously collect images containing the user, by analyzing the image, between multiple images. Compare to determine the direction of movement of the user.
  • the electronic machine equipment can also determine the speed of the user's movement through parameters such as the amount of change between the images and the time of use.
  • the electronic machine device After determining the direction of motion and the speed of movement of the user, the electronic machine device can determine its own direction and speed of movement so that the electronic machine device maintains a closer distance from the user and remains at a predetermined distance in front of or lateral to the user. To avoid the electronic equipment is too far away from the user to play the role of companionship, or too close to cause collision with the user.
  • the electronic machine device when the electronic machine device guides the user, it can also turn on a light source such as a night light to illuminate, so that the user can see the road while walking at night, and improve safety.
  • the processing device 120 may further acquire the location of the user, for example, by analyzing the coordinates of the user in the captured image to determine the location of the user, or based on indoor positioning technologies such as Wi-Fi, Bluetooth, ZIGBEE, RFID, and the like. Determining the user position, based on the user position, can more accurately determine the direction of motion and the speed of movement of the second action of the user, so that the electronic machine device remains moved at a predetermined distance in front of or lateral to the user.
  • indoor positioning technologies such as Wi-Fi, Bluetooth, ZIGBEE, RFID, and the like.
  • the electronic machine device 100 may further include a first sensor 140, which may be, for example, an ambient light sensor capable of recognizing the brightness of ambient light.
  • a first sensor 140 which may be, for example, an ambient light sensor capable of recognizing the brightness of ambient light.
  • the processing device 120 is notified, and the processing device 120 stops execution of the second action based on the brightness notification. For example, when the user turns on the indoor lighting, the user may no longer need the electronic equipment to guide and guide the lighting, so that the electronic equipment can stop after the indoor lighting is recognized. Move, or go back to the preset default location.
  • the electronic machine device 100 may further include a second sensor 150, which may be, for example, a radar sensor, an infrared sensor, a distance sensor, or the like, which may sense an obstacle within a predetermined range around the electronic machine device.
  • a second sensor 150 which may be, for example, a radar sensor, an infrared sensor, a distance sensor, or the like, which may sense an obstacle within a predetermined range around the electronic machine device.
  • the processing device 120 can, for example, analyze the signal to determine if an obstacle is present on the forward route.
  • the processing device 120 changes the direction and/or speed of the second action based on whether an obstacle is present.
  • the second sensor itself may also have processing capabilities to determine if an obstacle is present and to feed back information about the presence or absence of the obstacle to the processing device 120.
  • the radar sensor determines whether there are obstacles around by detecting a change in the frequency or amplitude of the returned signal by transmitting a radar signal to its surroundings.
  • the infrared sensor determines the distance between the front object and itself based on the returned signal by transmitting an infrared signal to the surroundings, so that the processor 120 can determine whether the user's walking is affected or not, and whether the walking direction needs to be changed.
  • the processing device 120 may change the direction of the second action performed by itself, or may alert the user to alert the user.
  • the electronic machine device may further include a third sensor 160 and a prompting device 170, which may be, for example, a wireless signal sensor, can detect a radio signal within a predetermined range, and can notify when the radio signal is detected. Prompt device 170.
  • the prompting device 170 can be, for example, a speaker, an LED light, or the like, which can draw the user's attention to remind the user.
  • the wireless signal sensor of the electronic machine device senses the incoming call or the short message of the mobile phone, and can notify the user of the incoming call information or the short message information, so as to avoid the small or mute state of the mobile phone. Miss important calls.
  • the electronic machine device can also broadcast the mobile phone call information or short message information under the preset setting of the user.
  • the electronic machine device may further include a fourth sensor 180, which may be a sensor such as an infrared sensor that can detect the position of the user within a predetermined range.
  • the fourth sensor 180 may, for example, transmit user location information to the processing device 120 when the location of the user is detected.
  • the processing device 120 determines a path of the electronic device to the user location based on the location information and determines a displacement action to walk toward the user based on the path.
  • the electronic machine device can determine the location of the user and then assist the user with the item he/she wants based on the user's instructions.
  • the infrared sensor can determine the position of the user by detecting the temperature and the distance, and can also determine the location of the user by combining the temperature with the physical contour to avoid misjudgment.
  • the fourth sensor 180 may detect the use within a predetermined time period A plurality of location information of the user, and transmitting a plurality of location information to the processing device 120.
  • the processing device 120 determines whether the user has a location change based on the plurality of location information. When it is determined that there is no change in position, the processing device 120 determines a path of the electronic machine device to the location based on the location information and determines a displacement action in the direction of the user based on the path. For example, within 10 seconds, if multiple images taken indicate that the user is fixed in one position, it means that the user has no position change.
  • the processing device 120 can determine the distance between the user and the electronic device to determine the location of the user, and send the user the item he/she wants. If the user continuously moves from the multiple images taken, it means that the user's position is changing at this time. At this time, the electronic machine device can provide the user with no need to send the object to avoid the processing resources caused by the user position. waste.
  • the second action of the electronic device is determined by determining the first action of the user, so that the second action is unified with the first action, thereby solving the problem that the electronic device can help even if the path is not preset.
  • the technical problems of the user guides ensure that the electronic machine equipment can perform the corresponding tasks at any time according to the needs of the user.
  • FIG. 4 shows a third structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
  • the structure and operation of an electronic machine device that can move by itself according to an embodiment of the present disclosure will be described below with reference to FIG.
  • the processing device 130 may determine whether the user changes from the initial action to the first action based on the acquired image, and the initial action and the first action are different types of actions. That is, the processing device 130 can determine whether the user has an action change.
  • different types of actions or action changes refer to actions in which the two actions are different attributes. For example, eating and walking actions, getting up and sleeping actions, learning actions and playing actions, etc., all belong to different types of actions. Conversely, if the user transitions from the left side to the lying side or the right side in the sleep state, although the action changes, it still belongs to the sleep action, and thus does not belong to the different types of actions defined by the present disclosure.
  • the image capture device 110 collects motion information of the user and generates a first captured image and a second captured image, or more captured images.
  • the processing device 120 compares the image information change amount between the first captured image and the second captured image or between the plurality of acquired images, and determines whether the user changes from the initial motion to the first motion based on the image information change amount.
  • the first acquired image and the second acquired image may be two consecutive frames of images, and the processing device 120 can effectively identify whether the user has an action change by comparing the preceding and succeeding frames.
  • the processing device 120 can directly base on two or The plurality of captured images themselves are compared, and the first captured image and the second captured image may be separately extracted, the important information in the image is extracted, and the user is determined from the initial motion based on the amount of change of the image information between the extracted information.
  • Change to the first action For example, the first acquisition image and the second acquisition image are separately binarized, and the user is determined to change from the initial motion to the user based on the amount of change in the image information between the binarized first acquired image and the second acquired image.
  • the first action Alternatively, the background information in the image is removed, and the comparison is performed based on the foreground information to determine whether the user motion changes before and after the user action.
  • contour extraction is performed on all images, and comparison is performed based on the contour information to determine the before and after changes of the two frames of images and the like. In this way, the amount of calculation can be effectively reduced, and the processing efficiency can be improved.
  • the amount of change in the image information it can be judged based on the overall content of the processed image. For example, after binarizing the first acquired image and the second acquired image, the pixel values in each image are accumulated, and then the difference between the pixel accumulated values in each image is compared to be greater than a preset threshold.
  • the threshold can be set to a value of 20-40% according to actual needs.
  • the accumulated value is greater than the preset threshold, the user may be considered to change from the initial action to the first action.
  • the accumulated value is less than the preset threshold, it can be considered that the user remains in the initial action. For example, if the user only makes a roll-over action during sleep, the difference between the image pixel accumulated value of the latter frame and the pixel accumulated value of the previous frame image is 15%, and it can be considered that the user remains in the sleep action.
  • the image capture device 110 continuously collects motion information of the user and generates at least a first acquired image and a second captured image.
  • the processing device 120 analyzes the position change information of the user in the first acquired image and the second collected image, and determines whether the user changes from the initial action to the first action based on the position change information.
  • the processing device 120 sets a unified coordinate system for each image acquired by the image capturing device 110, for example, after the user enters the sleep action, taking the bed surface position of the bed as the origin, and the bed surface from the bed head to the bed end.
  • the direction is set to the X-axis direction, and the abscissa is set, and the ordinate is set to the Y-axis direction in the direction perpendicular to the bed surface of the bed surface toward the ceiling.
  • the coordinate change threshold may be set in advance, and the threshold may be set based on historical data, for example, may be one of 5%-20%.
  • the user can be considered to change from the sleep action to the wake-up action.
  • the ordinate of the user's head is changed from 10cm to 12cm, and the change value is less than the threshold, it can be judged that the user is still It is still asleep.
  • the electronic device device can also determine whether the user converts from the initial action to the first action by the wireless signal transmitting device.
  • a wireless signal transmitting device 240 may be further disposed on the electronic device 100.
  • the wireless signal transmitting device 240 may be, for example, a radar transmitting transducer, and may be an ultrasonic transmitter, an infrared signal transmitter, or the like.
  • the wireless signal transmitting device 240 can transmit various wireless signals to the user and receive wireless signals returned from the user.
  • the wireless signal transmitting device 240 may also not transmit a signal to the user, but may transmit a signal to a possible action area of the user around the user to determine whether the user performs the corresponding action.
  • the processing device 120 can determine the amount of change in image information between the wireless signal transmitted by the wireless signal transmitting device 240 and the wireless signal returned from the user. Since the transmitted wireless signal is occluded and blocked by what object, the strength of the return information also changes. Therefore, it is possible to determine whether the user changes from the initial motion to the first motion based on the amount of change before and after the signal.
  • the amount of change in the image information may be a signal frequency change amount, a signal amplitude change amount, a combination of the two, or the like.
  • the frequency change amount is 200-500 Hz, it means that the frequency change amount is small, and the action does not change; when the frequency change amount is 1000-3000 Hz, it means that the frequency change amount is large, and it can be considered that the user's motion changes from the initial action to the first. action.
  • determining whether the user changes from one action to another by determining the collected image including the user action, and determining the next action of the electronic device device according to the change can effectively perform what the user wants to do. Pre-judging things or places you want to reach can provide users with timely and accurate services.
  • FIG. 5 shows a fourth structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
  • the electronic machine device 100 may further include a storage unit 190 in addition to the image capture device 110, the processing device 120, and the control device 130.
  • the electronic machine device may be trained to memorize at least one storage path.
  • the image capture device 110 can collect a plurality of first actions, and the first action can be a plurality of consecutive actions, for example, multiple displacement actions.
  • the processing device 120 determines a plurality of consecutive second actions of the electronic device based on the plurality of consecutive first actions, and generates a movement path based on the plurality of consecutive second actions. That is to say, after the processing device 120 guides the user, the path can be memorized and sent to the storage unit 190, and the storage unit 190 stores the path.
  • the electronic machine device 100 can also be provided with a plurality of function buttons 220, which can receive the input of the user and determine the storage in the storage unit 190 corresponding to the user input. Move the path.
  • the processing device 120 may determine a second action of the electronic device based on the movement path and the first action of the user based on the user's selection input. For example, by default, processing device 120 can direct the user to move along the stored movement path, but at the same time processing device 120 also needs to consider the user's first action, if the user suddenly changes direction during walking, electronic machine device 110 You can also change your second action as needed to suit your needs.
  • the electronic machine device also has a function of identifying an obstacle.
  • FIG. 6 shows a flow chart of one example of an obstacle processing method according to an embodiment of the present disclosure.
  • the electronic machine device may further include a second sensor 150, which may be, for example, a sensor that transmits a radar signal, and may detect whether there is an obstacle in a predetermined range around the electronic machine device according to the returned wireless signal by transmitting a wireless signal to the surroundings. Things.
  • step 601 the processing device 120 may retrieve a route pre-stored in the storage unit 190.
  • processing device 120 can control to walk in accordance with a set route.
  • step 603 the second sensor 150 can be used to identify the obstacle.
  • step 604 it is determined whether there is an obstacle.
  • step 605 when it is determined that there is an obstacle in front of the route, the obstacle notification is transmitted to the processing device 120, and the processing device 120 determines the second action of the electronic device based on the obstacle notification to cause the electronic device to avoid the obstacle.
  • the second sensor 150 may also send the obstacle notification to the processing device 120, and the processing device 120 still uses the movement path pre-stored in the storage unit 190 based on the obstacle-free notification and The user's first action determines its second action while instructing the second sensor 150 to continue detecting obstacles.
  • step 607 after avoiding the obstacle, the processing device 120 may record the current path of the obstacle bypassing.
  • the processing device 120 may also send the newly recorded new movement path to the storage unit 190, and store the new movement path for later selection and use by the user.
  • the processing device 120 can instruct to continue walking in accordance with the previously set set route.
  • the previously recorded path can be directly updated using the newly recorded path, after which the processing device 120 can determine the second action of the electronic device based on the updated movement path or according to further selection by the user.
  • Embodiments of the present disclosure store one or more memories by training an electronic machine device
  • the storage path can also be acted upon according to the user's input selection, according to the path selected by the user, and can also effectively avoid obstacles. Make the functions of electronic equipment more powerful and adapt to the different needs of users.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Navigation (AREA)
  • Telephone Function (AREA)

Abstract

一种电子机器设备(100),包括图像采集装置(110)、处理装置(120)以及控制装置(130)。图像采集装置(110)被配置为采集用户的动作信息,并生成采集图像。处理装置(120)被配置为基于采集图像获取用户想要进行的第一动作,基于第一动作确定电子机器设备(100)的第二动作,并基于第二动作生成控制指令并发送给控制装置(130);控制装置(130)基于控制指令,控制电子机器设备(100)执行第二动作。电子机器设备(100)可以不需要提前规划路径,就可以根据用户的动作,确定自身需要执行的动作,完成多种服务任务。

Description

电子机器设备 技术领域
本公开实施例涉及一种电子机器设备。
背景技术
近几年,各种功能的机器人已经出现在人们的生活中,例如,扫地机器人、引路机器人等。其中的引路机器人以大量的图像数据为基础识别事物,确定用户想要达到的目的地,并基于预先存储的路径,将用户引领到其想要达到的地方。
然而,现有技术中的引路机器人,仅能够在固定区域内行走,并将用户引领至指定地点,而且需要预先根据当前地和目的地进行航迹规划,根据规划好的路径进行引路。而当用户希望去一个机器人没有去过的地方,引路机器人将无法完成任务。
发明内容
本公开实施例的目的在于提供一种电子机器设备,以解决上述技术问题。
根据本公开的至少一个实施例,提供了一种电子机器设备,包括:图像采集装置、处理装置以及控制装置,其中,所述图像采集装置被配置为采集用户的动作信息,并生成采集图像;所述处理装置被配置为基于所述采集图像获取所述用户想要进行的第一动作;基于所述第一动作确定所述电子机器设备的第二动作;并基于所述第二动作,生成控制指令,并发送给所述控制装置;所述控制装置基于所述控制指令,控制所述电子机器设备执行所述第二动作。
例如,所述处理装置基于采集图像判断用户是否从初始动作变化为所述第一动作,其中,所述初始动作和所述第一动作是不同类型的动作。
例如,所述图像采集装置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;所述处理装置比较所述第一采集图像和所述第二采集图像之间的图像信息变化量,基于所述图像信息变化量判断所述用户是 否从初始动作变化为所述第一动作。
例如,所述处理装置对所述第一采集图像和所述第二采集图像分别进行信息提取,并基于提取信息之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
例如,所述处理装置对所述第一采集图像和所述第二采集图像分别进行二值化处理,并基于二值化后的第一采集图像和所述第二采集图像之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
例如,所述图像采集装置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的位置变化信息,基于所述位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
例如,所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的坐标位置变化信息,基于所述坐标位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
例如,所述电子机器设备还可以包括无线信号发射装置,其中,所述无线信号发射装置被配置为向所述用户发射无线信号,并接收从所述用户返回的无线信号;所述处理装置判断所述发射的无线信号以及所述返回的无线信号之间的图像信息变化量,基于所述图像信息变化量来确定所述用户是否从所述初始动作变化为所述第一动作。
例如,所述第一动作是位移动作,所述处理装置基于所述第一动作,确定所述第一动作的动作方向和动作速度;基于所述第一动作的动作方向和动作速度确定所述电子机器设备的运动方向和运动速度,以使的第二动作的运动方向和运动速度与所述第一动作的动作方向和动作速度相匹配。
例如,所述处理装置进一步获取所述用户的位置,基于所述用户位置确定所述第二动作的运动方向和运动速度,以使所述电子机器设备保持在所述用户前方或侧方预定距离处执行所述第二动作。
例如,所述电子机器设备还可以包括第一传感器,其中,所述第一传感器被配置为识别环境光的亮度,当所述环境光亮度大于第一亮度阈值时,通知所述处理装置;所述处理装置基于所述亮度通知,停止所述第二动作的执行。
例如,所述电子机器设备还可以包括第二传感器,其中,所述第二传感 器被配置为识别所述电子机器设备周围预定范围内的障碍物,当识别到所述障碍物时,向所述处理装置发送障碍物通知;所述处理装置基于所述障碍物通知改变所述第二动作的方向和/或速度。
例如,所述电子机器设备还可以包括第三传感器和提示装置,其中,所述第三传感器检测预定范围内的无线电信号,当检测到无线电信号后,通知所述提示装置;所述提示装置基于所述无线电信号通知,向所述用户进行信息提醒。
例如,所述电子机器设备还可以包括第四传感器,其中,所述第二动作是位移动作,所述第四传感器检测预定范围内所述用户的位置,当检测到用户所在位置时,向所述处理装置发送位置信息;所述处理装置基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
例如,所述第四传感器检测预定时间内所述用户的多个位置信息,并向所述处理装置发送所述多个位置信息;所述处理装置基于所述多个位置信息确定所述用户是否有位置变化;当确定没有位置变化时,基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
例如,所述电子机器设备还可以包括存储单元,其中,所述第一动作为连续的多个动作,所述处理装置基于所述连续的多个第一动作,确定所述电子机器设备的多个连续的第二动作;并基于所述多个连续的第二动作,生成移动路径;所述存储单元被配置为将所述移动路径进行存储。
例如,所述电子机器设备还可以包括功能按键,所述存储单元中包括至少一条移动路径,其中,所述功能按键被配置为基于所述用户的输入,确定与所述输入对应的移动路径,所述处理装置基于所述移动路径以及所述第一动作,确定所述电子机器设备的第二动作。
例如,所述电子机器设备还可以包括第二传感器,其中,所述第二传感器被配置为识别所述电子机器设备周围预定范围内的障碍物,响应于识别到所述障碍物时,向所述处理装置发送障碍物通知;所述处理装置基于所述障碍物通知确定所述电子机器设备的第二动作,以使所述电子机器设备避开所述障碍物。
例如,所述处理装置基于所述第二动作,改变所述移动路径,并将改变 后的移动路径发送给所述存储单元;
所述存储单元存储所述改变后的移动路径。
例如,响应于未识别到所述障碍物时,所述第二传感器向所述处理装置发送无障碍物通知;所述处理器基于所述无障碍物通知,基于所述移动路径以及所述第一动作确定所述电子机器设备的第二动作。
通过本公开实施例,电子机器设备不需要提前规划路径,就可以根据用户的动作,确定自身需要执行的动作,完成多种服务任务。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的描述中所需要使用的附图作简单的介绍。下面描述中的附图仅仅是本公开的示例性实施例。
图1示出了根据本公开实施例的电子机器设备的一种结构示意图;
图2示出了根据本公开实施例的电子机器设备的外形设计示意图;
图3示出了根据本公开实施例的电子机器设备的另一种结构示意图;
图4示出了根据本公开实施例的电子机器设备的第三种种结构示意图;
图5示出了根据本公开实施例的电子机器设备的第四种结构示意图;
图6示出了根据本公开实施例的障碍物处理流程图。
具体实施方式
在下文中,将参考附图详细描述本公开的优选实施例。注意,在本说明书和附图中,具有基本上相同步骤和元素用相同的附图标记来表示,且对这些步骤和元素的重复解释将被省略。
在本公开的以下实施例中,电子机器设备指的是以数字和逻辑计算设备为工作基础的,可以在无外部指令状态下自行移动的机器设备,例如人工智能设备、机器人或机器宠物等。
图1示出了根据本公开实施例的电子机器设备的一种结构示意图。图2示出了根据本公开实施例的电子机器设备的外形设计示意图。参见图1,该电子机器设备100包括图像采集装置110、处理装置120以及控制装置130。
该电子机器设备设备可以包括驱动装置,该驱动装置可以包括例如电机的动力部件以及例如轮子、履带等运动部件,并可以根据指令进行启动、停 止、直行、转弯、翻越障碍物等动作。本公开的实施例不限于驱动装置的具体类型。
图像采集装置110被配置为采集用户的动作信息并生成采集图像。图像采集装置110例如可以包括一个或多个摄像头、摄像机等等。图像采集装置110可以是朝固定方向采集图像,也可以进行翻转以拍摄不同位置、不同角度的图像信息。例如,图像采集装置110可以设置为不仅用于采集可见光的图像,而且还可以用于采集红外光的图像,从而适用于夜间环境。又例如,图像采集装置110所采集的图像可以即时存储在一存储装置之中或根据用户的指令存储在存储装置之中。
处理装置120被配置为基于图像采集装置110采集的图像,获取用户想要进行的第一动作,然后可基于第一动作确定电子机器设备的第二动作,并基于第二动作生成控制指令并发送给控制装置。处理装置120例如可以是通用处理器,如中央处理器(CPU),也可以是专用处理器,例如可编程逻辑电路(PLC)、现场可编程门阵列(FPGA)等。
控制装置130基于控制指令,控制电子机器设备执行第二动作。控制装置130例如可以控制电子机器设备行走、开启内部特定功能或者发出声音等动作。控制指令可以存储在预定存储装置之中,当该电子机器设备设备工作时被读入到控制装置130之中。
参见图2,电子机器设备100所处的示例包括车轮210、功能按键220、光源230等。电子机器设备100可以通过图像采集装置110来采集图像。电子机器设备100可以通过各种功能按键220以供用户输入指令。光源230可以根据需要开启以用于照明等,该光源230例如可以是亮度可调节的LED灯。当然,图2中的各种功能部件并不是本公开实施例所必须的,本领域技术人员可以了解,可以根据实际需要进行功能部件的增减。例如,例如功能按键220可以使用触摸屏等替代。
图3示出了根据本公开实施例的电子机器设备的另一种结构示意图。下面将根据图3来介绍根据本公开实施例的可以自行移动的电子机器设备的结构和工作方式。
根据本公开的实施例,处理装置120判断用户的第一动作,并基于第一动作来确定电子机器设备的第二动作。第一动作例如可以是位移动作,手势动作等等。处理装置120确定位移动作的动作方向和动作速度,并基于第一 动作的动作方向和动作速度确定电子机器设备的运动方向和运动速度,以使电子机器设备的第二动作的运动方向和运动速度与用户的第一动作的动作方向和动作速度相匹配。由此,例如,电子机器设备在用户走路时可以为其引路、照明等。当然,处理装置120也可以确定用户其他手势动作的动作方向和动作速度,并基于第一动作的动作方向和动作速度确定电子机器设备的运动方向和运动速度,以使电子机器设备的第二动作的运动方向和运动速度与用户的第一动作的动作方向和动作速度相匹配。例如,用户在做手术时,该电子机器设备可以根据用户的手势,帮其传递医疗器械等。以下仅以用户的位移动作为例,对本公开实施例进行介绍。
例如,处理装置120判断出用户的走路动作后,为了保障用户的安全,例如,当用户是小孩或老人时,可以帮助用户引路,或作为用户的陪伴。在引路时,电子机器设备可以走在用户的前方或侧方。如果此时电子机器设备内部没有预先存储路径时,那么不了解用户想要去的地方,则可以使用图像采集装置110不断地采集包含用户的图像,通过对图像的分析、多幅图像之间的比较,以确定用户的运动方向。电子机器设备还可以通过多幅图像之间的变化量以及使用时间等参数,确定用户的运动速度。在确定了用户的运动方向和运动速度后,电子机器设备可以确定自身的运动方向和速度,以使得电子机器设备与用户之间保持较近的距离,并保持在用户前方或侧方预定距离处,避免电子机器设备距离用户太远起不到陪伴的作用,或太近反而引起与用户的碰撞等。此外,电子机器设备在给用户引路时,还可以开启光源例如小夜灯以进行照明,使得用户在夜间行走时也可以看清道路,提高安全。
根据本公开的一个示例,处理装置120还可以进一步获取用户的位置,例如,通过分析采集的图像中用户的坐标来确定用户位置,或者基于Wi-Fi、蓝牙、ZIGBEE、RFID等室内定位技术来确定用户位置,基于用户位置可以更准确地确定自身第二动作的运动方向和运动速度,以使电子机器设备保持在用户前方或侧方预定距离处移动。
参见图3,电子机器设备100还可以包括第一传感器140,该第一传感器例如可以是环境光传感器,能够识别环境光的亮度。当环境光亮度大于第一亮度阈值时,通知处理装置120,处理装置120基于该亮度通知,停止第二动作的执行。例如,当用户开启室内照明灯后,则可能用户不再需要电子机器设备帮其引路、照明,因此在电子机器设备识别到室内开灯后,可以停止 移动,或回到预先设定的默认位置处。
参见图3,电子机器设备100还可以包括第二传感器150,该第二传感器例如可以是雷达传感器、红外传感器、距离传感器等,可以感应电子机器设备周围预定范围内的障碍物。例如,当处理装置120收到第二传感器150返回的障碍物检测信号后,处理装置120例如可以分析该信号,以确定在前进路线上是否存在障碍物。处理装置120基于是否存在障碍物改变第二动作的方向和/或速度。又例如,第二传感器本身也可以具有处理能力以确定是否存在障碍物,并将该存在障碍物与否的信息反馈给处理装置120。例如,雷达传感器通过向其周围发射雷达信号,根据返回的信号的频率或振幅的变化,以判断周围是否有障碍物。红外传感器通过向周围发射红外信号,根据返回的信号来判断前方物体与自身的距离,从而处理器120可以判断是否用户的行走是否受到影响,是否需要改变行走方向。当判断有障碍物时,处理装置120可以改变自身进行的第二动作的方向,也可以向用户发出警报以提醒用户注意。
此外,参见图3,电子机器设备还可以包括第三传感器160以及提示装置170,该第三传感器例如可以是无线信号传感器,可以检测预定范围内的无线电信号,当检测到无线电信号后,可以通知提示装置170。该提示装置170例如可以是扬声器、LED灯等,可以引起用户注意,以向用户进行提醒。例如,当用户并没有把手机随身携带时,电子机器设备的无线信号传感器感测到手机来电或来短信,则可以将该来电信息或短信信息通知给用户,避免由于手机声音小或静音状态下错过重要电话。当然,电子机器设备还可以在用户预先设置下,播报手机来电信息或短信信息。
此外,参见图3,电子机器设备还可以包括第四传感器180,该第四传感器180,可以是例如红外传感器等可以检测预定范围内用户的位置的传感器。当检测到用户所在位置时,第四传感器180例如可以向处理装置120发送用户位置信息。处理装置120基于位置信息确定电子机器设备到用户位置的路径,并基于路径确定向用户方向行走的位移动作。例如,电子机器设备可以确定用户位置,然后根据用户的指示帮用户送他/她想要的物件。红外传感器例如可以通过检测温度、距离来判断用户位置,也可以通过温度结合物理轮廓来判断用户所在位置,避免误判。
此外,根据本公开的一个示例,第四传感器180可以检测预定时间内用 户的多个位置信息,并向处理装置120发送多个位置信息。处理装置120基于多个位置信息确定用户是否有位置变化。当确定没有位置变化时,处理装置120基于位置信息确定电子机器设备到位置的路径,并基于路径确定向用户方向的位移动作。例如,在10秒钟内,如果拍摄的多幅图像都表示用户固定在一个位置上,则表示用户没有位置变化。此时,处理装置120可以判断用户与电子机器设备之间的距离,来判断用户的位置,给用户送他/她想要的物件。如果从拍摄的多幅图像中分析得到用户在不断的移动,则表示用户此时位置在不断变化,此时电子机器设备可以无需帮用户送物件,以避免不停定位用户位置造成的处理资源的浪费。
本公开实施例,通过判断用户的第一动作,来确定电子机器设备的第二动作,使得第二动作与第一动作相统一,从而解决了电子机器设备即使没有预先设定路径,也可以帮助用户引路的技术问题,保证了电子机器设备可以随时根据用户的需要执行相应的任务。
图4示出了根据本公开实施例的电子机器设备的第三种结构示意图。下面将根据图4来介绍根据本公开实施例的可以自行移动的电子机器设备的结构和工作方式。
在本公开实施例中,处理装置130可以基于采集图像判断用户是否从初始动作变化为第一动作,而初始动作和第一动作是不同类型的动作。也就是说,处理装置130可以判断出用户是否有动作变化。在本公开实施例中,不同类型的动作或动作变化,是指前后两种动作是不同属性的动作。例如,吃饭动作和走路动作,起床动作和睡眠动作,学习动作和玩耍动作等等,均属于不同类型的动作。相反,如果用户在睡眠时从左侧卧动作转换为平躺或右侧卧动作,虽然动作有变化,但仍然都属于睡眠动作,因此不属于本公开所定义的不同类型的动作。
例如,图像采集装置110采集用户的动作信息,并生成第一采集图像和第二采集图像,或更多采集图像。处理装置120比较第一采集图像和第二采集图像之间的或多幅采集图像之间的图像信息变化量,基于图像信息变化量判断用户是否从初始动作变化为第一动作。例如,第一采集图像和第二采集图像可以是连续的两帧图像,处理装置120通过前后帧的比较可以有效识别用户是否有动作变化。
对于图像信息变化量的判断和比较,处理装置120可以直接基于两幅或 多幅采集图像本身进行比较,也可以对第一采集图像和第二采集图像分别进行信息提取,提取出图像中的重要信息,并基于提取信息之间的图像信息变化量来判断用户从初始动作变化为第一动作。例如,对第一采集图像和第二采集图像分别进行二值化处理,并基于二值化后的第一采集图像和第二采集图像之间的图像信息变化量来判断用户从初始动作变化为第一动作。或者,将图像中的背景信息剔除掉,基于前景信息来进行比较判断用户动作前后是否发生变化。或者,对所有图像进行轮廓提取,基于轮廓信息进行比较判断两帧图像的前后变化等等。如此,可以有效减小计算量,提高处理效率。
在判断图像信息变化量时,可以根据处理后的图像的整体内容来判断。例如将第一采集图像和第二采集图像二值化之后,将每幅图像中的像素值累加,然后比较每幅图像中像素累加值之间的差值是否大于预设阈值。该阈值可以根据实际需要设定为20-40%中的一个数值。当累加值大于预设阈值时,可以认为用户从初始动作变化为第一动作。当累加值小于预设阈值时,则可以认为用户仍然保持在初始动作。例如,如果用户在睡眠中仅仅做了一个翻身动作,则后一帧图像像素累加值与前一帧图像的像素累加值的差值为15%,则可以认为,用户仍然保持在睡眠动作。
此外,根据本公开的其他实施例,还可以通过判断前后图像中用户的位置变化来判断用户是否从初始动作变化为第一动作。例如,图像采集装置110连续采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像。处理装置120分析第一采集图像和第二采集图像中用户的位置变化信息,基于位置变化信息判断用户是否从初始动作变化为第一动作。例如,处理装置120对图像采集装置110采集的每一幅图像均设定统一坐标系,例如,在用户进入睡眠动作后,以床表面床头位置为原点,以床表面从床头到床尾的方向设定为X轴方向,设定横坐标,以垂直于床表面的床头位置向天花板的方向为Y轴方向,设定纵坐标。由此当用户动作有变化时,可以根据用户坐标变化来确定其是否从一种类型的动作变换到另一种类型的动作。例如,为了减小计算量,可仅检测Y轴方向上的变化值来获取用户是否从初始动作到第一动作。例如,可以预先设定坐标变化阈值,该阈值可以根据历史数据进行设定,例如可以是5%-20%中的一个值。当用户头部的纵坐标从10cm变化为50cm,变化值大于阈值,则可以认为用户从睡眠动作变换到起床动作。当用户头部的纵坐标从10cm变换为12cm,变化值小于阈值,则可以判断用户仍 然处于睡眠状态。
此外,该电子机器设备还可以通过无线信号发射装置来判断用户是否从初始动作转换为第一动作。如图2所示,电子机器设备100上可以还可以设置无线信号发射装置240,该无线信号发射装置240例如可以是雷达发射换能器,可以是超声波发射器、红外信号发射器等。无线信号发射装置240可以向用户发射各种无线信号,并接收从用户返回的无线信号。当然,无线信号发射装置240也可以不向用户发射信号,而是向用户周围用户可能的动作区域发射信号,来判断用户是否进行相应动作。处理装置120可以判断无线信号发射装置240发射的无线信号以及从用户返回的无线信号之间的图像信息变化量。由于发射出的无线信号是否被遮挡以及被何种物体遮挡,返回信息强弱也发生变化,因此可以基于信号前后变化量来确定用户是否从初始动作变化为第一动作。上述图像信息变化量可以是信号频率变化量,也可以是信号振幅变化量,也可以是二者的组合等等。例如,当频率变化量在200-500Hz,表示频率变化量较小,动作没有变化;频率变化量在1000-3000Hz时,表示频率变化量较大,可以认为用户的动作从初始动作变化为第一动作。
本公开实施例,通过判断分析包含用户动作的采集图像来确定用户是否从一个动作变换到另一个动作,并根据变化来确定电子机器设备自身的下一步动作,可以有效地对用户想要做的事或想要到达的地方做预判,能够更及时准确的为用户提供服务。
图5示出了根据本公开实施例的电子机器设备的第四种种结构示意图。参见图5,电子机器设备100除了包括图像采集装置110、处理装置120以及控制装置130之外,还可以包括存储单元190。
在本公开实施例中,可以对电子机器设备进行学习训练,使其记忆至少一条存储路径。图像采集装置110可以采集多个第一动作,第一动作可以是连续的多个动作,例如,多个位移动作。处理装置120基于连续的多个第一动作,确定电子机器设备的多个连续的第二动作,并基于多个连续的第二动作,生成移动路径。也就是说,处理装置120在为用户引路后,可以记忆引路路径,并且将该路径发送到存储单元190中,存储单元190将该移动路径进行存储。
此外,电子机器设备100上还可以设置多个功能按键220,该功能按键220可以接收用户的输入,并确定与用户输入对应的存储单元190中存储的 移动路径。处理装置120可以基于用户的选择输入,根据移动路径以及用户的第一动作,确定电子机器设备的第二动作。例如,在默认情况下,处理装置120可以带领用户沿存储的移动路径进行移动,但同时处理装置120也需要考虑用户的第一动作,如果用户在行走过程中,突然改变方向,电子机器设备110也可以根据需要改变自己的第二动作,以适应用户的需求。
根据本公开的一个示例,电子机器设备还具有识别障碍物的功能。图6示出了根据本公开实施例的障碍物处理方法的一个示例的流程图。电子机器设备还可以包括第二传感器150,第二传感器150例如可以是发射雷达信号的传感器、可以通过向周围发射无线信号,根据返回的无线信号,来判断电子机器设备周围预定范围内是否有障碍物。
在步骤601中,处理装置120可以调取存储单元190中预先存储的路线。
在步骤602中,处理装置120可以控制以按照设定路线行走。
在步骤603中,可以使用第二传感器150识别障碍物。
在步骤604中,判断是否有障碍物。
在步骤605中,当判断进行路线前方存在障碍物时,向处理装置120发送障碍物通知,处理装置120基于障碍物通知确定电子机器设备的第二动作以使电子机器设备避开障碍物。
在步骤606中,如果未识别到障碍物时,第二传感器150也可以向处理装置120发送无障碍物通知,处理装置120基于无障碍物通知,仍然根据存储单元190中预先存储的移动路径以及用户的第一动作来确定自身的第二动作,同时指示第二传感器150继续探测障碍物。
在步骤607中,在避开障碍物之后,处理装置120可以记录当前绕过障碍物的移动路径。
在步骤608中,处理装置120还可以将新记录的新的移动路径发送给存储单元190,存储该新的移动路径,以供之后用户选择使用。
或者,在电子机器设备避开障碍物之后,处理装置120可以指示继续按照之前调取的设定路线继续行走。
或者,可以使用新记录的路径直接更新之前存储的路径,则此后处理装置120可以根据更新后的移动路径或根据用户进一步的选择,确定电子机器设备的第二动作。
本公开实施例,通过对电子机器设备进行训练,使其存储一条或多条存 储路径,还可以根据用户的输入选择,按照用户选择的路径进行行动,同时还能够有效避开障碍物。使得电子机器设备的功能更加强大,适应用户的不同需求。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现。并且软件模块可以置于任意形式的计算机存储介质中。为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本公开的范围。
本领域技术人员应该理解,可依赖于设计需求和其它因素对本公开进行各种修改、组合、部分组合和替换,只要它们在所附权利要求书及其等价物的范围内。
本申请要求于2016年8月10日递交的中国专利申请第201610652816.1号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。

Claims (20)

  1. 一种电子机器设备,包括:图像采集装置、处理装置以及控制装置,
    其中,所述图像采集装置被配置为采集用户的动作信息,并生成采集图像;
    所述处理装置被配置为基于所述采集图像获取所述用户想要进行的第一动作,基于所述第一动作确定所述电子机器设备的第二动作,并基于所述第二动作,生成控制指令并发送给所述控制装置;
    所述控制装置基于所述控制指令,控制所述电子机器设备执行所述第二动作。
  2. 根据权利要求1所述的电子机器设备,其中,所述处理装置基于所述采集图像判断用户是否从初始动作变化为所述第一动作,其中,所述初始动作和所述第一动作是不同类型的动作。
  3. 根据权利要求2所述的电子机器设备,其中,所述图像采集装置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;
    所述处理装置比较所述第一采集图像和所述第二采集图像之间的图像信息变化量,基于所述图像信息变化量判断所述用户是否从初始动作变化为所述第一动作。
  4. 根据权利要求3所述的电子机器设备,其中,所述处理装置对所述第一采集图像和所述第二采集图像分别进行信息提取,并基于提取信息之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
  5. 根据权利要求4所述的电子机器设备,其中,所述处理装置对所述第一采集图像和所述第二采集图像分别进行二值化处理,并基于二值化后的第一采集图像和所述第二采集图像之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
  6. 根据权利要求2-5任一所述的电子机器设备,其中,所述图像采集装 置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;
    所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的位置变化信息,基于所述位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
  7. 根据权利要求6所述的电子机器设备,其中,
    所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的坐标位置变化信息,基于所述坐标位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
  8. 根据权利要求2-7任一所述的电子机器设备,还包括:无线信号发射装置,
    其中,所述无线信号发射装置被配置为向所述用户发射无线信号,并接收从所述用户返回的无线信号;
    所述处理装置判断所述发射的无线信号以及所述返回的无线信号之间的图像信息变化量,基于所述图像信息变化量来确定所述用户是否从所述初始动作变化为所述第一动作。
  9. 根据权利要求1-8任一所述的电子机器设备,其中,
    所述第一动作是位移动作,所述处理装置基于所述第一动作,确定所述第一动作的动作方向和动作速度;
    基于所述第一动作的动作方向和动作速度确定所述电子机器设备的运动方向和运动速度,以使的第二动作的运动方向和运动速度与所述第一动作的动作方向和动作速度相匹配。
  10. 根据权利要求9所述的电子机器设备,其中,
    所述处理装置进一步获取所述用户的位置,基于所述用户位置确定所述第二动作的运动方向和运动速度,以使所述电子机器设备保持在所述用户前方或侧方预定距离处执行所述第二动作。
  11. 根据权利要求1-10任一所述的电子机器设备,还包括第一传感器,
    其中,所述第一传感器被配置为识别环境光的亮度,当所述环境光亮度大于第一亮度阈值时,通知所述处理装置;
    所述处理装置基于所述亮度通知,停止所述第二动作的执行。
  12. 根据权利要求1-11任一所述的电子机器设备,还包括第二传感器,
    其中,所述第二传感器被配置为识别所述电子机器设备周围预定范围内的障碍物,当识别到所述障碍物时,向所述处理装置发送障碍物通知;
    所述处理装置基于所述障碍物通知改变所述第二动作的方向和/或速度。
  13. 根据权利要求1-12任一所述的电子机器设备,还包括第三传感器和提示装置,
    其中,所述第三传感器检测预定范围内的无线电信号,当检测到无线电信号后,通知所述提示装置;
    所述提示装置基于所述无线电信号通知,向所述用户进行信息提醒。
  14. 根据权利要求1-13任一所述的电子机器设备,还包括第四传感器,
    其中,所述第二动作是位移动作,
    所述第四传感器检测预定范围内所述用户的位置,当检测到用户所在位置时,向所述处理装置发送位置信息;
    所述处理装置基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
  15. 根据权利要求14所述的电子机器设备,其中,
    所述第四传感器检测预定时间内所述用户的多个位置信息,并向所述处理装置发送所述多个位置信息;
    所述处理装置基于所述多个位置信息确定所述用户是否有位置变化;当确定没有位置变化时,基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
  16. 根据权利要求1-15任一所述的电子机器设备,还包括存储单元,
    其中,所述第一动作为连续的多个动作,所述处理装置基于所述连续的 多个第一动作,确定所述电子机器设备的多个连续的第二动作;并基于所述多个连续的第二动作,生成移动路径;
    所述存储单元被配置为将所述移动路径进行存储。
  17. 根据权利要求1-16任一所述的电子机器设备,还包括功能按键,
    其中,所述存储单元中存储至少一条移动路径,
    所述功能按键被配置为基于所述用户的输入,确定与所述输入对应的移动路径,
    所述处理装置基于所述移动路径以及所述第一动作,确定所述电子机器设备的第二动作。
  18. 根据权利要求17所述的电子机器设备,还包括第二传感器,
    其中,所述第二传感器被配置为识别所述电子机器设备周围预定范围内的障碍物,响应于识别到所述障碍物时,向所述处理装置发送障碍物通知;
    所述处理装置基于所述障碍物通知确定所述电子机器设备的第二动作,以使所述电子机器设备避开所述障碍物。
  19. 根据权利要求18所述的电子机器设备,其中,
    所述处理装置基于所述第二动作,改变所述移动路径,并将改变后的移动路径发送给所述存储单元;
    所述存储单元存储所述改变后的移动路径。
  20. 根据权利要求18或19所述的电子机器设备,其中,
    响应于未识别到所述障碍物时,所述第二传感器向所述处理装置发送无障碍物通知;
    所述处理器基于所述无障碍物通知,基于所述移动路径以及所述第一动作确定所述电子机器设备的第二动作。
PCT/CN2017/076922 2016-08-10 2017-03-16 电子机器设备 WO2018028200A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/561,770 US20180245923A1 (en) 2016-08-10 2017-03-16 Electronic machine equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610652816.1 2016-08-10
CN201610652816.1A CN106092091B (zh) 2016-08-10 2016-08-10 电子机器设备

Publications (1)

Publication Number Publication Date
WO2018028200A1 true WO2018028200A1 (zh) 2018-02-15

Family

ID=57455394

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/076922 WO2018028200A1 (zh) 2016-08-10 2017-03-16 电子机器设备

Country Status (3)

Country Link
US (1) US20180245923A1 (zh)
CN (1) CN106092091B (zh)
WO (1) WO2018028200A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106092091B (zh) * 2016-08-10 2019-07-02 京东方科技集团股份有限公司 电子机器设备
ES2901649T3 (es) 2016-12-23 2022-03-23 Gecko Robotics Inc Robot de inspección
US11307063B2 (en) 2016-12-23 2022-04-19 Gtc Law Group Pc & Affiliates Inspection robot for horizontal tube inspection having vertically positionable sensor carriage
JP6681326B2 (ja) * 2016-12-27 2020-04-15 本田技研工業株式会社 作業システムおよび作業方法
US10713487B2 (en) 2018-06-29 2020-07-14 Pixart Imaging Inc. Object determining system and electronic apparatus applying the object determining system
CN108958253A (zh) * 2018-07-19 2018-12-07 北京小米移动软件有限公司 扫地机器人的控制方法及装置
CA3126283A1 (en) * 2019-03-08 2020-09-17 Gecko Robotics, Inc. Inspection robot
CN110277163A (zh) * 2019-06-12 2019-09-24 合肥中科奔巴科技有限公司 基于视觉老人及病人床上状态识别与监控预警系统
WO2022225725A1 (en) 2021-04-20 2022-10-27 Gecko Robotics, Inc. Flexible inspection robot
EP4327047A1 (en) 2021-04-22 2024-02-28 Gecko Robotics, Inc. Systems, methods, and apparatus for ultra-sonic inspection of a surface

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158476A1 (en) * 2009-12-24 2011-06-30 National Taiwan University Of Science And Technology Robot and method for recognizing human faces and gestures thereof
CN103809734A (zh) * 2012-11-07 2014-05-21 联想(北京)有限公司 一种电子设备的控制方法、控制器及电子设备
CN104842358A (zh) * 2015-05-22 2015-08-19 上海思岚科技有限公司 一种可自主移动的多功能机器人
CN104985599A (zh) * 2015-07-20 2015-10-21 百度在线网络技术(北京)有限公司 基于人工智能的智能机器人控制方法、系统及智能机器人
US20160161945A1 (en) * 2013-06-13 2016-06-09 Samsung Electronics Co., Ltd. Cleaning robot and method for controlling the same
CN105796289A (zh) * 2016-06-03 2016-07-27 京东方科技集团股份有限公司 导盲机器人
CN106092091A (zh) * 2016-08-10 2016-11-09 京东方科技集团股份有限公司 电子机器设备

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4087104B2 (ja) * 2001-11-20 2008-05-21 シャープ株式会社 群ロボットシステム
JP3879848B2 (ja) * 2003-03-14 2007-02-14 松下電工株式会社 自律移動装置
ATE524784T1 (de) * 2005-09-30 2011-09-15 Irobot Corp Begleitroboter für persönliche interaktion
US9144360B2 (en) * 2005-12-02 2015-09-29 Irobot Corporation Autonomous coverage robot navigation system
JP4528295B2 (ja) * 2006-12-18 2010-08-18 株式会社日立製作所 案内ロボット装置及び案内システム
CN103370039B (zh) * 2011-02-23 2015-10-14 株式会社村田制作所 步行辅助车
US9229450B2 (en) * 2011-05-31 2016-01-05 Hitachi, Ltd. Autonomous movement system
WO2014122751A1 (ja) * 2013-02-07 2014-08-14 富士機械製造株式会社 移動補助ロボット
KR102094347B1 (ko) * 2013-07-29 2020-03-30 삼성전자주식회사 자동 청소 시스템, 청소 로봇 및 그 제어 방법
ES2613138T3 (es) * 2013-08-23 2017-05-22 Lg Electronics Inc. Robot limpiador y método para controlar el mismo
WO2015052588A2 (en) * 2013-10-10 2015-04-16 Itay Katz Systems, devices, and methods for touch-free typing
CN111603094B (zh) * 2014-02-28 2022-05-13 三星电子株式会社 清扫机器人及远程控制器
KR102328252B1 (ko) * 2015-02-13 2021-11-19 삼성전자주식회사 청소 로봇 및 그 제어방법
US20160345137A1 (en) * 2015-05-21 2016-11-24 Toshiba America Business Solutions, Inc. Indoor navigation systems and methods
KR102431996B1 (ko) * 2015-10-12 2022-08-16 삼성전자주식회사 로봇 청소기 및 그 제어 방법
US20170108874A1 (en) * 2015-10-19 2017-04-20 Aseco Investment Corp. Vision-based system for navigating a robot through an indoor space
GB201518652D0 (en) * 2015-10-21 2015-12-02 F Robotics Acquisitions Ltd Domestic robotic system and method
JP6697768B2 (ja) * 2016-06-29 2020-05-27 パナソニックIpマネジメント株式会社 歩行支援ロボット及び歩行支援方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158476A1 (en) * 2009-12-24 2011-06-30 National Taiwan University Of Science And Technology Robot and method for recognizing human faces and gestures thereof
CN103809734A (zh) * 2012-11-07 2014-05-21 联想(北京)有限公司 一种电子设备的控制方法、控制器及电子设备
US20160161945A1 (en) * 2013-06-13 2016-06-09 Samsung Electronics Co., Ltd. Cleaning robot and method for controlling the same
CN104842358A (zh) * 2015-05-22 2015-08-19 上海思岚科技有限公司 一种可自主移动的多功能机器人
CN104985599A (zh) * 2015-07-20 2015-10-21 百度在线网络技术(北京)有限公司 基于人工智能的智能机器人控制方法、系统及智能机器人
CN105796289A (zh) * 2016-06-03 2016-07-27 京东方科技集团股份有限公司 导盲机器人
CN106092091A (zh) * 2016-08-10 2016-11-09 京东方科技集团股份有限公司 电子机器设备

Also Published As

Publication number Publication date
US20180245923A1 (en) 2018-08-30
CN106092091B (zh) 2019-07-02
CN106092091A (zh) 2016-11-09

Similar Documents

Publication Publication Date Title
WO2018028200A1 (zh) 电子机器设备
KR102348041B1 (ko) 복수의 이동 로봇을 포함하는 로봇 시스템의 제어 방법
US8972054B2 (en) Robot apparatus, information providing method carried out by the robot apparatus and computer storage media
US9747802B2 (en) Collision avoidance system and method for an underground mine environment
CN111479662A (zh) 学习障碍物的人工智能移动机器人及其控制方法
JP6816767B2 (ja) 情報処理装置およびプログラム
EP3051810B1 (en) Surveillance
JP5318623B2 (ja) 遠隔操作装置および遠隔操作プログラム
KR20180075176A (ko) 이동 로봇 및 그 제어방법
CN104842358A (zh) 一种可自主移动的多功能机器人
WO2018046015A1 (zh) 车辆报警方法、装置及终端
CN105058389A (zh) 一种机器人系统、机器人控制方法及机器人
CN107962573A (zh) 陪伴型机器人及机器人控制方法
US11200786B1 (en) Canine assisted home monitoring
US11960285B2 (en) Method for controlling robot, robot, and recording medium
JP2019139467A (ja) 情報処理装置、情報処理方法およびプログラム
KR20180098040A (ko) 이동 로봇 및 그 제어방법
CN107485335B (zh) 识别方法、装置、电子设备及存储介质
US20240142997A1 (en) Method for controlling robot, robot, and recording medium
US10593058B2 (en) Human radar
JP2012110996A (ja) ロボットの移動制御システム、ロボットの移動制御プログラムおよびロボットの移動制御方法
JP6621220B2 (ja) 情報処理装置、情報処理方法およびプログラム
US20160379416A1 (en) Apparatus and method for controlling object movement
JP7374581B2 (ja) ロボット、画像処理方法及びプログラム
JP2019139733A (ja) 情報処理システム、情報処理装置、情報処理方法およびプログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15561770

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17838333

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC , EPO FORM 1205A DATED 12.06.2019.

122 Ep: pct application non-entry in european phase

Ref document number: 17838333

Country of ref document: EP

Kind code of ref document: A1