WO2018028200A1 - 电子机器设备 - Google Patents
电子机器设备 Download PDFInfo
- Publication number
- WO2018028200A1 WO2018028200A1 PCT/CN2017/076922 CN2017076922W WO2018028200A1 WO 2018028200 A1 WO2018028200 A1 WO 2018028200A1 CN 2017076922 W CN2017076922 W CN 2017076922W WO 2018028200 A1 WO2018028200 A1 WO 2018028200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- action
- user
- processing device
- image
- motion
- Prior art date
Links
- 230000009471 action Effects 0.000 claims description 159
- 230000008859 change Effects 0.000 claims description 63
- 230000000977 initiatory effect Effects 0.000 claims description 24
- 238000006073 displacement reaction Methods 0.000 claims description 13
- 230000004044 response Effects 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 7
- 230000000875 corresponding effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 206010047571 Visual impairment Diseases 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/024—Guidance services
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
- H04W4/14—Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
Definitions
- Embodiments of the present disclosure relate to an electronic machine device.
- the guiding robot identifies things based on a large amount of image data, determines the destination that the user wants to reach, and leads the user to the place he wants to reach based on the pre-stored path.
- the leading robot in the prior art can only walk in a fixed area and lead the user to a designated place, and needs to perform track planning according to the current place and the destination in advance, and guide the route according to the planned path.
- the guiding robot will not be able to complete the task.
- an electronic machine apparatus comprising: an image capture device, a processing device, and a control device, wherein the image capture device is configured to collect motion information of a user and generate a captured image;
- the processing device is configured to acquire a first action that the user wants to perform based on the acquired image; determine a second action of the electronic device based on the first action; and generate a second action based on the second action Controlling the command and transmitting to the control device; the control device controlling the electronic machine device to perform the second action based on the control command.
- the processing device determines whether the user changes from the initial action to the first action based on the acquired image, wherein the initial action and the first action are different types of actions.
- the image capture device collects motion information of the user and generates at least a first captured image and a second captured image; the processing device compares images between the first captured image and the second captured image The amount of information change, determining that the user is based on the amount of change in the image information Whether it changes from the initial action to the first action.
- the processing device separately performs information extraction on the first acquired image and the second collected image, and determines that the user changes from an initial action to the first based on an amount of change in image information between the extracted information. action.
- the processing device separately performs binarization processing on the first acquired image and the second collected image, and is based on image information between the binarized first acquired image and the second collected image The amount of change is used to determine that the user changes from the initial action to the first action.
- the image capture device collects motion information of the user and generates at least a first captured image and a second captured image; the processing device analyzes the user in the first captured image and the second captured image The position change information determines whether the user changes from the initial action to the first action based on the position change information.
- the processing device analyzes coordinate position change information of the user in the first acquired image and the second acquired image, and determines, according to the coordinate position change information, whether the user changes from an initial action to the first An action.
- the electronic machine device may further include a wireless signal transmitting device configured to transmit a wireless signal to the user and receive a wireless signal returned from the user; the processing device determines And determining, by the image information change amount, whether the user changes from the initial action to the first action based on the amount of change in image information between the transmitted wireless signal and the returned wireless signal.
- a wireless signal transmitting device configured to transmit a wireless signal to the user and receive a wireless signal returned from the user
- the processing device determines And determining, by the image information change amount, whether the user changes from the initial action to the first action based on the amount of change in image information between the transmitted wireless signal and the returned wireless signal.
- the first action is a displacement action
- the processing device determines a motion direction and a motion speed of the first motion based on the first motion; determining the motion based on the motion direction and the motion speed of the first motion
- the moving direction and the moving speed of the electronic machine device are such that the moving direction and the moving speed of the second motion match the moving direction and the moving speed of the first motion.
- the processing device further acquires a location of the user, determining a direction of motion and a speed of movement of the second motion based on the location of the user to maintain the electronic machine device at a predetermined distance in front of or lateral to the user The second action is performed.
- the electronic machine device may further include a first sensor, wherein the first sensor is configured to recognize brightness of ambient light, and notify the processing device when the ambient light brightness is greater than a first brightness threshold; The processing device stops execution of the second action based on the brightness notification.
- the electronic machine device may further include a second sensor, wherein the second sensor
- the device is configured to identify an obstacle within a predetermined range around the electronic machine device, and when the obstacle is identified, send an obstacle notification to the processing device; the processing device changes the said based on the obstacle notification The direction and/or speed of the second action.
- the electronic machine device may further include a third sensor and a prompting device, wherein the third sensor detects a radio signal within a predetermined range, and when the radio signal is detected, notifying the prompting device; the prompting device is based on The radio signal notifies that the user is prompted for information.
- the third sensor detects a radio signal within a predetermined range, and when the radio signal is detected, notifying the prompting device; the prompting device is based on The radio signal notifies that the user is prompted for information.
- the electronic machine device may further include a fourth sensor, wherein the second action is a displacement action, the fourth sensor detects a position of the user within a predetermined range, and when detecting a location of the user, The processing device transmits location information; the processing device determines a path of the electronic device to the location based on the location information, and determines the displacement action to the user direction based on the path.
- the second action is a displacement action
- the fourth sensor detects a position of the user within a predetermined range, and when detecting a location of the user,
- the processing device transmits location information; the processing device determines a path of the electronic device to the location based on the location information, and determines the displacement action to the user direction based on the path.
- the fourth sensor detects a plurality of pieces of position information of the user within a predetermined time and transmits the plurality of pieces of position information to the processing device; the processing device determines whether the user is based on the plurality of pieces of position information There is a change in position; when it is determined that there is no change in position, a path of the electronic machine device to the position is determined based on the position information, and the displacement action in the direction of the user is determined based on the path.
- the electronic machine device may further include a storage unit, wherein the first action is a plurality of consecutive actions, and the processing device determines the plurality of electronic machine devices based on the consecutive plurality of first actions a continuous second action; and generating a movement path based on the plurality of consecutive second actions; the storage unit being configured to store the movement path.
- the first action is a plurality of consecutive actions
- the processing device determines the plurality of electronic machine devices based on the consecutive plurality of first actions a continuous second action; and generating a movement path based on the plurality of consecutive second actions
- the storage unit being configured to store the movement path.
- the electronic machine device may further include a function button, wherein the storage unit includes at least one movement path, wherein the function button is configured to determine a movement path corresponding to the input based on an input of the user,
- the processing device determines a second action of the electronic device based on the movement path and the first action.
- the electronic machine device may further include a second sensor, wherein the second sensor is configured to identify an obstacle within a predetermined range around the electronic machine device, in response to identifying the obstacle,
- the processing device transmits an obstacle notification; the processing device determines a second action of the electronic device based on the obstacle notification to cause the electronic device to avoid the obstacle.
- the processing device changes the movement path based on the second action and will change a subsequent movement path is sent to the storage unit;
- the storage unit stores the changed movement path.
- the second sensor transmits an obstacle notification to the processing device; the processor is based on the movement notification and based on the movement path and the first An action determines a second action of the electronic machine device.
- the electronic equipment device does not need to plan the path in advance, and can determine the actions that need to be performed according to the actions of the user, and complete various service tasks.
- FIG. 1 shows a schematic structural view of an electronic machine device according to an embodiment of the present disclosure
- FIG. 2 is a schematic diagram showing the outline design of an electronic machine device according to an embodiment of the present disclosure
- FIG. 3 illustrates another structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure
- FIG. 4 shows a third structural schematic view of an electronic machine device in accordance with an embodiment of the present disclosure
- FIG. 5 shows a fourth structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure
- FIG. 6 illustrates a flowchart of obstacle processing in accordance with an embodiment of the present disclosure.
- an electronic machine device refers to a machine device that is based on digital and logical computing devices and that can move itself without an external command state, such as an artificial intelligence device, a robot, or a robotic pet.
- FIG. 1 shows a schematic structural view of an electronic machine device according to an embodiment of the present disclosure.
- 2 shows a schematic diagram of an outline design of an electronic machine device in accordance with an embodiment of the present disclosure.
- the electronic machine device 100 includes an image capture device 110, a processing device 120, and a control device 130.
- the electronic machine equipment may include a driving device, which may include a power component such as a motor and moving parts such as wheels, tracks, etc., and may be activated and stopped according to instructions. Stop, go straight, turn, climb obstacles, etc.
- a driving device which may include a power component such as a motor and moving parts such as wheels, tracks, etc., and may be activated and stopped according to instructions. Stop, go straight, turn, climb obstacles, etc.
- Embodiments of the present disclosure are not limited to a particular type of drive device.
- the image capture device 110 is configured to collect motion information of the user and generate a captured image.
- Image acquisition device 110 may, for example, include one or more cameras, cameras, and the like.
- the image capture device 110 may collect images in a fixed direction or may flip to capture image information at different positions and at different angles.
- the image capture device 110 can be configured to capture not only an image of visible light, but also an image of infrared light for use in a nighttime environment.
- the image acquired by the image capture device 110 can be stored in a storage device or stored in the storage device according to a user's instruction.
- the processing device 120 is configured to acquire a first action that the user wants to perform based on the image acquired by the image capture device 110, and then determine a second action of the electronic device based on the first action, and generate a control command based on the second action and send Give the control device.
- Processing device 120 may be, for example, a general purpose processor, such as a central processing unit (CPU), or a special purpose processor, such as a programmable logic circuit (PLC), a field programmable gate array (FPGA), or the like.
- CPU central processing unit
- PLC programmable logic circuit
- FPGA field programmable gate array
- the control device 130 controls the electronic machine device to perform the second action based on the control command.
- the control device 130 can, for example, control an action of the electronic device to walk, turn on an internal specific function, or emit a sound.
- the control commands can be stored in a predetermined storage device and read into the control device 130 when the electronic machine device is in operation.
- examples of the electronic machine device 100 include a wheel 210, a function button 220, a light source 230, and the like.
- the electronic machine device 100 can acquire images through the image capture device 110.
- the electronic machine device 100 can pass various function buttons 220 for the user to input commands.
- Light source 230 can be turned on as needed for illumination or the like, which can be, for example, a brightness-adjustable LED light.
- the various functional components in FIG. 2 are not necessary for the embodiments of the present disclosure, and those skilled in the art may understand that the functional components may be increased or decreased according to actual needs.
- the function button 220 can be replaced with a touch screen or the like.
- FIG. 3 illustrates another structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
- the structure and operation of an electronic machine device that can move by itself according to an embodiment of the present disclosure will be described below with reference to FIG.
- the processing device 120 determines a first action of the user and determines a second action of the electronic machine device based on the first action.
- the first action may be, for example, a displacement action, a gesture action, or the like.
- the processing device 120 determines the action direction and the action speed of the displacement action, and is based on the first
- the motion direction and the motion speed of the motion determine the motion direction and the motion speed of the electronic device to match the motion direction and the motion speed of the second motion of the electronic device with the motion direction and motion speed of the first motion of the user.
- the electronic equipment can lead, illuminate, etc. when the user walks.
- the processing device 120 may also determine the action direction and the action speed of the other gesture actions of the user, and determine the motion direction and the motion speed of the electronic device based on the action direction and the action speed of the first action, so as to enable the second action of the electronic device.
- the direction of motion and the speed of motion match the direction of motion and the speed of motion of the user's first motion.
- the electronic machine device can transmit a medical device or the like according to the gesture of the user.
- the embodiments of the present disclosure will be described below by taking only the user's bit movement as an example.
- the processing device 120 determines the user's walking action, in order to protect the security of the user, for example, when the user is a child or an elderly person, the user may be assisted to lead the way, or as a companion of the user.
- the electronic machine can walk in front of or to the side of the user. If there is no pre-stored path inside the electronic device at this time, then the user does not know where the user wants to go, then the image capture device 110 can be used to continuously collect images containing the user, by analyzing the image, between multiple images. Compare to determine the direction of movement of the user.
- the electronic machine equipment can also determine the speed of the user's movement through parameters such as the amount of change between the images and the time of use.
- the electronic machine device After determining the direction of motion and the speed of movement of the user, the electronic machine device can determine its own direction and speed of movement so that the electronic machine device maintains a closer distance from the user and remains at a predetermined distance in front of or lateral to the user. To avoid the electronic equipment is too far away from the user to play the role of companionship, or too close to cause collision with the user.
- the electronic machine device when the electronic machine device guides the user, it can also turn on a light source such as a night light to illuminate, so that the user can see the road while walking at night, and improve safety.
- the processing device 120 may further acquire the location of the user, for example, by analyzing the coordinates of the user in the captured image to determine the location of the user, or based on indoor positioning technologies such as Wi-Fi, Bluetooth, ZIGBEE, RFID, and the like. Determining the user position, based on the user position, can more accurately determine the direction of motion and the speed of movement of the second action of the user, so that the electronic machine device remains moved at a predetermined distance in front of or lateral to the user.
- indoor positioning technologies such as Wi-Fi, Bluetooth, ZIGBEE, RFID, and the like.
- the electronic machine device 100 may further include a first sensor 140, which may be, for example, an ambient light sensor capable of recognizing the brightness of ambient light.
- a first sensor 140 which may be, for example, an ambient light sensor capable of recognizing the brightness of ambient light.
- the processing device 120 is notified, and the processing device 120 stops execution of the second action based on the brightness notification. For example, when the user turns on the indoor lighting, the user may no longer need the electronic equipment to guide and guide the lighting, so that the electronic equipment can stop after the indoor lighting is recognized. Move, or go back to the preset default location.
- the electronic machine device 100 may further include a second sensor 150, which may be, for example, a radar sensor, an infrared sensor, a distance sensor, or the like, which may sense an obstacle within a predetermined range around the electronic machine device.
- a second sensor 150 which may be, for example, a radar sensor, an infrared sensor, a distance sensor, or the like, which may sense an obstacle within a predetermined range around the electronic machine device.
- the processing device 120 can, for example, analyze the signal to determine if an obstacle is present on the forward route.
- the processing device 120 changes the direction and/or speed of the second action based on whether an obstacle is present.
- the second sensor itself may also have processing capabilities to determine if an obstacle is present and to feed back information about the presence or absence of the obstacle to the processing device 120.
- the radar sensor determines whether there are obstacles around by detecting a change in the frequency or amplitude of the returned signal by transmitting a radar signal to its surroundings.
- the infrared sensor determines the distance between the front object and itself based on the returned signal by transmitting an infrared signal to the surroundings, so that the processor 120 can determine whether the user's walking is affected or not, and whether the walking direction needs to be changed.
- the processing device 120 may change the direction of the second action performed by itself, or may alert the user to alert the user.
- the electronic machine device may further include a third sensor 160 and a prompting device 170, which may be, for example, a wireless signal sensor, can detect a radio signal within a predetermined range, and can notify when the radio signal is detected. Prompt device 170.
- the prompting device 170 can be, for example, a speaker, an LED light, or the like, which can draw the user's attention to remind the user.
- the wireless signal sensor of the electronic machine device senses the incoming call or the short message of the mobile phone, and can notify the user of the incoming call information or the short message information, so as to avoid the small or mute state of the mobile phone. Miss important calls.
- the electronic machine device can also broadcast the mobile phone call information or short message information under the preset setting of the user.
- the electronic machine device may further include a fourth sensor 180, which may be a sensor such as an infrared sensor that can detect the position of the user within a predetermined range.
- the fourth sensor 180 may, for example, transmit user location information to the processing device 120 when the location of the user is detected.
- the processing device 120 determines a path of the electronic device to the user location based on the location information and determines a displacement action to walk toward the user based on the path.
- the electronic machine device can determine the location of the user and then assist the user with the item he/she wants based on the user's instructions.
- the infrared sensor can determine the position of the user by detecting the temperature and the distance, and can also determine the location of the user by combining the temperature with the physical contour to avoid misjudgment.
- the fourth sensor 180 may detect the use within a predetermined time period A plurality of location information of the user, and transmitting a plurality of location information to the processing device 120.
- the processing device 120 determines whether the user has a location change based on the plurality of location information. When it is determined that there is no change in position, the processing device 120 determines a path of the electronic machine device to the location based on the location information and determines a displacement action in the direction of the user based on the path. For example, within 10 seconds, if multiple images taken indicate that the user is fixed in one position, it means that the user has no position change.
- the processing device 120 can determine the distance between the user and the electronic device to determine the location of the user, and send the user the item he/she wants. If the user continuously moves from the multiple images taken, it means that the user's position is changing at this time. At this time, the electronic machine device can provide the user with no need to send the object to avoid the processing resources caused by the user position. waste.
- the second action of the electronic device is determined by determining the first action of the user, so that the second action is unified with the first action, thereby solving the problem that the electronic device can help even if the path is not preset.
- the technical problems of the user guides ensure that the electronic machine equipment can perform the corresponding tasks at any time according to the needs of the user.
- FIG. 4 shows a third structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
- the structure and operation of an electronic machine device that can move by itself according to an embodiment of the present disclosure will be described below with reference to FIG.
- the processing device 130 may determine whether the user changes from the initial action to the first action based on the acquired image, and the initial action and the first action are different types of actions. That is, the processing device 130 can determine whether the user has an action change.
- different types of actions or action changes refer to actions in which the two actions are different attributes. For example, eating and walking actions, getting up and sleeping actions, learning actions and playing actions, etc., all belong to different types of actions. Conversely, if the user transitions from the left side to the lying side or the right side in the sleep state, although the action changes, it still belongs to the sleep action, and thus does not belong to the different types of actions defined by the present disclosure.
- the image capture device 110 collects motion information of the user and generates a first captured image and a second captured image, or more captured images.
- the processing device 120 compares the image information change amount between the first captured image and the second captured image or between the plurality of acquired images, and determines whether the user changes from the initial motion to the first motion based on the image information change amount.
- the first acquired image and the second acquired image may be two consecutive frames of images, and the processing device 120 can effectively identify whether the user has an action change by comparing the preceding and succeeding frames.
- the processing device 120 can directly base on two or The plurality of captured images themselves are compared, and the first captured image and the second captured image may be separately extracted, the important information in the image is extracted, and the user is determined from the initial motion based on the amount of change of the image information between the extracted information.
- Change to the first action For example, the first acquisition image and the second acquisition image are separately binarized, and the user is determined to change from the initial motion to the user based on the amount of change in the image information between the binarized first acquired image and the second acquired image.
- the first action Alternatively, the background information in the image is removed, and the comparison is performed based on the foreground information to determine whether the user motion changes before and after the user action.
- contour extraction is performed on all images, and comparison is performed based on the contour information to determine the before and after changes of the two frames of images and the like. In this way, the amount of calculation can be effectively reduced, and the processing efficiency can be improved.
- the amount of change in the image information it can be judged based on the overall content of the processed image. For example, after binarizing the first acquired image and the second acquired image, the pixel values in each image are accumulated, and then the difference between the pixel accumulated values in each image is compared to be greater than a preset threshold.
- the threshold can be set to a value of 20-40% according to actual needs.
- the accumulated value is greater than the preset threshold, the user may be considered to change from the initial action to the first action.
- the accumulated value is less than the preset threshold, it can be considered that the user remains in the initial action. For example, if the user only makes a roll-over action during sleep, the difference between the image pixel accumulated value of the latter frame and the pixel accumulated value of the previous frame image is 15%, and it can be considered that the user remains in the sleep action.
- the image capture device 110 continuously collects motion information of the user and generates at least a first acquired image and a second captured image.
- the processing device 120 analyzes the position change information of the user in the first acquired image and the second collected image, and determines whether the user changes from the initial action to the first action based on the position change information.
- the processing device 120 sets a unified coordinate system for each image acquired by the image capturing device 110, for example, after the user enters the sleep action, taking the bed surface position of the bed as the origin, and the bed surface from the bed head to the bed end.
- the direction is set to the X-axis direction, and the abscissa is set, and the ordinate is set to the Y-axis direction in the direction perpendicular to the bed surface of the bed surface toward the ceiling.
- the coordinate change threshold may be set in advance, and the threshold may be set based on historical data, for example, may be one of 5%-20%.
- the user can be considered to change from the sleep action to the wake-up action.
- the ordinate of the user's head is changed from 10cm to 12cm, and the change value is less than the threshold, it can be judged that the user is still It is still asleep.
- the electronic device device can also determine whether the user converts from the initial action to the first action by the wireless signal transmitting device.
- a wireless signal transmitting device 240 may be further disposed on the electronic device 100.
- the wireless signal transmitting device 240 may be, for example, a radar transmitting transducer, and may be an ultrasonic transmitter, an infrared signal transmitter, or the like.
- the wireless signal transmitting device 240 can transmit various wireless signals to the user and receive wireless signals returned from the user.
- the wireless signal transmitting device 240 may also not transmit a signal to the user, but may transmit a signal to a possible action area of the user around the user to determine whether the user performs the corresponding action.
- the processing device 120 can determine the amount of change in image information between the wireless signal transmitted by the wireless signal transmitting device 240 and the wireless signal returned from the user. Since the transmitted wireless signal is occluded and blocked by what object, the strength of the return information also changes. Therefore, it is possible to determine whether the user changes from the initial motion to the first motion based on the amount of change before and after the signal.
- the amount of change in the image information may be a signal frequency change amount, a signal amplitude change amount, a combination of the two, or the like.
- the frequency change amount is 200-500 Hz, it means that the frequency change amount is small, and the action does not change; when the frequency change amount is 1000-3000 Hz, it means that the frequency change amount is large, and it can be considered that the user's motion changes from the initial action to the first. action.
- determining whether the user changes from one action to another by determining the collected image including the user action, and determining the next action of the electronic device device according to the change can effectively perform what the user wants to do. Pre-judging things or places you want to reach can provide users with timely and accurate services.
- FIG. 5 shows a fourth structural schematic diagram of an electronic machine device in accordance with an embodiment of the present disclosure.
- the electronic machine device 100 may further include a storage unit 190 in addition to the image capture device 110, the processing device 120, and the control device 130.
- the electronic machine device may be trained to memorize at least one storage path.
- the image capture device 110 can collect a plurality of first actions, and the first action can be a plurality of consecutive actions, for example, multiple displacement actions.
- the processing device 120 determines a plurality of consecutive second actions of the electronic device based on the plurality of consecutive first actions, and generates a movement path based on the plurality of consecutive second actions. That is to say, after the processing device 120 guides the user, the path can be memorized and sent to the storage unit 190, and the storage unit 190 stores the path.
- the electronic machine device 100 can also be provided with a plurality of function buttons 220, which can receive the input of the user and determine the storage in the storage unit 190 corresponding to the user input. Move the path.
- the processing device 120 may determine a second action of the electronic device based on the movement path and the first action of the user based on the user's selection input. For example, by default, processing device 120 can direct the user to move along the stored movement path, but at the same time processing device 120 also needs to consider the user's first action, if the user suddenly changes direction during walking, electronic machine device 110 You can also change your second action as needed to suit your needs.
- the electronic machine device also has a function of identifying an obstacle.
- FIG. 6 shows a flow chart of one example of an obstacle processing method according to an embodiment of the present disclosure.
- the electronic machine device may further include a second sensor 150, which may be, for example, a sensor that transmits a radar signal, and may detect whether there is an obstacle in a predetermined range around the electronic machine device according to the returned wireless signal by transmitting a wireless signal to the surroundings. Things.
- step 601 the processing device 120 may retrieve a route pre-stored in the storage unit 190.
- processing device 120 can control to walk in accordance with a set route.
- step 603 the second sensor 150 can be used to identify the obstacle.
- step 604 it is determined whether there is an obstacle.
- step 605 when it is determined that there is an obstacle in front of the route, the obstacle notification is transmitted to the processing device 120, and the processing device 120 determines the second action of the electronic device based on the obstacle notification to cause the electronic device to avoid the obstacle.
- the second sensor 150 may also send the obstacle notification to the processing device 120, and the processing device 120 still uses the movement path pre-stored in the storage unit 190 based on the obstacle-free notification and The user's first action determines its second action while instructing the second sensor 150 to continue detecting obstacles.
- step 607 after avoiding the obstacle, the processing device 120 may record the current path of the obstacle bypassing.
- the processing device 120 may also send the newly recorded new movement path to the storage unit 190, and store the new movement path for later selection and use by the user.
- the processing device 120 can instruct to continue walking in accordance with the previously set set route.
- the previously recorded path can be directly updated using the newly recorded path, after which the processing device 120 can determine the second action of the electronic device based on the updated movement path or according to further selection by the user.
- Embodiments of the present disclosure store one or more memories by training an electronic machine device
- the storage path can also be acted upon according to the user's input selection, according to the path selected by the user, and can also effectively avoid obstacles. Make the functions of electronic equipment more powerful and adapt to the different needs of users.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Navigation (AREA)
- Telephone Function (AREA)
Abstract
Description
Claims (20)
- 一种电子机器设备,包括:图像采集装置、处理装置以及控制装置,其中,所述图像采集装置被配置为采集用户的动作信息,并生成采集图像;所述处理装置被配置为基于所述采集图像获取所述用户想要进行的第一动作,基于所述第一动作确定所述电子机器设备的第二动作,并基于所述第二动作,生成控制指令并发送给所述控制装置;所述控制装置基于所述控制指令,控制所述电子机器设备执行所述第二动作。
- 根据权利要求1所述的电子机器设备,其中,所述处理装置基于所述采集图像判断用户是否从初始动作变化为所述第一动作,其中,所述初始动作和所述第一动作是不同类型的动作。
- 根据权利要求2所述的电子机器设备,其中,所述图像采集装置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;所述处理装置比较所述第一采集图像和所述第二采集图像之间的图像信息变化量,基于所述图像信息变化量判断所述用户是否从初始动作变化为所述第一动作。
- 根据权利要求3所述的电子机器设备,其中,所述处理装置对所述第一采集图像和所述第二采集图像分别进行信息提取,并基于提取信息之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
- 根据权利要求4所述的电子机器设备,其中,所述处理装置对所述第一采集图像和所述第二采集图像分别进行二值化处理,并基于二值化后的第一采集图像和所述第二采集图像之间的图像信息变化量来判断所述用户从初始动作变化为所述第一动作。
- 根据权利要求2-5任一所述的电子机器设备,其中,所述图像采集装 置采集用户的动作信息,并生成连续的至少第一采集图像和第二采集图像;所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的位置变化信息,基于所述位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
- 根据权利要求6所述的电子机器设备,其中,所述处理装置分析所述第一采集图像和所述第二采集图像中所述用户的坐标位置变化信息,基于所述坐标位置变化信息判断所述用户是否从初始动作变化为所述第一动作。
- 根据权利要求2-7任一所述的电子机器设备,还包括:无线信号发射装置,其中,所述无线信号发射装置被配置为向所述用户发射无线信号,并接收从所述用户返回的无线信号;所述处理装置判断所述发射的无线信号以及所述返回的无线信号之间的图像信息变化量,基于所述图像信息变化量来确定所述用户是否从所述初始动作变化为所述第一动作。
- 根据权利要求1-8任一所述的电子机器设备,其中,所述第一动作是位移动作,所述处理装置基于所述第一动作,确定所述第一动作的动作方向和动作速度;基于所述第一动作的动作方向和动作速度确定所述电子机器设备的运动方向和运动速度,以使的第二动作的运动方向和运动速度与所述第一动作的动作方向和动作速度相匹配。
- 根据权利要求9所述的电子机器设备,其中,所述处理装置进一步获取所述用户的位置,基于所述用户位置确定所述第二动作的运动方向和运动速度,以使所述电子机器设备保持在所述用户前方或侧方预定距离处执行所述第二动作。
- 根据权利要求1-10任一所述的电子机器设备,还包括第一传感器,其中,所述第一传感器被配置为识别环境光的亮度,当所述环境光亮度大于第一亮度阈值时,通知所述处理装置;所述处理装置基于所述亮度通知,停止所述第二动作的执行。
- 根据权利要求1-11任一所述的电子机器设备,还包括第二传感器,其中,所述第二传感器被配置为识别所述电子机器设备周围预定范围内的障碍物,当识别到所述障碍物时,向所述处理装置发送障碍物通知;所述处理装置基于所述障碍物通知改变所述第二动作的方向和/或速度。
- 根据权利要求1-12任一所述的电子机器设备,还包括第三传感器和提示装置,其中,所述第三传感器检测预定范围内的无线电信号,当检测到无线电信号后,通知所述提示装置;所述提示装置基于所述无线电信号通知,向所述用户进行信息提醒。
- 根据权利要求1-13任一所述的电子机器设备,还包括第四传感器,其中,所述第二动作是位移动作,所述第四传感器检测预定范围内所述用户的位置,当检测到用户所在位置时,向所述处理装置发送位置信息;所述处理装置基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
- 根据权利要求14所述的电子机器设备,其中,所述第四传感器检测预定时间内所述用户的多个位置信息,并向所述处理装置发送所述多个位置信息;所述处理装置基于所述多个位置信息确定所述用户是否有位置变化;当确定没有位置变化时,基于所述位置信息确定所述电子机器设备到所述位置的路径,并基于所述路径确定向所述用户方向的所述位移动作。
- 根据权利要求1-15任一所述的电子机器设备,还包括存储单元,其中,所述第一动作为连续的多个动作,所述处理装置基于所述连续的 多个第一动作,确定所述电子机器设备的多个连续的第二动作;并基于所述多个连续的第二动作,生成移动路径;所述存储单元被配置为将所述移动路径进行存储。
- 根据权利要求1-16任一所述的电子机器设备,还包括功能按键,其中,所述存储单元中存储至少一条移动路径,所述功能按键被配置为基于所述用户的输入,确定与所述输入对应的移动路径,所述处理装置基于所述移动路径以及所述第一动作,确定所述电子机器设备的第二动作。
- 根据权利要求17所述的电子机器设备,还包括第二传感器,其中,所述第二传感器被配置为识别所述电子机器设备周围预定范围内的障碍物,响应于识别到所述障碍物时,向所述处理装置发送障碍物通知;所述处理装置基于所述障碍物通知确定所述电子机器设备的第二动作,以使所述电子机器设备避开所述障碍物。
- 根据权利要求18所述的电子机器设备,其中,所述处理装置基于所述第二动作,改变所述移动路径,并将改变后的移动路径发送给所述存储单元;所述存储单元存储所述改变后的移动路径。
- 根据权利要求18或19所述的电子机器设备,其中,响应于未识别到所述障碍物时,所述第二传感器向所述处理装置发送无障碍物通知;所述处理器基于所述无障碍物通知,基于所述移动路径以及所述第一动作确定所述电子机器设备的第二动作。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/561,770 US20180245923A1 (en) | 2016-08-10 | 2017-03-16 | Electronic machine equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610652816.1 | 2016-08-10 | ||
CN201610652816.1A CN106092091B (zh) | 2016-08-10 | 2016-08-10 | 电子机器设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018028200A1 true WO2018028200A1 (zh) | 2018-02-15 |
Family
ID=57455394
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/076922 WO2018028200A1 (zh) | 2016-08-10 | 2017-03-16 | 电子机器设备 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180245923A1 (zh) |
CN (1) | CN106092091B (zh) |
WO (1) | WO2018028200A1 (zh) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106092091B (zh) * | 2016-08-10 | 2019-07-02 | 京东方科技集团股份有限公司 | 电子机器设备 |
ES2901649T3 (es) | 2016-12-23 | 2022-03-23 | Gecko Robotics Inc | Robot de inspección |
US11307063B2 (en) | 2016-12-23 | 2022-04-19 | Gtc Law Group Pc & Affiliates | Inspection robot for horizontal tube inspection having vertically positionable sensor carriage |
JP6681326B2 (ja) * | 2016-12-27 | 2020-04-15 | 本田技研工業株式会社 | 作業システムおよび作業方法 |
US10713487B2 (en) | 2018-06-29 | 2020-07-14 | Pixart Imaging Inc. | Object determining system and electronic apparatus applying the object determining system |
CN108958253A (zh) * | 2018-07-19 | 2018-12-07 | 北京小米移动软件有限公司 | 扫地机器人的控制方法及装置 |
CA3126283A1 (en) * | 2019-03-08 | 2020-09-17 | Gecko Robotics, Inc. | Inspection robot |
CN110277163A (zh) * | 2019-06-12 | 2019-09-24 | 合肥中科奔巴科技有限公司 | 基于视觉老人及病人床上状态识别与监控预警系统 |
WO2022225725A1 (en) | 2021-04-20 | 2022-10-27 | Gecko Robotics, Inc. | Flexible inspection robot |
EP4327047A1 (en) | 2021-04-22 | 2024-02-28 | Gecko Robotics, Inc. | Systems, methods, and apparatus for ultra-sonic inspection of a surface |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110158476A1 (en) * | 2009-12-24 | 2011-06-30 | National Taiwan University Of Science And Technology | Robot and method for recognizing human faces and gestures thereof |
CN103809734A (zh) * | 2012-11-07 | 2014-05-21 | 联想(北京)有限公司 | 一种电子设备的控制方法、控制器及电子设备 |
CN104842358A (zh) * | 2015-05-22 | 2015-08-19 | 上海思岚科技有限公司 | 一种可自主移动的多功能机器人 |
CN104985599A (zh) * | 2015-07-20 | 2015-10-21 | 百度在线网络技术(北京)有限公司 | 基于人工智能的智能机器人控制方法、系统及智能机器人 |
US20160161945A1 (en) * | 2013-06-13 | 2016-06-09 | Samsung Electronics Co., Ltd. | Cleaning robot and method for controlling the same |
CN105796289A (zh) * | 2016-06-03 | 2016-07-27 | 京东方科技集团股份有限公司 | 导盲机器人 |
CN106092091A (zh) * | 2016-08-10 | 2016-11-09 | 京东方科技集团股份有限公司 | 电子机器设备 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4087104B2 (ja) * | 2001-11-20 | 2008-05-21 | シャープ株式会社 | 群ロボットシステム |
JP3879848B2 (ja) * | 2003-03-14 | 2007-02-14 | 松下電工株式会社 | 自律移動装置 |
ATE524784T1 (de) * | 2005-09-30 | 2011-09-15 | Irobot Corp | Begleitroboter für persönliche interaktion |
US9144360B2 (en) * | 2005-12-02 | 2015-09-29 | Irobot Corporation | Autonomous coverage robot navigation system |
JP4528295B2 (ja) * | 2006-12-18 | 2010-08-18 | 株式会社日立製作所 | 案内ロボット装置及び案内システム |
CN103370039B (zh) * | 2011-02-23 | 2015-10-14 | 株式会社村田制作所 | 步行辅助车 |
US9229450B2 (en) * | 2011-05-31 | 2016-01-05 | Hitachi, Ltd. | Autonomous movement system |
WO2014122751A1 (ja) * | 2013-02-07 | 2014-08-14 | 富士機械製造株式会社 | 移動補助ロボット |
KR102094347B1 (ko) * | 2013-07-29 | 2020-03-30 | 삼성전자주식회사 | 자동 청소 시스템, 청소 로봇 및 그 제어 방법 |
ES2613138T3 (es) * | 2013-08-23 | 2017-05-22 | Lg Electronics Inc. | Robot limpiador y método para controlar el mismo |
WO2015052588A2 (en) * | 2013-10-10 | 2015-04-16 | Itay Katz | Systems, devices, and methods for touch-free typing |
CN111603094B (zh) * | 2014-02-28 | 2022-05-13 | 三星电子株式会社 | 清扫机器人及远程控制器 |
KR102328252B1 (ko) * | 2015-02-13 | 2021-11-19 | 삼성전자주식회사 | 청소 로봇 및 그 제어방법 |
US20160345137A1 (en) * | 2015-05-21 | 2016-11-24 | Toshiba America Business Solutions, Inc. | Indoor navigation systems and methods |
KR102431996B1 (ko) * | 2015-10-12 | 2022-08-16 | 삼성전자주식회사 | 로봇 청소기 및 그 제어 방법 |
US20170108874A1 (en) * | 2015-10-19 | 2017-04-20 | Aseco Investment Corp. | Vision-based system for navigating a robot through an indoor space |
GB201518652D0 (en) * | 2015-10-21 | 2015-12-02 | F Robotics Acquisitions Ltd | Domestic robotic system and method |
JP6697768B2 (ja) * | 2016-06-29 | 2020-05-27 | パナソニックIpマネジメント株式会社 | 歩行支援ロボット及び歩行支援方法 |
-
2016
- 2016-08-10 CN CN201610652816.1A patent/CN106092091B/zh active Active
-
2017
- 2017-03-16 WO PCT/CN2017/076922 patent/WO2018028200A1/zh active Application Filing
- 2017-03-16 US US15/561,770 patent/US20180245923A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110158476A1 (en) * | 2009-12-24 | 2011-06-30 | National Taiwan University Of Science And Technology | Robot and method for recognizing human faces and gestures thereof |
CN103809734A (zh) * | 2012-11-07 | 2014-05-21 | 联想(北京)有限公司 | 一种电子设备的控制方法、控制器及电子设备 |
US20160161945A1 (en) * | 2013-06-13 | 2016-06-09 | Samsung Electronics Co., Ltd. | Cleaning robot and method for controlling the same |
CN104842358A (zh) * | 2015-05-22 | 2015-08-19 | 上海思岚科技有限公司 | 一种可自主移动的多功能机器人 |
CN104985599A (zh) * | 2015-07-20 | 2015-10-21 | 百度在线网络技术(北京)有限公司 | 基于人工智能的智能机器人控制方法、系统及智能机器人 |
CN105796289A (zh) * | 2016-06-03 | 2016-07-27 | 京东方科技集团股份有限公司 | 导盲机器人 |
CN106092091A (zh) * | 2016-08-10 | 2016-11-09 | 京东方科技集团股份有限公司 | 电子机器设备 |
Also Published As
Publication number | Publication date |
---|---|
US20180245923A1 (en) | 2018-08-30 |
CN106092091B (zh) | 2019-07-02 |
CN106092091A (zh) | 2016-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018028200A1 (zh) | 电子机器设备 | |
KR102348041B1 (ko) | 복수의 이동 로봇을 포함하는 로봇 시스템의 제어 방법 | |
US8972054B2 (en) | Robot apparatus, information providing method carried out by the robot apparatus and computer storage media | |
US9747802B2 (en) | Collision avoidance system and method for an underground mine environment | |
CN111479662A (zh) | 学习障碍物的人工智能移动机器人及其控制方法 | |
JP6816767B2 (ja) | 情報処理装置およびプログラム | |
EP3051810B1 (en) | Surveillance | |
JP5318623B2 (ja) | 遠隔操作装置および遠隔操作プログラム | |
KR20180075176A (ko) | 이동 로봇 및 그 제어방법 | |
CN104842358A (zh) | 一种可自主移动的多功能机器人 | |
WO2018046015A1 (zh) | 车辆报警方法、装置及终端 | |
CN105058389A (zh) | 一种机器人系统、机器人控制方法及机器人 | |
CN107962573A (zh) | 陪伴型机器人及机器人控制方法 | |
US11200786B1 (en) | Canine assisted home monitoring | |
US11960285B2 (en) | Method for controlling robot, robot, and recording medium | |
JP2019139467A (ja) | 情報処理装置、情報処理方法およびプログラム | |
KR20180098040A (ko) | 이동 로봇 및 그 제어방법 | |
CN107485335B (zh) | 识别方法、装置、电子设备及存储介质 | |
US20240142997A1 (en) | Method for controlling robot, robot, and recording medium | |
US10593058B2 (en) | Human radar | |
JP2012110996A (ja) | ロボットの移動制御システム、ロボットの移動制御プログラムおよびロボットの移動制御方法 | |
JP6621220B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US20160379416A1 (en) | Apparatus and method for controlling object movement | |
JP7374581B2 (ja) | ロボット、画像処理方法及びプログラム | |
JP2019139733A (ja) | 情報処理システム、情報処理装置、情報処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15561770 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17838333 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC , EPO FORM 1205A DATED 12.06.2019. |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17838333 Country of ref document: EP Kind code of ref document: A1 |