WO2019196947A1 - 电子设备确定方法、系统、计算机系统和可读存储介质 - Google Patents

电子设备确定方法、系统、计算机系统和可读存储介质 Download PDF

Info

Publication number
WO2019196947A1
WO2019196947A1 PCT/CN2019/082567 CN2019082567W WO2019196947A1 WO 2019196947 A1 WO2019196947 A1 WO 2019196947A1 CN 2019082567 W CN2019082567 W CN 2019082567W WO 2019196947 A1 WO2019196947 A1 WO 2019196947A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
determining
similarity
preset
candidate
Prior art date
Application number
PCT/CN2019/082567
Other languages
English (en)
French (fr)
Inventor
王雅卓
关煜
徐忠飞
Original Assignee
北京京东尚科信息技术有限公司
北京京东世纪贸易有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京京东尚科信息技术有限公司, 北京京东世纪贸易有限公司 filed Critical 北京京东尚科信息技术有限公司
Priority to US17/042,018 priority Critical patent/US11481036B2/en
Priority to JP2020551794A priority patent/JP7280888B2/ja
Priority to EP19785227.0A priority patent/EP3779645A4/en
Publication of WO2019196947A1 publication Critical patent/WO2019196947A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the present disclosure relates to the field of Internet technologies, and more particularly, to an electronic device determining method, system, computer system, and computer readable storage medium.
  • the manner of determining the controlled device is mostly that the user sends a corresponding confirmation message to the control terminal through a contact method (such as a touch operation), and then the control terminal determines the controlled device according to the confirmation information.
  • the inventors have found that at least the following problems exist in the related art: the existing voice confirmation method not only has the defects of inaccurate semantic understanding, but also requires high distance and environmental noise, resulting in use scenarios. Limited.
  • the present disclosure provides a second electronic device that can be controlled by the first electronic device by the first electronic device according to the recognition result of the first action performed by the first operating device to solve the existing voice.
  • the manipulation method not only has the defects of inaccurate semantic understanding, but also the method and system for determining the electronic device with high requirements for distance and environmental noise, which leads to the use of scene-limited defects.
  • An aspect of the present disclosure provides an electronic device determining method, where the electronic device determining method includes: identifying, by the first electronic device, a first action performed by the operating body to obtain a recognition result; and determining, according to the foregoing recognition result, that the a second electronic device that is controlled by the first electronic device.
  • determining the second electronic device controllable by the first electronic device includes: determining at least one candidate electronic device controllable by the first electronic device; determining a coordinate origin; determining the at least one candidate At least one first position vector corresponding to each of the candidate electronic devices in the electronic device, starting from the coordinate origin and ending with the own position; determining the position after the coordinate origin is used as the starting point and the second operation is performed by the operating body a second position vector that is an end point; and determining the second electronic device from the at least one candidate electronic device based on the at least one first position vector and the second position vector.
  • determining the second electronic device from the at least one candidate electronic device includes: determining at least one first angle formed by each position vector in the at least one first position vector and the same coordinate axis; determining the above a second angle formed by the second position vector and the same coordinate axis; calculating an angular difference between each of the at least one first angle and the second angle to obtain a corresponding at least one angular difference; and at least one of the angular differences
  • the electronic device corresponding to the angular difference of the at least one candidate electronic device that is less than or equal to the preset angle is determined as the second electronic device.
  • the electronic device determining method further includes: outputting the corresponding electronic device before determining the electronic device corresponding to the angular difference of the preset angle in the at least one candidate electronic device as the second electronic device Name information of the electronic device; and when the operation body performs the confirmation operation for the name information, the corresponding electronic device is determined as the second electronic device.
  • determining a at least one candidate electronic device controllable by the first electronic device for each electronic device within the target area: acquiring an image of the electronic device; calculating the electronic device a first integration map of the image; acquiring at least one preset integration map, wherein the preset integration map is obtained by calculating a preset template image, wherein the preset template image is controllable by the first electronic device An image obtained by the electronic device; calculating a first similarity between the first integral map and each preset integral map of the at least one preset integral map to obtain at least one first similarity; and at the at least one first similarity In the case where there is a similarity satisfying the first similarity threshold, the electronic device is determined as the candidate electronic device.
  • the electronic device determining method further includes: calculating a first histogram of an image corresponding to the electronic device before determining the electronic device as the candidate electronic device; acquiring at least one preset histogram The preset histogram is obtained by calculating a preset template image; calculating a second similarity between the first histogram and each preset histogram in the at least one preset histogram, to obtain at least a second similarity; and in the case where the at least one first similarity has a similarity satisfying the first similarity threshold and the at least one second similarity also has a similarity satisfying the second similarity threshold, The electronic device is determined as the candidate electronic device described above.
  • the electronic device determining method further includes: after determining the second electronic device controllable by the first electronic device, the second electronic device receives user voice information and responds to the user voice message.
  • Another aspect of the present disclosure provides an electronic device determining system, where the electronic device determining system includes: a first obtaining device, configured to identify, by the first electronic device, a first action performed by the operating body to obtain a recognition result; And a determining device, configured to determine, according to the foregoing identification result, the second electronic device that can be controlled by the first electronic device.
  • the first determining means includes: a first determining unit, configured to determine at least one candidate electronic device controllable by the first electronic device; a second determining unit, configured to determine a coordinate origin; a determining unit, configured to determine at least one first position vector corresponding to each of the at least one candidate electronic device and starting from the coordinate origin and ending with a self position; and a fourth determining unit, configured to determine the coordinate a second position vector whose origin is the starting point and whose position is after the second action is performed by the operating body; and a fifth determining unit for using the at least one first position vector and the second position vector from the above The second electronic device is determined in at least one candidate electronic device.
  • the fifth determining unit includes: a first determining subunit, configured to determine at least one first angle formed by each position vector in the at least one first position vector and the same coordinate axis; and a second determining subunit a second angle formed by the second position vector and the same coordinate axis; a calculation subunit, configured to calculate an angle difference between each of the at least one first angle and the second angle, to obtain a corresponding minimum An angle difference; and a third determining subunit, configured to: in the case where the angle difference is less than or equal to the preset angle in the at least one angle difference, the angle difference between the at least one candidate electronic device that is less than or equal to the preset angle The corresponding electronic device is determined to be the second electronic device described above.
  • the electronic device determining system further includes: an output device, configured to determine, before determining the electronic device corresponding to the angular difference of the preset angle in the at least one candidate electronic device as the second electronic device And outputting the name information of the corresponding electronic device; and second determining means for determining the corresponding electronic device as the second electronic device when the operating body performs a confirming operation on the name information.
  • the electronic device in the process of determining, by the first determining unit, the at least one candidate electronic device that can be controlled by the first electronic device, acquiring, for each electronic device in the target area, an image of the electronic device; Calculating a first integration map of the image of the electronic device; acquiring at least one preset integration map, wherein the preset integration map is obtained by calculating a preset template image, wherein the preset template image is An image obtained by the electronic device controlled by the electronic device; calculating a first similarity between the first integral map and each preset integral map in the at least one preset integral map to obtain at least one first similarity; and at least In the case where there is a first similarity satisfying the first similarity threshold in a first similarity, the electronic device is determined as the candidate electronic device.
  • the electronic device determining system further includes: a first computing device, configured to calculate a first histogram of an image corresponding to the electronic device before determining the electronic device as the candidate electronic device; a second acquiring device, configured to obtain at least one preset histogram, wherein the preset histogram is obtained by calculating a preset template image; and the second calculating device is configured to calculate the first histogram and the foregoing a second similarity of each preset histogram in at least one preset histogram, obtaining at least one second similarity; and third determining means for satisfying the first similarity in the at least one first similarity
  • the similarity of the thresholds and in the case where the at least one second similarity also has a similarity satisfying the second similarity threshold the electronic device is determined as the candidate electronic device.
  • an electronic device determining apparatus including a collecting device, an identifying device, and a processing device, wherein the collecting device is used for a first action performed by an operating body, and the identifying device is configured to identify The first action is performed to obtain a recognition result, and the processing device is configured to determine a controllable second electronic device according to the recognition result.
  • the electronic device determining device further includes: a signal transmitting device configured to transmit a signal to the controllable second electronic device.
  • Another aspect of the present disclosure provides an electronic device determining system including at least one first electronic device and at least one second electronic device, wherein the at least one first electronic device is configured to identify a first done by an operating body Acting, obtaining a recognition result, and determining, according to the recognition result, a second electronic device controllable by the first electronic device.
  • Another aspect of the present disclosure provides a computer system comprising: one or more processors; and a computer readable storage medium storing one or more programs, wherein when one or more of the programs are When the plurality of processors are executed, the one or more processors are implemented to implement the electronic device determining method according to any one of the above embodiments.
  • Another aspect of the present disclosure provides a computer readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to implement the electronic device determining method of any of the above embodiments .
  • the technical means of determining the second electronic device controllable by the first electronic device by using the recognition result of the first action by the first electronic device according to the recognition operation body is adopted, at least part of The existing voice control method not only has the defects of inaccurate semantic understanding, but also has high requirements on distance and environmental noise, which leads to technical problems with limited use scenes, and thus can be implemented with a limited use scenario and A more natural way to help the user determine the technical effects of the second electronic device.
  • FIG. 1 schematically illustrates an application scenario of an electronic device determining method and system according to an embodiment of the present disclosure
  • FIG. 2 schematically shows a flowchart of an electronic device determining method according to an embodiment of the present disclosure
  • FIG. 3A schematically illustrates a flowchart of determining a second electronic device in accordance with an embodiment of the present disclosure
  • FIG. 3B schematically illustrates a flowchart of determining a second electronic device according to another embodiment of the present disclosure
  • FIG. 3C schematically illustrates a flowchart of an electronic device determining method according to another embodiment of the present disclosure
  • FIG. 3D schematically illustrates a flowchart of determining a candidate electronic device in accordance with an embodiment of the present disclosure
  • FIG. 3E schematically illustrates a flowchart of an electronic device determining method according to another embodiment of the present disclosure
  • FIG. 3F schematically illustrates a schematic diagram of a first electronic device being a binocular image according to an embodiment of the present disclosure
  • FIG. 3G schematically illustrates a schematic diagram of determining a second electronic device in accordance with an embodiment of the present disclosure
  • FIG. 4 schematically illustrates a block diagram of an electronic device determination system in accordance with an embodiment of the present disclosure
  • FIG. 5A schematically shows a block diagram of a first determining device according to an embodiment of the present disclosure
  • FIG. 5B schematically shows a block diagram of a fifth determining unit according to an embodiment of the present disclosure
  • FIG. 5C schematically illustrates a block diagram of an electronic device determination system in accordance with another embodiment of the present disclosure
  • FIG. 5D schematically illustrates a block diagram of an electronic device determination system in accordance with another embodiment of the present disclosure
  • FIG. 6 schematically illustrates a block diagram of a computer system suitable for implementing an electronic device determination method in accordance with an embodiment of the present disclosure.
  • An embodiment of the present disclosure provides an electronic device determining method, where the electronic device determining method may include: identifying, by the first electronic device, a first action performed by the operating body, obtaining a recognition result; and determining, according to the recognition result, that the A second electronic device that is controlled by an electronic device.
  • FIG. 1 schematically illustrates an application scenario of an electronic device determining method and system according to an embodiment of the present disclosure. It should be noted that FIG. 1 is only an example of an application scenario to which the embodiments of the present disclosure may be applied, to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be used for other Device, system, environment, or scenario.
  • the first electronic device is the camera 101
  • the operating body is the human hand 102
  • the second electronic device is the television 103.
  • the camera 101 can recognize the first action made by the human hand 102 and obtain the recognition result, and the camera 101 can determine the TV that can be controlled by the camera 101 according to the recognition result.
  • the camera 101 can also control the television 103, such as turning on the television 103, turning off the television 103, and the like.
  • the second electronic device that can be controlled by the first electronic device is determined by the first electronic device based on the recognition result of the first action performed by the first operating device, and further
  • the existing voice control method can solve the defects of inaccurate semantic understanding, and the requirements for distance and environmental noise are high, resulting in limited use scenarios.
  • FIG. 2 schematically illustrates a flow chart of an electronic device determining method in accordance with an embodiment of the present disclosure.
  • the electronic device determining method may include an operation S201 and an operation S202, wherein:
  • the first electronic device recognizes the first action performed by the operating body, and acquires the recognition result.
  • the electronic device determining method provided by the present disclosure may be applied to a first electronic device, which may include, but is not limited to, a mobile phone, a television, a notebook computer, a tablet computer, a camera, etc., the first electronic device may be second The electronic device communicates, such as controlling the second electronic device.
  • a first electronic device which may include, but is not limited to, a mobile phone, a television, a notebook computer, a tablet computer, a camera, etc.
  • the first electronic device may be second
  • the electronic device communicates, such as controlling the second electronic device.
  • the first electronic device may have an identification function. Specifically, the first electronic device can identify at least one electronic device in the scene, and further identify, from the at least one electronic device, at least one candidate electronic device that can be controlled by the first electronic device, where the first electronic device can pass the first
  • the at least one candidate electronic device that the electronic device controls may include the second electronic device.
  • the first electronic device may further identify a first action performed by the operating body, thereby obtaining a recognition result.
  • the first electronic device is not in contact with the operating body, and the first action may be performed by the operating body in a space where the operating body itself and the first electronic device are co-located.
  • the second electronic device may send a secondary confirmation to the candidate second electronic device.
  • the candidate second electronic device may prompt the operating body, for example, prompting the operating body in the form of sound and light, and at this time, the first electronic device may Receiving an action of the operating body (for example, collecting an action of the operating body when the first electronic device detects the prompt of the form of the sound and light, or reaching a set time threshold after the first electronic device sends the secondary confirmation message
  • the operation of the operating body is collected. If the action of the operating body represents "confirmation" information, such as a ⁇ action, the candidate second electronic device is used as the second electronic device.
  • the above-mentioned operating body may include a limb, a hand, a foot, and the like of the user, and may also include an object controlled by a user.
  • the first action performed by the operating body may be an action performed by the user's limbs, hands, feet, etc., or may be an action performed by the user to manipulate the object.
  • the operation body is taken as an example, and the scheme in which the first electronic device recognizes the action performed by the human hand is described in detail.
  • a gesture may be recognized prior to recognizing an action of a human hand, for example, a gesture may be recognized using a skin color based gesture segmentation method.
  • the skin color-based gesture segmentation method distinguishes between the skin color and the background color, and then further distinguishes the edge features (such as the outline of the human hand) that are unique to the gesture, thereby identifying and determining the gesture.
  • the first electronic device such as a camera
  • the Meanshift algorithm can be used to track the motion of the human hand, and then determine the manipulation command by the motion trajectory of the human hand.
  • the recognition result may be obtained.
  • the recognition result may include a manipulation instruction corresponding to the first motion, and the recognition result may be obtained based on the recognized motion trajectory of the first motion, or may be based on the identified final motion of the first motion.
  • the appearance of the gesture is not limited here.
  • the first electronic device may acquire the recognition result, and determine, according to the recognition result, the second electronic device that can be controlled by the first electronic device.
  • the first electronic device may further control the second electronic device according to the recognition result. For example, if the first action is a circle drawn by the operating body in the air, the first action may be used to enable the first electronic device to determine the second electronic device and control the second electronic device to sleep; if the first action is an operation The "X" shape drawn by the body in the air, the first action can be used to cause the first electronic device to determine the second electronic device and control the second electronic device to increase the volume.
  • the first electronic device may be referred to as a master electronic device, and the second electronic device may be referred to as a slave electronic device.
  • the first electronic device of the embodiment of the present disclosure may control the electronic device (for example, the second electronic device) to perform an operation, or may perform the operation by itself; the second electronic device of the embodiment of the present disclosure may be controlled by the first electronic device, and You can make your own actions, such as playing music at regular intervals.
  • a related art is that a user sends a corresponding confirmation message to a control terminal through a contact method (such as a touch operation), and then the control terminal determines the controlled device according to the confirmation information.
  • a contact method such as a touch operation
  • the contact approach has not provided a better user experience.
  • Other related technologies also provide a technical solution for remotely determining a controlled device by a contactless method such as voice operation.
  • the existing voice confirmation method not only has the defects of inaccurate semantic understanding, but also has high requirements on distance and environmental noise, resulting in limited use scenarios.
  • gesture recognition technology is only applicable to certain specific use scenarios, such as somatosensory games and VR device control, and the technology is limited to the user and a single operated device. The interaction between the two cannot involve the third type of device interaction.
  • the first electronic device can determine the second electronic device that can be controlled by the first electronic device based on the recognition result of the first action performed by the recognition operating body, and can be used in a usage scenario.
  • the limited and more natural way helps the user to determine the second electronic device, and can also solve the existing voice control method, which not only has the defects of inaccurate semantic understanding, but also has high requirements on distance and environmental noise, resulting in use scenarios. Limited by defects.
  • embodiments of the present disclosure may also combine the action operation mode and the voice operation mode, so that the present disclosure can be applied to more scenarios.
  • the smart speaker is taken as an example, and the smart speaker has an operating mode and a sleep mode.
  • the first electronic device confirms that the second electronic device is a smart speaker and wakes up the smart speaker through the action of the operating body (the smart speaker changes from the sleep mode to the working mode)
  • the operating body can perform voice interaction with the smart speaker. In this way, the accuracy of the confirmed second electronic device can be ensured, and the normal use of the second electronic device is not affected.
  • the second electronic device may be determined by an action of the operating body when it is inconvenient to determine which electronic device is the second electronic device by means of voice or the like. For example, when there are two smart air conditioners that can perform voice interaction in the room, the operating body emits a voice “set the temperature at 25 degrees”, it is not convenient to determine which smart air conditioner performs the operation. At this time, two smart air conditioners The operating body can be prompted to confirm by sound and light. For example, the operating body points the finger to the smart air conditioner that wishes to reset the temperature parameter, and the first electronic device determines the corresponding smart air conditioner resetting temperature parameter based on the action of the operating body, and Send a message to the smart air conditioner.
  • voice or the like For example, when there are two smart air conditioners that can perform voice interaction in the room, the operating body emits a voice “set the temperature at 25 degrees”, it is not convenient to determine which smart air conditioner performs the operation. At this time, two smart air conditioners The operating body can be prompted to confirm by sound and light.
  • the smart speaker is still taken as an example.
  • the candidate device may be given by the first electronic device.
  • the second electronic device sends the secondary confirmation information; in response to receiving the secondary confirmation information, the smart speaker can send an acousto-optic prompt to the operating body, for example, sending the voice message “Do you want to wake up?”
  • the smart speaker can enter the standby mode and receive the second confirmation information of the operating body. For example, the operating body sends the voice message “Yes”.
  • the smart speaker determines that it is the second electronic device, and can send the information to The first electronic device. This combines action and voice operations to enhance the user experience.
  • FIG. 2 The method shown in FIG. 2 will be further described below with reference to specific embodiments with reference to FIGS. 3A-3G.
  • FIG. 3A schematically illustrates a flow chart for determining a second electronic device in accordance with an embodiment of the present disclosure.
  • determining that the controlled electronic device that can be controlled by the first electronic device may include operations S301 to S305, where:
  • At operation S301 at least one candidate electronic device operable by the first electronic device is determined.
  • At least one first location vector corresponding to each candidate electronic device in the at least one candidate electronic device starting from a coordinate origin and ending with a self position is determined.
  • the second electronic device is determined from the at least one candidate electronic device based on the at least one first location vector and the second location vector.
  • the at least one candidate electronic device that can be manipulated by the first electronic device may be determined, and then the second device may be determined from the at least one candidate electronic device.
  • Electronic equipment before determining the second electronic device, the at least one candidate electronic device that can be manipulated by the first electronic device may be determined, and then the second device may be determined from the at least one candidate electronic device.
  • the electronic device that collects the first action and the second action includes various sensors that can be used for gesture recognition, including but not limited to: a camera, a laser radar, a proximity sensor, an infrared sensor, and the like, wherein The camera can include monocular, binocular, and the like.
  • the first electronic device may establish a three-dimensional coordinate system based on the identified at least one candidate electronic device and the operating body, and determine an origin of the three-dimensional coordinate system. Since there is a certain distance between the position of each candidate electronic device and the origin of the coordinate, the first position vector can be determined starting from the coordinate origin and ending with the position of each candidate electronic device, thereby obtaining at least one of the foregoing At least one first location vector corresponding to each candidate electronic device in the candidate electronic device. Similarly, since the position of the operating body after the second action is also at a certain distance from the origin of the coordinate, it can be determined that the position originating from the coordinate origin is used as the starting point and the position after the second action is performed by the operating body is the end point. The second position vector. Further, the second electronic device can be determined from the at least one candidate electronic device based on the at least one first location vector and the second location vector.
  • the second action performed by the operating body in the embodiment of the present disclosure may be an action performed by the user's limb, hand, foot, or the like, or may be an action performed by the user to manipulate the object.
  • the second action performed by the operating body may be an action of a human finger to the controlled electronic device.
  • the second action may be the same action as the first action, that is, the first action and the second action may both be used to determine the second electronic device; and the second action may also be an action different from the first action, such as
  • the first action may be for indicating that the first electronic device controls the second electronic device, and the second action may be for indicating that the first electronic device determines the second electronic device.
  • the second electronic device by determining the second electronic device from the at least one candidate electronic device based on the at least one first location vector and the second location vector, not only the accuracy of determining the second electronic device may be improved, but also Improve the user experience.
  • FIG. 3B schematically illustrates a flow chart of determining a second electronic device in accordance with another embodiment of the present disclosure.
  • determining the second electronic device from the at least one candidate electronic device may include operations S401 to S404, where:
  • At least one first angle formed by each position vector in the at least one first position vector and the same coordinate axis is determined.
  • an angle difference between each of the at least one first angle and the second angle is calculated to obtain a corresponding at least one angular difference.
  • the electronic device corresponding to the angle difference that is less than or equal to the preset angle in the at least one candidate electronic device is determined as the second electronic device.
  • At least one first angle formed by each first position vector and a certain coordinate axis may be determined according to the determined at least one first position vector, and the coordinate axis may be, for example, an x-axis and a y-axis. , z axis, etc.
  • a second angle formed by the second position vector with a certain coordinate axis may also be determined according to the determined second position vector. It should be understood that the first angle and the second angle are angles formed by the first position vector and the second position vector, respectively, with the same coordinate axis.
  • an angle difference between each of the at least one first angle and the second angle may be separately calculated, and thus at least one angular difference may be obtained. Comparing each of the at least one angle difference with the preset angle, and if there is an angle difference smaller than or equal to the preset angle in the at least one angle difference, determining an angle difference smaller than or equal to the preset angle Corresponding candidate electronic device, and determining the candidate electronic device as the second electronic device.
  • the accuracy of determining the second electronic device may be further improved.
  • FIG. 3C schematically illustrates a flow chart of an electronic device determining method according to another embodiment of the present disclosure.
  • the electronic device determining method may further include an operation S501 and an operation S502, wherein:
  • the corresponding electronic device in order to further ensure that the electronic device corresponding to the angle difference that is less than or equal to the preset angle in the at least one candidate electronic device is the second electronic device that the user wants to pass through, the corresponding electronic device may be Before the device is determined to be the second electronic device, the name information of the corresponding electronic device is output. Specifically, the name information may be broadcasted by voice, or the name information may be displayed through a display screen, and the name information may also be displayed by holographic projection.
  • the user may perform a corresponding operation according to the name information output by the first electronic device. If the name information is the name information corresponding to the second electronic device that the user wants to determine through the first electronic device, the user may The confirmation operation performed on the name information is fed back to the first electronic device, and after the first electronic device receives the confirmation operation, the corresponding electronic device may be determined as the second electronic device.
  • the confirmation operation may be an action performed by the operating body, and the first electronic device receiving the confirmation operation may be an action performed by identifying the operation body.
  • the electronic device corresponding to the angle difference of the at least one candidate electronic device that is less than or equal to the preset angle is determined as the second electronic device, and may further Improve the accuracy of the second electronic device that the user wants to determine.
  • determining a at least one candidate electronic device that can be controlled by the first electronic device for each electronic device in the target area: acquiring an image of the electronic device; calculating the electronic device a first integral map of the image; obtaining at least one preset integral map, wherein the preset integral map is obtained by calculating a preset template image, wherein the preset template image is an electronic device controllable by the first electronic device Obtaining an image; calculating a first similarity of each of the preset integration maps of the first integration map and the at least one preset integration map to obtain at least one first similarity; and satisfying the first presence in the at least one first similarity In the case of the similarity of the similarity threshold, the electronic device is determined as the candidate electronic device.
  • FIG. 3D schematically illustrates a flowchart of determining a candidate electronic device in accordance with an embodiment of the present disclosure.
  • operations S601 to S605 may be performed for each electronic device in the target area, where:
  • At least one preset integration map is acquired, wherein the preset integration map is obtained by calculating a preset template image, and the preset template image is an image obtained by an electronic device controlled by the first electronic device.
  • a first similarity of each of the preset integration maps of the first integration map and the at least one preset integration map is calculated to obtain at least one first similarity.
  • the electronic device in a case where there is a similarity satisfying the first similarity threshold in the at least one first similarity, the electronic device is determined as the candidate electronic device.
  • the target area may include a first electronic device identifiable area, for example, when the first electronic device is a camera, the target area may include an area that the camera can scan.
  • the first electronic device may identify a part of the electronic devices in the target area, and may also identify all the electronic devices in the target area, wherein the electronic devices in the target area may include the first electronic device
  • the controlled electronic device may also include an electronic device that is not controllable by the first electronic device.
  • the first electronic device may acquire an image of the electronic device, wherein the image may be an image obtained by the first electronic device by scanning the electronic device.
  • a first integral map of the image of the electronic device is calculated, wherein the first integral map may include a first direction integral map, and may also include a first direction integral map and a first color integral map.
  • At least one preset integration map pre-stored in the master electronic device may be acquired, where the preset integration map may be obtained by calculating a preset template image, wherein the preset
  • the integral map may include a preset direction integral map, and may also include a preset direction integral map and a preset color integral map.
  • the preset template image may be pre-stored in the first electronic device, and the preset template image may also be an image obtained by an electronic device controllable by the first electronic device, for example, the preset template image may be An electronic device scans an electronic device that can be controlled by the first electronic device.
  • calculating a first similarity between each of the first integral map and each of the preset integral maps of the at least one preset integral map to obtain at least one first similarity Specifically, a third similarity between the first direction integral map and each preset direction integral map in the at least one preset direction integral map may be calculated; and the first direction integral map and the at least one preset direction integral map may also be calculated. a third similarity of each preset direction integral map, and a fourth similarity of each of the preset color integral maps of the first color integral map and the at least one preset color integral map.
  • the electronic device is determined as the candidate electronic device.
  • the electronic device may be determined as the candidate electronic device if there is a similarity satisfying the third similarity threshold in the at least one third similarity; or the presence may be satisfied in the at least one third similarity
  • the similarity of the three similarity thresholds, and in the case where there is a similarity satisfying the fourth similarity threshold in the at least one fourth similarity, the electronic device is determined as the candidate electronic device.
  • the electronic device if there is a similarity that satisfies the first similarity threshold in the at least one first similarity, the electronic device is determined as the candidate electronic device, and thus at least one candidate electronic device is obtained, so that An electronic device can determine, from the at least one candidate electronic device, a second electronic device that the user wants to determine.
  • FIG. 3E schematically illustrates a flow chart of an electronic device determining method according to another embodiment of the present disclosure.
  • the electronic device determining method may further include operations S701 to S704, where:
  • a first histogram of the image corresponding to the electronic device is calculated.
  • At operation S702 at least one preset histogram is acquired, wherein the preset histogram is obtained by calculating a preset template image.
  • a second similarity between the first histogram and each of the preset histograms in the at least one preset histogram is calculated to obtain at least one second similarity.
  • the electronic device is determined As a candidate electronic device.
  • the electronic device in order to further ensure that the determined candidate electronic device is the accuracy of the electronic device that can be controlled by the first electronic device, the electronic device may be further calculated before the electronic device is determined as the candidate electronic device.
  • the first histogram may include a first direction histogram, and may also include a first direction histogram and a first color histogram.
  • At least one preset histogram previously stored in the first electronic device may be acquired, wherein the preset histogram may be obtained by calculating a preset template image, wherein the preset
  • the histogram may include a preset direction histogram, and may also include a preset direction histogram and a preset color histogram.
  • calculating a first similarity between each of the first histogram and each preset histogram in the at least one preset histogram may obtain at least one first similarity. Specifically, a fifth similarity between the first direction histogram and each preset direction histogram in the at least one preset direction histogram may be calculated; and the first direction histogram and the at least one preset direction histogram may also be calculated. a fifth similarity of the preset direction histograms, and a sixth similarity of the first color histogram and each preset color histogram in the at least one preset color histogram.
  • the electronic device is determined as a candidate Electronic equipment.
  • the electronic device is determined a candidate electronic device; or a similarity that satisfies a third similarity threshold in at least one third similarity, and a similarity that satisfies a fourth similarity threshold in the at least one fourth similarity, and at least In a fifth similarity, there is a similarity satisfying the fifth similarity threshold, and in the case where there is a similarity satisfying the sixth similarity threshold in the at least one sixth similarity, the electronic device is determined as the candidate electronic device.
  • the camera stores information of images of different angles (also referred to as preset template images) of all electronic devices that can be controlled by the camera.
  • the camera can scan the electronic device within the target area, and the image of the electronic device within the target area acquired by the camera can be referred to as a pre-processed image.
  • the camera can identify the electronic device in the target area by the physical property recommendation algorithm, and determine a set of candidate frame sets in a short time.
  • the statistical matching method is used to compare the similarity between the first integral map of the preprocessed image and the preset integral map of the preset template image to realize the object recognition function. Specific steps are as follows:
  • determining the electronic device as one of the candidate electronic devices may improve Determine the correct rate of each candidate electronic device.
  • FIG. 3F schematically illustrates a schematic diagram of a first electronic device being a binocular camera in accordance with an embodiment of the present disclosure.
  • the camera can include a binocular camera 104 that can identify a first action by at least one candidate electronic device and an operator in the screen (eg, The user's gesture), as well as the depth information of the first action performed by the operator, implements a completely new way of interacting to help the user control other devices in the scene, such as the second electronic device.
  • a binocular camera 104 can identify a first action by at least one candidate electronic device and an operator in the screen (eg, The user's gesture), as well as the depth information of the first action performed by the operator, implements a completely new way of interacting to help the user control other devices in the scene, such as the second electronic device.
  • FIG. 3G schematically illustrates a schematic diagram of determining a second electronic device in accordance with an embodiment of the present disclosure.
  • the first electronic device may identify the specificity of the operating body.
  • An action eg, a particular gesture
  • initiates an action control function eg, a gesture control function
  • the first electronic device can identify the action of the operating body and the at least one candidate electronic device in the screen that can be manipulated by the first electronic device, and when the final presentation gesture of the action is directed to an electronic device, the first electronic device feeds back the electronic device Name information, if the name information meets the user's desire, the electronic device is determined as the second electronic device, and the user can cause the first electronic device to control the second electronic device by the first action of the operating body, such as gesture sliding control .
  • the binocular camera 104 can acquire objects (including at least one controlled device) in the scene 105 and the depth of the user 106. Information to establish a three-dimensional coordinate system.
  • the binocular camera 104 detects that the user 106 initiates the gesture of device manipulation, assuming that the neck of the user 106 coincides with the shoulder position of the user 106, the three-dimensional coordinate system can be established with the neck position of the user 106 as the origin.
  • the front side of the user 106 is the positive x-axis direction, and the direction near the binocular camera 104 is the positive direction of the y-axis for recording depth information, and the upper side is the z-axis positive direction.
  • the steps of the binocular camera 104 to lock the controlled electronic device pointed by the user 106 are as follows:
  • the binocular camera 104 detects the activation gesture of the user 106, it starts to track the position of the human hand of the user 106. After the human hand shifts and stays at a certain position, the binocular camera 104 records the second position vector of the human hand at this time. As ⁇ ;
  • each of the at least one first position vector with the second position vector ⁇ for example, determining each of the at least one first position vector and the first angle of the coordinate axis to obtain at least one first An angle determining a second position vector and a second angle of the coordinate axis, thereby comparing each of the at least one first angle with the second angle;
  • the name information of the electronic device corresponding to the angle close to or coincident with the second angle may be voiced by the binocular camera 104 at this time;
  • the electronic device corresponding to the angle that the second angle is close or coincident among the at least one candidate electronic device may be confirmed as the controlled electronic device, and the binocular camera 104
  • the controlled electronic device may be manipulated according to the recognition result of the first action performed by the recognition user 106, wherein the recognition result may be used to identify a manipulation instruction, which may include starting, shutting down, increasing the gear position, reducing the gear position, and the like. .
  • the binocular camera device provided by the embodiment of the present disclosure provides a brand-new smart home control mode, which can automatically identify other devices in the scene by using image recognition and gesture recognition technology, and save the relevant identification data, and then identify the user. Gesture action can achieve the purpose of using gestures to control other smart devices in the home.
  • embodiments of the present disclosure supplement the shortcomings of current voice manipulations that require higher distance and background noise, and help the user operate different devices in the home in a less restrictive and more natural way.
  • FIG. 4 schematically illustrates a block diagram of an electronic device determination system in accordance with an embodiment of the present disclosure.
  • the electronic device determining system can include a first obtaining device 410 and a first determining device 420, wherein:
  • the first obtaining device 410 is configured to acquire the recognition result by the first electronic device identifying the first action performed by the operating body.
  • the first determining device 420 is configured to determine, according to the recognition result, the second electronic device that can be controlled by the first electronic device.
  • the usage control scene can be less restricted and more natural.
  • the method helps the user to manipulate the controlled electronic device, and can also solve the defects that the existing voice manipulation mode not only has inaccurate semantic understanding, but also has high requirements on distance and environmental noise, resulting in limited use scenarios.
  • FIG. 5A schematically illustrates a block diagram of a first determining device in accordance with an embodiment of the present disclosure.
  • the first determining means 420 may include a first determining unit 421, a second determining unit 422, a third determining unit 423, a fourth determining unit 424, and a fifth determining unit 425, wherein:
  • the first determining unit 421 is configured to determine at least one candidate electronic device that can be controlled by the first electronic device.
  • the second determining unit 422 is configured to determine a coordinate origin.
  • the third determining unit 423 is configured to determine at least one first location vector corresponding to each candidate electronic device in the at least one candidate electronic device, starting from a coordinate origin and ending with a self location.
  • the fourth determining unit 424 is configured to determine a second position vector whose starting point is the starting point of the coordinate origin and the position after the second action is made by the operating body.
  • the fifth determining unit 425 is configured to determine the second electronic device from the at least one candidate electronic device based on the at least one first location vector and the second location vector.
  • determining the controlled electronic device from the at least one candidate electronic device based on the at least one first location vector and the second location vector, thereby performing a control operation on the controlled electronic device can not only improve The accuracy of the control can also improve the user experience.
  • FIG. 5B schematically illustrates a block diagram of a fifth determining unit in accordance with an embodiment of the present disclosure.
  • the fifth determining unit 425 may include a first determining subunit 4251, a second determining subunit 4252, a calculating subunit 4253, and a third determining subunit 4254, where:
  • the first determining subunit 4251 is configured to determine at least one first angle formed by each position vector in the at least one first position vector and the same coordinate axis.
  • the second determining subunit 4252 is configured to determine a second angle formed by the second position vector and the same coordinate axis.
  • the calculating sub-unit 4253 is configured to calculate an angular difference between each of the at least one first angle and the second angle to obtain a corresponding at least one angular difference.
  • the third determining subunit 4254 is configured to determine, in the case where the angle difference is less than or equal to the preset angle in the at least one angle difference, determining, by the electronic device corresponding to the angle difference that is less than or equal to the preset angle in the at least one candidate electronic device, as the second Electronic equipment.
  • determining the electronic device corresponding to the angle difference of the preset angle by at least one candidate electronic device as the controlled electronic device can further improve the accuracy of determining the controlled electronic device.
  • FIG. 5C schematically illustrates a block diagram of an electronic device determination system in accordance with another embodiment of the present disclosure.
  • the electronic device determination system 400 can further include an output device 510 and a second determining device 520, wherein:
  • the output device 510 is configured to output the name information of the corresponding electronic device before determining the electronic device corresponding to the angle difference of the preset angle in the at least one candidate electronic device as the second electronic device.
  • the second determining means 520 is configured to determine the corresponding electronic device as the second electronic device in the case where the operating body performs the confirming operation for the name information.
  • the electronic device corresponding to the angle difference of the at least one candidate electronic device that is less than or equal to the preset angle is determined as the controlled electronic device, which may be further Accurately determine the controlled electronic device that the user wants to control through the master electronic device.
  • the first determining unit determines the at least one candidate electronic device that can be manipulated by the first electronic device, for each electronic device in the target area: acquiring an image of the electronic device; Calculating a first integration map of the image of the electronic device; acquiring at least one preset integration map, wherein the preset integration map is obtained by calculating a preset template image, and the preset template image is performed by the first electronic device An image obtained by the manipulated electronic device; calculating a first similarity of the first integral map and each of the preset integral maps of the at least one preset integral map to obtain at least one first similarity; and in the at least one first similarity In the case where there is a first similarity satisfying the first similarity threshold, the electronic device is determined as the candidate electronic device.
  • the electronic device is determined as the candidate electronic device, and thus at least one candidate electronic device is obtained, so that the main The control electronic device can determine from the at least one candidate electronic device the controlled electronic device that the user wants to control.
  • FIG. 5D schematically illustrates a block diagram of an electronic device determination system in accordance with another embodiment of the present disclosure.
  • the electronic device determination system 400 can further include a first computing device 610, a second obtaining device 620, a second computing device 630, and a third determining device 640, where:
  • the first computing device 610 is configured to calculate a first histogram of an image corresponding to the electronic device before determining the electronic device as the candidate electronic device.
  • the second obtaining device 620 is configured to acquire at least one preset histogram, wherein the preset histogram is obtained by calculating a preset template image.
  • the second computing device 630 is configured to calculate a second similarity between the first histogram and each preset histogram in the at least one preset histogram to obtain at least one second similarity.
  • the third determining means 640 is configured to have a similarity satisfying the first similarity threshold in the at least one first similarity, and in a case that the at least one second similarity also has a similarity satisfying the second similarity threshold, The electronic device is determined to be a candidate electronic device.
  • determining the electronic device as one of the candidate electronic devices may improve Determine the correct rate of each candidate electronic device.
  • the first obtaining device 410, the first determining device 420, the output device 510, the second determining device 520, the first computing device 610, the second obtaining device 620, the second computing device 630, and the third determining device 640 a first determining unit 421, a second determining unit 422, a third determining unit 423, a fourth determining unit 424, a fifth determining unit 425, a first determining subunit 4251, a second determining subunit 4252, a calculating subunit 4253, and
  • the third determination sub-unit 4254 can be implemented in one device/unit/sub-unit, or any one of the devices/unit/sub-unit can be split into multiple devices/units/sub-units.
  • At least one of 4253 and third determining subunit 4254 may be implemented at least in part as a hardware circuit, such as a field programmable gate array (FPGA), a programmable logic array (PLA), a system on a chip, a
  • the system, application specific integrated circuit (ASIC), or hardware or firmware may be implemented in any other reasonable manner to integrate or package the circuit, or in a suitable combination of three implementations of software, hardware, and firmware.
  • an electronic device determining apparatus including a collecting device, an identifying device, and a processing device, wherein the collecting device is used for a first action performed by an operating body, and the identifying device is configured to identify The first action is performed to obtain a recognition result, and the processing device is configured to determine a controllable second electronic device according to the recognition result.
  • the electronic device determining device further includes: a signal transmitting device configured to transmit a signal to the controllable second electronic device.
  • Another aspect of the present disclosure provides an electronic device determining system including at least one first electronic device and at least one second electronic device, wherein the at least one first electronic device is configured to identify a first done by an operating body Acting, obtaining a recognition result, and determining, according to the recognition result, a second electronic device controllable by the first electronic device.
  • Another aspect of the present disclosure provides a computer system comprising: one or more processors; and a computer readable storage medium storing one or more programs, wherein when one or more of the programs are When the plurality of processors are executed, the one or more processors are implemented to implement the electronic device control method according to any one of the above embodiments.
  • FIG. 6 schematically illustrates a block diagram of a computer system suitable for implementing an electronic device determination method in accordance with an embodiment of the present disclosure.
  • the computer system shown in FIG. 6 is merely an example and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • a computer system 700 in accordance with an embodiment of the present disclosure includes a processor 701 that can be loaded into a random access memory (RAM) 703 according to a program stored in a read only memory (ROM) 702 or from a storage portion 708.
  • the program performs various appropriate actions and processes.
  • Processor 701 can include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor, and/or a related chipset and/or a special purpose microprocessor (e.g., an application specific integrated circuit (ASIC)), and the like.
  • Processor 701 can also include onboard memory for caching purposes.
  • Processor 701 can include a single processing unit or a plurality of processing units for performing different actions of a method flow in accordance with an embodiment of the present disclosure.
  • the processor 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • the processor 701 performs the above various operations by executing programs in the ROM 702 and/or the RAM 703. It is noted that the program can also be stored in one or more memories other than ROM 702 and RAM 703.
  • the processor 701 can also perform the above various operations by executing a program stored in the one or more memories.
  • Computer system 700 may also include an input/output (I/O) interface 705 to which an input/output (I/O) interface 705 is also coupled, in accordance with an embodiment of the present disclosure.
  • the computer system 700 can also include one or more of the following components coupled to the I/O interface 705: an input portion 706 including a keyboard, mouse, etc.; including, for example, a cathode ray tube (CRT), a liquid crystal display (LCD), and the like.
  • An output portion 707 of a speaker or the like a storage portion 708 including a hard disk or the like; and a communication portion 709 including a network interface card such as a LAN card, a modem, and the like.
  • the communication section 709 performs communication processing via a network such as the Internet.
  • Driver 710 is also connected to I/O interface 705 as needed.
  • a removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory or the like, is mounted on the drive 710 as needed so that a computer program read therefrom is installed into the storage portion 708 as needed.
  • an embodiment of the present disclosure includes a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for executing the method illustrated in the flowchart.
  • the computer program can be downloaded and installed from the network via communication portion 709, and/or installed from removable media 711.
  • the above-described functions defined in the system of the embodiments of the present disclosure are executed when the computer program is executed by the processor 701.
  • the systems, devices, devices, units, etc. described above may be implemented by computer program modules in accordance with embodiments of the present disclosure.
  • the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two.
  • the computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above. More specific examples of computer readable storage media may include, but are not limited to, electrical connections having one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain or store a program, which can be used by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a data signal that is propagated in the baseband or as part of a carrier, carrying computer readable program code. Such propagated data signals can take a variety of forms including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer readable signal medium can also be any computer readable medium other than a computer readable storage medium, which can transmit, propagate, or transport a program for use by or in connection with the instruction execution system, apparatus, or device. .
  • Program code embodied on a computer readable medium can be transmitted by any suitable medium, including but not limited to wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
  • the computer readable medium may include one or more memories other than the ROM 702 and/or the RAM 703 and/or the ROM 702 and the RAM 703 described above.
  • each block of the flowchart or block diagrams can represent a device, a program segment, or a portion of code that comprises one or more Executable instructions.
  • the functions noted in the blocks may also occur in a different order than that illustrated in the drawings. For example, two successively represented blocks may in fact be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams or flowcharts, and combinations of blocks in the block diagrams or flowcharts can be implemented by a dedicated hardware-based system that performs the specified function or operation, or can be used A combination of dedicated hardware and computer instructions is implemented.
  • the present disclosure further provides a computer readable medium having stored thereon executable instructions that, when executed by a processor, cause the processor to implement the electronic device determination of any of the above embodiments
  • the computer readable medium may be included in the apparatus described in the above embodiments; or may be separately present and not incorporated into the apparatus.
  • the computer readable medium carries one or more programs, and when the one or more programs are executed by the device, causing the device to perform: identifying, by the first electronic device, the first action performed by the operating body, obtaining the recognition result And determining, according to the recognition result, the second electronic device controllable by the first electronic device

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种电子设备确定方法、一种电子设备确定系统、一种计算机系统和一种计算机可读存储介质,所述方法包括:通过第一电子设备识别操作体所做的第一动作,获取识别结果(S201);以及根据识别结果,确定可通过第一电子设备进行控制的第二电子设备(S202)。

Description

电子设备确定方法、系统、计算机系统和可读存储介质
本申请要求于2018年4月13日提交的、申请号为201810331949.8的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及互联网技术领域,更具体地,涉及一种电子设备确定方法、系统、计算机系统和计算机可读存储介质。
背景技术
目前,在进行智能控制前,一般需要先确定哪些设备是被控制对象。
相关技术中的确定被控制设备的方式大多是使用者通过接触式方式(如触摸操作)向控制终端发出相应的确认信息,然后再由控制终端根据该确认信息确定出被控制设备。
而随着智能家居的发展,接触式方式已无法提供更好的用户体验。为了克服上述缺陷,另一些相关技术还提供了通过非接触式方式(如语音操作)实现遥控确定被控制设备的技术方案。
然而,在实现本公开构思的过程中,发明人发现相关技术中至少存在如下问题:现有的语音确认方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限。
发明内容
有鉴于此,本公开提供了一种通过第一电子设备根据识别操作体所做的第一动作的识别结果,确定可以通过第一电子设备进行控制的第二电子设备,以解决现有的语音操控方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限的缺陷的电子设备确定方法和系统。
本公开的一个方面提供了一种电子设备确定方法,上述电子设备确定方法包括:通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及根据上述识别结果,确定可通过上述第一电子设备进行控制的第二电子设备。
根据本公开的实施例,确定可通过上述第一电子设备进行控制的第二电子设备包括:确定可通过上述第一电子设备进行控制的至少一个候选电子设备;确定坐标原点;确定上 述至少一个候选电子设备中各候选电子设备对应的以上述坐标原点为起点且以自身位置为终点的至少一个第一位置矢量;确定以上述坐标原点为起点且以上述操作体做出第二动作后所在的位置为终点的第二位置矢量;以及基于上述至少一个第一位置矢量和上述第二位置矢量,从上述至少一个候选电子设备中确定出上述第二电子设备。
根据本公开的实施例,从上述至少一个候选电子设备中确定出上述第二电子设备包括:确定上述至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度;确定上述第二位置矢量与上述同一坐标轴形成的第二角度;计算上述至少一个第一角度中每个角度与上述第二角度的角度差,得到对应的至少一个角度差;以及在上述至少一个角度差中存在小于等于预设角度的角度差的情况下,将上述至少一个候选电子设备中小于等于上述预设角度的角度差对应的电子设备确定为上述第二电子设备。
根据本公开的实施例,上述电子设备确定方法还包括:在将上述至少一个候选电子设备中小于等于上述预设角度的角度差对应的电子设备确定为上述第二电子设备之前,输出上述对应的电子设备的名称信息;以及在上述操作体针对上述名称信息作出确认操作的情况下,将上述对应的电子设备确定为上述第二电子设备。
根据本公开的实施例,在确定可通过上述第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:获取该电子设备的图像;计算该电子设备的图像的第一积分图;获取至少一个预设积分图,其中,上述预设积分图是通过对预设模板图像进行计算得到的,上述预设模板图像是可通过上述第一电子设备进行控制的电子设备得到的图像;计算上述第一积分图与上述至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及在上述至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为上述候选电子设备。
根据本公开的实施例,上述电子设备确定方法还包括:在将该电子设备确定为上述候选电子设备之前,计算与该电子设备对应的图像的第一直方图;获取至少一个预设直方图,其中,上述预设直方图是通过对预设模板图像进行计算得到的;计算上述第一直方图与上述至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度;以及在上述至少一个第一相似度中存在满足第一相似度阈值的相似度且在上述至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为上述候选电子设备。
根据本公开的实施例,上述电子设备确定方法还包括:在确定可通过所述第一电子设备进行控制的第二电子设备之后,所述第二电子设备接收用户语音信息,并响应所述用户 语音信息。
本公开的另一个方面提供了一种电子设备确定系统,上述电子设备确定系统包括:第一获取装置,用于通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及第一确定装置,用于根据上述识别结果,确定可通过上述第一电子设备进行控制的第二电子设备。
根据本公开的实施例,第一确定装置包括:第一确定单元,用于确定可通过上述第一电子设备进行控制的至少一个候选电子设备;第二确定单元,用于确定坐标原点;第三确定单元,用于确定上述至少一个候选电子设备中各候选电子设备对应的以上述坐标原点为起点且以自身位置为终点的至少一个第一位置矢量;第四确定单元,用于确定以上述坐标原点为起点且以上述操作体做出第二动作后所在的位置为终点的第二位置矢量;以及第五确定单元,用于基于上述至少一个第一位置矢量和上述第二位置矢量,从上述至少一个候选电子设备中确定出上述第二电子设备。
根据本公开的实施例,第五确定单元包括:第一确定子单元,用于确定上述至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度;第二确定子单元,用于确定上述第二位置矢量与上述同一坐标轴形成的第二角度;计算子单元,用于计算上述至少一个第一角度中每个角度与上述第二角度的角度差,得到对应的至少一个角度差;以及第三确定子单元,用于在上述至少一个角度差中存在小于等于预设角度的角度差的情况下,将上述至少一个候选电子设备中小于等于上述预设角度的角度差对应的电子设备确定为上述第二电子设备。
根据本公开的实施例,上述电子设备确定系统还包括:输出装置,用于在将上述至少一个候选电子设备中小于等于上述预设角度的角度差对应的电子设备确定为上述第二电子设备之前,输出上述对应的电子设备的名称信息;以及第二确定装置,用于在上述操作体针对上述名称信息作出确认操作的情况下,将上述对应的电子设备确定为上述第二电子设备。
根据本公开的实施例,在上述第一确定单元确定可通过上述第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:获取该电子设备的图像;计算该电子设备的图像的第一积分图;获取至少一个预设积分图,其中,上述预设积分图是通过对预设模板图像进行计算得到的,上述预设模板图像是可通过上述第一电子设备进行控制的电子设备得到的图像;计算上述第一积分图与上述至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及在上述至少一个第一相似度 中存在满足第一相似度阈值的第一相似度的情况下,将该电子设备确定为上述候选电子设备。
根据本公开的实施例,上述电子设备确定系统还包括:第一计算装置,用于在将该电子设备确定为上述候选电子设备之前,计算与该电子设备对应的图像的第一直方图;第二获取装置,用于获取至少一个预设直方图,其中,上述预设直方图是通过对预设模板图像进行计算得到的;第二计算装置,用于计算上述第一直方图与上述至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度;以及第三确定装置,用于在上述至少一个第一相似度中存在满足第一相似度阈值的相似度且在上述至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为上述候选电子设备。
本公开的另一个方面提供了一种电子设备确定设备,包括采集装置、识别装置和处理装置,其中,所述采集装置用于操作体所做的第一动作,所述识别装置用于识别所述第一动作,获取识别结果,所述处理装置用于根据所述识别结果,确定可控制的第二电子设备。
根据本公开的实施例,上述电子设备确定设备还包括:信号发送装置,所述信号发送装置用于向所述可控制的第二电子设备发送信号。
本公开的另一个方面提供了一种电子设备确定系统,包括至少一个第一电子设备和至少一个第二电子设备,其中,所述至少一个第一电子设备用于识别操作体所做的第一动作,获取识别结果,并根据所述识别结果,确定可通过所述第一电子设备进行控制的第二电子设备。
本公开的另一个方面提供了一种计算机系统,包括:一个或多个处理器;计算机可读存储介质,用于存储一个或多个程序,其中,当上述一个或多个程序被上述一个或多个处理器执行时,使得上述一个或多个处理器实现上述实施例中任一项所述的电子设备确定方法。
本公开的另一个方面提供了一种计算机可读存储介质,其上存储有可执行指令,该指令被处理器执行时使上述处理器实现上述实施例中任一项所述的电子设备确定方法。
根据本公开的实施例,因为采用了通过第一电子设备根据识别操作体所做的第一动作的识别结果,确定可通过第一电子设备进行控制的第二电子设备的技术手段,可以至少部分地解决现有的语音操控方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限的技术问题,并因此可以实现以一种使用场景受限较小且更为自然的方式帮助用户确定第二电子设备的技术效果。
附图说明
通过以下参照附图对本公开实施例的描述,本公开的上述以及其他目的、特征和优点将更为清楚,在附图中:
图1示意性示出了根据本公开实施例的电子设备确定方法和系统的应用场景;
图2示意性示出了根据本公开实施例的电子设备确定方法的流程图;
图3A示意性示出了根据本公开实施例的确定第二电子设备的流程图;
图3B示意性示出了根据本公开另一实施例的确定第二电子设备的流程图;
图3C示意性示出了根据本公开另一实施例的电子设备确定方法的流程图;
图3D示意性示出了根据本公开实施例的确定候选电子设备的流程图;
图3E示意性示出了根据本公开另一实施例的电子设备确定方法的流程图;
图3F示意性示出了根据本公开实施例的第一电子设备为双目摄像图的示意图;
图3G示意性示出了根据本公开实施例的确定第二电子设备的示意图;
图4示意性示出了根据本公开实施例的电子设备确定系统的框图;
图5A示意性示出了根据本公开实施例的第一确定装置的框图;
图5B示意性示出了根据本公开实施例的第五确定单元的框图;
图5C示意性示出了根据本公开另一实施例的电子设备确定系统的框图;
图5D示意性示出了根据本公开另一实施例的电子设备确定系统的框图;以及
图6示意性示出了根据本公开实施例的适于实现电子设备确定方法的计算机系统的框图。
具体实施方式
以下,将参照附图来描述本公开的实施例。但是应该理解,这些描述只是示例性的,而并非要限制本公开的范围。此外,在以下说明中,省略了对公知结构和技术的描述,以避免不必要地混淆本公开的概念。
在此使用的术语仅仅是为了描述具体实施例,而并非意在限制本公开。在此使用的术语“包括”、“包含”等表明了所述特征、步骤、操作和/或部件的存在,但是并不排除存在或添加一个或多个其他特征、步骤、操作或部件。
在此使用的所有术语(包括技术和科学术语)具有本领域技术人员通常所理解的含义,除非另外定义。应注意,这里使用的术语应解释为具有与本说明书的上下文相一致的含义,而不应以理想化或过于刻板的方式来解释。
在使用类似于“A、B和C等中至少一个”这样的表述的情况下,一般来说应该按照本领域技术人员通常理解该表述的含义来予以解释(例如,“具有A、B和C中至少一个的系统”应包括但不限于单独具有A、单独具有B、单独具有C、具有A和B、具有A和C、具有B和C、和/或具有A、B、C的系统等)。在使用类似于“A、B或C等中至少一个”这样的表述的情况下,一般来说应该按照本领域技术人员通常理解该表述的含义来予以解释(例如,“具有A、B或C中至少一个的系统”应包括但不限于单独具有A、单独具有B、单独具有C、具有A和B、具有A和C、具有B和C、和/或具有A、B、C的系统等)。本领域技术人员还应理解,实质上任意表示两个或更多可选项目的转折连词和/或短语,无论是在说明书、权利要求书还是附图中,都应被理解为给出了包括这些项目之一、这些项目任一方、或两个项目的可能性。例如,短语“A或B”应当被理解为包括“A”或“B”、或“A和B”的可能性。
本公开的实施例提供了一种电子设备确定方法,该电子设备确定方法可以包括:通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及根据识别结果,确定可通过第一电子设备进行控制的第二电子设备。
图1示意性示出了根据本公开实施例的电子设备确定方法和系统的应用场景。需要注意的是,图1所示仅为可以应用本公开实施例的应用场景的示例,以帮助本领域技术人员理解本公开的技术内容,但并不意味着本公开实施例不可以用于其他设备、系统、环境或场景。
在本公开实施例的应用场景中,如图1所示,假设第一电子设备为摄像头101,操作体为人手102,第二电子设备为电视103。在用户可以通过人手102做出第一动作后,摄像头101可以识别人手102所做出的第一动作,并获得识别结果,该摄像头101可以根据该识别结果对确定出可被摄像头101控制的电视103。进一步,摄像头101还可以对电视103进行控制,例如打开电视103、关闭电视103等。
换言之,在本公开实施例的应用场景中,可以实现通过第一电子设备基于识别操作体所做的第一动作的识别结果,确定出可被第一电子设备进行控制的第二电子设备,进而可以解决现有的语音操控方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限得缺陷。
图2示意性示出了根据本公开实施例的电子设备确定方法的流程图。
如图2所示,该电子设备确定方法可以包括操作S201和操作S202,其中:
在操作S201,通过第一电子设备识别操作体所做的第一动作,获取识别结果。
在操作S202,根据识别结果,确定可通过第一电子设备进行控制的第二电子设备。
本公开提供的电子设备确定方法可以应用于第一电子设备,该第一电子设备可以包括但不限于手机、台视电脑、笔记本电脑、平板电脑、摄像头等,该第一电子设备可以与第二电子设备进行通信,比如对第二电子设备进行控制。
根据本公开的实施例,该第一电子设备可以具有识别功能。具体地,该第一电子设备可以识别场景中的至少一个电子设备,进而从该至少一个电子设备中识别出可通过第一电子设备进行控制的至少一个候选电子设备,其中,该可通过第一电子设备进行控制的至少一个候选电子设备可以包括第二电子设备。
在本公开的实施例中,该第一电子设备还可以识别操作体所做的第一动作,进而得到识别结果。其中,该第一电子设备与该操作体非接触,且该第一动作可以是该操作体在操作体本身与该第一电子设备共同所在的空间内执行的。
根据本公开的实施例,当所述第一电子设备的识别结果表征某个电子设备为候选第二电子设备时,可以由所述第一电子设备给所述候选第二电子设备发送二次确认信息;响应于接收到该二次确认信息,所述候选第二电子设备可以对操作体进行提示,例如,以声光的形式对操作体进行提示,此时,所述第一电子设备可以再次接收操作体的动作(例如,在所述第一电子设备检测到该声光的形式的提示时采集操作体的动作,或者在所述第一电子设备发送二次确认信息起达到设定时间阈值时采集操作体的动作),如果操作体的动作表征“确认”信息,如√动作,则将所述候选第二电子设备作为所述第二电子设备。
需要说明的是,上述操作体可以包括用户的肢体、手、脚等,也可以包括由用户操控的物体。相应地,该操作体所做的第一动作可以为用户的肢体、手、脚等所做的动作,也可以为用户操控物体做出的动作。
下面以操作体为人手为例,详细说明第一电子设备识别人手所做的动作的方案。
根据本公开的实施例,在识别人手的动作之前,可以先识别手势,例如,可以利用基于肤色的手势分割方法识别手势。具体地,该基于肤色的手势分割方法是利用肤色与背景颜色的差异进行区别,然后利用手势独特的边缘特性(例如人手的轮廓)来进一步区分,以此来识别和确定手势。进一步,在确定手势后,第一电子设备(例如摄像头)可以使用Meanshift算法来跟踪人手的运动,进而通过人手的运动轨迹来判断操控指令。
根据本公开的实施例,该第一电子设备识别该操作体所做的第一动作后,可以得到识别结果。其中,该识别结果可以包括该第一动作所对应的操控指令,且该识别结果可以是基于识别到的该第一动作的运动轨迹得到的,也可以是基于识别到的该第一动作的最终展 现姿态得到的,在此不做限定。进一步,第一电子设备可以获取该识别结果,并根据该识别结果确定出可被第一电子设备进行控制的第二电子设备。
根据本公开的实施例,在确定出第二电子设备之后,第一电子设备还可以根据识别结果对第二电子设备进行控制。例如,若第一动作为操作体在空中画出的圆形,则该第一动作可以用于使第一电子设备确定出第二电子设备并控制第二电子设备睡眠;若第一动作为操作体在空中画出的“X”形,则该第一动作可以用于使第一电子设备确定出第二电子设备并控制第二电子设备增加音量。
需要说明的是,在本公开的实施例中,第一电子设备可以称为主电子设备,第二电子设备可以称为从电子设备。其中,本公开实施例的第一电子设备可以控制电子设备(例如第二电子设备)执行操作,也可以自行执行操作;本公开实施例的第二电子设备可以被第一电子设备进行控制,也可以自行发出动作,例如定时播放音乐。
与本公开的实施例不同,目前一种相关技术是使用者通过接触式方式(如触摸操作)向控制终端发出相应的确认信息,然后再由控制终端根据该确认信息确定出被控制设备。然而,接触式方式已无法提供更好的用户体验。另一些相关技术还提供了通过非接触式方式(如语音操作)实现遥控确定被控制设备的技术方案。但是,现有的语音确认方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限。此外,另一些相关技术还提供了手势识别技术,但是目前的手势识别技术只适用于某种特定使用场景,例如体感游戏和VR设备控制等场景,且该技术局限于用户与单一被操作设备两者之间的交互,无法涉及第三类设备交互。
而通过本公开的实施例,通过使第一电子设备基于识别操作体所做的第一动作的识别结果,确定出可被第一电子设备进行控制的第二电子设备,可以以一种使用场景受限较小且更为自然的方式帮助用户确定出第二电子设备,还可以解决现有的语音操控方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限得缺陷。
需要说明的是,本公开的实施例还可以将动作操作模式和语音操作模式相结合,使得本公开可以适用于更多场景中。
在一个实施例中,以智能音箱为例进行说明,该智能音箱具有工作模式和休眠模式。当所述第一电子设备通过操作体的动作确认所述第二电子设备为智能音箱并唤醒该智能音箱时(智能音箱从休眠模式变为工作模式),操作体就可以与智能音箱进行语音交互,这样既能保证确认的第二电子设备的准确性,还不会影响第二电子设备的正常使用。
在另一个实施例中,可以在不便于通过语音等方式确定哪个电子设备为第二电子设备 时,通过操作体的动作确定第二电子设备。例如,室内具有两台可以进行语音交互的智能空调在工作时,操作体发出语音“把温度设定在25度”,则不便于确定哪台智能空调执行该操作,此时,两台智能空调可以通过声光的形式提示操作体进行确认,例如,操作体将手指向希望重新设置温度参数的智能空调,所述第一电子设备基于操作体的动作确定对应的智能空调重新设置温度参数,并发送信息给智能空调。
在另一个实施例中,仍然以智能音箱为例进行说明,当所述第一电子设备的识别结果表征智能音箱为候选第二电子设备时,可以由所述第一电子设备给所述候选第二电子设备发送二次确认信息;响应于接收到该二次确认信息,所述智能音箱可以对操作体发送声光提示,例如,发送语音信息“您是要唤醒叮咚吗”,此时,所述智能音箱可以进入待机模式,并接收操作体的二次确认信息,例如操作体发出语音信息“是的”,此时,智能音箱确定自身为所述第二电子设备,并可以发送信息给所述第一电子设备。这样可以将动作操作和语音操作等结合起来,提升用户体验。
下面参考图3A~图3G,结合具体实施例对图2所示的方法做进一步说明。
图3A示意性示出了根据本公开实施例的确定第二电子设备的流程图。
如图3A所示,确定可通过第一电子设备进行控制的被控电子设备可以包括操作S301~操作S305,其中:
在操作S301,确定可通过第一电子设备进行操控的至少一个候选电子设备。
在操作S302,确定坐标原点。
在操作S303,确定至少一个候选电子设备中各候选电子设备对应的以坐标原点为起点且以自身位置为终点的至少一个第一位置矢量。
在操作S304,确定以坐标原点为起点且以操作体做出第二动作后所在的位置为终点的第二位置矢量。
在操作S305,基于至少一个第一位置矢量和第二位置矢量,从至少一个候选电子设备中确定出第二电子设备。
在本公开的实施例中,在确定出第二电子设备之前,可以先确定出可通过第一电子设备进行操控的至少一个候选电子设备,进而可以从该至少一个候选电子设备中确定出第二电子设备。
需要说明的是,采集所述第一动作和所述第二动作的电子设备包括可以用于手势识别的各种传感器,包括但不限于:摄像头、激光雷达、接近传感器和红外传感器等,其中,摄像头可以包括单目、双目等。
根据本公开的实施例,第一电子设备可以基于识别到的至少一个候选电子设备和操作体,建立三维坐标系,并确定出该三维坐标系的原点。由于每个候选电子设备的自身位置与该坐标原点之间有一定距离,可以确定出以坐标原点为起点且以每个候选电子设备的自身位置为终点第一位置矢量,进而可以得到上述至少一个候选电子设备中各候选电子设备对应的至少一个第一位置矢量。同理,由于操作体做出第二动作后所在的位置也与该坐标原点之间有一定距离,可以确定出以该坐标原点为起点且以操作体做出第二动作后所在的位置为终点的第二位置矢量。进而,可以基于该至少一个第一位置矢量和该第二位置矢量,从上述至少一个候选电子设备中确定出第二电子设备。
需要说明的是,本公开实施例中的操作体所做的第二动作可以为用户的肢体、手、脚等所做的动作,也可以为用户操控物体做出的动作。例如,在操作体为人手时,该操作体所做的第二动作可以是人手指向被控电子设备的动作。此外,第二动作可以是与第一动作相同的动作,即第一动作和第二动作均可以是用于确定出第二电子设备;第二动作也可以是与第一动作不同的动作,比如第一动作可以是用于指示第一电子设备对第二电子设备进行控制,第二动作可以是用于指示第一电子设备确定出第二电子设备。
通过本公开的实施例,通过基于至少一个第一位置矢量和第二位置矢量,从至少一个候选电子设备中确定出第二电子设备,不仅可以提高确定出第二电子设备的准确性,还可以提高用户体验。
图3B示意性示出了根据本公开另一实施例的确定第二电子设备的流程图。
如图3B所示,从至少一个候选电子设备中确定出第二电子设备可以包括操作S401~操作S404,其中:
在操作S401,确定至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度。
在操作S402,确定第二位置矢量与同一坐标轴形成的第二角度。
在操作S403,计算至少一个第一角度中每个角度与第二角度的角度差,得到对应的至少一个角度差。
在操作S404,在至少一个角度差中存在小于等于预设角度的角度差的情况下,将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备。
在本公开的实施例中,可以根据确定出的至少一个第一位置矢量,确定各第一位置矢量与某一坐标轴形成的至少一个第一角度,该坐标轴例如可以是x轴、y轴、z轴等。还可以根据确定出的第二位置矢量,确定该第二位置矢量与某一坐标轴形成的第二角度。应 该理解,该第一角度和该第二角度分别是第一位置矢量和第二位置矢量与同一坐标轴形成的角度。
根据本公开的实施例,可以分别计算至少一个第一角度中每个角度与第二角度的角度差,进而可以得到至少一个角度差。将上述至少一个角度差中各个角度差分别与预设角度作比较,在至少一个角度差中存在小于等于该预设角度的角度差的情况下,可以确定出小于等于该预设角度的角度差对应的候选电子设备,并将该候选电子设备确定为第二电子设备。
通过本公开的实施例,通过将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备,可以进一步提高了确定第二电子设备的准确度。
图3C示意性示出了根据本公开另一实施例的电子设备确定方法的流程图。
如图3C所示,该电子设备确定方法还可以包括操作S501和操作S502,其中:
在操作S501,在将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备之前,输出对应的电子设备的名称信息。
在操作S502,在操作体针对名称信息作出确认操作的情况下,将对应的电子设备确定为第二电子设备。
在本公开的实施例中,为了进一步保证至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备是用户想要通过确定出的第二电子设备,则可以在将该对应的电子设备确定为第二电子设备之前,输出该对应的电子设备的名称信息,具体地,可以语音播报该名称信息,也可以通过显示屏显示该名称信息,还可以通过全息投影方式显示该名称信息。
根据本公开的实施例,用户可以根据第一电子设备输出的名称信息执行相应操作,若该名称信息是用户想要通过第一电子设备确定出的第二电子设备对应的名称信息,则用户可以向第一电子设备反馈针对该名称信息作出的确认操作,在第一电子设备接收到该确认操作后,可以将该对应的电子设备确定为第二电子设备。其中,该确认操作可以是操作体所做的动作,第一电子设备接收该确认操作可以是通过识别该操作体所做的动作。
通过本公开的实施例,在接收到操作体针对名称信息作出确认操作的情况下,将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备,可以进一步提高用户想要确定出的第二电子设备的准确性。
作为一种可选的实施例,在确定可通过第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:获取该电子设备的图像;计算该电子设备 的图像的第一积分图;获取至少一个预设积分图,其中,预设积分图是通过对预设模板图像进行计算得到的,预设模板图像是可通过第一电子设备进行控制的电子设备得到的图像;计算第一积分图与至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及在至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。
图3D示意性示出了根据本公开实施例的确定候选电子设备的流程图。
如图3D所示,在确定可通过第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备可以执行操作S601~操作S605,其中:
在操作S601,获取该电子设备的图像。
在操作S602,计算该电子设备的图像的第一积分图。
在操作S603,获取至少一个预设积分图,其中,预设积分图是通过对预设模板图像进行计算得到的,预设模板图像是可通过第一电子设备进行控制的电子设备得到的图像。
在操作S604,计算第一积分图与至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度。
在操作S605,在至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。
在本公开的实施例中,目标区域可以包括第一电子设备可以识别区域,例如在第一电子设备为摄像头时,该目标区域可以包括该摄像头可以扫描到的区域。
根据本公开的实施例,第一电子设备可以识别目标区域内的部分电子设备,也可以识别目标区域内的所有电子设备,其中,该目标区域内的电子设备可以包括可通过第一电子设备进行控制的电子设备,也可以包括不可通过该第一电子设备进行控制的电子设备。
根据本公开的实施例,针对上述每个电子设备,该第一电子设备可以获取该电子设备的图像,其中,该图像可以是第一电子设备通过扫描该电子设备得到的图像。计算该电子设备的图像的第一积分图,其中,该第一积分图可以包括第一方向积分图,也可以包括第一方向积分图和第一颜色积分图。
在本公开的实施例中,可以获取预先存储在主控电子设备中的至少一个预设积分图,其中,该预设积分图可以是通过对预设模板图像进行计算得到的,其中,该预设积分图可以包括预设方向积分图,也可以包括预设方向积分图和预设颜色积分图。上述预设模板图像可以预先存储在第一电子设备中,且该预设模板图像还可以是由可通过第一电子设备进行控制的电子设备得到的图像,例如,该预设模板图像可以是第一电子设备对可通过第一 电子设备进行控制的电子设备进行扫描得到的。
进一步,计算上述第一积分图与至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度。具体地,可以计算第一方向积分图与与至少一个预设方向积分图中每个预设方向积分图的第三相似度;也可以计算第一方向积分图与至少一个预设方向积分图中每个预设方向积分图的第三相似度,以及第一颜色积分图与至少一个预设颜色积分图中每个预设颜色积分图的第四相似度。
检测在至少一个第一相似度中是否存在满足第一相似度阈值的相似度,可以是检测在至少一个第三相似度中是否存在满足第三相似度阈值的相似度;也可以是检测在至少一个第三相似度中是否存在满足第三相似度阈值的相似度,以及在至少一个第四相似度中是否存在满足第四相似度阈值的相似度。
在至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。换言之,可以是在至少一个第三相似度中存在满足第三相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备;也可以是在至少一个第三相似度中存在满足第三相似度阈值的相似度,且在至少一个第四相似度中存在满足第四相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。
通过本公开的实施例,在至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备,进而可以得到至少一个候选电子设备,以便第一电子设备可以从该至少一个候选电子设备中确定出用户想要确定的第二电子设备。
图3E示意性示出了根据本公开另一实施例的电子设备确定方法的流程图。
如图3E所示,该电子设备确定方法还可以包括操作S701~操作S704,其中:
在操作S701,在将该电子设备确定为候选电子设备之前,计算与该电子设备对应的图像的第一直方图。
在操作S702,获取至少一个预设直方图,其中,预设直方图是通过对预设模板图像进行计算得到的
在操作S703,计算第一直方图与至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度。
在操作S704,在至少一个第一相似度中存在满足第一相似度阈值的相似度且在至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。
在本公开的实施例中,为了进一步保证确定出的候选电子设备是可通过第一电子设备 进行控制的电子设备的准确性,在将该电子设备确定为候选电子设备之前,可以进一步计算与该电子设备对应的图像的第一直方图。其中,该该第一直方图可以包括第一方向直方图,也可以包括第一方向直方图和第一颜色直方图。
根据本公开的实施例,可以获取预先存储在第一电子设备中的至少一个预设直方图,其中,该预设直方图可以是通过对预设模板图像进行计算得到的,其中,该预设直方图可以包括预设方向直方图,也可以包括预设方向直方图和预设颜色直方图。
进一步,计算上述第一直方图与至少一个预设直方图中每个预设直方图的第一相似度,可以得到至少一个第一相似度。具体地,可以计算第一方向直方图与至少一个预设方向直方图中每个预设方向直方图的第五相似度;也可以计算第一方向直方图与至少一个预设方向直方图中每个预设方向直方图的第五相似度,以及第一颜色直方图与至少一个预设颜色直方图中每个预设颜色直方图的第六相似度。
检测在至少一个第二相似度中是否存在满足第二相似度阈值的相似度,可以是检测在至少一个第五相似度中是否存在满足第五相似度阈值的相似度;也可以是检测在至少一个第五相似度中是否存在满足第五相似度阈值的相似度,以及在至少一个第六相似度中是否存在满足第六相似度阈值的相似度。
进一步,在至少一个第一相似度中存在满足第一相似度阈值的相似度且在至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。换言之,可以是在至少一个第三相似度中存在满足第三相似度阈值的相似度,且至少一个第五相似度中存在满足第五相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备;也可以是在至少一个第三相似度中存在满足第三相似度阈值的相似度,且在至少一个第四相似度中存在满足第四相似度阈值的相似度,且在至少一个第五相似度中存在满足第五相似度阈值的相似度,且在至少一个第六相似度中存在满足第六相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备。
下面以第一电子设备为摄像头为例,举例说明本公开的实施例。
假设第一电子设备为摄像头,该摄像头中存储了所有可被摄像头控制的电子设备的不同角度的图像(又称为预设模板图像)信息。此外,摄像头可以扫描目标区域内的电子设备,且通过摄像头获取的该目标区域内的电子设备的图像可以称为预处理图像。其中,摄像头可以通过似物性推荐算法识别目标区域内的电子设备,且在短时间内确定一组候选框集合。在此基础上采用统计匹配方法将预处理图像的第一积分图和预设模板图像的预设积分图进行相似度比较,实现物体识别功能。具体步骤如下:
a.计算预处理图像的第一方向积分图和第一颜色积分图与预设模板图像的预设方向积分图和预设颜色积分图进行比较;
b.通过似物性推荐算法,得到相似图像的候选框集合,并计算每个候选框的第一方向直方图和第一颜色直方图,其中,该候选框可以是上述实施例中至少一个相似度中满足第一相似度阈值的相似度对应的电子设备的图像;
c.计算各候选框中的第一直方图与预设模板图像的预设直方图的第二相似度,若第二相似度达到第二相似度阈值即判断匹配成功。
应该理解,基于上述实施例,可以识别到拍摄画面中的所有可被第一电子设备进行控制的电子设备。
通过本公开的实施例,在第一相似度满足第一相似度阈值且在第二相似度也满足第二相似度阈值的情况下,将该电子设备确定为候选电子设备中的一个,可以提高确定出的每个候选电子设备的正确率。
图3F示意性示出了根据本公开实施例的第一电子设备为双目摄像头的示意图。
如图3F所示,假设第一电子设备可以包括摄像头,该摄像头可以包括双目摄像头104,该双目摄像头104可以通过识别画面中至少一个候选电子设备和操作体所做的第一动作(例如用户的手势),以及操作体所做的第一动作的深度信息,实现一种全新的交互方式来帮助用户控制场景中的其他设备,例如第二电子设备。
图3G示意性示出了根据本公开实施例的确定第二电子设备的示意图。
需要说明的是,在操作体与至少一个候选电子设备处于第一电子设备(例如摄像头)所扫描到的画面(又称为目标区域)中时,该第一电子设备可通过识别操作体的特定动作(例如特定手势)启动第一电子设备的动作控制功能(例如手势控制功能)。该第一电子设备可以识别操作体的动作及画面中可被第一电子设备操控的至少一个候选电子设备,且该动作的最终展现姿态指向某一电子设备时,第一电子设备反馈该电子设备的名称信息,如该名称信息符合用户期望,则将该电子设备确定为第二电子设备,且用户可以通过操作体的第一动作使第一电子设备控制该第二电子设备,例如手势滑动控制。
在本公开的实施例中,如图3G所示,假设第一电子设备为双目摄像头104,该双目摄像头104可以获取场景105中的物体(包括至少一个被控设备)和用户106的深度信息,以此来建立三维坐标系。在双目摄像头104检测到用户106启动设备操控的手势时,假定用户106的脖子与用户106的肩膀位置是重合的,则可以以该用户106的脖子位置作为原点建立三维坐标系。用户106的正前方为x轴正方向,靠近双目摄像头104的方向为y轴 正方向,用于记录深度信息,上方为z轴正方向。
三维坐标建好后双目摄像头104锁定用户106的手势所指的被控电子设备步骤如下:
a.根据三维坐标系计算场景105中至少一个候选电子设备中每台候选电子设备的位置坐标,以记录位置矢量,得到至少一个第一位置矢量;
b.双目摄像头104检测到用户106的启动手势时,开始跟踪用户106的人手的位置,之后人手发生位移并停留在某一位置时双目摄像头104记录此时人手的第二位置矢量,记作α;
c.比较至少一个第一位置矢量中的每个位置矢量与第二位置矢量α,例如可以确定至少一个第一位置矢量中的每个位置矢量与坐标轴的第一角度,得到至少一个第一角度,确定第二位置矢量与该坐标轴的第二角度,进而将至少一个第一角度中的每个角度与第二角度进行比较;
d.如果至少一个第一角度中存在与第二角度接近或重合的角度,此时可以通过双目摄像头104通过语音提示该与第二角度接近或重合的角度对应的电子设备的名称信息;
e.在用户106针对所述名称信息做出确认手势后,可以将至少一个候选电子设备中与第二角度接近或重合的角度对应的电子设备确认为被控电子设备,且该双目摄像头104可以根据识别用户106所做的第一动作的识别结果操控被控电子设备,其中,该识别结果可以用于标识操控指令,该操控指令可以包括开机、关机、档位增加、档位减小等。
本公开实施例提供的双目摄像头设备,提供了一种全新的智能家居控制方式,其可以利用图像识别和手势识别技术,自动识别场景中的其它设备,保存相关识别数据后,再识别用户的手势动作,可以实现利用手势控制家庭中其它智能设备的目的。
此外,本公开的实施例补充了目前语音操控对距离和背景噪音要求较高的缺点,且以一种场景受限较小且更为自然的方式帮助用户操作家庭中的不同设备。
图4示意性示出了根据本公开实施例的电子设备确定系统的框图。
如图4所示,该电子设备确定系统可以包括第一获取装置410和第一确定装置420,其中:
第一获取装置410用于通过第一电子设备识别操作体所做的第一动作,获取识别结果。
第一确定装置420用于根据识别结果,确定可通过第一电子设备进行控制的第二电子设备。
通过本公开的实施例,通过使主控电子设备基于识别操作体所做的第一动作的识别结果,对被控电子设备进行控制,可以以一种使用场景受限较小且更为自然的方式帮助用户 操控被控电子设备,还可以解决现有的语音操控方式不仅存在语义理解不准确的缺陷,而且对距离和环境噪音要求较高,导致使用场景受限的缺陷。
图5A示意性示出了根据本公开实施例的第一确定装置的框图。
如图5A所示,该第一确定装置420可以包括第一确定单元421、第二确定单元422、第三确定单元423、第四确定单元424和第五确定单元425,其中:
第一确定单元421用于确定可通过第一电子设备进行控制的至少一个候选电子设备。
第二确定单元422用于确定坐标原点。
第三确定单元423用于确定至少一个候选电子设备中各候选电子设备对应的以坐标原点为起点且以自身位置为终点的至少一个第一位置矢量。
第四确定单元424用于确定以坐标原点为起点且以操作体做出第二动作后所在的位置为终点的第二位置矢量。
第五确定单元425用于基于至少一个第一位置矢量和第二位置矢量,从至少一个候选电子设备中确定出第二电子设备。
通过本公开的实施例,通过基于至少一个第一位置矢量和第二位置矢量,从至少一个候选电子设备中确定出被控电子设备,进而使对该被控电子设备执行控制操作,不仅可以提高控制的准确性,还可以提高用户体验。
图5B示意性示出了根据本公开实施例的第五确定单元的框图。
如图5B所示,该第五确定单元425可以包括第一确定子单元4251、第二确定子单元4252、计算子单元4253和第三确定子单元4254,其中:
第一确定子单元4251用于确定至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度。
第二确定子单元4252用于确定第二位置矢量与同一坐标轴形成的第二角度。
计算子单元4253用于计算至少一个第一角度中每个角度与第二角度的角度差,得到对应的至少一个角度差。
第三确定子单元4254用于在至少一个角度差中存在小于等于预设角度的角度差的情况下,将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备。
通过本公开的实施例,通过将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为被控电子设备,可以进一步提高了确定被控电子设备的准确度。
图5C示意性示出了根据本公开另一实施例的电子设备确定系统的框图。
如图5C所示,该电子设备确定系统400还可以包括输出装置510和第二确定装置520,其中:
输出装置510用于在将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为第二电子设备之前,输出对应的电子设备的名称信息。
第二确定装置520用于在操作体针对名称信息作出确认操作的情况下,将对应的电子设备确定为第二电子设备。
通过本公开的实施例,在接收到操作体针对名称信息作出确认操作的情况下,将至少一个候选电子设备中小于等于预设角度的角度差对应的电子设备确定为被控电子设备,可以更加准确的确定出用户想要通过主控电子设备操控的被控电子设备。
作为一种可选的实施例,在第一确定单元确定可通过第一电子设备进行操控的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:获取该电子设备的图像;计算该电子设备的图像的第一积分图;获取至少一个预设积分图,其中,预设积分图是通过对预设模板图像进行计算得到的,预设模板图像是可通过第一电子设备进行操控的电子设备得到的图像;计算第一积分图与至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及在至少一个第一相似度中存在满足第一相似度阈值的第一相似度的情况下,将该电子设备确定为候选电子设备。
通过本公开的实施例,在至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为候选电子设备,进而可以得到至少一个候选电子设备,以便主控电子设备可以从该至少一个候选电子设备中确定出用户想要操控的被控电子设备。
图5D示意性示出了根据本公开另一实施例的电子设备确定系统的框图。
如图5D所示,该电子设备确定系统400还可以包括第一计算装置610、第二获取装置620、第二计算装置630和第三确定装置640,其中:
第一计算装置610用于在将该电子设备确定为候选电子设备之前,计算与该电子设备对应的图像的第一直方图。
第二获取装置620用于获取至少一个预设直方图,其中,预设直方图是通过对预设模板图像进行计算得到的。
第二计算装置630用于计算第一直方图与至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度。
第三确定装置640用于在至少一个第一相似度中存在满足第一相似度阈值的相似度且在至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确 定为候选电子设备。
通过本公开的实施例,在第一相似度满足第一相似度阈值且在第二相似度也满足第二相似度阈值的情况下,将该电子设备确定为候选电子设备中的一个,可以提高确定出的每个候选电子设备的正确率。
可以理解的是,第一获取装置410、第一确定装置420、输出装置510、第二确定装置520、第一计算装置610、第二获取装置620、第二计算装置630、第三确定装置640、第一确定单元421、第二确定单元422、第三确定单元423、第四确定单元424、第五确定单元425、第一确定子单元4251、第二确定子单元4252、计算子单元4253以及第三确定子单元4254可以合并在一个装置/单元/子单元中实现,或者其中的任意一个装置/单元/子单元可以被拆分成多个装置/单元/子单元。或者,这些装置/单元/子单元中的一个或多个装置/单元/子单元的至少部分功能可以与其他装置/单元/子单元的至少部分功能相结合,并在一个装置/单元/子单元中实现。根据本发明的实施例,第一获取装置410、第一确定装置420、输出装置510、第二确定装置520、第一计算装置610、第二获取装置620、第二计算装置630、第三确定装置640、第一确定单元421、第二确定单元422、第三确定单元423、第四确定单元424、第五确定单元425、第一确定子单元4251、第二确定子单元4252、计算子单元4253以及第三确定子单元4254中的至少一个可以至少被部分地实现为硬件电路,例如现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)、片上系统、基板上的系统、封装上的系统、专用集成电路(ASIC),或可以以对电路进行集成或封装的任何其他的合理方式等硬件或固件来实现,或以软件、硬件以及固件三种实现方式的适当组合来实现。或者,第一获取装置410、第一确定装置420、输出装置510、第二确定装置520、第一计算装置610、第二获取装置620、第二计算装置630、第三确定装置640、第一确定单元421、第二确定单元422、第三确定单元423、第四确定单元424、第五确定单元425、第一确定子单元4251、第二确定子单元4252、计算子单元4253以及第三确定子单元4254中的至少一个可以至少被部分地实现为计算机程序装置/单元/子单元,当该程序被计算机运行时,可以执行相应装置/单元/子单元的功能。
本公开的另一个方面提供了一种电子设备确定设备,包括采集装置、识别装置和处理装置,其中,所述采集装置用于操作体所做的第一动作,所述识别装置用于识别所述第一动作,获取识别结果,所述处理装置用于根据所述识别结果,确定可控制的第二电子设备。
根据本公开的实施例,上述电子设备确定设备还包括:信号发送装置,所述信号发送装置用于向所述可控制的第二电子设备发送信号。
本公开的另一个方面提供了一种电子设备确定系统,包括至少一个第一电子设备和至少一个第二电子设备,其中,所述至少一个第一电子设备用于识别操作体所做的第一动作,获取识别结果,并根据所述识别结果,确定可通过所述第一电子设备进行控制的第二电子设备。
本公开的另一个方面提供了一种计算机系统,包括:一个或多个处理器;计算机可读存储介质,用于存储一个或多个程序,其中,当上述一个或多个程序被上述一个或多个处理器执行时,使得上述一个或多个处理器实现上述实施例中任一项所述的电子设备控制方法。
图6示意性示出了根据本公开实施例的适于实现电子设备确定方法的计算机系统的框图。图6示出的计算机系统仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图6所示,根据本公开实施例的计算机系统700包括处理器701,其可以根据存储在只读存储器(ROM)702中的程序或者从存储部分708加载到随机访问存储器(RAM)703中的程序而执行各种适当的动作和处理。处理器701例如可以包括通用微处理器(例如CPU)、指令集处理器和/或相关芯片组和/或专用微处理器(例如,专用集成电路(ASIC)),等等。处理器701还可以包括用于缓存用途的板载存储器。处理器701可以包括用于执行根据本公开实施例的方法流程的不同动作的单一处理单元或者是多个处理单元。
在RAM 703中,存储有计算机系统700操作所需的各种程序和数据。处理器701、ROM 702以及RAM 703通过总线704彼此相连。处理器701通过执行ROM 702和/或RAM703中的程序来执行以上各种操作。需要注意,所述程序也可以存储在除ROM 702和RAM703以外的一个或多个存储器中。处理器701也可以通过执行存储在所述一个或多个存储器中的程序来执行以上各种操作。
根据本公开的实施例,计算机系统700还可以包括输入/输出(I/O)接口705,输入/输出(I/O)接口705也连接至总线704。计算机系统700还可以包括连接至I/O接口705的以下部件中的一项或多项:包括键盘、鼠标等的输入部分706;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分707;包括硬盘等的存储部分708;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分709。通信部分709经由诸如因特网的网络执行通信处理。驱动器710也根据需要连接至I/O接口705。可拆卸介质711,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器710上,以便于从其上读出的计算机程序根据需要被安装入存储部分708。
根据本公开的实施例,上文参考流程图描述的方法可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分709从网络上被下载和安装,和/或从可拆卸介质711被安装。在该计算机程序被处理器701执行时,执行本公开实施例的系统中限定的上述功能。根据本公开的实施例,上文描述的系统、设备、装置、单元等可以通过计算机程序模块来实现。
需要说明的是,本公开所示的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。根据本公开的实施例,计算机可读介质可以包括上文描述的ROM 702和/或RAM 703和/或ROM 702和RAM 703以外的一个或多个存储器。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个装置、程序段、或代码的一部分,上述装置、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图或流程图中的每个方框、以及框图或流程图中的方框的组合,可以用 执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
作为另一方面,本公开还提供了一种计算机可读介质,其上存储有可执行指令,该指令被处理器执行时使上述处理器实现上述实施例中任一项所述的电子设备确定方法,该计算机可读介质可以是上述实施例中描述的设备中所包含的;也可以是单独存在,而未装配入该设备中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被一个该设备执行时,使得该设备执行:通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及根据识别结果,确定可通过第一电子设备进行控制的第二电子设备
以上对本公开的实施例进行了描述。但是,这些实施例仅仅是为了说明的目的,而并非为了限制本公开的范围。尽管在以上分别描述了各实施例,但是这并不意味着各个实施例中的措施不能有利地结合使用。本公开的范围由所附权利要求及其等同物限定。不脱离本公开的范围,本领域技术人员可以做出多种替代和修改,这些替代和修改都应落在本公开的范围之内。

Claims (18)

  1. 一种电子设备确定方法,包括:
    通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及
    根据所述识别结果,确定可通过所述第一电子设备进行控制的第二电子设备。
  2. 根据权利要求1所述的方法,其中,确定可通过所述第一电子设备进行控制的第二电子设备包括:
    确定可通过所述第一电子设备进行控制的至少一个候选电子设备;
    确定坐标原点;
    确定所述至少一个候选电子设备中各候选电子设备对应的以所述坐标原点为起点且以自身位置为终点的至少一个第一位置矢量;
    确定以所述坐标原点为起点且以所述操作体做出第二动作后所在的位置为终点的第二位置矢量;以及
    基于所述至少一个第一位置矢量和所述第二位置矢量,从所述至少一个候选电子设备中确定出所述第二电子设备。
  3. 根据权利要求2所述的方法,其中,从所述至少一个候选电子设备中确定出所述第二电子设备包括:
    确定所述至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度;
    确定所述第二位置矢量与所述同一坐标轴形成的第二角度;
    计算所述至少一个第一角度中每个角度与所述第二角度的角度差,得到对应的至少一个角度差;以及
    在所述至少一个角度差中存在小于等于预设角度的角度差的情况下,将所述至少一个候选电子设备中小于等于所述预设角度的角度差对应的电子设备确定为所述第二电子设备。
  4. 根据权利要求3所述的方法,其中,所述方法还包括:
    在将所述至少一个候选电子设备中小于等于所述预设角度的角度差对应的电子设备确定为所述第二电子设备之前,输出所述对应的电子设备的名称信息;以及
    在所述操作体针对所述名称信息作出确认操作的情况下,将所述对应的电子设备确定为所述第二电子设备。
  5. 根据权利要求2所述的方法,其中,在确定可通过所述第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:
    获取该电子设备的图像;
    计算该电子设备的图像的第一积分图;
    获取至少一个预设积分图,其中,所述预设积分图是通过对预设模板图像进行计算得到的,所述预设模板图像是可通过所述第一电子设备进行控制的电子设备得到的图像;
    计算所述第一积分图与所述至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及
    在所述至少一个第一相似度中存在满足第一相似度阈值的相似度的情况下,将该电子设备确定为所述候选电子设备。
  6. 根据权利要求5所述的方法,其中,所述方法还包括:
    在将该电子设备确定为所述候选电子设备之前,计算与该电子设备对应的图像的第一直方图;
    获取至少一个预设直方图,其中,所述预设直方图是通过对预设模板图像进行计算得到的;
    计算所述第一直方图与所述至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度;以及
    在所述至少一个第一相似度中存在满足第一相似度阈值的相似度且在所述至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为所述候选电子设备。
  7. 根据权利要求1所述的方法,还包括:
    在确定可通过所述第一电子设备进行控制的第二电子设备之后,所述第二电子设备接收用户语音信息,并响应所述用户语音信息。
  8. 一种电子设备确定系统,包括:
    第一获取装置,用于通过第一电子设备识别操作体所做的第一动作,获取识别结果;以及
    第一确定装置,用于根据所述识别结果,确定可通过所述第一电子设备进行控制的第二电子设备。
  9. 根据权利要求8所述的系统,其中,第一确定装置包括:
    第一确定单元,用于确定可通过所述第一电子设备进行控制的至少一个候选电子设备;
    第二确定单元,用于确定坐标原点;
    第三确定单元,用于确定所述至少一个候选电子设备中各候选电子设备对应的以所述坐标原点为起点且以自身位置为终点的至少一个第一位置矢量;
    第四确定单元,用于确定以所述坐标原点为起点且以所述操作体做出第二动作后所在的位置为终点的第二位置矢量;以及
    第五确定单元,用于基于所述至少一个第一位置矢量和所述第二位置矢量,从所述至少一个候选电子设备中确定出所述第二电子设备。
  10. 根据权利要求9所述的系统,其中,第五确定单元包括:
    第一确定子单元,用于确定所述至少一个第一位置矢量中各位置矢量与同一坐标轴形成的至少一个第一角度;
    第二确定子单元,用于确定所述第二位置矢量与所述同一坐标轴形成的第二角度;
    计算子单元,用于计算所述至少一个第一角度中每个角度与所述第二角度的角度差,得到对应的至少一个角度差;以及
    第三确定子单元,用于在所述至少一个角度差中存在小于等于预设角度的角度差的情况下,将所述至少一个候选电子设备中小于等于所述预设角度的角度差对应的电子设备确定为所述第二电子设备。
  11. 根据权利要求10所述的系统,其中,所述系统还包括:
    输出装置,用于在将所述至少一个候选电子设备中小于等于所述预设角度的角度差对应的电子设备确定为所述第二电子设备之前,输出所述对应的电子设备的名称信息;以及
    第二确定装置,用于在所述操作体针对所述名称信息作出确认操作的情况下,将所述对应的电子设备确定为所述第二电子设备。
  12. 根据权利要求9所述的系统,其中,在所述第一确定单元确定可通过所述第一电子设备进行控制的至少一个候选电子设备的过程中,针对目标区域内的每个电子设备:
    获取该电子设备的图像;
    计算该电子设备的图像的第一积分图;
    获取至少一个预设积分图,其中,所述预设积分图是通过对预设模板图像进行计算得到的,所述预设模板图像是可通过所述第一电子设备进行控制的电子设备得到的图像;
    计算所述第一积分图与所述至少一个预设积分图中每个预设积分图的第一相似度,得到至少一个第一相似度;以及
    在所述至少一个第一相似度中存在满足第一相似度阈值的第一相似度的情况下,将该 电子设备确定为所述候选电子设备。
  13. 根据权利要求12所述的系统,其中,所述系统还包括:
    第一计算装置,用于在将该电子设备确定为所述候选电子设备之前,计算与该电子设备对应的图像的第一直方图;
    第二获取装置,用于获取至少一个预设直方图,其中,所述预设直方图是通过对预设模板图像进行计算得到的;
    第二计算装置,用于计算所述第一直方图与所述至少一个预设直方图中每个预设直方图的第二相似度,得到至少一个第二相似度;以及
    第三确定装置,用于在所述至少一个第一相似度中存在满足第一相似度阈值的相似度且在所述至少一个第二相似度也存在满足第二相似度阈值的相似度的情况下,将该电子设备确定为所述候选电子设备。
  14. 一种电子设备确定设备,包括:
    采集装置,用于操作体所做的第一动作;
    识别装置,用于识别所述第一动作,获取识别结果;以及
    处理装置,用于根据所述识别结果,确定可控制的第二电子设备。
  15. 根据权利要求14所述的设备,还包括:
    信号发送装置,用于向所述可控制的第二电子设备发送信号。
  16. 一种电子设备确定系统,包括:
    至少一个第一电子设备;以及
    至少一个第二电子设备;
    其中,所述至少一个第一电子设备用于识别操作体所做的第一动作,获取识别结果,并根据所述识别结果,确定可通过所述第一电子设备进行控制的第二电子设备。
  17. 一种计算机系统,包括:
    一个或多个处理器;
    计算机可读存储介质,用于存储一个或多个程序,
    其中,当所述一个或多个程序被所述一个或多个处理器执行时,使得所述一个或多个处理器实现权利要求1至7中任一项所述的电子设备确定方法。
  18. 一种计算机可读存储介质,其上存储有可执行指令,该指令被处理器执行时使所述处理器实现权利要求1至7中任一项所述的电子设备确定方法。
PCT/CN2019/082567 2018-04-13 2019-04-12 电子设备确定方法、系统、计算机系统和可读存储介质 WO2019196947A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/042,018 US11481036B2 (en) 2018-04-13 2019-04-12 Method, system for determining electronic device, computer system and readable storage medium
JP2020551794A JP7280888B2 (ja) 2018-04-13 2019-04-12 電子機器確定方法、システム、コンピュータシステムおよび読取り可能な記憶媒体
EP19785227.0A EP3779645A4 (en) 2018-04-13 2019-04-12 Electronic device determining method and system, computer system, and readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810331949.8 2018-04-13
CN201810331949.8A CN110377145B (zh) 2018-04-13 2018-04-13 电子设备确定方法、系统、计算机系统和可读存储介质

Publications (1)

Publication Number Publication Date
WO2019196947A1 true WO2019196947A1 (zh) 2019-10-17

Family

ID=68163196

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/082567 WO2019196947A1 (zh) 2018-04-13 2019-04-12 电子设备确定方法、系统、计算机系统和可读存储介质

Country Status (5)

Country Link
US (1) US11481036B2 (zh)
EP (1) EP3779645A4 (zh)
JP (1) JP7280888B2 (zh)
CN (1) CN110377145B (zh)
WO (1) WO2019196947A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021111359A (ja) * 2020-01-07 2021-08-02 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド 音声ウェイクアップ方法及び装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103152467A (zh) * 2011-12-07 2013-06-12 智易科技股份有限公司 手持式电子装置及遥控方法
CN103616965A (zh) * 2013-11-22 2014-03-05 深圳Tcl新技术有限公司 基于空间定位设备的菜单控制方法
US20140184528A1 (en) * 2013-01-02 2014-07-03 Elan Microelectronics Corporation Method for identifying gesture
CN104007865A (zh) * 2013-02-27 2014-08-27 联想(北京)有限公司 识别方法和电子设备

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000231427A (ja) 1999-02-08 2000-08-22 Nec Corp マルチモーダル情報解析装置
JP4035610B2 (ja) 2002-12-18 2008-01-23 独立行政法人産業技術総合研究所 インタフェース装置
JP4516042B2 (ja) * 2006-03-27 2010-08-04 株式会社東芝 機器操作装置および機器操作方法
WO2009045861A1 (en) * 2007-10-05 2009-04-09 Sensory, Incorporated Systems and methods of performing speech recognition using gestures
TWI423144B (zh) * 2009-11-10 2014-01-11 Inst Information Industry Combined with the audio and video behavior identification system, identification methods and computer program products
US8400548B2 (en) * 2010-01-05 2013-03-19 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
US8296151B2 (en) * 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
JP5730086B2 (ja) * 2011-03-18 2015-06-03 Necパーソナルコンピュータ株式会社 入力装置および入力方法
US20120304067A1 (en) * 2011-05-25 2012-11-29 Samsung Electronics Co., Ltd. Apparatus and method for controlling user interface using sound recognition
RU2455676C2 (ru) * 2011-07-04 2012-07-10 Общество с ограниченной ответственностью "ТРИДИВИ" Способ управления устройством с помощью жестов и 3d-сенсор для его осуществления
JP5909455B2 (ja) * 2013-04-01 2016-04-26 富士通フロンテック株式会社 二段バーコード読取装置および二段バーコード読取方法
KR20130088104A (ko) * 2013-04-09 2013-08-07 삼성전자주식회사 비접촉 방식의 인터페이스를 제공하기 위한 휴대 장치 및 방법
KR102188090B1 (ko) * 2013-12-11 2020-12-04 엘지전자 주식회사 스마트 가전제품, 그 작동방법 및 스마트 가전제품을 이용한 음성인식 시스템
CN104866083B (zh) * 2014-02-25 2020-03-17 中兴通讯股份有限公司 手势识别方法、装置和系统
US9785247B1 (en) * 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions
KR101603553B1 (ko) * 2014-12-15 2016-03-15 현대자동차주식회사 차량에서 웨어러블 기기를 이용한 제스쳐 인식 방법 및 이를 수행하는 차량
WO2016200197A1 (ko) * 2015-06-10 2016-12-15 (주)브이터치 사용자 기준 공간좌표계 상에서의 제스처 검출 방법 및 장치
CN105425954B (zh) * 2015-11-04 2018-09-18 哈尔滨工业大学深圳研究生院 应用于智能家居中的人机交互方法及系统
EP3451335B1 (en) * 2016-04-29 2023-06-14 Vtouch Co., Ltd. Optimum control method based on multi-mode command of operation-voice, and electronic device to which same is applied
CN107203756B (zh) * 2016-06-06 2020-08-28 亮风台(上海)信息科技有限公司 一种识别手势的方法与设备
CN109416583B (zh) * 2016-07-05 2022-06-03 索尼公司 信息处理装置、信息处理方法及程序
KR20180098079A (ko) * 2017-02-24 2018-09-03 삼성전자주식회사 비전 기반의 사물 인식 장치 및 그 제어 방법
CN107484072A (zh) * 2017-04-06 2017-12-15 宝沃汽车(中国)有限公司 汽车娱乐系统的控制方法、装置及车辆

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103152467A (zh) * 2011-12-07 2013-06-12 智易科技股份有限公司 手持式电子装置及遥控方法
US20140184528A1 (en) * 2013-01-02 2014-07-03 Elan Microelectronics Corporation Method for identifying gesture
CN104007865A (zh) * 2013-02-27 2014-08-27 联想(北京)有限公司 识别方法和电子设备
CN103616965A (zh) * 2013-11-22 2014-03-05 深圳Tcl新技术有限公司 基于空间定位设备的菜单控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3779645A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021111359A (ja) * 2020-01-07 2021-08-02 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド 音声ウェイクアップ方法及び装置
JP7239544B2 (ja) 2020-01-07 2023-03-14 バイドゥ オンライン ネットワーク テクノロジー(ペキン) カンパニー リミテッド 音声ウェイクアップ方法及び装置

Also Published As

Publication number Publication date
US20210165494A1 (en) 2021-06-03
JP7280888B2 (ja) 2023-05-24
CN110377145B (zh) 2021-03-30
EP3779645A1 (en) 2021-02-17
US11481036B2 (en) 2022-10-25
EP3779645A4 (en) 2021-12-29
CN110377145A (zh) 2019-10-25
JP2021517314A (ja) 2021-07-15

Similar Documents

Publication Publication Date Title
WO2018000200A1 (zh) 对电子设备进行控制的终端及其处理方法
US11198508B2 (en) Electronic device moved based on distance from external object and control method thereof
US10068373B2 (en) Electronic device for providing map information
US8525876B2 (en) Real-time embedded vision-based human hand detection
US20140062862A1 (en) Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium
EP2708981B1 (en) Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium
US9690475B2 (en) Information processing apparatus, information processing method, and program
US20140300542A1 (en) Portable device and method for providing non-contact interface
US20190339856A1 (en) Electronic device and touch gesture control method thereof
TW201805744A (zh) 控制系統、控制處理方法及裝置
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
KR102665643B1 (ko) 아바타 표시를 제어하기 위한 방법 및 그 전자 장치
WO2021052008A1 (zh) 一种投屏方法及系统
CN108881544B (zh) 一种拍照的方法及移动终端
CN110619656B (zh) 基于双目摄像头的人脸检测跟踪方法、装置及电子设备
US20200226356A1 (en) Electronic device and controlling method thereof
CN111163906A (zh) 能够移动的电子设备及其操作方法
CN113289327A (zh) 移动终端的显示控制方法及装置、存储介质及电子设备
CN104007815A (zh) 电子设备和操作电子设备的方法
US11886643B2 (en) Information processing apparatus and information processing method
WO2018076720A1 (zh) 单手操控方法及操控系统
WO2019196947A1 (zh) 电子设备确定方法、系统、计算机系统和可读存储介质
CN112818733B (zh) 信息处理方法、装置、存储介质及终端
US20120206348A1 (en) Display device and method of controlling the same
US9041646B2 (en) Information processing system, information processing system control method, information processing apparatus, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19785227

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020551794

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019785227

Country of ref document: EP