CN112180748A - Target device control method, target device control apparatus, and control device - Google Patents

Target device control method, target device control apparatus, and control device Download PDF

Info

Publication number
CN112180748A
CN112180748A CN202011044329.XA CN202011044329A CN112180748A CN 112180748 A CN112180748 A CN 112180748A CN 202011044329 A CN202011044329 A CN 202011044329A CN 112180748 A CN112180748 A CN 112180748A
Authority
CN
China
Prior art keywords
target
instruction
control
determining
target device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011044329.XA
Other languages
Chinese (zh)
Inventor
陈士勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011044329.XA priority Critical patent/CN112180748A/en
Publication of CN112180748A publication Critical patent/CN112180748A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The present disclosure relates to a target device control method, a target device control apparatus, a control device, and a non-transitory computer-readable storage medium, wherein the target device control method is applied to the control device, the method including: receiving a control instruction; responding to the control instruction, determining the target equipment and generating a target instruction; sending the target instruction to the target device; determining the execution condition of the target instruction by the target equipment; and sending corresponding feedback information based on the execution condition. After the control device sends the instruction to the target device, the control device continues to confirm the execution condition of the target device on the instruction and feeds back the instruction to the user according to the execution condition of the target device, so that the user can timely know whether the target device executes the instruction required to be completed, the user can conveniently perform corresponding processing according to the condition, convenience is brought to the user, and the user experience is improved.

Description

Target device control method, target device control apparatus, and control device
Technical Field
The present disclosure relates to the field of smart home, and in particular, to a target device control method, a target device control apparatus, a control device, and a non-transitory computer-readable storage medium.
Background
At present, although smart devices have been developed rapidly in recent years, interconnection can be achieved through WiFi networking, bluetooth networking, and the like. In addition, in the field of intelligent home furnishing, any other target device in the same home furnishing environment can be controlled through one control device. The user can directly operate the control device or send an instruction to the control device in a remote mode, so that the control device can control the associated target device. However, after the user controls the control device, the user cannot know whether the target device that needs to complete the relevant instruction finally executes the instruction, and cannot timely find out the control failure.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a target device control method, a target device control apparatus, and a control device, and a non-transitory computer-readable storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a target device control method applied to a control device, the method including: receiving a control instruction; responding to the control instruction, determining target equipment corresponding to the control instruction, and generating a target instruction; sending the target instruction to the target device; determining the execution condition of the target instruction by the target equipment; and sending corresponding feedback information based on the execution condition.
In an embodiment, the receiving the control instruction includes receiving the control instruction sent by the first device, and the sending out corresponding feedback information based on the execution condition includes: and if the target device does not execute the target instruction, sending failure information to the first device.
In an embodiment, the control device further comprises an infrared emitter; the sending the target instruction to the target device includes: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter.
In an embodiment, the control device comprises a camera; the determining the execution condition of the target instruction by the target device includes: acquiring an image of the target equipment through the camera; and determining the execution condition of the target instruction by the target equipment based on the image.
In an embodiment, the control device further comprises a microphone; the determining the execution condition of the target instruction by the target device further includes: acquiring the sound of the target equipment through the microphone; and determining the execution condition of the target instruction by the target equipment based on the sound.
In an embodiment, the control apparatus further comprises a steering mechanism; the method further comprises the following steps: and the infrared emitter and the camera face the target equipment through the steering mechanism.
In an embodiment, the method further comprises: adjusting the shooting angle of the camera through the steering mechanism, and acquiring a complete image in a visual field through the camera; determining orientation information corresponding to one or more controllable devices in the visual field based on the complete image; the directing the infrared emitter and the camera toward the target device by the steering mechanism includes: and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment.
In an embodiment, the method further comprises: determining device information corresponding to the one or more controllable devices; the determining the target device and generating a target instruction in response to the control instruction comprises: and generating the target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment.
According to a second aspect of the embodiments of the present disclosure, there is provided a target device control apparatus applied to a control device, the apparatus including: a receiving unit for receiving a control instruction; the instruction generating unit is used for responding to the control instruction, determining target equipment corresponding to the control instruction and generating a target instruction; an instruction sending unit, configured to send the target instruction to the target device; the processing unit is used for determining the execution condition of the target instruction by the target equipment; and the feedback unit is used for sending out corresponding feedback information based on the execution condition.
In an embodiment, when the receiving unit receives the control instruction sent by the first device; the feedback unit is further configured to: and when the target device does not execute the target instruction, sending failure information to the first device.
In an embodiment, the control device further comprises an infrared emitter; the instruction sending unit is further configured to: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter.
In an embodiment, the control device comprises a camera; the processing unit further comprises: the image acquisition subunit is used for acquiring an image of the target equipment through the camera; and the determining subunit is used for determining the execution condition of the target instruction by the target equipment based on the image.
In an embodiment, the control device further comprises a microphone; the processing unit further comprises: the sound acquisition subunit is used for acquiring the sound of the target equipment through the microphone; the determining subunit is further to: and determining the execution condition of the target instruction by the target equipment based on the sound.
In an embodiment, the control apparatus further comprises a steering mechanism; the processing unit further comprises: and the steering subunit is used for enabling the infrared emitter and the camera to face the target equipment through the steering mechanism.
In an embodiment, the steering subunit is further configured to: adjusting the shooting angle of the camera through the steering mechanism; the image acquisition subunit is further configured to: acquiring a complete image in a visual field through the camera; the determining subunit is further to: determining orientation information corresponding to one or more controllable devices in the field of view based on the complete image; the steering subunit is further configured to: and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment.
In an embodiment, the determining subunit is further configured to: determining device information corresponding to the one or more controllable devices; the instruction sending unit is further configured to: and generating the target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment.
According to a third aspect of the embodiments of the present disclosure, there is provided a control apparatus that controls the target apparatus by the target apparatus control method according to the first aspect.
In one embodiment, the control apparatus includes: and the infrared transmitter is used for transmitting the target instruction to the target equipment.
In an embodiment, the control apparatus further comprises: and the camera is used for acquiring the image of the target equipment.
In an embodiment, the control apparatus further comprises: and the microphone is used for acquiring the sound of the target equipment.
In an embodiment, the control apparatus further comprises: and the steering mechanism is used for adjusting the directions of the infrared emitter and the camera.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a mobile processor, implement the target device control method according to the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: after the control device sends the instruction to the target device, the control device continues to confirm the execution condition of the target device on the instruction and feeds back the instruction to the user according to the execution condition of the target device, so that the user can timely know whether the target device executes the instruction required to be completed, the user can conveniently perform corresponding processing according to the condition, convenience is brought to the user, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a target device control method according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating another target device control method according to an example embodiment.
FIG. 3 is a schematic diagram illustrating a control device according to an exemplary embodiment.
Fig. 4 is a schematic block diagram illustrating a target device control apparatus according to an exemplary embodiment.
Fig. 5 is a schematic block diagram illustrating another target device control apparatus according to an example embodiment.
FIG. 6 is a block diagram illustrating an apparatus in accordance with an example embodiment.
FIG. 7 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Currently, in some related technologies, some smart homes, such as smart speakers, can receive a user's instruction through voice or remotely, and can control other target devices according to the user's instruction, such as controlling a television to be turned on, turned off, switching channels, adjusting volume, or controlling an air conditioner to be turned on, turned off, adjusting air volume, or the like. However, after sending the instruction to the control device, the user cannot know whether the target device that needs to execute the relevant operation finally executes the corresponding operation, for example, the user instructs the smart speaker to control the air conditioner to be turned on by remotely controlling the smart speaker, so that the user can turn on the air conditioner in advance when the user is far away from the room. However, after the smart sound box sends an opening instruction to the air conditioner, it cannot be known whether the air conditioner is opened, a user cannot timely know whether the instruction is completed, and there are situations where the air conditioner does not execute the instruction, for example, the air conditioner does not receive the instruction, the air conditioner cannot execute the instruction in a power-off state, and under these situations, the user can know whether the air conditioner is opened only when entering a room, and if the air conditioner does not execute the instruction, the user needs to adjust the room temperature in advance, and the user experience is seriously affected.
To solve the existing problems, the present disclosure provides a target device control method 10 that enables a user to even know whether or not its instruction is executed by a target device. Fig. 1 is a flowchart illustrating a target device control method 10 according to an exemplary embodiment, where the target device control method 10 may be applied to a control device such as a smart speaker capable of receiving and sending an instruction, and the target device control method 10 includes steps S11-S15, which are described in detail below.
In step S11, a control instruction is received.
In the embodiments of the present disclosure, the control device may receive the control instruction in various ways. For example, in some embodiments, a user may send a control instruction to the control device through an interface such as a key on the control device, a touch screen, or the like. In other embodiments, the control device may have a microphone, such as some smart speakers, and the user may send instructions to the control device by voice. For another example, in some embodiments, the control device may be interconnected with a terminal device of a user through a network, and the user may implement remote control of the control device through the network through an application program in the terminal device, and send the control instruction. The control device may receive the control instruction in a manner not limited to the form described in the above embodiments. The control instruction is used for enabling the control device to send a corresponding instruction to the target device, for example, if a user needs to turn on a television, the control instruction for turning on the television is sent to the control device, and then the control device sends the corresponding instruction to the television to turn on the television.
In step S12, the control device determines a target device to which the control instruction corresponds in response to the control instruction, and generates a target instruction.
After receiving a control instruction sent by a user directly or through remote equipment, the control equipment responds to the control instruction, determines target equipment corresponding to the control instruction, and generates a corresponding target instruction. The control instruction may generally include information of the target device, for example, when the user needs to turn on the air conditioner, the control instruction sent to the control device may include information of the air conditioner, so that the control device can determine that the target device is the air conditioner, and meanwhile, the control device also has instruction information for the target device to execute, and the control device may generate the target instruction corresponding to the target device according to the target device. The target instruction is used for being sent to the target device, can be acquired by the target device, and can enable the target device to execute a function corresponding to the target instruction. That is to say, the target instruction needs to correspond to a manner in which the target device receives the instruction, for example, if the target device needs to receive the instruction via bluetooth, it is necessary to implement bluetooth matching connection between the control device and the target device in advance, and send the target instruction via a bluetooth protocol. Meanwhile, the target instruction also needs to be readable by the target device, conform to the instruction format of the target device, and be executable by the target device. For example, after a user needs to start the air conditioner and sends a corresponding control instruction to the control device, the control device generates a target instruction corresponding to the air conditioner and used for realizing a starting function.
The generation mode of the target instruction can be confirmed according to the device type, the device brand, the device model and the like of the target device, and information of corresponding devices can be created in the control device in advance for determining the generation mode of the target instruction, so that the target instruction which can control the target device and can be received by the target device can be generated conveniently and quickly.
In step S13, the control apparatus transmits a target instruction to the target apparatus.
After the target instruction corresponding to the target device is generated, the target instruction may be sent in a manner corresponding to the target device, for example, in a manner of WiFi, bluetooth, infrared, or the like, so that the target device may receive the target instruction.
In step S14, the control apparatus determines the execution of the target instruction by the target apparatus.
In the embodiment of the present disclosure, after the control device issues the target instruction, the control device does not end the task, but continues to acquire the state of the target device to determine whether the issued target instruction is executed by the target device. Since some target devices do not perform active feedback, the control device or the user cannot directly confirm whether the target device receives the target instruction or whether the target instruction is executed. In this embodiment, the control device actively confirms the state of the target device and determines the execution condition of the target device on the target instruction, so that feedback can be performed, and a user can timely know whether the requirement is executed.
In step S15, the control device may issue corresponding feedback information based on the execution situation.
After the control device determines the execution condition of the target device, the control device can send corresponding feedback, and the control device can send out corresponding reminding corresponding to the execution condition through sound, light, screen display and other modes. For example, if the control device confirms that the target device has completed the target command, a green light may be displayed by the LED lamp on the control device, and if it confirms that the target device has not completed the target command, a red light may be displayed by the LED lamp. Or the execution situation can be fed back through different sounds. In some embodiments, the corresponding feedback may be performed only when the target device does not complete the target instruction, and if the target device completes the target instruction, the control device may end the task.
By the target device control method of the embodiment of the disclosure, a user can timely acquire the instruction execution condition after controlling other devices through the control device, and can be suitable for multi-type device control, even if the target device has no feedback, the execution condition can be determined through the control device and feedback can be provided for the user. The user requirements are met, and the user experience is improved.
In some embodiments, when the control instruction sent by the first device is received in step S11; in step S15: and if the target device does not execute the target instruction, sending failure information to the first device. In this embodiment, a user may send a control instruction to the control device through the first device in a remote manner, where the first device may be a terminal device such as a mobile phone or a tablet computer of the user, and the control instruction is sent to the control device through a network server, WiFi or bluetooth, and in this manner, the user may send the instruction without being in a space with the control device or the target device. Meanwhile, after the control equipment acquires the execution condition of the target equipment, the corresponding condition is fed back to the first equipment, so that a user can acquire the instruction execution result in the same remote mode.
In some embodiments, the control device may further comprise an infrared emitter; step S13 may include: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter. At present, in the field of electric appliances, more and more intelligent electric appliances can be interconnected through WiFi, bluetooth and the like, but a large number of devices which need to be controlled through infrared rays still exist, such as air conditioners, televisions and the like. Because infrared belongs to unidirectional transmission, the execution result can not be fed back, and whether the equipment executes the command can not be known. In this embodiment, the control device may have an infrared transmitter, and the control device may send the target instruction in an infrared manner in step S13, so as to be adapted to various devices that need to be controlled by infrared, and for a case where the execution result cannot be fed back, the control device determines the execution condition of the target device by using the method according to the embodiment of the present disclosure, and the user may also obtain, through the control device, whether the target device has completed a task to be completed by the instruction, such as power on, power off, and the like, without performing active feedback by the target device.
In some embodiments, the control device may further comprise a camera; step S13 may include: acquiring an image of target equipment through a camera; and determining the execution condition of the target instruction by the target equipment based on the image. In this embodiment, the target device may be photographed by a camera, and whether the target device executes the target instruction is determined by an image, where the image may be a single photograph, multiple photographs taken continuously, or a video, and the acquisition of the image may be started after the target instruction is transmitted, or may be started before the target instruction is transmitted, so that the determination can be made according to a change of the previous and subsequent images. After the target device executes the target instruction, the target device may change from the outside to a corresponding change, for example, a screen may change from black to a pattern after a television is turned on; after switching television channels, the pattern changes and may be separated by a black screen; a volume adjustment window appears when the volume is adjusted; after the air conditioner is started, a power indicator lamp and an air outlet can be opened. The control device acquires the image of the target device, and further can determine whether the target device has corresponding change after executing the target instruction in an image recognition mode. In one example, after the image is acquired, the image may be identified through a local image identification model, such as a convolutional neural network model, and whether the target command is executed is determined according to an identification result, where the identification result corresponds to the target command, such as images corresponding to power on and power off are different. In other embodiments, after the control device obtains the image, the image can be uploaded to a cloud server through WiFi and the like, image recognition is performed through an image recognition model of the cloud server, due to the difference between the storage space and the computing capacity, the cloud server can store more complex models and perform a larger amount of calculation, so that a more accurate recognition result is obtained, and particularly, image recognition can be more reliable through the cloud server under the condition that the difference between front and back images corresponding to some target instructions is not large. And then the cloud server can send the identification result to the control equipment or directly feed back the identification result to the terminal equipment of the user.
In some embodiments, the control device may further comprise a microphone; step S13 may further include: acquiring the sound of the target equipment through a microphone; based on the sound, execution of the target instruction by the target device is determined. In this embodiment, the control device may determine whether the target device executes the target instruction by sound, where the acquired sound may be a sound segment that is continuously acquired, and may start acquiring the sound after the target instruction is transmitted or may start acquiring the sound before the target instruction is transmitted. After receiving the target instruction, the target device may generate a corresponding sound or sound change during or after execution, for example, a warning sound generated by the target device when the target device is turned on or off; adjusting the volume change of the television volume; the change of the wind sound after adjusting the air volume of the air conditioner, and the like. By performing voice recognition, it can be determined whether the target device has executed the target instruction. The method of determining by sound in this embodiment may be implemented alone, or may be implemented in combination with the image recognition method in the foregoing embodiments, and the determination result may be more accurate by the joint determination of image and sound. For example, a target instruction to adjust a television sound, an image detection whether a volume adjustment window appears on a television screen, a volume change before and after a sound detection. The identification can be carried out through a local model or a model of a cloud server. The recognition models of the sound and the image can be independent, and judgment conclusions can be obtained according to respective recognition results; or may be a model in which both image and sound information (e.g., information obtained by analog-to-digital conversion) is input.
In the foregoing embodiment, the model may be an image recognition model, a voice recognition model, an image-voice recognition model, or the like that is trained in advance. The model can be trained by taking images and sounds corresponding to different instructions as training data in advance, wherein the training data needs to distinguish the type, brand, model and the like of a product.
In some embodiments, the control apparatus may further comprise a steering mechanism; as shown in fig. 2, the target device control method 10 may further include: in step S16, the infrared emitter and the camera are directed toward the target device by the steering mechanism. On the basis of the foregoing embodiments, since the range angle of the emission of the infrared signal is not very wide, it is necessary that the infrared emitter be aligned as much as possible with the infrared receiving end of the target device so that the infrared signal is received as reliably as possible. Similarly, although a wide-angle camera can be used for acquiring images in a wider view field, the accuracy of subsequent image identification results can be guaranteed due to higher image quality, and the process of acquiring the images by the camera aims at the target equipment as much as possible, so that the target equipment is located in the central area of the images, the image quality is improved, and the target equipment is clearer. In this embodiment, the control device has a steering mechanism, such as a pan-tilt, or other mechanism that can implement angle conversion. And infrared emitter and camera can be direct or indirect setting on steering mechanism, can be through steering mechanism angle regulation, wherein, can relative position fix between infrared emitter and the camera, and the orientation is unanimous. After the control device determines the target device, the infrared emitter and the camera can be aligned to the direction of the target device through the steering mechanism, the camera can start to collect images, and the infrared emitter can send target instructions to the target device before or after or at the same time of starting to collect the images. As described above, for some devices that need to be controlled by infrared, the devices may also be controlled by a smart home (control device), and because the infrared emission and reception mode is limited, the devices need to be aligned to ensure reception, in some embodiments of the present disclosure, the control device has a steering mechanism, and the directions of the infrared emitter and the camera can be adjusted, so as to meet the requirements of various types of devices. The requirements of customers are met, and the device is suitable for different types of equipment. In addition, in this example, the control device may also include a microphone, and due to the characteristic of sound propagation, the microphone may be fixedly disposed on the control device, or may be directly or indirectly mounted on the steering mechanism, and may be capable of adjusting the direction along with the steering mechanism and facing the target device, so that the sound reception is clearer, and the accuracy of the determination is improved.
In some embodiments, the target device control method 10 may further include: adjusting the shooting angle of the camera through a steering mechanism, and acquiring a complete image in a visual field through the camera; determining orientation information corresponding to one or more controllable devices in the field of view based on the complete image; step S16 may include: and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment. The embodiment can be used for creating the controllable device in the control device, and the camera is controlled to rotate and shoot the image of the full field of view through the steering mechanism, that is, the image of the maximum shooting range of the camera is obtained, and the device located in the complete image of the maximum shooting range can be used for judging the execution situation in an image recognition mode. After a complete image within the field of view has been acquired, the orientation of each device in the image can be determined, and when the user needs to control any of the devices, the steering mechanism can be adjusted based on the predetermined orientation, thereby enabling the infrared emitter and camera to be aimed at the target image. Through predetermination, the device can be conveniently and quickly adjusted when in need of control, and operations of searching and determining the position of the target device such as large-scale image scanning and device identification are not needed in real time.
In some embodiments, the target device control method 10 may further include: determining device information corresponding to one or more controllable devices; step S12 may include: and generating a target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment. Different devices may have different instruction formats, and the target device control method 10 according to the embodiment of the present disclosure may now determine device information of the controllable device, for example, after a complete image of a full field of view is passed, determine the type of the controllable device (such as a television, an air conditioner) and the brand and model of the device according to the devices in the image, determine a required instruction sending method such as bluetooth or infrared, and also determine a format for generating a target instruction such as an infrared beam of an infrared signal. Therefore, after the control device receives the control instruction, the control device generates a corresponding target instruction capable of executing user requirements according to the control instruction and based on the device information of the target device.
Based on the same concept, the embodiment of the present disclosure also provides the control device 100 and the target device control apparatus 400.
It is to be understood that the control device 100 and the target device control apparatus 400 provided in the embodiments of the present disclosure include hardware structures and/or software modules corresponding to the respective functions for implementing the functions described above. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 3 is a schematic diagram of a control device 100 according to an exemplary embodiment, where the control device 100 can control a target device 200 through the target device control method 10 in any of the foregoing embodiments, and can determine the execution condition of the target device 200, so as to feed back to a user in time. The control device 100 may directly receive an operation instruction from a user, or may receive a control instruction from the terminal device 300 of the user in a remote manner in a WiFi or bluetooth format. And the control device 100 may further include a speaker, an LED lamp, a display screen, etc. for emitting feedback information by means of sound, light, display pattern, etc.
In an embodiment, the control device 100 may include: and an infrared transmitter 110 for transmitting the target instruction to the target device 200 in an infrared manner. Therefore, the problem that the execution result cannot be fed back due to the fact that infrared belongs to one-way transmission through equipment for receiving the instruction through infrared is solved.
In an embodiment, the control device 100 may further include: and a camera 120 for acquiring an image of the target device 200. It is thus determined whether the target device 200 has executed the target instruction according to the visual information of the target device 200 by recognizing the acquired image.
In an embodiment, the control device 100 may further include: and a microphone 130 for acquiring the sound of the target device. It is thus determined whether the target device 200 has executed the target instruction based on the auditory information of the target device 200 by recognizing the acquired sound.
In an embodiment, the control device 100 may further include: and a steering mechanism 140 for adjusting the orientation of the infrared emitter 110 and the camera 120. Through a steering mechanism such as a pan-tilt, the infrared emitter 110 and the camera 120 are adjusted, so that infrared emission can be performed by aiming at the target device 200, the quality of image acquisition and the visual range of the acquired image can be improved, and the requirement for device control in a wider range is met.
With regard to the control apparatus 100 in the above-described embodiment, the specific manner in which the respective modules perform operations has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 4 is a block diagram illustrating a target device control apparatus 400 according to an example embodiment. Referring to fig. 4, a target device control apparatus 400 applied to a control device, the target device control apparatus 400 includes: a receiving unit 410, configured to receive a control instruction; the instruction generating unit 420 is configured to determine, in response to the control instruction, a target device corresponding to the control instruction, and generate a target instruction; an instruction sending unit 430, configured to send a target instruction to a target device; the processing unit 440 is used for determining the execution condition of the target instruction by the target device; the feedback unit 450 is configured to send out corresponding feedback information based on the execution situation.
In an embodiment, when the receiving unit 410 receives a control instruction sent by a first device; the feedback unit 450 is further configured to: and when the target device does not execute the target instruction, sending failure information to the first device.
In an embodiment, the control device further comprises an infrared emitter; the instruction issue unit 430 is further configured to: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter.
In an embodiment, the control device comprises a camera; as shown in fig. 5, the processing unit 440 further includes: the image acquisition subunit 441 is configured to acquire an image of the target device through the camera; a determining subunit 442 is configured to determine, based on the image, an execution of the target instruction by the target device.
In an embodiment, the control device further comprises a microphone; as shown in fig. 5, the processing unit 440 further includes: a sound collection subunit 443 configured to obtain the sound of the target device through the microphone; the determination subunit 442 is further configured to: based on the sound, execution of the target instruction by the target device is determined.
In an embodiment, the control apparatus further comprises a steering mechanism; as shown in fig. 5, the processing unit 440 further includes: and a steering subunit 444 for directing the infrared emitter and the camera towards the target device through a steering mechanism.
In one embodiment, steering subunit 444 is further configured to: adjusting the shooting angle of the camera through a steering mechanism; the image acquisition subunit 441 is further configured to: acquiring a complete image in a visual field through a camera; the determination subunit 442 is further configured to: based on the complete image, determining orientation information corresponding to one or more controllable devices in the field of view; steering subunit 444 is also for: and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment.
In one embodiment, the determining subunit 442 is further configured to: determining device information corresponding to one or more controllable devices; the instruction issue unit 430 is further configured to: and generating a target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment.
With regard to the target device control apparatus 400 in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
FIG. 6 is a block diagram illustrating an apparatus for target device control in accordance with an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, the apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 806 provides power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 7 is a block diagram illustrating an apparatus 1100 for target device control in accordance with an example embodiment. For example, the apparatus 1100 may be provided as a server. Referring to fig. 7, the apparatus 1100 includes a processing component 1122 that further includes one or more processors and memory resources, represented by memory 1132, for storing instructions, such as application programs, executable by the processing component 1122. The application programs stored in memory 1132 may include one or more modules that each correspond to a set of instructions. Additionally, processing component 1122 is configured to execute instructions to perform the above-described method for lithium battery activation charging
The apparatus 1100 may also include a power component 1126 configured to perform power management of the apparatus 1100, a wired or wireless network interface 1150 configured to connect the apparatus 1100 to a network, and an input/output (I/O) interface 1158. The apparatus 1100 may operate based on an operating system stored in the memory 1132, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
It is understood that "a plurality" in this disclosure means two or more, and other words are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that the terms "central," "longitudinal," "lateral," "front," "rear," "upper," "lower," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the present embodiment and to simplify the description, but do not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation.
It will be further understood that, unless otherwise specified, "connected" includes direct connections between the two without the presence of other elements, as well as indirect connections between the two with the presence of other elements.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (22)

1. A target device control method is applied to a control device; the method comprises the following steps:
receiving a control instruction;
responding to the control instruction, determining target equipment corresponding to the control instruction, and generating a target instruction;
sending the target instruction to the target device;
determining the execution condition of the target instruction by the target equipment;
and sending corresponding feedback information based on the execution condition.
2. The target device control method according to claim 1,
when the receiving the control instruction includes receiving the control instruction sent by the first device, the sending out corresponding feedback information based on the execution condition includes: and if the target device does not execute the target instruction, sending failure information to the first device.
3. The target device control method according to claim 1, wherein the control device further includes an infrared transmitter;
the sending the target instruction to the target device includes: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter.
4. The target apparatus control method according to claim 3, wherein the control apparatus includes a camera;
the determining the execution condition of the target instruction by the target device includes:
acquiring an image of the target equipment through the camera;
and determining the execution condition of the target instruction by the target equipment based on the image.
5. The target apparatus control method according to claim 1, wherein the control apparatus further includes a microphone;
the determining the execution condition of the target instruction by the target device further includes:
acquiring the sound of the target equipment through the microphone;
and determining the execution condition of the target instruction by the target equipment based on the sound.
6. The target apparatus control method according to claim 4, wherein the control apparatus further includes a steering mechanism; the method further comprises the following steps: and the infrared emitter and the camera face the target equipment through the steering mechanism.
7. The target device control method according to claim 6, characterized in that the method further comprises:
adjusting the shooting angle of the camera through the steering mechanism, and acquiring a complete image in a visual field through the camera; and a process for the preparation of a coating,
determining orientation information corresponding to one or more controllable devices in the field of view based on the complete image;
the directing the infrared emitter and the camera toward the target device by the steering mechanism includes:
and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment.
8. The target device control method according to claim 7, characterized in that the method further comprises: determining device information corresponding to the one or more controllable devices;
the determining the target device and generating a target instruction in response to the control instruction comprises:
and generating the target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment.
9. A target device control apparatus is characterized by being applied to a control device; the device comprises:
a receiving unit for receiving a control instruction;
the instruction generating unit is used for responding to the control instruction, determining target equipment corresponding to the control instruction and generating a target instruction;
an instruction sending unit, configured to send the target instruction to the target device;
the processing unit is used for determining the execution condition of the target instruction by the target equipment;
and the feedback unit is used for sending out corresponding feedback information based on the execution condition.
10. The target device control apparatus according to claim 9,
when the receiving unit receives the control instruction sent by the first device, the feedback unit is further configured to: and when the target device does not execute the target instruction, sending failure information to the first device.
11. The object device control apparatus according to claim 9, wherein the control device further comprises an infrared transmitter;
the instruction sending unit is further configured to: and sending the target instruction to the target equipment through the infrared signal sent by the infrared transmitter.
12. The object device control apparatus according to claim 11, wherein the control device includes a camera;
the processing unit further comprises:
the image acquisition subunit is used for acquiring an image of the target equipment through the camera;
and the determining subunit is used for determining the execution condition of the target instruction by the target equipment based on the image.
13. The target device control apparatus according to claim 9, wherein the control device further includes a microphone;
the processing unit further comprises: the sound acquisition subunit is used for acquiring the sound of the target equipment through the microphone;
and the determining subunit is used for determining the execution condition of the target instruction by the target equipment based on the sound.
14. The object device control apparatus according to claim 12, wherein the control device further comprises a steering mechanism;
the processing unit further comprises: and the steering subunit is used for enabling the infrared emitter and the camera to face the target equipment through the steering mechanism.
15. The target device control apparatus according to claim 14,
the steering subunit is further configured to: adjusting the shooting angle of the camera through the steering mechanism;
the image acquisition subunit is further configured to: acquiring a complete image in a visual field through the camera;
the determining subunit is further to: determining orientation information corresponding to one or more controllable devices in the field of view based on the complete image;
the steering subunit is further configured to: and adjusting the steering mechanism based on the azimuth information corresponding to the target equipment.
16. The target device control apparatus according to claim 15,
the determining subunit is further to: determining device information corresponding to the one or more controllable devices;
the instruction sending unit is further configured to: and generating the target instruction corresponding to the target equipment based on the equipment information corresponding to the target equipment.
17. A control apparatus characterized in that the control apparatus controls the target apparatus by the target apparatus control method according to any one of claims 1 to 8.
18. The control apparatus according to claim 17, characterized in that the control apparatus comprises: and the infrared transmitter is used for transmitting the target instruction to the target equipment.
19. The control apparatus according to claim 18, characterized in that the control apparatus further comprises: and the camera is used for acquiring the image of the target equipment.
20. The control apparatus according to claim 18 or 19, characterized by further comprising: and the microphone is used for acquiring the sound of the target equipment.
21. The control apparatus according to claim 19, characterized in that the control apparatus further comprises: and the steering mechanism is used for adjusting the directions of the infrared emitter and the camera.
22. A non-transitory computer readable storage medium, instructions in which, when executed by a processor, implement the target device control method of any one of claims 1-8.
CN202011044329.XA 2020-09-28 2020-09-28 Target device control method, target device control apparatus, and control device Pending CN112180748A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011044329.XA CN112180748A (en) 2020-09-28 2020-09-28 Target device control method, target device control apparatus, and control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011044329.XA CN112180748A (en) 2020-09-28 2020-09-28 Target device control method, target device control apparatus, and control device

Publications (1)

Publication Number Publication Date
CN112180748A true CN112180748A (en) 2021-01-05

Family

ID=73946420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011044329.XA Pending CN112180748A (en) 2020-09-28 2020-09-28 Target device control method, target device control apparatus, and control device

Country Status (1)

Country Link
CN (1) CN112180748A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112953796A (en) * 2021-02-03 2021-06-11 北京小米移动软件有限公司 Equipment state judgment method and device and storage medium
WO2023284562A1 (en) * 2021-07-14 2023-01-19 海信视像科技股份有限公司 Control device, household appliance, and control method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106504510A (en) * 2016-11-11 2017-03-15 青岛海尔智能家电科技有限公司 A kind of remote infrared control method and device
CN109581886A (en) * 2018-12-13 2019-04-05 深圳绿米联创科技有限公司 Apparatus control method, device, system and storage medium
WO2019102897A1 (en) * 2017-11-27 2019-05-31 ソニー株式会社 Control device, control method, electronic apparatus, and program
CN110224901A (en) * 2019-05-24 2019-09-10 北京小米移动软件有限公司 Intelligent device interaction, device and storage medium
CN110661685A (en) * 2019-08-22 2020-01-07 深圳绿米联创科技有限公司 Information feedback method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106504510A (en) * 2016-11-11 2017-03-15 青岛海尔智能家电科技有限公司 A kind of remote infrared control method and device
WO2019102897A1 (en) * 2017-11-27 2019-05-31 ソニー株式会社 Control device, control method, electronic apparatus, and program
CN109581886A (en) * 2018-12-13 2019-04-05 深圳绿米联创科技有限公司 Apparatus control method, device, system and storage medium
CN110224901A (en) * 2019-05-24 2019-09-10 北京小米移动软件有限公司 Intelligent device interaction, device and storage medium
CN110661685A (en) * 2019-08-22 2020-01-07 深圳绿米联创科技有限公司 Information feedback method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
盛孟刚;简博宇;谭昶辉;崔智;: "远程智能管家设计与实现", 数据采集与处理, no. 2, 15 November 2012 (2012-11-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112953796A (en) * 2021-02-03 2021-06-11 北京小米移动软件有限公司 Equipment state judgment method and device and storage medium
WO2023284562A1 (en) * 2021-07-14 2023-01-19 海信视像科技股份有限公司 Control device, household appliance, and control method

Similar Documents

Publication Publication Date Title
US11375120B2 (en) Method and device for assisting capturing of an image
JP6486970B2 (en) Flight control method, flight control device, and electronic apparatus
EP3136793B1 (en) Method and apparatus for awakening electronic device
EP3131315A1 (en) Working method and working device of intelligent electric apparatus
CN106355852B (en) Equipment control method and device
KR101815229B1 (en) Method, device, program and recording medium for reminding
EP3125152B1 (en) Method and device for collecting sounds corresponding to surveillance images
CN106789461A (en) The method and device of intelligent home device connection
CN112216088B (en) Remote control mode determining method and device and remote control method and device
WO2017024713A1 (en) Video image controlling method, apparatus and terminal
US10191708B2 (en) Method, apparatrus and computer-readable medium for displaying image data
CN112180748A (en) Target device control method, target device control apparatus, and control device
CN111123716B (en) Remote control method, remote control device, and computer-readable storage medium
CN106448104B (en) Equipment control method and device
CN112188089A (en) Distance acquisition method and device, focal length adjustment method and device, and distance measurement assembly
CN106303211B (en) Method, device and system for controlling shooting
CN111025921A (en) Local automation control method, local automation control device and electronic equipment
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN114339022B (en) Camera shooting parameter determining method and neural network model training method
CN115576417A (en) Interaction control method, device and equipment based on image recognition
CN112953796A (en) Equipment state judgment method and device and storage medium
CN106254919B (en) The method of adjustment and device of smart television working condition
CN110572582A (en) Shooting parameter determining method and device and shooting data processing method and device
CN112860827B (en) Inter-device interaction control method, inter-device interaction control device and storage medium
CN113138384B (en) Image acquisition method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination