CN110908566A - Information processing method and device - Google Patents

Information processing method and device Download PDF

Info

Publication number
CN110908566A
CN110908566A CN201811089795.2A CN201811089795A CN110908566A CN 110908566 A CN110908566 A CN 110908566A CN 201811089795 A CN201811089795 A CN 201811089795A CN 110908566 A CN110908566 A CN 110908566A
Authority
CN
China
Prior art keywords
image
target
characteristic
matrix
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811089795.2A
Other languages
Chinese (zh)
Inventor
张龙
文旷瑜
连园园
宋德超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201811089795.2A priority Critical patent/CN110908566A/en
Publication of CN110908566A publication Critical patent/CN110908566A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The invention discloses an information processing method and device. Wherein, the method comprises the following steps: photographing a target object through an image acquisition device to obtain a target image; extracting a characteristic image of the target object from the target image, wherein the characteristic image comprises action characteristics of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image; and controlling the corresponding target equipment based on the control parameters. The invention solves the technical problems of delay and low efficiency of control action caused by controlling the electric appliance according to the voice of a user or a remote controller in the conventional household intelligent electric appliance.

Description

Information processing method and device
Technical Field
The invention relates to the field of equipment control, in particular to an information processing method and device.
Background
With the development of artificial intelligence technology, the perception user interface becomes one of the research focuses in the field of human-computer interaction, and is a highly interactive and multi-channel user interface taking interaction activities between people and between people and the real world as a prototype, and the goal of the perception user interface is to enable the consistency of human-computer interaction and human-real world interaction to achieve an intuitive and natural interaction boundary, so as to realize a human-computer interface with human center, namely, a computer can adapt to the natural interaction habits of people in the human-computer interaction process, but not to require people to adapt to the specific operation requirements of the computer.
The existing household intelligent electric appliances are generally controlled according to the voice of a user or a remote controller, the control action is delayed, the efficiency is low, and the intelligent degree is low.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an information processing method and device, which at least solve the technical problems of delay and low efficiency of control action caused by the fact that the existing household intelligent electric appliance controls the electric appliance according to voice of a user or a remote controller.
According to an aspect of an embodiment of the present invention, there is provided an information processing method including: photographing a target object through an image acquisition device to obtain a target image; extracting a characteristic image of the target object from the target image, wherein the characteristic image comprises action characteristics of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image; and controlling the corresponding target equipment based on the control parameters.
Optionally, the extracting the feature image of the target object from the target image includes: establishing a matrix sampling channel; processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image; and recombining the characteristic matrix into the characteristic image.
Optionally, the processing the target image according to the matrix sampling channel, and acquiring the feature matrix of the target image includes: sampling the target image through a sampling window to obtain sampling data; and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
Optionally, the controlling the corresponding target device based on the control parameter includes: generating a control instruction according to the control parameter; and sending the control instruction to the target equipment to control the target equipment.
Optionally, the target device comprises a household appliance.
According to another aspect of the embodiments of the present invention, there is also provided an information processing apparatus including: the camera is used for photographing a target object to obtain a target image; a processor for extracting a feature image of the target object from the target image, wherein the feature image comprises an action feature of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image; and the controller is used for controlling the corresponding target equipment based on the control parameters.
Optionally, the processor is configured to perform the following steps to extract a feature image of the target object from the target image: establishing a matrix sampling channel; processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image; and recombining the characteristic matrix into the characteristic image.
Optionally, the processor is configured to execute the following steps to process the target image according to the matrix sampling channel, so as to obtain a feature matrix of the target image: sampling the target image through a sampling window to obtain sampling data; and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
Optionally, the controller is configured to perform the following steps to control the corresponding target device based on the control parameter: generating a control instruction according to the control parameter; and sending the control instruction to the target equipment to control the target equipment.
Optionally, the target device comprises a household appliance.
In the embodiment of the invention, the target object is photographed by an image acquisition device to obtain a target image; extracting a characteristic image of the target object from the target image, wherein the characteristic image comprises action characteristics of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image; based on control parameter, the mode of controlling corresponding target device gathers user's action through image acquisition device, and extract the characteristic image, carry out the analysis to the characteristic image with using first model, corresponding target device is controlled based on the control parameter who obtains, not only can save the vexation of using the remote controller, and can promote the purpose of human-computer interaction's experience, thereby realized promoting the human-computer interaction rate of accuracy, the technological effect of the fault rate among the reduction human-computer interaction, and then solved because current domestic intelligent electrical apparatus controls the control action that causes according to user's pronunciation or remote controller to electrical apparatus and has postponed, the technical problem of inefficiency.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow diagram illustrating an alternative information processing method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an alternative information processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided a method embodiment of an information processing method, it should be noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that herein.
Fig. 1 is an information processing method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
and S102, photographing a target object through an image acquisition device to obtain a target image.
Step S104, extracting the characteristic image of the target object from the target image.
Wherein the feature image includes an action feature of the target object.
Optionally, the extracting the feature image of the target object from the target image includes: establishing a matrix sampling channel; processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image; and recombining the characteristic matrix into the characteristic image.
Wherein the processing the target image according to the matrix sampling channel to obtain the feature matrix of the target image includes: sampling the target image through a sampling window to obtain sampling data; and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
And S106, analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object.
Wherein the first model is trained by machine learning using a plurality of sets of data, each of the plurality of sets of data comprising: the characteristic image and the control parameter corresponding to the characteristic image.
And step S108, controlling the corresponding target equipment based on the control parameters.
Wherein the target device comprises a household appliance.
Optionally, the controlling the corresponding target device based on the control parameter includes: generating a control instruction according to the control parameter; and sending the control instruction to the target equipment to control the target equipment.
Through the steps, the actions of the user are collected through the image collecting device, the characteristic images are extracted, the characteristic images are analyzed through the first model, corresponding target equipment is controlled based on the obtained control parameters, the trouble of using a remote controller can be saved, and the purpose of experience of human-computer interaction can be improved, so that the technical effects of improving the accuracy of the human-computer interaction and reducing the error rate in the human-computer interaction are achieved, and the technical problems that the control actions caused by the fact that the existing household intelligent electric appliance is controlled according to the voice of the user or the remote controller are delayed and low in efficiency are solved.
In this embodiment, when the household appliance is controlled, the image acquisition device acquires the motion of the user, the acquired target picture is subjected to feature image extraction, a model is established by using a deep learning method, the parameters are input into the model, the model is trained, the target picture is input by using the model, the feature picture is extracted, the control parameters of the household appliance are output, and the household appliance is controlled by generating a control instruction according to the control parameters of the household appliance. For example, the air conditioner may determine, from the movement and posture of the user, whether the user needs to change the temperature of the air conditioner to be raised or lowered.
When the characteristic image is obtained, a matrix sampling channel is established, the target image is processed according to the matrix sampling channel to obtain a characteristic matrix of the target image, then the characteristic matrix is recombined into the characteristic image, different parameters of the target image are processed through different matrix sampling channels, and the obtained parameters are processed through a convolution network model to obtain a final characteristic image. The control gesture or the control posture of the user is recognized according to the image recognition technology, so that the user can control the smart home through fixing the gesture or the action posture.
In the information processing method of the embodiment, the actions of the user are collected through the image collecting device; extracting characteristic images of the collected target pictures; establishing a model by using a deep learning method, and training the model; and inputting the extracted features of the target picture into the model, and outputting control parameters of the household appliance. The trouble of using a remote controller can be saved, the experience of human-computer interaction can be improved, the accuracy of the human-computer interaction is greatly improved, and the error rate in the human-computer interaction is effectively reduced. The technical problems that the existing household intelligent appliance is backward in control mode, delayed in control action and low in efficiency are effectively solved.
Example 2
According to an embodiment of the present invention, there is provided an embodiment of an information processing apparatus, and fig. 2 is an information processing apparatus according to an embodiment of the present invention, as shown in fig. 2, the apparatus including:
the camera 20 is used for photographing a target object to obtain a target image;
a processor 22, which extracts a characteristic image of the target object from the target image, wherein the characteristic image comprises the action characteristic of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image;
and a controller 24 for controlling the corresponding target device based on the control parameter.
Optionally, the processor is configured to perform the following steps to extract a feature image of the target object from the target image: establishing a matrix sampling channel; processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image; and recombining the characteristic matrix into the characteristic image.
Optionally, the processor is configured to execute the following steps to process the target image according to the matrix sampling channel, so as to obtain a feature matrix of the target image: sampling the target image through a sampling window to obtain sampling data; and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
Optionally, the controller is configured to perform the following steps to control the corresponding target device based on the control parameter: generating a control instruction according to the control parameter; and sending the control instruction to the target equipment to control the target equipment.
Optionally, the target device comprises a household appliance.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. An information processing method characterized by comprising:
photographing a target object through an image acquisition device to obtain a target image;
extracting a characteristic image of the target object from the target image, wherein the characteristic image comprises action characteristics of the target object;
analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image;
and controlling the corresponding target equipment based on the control parameters.
2. The method of claim 1, wherein the extracting the feature image of the target object from the target image comprises:
establishing a matrix sampling channel;
processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image;
and recombining the characteristic matrix into the characteristic image.
3. The method of claim 2, wherein the processing the target image according to the matrix sampling channel to obtain the feature matrix of the target image comprises:
sampling the target image through a sampling window to obtain sampling data;
and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
4. The method of claim 1, wherein the controlling the respective target device based on the control parameter comprises:
generating a control instruction according to the control parameter;
and sending the control instruction to the target equipment to control the target equipment.
5. The method of any one of claims 1 to 4, wherein the target device comprises a household appliance.
6. An information processing apparatus characterized by comprising:
the camera is used for photographing a target object to obtain a target image;
a processor for extracting a feature image of the target object from the target image, wherein the feature image comprises an action feature of the target object; analyzing the characteristic image by using a first model, and determining a control parameter corresponding to the characteristic image of the target object, wherein the first model is trained by using multiple groups of data through machine learning, and each group of data in the multiple groups of data comprises: the characteristic image and the control parameter corresponding to the characteristic image;
and the controller is used for controlling the corresponding target equipment based on the control parameters.
7. The apparatus of claim 6, wherein the processor is configured to extract a feature image of the target object from the target image by:
establishing a matrix sampling channel;
processing the target image according to the matrix sampling channel to obtain a characteristic matrix of the target image;
and recombining the characteristic matrix into the characteristic image.
8. The apparatus of claim 7, wherein the processor is configured to process the target image according to the matrix sampling channel to obtain a feature matrix of the target image by:
sampling the target image through a sampling window to obtain sampling data;
and carrying out different parameter processing on the matrix sampling channels with different sampling data to obtain the characteristic matrix.
9. The apparatus of claim 6, wherein the controller is configured to control the corresponding target device based on the control parameter by:
generating a control instruction according to the control parameter;
and sending the control instruction to the target equipment to control the target equipment.
10. The apparatus of any one of claims 6 to 9, wherein the target device comprises a household appliance.
CN201811089795.2A 2018-09-18 2018-09-18 Information processing method and device Pending CN110908566A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811089795.2A CN110908566A (en) 2018-09-18 2018-09-18 Information processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811089795.2A CN110908566A (en) 2018-09-18 2018-09-18 Information processing method and device

Publications (1)

Publication Number Publication Date
CN110908566A true CN110908566A (en) 2020-03-24

Family

ID=69812957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811089795.2A Pending CN110908566A (en) 2018-09-18 2018-09-18 Information processing method and device

Country Status (1)

Country Link
CN (1) CN110908566A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931762A (en) * 2020-09-25 2020-11-13 广州佰锐网络科技有限公司 AI-based image recognition solution method, device and readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509086A (en) * 2011-11-22 2012-06-20 西安理工大学 Pedestrian object detection method based on object posture projection and multi-features fusion
CN102831404A (en) * 2012-08-15 2012-12-19 深圳先进技术研究院 Method and system for detecting gestures
CN105353634A (en) * 2015-11-30 2016-02-24 北京地平线机器人技术研发有限公司 Household appliance and method for controlling operation by gesture recognition
CN105654037A (en) * 2015-12-21 2016-06-08 浙江大学 Myoelectric signal gesture recognition method based on depth learning and feature images
US20160353093A1 (en) * 2015-05-28 2016-12-01 Todd Michael Lyon Determining inter-pupillary distance
CN107133960A (en) * 2017-04-21 2017-09-05 武汉大学 Image crack dividing method based on depth convolutional neural networks
CN107610140A (en) * 2017-08-07 2018-01-19 中国科学院自动化研究所 Near edge detection method, device based on depth integration corrective networks
CN108052199A (en) * 2017-10-30 2018-05-18 珠海格力电器股份有限公司 Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator
CN108052858A (en) * 2017-10-30 2018-05-18 珠海格力电器股份有限公司 The control method and smoke exhaust ventilator of smoke exhaust ventilator
CN108105136A (en) * 2017-11-03 2018-06-01 珠海格力电器股份有限公司 Control method, device and the fan of fan
US20180188928A1 (en) * 2015-10-15 2018-07-05 International Business Machines Corporation Display control of an image on a display screen

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509086A (en) * 2011-11-22 2012-06-20 西安理工大学 Pedestrian object detection method based on object posture projection and multi-features fusion
CN102831404A (en) * 2012-08-15 2012-12-19 深圳先进技术研究院 Method and system for detecting gestures
US20160353093A1 (en) * 2015-05-28 2016-12-01 Todd Michael Lyon Determining inter-pupillary distance
US20180188928A1 (en) * 2015-10-15 2018-07-05 International Business Machines Corporation Display control of an image on a display screen
CN105353634A (en) * 2015-11-30 2016-02-24 北京地平线机器人技术研发有限公司 Household appliance and method for controlling operation by gesture recognition
CN105654037A (en) * 2015-12-21 2016-06-08 浙江大学 Myoelectric signal gesture recognition method based on depth learning and feature images
CN107133960A (en) * 2017-04-21 2017-09-05 武汉大学 Image crack dividing method based on depth convolutional neural networks
CN107610140A (en) * 2017-08-07 2018-01-19 中国科学院自动化研究所 Near edge detection method, device based on depth integration corrective networks
CN108052199A (en) * 2017-10-30 2018-05-18 珠海格力电器股份有限公司 Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator
CN108052858A (en) * 2017-10-30 2018-05-18 珠海格力电器股份有限公司 The control method and smoke exhaust ventilator of smoke exhaust ventilator
CN108105136A (en) * 2017-11-03 2018-06-01 珠海格力电器股份有限公司 Control method, device and the fan of fan

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨红玲: "基于卷积神经网络的手势识别", 《计算机技术与发展》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931762A (en) * 2020-09-25 2020-11-13 广州佰锐网络科技有限公司 AI-based image recognition solution method, device and readable storage medium
CN111931762B (en) * 2020-09-25 2021-07-30 广州佰锐网络科技有限公司 AI-based image recognition solution method, device and readable storage medium

Similar Documents

Publication Publication Date Title
CN110459214B (en) Voice interaction method and device
CN108181819B (en) Linkage control method, device and system for household electrical appliance and household electrical appliance
CN106454481B (en) A kind of method and device of live broadcast of mobile terminal interaction
CN113138705A (en) Method, device and equipment for adjusting display mode of display interface
CN108062971A (en) The method, apparatus and computer readable storage medium that refrigerator menu is recommended
CN104053060A (en) Intelligent television and television program playing method thereof
CN104182048A (en) Brain-computer interface based telephone system and call method thereof
CN114821236A (en) Smart home environment sensing method, system, storage medium and electronic device
KR102292243B1 (en) Scalp and hair management system for providing status information at the stage of change
CN110908566A (en) Information processing method and device
CN112019800A (en) Image sharing method and device, range hood and storage medium
CN109726808B (en) Neural network training method and device, storage medium and electronic device
CN105979331A (en) Smart television data recommend method and device
CN108006902B (en) Air conditioner control method and device
CN111007806B (en) Smart home control method and device
CN114707004B (en) Method and system for extracting and processing case-affair relation based on image model and language model
CN110941187A (en) Household appliance control method and device
JP6472509B2 (en) Data disturbance device and data disturbance system
CN110824930B (en) Control method, device and system of household appliance
CN104298442A (en) Information processing method and electronic device
CN110347247B (en) Man-machine interaction method and device, storage medium and electronic equipment
CN106303607A (en) A kind of Household diet control system based on intelligent television
CN111128135A (en) Voice communication method and device
CN110673737B (en) Display content adjusting method and device based on intelligent home operating system
CN109858380A (en) Expansible gesture identification method, device, system, gesture identification terminal and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination