CN112085942A - Method and system for configuring remote control function data according to remote control equipment image - Google Patents

Method and system for configuring remote control function data according to remote control equipment image Download PDF

Info

Publication number
CN112085942A
CN112085942A CN202011019103.4A CN202011019103A CN112085942A CN 112085942 A CN112085942 A CN 112085942A CN 202011019103 A CN202011019103 A CN 202011019103A CN 112085942 A CN112085942 A CN 112085942A
Authority
CN
China
Prior art keywords
remote control
control function
equipment
function data
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011019103.4A
Other languages
Chinese (zh)
Other versions
CN112085942B (en
Inventor
陈玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hansang Nanjing Technology Co ltd
Original Assignee
Hansong Nanjing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hansong Nanjing Technology Co ltd filed Critical Hansong Nanjing Technology Co ltd
Priority to CN202011019103.4A priority Critical patent/CN112085942B/en
Priority to CN202210126675.5A priority patent/CN114399896B/en
Publication of CN112085942A publication Critical patent/CN112085942A/en
Application granted granted Critical
Publication of CN112085942B publication Critical patent/CN112085942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C23/00Non-electrical signal transmission systems, e.g. optical systems
    • G08C23/04Non-electrical signal transmission systems, e.g. optical systems using light waves, e.g. infrared
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Selective Calling Equipment (AREA)

Abstract

An embodiment of the present specification provides a method and a system for configuring remote control function data according to a remote control device image, including: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; and determining remote control function data corresponding to the remote control equipment based on the first characteristic. By identifying the image of the remote control equipment and determining the corresponding remote control function data according to the key characteristic layout, the remote control function data can be configured more conveniently and accurately, and the user experience is improved.

Description

Method and system for configuring remote control function data according to remote control equipment image
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and a system for configuring remote control function data according to a remote control device image.
Background
With the development of information technology, remote control of various electronic devices such as electric appliances and production equipment has been applied in various fields. When the unified remote control of a plurality of electronic devices such as electric appliances and production equipment is realized or when the remote control equipment of the electronic devices such as electric appliances and production equipment is inconvenient to use or lost, the remote control function data corresponding to the remote control equipment of each electronic device needs to be configured. The electronic equipment is configured with the corresponding remote control function data, so that the brand and/or the model corresponding to each electronic equipment needs to be found and configured according to the remote control function, and the efficiency is low and the remote control function is very inconvenient.
Therefore, a need exists for a method and system for configuring remote control function data based on a remote control device image.
Disclosure of Invention
One aspect of the present specification provides a method of configuring remote control function data from a remote control device image. The method comprises the following steps: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge; determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises obtaining remote control function data corresponding to the remote control equipment or no remote control function data corresponding to the remote control equipment; if the first result is that the remote control function data corresponding to the remote control equipment is obtained, determining the remote control function data corresponding to the remote control equipment based on the first result; if the first result is that the remote control function data corresponding to the remote control device does not exist, then: acquiring second characteristics of controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data; the recognition model comprises a graph neural network model; the remote control function data comprises at least one group of remote control codes, and the intelligent equipment realizes control over controlled equipment matched with the remote control equipment according to the at least one group of remote control codes.
Another aspect of the specification provides a system for configuring remote control function data based on a remote control device image. The system comprises: a first obtaining module: the remote control device comprises a first image acquisition module, a second image acquisition module, a first characteristic determination module and a second characteristic determination module, wherein the first image acquisition module is used for acquiring a first image of a remote control device and determining a first characteristic based on the first image, and the first characteristic comprises a key layout characteristic of the remote control device; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge; a first identification module: the remote control device is used for determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises the remote control function data corresponding to the remote control device or the remote control function data corresponding to no remote control device; a first determination module: the remote control device is used for determining the remote control function data corresponding to the remote control device based on the first result if the first result is that the remote control function data corresponding to the remote control device is obtained; a second determination module: if the first result is that the remote control function data corresponding to the remote control device does not exist, then: acquiring second characteristics of controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data; the recognition model comprises a graph neural network model; the remote control function data comprises at least one group of remote control codes, and the intelligent equipment realizes control over controlled equipment matched with the remote control equipment according to the at least one group of remote control codes.
Another aspect of the specification provides an apparatus for configuring remote control function data based on a remote control device image, comprising a processor for performing a method of configuring remote control function data based on a remote control device image.
Another aspect of the present specification provides a computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform a method of configuring remote control function data based on a remote control device image.
The beneficial effects brought by the specification include but are not limited to: (1) by identifying the image of the remote control equipment and determining the corresponding remote control function data according to the key characteristic layout, configuration is not needed after searching the brand and/or model, the remote control function data can be configured more conveniently and accurately, and the user experience is improved; (2) the image data is constructed based on the image characteristics of the remote control equipment, the key characteristics and the layout information among keys of the remote control equipment are more completely embodied, and more accurate prediction is realized through the mode of image data identification; (3) the characteristics of the controlled equipment can be combined when the graph data is constructed, so that the remote control function data can not be determined according to the remote control equipment, and can be determined by combining the characteristics of the controlled equipment, so that the prediction is more accurate; (4) the characteristic information of the controlled equipment can be directly obtained through the image of the controlled equipment or the voice input, the text input and the like of a user, and the corresponding remote control function data can be more conveniently determined; (5) after the identification is added to the determined remote control function data and the remote control function data is stored, the remote control function data corresponding to the controlled equipment can be directly called to realize control, the control is more convenient and efficient, and the user experience is improved; (6) the determined remote control function data is tested, so that the accuracy of the configured remote control function data can be ensured, and the wrong configuration can be avoided; (7) when the determined remote control function data is invalid, the parameters of the recognition model are updated based on the re-determined result, and the recognition accuracy of the model can be improved.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a system for configuring remote control function data from a remote control device image, according to some embodiments of the present description;
FIG. 2 is a block diagram illustrating an exemplary system for configuring remote control function data based on a remote control device image in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow chart of a method for configuring remote control function data from a remote control device image according to some embodiments shown herein;
FIG. 4 is an exemplary flow diagram of a method of recognition model training, according to some embodiments shown in the present description;
FIG. 5 is an exemplary flow chart of another method for configuring remote control function data from a remote control device image according to some embodiments of the present description;
FIG. 6 is an exemplary flow diagram of another method of recognition model training, according to some embodiments of the present description;
FIG. 7 is an exemplary diagram illustrating determining second graph data based on first graph data and second features according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used in this specification is a method for distinguishing different components, elements, parts or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of a system for configuring remote control function data from a remote control device image according to some embodiments of the present description.
A scenario 100 of a system for configuring remote control function data from a remote control device image may include a terminal device 110, a processor 120, and a network 130.
The scenario 100 of the system for configuring remote control function data according to a remote control device image may be used for remote control data configuration for each appliance in an intelligent home, remote control data configuration for each electronic device in a factory, and the like. When the unified control of a plurality of electronic devices such as electric appliances is realized or remote control devices of the electric appliances or the electronic devices are inconvenient to use or lost, remote control function data corresponding to the remote control devices of the electronic devices need to be configured. The image of the remote control device may be acquired through the terminal device 110, and then the processor 120 may identify the image of the remote control device, acquire relevant features of the remote control device, such as key layout features, and determine corresponding remote control function data through the identification model according to the key layout features, which may make configuring the remote control function data more convenient. Data transmission between processor 120 and terminal device 110 may occur via network 130.
Terminal device 110 refers to one or more terminal devices or software that may be used by a user. In some embodiments, terminal device 110 may be used by one or more users, which may include users who directly use the service, as well as other associated users. In some embodiments, the terminal device 110 may be one or any combination of mobile device 110-1, tablet computer 110-2, laptop computer 110-3, or other device having image acquisition, input, and/or output capabilities.
Processor 120 may process data and/or information obtained from other devices or system components. The processor may execute program instructions based on the data, information, and/or processing results to perform one or more of the functions described herein. In some embodiments, processor 120 may include one or more sub-processing devices (e.g., single core processing devices or multi-core processing devices). Merely by way of example, the processor 110 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a programmable logic circuit (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like or any combination thereof. In some embodiments, processor 120, as well as other possible system components, may include a memory device.
The network 130 may connect the various components of the system and/or connect the system with external resource components. The network 130 allows communication between the various components and with other components outside the system to facilitate the exchange of data and/or information. In some embodiments, the network 130 may be any one or more of a wired network or a wireless network. For example, network 130 may include a cable network, a fiber optic network, a telecommunications network, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network (ZigBee), Near Field Communication (NFC), an in-device bus, an in-device line, a cable connection, and the like, or any combination thereof. The network connection between the parts can be in one way or in multiple ways. In some embodiments, the network may be a point-to-point, shared, centralized, etc. variety of topologies or a combination of topologies. In some embodiments, the network 130 may include one or more network access points. For example, the network 130 may include wired or wireless network access points, such as base stations and/or network switching points networks 130-1, networks 130-2, …, through which one or more components of the scenario 100 may connect to the network 130 to exchange data and/or information.
FIG. 2 is a block diagram illustrating an exemplary system for configuring remote control function data based on a remote control device image in accordance with some embodiments of the present description.
In some embodiments, the system 200 for configuring remote control function data based on remote control device images may include a first obtaining module 210, a first recognition module 220, a first determination module 230, a second determination module 240, a storage module 250, a recognition model training module 260, a testing module 270, and a speech determination module 280.
The first obtaining module 210 may be configured to obtain a first image of a remote control device, and determine a first feature based on the first image, where the first feature includes a key layout feature of the remote control device; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge.
The first recognition module 220 may be configured to determine a first result through the trained recognition model based on the first graph data, where the first result includes obtaining remote control function data corresponding to the remote control device or not obtaining the remote control function data corresponding to the remote control device. The recognition model may include a graph neural network model. The remote control function data comprises at least one group of remote control codes, and the intelligent equipment realizes control over controlled equipment matched with the remote control equipment according to the at least one group of remote control codes.
The first determining module 230 may be configured to determine, based on the first result, the remote control function data corresponding to the remote control device if the first result is that the remote control function data corresponding to the remote control device is obtained.
The second determining module 240 may be configured to, if the first result is that there is no remote control function data corresponding to the remote control device: and acquiring second characteristics of the controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data. In some embodiments, the second determination module 240 may be further configured to: determining a second node, a second node attribute of the second node being determined based on the second feature; determining the second graph data from the second node and the first graph data. In some embodiments, the second determination module 240 may be further configured to: acquiring a second image of the controlled device matched with the remote control device; and inputting the second image into a feature generation model to obtain the second feature corresponding to the second image. In some embodiments, the second determination module 240 may be further configured to: acquiring audio information and/or text information input by a user, wherein the audio information and the text information comprise equipment information of the controlled equipment; and obtaining the second characteristic of the controlled equipment according to the audio information and/or the text information.
The storage module 250 may be configured to add an identifier to the remote control function data corresponding to the remote control device and store the identifier, where the identifier is related to a controlled device matched with the remote control device, so as to obtain a remote control function database including at least one remote control function data corresponding to the remote control device; and the intelligent equipment searches an identifier related to the target controlled equipment in the remote control function database, and calls the remote control function data corresponding to the identifier related to the target controlled equipment to realize the control of the target controlled equipment.
The recognition model training module 260 may be configured to obtain at least one first graph data sample corresponding to at least one remote control device and a first label corresponding to the at least one first graph data sample, where the first label includes the remote control function data corresponding to the first graph data sample or the remote control function data not corresponding to the first graph data sample; obtaining at least one second feature sample corresponding to at least one controlled device matched with the remote control device, obtaining at least one second image data sample based on the at least one second feature sample and the at least one first image data sample, and obtaining a second label corresponding to the at least one second image data sample, wherein the second label comprises the remote control function data corresponding to the second image data sample; and taking the at least one first graph data sample with the first label and the at least one second graph data sample with the second label as training samples, and training an initial recognition model to obtain the trained recognition model.
The testing module 270 may be configured to test the controlled device matched with the corresponding remote control device by using the remote control function data; if the test is successful, the remote control function data is valid; if the test fails, acquiring a brand list and/or a model list corresponding to the controlled equipment, and selecting the brand and/or the model of the controlled equipment matched with the remote control equipment based on the brand list and/or the model list; and re-determining the remote control function data corresponding to the remote control equipment based on the selected brand and/or model and the first characteristic. In some embodiments, the test model 270 may also be used to: and constructing a loss function according to the difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the remote control equipment which is determined again, and updating the parameters of the identification model based on the loss function.
The voice determination module 280 may be configured to determine voice control data corresponding to the remote control device based on the at least one set of remote control codes of the remote control function data corresponding to the remote control device; and the intelligent equipment realizes the voice control of the controlled equipment matched with the remote control equipment according to the voice control data.
It should be understood that the illustrated system and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system 200 and its modules for configuring remote control function data according to a remote control device image is for convenience of description only and is not intended to limit the present disclosure to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, the first obtaining module 210, the first recognition module 220, the first determining module 230, the second determining module 240, the storage module 250, the recognition model training module 260, the testing module 270, and the voice determining module 280 may share one storage module, and each of the modules may also have a respective storage module. Such variations are within the scope of the present application.
FIG. 3 is an exemplary flow chart illustrating a method of configuring remote control function data from a remote control device image according to some embodiments of the present description.
As shown in fig. 3, the method 300 of configuring remote control function data from a remote control device image may include:
step 310, obtaining a first image of a remote control device, and determining a first feature based on the first image, where the first feature includes a key layout feature of the remote control device.
In particular, this step 310 may be performed by the first obtaining module 210.
The remote control device refers to a device for remotely controlling an electronic device, such as a remote controller, a mobile phone, a computer, or a terminal device capable of realizing remote control.
The image containing the image associated with the remote control device may be referred to as the first image. The first image at least comprises an image containing the layout of keys of the remote control device, and also comprises an identification image of the remote control device, a whole image showing the shape, the size and the like of the remote control device, an application environment image of the remote control device and other related images of the remote control device. The first image may be a 2D image, a 3D image, or other multi-dimensional image, and the embodiment is not limited.
In some embodiments, the image of the remote control device may be obtained by scanning, photographing, shooting, and the like, which is not limited in this embodiment.
The remote control function data refers to control data for remotely controlling the electronic device, and may include remote control codes, control data packets, and the like. The remote control function data may be used for a remote control device or other intelligent devices to implement remote control of the electronic device, and the remote control mode may be infrared remote control, radio signal remote control, voice remote control, and the like, which is not limited in this embodiment.
The electronic device remotely controlled by the remote control device can be called as a controlled device, the remote control device has a matched controlled device, and the remote control function data corresponding to the remote control device can be used for remotely controlling the matched controlled device.
In some embodiments, the remote function data includes at least one set of remote control codes, for example, a set of remote control codes corresponding to a set of keys corresponding to functions of up, down, on, and off. The remote control device and other intelligent devices (mobile phones, computers and other intelligent devices capable of realizing remote control) can realize control on controlled devices matched with the remote control device according to at least one group of remote control codes.
In some embodiments, the voice control data corresponding to the remote control device may also be determined based on at least one set of remote control codes corresponding to the remote control device. The remote control code may be converted into corresponding voice control data by a computer or a data conversion device, for example. The remote control device or other intelligent devices can realize voice control on controlled devices matched with the remote control device according to the voice control data.
In some embodiments, the remote control function data for different brands and/or models of electronic devices may be different, e.g., the remote control function data for brands a and B electronic devices are type a or type B, respectively. The remote control function data corresponding to different key functions may be different, for example, the remote control function data corresponding to the up key and the down key are different.
In some embodiments, after obtaining the first image of the remote control device, the first image may be used to determine remote control function data corresponding to the remote control device. For example, different key layouts correspond to different brands, models and key functions, that is, different remote control function data, and according to an image including a key layout of the remote control device included in the first image, the remote control function data corresponding to the first image, that is, the remote control function data of the remote control device corresponding to the first image, can be determined.
In some embodiments, the first feature may be determined based on the first image. The first feature is related image features contained in the first image, the first feature at least comprises key layout features of the remote control device, and can also comprise identification features of the remote control device, appearance features such as the shape and the size of the remote control device, application environment features and other image features related to the remote control device.
In some embodiments, the corresponding first feature may be obtained by performing image recognition on the image, or the first feature corresponding to the first image may be obtained by inputting the first image into the feature generation model. The input of the feature generation model is an image, and the output is a feature vector corresponding to the image. Specifically, the first image is input into the feature generation model, and a feature vector corresponding to the first image is output, and the feature vector can be used as a first feature corresponding to the controlled device. In some embodiments, the feature generation model may employ a neural network model, such as a conventional convolutional neural network model like CNN or VGG. Taking the VGG as an example, the first image is input into the VGG, and the feature vector corresponding to the first image can be output and obtained through the convolution layer, the pooling layer and the full-link layer.
The first characteristic may be used to determine remote control function data corresponding to the remote control device. For example, the remote control function data of the corresponding remote control device may be determined directly based on the first characteristic, the brand and/or model and the key function of the corresponding remote control device may be determined based on the first characteristic, and the corresponding remote control function data may be determined based on the brand and/or model and the key function. Specifically, a database including the first characteristic and/or a brand, a model, a key function corresponding to the first characteristic and corresponding remote control function data may be established, and the determination of the remote control function data corresponding to the remote control device based on the first characteristic may be achieved by searching the database. The method can also be realized by training the neural network model to obtain an identification model which can input the first characteristic and output the remote control function data corresponding to the remote control equipment. The identification model may be a logistic regression model, a classification and regression tree, a support vector machine, a classification and regression tree, a random forest, or other neural network models for classification, which is not limited in this embodiment.
Step 320, determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge.
In particular, this step 320 may be performed by the first obtaining module 210.
In some embodiments, first map data corresponding to the remote control device may be determined based on the first characteristic. Graph data refers to data using a graph structure, including nodes, edges, and attributes to represent and store data. The map data determined from the first feature may be referred to as first map data, as shown in fig. 7.
The first node is a node representing keys and key related information in the first graph data, each key corresponds to one first node, and the attribute of each first node can be determined based on the attribute of the corresponding key. As shown in 700 in fig. 7, the first graph data includes first nodes U1, U2, U3, U4, and U5. The attribute of a key refers to information related to the key, such as the function of the key, the position of the key, the shape of the key, the size of the key, and the like. In the first feature, the key of the remote control device may include a plurality of keys, so that a plurality of first nodes corresponding to the respective keys may be included in the corresponding first graph data.
The virtual edges refer to edges connecting nodes in the graph data. Each virtual edge connects two of the plurality of first nodes, and at least one virtual edge may be included in the corresponding first graph data. The attribute of the virtual edge may be determined based on the association relationship information between the two first nodes of the connection. The relationship information between the two connected first nodes may include position information of the keys corresponding to the two connected first nodes, and may further include a function association relationship, a shape association relationship, and the like between the keys corresponding to the two connected first nodes. In some embodiments, the edge attribute of the virtual edge may be determined based on the position information of the key corresponding to the two first nodes connected to the key, for example, the two first nodes connected to the virtual edge a correspond to the key a and the key b, and the edge attribute of the virtual edge a may include the position (xa, ya) of the key a, the position (xb, yb) of the key b, the linear distance between the key a and the key b, the direction of the connection line between the key a and the key b in the coordinate system, and the like.
Through the constructed first graph data, the key layout conditions of the remote control device, such as the relative position condition among keys, the distribution condition among a plurality of keys and the like, can be more excellently and perfectly characterized. The method and the device are favorable for better considering the condition of key layout when determining the remote control function data corresponding to the remote control equipment, and obtain more accurate results.
Step 330, determining a first result through the trained recognition model based on the first graph data, where the first result includes obtaining remote control function data corresponding to the remote control device or no remote control function data corresponding to the remote control device.
In particular, this step 330 may be performed by the first identification module 220.
In some embodiments, the remote control function data corresponding to the remote control device may be determined based on the first map data. Specifically, the remote control function data corresponding to the remote control device may be determined according to the output of the recognition model, where the input of the recognition model may be the first graph data.
In some embodiments, the result output by the recognition model may be referred to as a first result, and the first result may include obtaining remote control function data corresponding to the remote control device or not obtaining the remote control function data corresponding to the remote control device. For example, the remote control device 1 is a brand a, and keys corresponding to up, down, on, and off are distributed, and the first diagram data corresponding to the remote control device 1 is input into the identification model, so that the remote control function data corresponding to the remote control device can be obtained as a-type remote control function data including up, down, on, and off functions. For another example, the remote control device 2 is a universal remote controller, which has no corresponding brand or no corresponding remote control function data of its brand, and the remote control function data corresponding to the remote control device can be obtained by inputting the first diagram data corresponding to the remote control device 2 into the identification model.
In some embodiments, the recognition model may include a graph neural network model, such as a common graph neural network model of GNN, R-GCN, and so forth. Taking R-GCN as an example, the first graph data corresponding to the remote control device 1 is input into an R-GCN, R-GCN model, the vector representation of the first graph data can be obtained by convolution of the first graph data with R-GCN, and finally, the prediction type corresponding to the first graph data is obtained by using softmax activation function.
In some embodiments, step 332 or step 334 may be selectively executed according to obtaining the remote control function data corresponding to the remote control device or not.
Step 332, if the first result is that the remote control function data corresponding to the remote control device is obtained, determining the remote control function data corresponding to the remote control device based on the first result.
In particular, this step 332 may be performed by the first determination module 230.
When the first result is that the remote control function data corresponding to the remote control device is obtained, the first result also comprises the determined remote control function data. The remote control function data corresponding to the remote control device is determined based on the first result, and the remote control function data determined in the first result can be used as the remote control function data corresponding to the remote control device.
Step 334, if the first result is that there is no remote control function data corresponding to the remote control device, then: and acquiring second characteristics of the controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data.
In particular, this step 334 may be performed by the second determination module 240.
When the first result is that the remote control function data corresponding to the remote control device does not exist, the corresponding remote control function data can be determined by combining the related information of the controlled device matched with the remote control device. For example, the remote control device is a universal remote control device, the remote control device does not have corresponding remote control function data of a corresponding brand/model, the identification model is output as the remote control function data corresponding to the remote control device, the corresponding remote control function data can be determined based on the matched related information of the brand, the model and the like of the controlled device and the first characteristic, and the corresponding remote control function data can also be determined based on the matched related information of the brand, the model and the like of the controlled device and the first graph data.
In some embodiments, a second feature of the controlled device matched with the remote control device may be obtained, where the second feature refers to related information features of the controlled device, and the second feature may include a brand feature, a model feature, an identification feature, an appearance feature, and other features of the controlled device related to identifying the brand and/or model of the controlled device.
In some embodiments, after the second characteristic is obtained, the corresponding remote control function data may be determined based on the second characteristic and the first characteristic. For example, a database including the second characteristic and/or a brand and/or model of a controlled device corresponding to the second characteristic and a corresponding remote control function data type may be established, and determining the corresponding remote control function data type based on the second characteristic, such as a-type remote control function data or b-type remote control function data, may be implemented by searching the database. Based on the determined type of remote control function data and the first characteristic, remote control function data corresponding to the remote control device may be determined. The method can also be realized by training the neural network model to obtain a first characteristic corresponding to the input remote control equipment and a second characteristic corresponding to the matched controlled equipment, and outputting the identification model for obtaining the remote control function data of the remote control equipment. The identification model may be a logistic regression model, a classification and regression tree, a support vector machine, a classification and regression tree, a random forest, or other neural network models for classification, which is not limited in this embodiment.
In some embodiments, the second graph data may be determined based on the second feature and the first graph data. The second graph data refers to graph data that can characterize the first graph data and the second feature information. Remote control function data corresponding to the remote control device may be determined based on the second map data. Specifically, the remote control function data corresponding to the remote control device may be obtained through output of the second graph data, which may be input to the trained recognition model. For example, the remote control device 2 is a universal remote controller, and is distributed with keys corresponding to timing, turning on and turning off, and it has no corresponding brand or no corresponding remote control function data of its brand, and the corresponding controlled device brand is a, and based on the second feature corresponding to the remote control device 2 and the corresponding first graph data, the corresponding second graph data is determined, and the second graph data is input into the identification model, and the remote control function data corresponding to the remote control device is obtained by outputting and is a type remote control function data including timing, turning on and turning off functions.
In some embodiments, the recognition model may include a graph neural network model, such as a common graph neural network model of GNN, R-GCN, and so forth. Specifically, more contents about the recognition model which may include the graph neural network model can be referred to the related description of step 330, and are not described herein again.
In some embodiments, determining the second graph data based on the second feature and the first graph data may be by: and determining a second node, determining a second node attribute of the second node based on the second characteristic, and determining second graph data according to the second node and the first graph data.
The second node is a node used for characterizing related information of the controlled device in the graph data, and one controlled device may correspond to at least one second node. As shown at 700 in fig. 7, a second node I1 is included in the second graph data. The attribute of the second node may include information about the controlled device, and may be determined based on the second characteristic. For example, the attributes of the second node may include a brand feature, a model feature, an identification feature, a profile feature, etc. of the controlled device that are relevant to identifying the brand and/or model of the controlled device.
In some embodiments, the second graph data is determined according to the second node and the first graph data, the second graph data may be determined by adding the second node to the first graph data, or the second graph data may be determined again by combining the second node and the first node based on the first graph data and the virtual edge related information.
In some embodiments, the second characteristic may be obtained from other available data such as image information, text information, voice information, video information, and binary data containing information about the controlled device.
In some embodiments, the obtaining of the second feature may be performed by obtaining a second image of the controlled device matched with the remote control device, and inputting the second image into the feature generation model to obtain a second feature corresponding to the second image.
The image containing the controlled device related image may be referred to as a second image. The second image may include an identification image of the controlled device, an overall image representing the shape, size and the like of the remote control device, and the like, which may embody the brand and/or model of the controlled device, and may also include an application environment image of the controlled device and the like, which are related images of other controlled devices. The second image may be a 2D image, a 3D image, or other multi-dimensional image, and the embodiment is not limited thereto.
In some embodiments, the image of the remote control device may be obtained by scanning, photographing, shooting, and the like, which is not limited in this embodiment.
The input of the feature generation model is an image, and the output is a feature vector corresponding to the image. Specifically, the second image is input into the feature generation model, and a feature vector corresponding to the second image is output, and the feature vector can be used as a second feature corresponding to the controlled device. In some embodiments, the feature generation model may employ a neural network model, such as a conventional convolutional neural network model like CNN or VGG. Taking the VGG as an example, the second image is input into the VGG, and the feature vector corresponding to the second image can be output and obtained through the convolution layer, the pooling layer and the full-link layer.
In some embodiments, the obtaining of the second characteristic may be by obtaining audio information and/or text information input by the user, the audio information and the text information including device information of the controlled device, and the second characteristic of the controlled device may be obtained according to the audio information and/or the text information. For example, the user may input the device information of the controlled device by voice or by text, and may include related information of the controlled device such as a brand feature, a model feature, an identification feature, and an appearance feature, and may obtain the second feature of the controlled device by reading audio information of voice or text information of text content.
In some embodiments, after the remote control function data corresponding to the remote control device is obtained, the remote control function data corresponding to each remote control device may be stored, so that the remote control device and other intelligent devices call the corresponding remote control function data when implementing control over each controlled device. In some embodiments, this may be achieved by the following method of steps 340 and 342. In some embodiments, the methods of steps 340 and 342 may be performed by the storage module 250.
And 340, adding an identifier to the remote control function data corresponding to the remote control device and storing the identifier, wherein the identifier is related to the controlled device matched with the remote control device, so as to obtain a remote control function database comprising at least one remote control function data corresponding to the remote control device.
In some embodiments, after obtaining the remote control function data corresponding to the remote control device, an identifier may be added to the remote control function data corresponding to each remote control device and stored. Specifically, the information may be stored in a memory of the remote control device and other intelligent devices, or in a server, or in other storage spaces such as a cloud platform, which is not limited in this embodiment.
The label means a label or data for labeling the respective remote control function data. The identification may be determined based on device information associated with the controlled device to which the remote control device is matched. For example, if the controlled device is an air conditioner, the identifier of the remote control function data of the matched remote control device may be the air conditioner, and if the air conditioner is in a bedroom, the identifier of the remote control function data of the matched remote control device may be the bedroom air conditioner.
By adding and storing the identification to the remote control function data corresponding to each remote control device, a corresponding remote control function database can be obtained.
Step 342, the intelligent device searches for an identifier related to the target controlled device in the remote control function database, and invokes the remote control function data corresponding to the identifier related to the target controlled device to realize control of the target controlled device.
In some embodiments, when the remote control device and other intelligent devices control respective controlled devices, the remote control function database may call corresponding remote control function data to control the controlled devices. Specifically, the remote control function data corresponding to the identifier may be called by searching the identifier of the target controlled device in the remote control function database. The target controlled device refers to a controlled device that needs to be controlled. For example, the remote control function database stores remote control function data identified as a bedroom air conditioner, remote control function data identified as a living room sound box and remote control function data identified as a refrigerator, and if the target controlled device is the bedroom air conditioner, the identification of the bedroom air conditioner is searched in the remote control function database, and the corresponding remote control function data is called to realize control over the bedroom air conditioner.
In some embodiments, after the remote control function data corresponding to the remote control device is obtained, the remote control function data may be used to test the matched controlled device to verify whether the remote control function data is correct or not. In some embodiments, this may be accomplished by the methods of step 344, step 346, and step 348. In some embodiments, the methods of step 344, step 346, and step 348 may be performed by the test module 270.
And 344, testing the controlled equipment matched with the corresponding remote control equipment by adopting the remote control function data.
The test refers to control test of the controlled equipment. Specifically, the testing of the controlled device matched with the remote control device by using the remote control function data refers to that the remote control device or other intelligent devices call the remote control function data, the matched controlled device can be controlled by control modes such as infrared, wireless and voice, whether the remote control function data correctly reacts with the controlled device is tested, if not, the test fails, and if so, the test succeeds. For example, the remote control function data is called to control the air conditioner to be started, if the air conditioner is not started, the correct reaction is not carried out, the test fails, and if the air conditioner is started, the correct reaction is carried out, and the test succeeds.
Step 346, if the test is successful, the remote control function data is valid.
In some embodiments, when the test is successful, the corresponding remote control function data may be considered valid data. That is, effective control of the matched controlled device can be achieved using the remote control function data.
Step 348, if the test fails, acquiring a brand list and/or a model list corresponding to the controlled device, and selecting a brand and/or a model of the controlled device matched with the remote control device based on the brand list and/or the model list; and re-determining the remote control function data corresponding to the remote control equipment based on the selected brand and/or model and the first characteristic.
In some embodiments, when the test fails, the originally determined remote control function data is incorrect, and the remote control function data corresponding to the remote control device may be re-determined.
The brand list and/or model list includes the corresponding brand and/or model of each controlled device. The obtaining of the brand list and/or the model list may be obtained by obtaining from a database in which the brand list and/or the model list are stored, or inputting related information of each controlled device, and the like, which is not limited in this embodiment.
The selection of the corresponding brand and/or model from the brand list and/or model list may be implemented by a user or an intelligent device such as an intelligent robot, which is not limited in this embodiment.
After the brand and/or model of the controlled device is obtained, the remote control function data corresponding to the remote control device can be re-determined based on the brand and/or model and the first characteristic. For example, a database including the brand and/or model of the controlled device and the corresponding remote control function data type may be established, and by searching the database, the determination of the corresponding remote control function data type, such as a-type remote control function data or b-type remote control function data, based on the brand and/or model of the controlled device may be achieved. Based on the determined type of remote control function data and the first characteristic, remote control function data corresponding to the remote control device may be determined. The method can also be realized by training the neural network model to obtain a first characteristic corresponding to the input remote control equipment and a brand and/or model characteristic corresponding to the matched controlled equipment, and outputting the identification model of the remote control function data of the remote control equipment. The identification model may be a logistic regression model, a classification and regression tree, a support vector machine, a classification and regression tree, a random forest, or other neural network models for classification, which is not limited in this embodiment.
In some embodiments, the original recognition model can be optimized according to the re-determined remote control function data corresponding to the remote control device, so that the accuracy of the recognition model prediction is improved. In some embodiments, this may be accomplished by the method in step 352.
Step 352, a loss function is constructed according to the difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the remote control device which is determined again, and the parameters of the identification model are updated based on the loss function.
In some embodiments, step 352 may be performed by test module 270.
In some embodiments, a loss function is constructed based on a difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the re-determined remote control device, and the parameters of the identification model are updated based on the loss function. The training of the recognition model may be continued, and the parameters of the recognition model are updated with the difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the remote control device newly determined as the loss function, with the minimum loss function as the target.
Step 340-352 is not shown in fig. 3.
FIG. 4 is an exemplary flow diagram of a method of recognition model training, shown in accordance with some embodiments of the present description.
As shown in fig. 4, the method 400 may include:
step 410, at least one first graph data sample corresponding to at least one remote control device and a first label corresponding to the at least one first graph data sample are obtained, where the first label includes the remote control function data corresponding to the first graph data sample or the remote control function data not corresponding to the first graph data sample.
In particular, this step 410 may be performed by the recognition model training module 260.
The first graph data sample refers to sample data corresponding to the first graph data, and can be obtained based on a sample of the remote control device or obtained from a graph database corresponding to the remote control device. The at least one first pattern data sample may comprise at least one remote control device sample of first pattern data, and each remote control device sample may correspond to at least one first pattern data sample.
The label refers to the label of the sample data. In some embodiments, the obtaining of the tag may be obtained by artificially marking the sample data, or may be obtained in other manners, for example, historical sample data based on a known identification result, which is not limited in this embodiment.
The label of the first graph data sample may be referred to as a first label. The first label may be the remote control function data corresponding to the first graph data sample, or the remote control function data not corresponding to the first graph data sample. For example, the remote control device 1 is a sample of remote control devices, the brand of which is a, and corresponding keys are distributed to be turned up, turned down, turned on and turned off, and the first label of the first diagram data sample corresponding to the remote control device 1 is type a remote control function data including functions of turning up, turning down, turning on and turning off. For another example, the remote control device sample is the remote control device 2, which is a universal remote controller, and it has no corresponding brand or its brand has no corresponding remote control function data, and the first label of the first diagram data sample corresponding to the remote control device 2 is the remote control function data corresponding to no remote control device.
Step 420, obtaining at least one second feature sample corresponding to at least one controlled device matched with the remote control device, obtaining at least one second graph data sample based on the at least one second feature sample and the at least one first graph data sample, and obtaining a second label corresponding to the at least one second graph data sample, where the second label includes the remote control function data corresponding to the second graph data sample.
In particular, this step 420 may be performed by the recognition model training module 260.
The second feature sample refers to sample data corresponding to the second feature, and may be obtained based on the controlled device sample or obtained from a feature database corresponding to the controlled device. The at least one second characteristic sample may include a second characteristic sample of the at least one controlled device sample, and each controlled device sample may correspond to the at least one second characteristic sample.
The second graph data sample refers to sample data corresponding to the second graph data, and can be obtained based on the second feature sample and the first graph data sample. The at least one second feature sample and the at least one first graph data sample may correspond to obtain at least one second graph data sample.
The label of the second graph data sample may be referred to as a second label. The second label may be remote control function data corresponding to the second graph data sample. For example, the remote control device 2 is a universal remote controller, and is distributed with keys for timing, turning on and turning off, and it has no corresponding brand or its brand has no corresponding remote control function data, and the corresponding controlled device brand is a, and based on the second feature sample corresponding to the remote control device 2 and the corresponding first graph data sample, the label of the determined second graph data sample is a type remote control function data including timing, turning on and turning off functions.
Step 430, taking the at least one first graph data sample with the first label and the at least one second graph data sample with the second label as training samples, and training an initial recognition model to obtain the trained recognition model.
In particular, this step 430 may be performed by the recognition model training module 260.
The initial recognition model refers to an untrained original recognition model, and model parameters of the initial recognition model are still in an initialized state. And after the initial recognition model is trained, the recognition model is obtained.
The recognition model may include a graph neural network model, and further details regarding the recognition model may be found in relation to step 330 in fig. 3, which will not be described herein.
The method comprises the steps of taking at least one first graph data sample with a first label and at least one second graph data sample with a second label as training samples, training an initial recognition model, namely respectively taking the at least one first graph data sample with the first label and the at least one second graph data sample with the second label as the training samples to be input into the initial recognition model, and training the initial recognition model to update model parameters.
In some embodiments, the recognition model may be trained by a common method based on training samples, updating model parameters. For example, the training may be based on a gradient descent method. In some embodiments, when the trained recognition model satisfies the preset condition, the training is ended. The preset condition may be that the loss function result converges or is smaller than a preset threshold, etc.
FIG. 5 is an exemplary flow chart illustrating another method for configuring remote control function data from a remote control device image according to some embodiments of the present description.
As shown in fig. 5, the method 500 may include:
step 510, obtaining a first image of a remote control device, and determining a first feature based on the first image, where the first feature includes a key layout feature of the remote control device.
In particular, this step 510 may be performed by the second obtaining module.
For the remote control function data, the remote control device, the first image and the first feature of the remote control device, the key layout feature of the remote control device, and the corresponding obtaining and determining method, reference may be made to the related description in step 310 in fig. 3, which is not described herein again.
And step 520, determining remote control function data corresponding to the remote control equipment through the trained recognition model based on the first characteristic.
In particular, this step 520 may be performed by the second identification module.
After the first characteristic is determined, the remote control function data corresponding to the remote control device can be determined based on the first characteristic. In some embodiments, the trained recognition model may be used to determine remote control function data corresponding to the remote control device based on the first characteristic. For more details of determining the remote control function data corresponding to the remote control device by identifying the model based on the first feature, reference may be made to the relevant description in step 310 in fig. 3, and details are not repeated here.
Step 530, determining a controlled device matched with the remote control device based on the remote control function data.
In particular, this step 530 may be performed by a third determining module.
In some embodiments, when remote control devices of a plurality of controlled devices, such as refrigerators, televisions, air conditioners, and the like, are identified in a unified manner to determine corresponding remote control function data, it is not known which remote control device corresponds to which controlled device. After the remote control function data corresponding to the remote control device is determined, the controlled device matched with the remote control device can be determined based on the remote control function data.
In some embodiments, determining a controlled device that matches the remote control device based on the remote control function data may be accomplished by building a database that includes remote control function data types and controlled device information corresponding thereto. The type of the corresponding remote control function data can be known based on the determined remote control function data, and the controlled device information corresponding to the remote control function data can be determined by searching the database, namely the controlled device matched with the remote control device can be determined.
In some embodiments, determining a controlled device that matches the remote control device based on the remote control function data may be accomplished by training a controlled device identification model. The input of the controlled equipment identification model is remote control function data, and the output is corresponding controlled equipment. The controlled device identification model may be a logistic regression model, a classification and regression tree, a support vector machine, a classification and regression tree, a random forest, or other neural network models for classification, which is not limited in this embodiment.
FIG. 6 is an exemplary flow diagram of another method of recognition model training, shown in accordance with some embodiments of the present description.
As shown in fig. 6, the method 600 may include:
step 610, obtaining at least one first characteristic sample corresponding to at least one remote control device, and a sample label corresponding to the at least one first characteristic sample, where the sample label includes the remote control function data corresponding to the first characteristic sample.
In particular, this step 610 may be performed by the second recognition model training module.
The first feature sample refers to sample data corresponding to the first feature, and may be obtained based on a sample of the remote control device or obtained from a first feature library corresponding to the remote control device. The at least one first characteristic sample may include a first characteristic sample of at least one remote control device sample, and each remote control device sample may correspond to at least one first characteristic sample.
The labeling of the first feature exemplar may be referred to as an exemplar label. The sample label may be remote control function data corresponding to the first characteristic sample. For example, the remote control device sample is a remote control device 1, the brand of which is a, and corresponding keys are distributed to be turned up, turned down, turned on and turned off, and the sample label of the first feature sample corresponding to the remote control device 1 is type a remote control function data including functions of turning up, turning down, turning on and turning off.
And step 620, training an initial recognition model by using the at least one first characteristic sample with the sample label to obtain the trained recognition model.
In particular, this step 620 may be performed by the second recognition model training module.
The initial recognition model refers to an untrained original recognition model, and model parameters of the initial recognition model are still in an initialized state. And after the initial recognition model is trained, the recognition model is obtained.
The method comprises the steps of taking at least one first characteristic sample with a sample label as a training sample, training an initial recognition model, namely inputting the at least one first characteristic sample with the sample label into the initial recognition model, and training the initial recognition model to update model parameters.
In some embodiments, the initial recognition model may be trained by conventional methods based on training samples, updating model parameters. For example, the training may be based on a gradient descent method. In some embodiments, when the trained recognition model satisfies the preset condition, the training is ended. The preset condition may be that the loss function result converges or is smaller than a preset threshold, etc.
Embodiments of the present specification also provide an apparatus, which includes a processor configured to execute the foregoing method for configuring remote control function data according to a remote control device image. The method can comprise the following steps: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge; determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises obtaining remote control function data corresponding to the remote control equipment or no remote control function data corresponding to the remote control equipment; if the first result is that the remote control function data corresponding to the remote control equipment is obtained, determining the remote control function data corresponding to the remote control equipment based on the first result; if the first result is that the remote control function data corresponding to the remote control device does not exist, then: and acquiring second characteristics of the controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data. Or the method may comprise: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining remote control function data corresponding to the remote control equipment through a trained recognition model based on the first characteristic; and determining the controlled equipment matched with the remote control equipment based on the remote control function data.
The embodiment of the present specification further provides a computer-readable storage medium, where the storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the foregoing method for configuring remote control function data according to a remote control device image. The method can comprise the following steps: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge; determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises obtaining remote control function data corresponding to the remote control equipment or no remote control function data corresponding to the remote control equipment; if the first result is that the remote control function data corresponding to the remote control equipment is obtained, determining the remote control function data corresponding to the remote control equipment based on the first result; if the first result is that the remote control function data corresponding to the remote control device does not exist, then: and acquiring second characteristics of the controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data. Or the method may comprise: acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining remote control function data corresponding to the remote control equipment through a trained recognition model based on the first characteristic; and determining the controlled equipment matched with the remote control equipment based on the remote control function data.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) by identifying the image of the remote control equipment and determining the corresponding remote control function data according to the key characteristic layout, configuration is not needed after searching the brand and/or model, the remote control function data can be configured more conveniently and accurately, and the user experience is improved; (2) the image data is constructed based on the image characteristics of the remote control equipment, the key characteristics and the layout information among keys of the remote control equipment are more completely embodied, and more accurate prediction is realized through the mode of image data identification; (3) the characteristics of the controlled equipment can be combined when the graph data is constructed, so that the remote control function data can not be determined according to the remote control equipment, and can be determined by combining the characteristics of the controlled equipment, so that the prediction is more accurate; (4) the characteristic information of the controlled equipment can be directly obtained through the image of the controlled equipment or the voice input, the text input and the like of a user, and the corresponding remote control function data can be more conveniently determined; (5) after the identification is added to the determined remote control function data and the remote control function data is stored, the remote control function data corresponding to the controlled equipment can be directly called to realize control, the control is more convenient and efficient, and the user experience is improved; (6) the determined remote control function data is tested, so that the accuracy of the configured remote control function data can be ensured, and the wrong configuration can be avoided; (7) when the determined remote control function data is invalid, the parameters of the recognition model are updated based on the re-determined result, and the recognition accuracy of the model can be improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present description may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of this description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present description may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of this specification may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (20)

1. A method for configuring remote control function data from a remote control device image, comprising:
acquiring a first image of remote control equipment, and determining first characteristics based on the first image, wherein the first characteristics comprise key layout characteristics of the remote control equipment; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge;
determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises obtaining remote control function data corresponding to the remote control equipment or no remote control function data corresponding to the remote control equipment;
if the first result is that the remote control function data corresponding to the remote control equipment is obtained, determining the remote control function data corresponding to the remote control equipment based on the first result; if the first result is that the remote control function data corresponding to the remote control device does not exist, then:
acquiring second characteristics of controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data;
the recognition model comprises a graph neural network model; the remote control function data comprises at least one group of remote control codes, and the intelligent equipment realizes control over controlled equipment matched with the remote control equipment according to the at least one group of remote control codes.
2. The method of claim 1, the determining the second graph data based on the second features and the first graph data comprising:
determining a second node, a second node attribute of the second node being determined based on the second feature;
determining the second graph data from the second node and the first graph data.
3. The method of claim 1, the obtaining a second characteristic of a controlled device matching the remote control device comprising:
acquiring a second image of the controlled device matched with the remote control device;
and inputting the second image into a feature generation model to obtain the second feature corresponding to the second image.
4. The method of claim 1, the obtaining a second characteristic of a controlled device matching the remote control device comprising:
acquiring audio information and/or text information input by a user, wherein the audio information and the text information comprise equipment information of the controlled equipment;
and obtaining the second characteristic of the controlled equipment according to the audio information and/or the text information.
5. The method of claim 1, further comprising:
adding an identifier to the remote control function data corresponding to the remote control equipment and storing the identifier, wherein the identifier is related to controlled equipment matched with the remote control equipment, and a remote control function database comprising at least one remote control function data corresponding to the remote control equipment is obtained;
and the intelligent equipment searches an identifier related to the target controlled equipment in the remote control function database, and calls the remote control function data corresponding to the identifier related to the target controlled equipment to realize the control of the target controlled equipment.
6. The method of claim 1, the training method of the recognition model comprising:
obtaining at least one first graph data sample corresponding to at least one remote control device and a first label corresponding to the at least one first graph data sample, wherein the first label comprises the remote control function data corresponding to the first graph data sample or the remote control function data not corresponding to the first graph data sample;
obtaining at least one second feature sample corresponding to at least one controlled device matched with the remote control device, obtaining at least one second image data sample based on the at least one second feature sample and the at least one first image data sample, and obtaining a second label corresponding to the at least one second image data sample, wherein the second label comprises the remote control function data corresponding to the second image data sample;
and taking the at least one first graph data sample with the first label and the at least one second graph data sample with the second label as training samples, and training an initial recognition model to obtain the trained recognition model.
7. The method of claim 1, further comprising:
testing the controlled equipment matched with the corresponding remote control equipment by adopting the remote control function data;
if the test is successful, the remote control function data is valid;
if the test fails, acquiring a brand list and/or a model list corresponding to the controlled equipment, and selecting the brand and/or the model of the controlled equipment matched with the remote control equipment based on the brand list and/or the model list; and re-determining the remote control function data corresponding to the remote control equipment based on the selected brand and/or model and the first characteristic.
8. The method of claim 7, further comprising:
and constructing a loss function according to the difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the remote control equipment which is determined again, and updating the parameters of the identification model based on the loss function.
9. The method of claim 1, further comprising:
determining voice control data corresponding to the remote control equipment based on the at least one group of remote control codes of the remote control function data corresponding to the remote control equipment;
and the intelligent equipment realizes the voice control of the controlled equipment matched with the remote control equipment according to the voice control data.
10. A system for configuring remote control function data from a remote control device image, comprising:
a first obtaining module: the remote control device comprises a first image acquisition module, a second image acquisition module, a first characteristic determination module and a second characteristic determination module, wherein the first image acquisition module is used for acquiring a first image of a remote control device and determining a first characteristic based on the first image, and the first characteristic comprises a key layout characteristic of the remote control device; determining first graph data corresponding to the remote control device based on the first characteristic, wherein the first graph data comprises a plurality of first nodes corresponding to a plurality of keys and at least one virtual edge; each virtual edge is connected with two of the plurality of first nodes, the first node attribute of each first node is determined based on the attribute of the corresponding key, and the edge attribute of each virtual edge is determined based on the position information of the key corresponding to the first node connected with the virtual edge;
a first identification module: the remote control device is used for determining a first result through a trained recognition model based on the first graph data, wherein the first result comprises the remote control function data corresponding to the remote control device or the remote control function data corresponding to no remote control device;
a first determination module: the remote control device is used for determining the remote control function data corresponding to the remote control device based on the first result if the first result is that the remote control function data corresponding to the remote control device is obtained;
a second determination module: if the first result is that the remote control function data corresponding to the remote control device does not exist, then: acquiring second characteristics of controlled equipment matched with the remote control equipment, determining second graph data based on the second characteristics and the first graph data, and acquiring the remote control function data corresponding to the remote control equipment through the trained recognition model based on the second graph data;
the recognition model comprises a graph neural network model; the remote control function data comprises at least one group of remote control codes, and the intelligent equipment realizes control over controlled equipment matched with the remote control equipment according to the at least one group of remote control codes.
11. The system of claim 10, the second determination module further to:
determining a second node, a second node attribute of the second node being determined based on the second feature;
determining the second graph data from the second node and the first graph data.
12. The system of claim 10, the second determination module further to:
acquiring a second image of the controlled device matched with the remote control device;
and inputting the second image into a feature generation model to obtain the second feature corresponding to the second image.
13. The system of claim 10, the second determination module further to:
acquiring audio information and/or text information input by a user, wherein the audio information and the text information comprise equipment information of the controlled equipment;
and obtaining the second characteristic of the controlled equipment according to the audio information and/or the text information.
14. The system of claim 10, further comprising a storage module to:
adding an identifier to the remote control function data corresponding to the remote control equipment and storing the identifier, wherein the identifier is related to controlled equipment matched with the remote control equipment, and a remote control function database comprising at least one remote control function data corresponding to the remote control equipment is obtained;
and the intelligent equipment searches an identifier related to the target controlled equipment in the remote control function database, and calls the remote control function data corresponding to the identifier related to the target controlled equipment to realize the control of the target controlled equipment.
15. The system of claim 10, further comprising a recognition model training module to:
obtaining at least one first graph data sample corresponding to at least one remote control device and a first label corresponding to the at least one first graph data sample, wherein the first label comprises the remote control function data corresponding to the first graph data sample or the remote control function data not corresponding to the first graph data sample;
obtaining at least one second feature sample corresponding to at least one controlled device matched with the remote control device, obtaining at least one second image data sample based on the at least one second feature sample and the at least one first image data sample, and obtaining a second label corresponding to the at least one second image data sample, wherein the second label comprises the remote control function data corresponding to the second image data sample;
and taking the at least one first graph data sample with the first label and the at least one second graph data sample with the second label as training samples, and training an initial recognition model to obtain the trained recognition model.
16. The system of claim 10, further comprising a test module to:
testing the controlled equipment matched with the corresponding remote control equipment by adopting the remote control function data;
if the test is successful, the remote control function data is valid;
if the test fails, acquiring a brand list and/or a model list corresponding to the controlled equipment, and selecting the brand and/or the model of the controlled equipment matched with the remote control equipment based on the brand list and/or the model list; and re-determining the remote control function data corresponding to the remote control equipment based on the selected brand and/or model and the first characteristic.
17. The system of claim 16, the test module further to:
and constructing a loss function according to the difference between the remote control function data corresponding to the test failure and the remote control function data corresponding to the remote control equipment which is determined again, and updating the parameters of the identification model based on the loss function.
18. The system of claim 10, further comprising a speech determination module to:
determining voice control data corresponding to the remote control equipment based on the at least one group of remote control codes of the remote control function data corresponding to the remote control equipment;
and the intelligent equipment realizes the voice control of the controlled equipment matched with the remote control equipment according to the voice control data.
19. An apparatus for configuring remote control function data from a remote control device image, comprising at least one storage medium and at least one processor, the at least one storage medium for storing computer instructions; the at least one processor is configured to execute the computer instructions to implement the method of any of claims 1-9.
20. A computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of any one of claims 1 to 9.
CN202011019103.4A 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image Active CN112085942B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011019103.4A CN112085942B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image
CN202210126675.5A CN114399896B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011019103.4A CN112085942B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210126675.5A Division CN114399896B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image

Publications (2)

Publication Number Publication Date
CN112085942A true CN112085942A (en) 2020-12-15
CN112085942B CN112085942B (en) 2021-12-31

Family

ID=73739847

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210126675.5A Active CN114399896B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image
CN202011019103.4A Active CN112085942B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210126675.5A Active CN114399896B (en) 2020-09-25 2020-09-25 Method and system for configuring remote control function data according to remote control equipment image

Country Status (1)

Country Link
CN (2) CN114399896B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095768A (en) * 2021-11-29 2022-02-25 四川长虹电器股份有限公司 Infrared remote controller learning method and system based on machine learning algorithm

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102346643A (en) * 2011-09-14 2012-02-08 华为终端有限公司 Realization method and device for learnable type remoter
CN102354273A (en) * 2011-09-21 2012-02-15 歌尔声学股份有限公司 Remote control method and electronic equipment for implementing same
US20120161928A1 (en) * 2010-12-24 2012-06-28 Mstar Semiconductor, Inc. Display Apparatus, Remote Controller and Associated Display System
CN103475806A (en) * 2012-06-08 2013-12-25 腾讯科技(深圳)有限公司 Remote-control adaptive control method, equipment and system thereof
CN103531015A (en) * 2013-10-08 2014-01-22 海信集团有限公司 Method for realizing remote control function, and electronic device thereof

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7437062B2 (en) * 2005-11-10 2008-10-14 Eradas, Inc. Remote sensing system capable of coregistering data from sensors potentially having unique perspectives
KR101700616B1 (en) * 2012-06-27 2017-01-31 한국전자통신연구원 Multimedia apparatus and remote controller for frame synchronization and synchronizing method
KR102217238B1 (en) * 2014-07-23 2021-02-18 엘지전자 주식회사 Remote controller and operating method thereof
CN105635776B (en) * 2014-11-06 2019-03-01 深圳Tcl新技术有限公司 Pseudo operation graphical interface remoting control method and system
KR20190104317A (en) * 2016-12-13 2019-09-09 주식회사 랜티스 Back cover for mobile terminal with remote control function and control method of electronic products using same
KR102250078B1 (en) * 2018-08-10 2021-05-12 주식회사 더벨 A door-lock locking and unlocking apparatus, a central control server and a manufacturing system using the same
CN109448356B (en) * 2019-01-04 2020-04-28 恬家(上海)信息科技有限公司 Remote controller, intelligent equipment and configuration method
KR20190109337A (en) * 2019-09-06 2019-09-25 엘지전자 주식회사 Apparatus for controlling device based on augmentded reality and method thereof
KR102068277B1 (en) * 2019-10-04 2020-02-11 주식회사 루닛 Method and System for analysing image
CN110705551B (en) * 2019-10-09 2022-05-20 北京百度网讯科技有限公司 Key position identification method and device, electronic equipment and storage medium
CN111126917A (en) * 2019-12-24 2020-05-08 北京每日优鲜电子商务有限公司 Unit transfer value adjusting method, unit transfer value adjusting device, computer equipment and storage medium
CN111259828B (en) * 2020-01-20 2022-05-17 河海大学 High-resolution remote sensing image multi-feature-based identification method
CN111460132B (en) * 2020-03-10 2021-08-10 哈尔滨工业大学 Generation type conference abstract method based on graph convolution neural network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120161928A1 (en) * 2010-12-24 2012-06-28 Mstar Semiconductor, Inc. Display Apparatus, Remote Controller and Associated Display System
CN102346643A (en) * 2011-09-14 2012-02-08 华为终端有限公司 Realization method and device for learnable type remoter
CN102354273A (en) * 2011-09-21 2012-02-15 歌尔声学股份有限公司 Remote control method and electronic equipment for implementing same
CN103475806A (en) * 2012-06-08 2013-12-25 腾讯科技(深圳)有限公司 Remote-control adaptive control method, equipment and system thereof
CN103531015A (en) * 2013-10-08 2014-01-22 海信集团有限公司 Method for realizing remote control function, and electronic device thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095768A (en) * 2021-11-29 2022-02-25 四川长虹电器股份有限公司 Infrared remote controller learning method and system based on machine learning algorithm

Also Published As

Publication number Publication date
CN114399896A (en) 2022-04-26
CN114399896B (en) 2023-04-07
CN112085942B (en) 2021-12-31

Similar Documents

Publication Publication Date Title
CN111950638B (en) Image classification method and device based on model distillation and electronic equipment
CN110807515A (en) Model generation method and device
CN109697500B (en) Data processing method and device, electronic equipment and storage medium
US10488839B2 (en) Method and apparatus for controlling and managing an industry process using an industry internet operating system
CN112085942B (en) Method and system for configuring remote control function data according to remote control equipment image
CN112836521A (en) Question-answer matching method and device, computer equipment and storage medium
CN111210017B (en) Method, device, equipment and storage medium for determining layout sequence and data processing
CN111857803A (en) Page updating method, device, server and storage medium
CN114091589B (en) Model training method and device, electronic equipment and medium
CN113326913A (en) Neural network model conversion method, model precision positioning method and device
CN112906554B (en) Model training optimization method and device based on visual image and related equipment
CN109783622A (en) One kind determining problem answers method, apparatus and electronic equipment based on Question Classification
US20230273826A1 (en) Neural network scheduling method and apparatus, computer device, and readable storage medium
CN111814044A (en) Recommendation method and device, terminal equipment and storage medium
CN111368998A (en) Spark cluster-based model training method, device, equipment and storage medium
CN110728359A (en) Method, device, equipment and storage medium for searching model structure
CN113742804B (en) Furniture layout diagram generation method, device, equipment and storage medium
CN109672772A (en) Submit the Handless system and method for non-standard attention command
CN109284828A (en) A kind of hyper parameter tuning method, device and equipment
CN110851574A (en) Statement processing method, device and system
CN110334024B (en) Test case management method, device and terminal based on tree structure
CN111353585A (en) Structure searching method and device of neural network model
CN112905931A (en) Page information display method and device, electronic equipment and storage medium
CN111915277A (en) Contract flow management method and device
CN106682221B (en) Question-answer interaction response method and device and question-answer system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: No.8, Kangping street, Jiangning Economic and Technological Development Zone, Nanjing, Jiangsu, 211106

Patentee after: Hansang (Nanjing) Technology Co.,Ltd.

Address before: No.8, Kangping street, Jiangning Economic and Technological Development Zone, Nanjing, Jiangsu, 211106

Patentee before: HANSONG (NANJING) TECHNOLOGY CO.,LTD.

CP01 Change in the name or title of a patent holder