CN113961735A - Method and device for establishing mapping relation, storage medium and electronic device - Google Patents

Method and device for establishing mapping relation, storage medium and electronic device Download PDF

Info

Publication number
CN113961735A
CN113961735A CN202111089516.4A CN202111089516A CN113961735A CN 113961735 A CN113961735 A CN 113961735A CN 202111089516 A CN202111089516 A CN 202111089516A CN 113961735 A CN113961735 A CN 113961735A
Authority
CN
China
Prior art keywords
target
clothes
parameter set
determining
target clothes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111089516.4A
Other languages
Chinese (zh)
Inventor
孙晓鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Technology Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Technology Co Ltd
Priority to CN202111089516.4A priority Critical patent/CN113961735A/en
Publication of CN113961735A publication Critical patent/CN113961735A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/80Information retrieval; Database structures therefor; File system structures therefor of semi-structured data, e.g. markup language structured data such as SGML, XML or HTML
    • G06F16/84Mapping; Conversion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/54Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for retrieval

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Tourism & Hospitality (AREA)
  • Library & Information Science (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a device for establishing a mapping relation, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; the technical scheme is adopted, and the problems that in the related technology, a target object can only manually input clothes parameter information, the clothes parameter information is manually associated with the storage area of the clothes, and the like are solved.

Description

Method and device for establishing mapping relation, storage medium and electronic device
Technical Field
The present invention relates to the field of communications, and in particular, to a method and an apparatus for establishing a mapping relationship, a storage medium, and an electronic apparatus.
Background
With the progress of scientific technology and the development of artificial intelligence, intelligent algorithms are also increasingly applied to daily life, especially for household appliances, the intelligent development of the household appliances is crucial, and the most important problem of intelligence lies in the urgent need of intelligently solving target objects.
In daily life, when finding clothes, a user often turns over a box and turns over a cabinet to turn over all the wardrobes once to determine the positions of the clothes; the existing clothes storage mode realizes basic clothes classification management, clothes wearing recommendation and the like, but cannot realize automatic clothes information recognition by photographing, voice interaction recording storage position, automatic clothes storage information broadcasting and the like. The use experience is complex, the target object needs to spend a lot of time on the parameter information entry of the clothes, and the usability is low.
In the related art, effective solutions have not been proposed yet for the problems that the target object can only manually input the clothes parameter information, and manually associate the clothes parameter information with the storage area of the clothes.
Disclosure of Invention
The embodiment of the invention provides a method and a device for establishing a mapping relation, a storage medium and an electronic device, which are used for at least solving the problems that in the related art, a target object only can manually input clothes parameter information, the clothes parameter information is manually associated with a storage area of clothes and the like.
According to an embodiment of the present invention, a method for establishing a mapping relationship is provided, including: acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
In an exemplary embodiment, after determining the set mapping relationship between the first parameter set and the second parameter set, the method further comprises: acquiring second voice data of the target object, wherein the second voice data is used for searching for target clothes; analyzing the second voice data to obtain a third parameter set of the target clothes; and inquiring a set mapping relation matched with the third parameter set in a database according to the third parameter set, and determining a storage area of the target clothes according to the set mapping relation.
In an exemplary embodiment, after determining the storage area of the target garment according to the set mapping relationship, the method further includes: determining a union set of the target clothes according to the set mapping relation, and determining a collocation manner corresponding to the target clothes according to the union set of the target clothes; and sending the collocation manner to display equipment so that the display equipment displays the collocation manner in a preset manner.
In an exemplary embodiment, determining a collocation manner corresponding to the target clothing according to the union set of the target clothing includes: acquiring weather information and schedule information of the target object; and determining a collocation mode corresponding to the target clothes according to the weather information, the schedule information and the union set.
In an exemplary embodiment, after determining the storage area of the target garment according to the set mapping relationship, the method further includes: determining first position information of the target object and a plurality of second position information of a plurality of display devices corresponding to the target object; and determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information.
In an exemplary embodiment, after determining the distances between the plurality of display devices and the target object according to the plurality of second location information and the first location information, the method further includes: determining a target display device closest to the target object according to the distances between the plurality of display devices and the target object; and sending the storage sub-area corresponding to the target clothes to the target display equipment so that the target display equipment displays the storage sub-area corresponding to the target clothes.
In an exemplary embodiment, parsing the image information to obtain a first parameter set of the target garment and parsing the first voice data to obtain a second parameter set of the target garment includes: identifying the image information to obtain a first parameter set of the target garment, wherein the first parameter set includes at least one of: a first category of the target garment, a first color of the target garment, a first material of the target garment; recognizing the first voice data to obtain a second parameter set of the target clothes, wherein the second parameter set at least comprises one of the following parameters: a second category of the target garment, a second color of the target garment, a second material of the target garment.
According to another embodiment of the present invention, there is provided an apparatus for establishing a mapping relationship, including: the acquisition module is used for acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object; the analysis module is used for analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; the determining module is used for determining a union set of the first parameter set and the second parameter set and acquiring a storage area of the target clothes; the establishing module is used for establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, where the computer program is configured to execute the above method for establishing the mapping relationship when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the method for establishing the mapping relationship through the computer program.
In the embodiment of the invention, image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object are acquired; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes, namely, a first parameter set and a second parameter set of the target clothes are obtained by identifying first voice data and image information, and further, the mapping relation is established according to the union set of the first parameter set and the second parameter set and the storage area of the clothes.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile terminal according to a method for establishing a mapping relationship in an embodiment of the present invention;
FIG. 2 is a flowchart of a method for establishing a mapping relationship according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for establishing a mapping relationship according to an alternative embodiment of the present invention;
FIG. 4 is a flow chart of a method of establishing a mapping relationship according to an alternative embodiment of the invention;
fig. 5 is a block diagram of a mapping relationship establishing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method provided by the embodiment of the application can be executed in a mobile terminal, a computer terminal or a similar operation device. Taking the operation on the mobile terminal as an example, fig. 1 is a hardware structure block diagram of the mobile terminal of the method for establishing a mapping relationship according to the embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, which in an exemplary embodiment may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration with equivalent functionality to that shown in FIG. 1 or with more functionality than that shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of an application software, such as a computer program corresponding to the method for establishing the mapping relationship in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In this embodiment, a method for establishing a mapping relationship is provided, which is applied to the mobile terminal described above, and fig. 2 is a flowchart of a method for establishing a mapping relationship according to an embodiment of the present invention, where the flowchart includes the following steps:
step S202, acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object;
step S204, analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes;
step S206, determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes;
step S208, establishing a set mapping relationship according to the union set and the storage area of the target clothes, wherein the set mapping relationship is used for indicating the mapping relationship between the target clothes and the storage area of the target clothes.
Through the steps, image information of the target clothes and first voice data obtained by describing the target clothes in the image information by the target object are obtained; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes, namely, a first parameter set and a second parameter set of the target clothes are obtained by identifying first voice data and image information, and further, the mapping relation is established according to the union set of the first parameter set and the second parameter set and the storage area of the clothes.
In an exemplary embodiment, the storage area of the target garment is obtained by at least one of: respectively determining similarity values of any clothes in the clothes storage cabinet and the target clothes in the image information; determining a first garment with a similarity value larger than a preset threshold value and a first storage area corresponding to the first garment; determining a first storage area of the first laundry as a storage area of the target laundry; acquiring fourth voice information obtained by describing a storage area of the target clothes by the target object; analyzing the fourth voice information to acquire a storage area of the target clothes; acquiring target operation of the target object in a mobile terminal for operating a storage area of the target clothes; and analyzing the target operation to acquire a storage area of the target clothes.
In an exemplary embodiment, after determining the set mapping relationship between the first parameter set and the second parameter set, second voice data of the target object is obtained, wherein the second voice data is used for searching for target clothes; analyzing the second voice data to obtain a third parameter set of the target clothes; and inquiring a set mapping relation matched with the third parameter set in a database according to the third parameter set, and determining a storage area of the target clothes according to the set mapping relation.
That is to say, when the display device receives second voice data sent by the target object, the display device sends the second voice data to the cloud server, the cloud server analyzes the second voice data, and when the second voice data is analyzed to be used for searching for the target clothes, descriptions of the target object to the target clothes in the second voice data, namely the third parameter set, are determined, a set mapping relation matched with the descriptions of the target object is inquired in the database according to the descriptions of the target object to the target clothes, and then a storage area corresponding to the target clothes is determined according to the set mapping relation.
For example, the display device receives a "black coat finding" sent by a target object, the display device sends the "black coat finding" to the cloud server, the cloud server analyzes the "black coat finding" to find the "black" and the "coat", determines that the target object needs to find a target garment according to the "finding", queries a set mapping relation of the "black coat" matched with the description of the target object in the database according to the "black" and the "coat", determines a storage area corresponding to the "black coat", and sends the obtained result to the display device.
In an exemplary embodiment, after the storage area of the target clothes is determined according to the set mapping relationship, a union set of the target clothes is determined according to the set mapping relationship, and a collocation manner corresponding to the target clothes is determined according to the union set of the target clothes; and sending the collocation manner to display equipment so that the display equipment displays the collocation manner in a preset manner.
That is to say, the cloud server searches for a suitable collocation manner according to the clothes parameter information such as the type, color and material of the target clothes selected by the target object, and sends the collocation manner to the display device. Wherein the display device may be determined by: determining first position information of the target object and a plurality of third position information of the display device; and determining the distances between the plurality of display devices and the target object according to the plurality of third position information and the first position information, and determining the display devices according to the distances between the plurality of display devices and the target object.
Further, under the condition that the number of the target clothes is multiple, the collocation manner can be displayed according to the use frequency of the target clothes, wherein the use frequency includes but is not limited to the use frequency from more to less and the use frequency from less to more.
In an exemplary embodiment, determining a collocation manner corresponding to the target clothing according to the union set of the target clothing includes: acquiring weather information and schedule information of the target object; and determining a collocation mode corresponding to the target clothes according to the weather information, the schedule information and the union set.
The exemplary embodiment of the invention describes a method for determining a collocation manner corresponding to target clothes, and specifically, the method comprises the steps of obtaining current weather information and schedule information of a target object, and determining a union set according to the union set, or the weather information, or a first collocation manner set corresponding to the schedule information; determining a second collocation manner set corresponding to the target clothes in the first collocation manner set according to the union set, or weather information, or schedule information; and determining the collocation mode corresponding to the target clothes in the second collocation mode set according to the union set, or the weather information, or the schedule information.
It should be noted that, the obtaining of the schedule information of the target object at least includes one of the following: acquiring the schedule information input by the target object; acquiring the schedule information from an account of the target object; acquiring the schedule information from a notebook of the target object; and acquiring the schedule information from the chat records of the target object.
For example, a "black overcoat" is selected for the target object, and the following first collocation set is determined according to the "black overcoat": black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes; black overcoat, shirt, half skirt, Martin boot; the black overcoat, the one-piece dress and the high-heeled shoes acquire current weather information and are cold, and the following second collocation mode set is determined in the first collocation mode set: black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes; black overcoat, one-piece dress, high-heeled shoes; and then climbing mountains according to the schedule information of the obtained target object, and determining the following collocation modes in the second collocation mode set: black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes. It should be noted that, the above examples are only for better understanding of the embodiment of the present invention, and the union set, the weather information, and the schedule information are not limited in the embodiment of the present invention.
In an exemplary embodiment, after determining the storage area of the target clothing according to the set mapping relationship, determining first position information of the target object and a plurality of second position information of a plurality of display devices corresponding to the target object; and determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information.
That is to say, when a plurality of display devices are bound to account information of one target object, a target display device is determined in the plurality of display devices, specifically, the following manner is adopted: 1) sending query information to the target object, wherein the query information is used for querying the target display device determined in the plurality of display devices; 2) determining a target display device in the plurality of display devices according to the distances between the plurality of target objects and the display devices; 3) and determining the display equipment with the highest use frequency as the target display equipment through the use frequencies of the plurality of display equipment.
In an exemplary embodiment, after determining the distances between the plurality of display devices and the target object according to the plurality of second location information and the first location information, the method further includes: determining a target display device closest to the target object according to the distances between the plurality of display devices and the target object; and sending the storage sub-area corresponding to the target clothes to the target display equipment so that the target display equipment displays the storage sub-area corresponding to the target clothes.
That is to say, after determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information, the plurality of distances are sorted according to a predetermined manner, a target display device closest to the target object is determined according to the sorted plurality of distances, and the storage sub-area corresponding to the target clothes is sent to the target display device, so that the display device displays the storage sub-area corresponding to the target clothes.
In an exemplary embodiment, parsing the image information to obtain a first parameter set of the target garment and parsing the first voice data to obtain a second parameter set of the target garment includes: identifying the image information to obtain a first parameter set of the target garment, wherein the first parameter set includes at least one of: a first category of the target garment, a first color of the target garment, a first material of the target garment; recognizing the first voice data to obtain a second parameter set of the target clothes, wherein the second parameter set at least comprises one of the following parameters: a second category of the target garment, a second color of the target garment, a second material of the target garment.
That is, image information of the target laundry to be stored to the storage area is acquired; collecting voice data of a target object, and analyzing parameter information of the target clothes carried in the voice data, wherein the parameter information at least comprises one of the following items: the type of the target clothes, the color of the target clothes and the material of the target clothes; and performing combined analysis on the image information and the voice data, determining parameter information of the target clothes in the image information, and sending the parameter information to a mobile terminal.
Further, after determining a union set of the first parameter set and the second parameter set, obtaining third voice data of the target object, wherein the third voice data is used for washing and protecting target clothes; analyzing the third voice data to obtain a fourth parameter set of the target clothes; and inquiring the target clothes matched with the third parameter set and the washing and protecting mode corresponding to the target clothes in a database according to the fourth parameter set, wherein the database stores the corresponding relation among the parameter set, the clothes and the washing and protecting mode.
In order to better understand the process of the method for establishing the mapping relationship, the following describes a flow of the method for establishing the mapping relationship with an optional embodiment, but the flow is not limited to the technical solution of the embodiment of the present invention.
As shown in fig. 3, fig. 3 is a flowchart (one) of a method for establishing a mapping relationship according to an alternative embodiment of the present invention, which is specifically as follows:
step S301: the target object opens the mobile terminal, enters a clothes adding detail page, clicks and shoots clothes image information or selects the clothes image information in an album;
step S302: after the photographing is finished, entering a voice acquisition page to prompt a target object to speak out the parameter information of the category, color, material and the like of the clothes to be stored so as to obtain first voice data;
step S303: the mobile terminal sends the image information and the first voice data to the cloud server, and the cloud server analyzes parameter information such as categories, colors and materials of clothes through image recognition and voice recognition;
step S304: entering a clothes storage page, and returning parameter information such as the category, color, material and the like of clothes by the cloud server;
step S305: the target object confirms that the information is correct, selects a storage area to be stored, supplements information such as season and wearing occasion suitable for clothes, and then determines to store the information in the storage area.
As shown in fig. 4, fig. 4 is a flowchart (ii) of a method for establishing a mapping relationship according to an alternative embodiment of the present invention, which is specifically as follows:
step S401: the mobile terminal or the household appliance receives second voice data of the type, color and material of clothes to be searched by the target object;
step S402: the mobile terminal or the household appliance transmits the second voice data to the cloud server, the cloud server analyzes the second voice data to obtain field information such as types, colors, materials and the like of the clothes, and a target clothes and a storage area of the target clothes are searched in the database;
step S403: the mobile terminal or the household appliance displays the target clothes and the storage area of the target clothes;
step S404: the target object clicks the pushed clothes information on the mobile terminal or the household appliance device, and the storage area can be checked;
step S405: and the cloud server searches for a proper matching mode according to the clothes color and style selected by the target object and pushes the clothes color and style to the mobile terminal or the household appliance.
By the embodiment, the image information of the target clothes and the first voice data obtained by describing the target clothes in the image information by the target object are acquired; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes, namely, a first parameter set and a second parameter set of the target clothes are obtained by identifying first voice data and image information, and further, the mapping relation is established according to the union set of the first parameter set and the second parameter set and the storage area of the clothes.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Fig. 5 is a block diagram of a mapping relationship establishing apparatus according to an embodiment of the present invention; as shown in fig. 5, includes:
an obtaining module 52, configured to obtain image information of a target garment and first voice data obtained by a target object describing the target garment in the image information;
the analysis module 54 is configured to analyze the image information to obtain a first parameter set of the target garment, and analyze the first voice data to obtain a second parameter set of the target garment;
a determining module 56, configured to determine a union set of the first parameter set and the second parameter set, and acquire a storage area of the target laundry;
an establishing module 58, configured to establish a set mapping relationship according to the union set and the storage area of the target garment, where the set mapping relationship is used to indicate a mapping relationship between the target garment and the storage area of the target garment.
By the device, image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object are obtained; analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes; determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes; and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes, namely, a first parameter set and a second parameter set of the target clothes are obtained by identifying first voice data and image information, and further, the mapping relation is established according to the union set of the first parameter set and the second parameter set and the storage area of the clothes.
In an exemplary embodiment, the determining module 56 is configured to obtain the storage area of the target clothes at least one of: respectively determining similarity values of any clothes in the clothes storage cabinet and the target clothes in the image information; determining a first garment with a similarity value larger than a preset threshold value and a first storage area corresponding to the first garment; determining a first storage area of the first laundry as a storage area of the target laundry; acquiring fourth voice information obtained by describing a storage area of the target clothes by the target object; analyzing the fourth voice information to acquire a storage area of the target clothes; acquiring target operation of the target object in a mobile terminal for operating a storage area of the target clothes; and analyzing the target operation to acquire a storage area of the target clothes.
In an exemplary embodiment, the obtaining module 52 is further configured to obtain second voice data of the target object, where the second voice data is used to search for a target garment; analyzing the second voice data to obtain a third parameter set of the target clothes; and inquiring a set mapping relation matched with the third parameter set in a database according to the third parameter set, and determining a storage area of the target clothes according to the set mapping relation.
That is to say, when the display device receives second voice data sent by the target object, the display device sends the second voice data to the cloud server, the cloud server analyzes the second voice data, and when the second voice data is analyzed to be used for searching for the target clothes, descriptions of the target object to the target clothes in the second voice data, namely the third parameter set, are determined, a set mapping relation matched with the descriptions of the target object is inquired in the database according to the descriptions of the target object to the target clothes, and then a storage area corresponding to the target clothes is determined according to the set mapping relation.
For example, the display device receives a "black coat finding" sent by a target object, the display device sends the "black coat finding" to the cloud server, the cloud server analyzes the "black coat finding" to find the "black" and the "coat", determines that the target object needs to find a target garment according to the "finding", queries a set mapping relation of the "black coat" matched with the description of the target object in the database according to the "black" and the "coat", determines a storage area corresponding to the "black coat", and sends the obtained result to the display device.
In an exemplary embodiment, the determining module 56 is configured to determine a union set of the target clothes according to the set mapping relationship, and determine a collocation manner corresponding to the target clothes according to the union set of the target clothes; and sending the collocation manner to display equipment so that the display equipment displays the collocation manner in a preset manner.
That is to say, the cloud server searches for a suitable collocation manner according to the clothes parameter information such as the type, color and material of the target clothes selected by the target object, and sends the collocation manner to the display device. Wherein the display device may be determined by: determining first position information of the target object and a plurality of third position information of the display device; and determining the distances between the plurality of display devices and the target object according to the plurality of third position information and the first position information, and determining the display devices according to the distances between the plurality of display devices and the target object.
Further, under the condition that the number of the target clothes is multiple, the collocation manner can be displayed according to the use frequency of the target clothes, wherein the use frequency includes but is not limited to the use frequency from more to less and the use frequency from less to more.
In an exemplary embodiment, the determining module 56 is configured to obtain weather information and schedule information of the target object; and determining a collocation mode corresponding to the target clothes according to the weather information, the schedule information and the union set.
The exemplary embodiment of the invention describes a method for determining a collocation manner corresponding to target clothes, and specifically, the method comprises the steps of obtaining current weather information and schedule information of a target object, and determining a union set according to the union set, or the weather information, or a first collocation manner set corresponding to the schedule information; determining a second collocation manner set corresponding to the target clothes in the first collocation manner set according to the union set, or weather information, or schedule information; and determining the collocation mode corresponding to the target clothes in the second collocation mode set according to the union set, or the weather information, or the schedule information.
It should be noted that, the obtaining of the schedule information of the target object at least includes one of the following: acquiring the schedule information input by the target object; acquiring the schedule information from an account of the target object; acquiring the schedule information from a notebook of the target object; and acquiring the schedule information from the chat records of the target object.
For example, a "black overcoat" is selected for the target object, and the following first collocation set is determined according to the "black overcoat": black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes; black overcoat, shirt, half skirt, Martin boot; the black overcoat, the one-piece dress and the high-heeled shoes acquire current weather information and are cold, and the following second collocation mode set is determined in the first collocation mode set: black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes; black overcoat, one-piece dress, high-heeled shoes; and then climbing mountains according to the schedule information of the obtained target object, and determining the following collocation modes in the second collocation mode set: black overcoat, sweater, jeans, sports shoes; black overcoat, shirt, jeans, sports shoes. It should be noted that, the above examples are only for better understanding of the embodiment of the present invention, and the union set, the weather information, and the schedule information are not limited in the embodiment of the present invention.
In an exemplary embodiment, the determining module 56 is configured to determine first position information of the target object and a plurality of second position information of a plurality of display devices corresponding to the target object; and determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information.
That is to say, when a plurality of display devices are bound to account information of one target object, a target display device is determined in the plurality of display devices, specifically, the following manner is adopted: 1) sending query information to the target object, wherein the query information is used for querying the target display device determined in the plurality of display devices; 2) determining a target display device in the plurality of display devices according to the distances between the plurality of target objects and the display devices; 3) and determining the display equipment with the highest use frequency as the target display equipment through the use frequencies of the plurality of display equipment.
In an exemplary embodiment, the determining module 56 is configured to determine a target display device closest to the target object according to the distances between the plurality of display devices and the target object; and sending the storage sub-area corresponding to the target clothes to the target display equipment so that the target display equipment displays the storage sub-area corresponding to the target clothes.
That is to say, after determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information, the plurality of distances are sorted according to a predetermined manner, a target display device closest to the target object is determined according to the sorted plurality of distances, and the storage sub-area corresponding to the target clothes is sent to the target display device, so that the display device displays the storage sub-area corresponding to the target clothes.
In an exemplary embodiment, the analyzing module 54 is configured to identify the image information to obtain a first set of parameters of the target garment, wherein the first set of parameters includes at least one of: a first category of the target garment, a first color of the target garment, a first material of the target garment; recognizing the first voice data to obtain a second parameter set of the target clothes, wherein the second parameter set at least comprises one of the following parameters: a second category of the target garment, a second color of the target garment, a second material of the target garment.
That is, image information of the target laundry to be stored to the storage area is acquired; collecting voice data of a target object, and analyzing parameter information of the target clothes carried in the voice data, wherein the parameter information at least comprises one of the following items: the type of the target clothes, the color of the target clothes and the material of the target clothes; and performing combined analysis on the image information and the voice data, determining parameter information of the target clothes in the image information, and sending the parameter information to a mobile terminal.
An embodiment of the present invention further provides a storage medium including a stored program, wherein the program executes any one of the methods described above.
Alternatively, in the present embodiment, the storage medium may be configured to store program codes for performing the following steps:
s1, acquiring image information of the target clothes and first voice data obtained by describing the target clothes in the image information by the target object;
s2, analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes;
s3, determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes;
s4, establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring image information of the target clothes and first voice data obtained by describing the target clothes in the image information by the target object;
s2, analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes;
s3, determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes;
s4, establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for establishing a mapping relation is characterized by comprising the following steps:
acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object;
analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes;
determining a union set of the first parameter set and the second parameter set, and acquiring a storage area of the target clothes;
and establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
2. The method for establishing a mapping relationship according to claim 1, wherein after determining the set mapping relationship between the first parameter set and the second parameter set, the method further comprises:
acquiring second voice data of the target object, wherein the second voice data is used for searching for target clothes;
analyzing the second voice data to obtain a third parameter set of the target clothes;
and inquiring a set mapping relation matched with the third parameter set in a database according to the third parameter set, and determining a storage area of the target clothes according to the set mapping relation.
3. The method for establishing a mapping relation according to claim 2, wherein after determining the storage area of the target clothing according to the set mapping relation, the method further comprises:
determining a union set of the target clothes according to the set mapping relation, and determining a collocation manner corresponding to the target clothes according to the union set of the target clothes;
and sending the collocation manner to display equipment so that the display equipment displays the collocation manner in a preset manner.
4. The method for establishing the mapping relationship according to claim 3, wherein determining the matching manner corresponding to the target clothes according to the union set of the target clothes comprises:
acquiring weather information and schedule information of the target object;
and determining a collocation mode corresponding to the target clothes according to the weather information, the schedule information and the union set.
5. The method for establishing a mapping relation according to claim 2, wherein after determining the storage area of the target clothing according to the set mapping relation, the method further comprises:
determining first position information of the target object and a plurality of second position information of a plurality of display devices corresponding to the target object;
and determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information.
6. The method for establishing the mapping relationship according to claim 5, wherein after determining the distances between the plurality of display devices and the target object according to the plurality of second position information and the first position information, the method further comprises:
determining a target display device closest to the target object according to the distances between the plurality of display devices and the target object;
and sending the storage sub-area corresponding to the target clothes to the target display equipment so that the target display equipment displays the storage sub-area corresponding to the target clothes.
7. The method for establishing a mapping relationship according to claim 1, wherein analyzing the image information to obtain a first parameter set of the target clothing, and analyzing the first voice data to obtain a second parameter set of the target clothing comprises:
identifying the image information to obtain a first parameter set of the target garment, wherein the first parameter set includes at least one of: a first category of the target garment, a first color of the target garment, a first material of the target garment;
recognizing the first voice data to obtain a second parameter set of the target clothes, wherein the second parameter set at least comprises one of the following parameters: a second category of the target garment, a second color of the target garment, a second material of the target garment.
8. An apparatus for establishing a mapping relationship, comprising:
the acquisition module is used for acquiring image information of target clothes and first voice data obtained by describing the target clothes in the image information by a target object;
the analysis module is used for analyzing the image information to obtain a first parameter set of the target clothes, and analyzing the first voice data to obtain a second parameter set of the target clothes;
the determining module is used for determining a union set of the first parameter set and the second parameter set and acquiring a storage area of the target clothes;
the establishing module is used for establishing a set mapping relation according to the union set and the storage area of the target clothes, wherein the set mapping relation is used for indicating the mapping relation between the target clothes and the storage area of the target clothes.
9. A computer-readable storage medium, comprising a stored program, wherein the program is operable to perform the method of any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method of any of claims 1 to 7 by means of the computer program.
CN202111089516.4A 2021-09-16 2021-09-16 Method and device for establishing mapping relation, storage medium and electronic device Pending CN113961735A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111089516.4A CN113961735A (en) 2021-09-16 2021-09-16 Method and device for establishing mapping relation, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111089516.4A CN113961735A (en) 2021-09-16 2021-09-16 Method and device for establishing mapping relation, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN113961735A true CN113961735A (en) 2022-01-21

Family

ID=79461937

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111089516.4A Pending CN113961735A (en) 2021-09-16 2021-09-16 Method and device for establishing mapping relation, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN113961735A (en)

Similar Documents

Publication Publication Date Title
CN104484450B (en) Clothing matching based on image recommends method and clothing matching recommendation apparatus
CN105224775B (en) Method and device for matching clothes based on picture processing
EP3438577A1 (en) Intelligent refrigerator and control method and control system thereof
CN110807085B (en) Fault information query method and device, storage medium and electronic device
CN109447714A (en) Advertisement recommended method, device, equipment, system and server
CN105700680B (en) A kind of control method and wearable device of smart machine
CN112507211A (en) Message pushing method and device, storage medium and electronic device
CN110532273A (en) The processing method and processing device of tables of data, storage medium, electronic device
CN112417277A (en) Clothes recommendation method and device, storage medium and electronic device
CN106708916A (en) Commodity picture searching method and commodity picture searching system
CN112030465A (en) Method and device for cleaning first object, storage medium and electronic device
CN114265927A (en) Data query method and device, storage medium and electronic device
CN111209368A (en) Information prompting method and device, computer readable storage medium and electronic device
CN111126457A (en) Information acquisition method and device, storage medium and electronic device
CN113009839B (en) Scene recommendation method and device, storage medium and electronic equipment
CN109408737B (en) User recommendation method, device and storage medium
CN108924915B (en) Data updating method and device based on flight mode
CN113961735A (en) Method and device for establishing mapping relation, storage medium and electronic device
CN111738181A (en) Object association method and device, and object retrieval method and device
CN113377970A (en) Information processing method and device
CN110895555A (en) Data retrieval method and device, storage medium and electronic device
CN112269894B (en) Article pool generation method, image search method, device, electronic equipment and medium
CN115221336A (en) Method and device for determining food storage information, storage medium and electronic device
KR20140006440U (en) System and method for plant idendification using both images and taxonomic characters
CN106776864A (en) A kind of image searching method and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination