CA3107808A1 - Photo assisted self-diagnosis of appliances - Google Patents

Photo assisted self-diagnosis of appliances Download PDF

Info

Publication number
CA3107808A1
CA3107808A1 CA3107808A CA3107808A CA3107808A1 CA 3107808 A1 CA3107808 A1 CA 3107808A1 CA 3107808 A CA3107808 A CA 3107808A CA 3107808 A CA3107808 A CA 3107808A CA 3107808 A1 CA3107808 A1 CA 3107808A1
Authority
CA
Canada
Prior art keywords
appliance
electronic device
processor
data
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CA3107808A
Other languages
French (fr)
Inventor
The Vinh Nguyen
Thi Thanh Thuy Tran
Hoang Phuong Linh Vuong
Minh Man Huynh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fixease Services Inc
Original Assignee
Fixease Services Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fixease Services Inc filed Critical Fixease Services Inc
Priority to CA3107808A priority Critical patent/CA3107808A1/en
Publication of CA3107808A1 publication Critical patent/CA3107808A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/04Billing or invoicing

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Accounting & Taxation (AREA)
  • Molecular Biology (AREA)
  • Finance (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Image Analysis (AREA)

Abstract

Systems and methods for photograph assisted self-diagnosis of appliance. The method includes receiving, by a processor, a first set of data related to an appliance from an electronic device; identifying, by the processor, a model of the appliance based on the first set of data; receiving, by the processor, a second set of data from the electronic device; identifying, using a machine learning model, an issue of the appliance by using the second set of data; and transmitting, by the processor, a recommendation for fixing the issue to the electronic device.

Description

PHOTOGRAPH ASSISTED SELF-DIAGNOSIS OF APPLIANCES
TECHNICAL FIELD
[0001] Example embodiments relate to software-based appliance diagnosis, in particular, to systems and methods for photograph assisted self-diagnosis of appliances.
BACKGROUND
[0002] Repairing an appliance, including a home appliance, is a time consuming and frustrating process. It can take more than two months to fix of an appliance.
As well, repairing an appliance can be expensive as the repairing may involve one or several visits of paid technicians to fix the appliance.
[0003] Sometimes, even when operational manuals or technical documents of appliances are available, it may be difficult for a user to locate relevant information as too much information is included. For example, a user simply does not like to read through thick operational manuals or technical documents to find relevant information.
[0004] It would be advantageous to provide systems and methods which assist a user with step-by-step diagnosis and repair of appliances from photographs of the appliances.
SUMMARY
[0005] Example embodiments relate to responsive, convenient, efficient, and cost-effective systems and methods for photograph assisted self-diagnosis of appliance.
[0006] The system is convenient to use by providing step-by-step instructions to a user to perform selfdiagnosis and repair. The system is efficient by providing the user recommendations for the user to fix the issue based on the diagnosed results. The system is also cost-effective in Date Recue/Date Received 2021-02-02 that the technician visit is needed only if the system cannot diagnose the issue or the user needs the technician to install a part or an appliance.
[0007] An example embodiment is a method, which includes: receiving, by a processor, a first set of data related to an appliance from an electronic device;
identifying, by the processor, a model of the appliance based on the first set of data; receiving, by the processor, a second set of data from the electronic device; identifying, using a machine learning model, an issue of the appliance by using the second set of data; and transmitting, by the processor, a recommendation for fixing the issue to the electronic device..
[0008] Another example embodiment is a non-transitory memory containing instructions and statements which, when executed by a processor, cause the processor to perform the method.
[0009] Another example embodiment is a system, including one or more electronic devices for sending image data of an appliance; a server for receiving the image data from the one or more electronic devices, the server being configured to perform the method.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Reference will now be made, by way of example, to the accompanying drawings which show example embodiments, and in which:
[0011] FIG. 1 is a schematic diagram of a system for photograph assisted self-diagnosis of appliance, according to an example embodiment;
[0012] FIG. 2A is a schematic diagram of an example architecture for a neural network including a convolutional neural network (CNN);
[0013] FIG. 2B illustrates example layers of the CNN in FIG. 2A;
[0014] FIG. 2C illustrates a convolution layer of the CNN in FIG. 2A.
[0015] FIG. 3 is a flow chart illustrating a process for photograph assisted self-diagnosis of appliance, according to an example embodiment;

Date Recue/Date Received 2021-02-02
[0016] FIGs. 4A-40 are user interfaces displayed on an electronic device during the process of FIG. 3, according to an example embodiment; and
[0017] FIG. 5 is a block diagram of the server in FIG. 1, according to an example embodiment.
[0018] Similar reference numerals may have been used in different figures to denote similar components.
DETAILED DESCRIPTION
[0019] Example embodiments relate to a system that provides step-by-step instructions to .. a user in a diagnosis and repair process using one or more images (photographs).
[0020] FIG. 1 illustrates a system 10 for photograph assisted self-diagnosis of appliance appliances, according to an example embodiment. Appliances include home appliances, commercial or industrial appliances, or other appliances for home use or commercial use. Home appliances may include ovens, stoves, refrigerators, dishwasher, washers, dryers, air conditioners, vacuum machines, etc. Commercial or industrial appliances may include commercial freezers and ice makers, commercial dishwasher, commercial laundry, commercial cooking, commercial refrigerators, etc.
[0021] The system 10 may include a server 100, a database 102, one or more electronic devices 108. The server 100 is configured to output appliance diagnosis result using artificial intelligence (Al) and machine learning (ML) model. For example, as will be discussed in detail in Figure 3, the server 100 is configured to detect the model of an appliance to be fixed, the cause of the issue of the appliance based on photographs or images of the appliance collected during a diagnosing process.
[0022] In some examples, the server 100 is configured to diagnose the appliance from .. images by using a machine learning model. For example, the server 100 can use a vision machine learning model to determine the issued of the appliance based on the images collected, Date Recue/Date Received 2021-02-02 e.g., using convolutional neural network (CNN) based model. The server 100 is also configured to output recommendations or solutions for fixing the issue.
[0023] In some examples, the server 100 can be a cloud server. The system 10 in this case is a cloud platform, with which the collected image data may be accessed and stored in various physical locations globally.
[0024] The electronic device 108 can include a camera for acquiring or collecting images of an appliance, such as external appearance images, control panel images, internal condition images of the appliance, or other necessary images as determined by the server 100. The electronic device 108 can be a desktop, a laptop, or a mobile communication device, such as a smart phone or a tablet. In some examples, the electronic device 108 can take photographs on-site and transmit the photographs to the server 100 via the communication link 114. With the electronic device 108, the system 10 is suitable for use by home owners or business owners, or other users.
[0025] The electronic device 108 can include a dedicated application for example Android, i0S, Windows, or other appropriate operation systems for taking pictures. A user can download the application to the electronic device 108. In some examples, the electronic device 108 can be configured to output instructions to a user, for example via a display screen, to capture various images of an appliance. The electronic device 108 can transmit captured images to the server for diagnosing the cause of the issue associated with the appliance.
[0026] The electronic device 108 is also configured to transmit the image data to the server 100 via one or more communication channels of a communication network, for example, a wireless communication network, the Internet, or other communication systems.
[0027] The server 100 can store the received image data in the database 102. In addition to image data, the database 102 can also store pervious diagnosing results of different appliances, appliance product information, appliance parts information, maintenance information, and operation manuals the supplier information, or other technical information of appliances, such as product recall notices, etc. The database 102 can be integrated in the server 100, or provided in a separate server. The database 102 may be configured to communicate with a remote server 112 Date Recue/Date Received 2021-02-02 of appliance or appliance parts suppliers, such as via a communication link 116, to receive updated appliance and part information from the remoted server 112.
[0028] In some examples, the server 100 is configured to communicate with a third technician service server 118, via a communication link 120. The technician service server 118 allow a user to schedule a visit by a technician, or to have a live audio and/or video call with a certified technician by using the electronic device 108.
[0029] The communication links 106 can be provided by a wireless or wired communications network.
[0030] With image data of the appliance, the server 100 can use a machine learning model to identify the model and/or the cause of the issue of the appliance.
The machine learning model is configured to identify the cause of the issue of the appliance. For example, the machine learning model can generate recommendations of fixing the issue by retrieving relevant information from the database 102, and send the information or instructions to the electronic device 108 via the communication link 114.
[0031] The machine learning model can be implemented by an artificial neural network (also referred to simply as a "neural network") running on a computing platform such as server 100. Neural networks will be briefly described in general terms. A neural network can include multiple layers of neurons, each neuron receiving inputs from a previous layer, applying a set of weights to the inputs, and combining these weighted inputs to generate an output, which can in turn be provided as input to one or more neurons of a subsequent layer.
[0032] A layer of neurons uses filters to define the relationship between the outputs of the neurons of the previous layer and the outputs of the neurons of the current layer. A layer of the neural network receives a data input, usually in the form of a data array of known dimensions. In the case of neural networks operating on 2D data such as image data, the data input to a layer of the network is generally a 3D array consisting of a set of multiple 2D input activation maps (also called input feature maps or input channels). By applying the set of filters to the set of input activation maps, the layer generates a data output, which is typically a data array having known dimensions. In the case of neural networks operating on 2D
data, the data Date Recue/Date Received 2021-02-02 output of a layer of the network is generally a 3D array consisting of a set of multiple 2D output activation maps as described above.
[0033] A filter comprises a set of weights (also called parameters).
In some neural networks, such as convolutional neural networks (CNNs), the weights of a filter are arranged into convolution kernels, such as 2D convolution kernels. Each kernel of a filter corresponding to a channel of the data input (i.e. an input activation map). The application of a single filter to the input volume (e.g. by convolving the kernels of the filter with the corresponding input activation maps of the data input) generates a single output activation map. The set of output activation maps generated by the set of filter of the convolution layer are the data output of the convolution layer.
[0034] The machine learning model is trained to infer the issue of an appliance. In the example of a neural network, training a neural network involves learning or determining the appropriate weight values at different weight locations throughout the network. After being optimally trained to perform a given inference task, the weights of the neural network will not all contribute equally to the final inference outputs: some weights will have high value due to their high contribution, while other weights will have low value due to their low contribution. If the weights are not properly trained (e.g., high value weights are misplaced or miscalibrated by training), then the trained network will perform with less accuracy. In system 10, the machine learning model can be trained by a suitable set of training data to determine appropriate weights.
The training data may include labeled images of various appliance products, at various internal or external images of appliances, and features of various issues that may be associated with appliances, and image data of previous diagnosis, and other applicable data.
[0035] The trained machine learning model can be used to create and apply models for performing inference tasks, such as classifying images to be described below.
[0036] FIG. 2A illustrates an example architecture of a neural network 120 that is trained to classify an image of an appliance. The neural network 120 can be implemented by the server 100.

Date Recue/Date Received 2021-02-02
[0037] The neural network 120 can be a CNN. The neural network 120 in this example is designed for performing an object classification task, for example, to determine the cause of the issue of an appliance. The neural network 120 has been simplified, is not intended to be limiting, and is provided for the purpose of illustration only. The input data may be image data from the electronic device 108.
[0038] The neural network 120 includes a preprocessing block 122, which may perform various operations (e.g., normalization) to prepare the input data for the CNN
124. The input data include image data received from electronic device 108.
[0039] The CNN 124 receives a set of preprocessed data and performs convolution operations, using convolution filters, to output a set of output activation maps. As will be discussed further below, the convolution filters each include a set of weights.
[0040] The output activation maps are output to a classifier block 126, which can then output the classification of an object represented by the input data. In some examples, the classifier block 126 may include one or more convolution layers or convolution blocks of a CNN, as well as additional operations such as one or more fully-connected layers and a SoftMax function to normalize the inference data generated by an output layer. The output classification may be the root cause of the malfunction inferred by trained neural network 120. If trained neural network 120 has not identified any root cause of the malfunction, the server 100 may be configured to send instructions to electronic device 108 to acquire further image data, for example by taking further photographs at the specified positon of the appliance.
[0041] FIG. 2B illustrates example layers of the CNN 124. In the example of FIG. 2B, the CNN 124 may include one or more convolution layers 140, and pooling layers 150. An input image may be represented by a matrix. Each of the layers 140, 150 and 160 can be represented by a matrix.
[0042] The convolution layer 140 is configured to carry out convolution operation using a kernel or filter to process an input image. The kernel may be a matrix with a smaller dimensions than the matrix representing the image. The convolution layer 140 is configured to Date Recue/Date Received 2021-02-02 extract features of the input image, such as edges, color, gradient orientation. The convolutional layer 140 is described in greater detail in relation to FIG. 2C.
[0043] The pooling layer 150 is configured to extract dominant features that are rotational and positional invariant. The pooling layer 150 thus further reduces the spatial size of the convoluted image features and reduce the dimensionality of the matrix to be processed. The pooling layer 150 is configured to perform Max Pooling or Average Pooling. Max Pooling returns the maximum value from the portion of the image covered by the Kernel.
Average Pooling returns the average of all the values from the portion of the image covered by the Kernel.
[0044] The CNN 124 may include a fully-connected layer 160 for learning non-linear combinations of the high-level features as represented by the output of the convolutional layer 140. The fully-connected layer 160 receives the output of the convolution layer 140 and/or the pooling layer 150 and predicts the best label to describe the input image, such as the issue associated with an appliance as indicated in the images received from the electronic device 108.
The Fully-Connected layer 160 is configured to learn a possibly non-linear function in the space represented by the output of the convolutional layer.
[0045] FIG. 2C illustrates an example of the convolution layer 140 of the CNN 124, showing the dimensions of an input data array 144, an output data array 148, and a set of convolution filters 146 applied by the convolution layer 140. The input data array 114 is shown here as a multi-channel set of activation maps having a number of input channels (i.e. activation maps) equal to value Cm. Each channel of the input data array 144 consists of a 2D array, such as the image of an appliance consisting of a 2D pixel array, having a height 111, and a width W.
Thus, the number of values stored in the input data array is equal to (111, x x Cm). The convolution filters 146 applied to the input data array 144 each have a height h, a width w, and a channel depth Cm. The convolution layer 140 uses a number of convolution filters 146 equal to value Gut.
[0046] The convolution layer 140 applies the convolution filters 146 to the input data array 144 in a series of convolution operations. Each convolution filter 146 is applied to the input Date Recue/Date Received 2021-02-02 data array 144 to generate a channel of the output data array 148, shown here as a multi-channel set of activation maps having a number of output channels (i.e. output activation maps) equal to value Coin. Each channel of the output data array 148 consists of a 2D array, such as an image consisting of a 2D pixel array, having a height Hout and a width Wont. The relationships between Hui and How', and between Wm and Wow, are determined by the kernel dimensions h and w and the stride, padding, and other convolution configuration values or hyperparameters used by the convolution operations of the convolution layer 140. In some embodiments, H1 =
How, and Wm =
Wom. For example, an example embodiment may use kernel dimensions h=3 and w=3, with padding of 1 pixel and stride 1, to generate an output data array wherein H1 =
How, and Wm =
Wow. The use of a convolution layer 140 wherein Ilm = How, and W
m ¨ Wont may present certain advantages, for example in embodiments using hardware or software components optimized to process input channels having fixed dimensions.
[0047] As described above, a convolution layer applies a set of convolution filters to the set of input activation maps to generate the set of output activation maps.
FIG. 2C shows an example convolution operation of a single filter of a convolution layer. The filter, having dimensions it x w x Chi, traverses the height and depth of the set of input activation maps of dimensions Wm x Win x Cm), performing a convolution operation with pre-defined hyperparameter values (e.g. stride and padding values). Each computed dot product of the kernel with a single filter-sized portion of the set of input activation maps (i.e. a region of the input activation maps of dimensions hxwx Ciii) generates a single value in a single output channel.
By computing further dot products as the filter kernels traverse the height and width of the set of input activation maps, a single output channel, with dimensions (Hont x Wow), is generated.
[0048] FIG. 3 is a flow chart illustrating a method 300 for photograph assisted self-diagnosing of an appliance according to an example embodiment. The method 300 can be executed by the server 100. The method 300 can be applied to appliances, including home appliances and commercial appliances, or other electronic devices. The method 300 can identify a malfunction or an issue of an appliance using a camera. Based on machine learning model of the server 100, which can be a "vision" machine learning model, the server 100 is configured to determine issue or malfunctions associated with an appliance, generate recommendations for Date Recue/Date Received 2021-02-02 fixing the malfunction, and if necessary, and order replacement parts or replacement appliance.
In some examples, the server 100 is configured to schedule a technician visit to fix the malfunction or issue at the location of the appliance. The machine learning model can be implemented by using the trained neural network 120.
[0049] In method 300, at step 302, the server 100 receives a first set of data from the electronic device 108. The first set of data may be a barcode scanned by the electronic device 108 in FIGs. 4A and 4B, which identifies the model of the appliance, or a model serial number of the appliance input to the electronic device 108 by a user. In some examples, the first set of data can be image data of the barcode or the appliance model image captured by the camera of the electronic device 108.
[0050] At step 304, the server 100 may identify the appliance model based on the scanned barcode, or the input model serial number of the appliance. When the first set of data is image data, the server 100 is configured to input the image data to the neural network 120, and the neural network 120 can use the weights or parameters determined for each layer in the model training process to infer the model of the appliance.
[0051] After the server 100 has identified the model of the appliance, at step 306, the server 100 is configured to instruct the user to acquire a further set of data based on a predetermined troubleshooting procedure of the appliance model. In the example of FIGs 4C, the server 100 instructs the user to stand in front of the appliance. The further set of data may include images data, text data or relevant information requested by the server 100. For example, the server 100 may instruct the user, for example, by audio or text, to take further images of the appliance.
[0052] In an example, the server 100 is configured to instruct the user to first take a front image of the control panel. During the diagnosis process, the server 100 may instruct the user to take photographs of an appliance at the points of interest, such as at a control panel of the appliance. The photograph of a control panel may provide helpful information on the appliance.
For example, the number of lights lit on the console and the sequence of lighting may provide helpful information for identifying the issue.
Date Recue/Date Received 2021-02-02
[0053] If the signal indicated in the control panel is related to a specific aspect of the appliance, the server 100 may instruct the user to take further photographs of the appliance for the specific aspect. For example, if a home refrigerator is malfunctioned, the server 100 may first instruct the user to take a photograph of the control panel of the refrigerator. If the server 100 infers that the filter indicator on the control panel shows an abnormal condition, the server 100 may instruct the user to take further photographs of the filter or surrounding areas of the filter for further diagnosis the root cause of the malfunction.
[0054] The electronic device 108 may send the requested further set of data to the server 100. Based on the image data received, the server 100 may be configured to provide further instructions. For example, based on the control panel images received, the server 100 may instruct the user to take certain actions, such as pressing a particular button related to temperature. After the button is pressed, the information of the temperature shows up. The sever 100 may read the temperature based on the photographs from the electronic device 108, and determine whether the temperature is within a normal range.
[0055] At step 308, the server 100 may use the machine learning model to diagnose the appliance using the image data from the electronic device 108. For example, the image data may be first preprocessed at preprocessing 122 for suitable to be further processed at the CNN 124.
At the CNN 124, the convolutional layer 140 generates a convolved matrix from the input image data; the pooling layer 150 generates a pooled convolved image data from the convolved image data; and based on the output of the convolution layer 140 and/or the pooling layer 150, the fully-connected layer 160 predicts the best label to describe the input image data. The server 100 uses classifier 126 of the trained neural network 120 to identify features displayed on the image data of the appliance, and to classify possible issue of the appliance based on the identified features.
[0056] During the diagnosis process, the server 100 may generate a series of questions to the user on the electronic device 108. The questions may be designed based on possible issue classified by the trained neural network 120, operational manual or maintenance manual of the model of the appliance, and diagnosing and troubleshooting data collected over years. The Date Recue/Date Received 2021-02-02 question may be generated by the server 100 based on received image data, and the answers provided by the user to the questions. The questions may include requesting the user to confirm certain information, and/or to take one or more photographs of the appliance in question. In response, the user may provide requested information, and/or use electronic device 108 to take further photographs or videos at the requested positions. The user may then use the electronic device 108 to transmit to the server 100 the answers and/or the photographs or videos captured.
[0057] For example, in FIG. 4D, the server 100 instructs the user to point the camera of the electronic device 108 to the freezer area of a refrigerator. In FIG. 4E, the server 100 determines that the freezer works properly based on the photos of the freezer received from the electronic device 108, and displays this result on the screen of the electronic device 108. In FIG.
4F, the server 100 instructs the user to point the camera of the electronic device 108 inside the refrigerator and close to the back portion of the refrigerator. In FIG. 4G, the server 100 determines that the back portion has frost and display this result on the screen of the electronic device 108. In FIG. 4H, the server 100 instructs the user to point the camera of the electronic device 108 to the control panel on the front of the refrigerator. In FIG. 41, the server 100 determines that the signals on the control panel indicate the possible issue of the refrigerator based on the photos of the control panel received from the electronic device 108, and displays this result on the screen of the electronic device 108.
[0058] As well, the trained neural network 120 may be configured to diagnose the .. appliance by identifying typical features or symptoms of possible issues of the appliance using the captured images. For example, the server 100 may diagnose an issue based on the similar features or symptoms manifested on the image data of similar issues that were successfully diagnosed.
[0059] Based on the diagnosing result of the appliance, at step 309, the server 100 is configured to generate recommendations for fixing the issue of the appliance.
The server 100 may generate the recommendation based on the classified issue, the operational manual, maintenance manual, and/or other technical instructions provided by the manufacturer of the appliance. For example, the neural network 120 may be trained to output recommendations Date Recue/Date Received 2021-02-02 based on the identified issue, and information stored in a database, including the operational manual or maintenance manual, or other technical instructions provided by the manufacturer of the appliance. The database may be database 102. The recommendations may include steps or operations to be performed by the user to fix the issue.
[0060] In some examples, the neural network 120 may infer that merely a reset of the appliance is required to address the issue. The server 100 can instruct the user to press the reset button of the appliance. In some examples, the neural network 120 may identify that a part, such as a filter of a refrigerator, needs to be replace and the server 100 can instruct the user to replace the part accordingly.
[0061] In some examples, the server 100 may also suggest a preventive maintenance schedule for the appliance to prevent certain issue from arising, such as every three months or every six months based on the type of appliance, the model of the appliance, and the maintenance recommendations of the manufacturer. The server 100 may be configured to send a maintenance reminder to the user, or a Warranty expiry reminder to a user via the electronic device 108. Preventive maintenance may prolong the life of the appliance.
FIGs. 4N and 40 are exemplary user interfaces of the electronic device 108, displaying maintenance information to defrost and change filter.
[0062] In some examples, the neural network 120 may infer that one or more parts of the malfunctioned appliance need to be replaced, the recommendations may include instructing the user to order replacement parts or the replacement appliance, as identified by the the neural network 120. For example, the server 100 may recommend replacement parts or appliance ordering webpages or web links to the user. In the example of FIG. 4J, the server 100 determines that the air filter of the refrigerator needs to be replaced, and displays the air filter information on the screen of the electronic device 108. The user may order the air filter with this information.
[0063] In some examples, the server 100 may communicate with, for example via APIs, a third party server 112, which provides up-to-date replacement parts and appliance information, availability of the replacement parts or appliance, and price information.

Date Recue/Date Received 2021-02-02
[0064] As described above, the database 102 can store product information, parts information, and operational or maintenance manual information of the appliance. In some examples, the server 100 can query the database 102 to identify proper replacement parts that may be used to fix the issue.
[0065] The server 100 is configured to allow a user to order the replacement parts or appliance using an ordering webpages or a web links via the electronic device 108. If the server 100 determines that the issue must be fixed by a professional, for example, an electrical part of the appliance needs to be replaced, the the neural network 120 may provide a certified technician scheduling application or webpage that allows the user to schedule a visit by a certified .. technician.
[0066] At step 310, the server 100 may send a query to the electronic device 108 to confirm whether the issue has been fixed. If the user confirms that the issue has been fixed, the server 100 may close the case.
[0067] After the issue has been fixed, the server 100 can also use the acquired images to further train the machine learning model to provide more practical or effective recommendations.
For example, a recommendation that does not help address the issue may result in further training of the neural network such that the recommendation in issue is less likely to be recommended to future users with appliances having similar issue. Training the the machine learning model with previous successful diagnosis makes the diagnosing process more efficient and effective.
[0068] If the issue has not been fixed, at step 312, the server 100 determines whether any other aspect of the applicant should be checked. If there is at least one remaining aspect of the appliance to be checked, the server 100 is configured to repeat steps from 306 to 310 described above.
[0069] If the issue has not been fixed, and all of the possible aspects of the appliance has been checked, at step 314, the server 100 may be configured to allow the user to schedule a visit by a professional or technician to fix the issue, or to install the ordered replacement parts. For Date Recue/Date Received 2021-02-02 example, the server 100 may be configured to provide an application or a webpage, or provide a web link, which allows a user to schedule a visit by a technician. For example, the server 100 may be configured to communicate with a third party technician booking system 118 via APIs.
FIG. 4K is an exemplary user interface on the electronic device 108, which allows the user to schedule a visit by a technician.
[0070] Alternatively, the server 100 may be configured to enable a user to communicate, via the electronic device 108, with a trained technician at a central service center, via a live call, video call, or video or audio call. The technician can guide the user to diagnose and troubleshoot the appliance. For example, this option can be used if the model of an appliance is too old, and cannot be found in the market.
[0071] The server 100 may be configured to provide an online payment option for goods and services. For example, the server 100 may support online payment for replacement part or appliance ordering by communicating with the third party server 112, or for technician visit by communicating with the third party technician booking system 118. FIG. 4L is an exemplary user interface on the electronic device 108, which allows the user to pay for the order placed, such as a replacement part, or a visit by a technician. FIG. 4M is an exemplary user interface on the electronic device 108, showing the summary of the orders placed by the user.
[0072] The method 300 may also be used to fix the issue by a user when the user install an appliance.
[0073] The system 10 can also be configured to diagnose and troubleshoot electronic devices using the method 300 described above.
[0074] In some example, the appliance models, acquired images, diagnosis results, associated recommendations, replacement parts or appliance information are stored to the memory 401 (FIG. 5) of the server 100.
[0075] FIG. 5 is a schematic diagram of a hardware structure of the server 100 according to an example embodiment. The server 100 shown in FIG. 5 includes a memory 401, a processor Date Recue/Date Received 2021-02-02 402, a communications interface 403, and a bus 404. A communication connection is implemented between the memory 401, the processor 402, and the communications interface 403 by using the bus 404.
[0076] The processor 402 and the communications interface 403 are configured to perform, when the program stored in the memory 401 is executed by the processor 402, steps of method 300.
[0077] The memory 401 can be a read-only memory (Read Only Memory, ROM), a static storage device, a dynamic storage device, or a random access memory (Random Access Memory, RAM). The memory 401 may store a program. The memory 401 can be a non-transitory memory.
[0078] The processor 402 can be a general central processing unit (Central Processing Unit, CPU), a microprocessor, an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), a graphics processing unit (graphics processing unit, GPU), or one or more integrated circuits.
[0079] In addition, the processor 402 may be an integrated circuit chip with a signal processing capability. In an implementation process, steps of method 300 can be performed by an integrated logical circuit in a form of hardware or by an instruction in a form of software in the processor 402. In addition, the processor 402 can be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (Field Programmable Gate Array, FPGA) or another programmable logic device, a discrete gate or a transistor logic device, or a discrete hardware assembly. The processor 402 can implement or execute the methods, steps, and logical block diagrams that are described in the embodiments of this application. The general purpose processor can be a microprocessor, or the processor may be any conventional processor or the like. The steps of the method disclosed with reference to the embodiments of this application may be directly performed by a hardware decoding processor, or may be performed by using a combination of hardware in the decoding processor and a software module. The software module may be located in a mature storage medium in the art, such as a random access memory, a flash Date Recue/Date Received 2021-02-02 memory, a read-only memory, a programmable read-only memory, an electrically erasable programmable memory, or a register. The storage medium is located in the memory 401. The processor 402 reads information from the memory 401, and completes, by using hardware in the processor 402, the steps of method 300.
[0080] The communications interface 403 implements communication between server 100 and another device or communications network by using a transceiver apparatus, for example, including but not limited to a transceiver. For example, the training data may be obtained by using the communications interface 403.
[0081] The bus 404 may include a path that transfers information between all the components of the server 100.
[0082] It should be noted that, although only the memory, the processor, and the communications interface are shown in the server 100 in FIG. 5, in a specific implementation process, a person skilled in the art should understand that the server 100 may further include other components that are necessary for implementing normal running. In addition, based on specific needs, a person skilled in the art should understand that the server 100 may further include hardware components that implement other additional functions. In addition, a person skilled in the art should understand that the server 100 may include only a component required for implementing the embodiments of the present invention, without a need to include all the components shown in FIG. 5.
[0083] It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.
[0084] In the several embodiments described in this application, it should be understood that the disclosed system, apparatus, and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, the unit division is merely logical function division and may be other division in actual implementation.
For example, a plurality of units or components may be combined or integrated into another Date Recue/Date Received 2021-02-02 system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented by using some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
[0085] The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual requirements to achieve the objectives of the solutions of the embodiments.
[0086] In addition, functional units in the embodiments of this application may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
[0087] When the functions are implemented in the form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the prior art, or some of the technical solutions may be implemented in a form of a software product. The software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or some of the steps of the methods described in the embodiments of this application. The foregoing storage medium includes any medium that can store program code, such as a USB flash drive, a removable hard disk, a read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disc.
[0088] Certain adaptations and modifications of the described embodiments can be made.
Therefore, the above discussed embodiments are considered to be illustrative and not restrictive.

Date Recue/Date Received 2021-02-02

Claims (20)

WHAT IS CLAIMED IS:
1. A method, comprising:
receiving, by a processor, a first set of data related to an appliance from an electronic device;
identifying, by the processor, a model of the appliance based on the first set of data;
receiving, by the processor, a second set of data from the electronic device;
identifying, using a machine learning model, an issue of the appliance by using the second set of data; and transmitting, by the processor, a recommendation for fixing the issue to the electronic device.
2. The method as claimed in claim 1, further comprising: if the issue is not fixed by the recommendation, and at least one aspect of the appliance needs to be checked, acquiring, by the processor, a third set of data;
identifying, using the machine learning model, a second issue of the appliance by using the third set of data; and transmitting, by the processor, a second recommendation for fixing the second issue to the electronic device.
3. The method as claimed in claim 1 or 2, wherein the first set of data is a bar code scanned by the electronic device or a text character string transmitted from the electronic device.
4. The method as claimed in claim any one of claims 1 to 3, wherein the recommendation includes one or more instructions to fix the issue.
5. The method as claimed in claim 4, wherein the recommendation includes ordering one or more replacement parts from a webpages or a web link.

Date Recue/Date Received 2021-02-02
6. The method as claimed in claim 1, further comprising training the machine learning model with the second set of data.
7. The method as claimed in any one of claims 1 to 6, wherein the machine learning model includes a neural network or a convolutional neural network.
8. The method as claimed in claim 7, wherein the convolutional neural network comprises a convolutional layer for generating a convolved image data matrix from the second set of data.
9. The method as claimed in claim 8, wherein the convolutional neural network comprises a pooling layer for generates a pooled convolved image data matrix from the convolved image data matrix.
10. The method as claimed in claim 9, the convolutional neural network comprises a fully-connected layer for predicting a label to describe the second set of data using the convolved image data matrix from the convolution layer or the pooled convolved image data matrix from the pooling layer.
11. The method as claimed in any one of claims 1 to 10, wherein the second set of data comprises image data generated by a camera, a smart phone, a tablet, or a computer.
12. The method as claimed in claim 1, further comprising when the issue has not been fixed and all aspects of the appliance have been checked, scheduling a visit by a technician to fix the issue.
Date Recue/Date Received 2021-02-02
13. The method as claimed in any one of claims 1 to 12, wherein the processor is configured to receive information about replacement parts of the appliance from a third party server.
14. The method as claimed in any one of claims 1 to 13, further comprising, generating a preventive maintenance schedule for the appliance.
15. The method as claimed in claim 14, further comprising sending a maintenance reminder to the electronic device.
16. The method as claimed in any one of claims 1 to 15, further comprising sending a warranty expiry reminder to the electronic device.
17. The method as claimed in any one of claims 1 to 16, wherein the processor is configured to enable the electronic device to communicate with a trained technician via a live call, a video or audio call.
18. The method as claimed in any one of claims 1 to 17, wherein the processor is configured to support online payment.
19. A non-transitory memory containing instructions and statements which, when executed by a processor, cause the processor to perfomi the method as claimed in any one of claims 1 to 18.
20. A system, comprising:
one or more electronic devices for sending image data of an appliance; and a server for receiving the image data from the one or more electronic devices, the server is configured to perfomi the method as claimed in any one of claims 1 to 18.

Date Recue/Date Received 2021-02-02
CA3107808A 2021-02-02 2021-02-02 Photo assisted self-diagnosis of appliances Pending CA3107808A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3107808A CA3107808A1 (en) 2021-02-02 2021-02-02 Photo assisted self-diagnosis of appliances

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA3107808A CA3107808A1 (en) 2021-02-02 2021-02-02 Photo assisted self-diagnosis of appliances

Publications (1)

Publication Number Publication Date
CA3107808A1 true CA3107808A1 (en) 2022-08-02

Family

ID=82693781

Family Applications (1)

Application Number Title Priority Date Filing Date
CA3107808A Pending CA3107808A1 (en) 2021-02-02 2021-02-02 Photo assisted self-diagnosis of appliances

Country Status (1)

Country Link
CA (1) CA3107808A1 (en)

Similar Documents

Publication Publication Date Title
CN105184975B (en) Automatic vending machine management system and management method
KR20190119219A (en) Main image recommendation method and apparatus, and system
US20210216938A1 (en) Enterprise platform for enhancing operational performance
US20170124484A1 (en) Machine Learning System
US10733481B2 (en) Cloud device, terminal device, and method for classifying images
CN111401722B (en) Intelligent decision method and intelligent decision system
CN108721898A (en) The determination method and apparatus of frame per second, storage medium, electronic device
KR102241345B1 (en) Advertising content production support system
JP6031210B1 (en) Sales prediction device, sales prediction method, and program
KR20190120958A (en) Methods and system for vision-based automatic fault notification and classification of system lighting
US10628792B2 (en) Systems and methods for monitoring and restocking merchandise
WO2022076083A1 (en) Recommending products using artificial intelligence
CN117078163A (en) Logistics storage management and control system and method based on digital twinning
CA3107808A1 (en) Photo assisted self-diagnosis of appliances
KR20200030138A (en) Method for Providing Analysis Information of Skin Condition Based on Facial Picture
CN115809889A (en) Intelligent passenger group screening method, system, medium and equipment based on marketing effect
CN113568735B (en) Data processing method and system
KR20200034020A (en) Electronic apparatus and control method thereof
US11175807B1 (en) Intelligent contextual video thumbnail display
CN115270923A (en) Scene-based visual intelligent decision method and system
US11657409B2 (en) System and method for product demand transfer estimation through machine learning
JP2023055522A (en) Tire state estimation method
CN109064154B (en) Information transmission method and related device
CN111680236A (en) Menu display method and device, terminal equipment and storage medium
US20190303993A1 (en) Method and system for providing customized product recommendations to consumers