CN116128484B - Method and system for determining remaining maintenance time of automobile based on neural network - Google Patents

Method and system for determining remaining maintenance time of automobile based on neural network Download PDF

Info

Publication number
CN116128484B
CN116128484B CN202310218201.8A CN202310218201A CN116128484B CN 116128484 B CN116128484 B CN 116128484B CN 202310218201 A CN202310218201 A CN 202310218201A CN 116128484 B CN116128484 B CN 116128484B
Authority
CN
China
Prior art keywords
automobile
neural network
network model
maintenance
proficiency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310218201.8A
Other languages
Chinese (zh)
Other versions
CN116128484A (en
Inventor
崔鑫阳
徐焕军
刘金龙
马双
徐留琴
羊铁军
贺红卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bangbang Automobile Sales Service Beijing Co ltd
Original Assignee
Bangbang Automobile Sales Service Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bangbang Automobile Sales Service Beijing Co ltd filed Critical Bangbang Automobile Sales Service Beijing Co ltd
Priority to CN202310218201.8A priority Critical patent/CN116128484B/en
Publication of CN116128484A publication Critical patent/CN116128484A/en
Application granted granted Critical
Publication of CN116128484B publication Critical patent/CN116128484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Operations Research (AREA)
  • Computational Linguistics (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method and a system for determining remaining maintenance time of an automobile based on a neural network, which relate to the technical field of the neural network.

Description

Method and system for determining remaining maintenance time of automobile based on neural network
Technical Field
The invention relates to the technical field of neural networks, in particular to a method and a system for determining remaining maintenance time of an automobile based on a neural network.
Background
With the rapid development of the automobile industry, automobiles have become more and more popular in people's daily lives. Meanwhile, the structure of the automobile becomes more and more complex, the degree of automation is also higher and higher, the electronic control system of the whole automobile becomes more and more complex, and the maintenance time of the automobile is also more and more difficult to judge along with the complexity of the internal structure of the automobile. After the vehicle owner puts the vehicle into a repair shop for repair, the repair time of the vehicle cannot be clearly known, the vehicle owner cannot predict what time the vehicle is repaired, and the vehicle using plan cannot be arranged in advance, so that the time arrangement of a user is hindered. Most of the existing schemes are that repair shop repair staff estimates a residual repair time according to experience to inform a vehicle owner, but the estimated repair time is often inaccurate.
Therefore, how to accurately determine the remaining maintenance time of a vehicle is a current urgent problem to be solved.
Disclosure of Invention
The invention mainly solves the technical problem of accurately determining the remaining maintenance time of the vehicle.
According to a first aspect, in one embodiment, there is provided a method for determining remaining repair time of an automobile based on a neural network, including: s1, obtaining an appearance image of an automobile after damage; s2, determining the damage degree of the automobile by using a convolutional neural network model based on the appearance image after the automobile is damaged; s3, maintenance personnel information is obtained; s4, determining the proficiency of the maintenance personnel by using a deep neural network model based on the maintenance personnel information; s5, acquiring an automobile maintenance monitoring video; and S6, determining the remaining maintenance time of the automobile based on the damage degree of the automobile, the proficiency of the maintenance personnel and the long-short-period neural network model used by the automobile maintenance monitoring video.
In some embodiments, the input of the convolutional neural network model includes an appearance image after the automobile is damaged, and the output of the convolutional neural network model is the degree of damage to the automobile; the input of the deep neural network model comprises the information of the maintenance personnel, and the output of the deep neural network model is the proficiency of the maintenance personnel; the input of the long-short-period neural network model comprises the damage degree of the automobile, the proficiency of the maintenance personnel and the automobile maintenance monitoring video, and the output of the long-short-period neural network model is the remaining maintenance time of the automobile.
In some embodiments, if the damage degree of the automobile exceeds a first threshold, a maintenance amount corresponding to the damage degree of the automobile is calculated, and if the maintenance amount is greater than a second threshold, maintenance of the automobile is abandoned.
In some embodiments, if the remaining repair time of the automobile is greater than a third threshold, a prompt message is sent to alert an automobile owner.
In some embodiments, the convolutional neural network model is obtained by a training process comprising: obtaining a plurality of training samples, wherein the training samples comprise sample input data and labels corresponding to the sample input data, the sample input data is an appearance image of a sample automobile after damage, and the labels are the damage degree of the sample automobile; and training an initial convolutional neural network model based on the plurality of training samples to obtain the convolutional neural network model.
In some embodiments, the method further comprises: calculating a plurality of similarities between a plurality of images in an automobile damage image library and the appearance image after the automobile damage, taking the maintenance time corresponding to the image with the highest similarity in the plurality of similarities as a historical reference maintenance time, and sending the historical reference maintenance time to an automobile owner.
According to a second aspect, there is provided in one embodiment a neural network-based vehicle remaining repair time determination system, comprising: the first acquisition module is used for acquiring an appearance image of the damaged automobile; the automobile damage degree determining module is used for determining the automobile damage degree by using a convolutional neural network model based on the appearance image after the automobile is damaged; the second acquisition module is used for acquiring maintenance personnel information; a proficiency determining module for determining the proficiency of the serviceman using a deep neural network model based on the serviceman information; the third acquisition module is used for acquiring an automobile maintenance monitoring video; and the maintenance time determining module is used for determining the remaining maintenance time of the automobile based on the automobile damage degree, the proficiency of the maintenance personnel and the long-short period neural network model used by the automobile maintenance monitoring video.
In some embodiments, the input of the convolutional neural network model includes an appearance image after the automobile is damaged, and the output of the convolutional neural network model is the degree of damage to the automobile; the input of the deep neural network model comprises the information of the maintenance personnel, and the output of the deep neural network model is the proficiency of the maintenance personnel; the input of the long-short-period neural network model comprises the damage degree of the automobile, the proficiency of the maintenance personnel and the automobile maintenance monitoring video, and the output of the long-short-period neural network model is the remaining maintenance time of the automobile.
According to a third aspect, an embodiment provides a computer program product comprising a computer program, characterized in that the computer program, when executed by a processor, implements the steps of the neural network based vehicle remaining repair time determination method as defined in any one of the above.
According to a fourth aspect, there is provided in one embodiment an electronic device comprising: a memory; a processor; a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method described above.
According to a fifth aspect, an embodiment provides a computer readable storage medium having stored thereon a program executable by a processor to implement a method as in any of the above aspects.
According to the method and the system for determining the remaining maintenance time of the automobile based on the neural network, the degree of damage of the automobile is determined by using a convolutional neural network model based on the appearance image after the automobile is damaged, the proficiency of the maintenance personnel is determined by using a deep neural network model based on the information of the maintenance personnel, and the remaining maintenance time of the automobile is determined by using a long-term neural network model based on the degree of damage of the automobile, the proficiency of the maintenance personnel and the long-term neural network model of the automobile maintenance monitoring video, so that the remaining maintenance time of the automobile can be determined rapidly and accurately.
Drawings
Fig. 1 is a schematic flow chart of a method for determining remaining maintenance time of an automobile based on a neural network according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an automobile remaining maintenance time determining system based on a neural network according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present invention. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, related operations of the present invention have not been shown or described in the specification in order to avoid obscuring the core portions of the present invention, and may be unnecessary to persons skilled in the art from a detailed description of the related operations, which may be presented in the description and general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The term "coupled" as used herein includes both direct and indirect coupling (coupling), unless otherwise indicated.
In the embodiment of the invention, a method for determining the remaining maintenance time of an automobile based on a neural network as shown in fig. 1 is provided, which comprises the following steps of S1-S6:
and S1, obtaining an appearance image of the damaged automobile.
The appearance image after the automobile damage represents an appearance image obtained by photographing the appearance of the automobile after the automobile is damaged. In some embodiments, the appearance image may be a panoramic image of the appearance of the automobile or an appearance image formed by stitching multiple images. In some embodiments, the appearance of the automobile can be photographed by a panoramic camera to obtain an appearance image of the automobile after damage.
In some embodiments, a plurality of similarities between a plurality of images in an automobile damaged image library and the appearance image after the automobile is damaged may be calculated, a maintenance time corresponding to an image with the highest similarity in the plurality of similarities is used as a historical reference maintenance time, and the historical reference maintenance time is sent to an automobile owner to be used as a reference. In some embodiments, the similarity between the images in the damaged image library of the automobile and the appearance image after the damage of the automobile can be calculated by a cosine similarity calculation method, a mean value hash algorithm, a perception hash algorithm and the like.
And S2, determining the damage degree of the automobile by using a convolutional neural network model based on the appearance image after the automobile is damaged.
The damage degree of the automobile indicates the damage degree of the automobile after the user is damaged. In some embodiments, the degree of damage to the vehicle may be a value of 0-1, with a higher value indicating a greater damage to the vehicle. In some embodiments, the degree of damage to the vehicle may also be severe, general, minor, etc. It will be appreciated that the greater the degree of damage, the more repair time will be required.
The convolutional neural network model includes a convolutional neural network. The Convolutional Neural Network (CNN) may be a multi-layer neural network (e.g., comprising at least two layers). The at least two layers may include at least one of a convolutional layer (CONV), a modified linear unit (ReLU) layer, a pooling layer (POOL), or a fully-connected layer (FC). At least two layers of the Convolutional Neural Network (CNN) may correspond to neurons arranged in three dimensions: width, height, depth. In some embodiments, a Convolutional Neural Network (CNN) may have an architecture of [ input layer-convolutional layer-modified linear cell layer-pooling layer-full-connection layer ]. The convolution layer may calculate the output of neurons connected to a local region in the input, calculate the dot product between the weight of each neuron and its small region connected in the input volume. In some embodiments, the input of the convolutional neural network model comprises an appearance image after the automobile is damaged, and the output of the convolutional neural network model is the degree of damage to the automobile. In some embodiments, the convolution kernel of the convolutional neural network model may be 3×3 in size.
The convolutional neural network model may be trained by a plurality of training samples. The training samples comprise sample input data and labels corresponding to the sample input data, the sample input data is an appearance image of a sample automobile after being damaged, and the labels are the damage degree of the sample automobile. The sample output labels of the training samples may be manually labeled by a worker, and in some embodiments, the convolutional neural network model is obtained by training an initial convolutional neural network model based on a plurality of training samples.
In some embodiments, if the damage degree of the automobile exceeds a first threshold, a maintenance amount corresponding to the damage degree of the automobile is calculated, and if the maintenance amount is greater than a second threshold, maintenance of the automobile is abandoned. In some embodiments, the repair amount corresponding to the damage level of the vehicle may be calculated from a damage level versus repair amount lookup table. The first threshold and the second threshold may be set by human or machine.
And S3, obtaining maintenance personnel information.
The serviceman information includes sex, age, service life, repair shop repair score, repair time in the historical repair order, historical repair order quantity, repair shop equipment information, and the like of the serviceman.
The service life of a serviceman means the period in which the serviceman is engaged in the maintenance work. The repair shop repair score indicates a score for the repair level of the repair person within the repair shop, e.g. the higher the repair shop repair score, the higher the repair level of the repair person, and the lower the repair shop repair score, the lower the repair level of the repair person. The repair time in the historical repair orders represents the total repair time corresponding to each order in the historical repair orders. Historical maintenance orders the maintenance personnel have processed how many maintenance orders, the larger the maintenance order, the more experienced the maintenance personnel is, and the shorter the maintenance time may be. The service plant equipment information indicates the model number, price, service life, etc. of the service plant equipment. For example, the higher the service plant equipment, the shorter the service time may be, and vice versa.
And step S4, determining the proficiency of the maintenance personnel by using a deep neural network model based on the maintenance personnel information.
The deep neural network model may include a deep neural network, the deep neural network model including a plurality of processing layers, each processing layer being composed of a plurality of neurons, each neuron matrixing data. The parameters used by the matrix may be obtained by training. The deep neural network model may also be any existing neural network model that enables processing of multiple features, such as RNN, CNN, DNN, etc. The deep neural network model can also be a model which is customized according to requirements. The input of the deep neural network model comprises the maintenance personnel information, and the output of the deep neural network model is the proficiency of the maintenance personnel.
The deep neural network model may be trained by a plurality of training samples. The training sample comprises sample input data and a label corresponding to the sample input data, wherein the sample input data is sample maintenance personnel information, and the label is the proficiency of the sample maintenance personnel. The sample output label of the training sample can be obtained through manual labeling by a worker. For example, a worker may determine the proficiency of the serviceman based on the serviceman information and make a label.
The proficiency of the serviceman indicates the proficiency of the serviceman in servicing the automobile. The proficiency may be a number between 0 and 1, with greater numbers resulting in greater proficiency for maintenance personnel. It will be appreciated that the greater the skill of the service personnel, the less repair time will be required.
And S5, acquiring an automobile maintenance monitoring video.
The automobile maintenance monitoring video represents an automobile maintenance monitoring video obtained by shooting an automobile maintenance process. For example, the car repair process can be photographed by a monitoring camera of a repair shop to obtain a car repair monitoring video. The automobile maintenance monitoring video contains real-time maintenance progress information of the automobile, and the automobile maintenance can be judged by the automobile maintenance monitoring video to which step, and the maintenance time is left approximately. The automobile maintenance monitoring video is a dynamic image recorded in an electric signal mode and consists of a plurality of continuous static images in time. Wherein each image is a frame of video data. In some embodiments, the duration of the auto repair surveillance video may be 10 seconds, 20 seconds, 30 seconds, 1 minute, etc.
In some embodiments, the format of the video data may include, but is not limited to: high density digital Video disc (Digital Video Disc, DVD), streaming media format (Flash Video, FLV), moving picture experts group (MPEG, motion Picture Experts Group), audio Video interleave (Audio Video Interleaved, AVI), home Video recording system (Video Home System, VHS), and Video container file format (Real Media file format, RM), etc.
And S6, determining the remaining maintenance time of the automobile based on the automobile damage degree, the proficiency of the maintenance personnel and the automobile maintenance monitoring video by using a long-short-period neural network model.
The remaining repair time of the automobile indicates how much time is required for the repair of the automobile to be completed. For example, the remaining repair time for an automobile is 1 day, indicating that the repair of the automobile can be completed in 1 day.
The Long and Short Term neural network model includes a Long and Short Term Memory network (LSTM), which is one of RNNs (Recurrent Neural Network, recurrent neural networks).
The long-term and short-term neural network model can process sequence data with any length, capture sequence information and output results based on the association relationship of front data and rear data in the sequence. And the long-term neural network model is used for processing the automobile maintenance monitoring video, so that the characteristics of the association relationship among the automobile maintenance monitoring videos considering all time points can be output, and the output characteristics are more accurate and comprehensive. The input of the long-short-period neural network model comprises the damage degree of the automobile, the proficiency of the maintenance personnel and the automobile maintenance monitoring video, and the output of the long-short-period neural network model is the remaining maintenance time of the automobile.
The long-term and short-term neural network model can be obtained through training by training samples. The training sample comprises sample input data and a label corresponding to the sample input data. The specific training is similar to the convolutional neural network model and will not be described in detail herein.
After training, inputting the damage degree of the automobile, the proficiency of the maintenance personnel and the automobile maintenance monitoring video to a long-short-period neural network model after training is finished, and outputting to obtain the remaining maintenance time of the automobile. For example, the long-short-period neural network model is input into a real-time automobile maintenance monitoring video with the automobile damage degree of 0.5 and the proficiency of maintenance personnel of 0.8 and 30 seconds, and the remaining maintenance time of the output automobile is 10 hours.
In some embodiments, if the remaining repair time of the automobile is greater than a third threshold, a prompt message is sent to alert an automobile owner. For example, the owner is alerted by a short message.
Based on the same inventive concept, fig. 2 is a schematic diagram of an automobile remaining repair time determining system based on a neural network according to an embodiment of the present invention, where the automobile remaining repair time determining system based on the neural network includes:
a first obtaining module 21, configured to obtain an appearance image after the automobile is damaged;
The automobile damage degree determining module 22 is configured to determine an automobile damage degree by using a convolutional neural network model based on the appearance image after the automobile damage;
a second acquiring module 23, configured to acquire maintenance personnel information;
a proficiency determining module 24 for determining the proficiency of the serviceman using a deep neural network model based on the serviceman information;
a third obtaining module 25, configured to obtain an auto repair monitoring video;
a repair time determination module 26 for determining the remaining repair time of the vehicle based on the degree of damage to the vehicle, the proficiency of the repair person, and the long and short term neural network model used by the vehicle repair surveillance video.
Based on the same inventive concept, an embodiment of the present invention provides an electronic device, as shown in fig. 3, including:
a processor 31; a memory 32 for storing executable program instructions in the processor 31; wherein the processor 31 is configured to execute to implement a neural network based vehicle remaining repair time determination method as provided above, the method comprising:
s1, obtaining an appearance image of an automobile after damage; s2, determining the damage degree of the automobile by using a convolutional neural network model based on the appearance image after the automobile is damaged; s3, maintenance personnel information is obtained; s4, determining the proficiency of the maintenance personnel by using a deep neural network model based on the maintenance personnel information; s5, acquiring an automobile maintenance monitoring video; and S6, determining the remaining maintenance time of the automobile based on the damage degree of the automobile, the proficiency of the maintenance personnel and the long-short-period neural network model used by the automobile maintenance monitoring video. .
Based on the same inventive concept, the present embodiment provides a non-transitory computer readable storage medium, which when executed by a processor 31 of an electronic device, enables the electronic device to perform the method for determining remaining repair time of an automobile based on a neural network as provided above, the method comprising S1, acquiring an appearance image after an automobile is damaged; s2, determining the damage degree of the automobile by using a convolutional neural network model based on the appearance image after the automobile is damaged; s3, maintenance personnel information is obtained; s4, determining the proficiency of the maintenance personnel by using a deep neural network model based on the maintenance personnel information; s5, acquiring an automobile maintenance monitoring video; and S6, determining the remaining maintenance time of the automobile based on the damage degree of the automobile, the proficiency of the maintenance personnel and the long-short-period neural network model used by the automobile maintenance monitoring video.
Based on the same inventive concept, the present embodiment also provides a computer program product, which when executed by a processor, implements the neural network-based vehicle remaining repair time determination method as provided above.
The method for determining the remaining maintenance time of the automobile based on the neural network provided by the embodiment of the invention can be applied to electronic equipment such as terminal equipment (such as a mobile phone), a tablet personal computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a handheld computer, a netbook, a personal digital assistant (personal digital assistant, PDA), wearable equipment (such as a smart watch, smart glasses or a smart helmet, etc.), augmented reality (augmented reality, AR) \virtual reality (VR) equipment, smart home equipment, vehicle-mounted computer, etc., and the embodiment of the invention is not limited in any way.
Taking the mobile phone 100 as an example of the electronic device, fig. 4 shows a schematic structural diagram of the mobile phone 100.
As shown in fig. 4, the mobile phone 100 may include a processing module 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like.
The sensor module 180 may include a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, and the like.
It should be understood that the structure illustrated in this embodiment is not limited to the specific configuration of the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components may be provided. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processing module 110 may include one or more processing units, such as: the processing module 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processingunit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the mobile phone 100, and is a decision maker for commanding each component of the mobile phone 100 to work in coordination according to the instruction. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The application processor may have an operating system of the mobile phone 100 installed thereon for managing hardware and software resources of the mobile phone 100. Such as managing and configuring memory, prioritizing system resources, managing file systems, managing drivers, etc. The operating system may also be used to provide an operator interface for a user to interact with the system. Various types of software, such as drivers, applications (apps), etc., may be installed in the operating system. For example, the operating system of the mobile phone 100 may be an Android system, a Linux system, or the like.
A memory may also be provided in the processing module 110 for storing instructions and data. In some embodiments, the memory in the processing module 110 is a cache memory. The memory may hold instructions or data that the processing module 110 has just used or recycled. If the processing module 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processing module 110 is reduced, thereby improving the efficiency of the system.
In the embodiment of the present invention, the processing module 110 may determine the remaining maintenance time of the automobile based on the damage degree of the automobile, the proficiency of the maintenance personnel, and the long-short-period neural network model used by the automobile maintenance monitoring video.
In some embodiments, the processing module 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charging management module 140 and the processing module 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processing module 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be disposed in the processing module 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (lownoise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processing module 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processing module 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processing module 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless localarea networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency Modulation (FM), near field communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the mobile phone 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processing module 110. The wireless communication module 160 may also receive a signal to be transmitted from the processing module 110, frequency modulate the signal, amplify the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 and the mobile communication module 150 of the handset 100 are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the handset 100 can communicate with a network and other devices through wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code divisionmultiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidounavigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellitesystem, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The mobile phone 100 implements display functions through a GPU, a display 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processing module 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrixorganic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot lightemitting diodes, QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In the embodiment of the invention, the display screen 194 can be used for displaying the appearance image after the automobile is damaged.
The mobile phone 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like. In some embodiments, the handset 100 may implement video communication functions through an ISP, camera 193, video codec, GPU, and application processor pair.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the cell phone 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In the embodiment of the invention, the camera 193 can shoot the damaged automobile to obtain an appearance image of the damaged automobile.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the handset 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The handset 100 may support one or more video codecs. In this way, the mobile phone 100 can play or record video in multiple coding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the mobile phone 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
In the embodiment of the invention, the NPU calculation processor can run the deep neural network model to determine the proficiency of the maintenance personnel.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 100. The external memory card communicates with the processing module 110 via the external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processing module 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the handset 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The handset 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processing module 110, or a portion of the functional modules of the audio module 170 may be disposed in the processing module 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The handset 100 may listen to music, or to hands-free calls, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the handset 100 is answering a telephone call or voice message, the voice can be received by placing the receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the mobile phone 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the source of sound, implement directional recording, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The handset 100 may receive key inputs, generating key signal inputs related to user settings and function control of the handset 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the handset 100. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the handset 100 and cannot be separated from the handset 100.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (8)

1. The method for determining the remaining maintenance time of the automobile based on the neural network is characterized by comprising the following steps of:
s1, obtaining an appearance image of an automobile after damage;
s2, determining the damage degree of the automobile by using a convolutional neural network model based on the appearance image after the automobile is damaged, wherein the input of the convolutional neural network model comprises the appearance image after the automobile is damaged, and the output of the convolutional neural network model is the damage degree of the automobile;
S3, maintenance personnel information is obtained;
s4, determining the proficiency of the maintenance personnel by using a deep neural network model based on the maintenance personnel information, wherein the input of the deep neural network model comprises the maintenance personnel information, and the output of the deep neural network model is the proficiency of the maintenance personnel;
s5, acquiring an automobile maintenance monitoring video;
s6, determining the remaining maintenance time of the automobile based on the automobile damage degree, the proficiency of the maintenance personnel and the automobile maintenance monitoring video by using a long-short-period neural network model, wherein the input of the long-short-period neural network model comprises the automobile damage degree, the proficiency of the maintenance personnel and the automobile maintenance monitoring video, and the output of the long-short-period neural network model is the remaining maintenance time of the automobile.
2. The neural network-based vehicle remaining repair time determination method of claim 1, further comprising: if the damage degree of the automobile exceeds a first threshold, calculating maintenance amount corresponding to the damage degree of the automobile, and if the maintenance amount is larger than a second threshold, abandoning maintenance of the automobile.
3. The neural network-based vehicle remaining repair time determination method of claim 1, further comprising: and if the remaining maintenance time of the automobile is greater than a third threshold value, sending prompt information to remind an automobile owner.
4. The neural network-based vehicle remaining repair time determination method of claim 1, the convolutional neural network model being obtained by a training process comprising:
obtaining a plurality of training samples, wherein the training samples comprise sample input data and labels corresponding to the sample input data, the sample input data is an appearance image of a sample automobile after damage, and the labels are the damage degree of the sample automobile;
and training an initial convolutional neural network model based on the plurality of training samples to obtain the convolutional neural network model.
5. The neural network-based vehicle remaining repair time determination method of claim 1, further comprising: calculating a plurality of similarities between a plurality of images in an automobile damage image library and the appearance image after the automobile damage, taking the maintenance time corresponding to the image with the highest similarity in the plurality of similarities as a historical reference maintenance time, and sending the historical reference maintenance time to an automobile owner.
6. A neural network-based vehicle remaining repair time determination system, comprising:
the first acquisition module is used for acquiring an appearance image of the damaged automobile;
The automobile damage degree determining module is used for determining the automobile damage degree by using a convolutional neural network model based on the appearance image after the automobile damage, wherein the input of the convolutional neural network model comprises the appearance image after the automobile damage, and the output of the convolutional neural network model is the automobile damage degree;
the second acquisition module is used for acquiring maintenance personnel information;
a proficiency determining module, configured to determine proficiency of the serviceman using a deep neural network model based on the serviceman information, where an input of the deep neural network model includes the serviceman information, and an output of the deep neural network model is the proficiency of the serviceman;
the third acquisition module is used for acquiring an automobile maintenance monitoring video;
and the maintenance time determining module is used for determining the remaining maintenance time of the automobile based on the automobile damage degree, the proficiency of the maintenance personnel and the automobile maintenance monitoring video by using a long-short-period neural network model, wherein the input of the long-short-period neural network model comprises the automobile damage degree, the proficiency of the maintenance personnel and the automobile maintenance monitoring video, and the output of the long-short-period neural network model is the remaining maintenance time of the automobile.
7. An electronic device, comprising: a memory; a processor; a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor to implement the steps of the neural network-based vehicle remaining repair time determination method of any one of claims 1 to 5.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps corresponding to the neural network-based vehicle remaining repair time determination method according to any one of claims 1 to 5.
CN202310218201.8A 2023-03-08 2023-03-08 Method and system for determining remaining maintenance time of automobile based on neural network Active CN116128484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310218201.8A CN116128484B (en) 2023-03-08 2023-03-08 Method and system for determining remaining maintenance time of automobile based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310218201.8A CN116128484B (en) 2023-03-08 2023-03-08 Method and system for determining remaining maintenance time of automobile based on neural network

Publications (2)

Publication Number Publication Date
CN116128484A CN116128484A (en) 2023-05-16
CN116128484B true CN116128484B (en) 2023-08-04

Family

ID=86301134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310218201.8A Active CN116128484B (en) 2023-03-08 2023-03-08 Method and system for determining remaining maintenance time of automobile based on neural network

Country Status (1)

Country Link
CN (1) CN116128484B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117557221A (en) * 2023-11-17 2024-02-13 德联易控科技(北京)有限公司 Method, device, equipment and readable medium for generating vehicle damage report

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038613A (en) * 2017-12-08 2018-05-15 珠海华发汽车销售有限公司 A kind of automobile maintenance management system and method
CN113326954A (en) * 2021-06-25 2021-08-31 中国平安财产保险股份有限公司 Vehicle maintenance task scheduling method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038613A (en) * 2017-12-08 2018-05-15 珠海华发汽车销售有限公司 A kind of automobile maintenance management system and method
CN113326954A (en) * 2021-06-25 2021-08-31 中国平安财产保险股份有限公司 Vehicle maintenance task scheduling method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN116128484A (en) 2023-05-16

Similar Documents

Publication Publication Date Title
US11889180B2 (en) Photographing method and electronic device
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN110742580A (en) Sleep state identification method and device
CN116128484B (en) Method and system for determining remaining maintenance time of automobile based on neural network
CN111078376A (en) Process management method and device
CN113727287A (en) Short message notification method and electronic terminal equipment
CN116612458A (en) Deep learning-based parking path determination method and system
CN113497851B (en) Control display method and electronic equipment
CN114005016A (en) Image processing method, electronic equipment, image processing system and chip system
CN113660369B (en) Incoming call processing and model training method and device, terminal equipment and storage medium
CN113656099B (en) Application shortcut starting method and device and terminal equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN115965448B (en) Vehicle maintenance accessory recommendation method and system based on image processing
CN113473013A (en) Display method and device for beautifying effect of image and terminal equipment
CN115994910B (en) Method and system for determining damage degree of automobile based on data processing
CN115841098B (en) Interactive batch filling method and system based on data identification
CN115841099B (en) Intelligent recommendation method of page filling words based on data processing
CN113766060A (en) Information screen display method and electronic equipment
CN114822525A (en) Voice control method and electronic equipment
CN116233599B (en) Video mode recommendation method and electronic equipment
CN116074624B (en) Focusing method and device
CN116048831B (en) Target signal processing method and electronic equipment
CN117493095A (en) Method and related device for determining equipment user
CN117274663A (en) Target detection method and electronic equipment
CN115131692A (en) Heart rate detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant