CN112446513A - Vehicle part detection method and device and computer readable storage medium - Google Patents
Vehicle part detection method and device and computer readable storage medium Download PDFInfo
- Publication number
- CN112446513A CN112446513A CN202011473095.0A CN202011473095A CN112446513A CN 112446513 A CN112446513 A CN 112446513A CN 202011473095 A CN202011473095 A CN 202011473095A CN 112446513 A CN112446513 A CN 112446513A
- Authority
- CN
- China
- Prior art keywords
- image
- vehicle
- current state
- maintenance
- component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 46
- 238000012423 maintenance Methods 0.000 claims abstract description 51
- 230000003190 augmentative effect Effects 0.000 claims abstract description 25
- 238000000034 method Methods 0.000 claims description 40
- 238000004590 computer program Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 230000008439 repair process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005299 abrasion Methods 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000010705 motor oil Substances 0.000 description 1
- 239000003921 oil Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/20—Administration of product repair or maintenance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application discloses a vehicle part detection method and device and a computer readable storage medium, which are used for solving the problem that the existing vehicle part detection cost is high. The scheme provided by the application comprises the following steps: scanning a preset position area of the vehicle to obtain a scanning image corresponding to the preset position area; identifying parts in a preset position area from the scanned image; detecting a current state of the part based on the image recognition; and displaying the detection feedback information of the parts through augmented reality according to the current state of the parts. The scheme of this application embodiment can detect vehicle spare part by simple effectual mode, reduces the maintenance cost of user to the vehicle.
Description
Technical Field
The present disclosure relates to the field of vehicle technologies, and in particular, to a method and an apparatus for detecting a vehicle component, and a computer-readable storage medium.
Background
People often need to maintain and repair vehicles in the daily vehicle using process, the current maintenance and repair mode is that users generally look through vehicle manuals by themselves and detect and maintain vehicle parts according to personal experience, or the vehicles are driven to 4S stores and detected by professionals, and the vehicles need to wait in line when the vehicles go to the 4S stores for operation, so that the detection cost is higher. The maintenance is performed by the personal experience or the detection of the special personnel in the 4S shop, a lot of time and energy of the user are wasted, and the vehicle maintenance cost of the user is increased.
How to provide a simple and effective vehicle part detection method, thereby reducing the maintenance cost of a user to a vehicle is the technical problem to be solved by the application.
Disclosure of Invention
The embodiment of the application aims to provide a vehicle part detection method and device and a computer readable storage medium, which are used for solving the problem that the existing vehicle part detection cost is high.
In order to solve the above technical problem, the present specification is implemented as follows:
in a first aspect, a vehicle component detection method is provided, including: scanning a preset position area of a vehicle to obtain a scanning image corresponding to the preset position area; identifying parts in the preset position area from the scanning image; detecting a current state of the part based on image recognition; and displaying the detection feedback information of the part through augmented reality according to the current state of the part.
Optionally, identifying the component in the predetermined position region from the scanned image includes: comparing the scanned image with a first database, wherein the first database comprises first images of parts in various shapes; when there is a first image matching the current shape of the part in the scanned image, identifying the part within the predetermined location area as a matching part in the first image.
Optionally, detecting the current state of the component based on image recognition includes: comparing the scanned image with a second database, wherein the second database comprises a second image of the original factory shape of the part in the scanned image; and determining the current state of the part based on the change between the current shape of the part in the scanned image and the original factory shape of the part in the second image.
Optionally, determining the current state of the part based on the change between the current shape of the part in the scanned image and the original factory shape of the part in the second image includes: and determining the current state of the part based on the deformation degree of the current shape of the part in the scanned image and the original factory shape of the part in the second image.
Optionally, detecting the current state of the component based on image recognition includes: comparing the scanned image with a third database, wherein the part in the scanned image is an alarm lamp, and the third database comprises third images of different display states of the alarm lamp of the type corresponding to the part; when there is a third image that matches the current display state of the warning lamp in the scanned image, the display state of the warning lamp in the third image is determined as the current state of the component.
Optionally, the method further includes: and determining the current state of the part corresponding to the alarm prompt by the alarm lamp based on the display state of the alarm lamp in the third image.
Optionally, the detecting feedback information of the component is displayed through augmented reality according to the current state of the component, and the detecting feedback information includes at least one of the following information:
when the current state of the part is detected to be good, displaying at least one item of a maintenance manual and an instruction book corresponding to the part through augmented reality;
when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part is displayed through augmented reality and a maintenance detail step of the part is dynamically displayed;
and when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part and a detailed maintenance step of the part is displayed dynamically through augmented reality.
Optionally, before displaying the detection feedback information of the component through augmented reality according to the current state of the component, the method further includes: acquiring vehicle type information of the vehicle; displaying a model of the same type of vehicle corresponding to the vehicle and highlighting the parts through augmented reality based on the vehicle type information of the vehicle; and displaying detection feedback information of the part at the periphery of the highlighted part.
In a second aspect, there is provided a vehicle component detecting apparatus comprising: a processor and a processor electrically connected to the memory, the memory storing a computer program executable on the processor, the computer program, when executed by the processor, implementing the steps of the method according to the first aspect.
In a third aspect, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to the first aspect.
In the embodiment of the application, a vehicle preset position area is scanned to obtain a scanning image corresponding to the preset position area; identifying parts in the preset position area from the scanning image; detecting a current state of the part based on image recognition; according to the current state of the part, the detection feedback information of the part is displayed through augmented reality, so that the detection of the vehicle part has the advantages of simplicity, directness and operation anytime and anywhere, the time cost and the fund cost of vehicle detection of a user are reduced, and the use experience of the user is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic flow chart of a vehicle component detection method according to an embodiment of the present application.
FIG. 2 is a schematic flow chart of a vehicle component detection method according to an embodiment of the present application.
FIG. 3 is a schematic flow chart of a vehicle component detection method according to an embodiment of the present application.
FIG. 4 is a schematic flow chart of a vehicle component detection method according to an embodiment of the present application.
Fig. 5 is an exemplary flowchart of a vehicle component detecting method according to an embodiment of the present application.
Fig. 6 is a block diagram showing a configuration of a vehicle component detection apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. The reference numbers in the present application are only used for distinguishing the steps in the scheme and are not used for limiting the execution sequence of the steps, and the specific execution sequence is described in the specification.
In order to solve the problems in the prior art, an embodiment of the present application provides a method for detecting a vehicle component, and fig. 1 is one of flow diagrams of the method for detecting a vehicle component according to the embodiment of the present application.
As shown in fig. 1, the method comprises the following steps:
The vehicle part detection method is executed on terminal equipment such as a mobile phone and a tablet personal computer. By opening the camera of the terminal equipment, any area which is desired to be detected on the detected vehicle can be scanned. The predetermined location area of the vehicle is typically an area containing vehicle vulnerable parts such as tires, wipers and brakes, etc., or an area containing vehicle warning lights such as warning lights indicating that a seatbelt is not fastened, that the engine oil capacity of the vehicle is insufficient, or that the generator is malfunctioning, etc.
After scanning the predetermined position area, obtaining a scanning image corresponding to the area.
And 104, identifying the parts in the preset position area from the scanned image.
Based on the solution provided by the foregoing embodiment, optionally, in the step 104, identifying the component in the predetermined position area from the scanned image includes the steps shown in fig. 2, and fig. 2 is one of the flow diagrams of the vehicle component detection method according to the embodiment of the present application.
As shown in fig. 2, includes:
The first database is pre-stored with images of various different shapes of parts of the vehicle, such as tires, wipers, brake pads, warning lamps, front mirrors, rear mirrors, and the like, and the first database is recorded with names of the parts included in each first image, for example, by recording each first image and its corresponding part name through an ID of each image and a name mapping table of the correspondingly included part. Further, the model number of the part is recorded in the first database.
And comparing the scanned image with the first database, namely comparing the shape of the part in the first image with the shape of the part in the scanned image, thereby finding out the first image matched with the shape of the part in the scanned image from the first database. Image comparison and recognition can be performed through a pre-trained image recognition model.
And 204, when a first image which is matched with the current shape of the part in the scanned image exists, identifying the part in the preset position area as the matched part in the first image.
If a first image matching the part shape of the scanned image exists in the first database, the part name of the corresponding mapping relation is acquired according to the ID of the first image, so that the part in the area scanned by the vehicle can be identified according to the part name. For example, if it is determined by the comparison that a first image matching the shape of the part in the scanned image exists in the first database and the part included in the first image is a tire, the part in the scanned image is identified as a tire.
If the first database does not have a first image matching the part shape of the scanned image, then it may be rescanned.
And 106, detecting the current state of the part based on image recognition.
Based on the solutions provided by the foregoing embodiments, optionally, in an embodiment, the step 106 detects the current state of the component based on image recognition, including the steps shown in fig. 3, where fig. 3 is one of the flow diagrams of the vehicle component detection method according to the embodiments of the present application.
As shown in fig. 3, includes:
The second database stores in advance the original factory shape of the component identified from the scanned image, that is, the shape of the component in a state where no wear or damage has occurred. To reduce computational consumption, the second image may include only second images of various different angles of the identified part. For example, if the component in the scanned image is identified as a tire in step 204, the second images included in the second database are all images of the original factory shape of the tire taken from various different angles.
In one embodiment, a second image of the part of the same model may further be obtained from a second database based on the model number of the identified part. Thus, a comparison part image that matches the part of the scanned image more accurately can thus be obtained in the second database.
The current status of the component includes at least one of good, in need of maintenance, and in need of repair.
For example, the recognized part is a wiper blade, and the aging deformation of the rubber strip of the wiper blade can be determined by comparing the current shape of the wiper blade with the original factory shape of the wiper blade in the second image. The original factory shape is straight, the close state of the windscreen wiper with a vehicle window is good, the current state of the windscreen wiper is determined to be bent through comparison, the fit gap with the vehicle window is large, and the windscreen wiper deforms.
For example, if the identified component is a tire, it can be determined that the scanned tire pattern of the vehicle is reduced and the tire is worn by comparing the current shape of the tire with its original factory shape in the second image.
In step 304, optionally, the current state of the component may be determined based on the magnitude of the deformation between the current shape of the component in the scanned image and the original factory shape of the component in the second image.
For example, the presence of wear in a tire may be determined based on a reduction in the pattern of the tire, and further, the degree of wear in the tire pattern may be determined by comparing the original factory thickness of the pattern to the current thickness. The original pattern of the tire, the leaving thickness of which is 1cm, and the current pattern thickness of the tire, the thickness of which is 0.5cm, are worn by 50%. At this time, the current state of the component is determined to be in need of maintenance.
For example, if the original factory thickness of the brake pad is 15-20cm, the current thickness is 14cm, and the brake pad is worn by 1cm, the current state of the brake pad is determined to be good; and if the thickness abrasion of the brake pad is determined to be 3cm-4cm through comparison and needs to be replaced, determining that the current state is the state needing to be maintained.
For example, if the current appearance of the windshield of the vehicle is determined to contain foreign matter, dirt, or the like and the vehicle needs to be washed, the current state is determined to be in need of maintenance.
And different detection strategies are correspondingly determined by different parts.
Based on the solutions provided by the foregoing embodiments, optionally, in an embodiment, the step 106 detects the current state of the component based on image recognition, including the steps shown in fig. 4, where fig. 4 is one of the flow diagrams of the vehicle component detection method according to the embodiments of the present application.
As shown in fig. 4, the method comprises the following steps:
As described above, the parts identified in the scanned image further include warning lamps, and different warning lamps are used to indicate different parts, such as a warning lamp indicating that a seatbelt is not attached, a warning lamp indicating that the oil capacity of the vehicle is insufficient, or a warning lamp indicating that the generator is out of order.
In addition, the same shaped warning light is used for different purposes depending on the current display state. For example, a warning light may be illuminated, strobed, or displayed in different colors, each indicating a different component failure.
And 404, when a third image which is matched with the current display state of the alarm lamp in the scanned image exists, determining the display state of the alarm lamp in the third image as the current state of the part.
Optionally, the method further includes: and determining the current state of the part corresponding to the alarm prompt by the alarm lamp based on the display state of the alarm lamp in the third image.
For example, if a warning lamp for indicating a generator failure is present in the third database and the display state of the warning lamp is identified as blinking, it indicates that the generator of the vehicle is currently in failure, and it is determined that the current state of the generator is in need of maintenance.
And 108, displaying the detection feedback information of the parts through augmented reality according to the current state of the parts.
Based on the solution provided by the foregoing embodiment, optionally, in step 108, the feedback information of the detection of the component is displayed through augmented reality according to the current state of the component, where the feedback information includes at least one of the following information:
when the current state of the part is detected to be good, displaying at least one of a use instruction and a maintenance manual corresponding to the part through augmented reality;
when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part is displayed through augmented reality and a maintenance detail step of the part is dynamically displayed;
and when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part and a detailed maintenance step of the part is displayed dynamically through augmented reality.
Before displaying, by augmented reality, detection feedback information of the component according to the current state of the component in step 108, the method further includes: acquiring vehicle type information of the vehicle; displaying a model of the same type of vehicle corresponding to the vehicle and highlighting the parts through augmented reality based on the vehicle type information of the vehicle; and displaying detection feedback information of the part at the periphery of the highlighted part.
Relevant information of the scanned vehicle, such as vehicle type information, etc., is stored in advance, and after the scanned vehicle part is detected in step 106, an Augmented Reality (AR) model is called using a terminal device, and is displayed by the AR so as to project the same model of the current vehicle to a blank space of an open space, for example, and highlight the identified part.
The part use instruction, the maintenance mobile phone, the maintenance suggestion, the maintenance detailed step, the maintenance suggestion and the maintenance detailed step can be corresponding small videos shot in advance, the corresponding small videos are called through an AR model and displayed, and therefore detection feedback information of the part is displayed to the user in an AR mode.
Therefore, the user projects an AR vehicle model which is the same as the vehicle of the user to the blank positions such as the ground through the terminal equipment, the maintenance process of the vehicle parts is shown in detail to the user through the dynamic AR vehicle model, and the vehicle owner can easily maintain the vehicle parts without external guidance. The maintenance process is displayed to the user more accurately through the AR technology, the guidance is more visual and convenient, the user can operate step by step along with the AR description, the operation guidance information is accurate, and the vehicle maintenance is simpler and more convenient.
In addition to the AR displaying the corresponding detection feedback information, a purchase suggestion link or a 4S store reservation suggestion may be displayed to the user on an application interface of the terminal device.
For example, in determining that a scanned vehicle part requires maintenance or manual repair on its own, in addition to providing the user with a corresponding AR display recommendation, a purchase recommendation link may be provided if there is a need to purchase the part; if it is determined that the scanned vehicle part requires repair and the problem is serious, professional repair to the 4S shop is required, a 4S shop reservation recommendation may be provided to the user.
The vehicle component detecting method according to the embodiment of the present application may be executed in a terminal device, where the terminal device executes the steps 102 to 108. In one embodiment, in order to increase the operation speed of the vehicle component detection, the component identification in step 104 and/or the component current state detection in step 106 may also be performed at the server side, and the server side determines the result and feeds the corresponding result back to the terminal device for the subsequent steps.
The vehicle part detection method provided by the embodiment of the application comprises the steps of scanning a vehicle preset position area to obtain a scanning image corresponding to the preset position area, identifying parts in the preset position area in the scanning image, detecting the current state of the parts based on image identification, and displaying the detection feedback information of the parts through augmented reality according to the current state of the parts, so that the detection of the vehicle parts provided by the embodiment of the application has the advantages of simplicity, directness and capability of operating at any time and any place, the time cost and the fund cost of vehicle detection of a user are reduced, and the use experience of the user is improved.
Compare most current automobile maintenance mode, this application embodiment solves some user experience and is not enough, can't carry out accurate detection and follow-up maintenance's problem to the state of vehicle wearing parts and other spare parts. And based on the continuous development of the artificial intelligence technology, the recognition capability and the maintenance suggestions can be continuously optimized and updated, and the use experience of the car owner is further improved.
Referring now to fig. 5, fig. 5 is an exemplary flowchart of a vehicle component detection method of an embodiment of the present application.
As shown in fig. 5, the method comprises the following steps:
step 502: opening a terminal device, such as a camera of a mobile phone, and scanning the position of a part corresponding to a vehicle to obtain a scanned image;
step 504: identifying the information of the position, namely the information and the current state of the parts in the corresponding area of the scanned image;
step 506: determining whether a valid component is identified, if not (shown in the figure "N"), returning to step 502, and if yes (shown in the figure "Y"), proceeding to step 508;
step 508: and analyzing the state of the part through image recognition, wherein the corresponding state comprises that the part needs to be repaired, the state is good or the part needs to be maintained.
Step 510: if the component is identified to be in good condition, ending the process, or further displaying the component maintenance manual and the operation instruction to the user through the AR;
step 512: if the part is identified to need maintenance, displaying the relevant part through VR, providing maintenance suggestions displayed by AR for a user, and displaying maintenance details through dynamic AR;
step 514: if the part is identified to be in need of maintenance, further judging whether the fault of the part belongs to a serious problem, if not, entering a step 516, and if so, entering a step 518;
step 516: if the method is not a serious problem, the part can be manually maintained by a user, the relevant part is displayed through VR, AR maintenance suggestions are provided for the user, and detailed maintenance steps are displayed through dynamic AR;
step 518: if the part is identified to be in need of maintenance and the problem is serious, professional maintenance is needed to be carried out in the 4S shop, 4S shop reservation suggestions are provided for the user.
By the method, for example, when a user scans the wiper by using a mobile phone, the mobile phone automatically recognizes that the current part is the wiper and the wiper can be replaced by itself when the current loss is serious, informs the user that the wiper needs to be replaced when the current loss is serious through AR display, and provides an official mall purchasing link for the user to purchase. After the wiper is in a good state, a user can replace the wiper through the AR specification replacement description corresponding to the wiper in the mobile phone step by step, and finally the replacement of the wiper is completed.
Optionally, an embodiment of the present application further provides a vehicle component detecting device, and fig. 6 is a block diagram of the vehicle component detecting device according to the embodiment of the present application.
As shown in fig. 6, the apparatus 2000 includes a memory 2200 and a processor 2400 electrically connected to the memory 2200, where the memory 2200 stores a computer program that can be executed by the processor 2400, and the computer program, when executed by the processor, implements each process of any one of the above embodiments of the vehicle component detecting method, and can achieve the same technical effect, and is not repeated here to avoid repetition.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of any one of the above vehicle part detection method embodiments, and can achieve the same technical effect, and is not described herein again to avoid repetition. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. A vehicle component detection method, comprising:
scanning a preset position area of a vehicle to obtain a scanning image corresponding to the preset position area;
identifying parts in the preset position area from the scanning image;
detecting a current state of the part based on image recognition;
and displaying the detection feedback information of the part through augmented reality according to the current state of the part.
2. The method of claim 1, wherein identifying parts within the predetermined location area from the scanned image comprises:
comparing the scanned image with a first database, wherein the first database comprises first images of parts in various shapes;
when there is a first image matching the current shape of the part in the scanned image, identifying the part within the predetermined location area as a matching part in the first image.
3. The method of claim 1, wherein detecting the current state of the part based on image recognition comprises:
comparing the scanned image with a second database, wherein the second database comprises a second image of the original factory shape of the part in the scanned image;
and determining the current state of the part based on the change between the current shape of the part in the scanned image and the original factory shape of the part in the second image.
4. The method of claim 3, wherein determining the current state of the part based on the change in the current shape of the part in the scanned image from the original factory shape of the part in the second image comprises:
and determining the current state of the part based on the deformation degree of the current shape of the part in the scanned image and the original factory shape of the part in the second image.
5. The method of claim 1, wherein detecting the current state of the part based on image recognition comprises:
comparing the scanned image with a third database, wherein the part in the scanned image is an alarm lamp, and the third database comprises third images of different display states of the alarm lamp of the type corresponding to the part;
when there is a third image that matches the current display state of the warning lamp in the scanned image, the display state of the warning lamp in the third image is determined as the current state of the component.
6. The method of claim 5, further comprising:
and determining the current state of the part corresponding to the alarm prompt by the alarm lamp based on the display state of the alarm lamp in the third image.
7. The method of claim 1, wherein the feedback information for the detection of the component through the augmented reality presentation according to the current state of the component comprises at least one of:
when the current state of the part is detected to be good, displaying at least one item of a maintenance manual and an instruction book corresponding to the part through augmented reality;
when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part is displayed through augmented reality and a maintenance detail step of the part is dynamically displayed;
and when the current state of the part is detected to be in need of maintenance, at least one of a maintenance suggestion of the part and a detailed maintenance step of the part is displayed dynamically through augmented reality.
8. The method of claim 7, further comprising, prior to presenting feedback information for the component detected by augmented reality based on the current state of the component:
acquiring vehicle type information of the vehicle;
displaying a model of the same type of vehicle corresponding to the vehicle and highlighting the parts through augmented reality based on the vehicle type information of the vehicle; wherein,
and displaying detection feedback information of the part at the periphery of the highlighted part.
9. A vehicle component detecting device, characterized by comprising: a memory and a processor electrically connected to the memory, the memory storing a computer program executable on the processor, the computer program, when executed by the processor, implementing the steps of the method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011473095.0A CN112446513A (en) | 2020-12-15 | 2020-12-15 | Vehicle part detection method and device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011473095.0A CN112446513A (en) | 2020-12-15 | 2020-12-15 | Vehicle part detection method and device and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112446513A true CN112446513A (en) | 2021-03-05 |
Family
ID=74740411
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011473095.0A Pending CN112446513A (en) | 2020-12-15 | 2020-12-15 | Vehicle part detection method and device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112446513A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113111091A (en) * | 2021-04-19 | 2021-07-13 | 深圳市轱辘车联数据技术有限公司 | Maintenance information acquisition method and device, computer equipment and storage medium |
CN114739293A (en) * | 2022-03-25 | 2022-07-12 | 北京博联众睿机器人科技有限公司 | Vehicle body measuring method, system, device and electronic equipment |
CN115063431A (en) * | 2022-08-19 | 2022-09-16 | 山东远盾网络技术股份有限公司 | Automobile part quality tracing method based on image processing |
CN115212586A (en) * | 2022-04-29 | 2022-10-21 | 长城汽车股份有限公司 | Automobile model and automobile model interaction system |
CN115633058A (en) * | 2022-09-30 | 2023-01-20 | 中国第一汽车股份有限公司 | Processing method and processing device for vehicle-mounted equipment |
CN116500058A (en) * | 2023-06-28 | 2023-07-28 | 信丰云创(深圳)信息有限公司 | Display screen detection system and display screen detection method |
WO2024170648A1 (en) * | 2023-02-16 | 2024-08-22 | Valeo Systèmes d'Essuyage | Method for associating a wiping blade and a windscreen-wiper |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103940620A (en) * | 2014-04-24 | 2014-07-23 | 成都华诚立信科技有限公司 | Method and system for distinguishing and prompting vehicle failures on basis of image identification |
CN106021548A (en) * | 2016-05-27 | 2016-10-12 | 大连楼兰科技股份有限公司 | Remote damage assessment method and system based on distributed artificial intelligent image recognition |
CN106296118A (en) * | 2016-08-03 | 2017-01-04 | 深圳市永兴元科技有限公司 | Car damage identification method based on image recognition and device |
CN107392218A (en) * | 2017-04-11 | 2017-11-24 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN107403424A (en) * | 2017-04-11 | 2017-11-28 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN110009508A (en) * | 2018-12-25 | 2019-07-12 | 阿里巴巴集团控股有限公司 | A kind of vehicle insurance compensates method and system automatically |
CN209147928U (en) * | 2019-01-22 | 2019-07-23 | 南昌工程学院 | Machine vision detection device for automobile parts |
CN110263615A (en) * | 2019-04-29 | 2019-09-20 | 阿里巴巴集团控股有限公司 | Interaction processing method, device, equipment and client in vehicle shooting |
-
2020
- 2020-12-15 CN CN202011473095.0A patent/CN112446513A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103940620A (en) * | 2014-04-24 | 2014-07-23 | 成都华诚立信科技有限公司 | Method and system for distinguishing and prompting vehicle failures on basis of image identification |
CN106021548A (en) * | 2016-05-27 | 2016-10-12 | 大连楼兰科技股份有限公司 | Remote damage assessment method and system based on distributed artificial intelligent image recognition |
CN106296118A (en) * | 2016-08-03 | 2017-01-04 | 深圳市永兴元科技有限公司 | Car damage identification method based on image recognition and device |
CN107392218A (en) * | 2017-04-11 | 2017-11-24 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN107403424A (en) * | 2017-04-11 | 2017-11-28 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN110009508A (en) * | 2018-12-25 | 2019-07-12 | 阿里巴巴集团控股有限公司 | A kind of vehicle insurance compensates method and system automatically |
CN209147928U (en) * | 2019-01-22 | 2019-07-23 | 南昌工程学院 | Machine vision detection device for automobile parts |
CN110263615A (en) * | 2019-04-29 | 2019-09-20 | 阿里巴巴集团控股有限公司 | Interaction processing method, device, equipment and client in vehicle shooting |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113111091A (en) * | 2021-04-19 | 2021-07-13 | 深圳市轱辘车联数据技术有限公司 | Maintenance information acquisition method and device, computer equipment and storage medium |
CN114739293A (en) * | 2022-03-25 | 2022-07-12 | 北京博联众睿机器人科技有限公司 | Vehicle body measuring method, system, device and electronic equipment |
CN115212586A (en) * | 2022-04-29 | 2022-10-21 | 长城汽车股份有限公司 | Automobile model and automobile model interaction system |
CN115212586B (en) * | 2022-04-29 | 2024-01-02 | 长城汽车股份有限公司 | Automobile model and automobile model interaction system |
CN115063431A (en) * | 2022-08-19 | 2022-09-16 | 山东远盾网络技术股份有限公司 | Automobile part quality tracing method based on image processing |
CN115063431B (en) * | 2022-08-19 | 2022-11-11 | 山东远盾网络技术股份有限公司 | Automobile part quality tracing method based on image processing |
CN115633058A (en) * | 2022-09-30 | 2023-01-20 | 中国第一汽车股份有限公司 | Processing method and processing device for vehicle-mounted equipment |
WO2024170648A1 (en) * | 2023-02-16 | 2024-08-22 | Valeo Systèmes d'Essuyage | Method for associating a wiping blade and a windscreen-wiper |
FR3145912A1 (en) * | 2023-02-16 | 2024-08-23 | Valeo Systemes D'essuyage | Method of combining a wiper blade and a windshield wiper blade |
CN116500058A (en) * | 2023-06-28 | 2023-07-28 | 信丰云创(深圳)信息有限公司 | Display screen detection system and display screen detection method |
CN116500058B (en) * | 2023-06-28 | 2024-03-19 | 信丰云创(深圳)信息有限公司 | Display screen detection system and display screen detection method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112446513A (en) | Vehicle part detection method and device and computer readable storage medium | |
US11361426B2 (en) | Paint blending determination | |
CN110395260B (en) | Vehicle, safe driving method and device | |
CN111488875B (en) | Vehicle insurance claim settlement loss checking method and device based on image recognition and electronic equipment | |
KR102198296B1 (en) | Apparatus, method and computer program for automatically calculating the damage | |
EP1891580B1 (en) | Method and a system for detecting a road at night | |
CN208239883U (en) | Vehicle failure intelligent checking system | |
EP3413287A1 (en) | Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle | |
US10703299B2 (en) | Rear view mirror simulation | |
US20220041105A1 (en) | Rearview device simulation | |
CN113911073A (en) | Vehicle sensor cleaning | |
KR20190129264A (en) | Method for suggesting service related to vehicle by using information on vehicle | |
US11521393B2 (en) | Information processing system, program, and control method | |
KR20220024768A (en) | Damage detection device and method | |
US20240320751A1 (en) | Remote vehicle damage assessment | |
CN108280201A (en) | A kind of information of vehicles generation method, device and its system | |
CN111045636A (en) | Vehicle function display method and system | |
KR102354754B1 (en) | Vehicle-related products recommendation and sales system | |
CN115633058A (en) | Processing method and processing device for vehicle-mounted equipment | |
KR20220036308A (en) | Method and system for guiding car repair based on image analysis | |
KR102498615B1 (en) | AI Based Night Vision System | |
KR20220029930A (en) | Vehicle management apparatus | |
CN116605141B (en) | Display method and device of electronic rearview mirror, electronic equipment and storage medium | |
US20240083354A1 (en) | Vehicle lamp information notification device, vehicle lamp information notification system, vehicle lamp information notification method, and non-transitory storage medium | |
CN118781568A (en) | Behavior early warning method and device based on rearview mirror, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210305 |