CN117252867B - VR equipment production product quality monitoring analysis method based on image recognition - Google Patents

VR equipment production product quality monitoring analysis method based on image recognition Download PDF

Info

Publication number
CN117252867B
CN117252867B CN202311512282.9A CN202311512282A CN117252867B CN 117252867 B CN117252867 B CN 117252867B CN 202311512282 A CN202311512282 A CN 202311512282A CN 117252867 B CN117252867 B CN 117252867B
Authority
CN
China
Prior art keywords
target
equipment
coefficient
response
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311512282.9A
Other languages
Chinese (zh)
Other versions
CN117252867A (en
Inventor
林鸿郁
黄灿坚
刘畅
武翠光
刘伯德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Pinzhong Electronic Technology Co ltd
Original Assignee
Guangzhou Pinzhong Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Pinzhong Electronic Technology Co ltd filed Critical Guangzhou Pinzhong Electronic Technology Co ltd
Priority to CN202311512282.9A priority Critical patent/CN117252867B/en
Publication of CN117252867A publication Critical patent/CN117252867A/en
Application granted granted Critical
Publication of CN117252867B publication Critical patent/CN117252867B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to the field of quality monitoring and analysis of VR equipment production products, and particularly discloses a quality monitoring and analysis method of VR equipment production products based on image recognition, which evaluates the appearance quality of VR equipment from dimensions such as deformation, scratch, dirt, printing and the like, improves the reliability of appearance inspection of the VR equipment, and ensures safe, reliable and comfortable use of the VR equipment; evaluating the image display quality of the VR equipment from the dimensions of resolution, brightness, contrast, color accuracy and the like, and ensuring the clear and real image presentation of the VR equipment; evaluating audio output quality of the VR device from the frequency response and the sound balance dimension, such that the VR device provides a clear, realistic, fluent sound effect; and the sensor tracking capability of the VR device is evaluated from the gesture tracking and motion tracking dimensions, so that the VR device can accurately and rapidly track the action change and position movement of the user, thereby realizing more real and immersive virtual reality experience and meeting complex interaction requirements.

Description

VR equipment production product quality monitoring analysis method based on image recognition
Technical Field
The invention relates to the field of quality monitoring and analysis of VR equipment production products, in particular to a quality monitoring and analysis method of VR equipment production products based on image recognition.
Background
With the rapid development of VR technology, VR devices have been widely used in the fields of games, education, medical treatment, etc., to provide users with an immersive experience. VR equipment is as a high-tech product, and is very strict to the quality requirement, if VR equipment quality is up to standard, not only influence user's experience and feel, there is the potential safety hazard even. Through the product quality monitoring of VR equipment, defects and problems in the production process of the VR equipment can be timely found and solved, complaints and accidents caused by quality problems are reduced, the satisfaction and the trust degree of users are improved, the product performance and the reliability are improved, and the production efficiency and the product competitiveness are improved. Therefore, the importance of VR device product quality monitoring is not negligible.
The existing VR equipment production product quality monitoring methods are mostly concentrated on routine inspection of the appearance of VR equipment, and lack of monitoring on performance of VR equipment or insufficient comprehensive and deep performance monitoring, so that some defects exist: according to the method, in-depth monitoring analysis on the image display quality of the VR equipment is lacking, such as resolution, brightness, contrast, color accuracy and the like of the image display of the VR equipment, the problems of blurring of the image quality, color distortion, pixel missing and the like can occur due to poor image display quality of the VR equipment, visual experience of a user is further affected, sense of reality and sense of immersion of the user on a virtual reality scene are reduced, expected sense of immersion cannot be obtained, eye fatigue of the user after the user uses the equipment for a long time is increased, uncomfortable symptoms such as dizziness and vertigo even occur, and negative user comments and public praise are caused.
In the second aspect, the existing method lacks in-depth monitoring analysis on the audio output quality of the VR device, such as frequency response and sound balance of the audio output of the VR device, and the problems of tone quality distortion and the like can occur when the audio output quality of the VR device is poor, so that the hearing experience of a user is affected, the user cannot obtain a real and clear sound effect, the immersion and realism of the user on a virtual reality scene are reduced, and the smoothness and pleasure of the user on virtual reality application such as games, videos and the like are affected, so that the VR device is difficult to be completely integrated into the virtual reality environment.
In the third aspect, the existing method lacks in-depth monitoring analysis on the tracking capability of the VR device sensor, such as gesture tracking and motion tracking of the VR device, and poor tracking capability of the VR device sensor can lead to inaccurate or delayed tracking of motion and position, so that immersion and reality of a user are reduced, user experience is affected, complex interaction requirements cannot be met, and diversity and innovation of virtual reality application scenes are limited.
Disclosure of Invention
Aiming at the problems, the invention provides a VR equipment production product quality monitoring and analyzing method based on image recognition, which comprises the following specific technical scheme: a VR equipment production product quality monitoring analysis method based on image recognition includes the following steps: step one, monitoring appearance quality of VR equipment: and obtaining appearance information of each VR device in the current production batch of the target VR device production factory, recording the appearance information as the appearance information of each target VR device, wherein the appearance information comprises a deformation coefficient, a scratch coefficient, a dirt coefficient and a printing coincidence coefficient, and analyzing appearance quality evaluation indexes of each target VR device.
Step two, monitoring image display quality of VR equipment: and acquiring the resolution, brightness, contrast and color accuracy of each display image of each target VR device, and analyzing the image display quality evaluation index of each target VR device.
Third, monitoring audio output quality of VR equipment: and acquiring the frequency response coincidence coefficient and the sound balance coincidence coefficient of the audio output of each target VR device, and analyzing the audio output quality evaluation index of each target VR device.
Step four, monitoring tracking capability of VR equipment sensors: acquiring the attitude fitness and the attitude change response time of each target VR device in each attitude tracking test, acquiring the positioning fitness and the position movement response time of each target VR device in each motion tracking test, and analyzing the sensor tracking ability evaluation index of each target VR device.
Fifthly, VR equipment product quality assessment feedback: and feeding back the appearance quality evaluation index, the image display quality evaluation index, the audio output quality evaluation index and the sensor tracking ability evaluation index of each target VR device to a production quality management department of a target VR device production factory.
On the basis of the above embodiment, the specific analysis process in the first step includes: f1: the three-dimensional model of each target VR device is obtained through a laser scanner, the three-dimensional model of each target VR device is compared with the standard three-dimensional model of the target VR device stored in a database, the deformation coefficient of each target VR device is obtained through analysis, and the deformation coefficient is recorded as,/>Indicate->Number of individual target VR device,/->
F2: obtaining angle images of each target VR device, splicing to obtain live-action images of each target VR device, and dividing the surface area of the target VR device according to a preset principle to obtain each subarea of the surface of the target VR device.
According to the live-action image of each target VR device, obtaining the length of each scratch in each sub-area of each target VR device surface, analyzing to obtain the scratch coefficient of each target VR device, and recording the scratch coefficient as
F3: according to eachThe real-scene image of the target VR equipment is used for acquiring the dirty areas of each part in each subarea of the surface of each target VR equipment, analyzing and obtaining the dirty coefficients of each target VR equipment, and recording the dirty coefficients as
F4: according to the real image of each target VR device, the position and the outline of each identifier on the surface of each target VR device are obtained, the position deviation and the overlapping outline length of each identifier on the surface of each target VR device are analyzed, the printing matching coefficient of each target VR device is further obtained, and the printing matching coefficient is recorded as
On the basis of the above embodiment, the specific analysis process of the first step further includes: by analysis of formulasObtaining the appearance quality evaluation index of each target VR device>WhereinRespectively representing the weights of preset deformation coefficients, scratch coefficients, dirt coefficients and printing fit coefficients,
based on the above embodiment, the specific analysis process in the second step includes: acquiring a set number of display images of each target VR device according to a preset principle, further acquiring the resolution, brightness and contrast of each display image of each target VR device, and respectively marking the resolution, brightness and contrast as,/>Indicate->The number of the individual display images is set,
acquiring each color region of each display image in each target VR device, further acquiring the gray value of each color region in each display image in each target VR device, and recording the gray value as,/>Indicate->The number of the individual color zones is determined,extracting standard gray values of each color region in each display image stored in the database, and recording the standard gray values asBy analysis formula->Obtain color accuracy +.>Wherein->Representing the number of color areas +.>Representing the gray value deviation threshold of the color region of the display image.
On the basis of the above embodiment, the specific analysis process of the second step further includes: by analysis of formulasObtaining each target VR deviceImage display quality evaluation index +.>Wherein->Representing the number of images to be displayed,respectively representing a resolution threshold, a proper brightness, a proper contrast and a color accuracy threshold of the image display of the preset target VR device, < >>Respectively representing a preset brightness deviation threshold value and contrast deviation threshold value,/->Weights respectively representing preset resolution, brightness, contrast and color accuracy.
On the basis of the above embodiment, the specific analysis process in the third step includes: setting each test frequency audio signal played by each target VR device according to a preset principle, acquiring output response of each target VR device when playing each test frequency audio signal, further acquiring output response curves of each target VR device when playing each test frequency audio signal, extracting reference output response curves stored in a database when each target VR device plays each test frequency audio signal, comparing the output response curves of each target VR device when playing each test frequency audio signal with the corresponding reference output response curves, acquiring the coincidence degree of the output response curves of each target VR device when playing each test frequency audio signal and the corresponding reference output response curves, and marking the coincidence degree as the frequency response coincidence degree of each target VR device under each test frequency audio signal and representing the coincidence degree as,/>Indicate->Number of audio signal of test frequency, +.>
By analysis of formulasObtaining the frequency response coincidence coefficient of the audio output of each target VR device>Wherein->Representing the number of audio signals of the test frequency, +.>Indicating a preset frequency response compliance threshold.
On the basis of the above embodiment, the specific analysis process in the third step further includes: setting audio frequency bands played by target VR equipment, acquiring the audio frequency bands played by each target VR equipment, selecting each frequency point in a frequency range corresponding to the audio frequency bands played by each target VR equipment according to a preset principle, marking the frequency point as each monitoring frequency point of the audio frequency bands played by each target VR equipment, acquiring the volume corresponding to each monitoring frequency point in the audio frequency bands played by each target VR equipment, and marking the volume as,/>Indicate->Number of each monitoring frequency point, +.>
By analysis of formulasObtaining the sound balance coincidence coefficient of the audio output of each target VR device>Wherein->Indicating the number of monitoring frequency points, +.>Indicate->The (th) of the target VR device playing audio band>Volume corresponding to each monitoring frequency point, +.>Indicating a preset volume deviation threshold.
On the basis of the above embodiment, the specific analysis process in the third step further includes: by analysis of formulasObtaining audio output quality evaluation index of each target VR deviceWherein->Respectively representing the preset frequency response compliance coefficients and the threshold values of the sound balance compliance coefficients,respectively representing the preset frequency response coincidence coefficient and the weight factor of the sound balance coincidence coefficient.
Based on the above embodiment, the specific analysis process in the fourth step includes: t1: carrying out gesture tracking test for each target VR device for set times according to a preset principle to obtainThe actual response gesture of each target VR device in each gesture tracking test is compared with the standard response gesture of each gesture tracking test stored in the database, so as to obtain the coincidence degree of the actual response gesture of each target VR device in each gesture tracking test and the standard response gesture corresponding to the gesture tracking test, and the coincidence degree is recorded as the gesture coincidence degree of each target VR device in each gesture tracking test and expressed as,/>Indicate->The number of the secondary pose tracking test is given,
acquiring the gesture change response time of each target VR device in each gesture tracking test, and recording the gesture change response time as
T2: according to a preset principle, performing motion tracking tests for set times on each target VR device, obtaining actual response positions of each target VR device in each motion tracking test, comparing the actual response positions of each target VR device in each motion tracking test with standard response positions of each motion tracking test stored in a database, obtaining distances between the actual response positions of each target VR device in each motion tracking test and the corresponding standard response positions of each motion tracking test, and recording the distances as,/>Indicate->Number of secondary exercise tracking test, +.>By analysis formula->Obtaining the positioning fitness of each target VR device in each motion tracking test>Wherein->And the influence factor corresponding to the unit distance between the preset actual response position and the standard response position is represented.
Acquiring the position movement response time length of each target VR device in each motion tracking test, and recording the time length as
On the basis of the above embodiment, the specific analysis process in the fourth step further includes: by analysis of formulasObtaining the posture tracking evaluation coefficient of each target VR device>WhereinThreshold values respectively representing preset attitude fitness and attitude change response time length, < >>Representing a preset->The secondary pose tracks the weight of the test.
By analysis of formulasObtaining action tracking evaluation coefficient of each target VR device>Wherein->Threshold values respectively representing preset positioning fitness and position movement response time length, +.>Representing a preset->The secondary motion tracks the weights of the test.
By analysis of formulasObtaining a sensor traceability evaluation index for each target VR device>
Compared with the prior art, the VR equipment production product quality monitoring and analyzing method based on image recognition has the following beneficial effects: 1. according to the invention, the deformation coefficient, the scratch coefficient, the dirt coefficient and the printing matching coefficient of the target VR equipment are obtained, the appearance quality evaluation index of the target VR equipment is analyzed, the appearance quality of a VR equipment product is comprehensively evaluated from multiple dimensions, the reliability of appearance inspection of the VR equipment is improved, safe, reliable and comfortable use of the VR equipment is ensured, and good brand image is maintained.
2. According to the invention, the resolution, brightness, contrast and color accuracy of the image display of the target VR equipment are obtained, the image display quality of the target VR equipment is evaluated, the image display quality of the VR equipment is deeply analyzed from multiple dimensions, clear and real image presentation of the VR equipment is ensured, and the satisfactory visual effect and immersive experience of a user are provided.
3. According to the invention, the audio output quality of the target VR device is evaluated by acquiring the frequency response coincidence coefficient and the sound balance coincidence coefficient of the audio output of the target VR device, and the audio output quality of the VR device is deeply analyzed from multiple dimensions, so that the VR device provides clear, real and smooth sound effects, and the satisfaction degree and participation degree of users on virtual reality experience are improved.
4. According to the invention, through monitoring the gesture tracking and motion tracking effects of the VR equipment, the sensor tracking capability of the VR equipment is evaluated, and the sensor tracking capability of the VR equipment is deeply analyzed from multiple dimensions, so that the VR equipment can accurately and rapidly track the action change and position movement of a user, thereby realizing more real and immersive virtual reality experience and meeting complex interaction requirements.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of the method of the present invention.
Fig. 2 is a schematic diagram of a VR device production quality assessment model of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1 and 2, the method for monitoring and analyzing quality of VR equipment production products based on image recognition provided by the invention comprises the following steps: step one, monitoring appearance quality of VR equipment: and obtaining appearance information of each VR device in the current production batch of the target VR device production factory, recording the appearance information as the appearance information of each target VR device, wherein the appearance information comprises a deformation coefficient, a scratch coefficient, a dirt coefficient and a printing coincidence coefficient, and analyzing appearance quality evaluation indexes of each target VR device.
As a preferred embodiment, the specific analysis process of the first step includes: f1: the three-dimensional model of each target VR device is obtained through a laser scanner, the three-dimensional model of each target VR device is compared with the standard three-dimensional model of the target VR device stored in a database, the deformation coefficient of each target VR device is obtained through analysis, and the deformation coefficient is recorded as,/>Indicate->Number of individual target VR device,/->
The deformation coefficient of each target VR device is analyzed by the following specific method: comparing the three-dimensional model of each target VR device with the standard three-dimensional model of the target VR device stored in the database to obtain the coincidence degree of the three-dimensional model of each target VR device and the standard three-dimensional model, and recording the coincidence degree as,/>Indicate->The number of the individual target VR device(s),by analysis formula->Obtaining the deformation coefficient of each target VR device>Wherein->And representing a threshold value of the coincidence degree of the preset target VR equipment three-dimensional model and the standard three-dimensional model.
F2: obtaining angle images of each target VR device, splicing to obtain live-action images of each target VR device, and dividing the surface area of the target VR device according to a preset principle to obtain each subarea of the surface of the target VR device.
According to the live-action image of each target VR device, obtaining the length of each scratch in each sub-area of each target VR device surface, analyzing to obtain the scratch coefficient of each target VR device, and recording the scratch coefficient as
It should be noted that, the scratch coefficient of each target VR device is analyzed, and the specific method includes: obtaining the length of scratches on each part of the surface of each target VR device according to the live-action image of each target VR device, further obtaining the length of scratches on each part of each sub-area of the surface of each target VR device, counting to obtain the total length of scratches on each sub-area of the surface of each target VR device, and recording the total length as,/>Indicate->Number of sub-area>By analysis of the formulaObtaining scratch coefficient of each target VR device>Wherein->Indicating the influence factor corresponding to the preset unit scratch length, < ->Representing a preset->Weighting factors for the sub-regions.
F3: according to the real-scene image of each target VR device, the area of each dirty area in each sub-area of the surface of each target VR device is obtained, the dirty coefficient of each target VR device is obtained by analysis, and is recorded as
The specific method for analyzing the pollution coefficient of each target VR device is as follows: obtaining the area of each dirty area on the surface of each target VR device according to the live-action image of each target VR device, further obtaining the area of each dirty area in each sub-area on the surface of each target VR device, counting to obtain the total dirty area of each sub-area on the surface of each target VR device, and recording the total dirty area asBy analysis formula->Obtaining the fouling factor of each target VR device>WhereinAnd representing the influence factor corresponding to the preset unit dirt area.
F4: according to the real image of each target VR device, the position and the outline of each identifier on the surface of each target VR device are obtained, the position deviation and the overlapping outline length of each identifier on the surface of each target VR device are analyzed, the printing matching coefficient of each target VR device is further obtained, and the printing matching coefficient is recorded as
The printing matching coefficients of each target VR device are analyzed, and the specific method is as follows: acquiring each identifier of each target VR device surface according to the real image of each target VR device, further obtaining the position and the outline of each identifier of each target VR device surface, comparing the position of each identifier of each target VR device surface with the standard positions of each identifier of the target VR device surface stored in a database, obtaining the distance between the position of each identifier of each target VR device surface and the corresponding standard position, marking the distance as the position deviation of each identifier of each target VR device surface, and representing the distance as the position deviation of each identifier of each target VR device surface,/>Indicate->Number of the identifier->
Comparing the contour of each identifier on the surface of each target VR device with the standard contour of each identifier on the surface of each target VR device stored in the database to obtain the superposition length of the contour of each identifier on the surface of each target VR device and the corresponding standard contour, and marking the superposition length as the superposition contour length of each identifier on the surface of each target VR device and representing the superposition length as
By analysis of formulasObtaining the printing matching coefficient of each target VR device>Wherein->Representing natural constant->Indicating the influence factor corresponding to the deviation of the preset identifier unit position +.>Representing the target VR device surface stored in the database +.>The total length of the outline of the identifier.
It is noted that the identifier of the target VR device surface includes, but is not limited to: letters, words, numbers, symbols, etc.
As a preferred embodiment, the specific analysis process of the first step further includes: by analysis of formulasObtaining the appearance quality evaluation index of each target VR device>WhereinRespectively representing the weights of preset deformation coefficients, scratch coefficients, dirt coefficients and printing fit coefficients,
in the embodiment, the deformation coefficient, the scratch coefficient, the dirt coefficient and the printing matching coefficient of the target VR equipment are obtained, the appearance quality evaluation index of the target VR equipment is analyzed, the appearance quality of a VR equipment product is comprehensively evaluated from multiple dimensions, the reliability of appearance inspection of the VR equipment is improved, the safe, reliable and comfortable use of the VR equipment is ensured, and good brand image is maintained.
Step two, monitoring image display quality of VR equipment: and acquiring the resolution, brightness, contrast and color accuracy of each display image of each target VR device, and analyzing the image display quality evaluation index of each target VR device.
As a preferred scheme, the specific analysis process of the second step includes: acquiring a set number of display images of each target VR device according to a preset principle, further acquiring the resolution, brightness and contrast of each display image of each target VR device, and respectively marking the resolution, brightness and contrast as,/>Indicate->Number of each display image->
Acquiring each color region of each display image in each target VR device, further acquiring the gray value of each color region in each display image in each target VR device, and recording the gray value as,/>Indicate->The number of the individual color zones is determined,extracting standard gray values of each color region in each display image stored in the database, and recording the standard gray values asBy analysis formula->Obtain color accuracy +.>Wherein->Representing the number of color areas +.>Representing the gray value deviation threshold of the color region of the display image.
The resolution of the display image of the target VR device may be acquired by means of an image resolution detector.
The luminance of the display image of the target VR device may be acquired by means of a luminance meter.
The contrast of the display image of the target VR device may be acquired by means of a color analyzer.
The total number of display images of each target VR device is the same as the screen content of each display image.
As a preferred scheme, the specific analysis process of the second step further includes: by analysis of formulasObtaining an image display quality evaluation index of each target VR device>Wherein->Representing the number of images to be displayed,respectively representing a resolution threshold, a proper brightness, a proper contrast and a color accuracy threshold of the image display of the preset target VR device, < >>Respectively representing a preset brightness deviation threshold value and contrast deviation threshold value,/->Weights respectively representing preset resolution, brightness, contrast and color accuracy.
In the embodiment, the resolution, brightness, contrast and color accuracy of the image display of the target VR device are obtained, the image display quality of the target VR device is evaluated, the image display quality of the VR device is deeply analyzed from multiple dimensions, clear and real image presentation of the VR device is ensured, and a satisfactory visual effect and immersive experience of a user are provided.
Third, monitoring audio output quality of VR equipment: and acquiring the frequency response coincidence coefficient and the sound balance coincidence coefficient of the audio output of each target VR device, and analyzing the audio output quality evaluation index of each target VR device.
As a preferred scheme, the specific analysis process of the third step includes: setting each test frequency audio signal played by each target VR device according to a preset principle, acquiring output response of each target VR device when playing each test frequency audio signal, further acquiring output response curves of each target VR device when playing each test frequency audio signal, extracting reference output response curves stored in a database when each target VR device plays each test frequency audio signal, comparing the output response curves of each target VR device when playing each test frequency audio signal with the corresponding reference output response curves, acquiring the coincidence degree of the output response curves of each target VR device when playing each test frequency audio signal and the corresponding reference output response curves, and marking the coincidence degree as the frequency response coincidence degree of each target VR device under each test frequency audio signal and representing the coincidence degree as,/>Indicate->Number of audio signal of test frequency, +.>
By analysis of formulasObtaining the frequency response coincidence coefficient of the audio output of each target VR device>Wherein->Representing the number of audio signals of the test frequency, +.>Indicating a preset frequency response compliance threshold.
It should be noted that, the frequency response of the audio output of the target VR device may be monitored by an audio analyzer or software, such as a spectrum analyzer and a signal generator.
As a preferred embodiment, the specific analysis process in the third step further includes: setting audio frequency bands played by target VR equipment, acquiring the audio frequency bands played by each target VR equipment, selecting each frequency point in a frequency range corresponding to the audio frequency bands played by each target VR equipment according to a preset principle, marking the frequency point as each monitoring frequency point of the audio frequency bands played by each target VR equipment, acquiring the volume corresponding to each monitoring frequency point in the audio frequency bands played by each target VR equipment, and marking the volume as,/>Indicate->Number of each monitoring frequency point, +.>
By analysis of formulasObtaining the sound balance coincidence coefficient of the audio output of each target VR device>Wherein->Indicating the number of monitoring frequency points, +.>Indicate->The (th) of the target VR device playing audio band>Volume corresponding to each monitoring frequency point, +.>Indicating a preset volume deviation threshold.
It should be noted that, the monitoring of the sound balance of the audio output of the target VR device may be performed by means of audio measurement tools, such as a sound level meter and a spectrum analyzer.
As a preferred embodiment, the specific analysis process in the third step further includes: by analysis of formulasObtaining audio output quality evaluation index of each target VR deviceWherein->Respectively representing the preset frequency response compliance coefficients and the threshold values of the sound balance compliance coefficients,respectively representing the preset frequency response coincidence coefficient and the weight factor of the sound balance coincidence coefficient.
In this embodiment, the audio output quality of the target VR device is evaluated by acquiring the frequency response coincidence coefficient and the sound balance coincidence coefficient of the audio output of the target VR device, and the audio output quality of the VR device is deeply analyzed from multiple dimensions, so that the VR device provides clear, real and smooth sound effects, thereby improving the satisfaction degree and participation degree of the user in the virtual reality experience.
Step four, monitoring tracking capability of VR equipment sensors: acquiring the attitude fitness and the attitude change response time of each target VR device in each attitude tracking test, acquiring the positioning fitness and the position movement response time of each target VR device in each motion tracking test, and analyzing the sensor tracking ability evaluation index of each target VR device.
As a preferred embodiment, the specific analysis process in the fourth step includes: t1: according to a preset principle, carrying out gesture tracking tests for set times on each target VR device, obtaining actual response gestures of each target VR device in each gesture tracking test, comparing the actual response gestures of each target VR device in each gesture tracking test with standard response gestures of each gesture tracking test stored in a database, obtaining the coincidence degree of the actual response gestures of each target VR device in each gesture tracking test and the standard response gestures corresponding to the gesture tracking test, and marking the coincidence degree as the gesture coincidence degree of each target VR device in each gesture tracking test,/>Indicate->The number of the secondary pose tracking test is given,
acquiring the gesture change response time of each target VR device in each gesture tracking test, and recording the gesture change response time as
T2: according to a preset principle, performing motion tracking tests for set times on each target VR device, obtaining actual response positions of each target VR device in each motion tracking test, comparing the actual response positions of each target VR device in each motion tracking test with standard response positions of each motion tracking test stored in a database, obtaining distances between the actual response positions of each target VR device in each motion tracking test and the corresponding standard response positions of each motion tracking test, and recording the distances as,/>Indicate->Number of secondary exercise tracking test, +.>By analysis formula->Obtaining the positioning fitness of each target VR device in each motion tracking test>Wherein->And the influence factor corresponding to the unit distance between the preset actual response position and the standard response position is represented.
Acquiring the position movement response time length of each target VR device in each motion tracking test, and recording the time length as
It should be noted that the gesture of each gesture tracking test is different.
It should be noted that the positions of the motion tracking tests are different from one another.
It should be noted that, the gesture tracking test is a test for performing a series of gesture changes, such as rotation, tilting, or translation, on the target VR device, and observing whether the gesture tracking system or the sensor of the target VR device can accurately capture the gesture changes of the target VR device.
It should be noted that, the motion tracking test is a series of motion tests performed on the target VR device, such as arm swing, body movement, or head rotation, and observe whether the motion tracking system or sensor of the target VR device can accurately capture the motion change of the target VR device.
As a preferred embodiment, the specific analysis process in the fourth step further includes: by analysis of formulasObtaining the posture tracking evaluation coefficient of each target VR device>Wherein->Threshold values respectively representing preset attitude fitness and attitude change response time length, < >>Representing a preset->The secondary pose tracks the weight of the test.
By analysis of formulasObtaining action tracking evaluation coefficient of each target VR device>Wherein->Threshold values respectively representing preset positioning fitness and position movement response time length, +.>Representing a preset->The secondary motion tracks the weights of the test.
By analysis of formulasObtaining sensor traceability evaluation index of each target VR device
In the embodiment, the gesture tracking and motion tracking effects of the VR equipment are monitored, the sensor tracking capability of the VR equipment is evaluated, the sensor tracking capability of the VR equipment is deeply analyzed from multiple dimensions, and the VR equipment can accurately and rapidly track the action change and the position movement of a user, so that a more real and immersive virtual reality experience is realized, and the complex interaction requirement is met.
Fifthly, VR equipment product quality assessment feedback: and feeding back the appearance quality evaluation index, the image display quality evaluation index, the audio output quality evaluation index and the sensor tracking ability evaluation index of each target VR device to a production quality management department of a target VR device production factory.
The foregoing is merely illustrative and explanatory of the principles of this invention, as various modifications and additions may be made to the specific embodiments described, or similar arrangements may be substituted by those skilled in the art, without departing from the principles of this invention or beyond the scope of this invention as defined in the claims.

Claims (7)

1. The VR equipment production product quality monitoring and analyzing method based on image recognition is characterized by comprising the following steps:
step one, monitoring appearance quality of VR equipment: obtaining appearance information of each VR device in the current production batch of the target VR device production factory, recording the appearance information as appearance information of each target VR device, wherein the appearance information comprises a deformation coefficient, a scratch coefficient, a dirt coefficient and a printing coincidence coefficient, and analyzing appearance quality evaluation indexes of each target VR device;
step two, monitoring image display quality of VR equipment: acquiring resolution, brightness, contrast and color accuracy of each display image of each target VR device, and analyzing image display quality evaluation indexes of each target VR device;
third, monitoring audio output quality of VR equipment: acquiring a frequency response coincidence coefficient and a sound balance coincidence coefficient of audio output of each target VR device, and analyzing an audio output quality evaluation index of each target VR device;
step four, monitoring tracking capability of VR equipment sensors: acquiring the attitude fitness and the attitude change response time of each target VR device in each attitude tracking test, acquiring the positioning fitness and the position movement response time of each target VR device in each motion tracking test, and analyzing the sensor tracking ability evaluation index of each target VR device;
fifthly, VR equipment product quality assessment feedback: the appearance quality evaluation index, the image display quality evaluation index, the audio output quality evaluation index and the sensor tracking ability evaluation index of each target VR device are fed back to a production quality management department of a target VR device production factory;
the specific analysis process of the third step comprises the following steps:
setting each test frequency audio signal played by the target VR equipment according to a preset principle, acquiring output response of each target VR equipment when playing each test frequency audio signal, further acquiring output response curves of each target VR equipment when playing each test frequency audio signal, extracting reference output response curves of each target VR equipment stored in a database when playing each test frequency audio signal, comparing the output response curves of each target VR equipment when playing each test frequency audio signal with the corresponding reference output response curves, and obtaining each target VR equipment when playing each test frequency audio signalThe coincidence degree of the output response curve of the frequency signal and the corresponding reference output response curve is recorded as the frequency response coincidence degree of each target VR equipment under each test frequency audio signal and expressed as,/>Indicate->Number of audio signal of test frequency, +.>
By analysis of formulasObtaining the frequency response coincidence coefficient of the audio output of each target VR deviceWherein->Representing the number of audio signals of the test frequency, +.>Representing a preset frequency response conformity threshold;
the specific analysis process in the third step further comprises the following steps:
setting audio frequency bands played by target VR equipment, acquiring the audio frequency bands played by each target VR equipment, selecting each frequency point in a frequency range corresponding to the audio frequency bands played by each target VR equipment according to a preset principle, marking the frequency point as each monitoring frequency point of the audio frequency bands played by each target VR equipment, acquiring the volume corresponding to each monitoring frequency point in the audio frequency bands played by each target VR equipment, and marking the volume as,/>Indicate->Number of each monitoring frequency point, +.>
By analysis of formulasObtaining the sound balance coincidence coefficient of the audio output of each target VR device>Wherein->Indicating the number of monitoring frequency points, +.>Indicate->The (th) of the target VR device playing audio band>Volume corresponding to each monitoring frequency point, +.>Representing a preset volume deviation threshold;
the specific analysis process in the third step further comprises the following steps:
by analysis of formulasObtaining an audio output quality evaluation index of each target VR device>Wherein->Threshold values respectively representing preset frequency response coincidence coefficients and sound balance coincidence coefficients, +.>Respectively representing the preset frequency response coincidence coefficient and the weight factor of the sound balance coincidence coefficient.
2. The VR device production product quality monitoring and analyzing method based on image recognition of claim 1, wherein: the specific analysis process of the first step comprises the following steps:
f1: the three-dimensional model of each target VR device is obtained through a laser scanner, the three-dimensional model of each target VR device is compared with the standard three-dimensional model of the target VR device stored in a database, the deformation coefficient of each target VR device is obtained through analysis, and the deformation coefficient is recorded as,/>Indicate->Number of individual target VR device,/->
F2: acquiring each angle image of each target VR device, splicing to obtain a live-action image of each target VR device, and dividing the surface area of the target VR device according to a preset principle to obtain each subarea of the surface of the target VR device;
according to the live-action image of each target VR device, obtaining each scratch in each sub-area of the surface of each target VR deviceThe length of the mark is analyzed to obtain the scratch coefficient of each target VR device, and the scratch coefficient is recorded as
F3: according to the real-scene image of each target VR device, the area of each dirty area in each sub-area of the surface of each target VR device is obtained, the dirty coefficient of each target VR device is obtained by analysis, and is recorded as
F4: according to the real image of each target VR device, the position and the outline of each identifier on the surface of each target VR device are obtained, the position deviation and the overlapping outline length of each identifier on the surface of each target VR device are analyzed, the printing matching coefficient of each target VR device is further obtained, and the printing matching coefficient is recorded as
3. The VR device production product quality monitoring and analyzing method based on image recognition of claim 2, wherein: the specific analysis process of the first step further comprises:
by analysis of formulasObtaining the appearance quality evaluation index of each target VR device>Wherein->Weights respectively representing preset deformation coefficient, scratch coefficient, smudge coefficient and printing coincidence coefficient, +.>
4. The VR device production product quality monitoring and analyzing method based on image recognition of claim 2, wherein: the specific analysis process of the second step comprises the following steps:
acquiring a set number of display images of each target VR device according to a preset principle, further acquiring the resolution, brightness and contrast of each display image of each target VR device, and respectively marking the resolution, brightness and contrast as,/>Indicate->Number of each display image->
Acquiring each color region of each display image in each target VR device, further acquiring the gray value of each color region in each display image in each target VR device, and recording the gray value as,/>Indicate->Number of individual color areas>Extracting standard gray values of each color region in each display image stored in the database, and recording the standard gray values as +.>By analysis of the formulaObtain color accuracy +.>Wherein->Representing the number of color areas +.>Representing the gray value deviation threshold of the color region of the display image.
5. The VR device production product quality monitoring and analysis method based on image recognition of claim 4, wherein: the specific analysis process of the second step further comprises the following steps:
by analysis of formulasObtaining an image display quality evaluation index of each target VR device>Wherein->Representing the number of displayed images +.>Respectively representing a resolution threshold, a proper brightness, a proper contrast and a color accuracy threshold of the image display of the preset target VR device, < >>Respectively representing a preset brightness deviation threshold value and contrast deviation threshold value,/->Respectively are provided withWeights representing preset resolution, brightness, contrast and color accuracy.
6. The VR device production product quality monitoring and analyzing method based on image recognition of claim 2, wherein: the specific analysis process of the fourth step comprises the following steps:
t1: according to a preset principle, carrying out gesture tracking tests for set times on each target VR device, obtaining actual response gestures of each target VR device in each gesture tracking test, comparing the actual response gestures of each target VR device in each gesture tracking test with standard response gestures of each gesture tracking test stored in a database, obtaining the coincidence degree of the actual response gestures of each target VR device in each gesture tracking test and the standard response gestures corresponding to the gesture tracking test, and marking the coincidence degree as the gesture coincidence degree of each target VR device in each gesture tracking test,/>Indicate->Number of secondary posture trace test, +.>
Acquiring the gesture change response time of each target VR device in each gesture tracking test, and recording the gesture change response time as;
T2: according to a preset principle, performing motion tracking test for each target VR device for set times, obtaining actual response positions of each target VR device in each motion tracking test, and storing the actual response positions of each target VR device in each motion tracking test and a databaseComparing the standard response positions of each motion tracking test to obtain the distance between the actual response position of each target VR device in each motion tracking test and the corresponding standard response position of each motion tracking test, and recording the distance as,/>Indicate->Number of secondary exercise tracking test, +.>By analysis of the formulaObtaining the positioning fitness of each target VR device in each motion tracking test>Wherein->The method comprises the steps of representing an influence factor corresponding to a unit distance between a preset actual response position and a standard response position;
acquiring the position movement response time length of each target VR device in each motion tracking test, and recording the time length as
7. The VR device production product quality monitoring and analysis method based on image recognition of claim 6, wherein: the specific analysis process in the fourth step further comprises:
by analysis of formulasObtaining the posture tracking evaluation coefficient of each target VR device>Wherein->Threshold values respectively representing preset attitude fitness and attitude change response time length, < >>Representing a preset->The weight of the secondary attitude tracking test;
by analysis of formulasObtaining action tracking evaluation coefficients of each target VR deviceWherein->Threshold values respectively representing preset positioning fitness and position movement response time length, +.>Representing a preset->Weights of the secondary motion tracking test;
by analysis of formulasObtaining a sensor traceability evaluation index for each target VR device>
CN202311512282.9A 2023-11-14 2023-11-14 VR equipment production product quality monitoring analysis method based on image recognition Active CN117252867B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311512282.9A CN117252867B (en) 2023-11-14 2023-11-14 VR equipment production product quality monitoring analysis method based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311512282.9A CN117252867B (en) 2023-11-14 2023-11-14 VR equipment production product quality monitoring analysis method based on image recognition

Publications (2)

Publication Number Publication Date
CN117252867A CN117252867A (en) 2023-12-19
CN117252867B true CN117252867B (en) 2024-02-27

Family

ID=89126658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311512282.9A Active CN117252867B (en) 2023-11-14 2023-11-14 VR equipment production product quality monitoring analysis method based on image recognition

Country Status (1)

Country Link
CN (1) CN117252867B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005006778A1 (en) * 2003-07-03 2005-01-20 Logiways France Method and system for testing the ability of a device to produce an error-free video and/or audio signal and recording medium containing instructions for the implementation of said method
CN105812768A (en) * 2016-03-18 2016-07-27 深圳市维尚境界显示技术有限公司 Method and system for playing 3D video in VR (Virtual Reality) device
JP2016206795A (en) * 2015-04-17 2016-12-08 Kddi株式会社 Device, program, and method for tracking object using discriminator that learns by real space information
CN106534968A (en) * 2016-11-14 2017-03-22 墨宝股份有限公司 Method and system for playing 3D video in VR device
CN106595518A (en) * 2016-11-30 2017-04-26 中航华东光电(上海)有限公司 Curved surface appearance shape detection VR system based on characteristic point stereo matching
CN107145706A (en) * 2017-03-30 2017-09-08 北京奇艺世纪科技有限公司 The appraisal procedure and device of Virtual Reality equipment blending algorithm performance parameter
KR20190004501A (en) * 2017-07-04 2019-01-14 정용철 Apparatus and method for virtual reality sound processing according to viewpoint change of a user
WO2021115255A1 (en) * 2019-12-09 2021-06-17 华为技术有限公司 Method, apparatus, and system for evaluating sensation of vr experience
CN113902045A (en) * 2021-12-09 2022-01-07 成都车晓科技有限公司 Vehicle insurance field rapid damage assessment method based on image recognition
CN113988517A (en) * 2021-09-27 2022-01-28 无界文化(北京)集团有限公司 Quality supervision safety tracing system based on VR technique
CN114862266A (en) * 2022-05-31 2022-08-05 多彩贵州印象网络传媒股份有限公司 Industrial product production quality monitoring and analyzing system based on big data
CN115272961A (en) * 2022-07-18 2022-11-01 陈迅 VR equipment early warning monitoring method and system based on artificial intelligence
CN116106717A (en) * 2023-04-12 2023-05-12 合肥瀚博智能科技有限公司 Intelligent detection and analysis system for integrated micro-optical-electromechanical semiconductor device
CN116909390A (en) * 2023-06-20 2023-10-20 之江实验室 Multi-mode data acquisition system based on glove

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI321299B (en) * 2006-01-27 2010-03-01 Taiwan Tft Lcd Ass System and method for evaluating a dynamic color deviation of a moving image of lcd
CN116660468B (en) * 2023-05-29 2024-02-13 广州昂博科技有限公司 Intelligent monitoring and analyzing method for cosmetic production line
CN116866719B (en) * 2023-07-12 2024-02-02 山东恒辉软件有限公司 Intelligent analysis processing method for high-definition video content based on image recognition
CN116959348B (en) * 2023-09-19 2023-12-12 山东泰克信息科技有限公司 Network data processing analysis method, device and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005006778A1 (en) * 2003-07-03 2005-01-20 Logiways France Method and system for testing the ability of a device to produce an error-free video and/or audio signal and recording medium containing instructions for the implementation of said method
JP2016206795A (en) * 2015-04-17 2016-12-08 Kddi株式会社 Device, program, and method for tracking object using discriminator that learns by real space information
CN105812768A (en) * 2016-03-18 2016-07-27 深圳市维尚境界显示技术有限公司 Method and system for playing 3D video in VR (Virtual Reality) device
CN106534968A (en) * 2016-11-14 2017-03-22 墨宝股份有限公司 Method and system for playing 3D video in VR device
CN106595518A (en) * 2016-11-30 2017-04-26 中航华东光电(上海)有限公司 Curved surface appearance shape detection VR system based on characteristic point stereo matching
CN107145706A (en) * 2017-03-30 2017-09-08 北京奇艺世纪科技有限公司 The appraisal procedure and device of Virtual Reality equipment blending algorithm performance parameter
KR20190004501A (en) * 2017-07-04 2019-01-14 정용철 Apparatus and method for virtual reality sound processing according to viewpoint change of a user
WO2021115255A1 (en) * 2019-12-09 2021-06-17 华为技术有限公司 Method, apparatus, and system for evaluating sensation of vr experience
CN113988517A (en) * 2021-09-27 2022-01-28 无界文化(北京)集团有限公司 Quality supervision safety tracing system based on VR technique
CN113902045A (en) * 2021-12-09 2022-01-07 成都车晓科技有限公司 Vehicle insurance field rapid damage assessment method based on image recognition
CN114862266A (en) * 2022-05-31 2022-08-05 多彩贵州印象网络传媒股份有限公司 Industrial product production quality monitoring and analyzing system based on big data
CN115272961A (en) * 2022-07-18 2022-11-01 陈迅 VR equipment early warning monitoring method and system based on artificial intelligence
CN116106717A (en) * 2023-04-12 2023-05-12 合肥瀚博智能科技有限公司 Intelligent detection and analysis system for integrated micro-optical-electromechanical semiconductor device
CN116909390A (en) * 2023-06-20 2023-10-20 之江实验室 Multi-mode data acquisition system based on glove

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
VR头戴设备显示屏质量评价方法研究与实践;杨林 等;《中国传媒大学学报(自然科学版)》;第26卷(第3期);第45-50页 *
基于多模态传感的交互式手套系统;刘畅 等;《计算机时代》(第8期);第129-133页 *

Also Published As

Publication number Publication date
CN117252867A (en) 2023-12-19

Similar Documents

Publication Publication Date Title
Egan et al. An evaluation of Heart Rate and ElectroDermal Activity as an objective QoE evaluation method for immersive virtual reality environments
JP5610153B2 (en) Inspection device and inspection method for visual and cooperative actions
CN110717391A (en) Height measuring method, system, device and medium based on video image
JP5654341B2 (en) Apparatus and method for examining visual and neural processing of a subject
CN108074241A (en) Quality score method, apparatus, terminal and the storage medium of target image
WO2017006875A1 (en) Golf club swing display method and display chart
CN109923529A (en) Information processing unit, information processing method and program
CN111814718A (en) Attention detection method integrating multiple discrimination technologies
CN110729047A (en) Device and method for combining psychophysiological analysis and scale test based on face video
CN109741285A (en) A kind of construction method and system of underwater picture data set
CN112488047A (en) Piano fingering intelligent identification method
CN110490173B (en) Intelligent action scoring system based on 3D somatosensory model
CN107689039A (en) Estimate the method and apparatus of image blur
CN108304806A (en) A kind of gesture identification method integrating feature and convolutional neural networks based on log path
CN108921829A (en) A kind of advertisement design method for objectively evaluating of view-based access control model attention mechanism
CN117252867B (en) VR equipment production product quality monitoring analysis method based on image recognition
CN104506852B (en) A kind of objective quality assessment method towards video conference coding
Trivedi et al. Height estimation of children under five years using depth images
KR101932525B1 (en) Sensing device for calculating information on position of moving object and sensing method using the same
JP2015082247A (en) Electronic equipment, determination method, and program
KR20170140756A (en) Appratus for writing motion-script, appratus for self-learning montion and method for using the same
CN116386424A (en) Method, device and computer readable storage medium for music teaching
CN111652045B (en) Classroom teaching quality assessment method and system
Kerdvibulvech et al. Real-time guitar chord estimation by stereo cameras for supporting guitarists
CN115343279A (en) Test paper-based skin oil evaluation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant