CN115082661B - Sensor assembly difficulty reducing method - Google Patents

Sensor assembly difficulty reducing method Download PDF

Info

Publication number
CN115082661B
CN115082661B CN202210811516.9A CN202210811516A CN115082661B CN 115082661 B CN115082661 B CN 115082661B CN 202210811516 A CN202210811516 A CN 202210811516A CN 115082661 B CN115082661 B CN 115082661B
Authority
CN
China
Prior art keywords
data
group
motor
sensor
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210811516.9A
Other languages
Chinese (zh)
Other versions
CN115082661A (en
Inventor
蔡勇
王淑娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asman Technology Shanghai Co ltd
Original Assignee
Asman Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asman Technology Shanghai Co ltd filed Critical Asman Technology Shanghai Co ltd
Priority to CN202210811516.9A priority Critical patent/CN115082661B/en
Publication of CN115082661A publication Critical patent/CN115082661A/en
Application granted granted Critical
Publication of CN115082661B publication Critical patent/CN115082661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Factory Administration (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)

Abstract

The invention discloses a method for reducing the difficulty of sensor assembly, which belongs to the technical field of sensor assembly and comprises the following specific steps: (1) powering up the control system and confirming an initial position; (2) performing position calibration and recording movement data; (3) recording position deviation and waiting for power-on verification again; (4) periodically detecting check data and recovering the data; according to the invention, the initial position, the 0 position and the calibration position are collected for multiple times for calibration, so that the requirements on the initial position and the system installation can be greatly reduced, and the position deviation caused in the long-time use or transportation process can be adjusted and avoided.

Description

Sensor assembly difficulty reducing method
Technical Field
The invention relates to the technical field of sensor assembly, in particular to a method for reducing difficulty in sensor assembly.
Background
The sensor is a detection device, can sense the information to be measured, and can convert the sensed information into an electric signal or other information output in a required form according to a certain rule so as to meet the requirements of information transmission, processing, storage, display, recording, control and the like, and the characteristics of the sensor include: miniaturization, digitalization, intellectualization, multifunction, systemization, and networking. The method is a primary link for realizing automatic detection and automatic control. The existence and development of the sensor can lead the object to have sense organs such as touch sense, taste sense, smell sense and the like, and lead the object to be slowly activated. The world began to enter the information age with the advent of new technological innovations. In the process of utilizing information, firstly, accurate and reliable information is required to be obtained, and a sensor is a main way and means for obtaining information in the natural and production fields, and in the modern industrial production, especially in the automatic production process, various sensors are required to monitor and control various parameters in the production process, so that equipment works in a normal state or an optimal state, and products reach the best quality. So that it can be said that without numerous excellent sensors, the modern production is lost;
In the conventional sensor assembly difficulty reduction method, in motion control, the requirements on the initial position and system installation are high, and the position deviation caused in the long-time use or transportation process cannot be adjusted and avoided; therefore, we propose a method for reducing the difficulty of assembling the sensor.
Disclosure of Invention
The invention aims to solve the defects in the prior art, and provides a method for reducing the assembly difficulty of a sensor.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
A method for reducing the assembly difficulty of a sensor comprises the following specific steps:
(1) The control system is powered on and confirms the initial position;
(2) Performing position calibration and recording movement data;
(3) Recording position deviation and waiting for power-on verification again;
(4) Check data is periodically detected and data recovery is performed.
As a further aspect of the present invention, the initial position confirmation in step (1) specifically includes the following steps:
step one: the control system automatically builds and trains a detection network model after being electrified and started, receives the position information sent by the sensor in real time, generates a corresponding plane image, and simultaneously detects real-time image information of the network model acquisition device;
Step two: the detection network model extracts characteristic data of each group of image information, sends the characteristic data into a bidirectional characteristic pyramid to perform characteristic fusion, deduces each group of image information with different resolutions after the characteristic fusion is completed, locks sensors in each group of image information through a detection frame, collects sensor detection frame information in image information, and generates corresponding detection frame coordinates;
Step three: enlarging and cutting relevant image information according to the coordinates of each group of detection frames, collecting each group of sensor pictures generated after enlarging and cutting, storing the sensor pictures, filtering out simple negative samples belonging to the background in each group of sensor pictures through RPN, selecting areas possibly containing targets for classification and regression, and confirming the position coordinates of the sensor on the device;
Step four: and comparing the position information sent by the sensor with the position coordinates detected by the detection network model, if the comparison result is consistent, taking the position information sent by the sensor as the initial position of the motor, and if the comparison result is inconsistent, feeding back the two sets of coordinate information to a worker to manually confirm and select correct position data as the initial position of the motor.
As a further scheme of the invention, the specific training steps of the detection network model in the step one are as follows:
Step I: the method comprises the steps that a detection network model is in communication connection with a cloud server, position verification data stored in the cloud server are extracted, non-binary data in each group of verification data are converted into binary data, and then each group of data are converted into a specified interval through a normalization method;
step II: extracting each group of verification data after processing to perform feature dimension reduction processing, integrating and summarizing each group of processed data into a simulation data set, and dividing the simulation data set into a verification set, a test set and a training set;
Step III: repeatedly verifying the accuracy of the detection network model by using each group of data in the verification set for multiple times, counting root mean square errors of each group of data in the verification set, simultaneously predicting each group of data once, and outputting data with the best prediction result as optimal parameters;
Step IV: and (3) performing input, convolution, pooling, full connection and output processing on the training set through optimal parameters to generate training samples, finally conveying the training samples into a detection network model, performing real-time optimization on the detection network model by adopting a long-term iteration method, testing the detection network model by utilizing the testing set, stopping training if the testing accuracy meets the expected value, and finally performing performance evaluation on the detection network model meeting the expected value, namely performing accuracy, detection rate and false alarm rate evaluation.
As a further aspect of the present invention, the specific step of the position calibration in the step (2) is as follows:
The first step: the motor position is manually moved to a0 position by a worker or automatically moved by a device, wherein the 0 position is specifically an initial position required by the worker, meanwhile, the control system records the 0 position coordinate information, and then the motor returns to the initial position of the motor from the 0 position;
And a second step of: the motor position is manually moved to a plurality of groups of different calibration positions by a worker or automatically moved to the different calibration positions by a device, the coordinate information of each group of calibration positions is recorded respectively, and the control system extracts the recorded coordinate information of each group of calibration positions according to the internally arranged calibration rules so as to calibrate whether the 0 position meets the actual requirements;
and a third step of: if the 0 position does not meet the actual requirement, waiting for a worker to issue an operation instruction, if the worker issues a recalibration instruction, repeating the first step to the second step, if the worker issues a calibration exit, stopping the calibration, and if the 0 position meets the actual requirement, circulating to the next operation process.
As a further scheme of the present invention, the specific step of power-on verification in the step (3) is as follows:
s1: the control system calculates the distance between the 0 position and the initial position of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, records the calculation result as deviation information, and marks the deviation information generation time;
s2: the control system re-collects and verifies the initial motor position when the control system is electrified every time later, compares the initial motor position after each verification with the initial motor position before, and searches for a corresponding 0 position according to the initial motor position and deviation information if the initial motor positions are consistent;
S3: if the initial positions of the motors are inconsistent, analyzing the reasons for the inconsistent positions, checking and confirming the 0 position by taking the initial position of the motor as check data, and feeding back the checked 0 position to staff.
As a further aspect of the present invention, the specific steps of data recovery in step (4) are as follows:
P1: the computer detects and counts the number of past check data stored by the control system in real time, and carries out recovery rate updating calculation according to the default of the system or the circulation time value set by staff;
P2: and then the computer periodically collects the calculated recovery rate value, extracts each group of data in the corresponding time from front to back according to the set circulation time value and according to the collected recovery rate value, recovers the quantity of past check data stored in the control system, and simultaneously automatically generates a recovery log to record the data recovery condition.
Compared with the prior art, the invention has the beneficial effects that:
Compared with the prior complex assembly difficulty reducing method, the sensor assembly difficulty reducing method has the advantages that after the control system is electrified and started, position information sent by a sensor is received in real time, a corresponding plane image is generated, the initial position of a motor is confirmed, then the motor position is manually or automatically moved to a 0 position through a worker, meanwhile, the control system records the 0 position coordinate information, then the motor returns to the initial position of the motor from the 0 position, whether the 0 position meets actual requirements or not is calibrated through a plurality of groups of different calibration positions, after calibration is finished, the control system calculates the 0 position and the distance between the initial positions of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, records the calculation result as deviation information, then the control system recollects and verifies the initial position of the motor when electrified every time, compares the initial position of the motor after every verification with the initial position of the previous motor, if the initial position of the motor is consistent twice, the initial position of the motor and the deviation information are searched for the corresponding 0 position, and if the initial position of the motor is inconsistent twice, the initial position of the motor is not consistent, the initial position of the motor is analyzed, and the initial position of the motor is required to be calibrated or the initial position is required to be greatly reduced through the calibration, and the calibration is carried out, and the calibration time is reduced.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention.
FIG. 1 is a block flow diagram of a method for reducing difficulty in assembling a sensor according to the present invention;
fig. 2 is an algorithm block diagram of a method for reducing the difficulty of assembling a sensor according to the present invention.
Detailed Description
Referring to fig. 1, a method for reducing the difficulty of assembling a sensor includes the following specific steps:
the control system powers up and confirms the initial position.
Specifically, the control system automatically builds and trains a detection network model after being electrified and started, receives position information sent by a sensor in real time, generates corresponding plane images, simultaneously detects real-time image information of a network model acquisition device, then extracts characteristic data of each group of image information, sends the characteristic data into a bidirectional characteristic pyramid to perform characteristic fusion, deduces each group of image information with different resolutions after the characteristic fusion is completed, locks the sensor in each group of image information through a detection frame, then collects sensor detection frame information in image information, generates corresponding detection frame coordinates, enlarges and cuts related image information according to each group of detection frame coordinates, collects each group of sensor pictures generated after enlarging and cutting, stores the sensor pictures, filters out simple negative samples belonging to the background in each group of sensor pictures through RPN, sorts and regresses the area possibly containing targets, confirms the position coordinates of the sensor on the device, compares the position information sent by the sensor with the position coordinates detected by the detection network model, compares the position information with the position coordinates detected by the detection network model, and if the position information of the sensor in the sensor is consistent with the initial position information of the detection frame, and if the initial position information of the sensor is not consistent with the initial position information of the motor is selected as initial position information of the motor, and if the initial position information of the motor is not consistent, the initial position information is selected as initial position information of a person.
The method comprises the steps of carrying out communication connection on a detection network model and a cloud server, extracting position verification data stored in the cloud server, converting non-binary data in each group of verification data into binary data, converting each group of data into a specified interval through a normalization method, extracting each group of verification data after processing to carry out feature dimension reduction processing, integrating each group of processed data into a simulation data set, dividing the simulation data set into a verification set, a test set and a training set, repeatedly verifying the accuracy of the detection network model through using each group of data in the verification set for a plurality of times, counting root mean square errors of each group of data in the verification set, simultaneously predicting each group of data once, outputting the data with the best prediction result as optimal parameters, finally inputting, convoluting, pooling, fully connecting and outputting the training set through the optimal parameters to generate training samples, finally conveying the training samples into the detection network model, carrying out real-time optimization on the detection network model through a long-term iteration method, testing the detection network model through the test set, and carrying out test on the detection network model, if the test set meets a final value, and if the test set meets a final value, carrying out statistics on the accuracy, and evaluating the accuracy of the detection network model.
Position calibration is performed and movement data is recorded.
Specifically, the motor position is manually moved to a 0 position by a worker or automatically moved to the 0 position by a device, wherein the 0 position is specifically an initial position required by the worker, meanwhile, the control system records the 0 position coordinate information, then the motor returns to the motor initial position from the 0 position, then the motor position is manually moved to a plurality of groups of different calibration positions by the worker or automatically moved to the motor initial position by the device, and respectively records each group of calibration position coordinate information, the control system extracts each recorded group of calibration position coordinate information according to the internally arranged calibration rule to calibrate whether the 0 position meets the actual requirement, if the 0 position does not meet the actual requirement, the worker waits for issuing an operation instruction, if the worker issues a recalibration instruction, the 0 position is recalibrated, if the worker issues a calibration exit, the calibration is stopped, and if the 0 position meets the actual requirement, the flow goes to the next operation process.
Recording the position deviation waits for the power-up verification again.
Specifically, the control system calculates the 0 position and the distance between the initial positions of the motors through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial positions of the motors, records the calculation result as deviation information, marks the generation time of the deviation information, then, when the control system is electrified every time later, re-collects and verifies the initial positions of the motors, compares the initial positions of the motors after each verification with the initial positions of the motors before, searches the corresponding 0 position according to the initial positions of the motors and the deviation information if the initial positions of the motors are consistent, analyzes the reason that the initial positions of the motors are inconsistent if the initial positions of the motors are inconsistent, and simultaneously, checks and confirms the 0 position by taking the initial position of the motor as check data and feeds the checked 0 position back to staff.
Check data is periodically detected and data recovery is performed.
Specifically, the computer detects and counts the number of past verification data stored in the control system in real time, and carries out recovery rate updating calculation according to a default system or a circulation time value set by a worker, then the computer periodically collects the calculated recovery rate value, then each group of data in the corresponding time is extracted from front to back according to the set circulation time value, the number of the past verification data stored in the control system is recovered according to the collected recovery rate value, and meanwhile, a recovery log is automatically generated to record the data recovery condition.

Claims (3)

1. The method for reducing the assembly difficulty of the sensor is characterized by comprising the following specific steps of:
(1) The control system is powered on and confirms the initial position;
(2) Performing position calibration and recording movement data;
(3) Recording position deviation and waiting for power-on verification again;
(4) Periodically detecting check data and recovering the data;
The specific steps of confirming the initial position in the step (1) are as follows:
step one: the control system automatically builds and trains a detection network model after being electrified and started, receives the position information sent by the sensor in real time, generates a corresponding plane image, and simultaneously detects real-time image information of the network model acquisition device;
Step two: the detection network model extracts characteristic data of each group of image information, sends the characteristic data into a bidirectional characteristic pyramid to perform characteristic fusion, deduces each group of image information with different resolutions after the characteristic fusion is completed, locks sensors in each group of image information through a detection frame, collects sensor detection frame information in image information, and generates corresponding detection frame coordinates;
Step three: enlarging and cutting relevant image information according to the coordinates of each group of detection frames, collecting each group of sensor pictures generated after enlarging and cutting, storing the sensor pictures, filtering out simple negative samples belonging to the background in each group of sensor pictures through RPN, selecting areas possibly containing targets for classification and regression, and confirming the position coordinates of the sensor on the device;
step four: comparing the position information sent by the sensor with the position coordinates detected by the detection network model, if the comparison result is consistent, taking the position information sent by the sensor as the initial position of the motor, and if the comparison result is inconsistent, feeding back two sets of coordinate information to a worker to manually confirm and select correct position data as the initial position of the motor;
the specific steps of the position calibration in the step (2) are as follows:
The first step: the motor position is manually moved to a0 position by a worker or automatically moved by a device, wherein the 0 position is specifically an initial position required by the worker, meanwhile, the control system records the 0 position coordinate information, and then the motor returns to the initial position of the motor from the 0 position;
And a second step of: the motor position is manually moved to a plurality of groups of different calibration positions by a worker or automatically moved to the different calibration positions by a device, the coordinate information of each group of calibration positions is recorded respectively, and the control system extracts the recorded coordinate information of each group of calibration positions according to the internally arranged calibration rules so as to calibrate whether the 0 position meets the actual requirements;
and a third step of: if the 0 position does not meet the actual requirement, waiting for a worker to issue an operation instruction, if the worker issues a recalibration instruction, repeating the first step to the second step, if the worker issues a calibration exit, stopping the calibration, and if the 0 position meets the actual requirement, transferring to the next operation process;
the specific step of power-on verification in the step (3) is as follows:
s1: the control system calculates the distance between the 0 position and the initial position of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, records the calculation result as deviation information, and marks the deviation information generation time;
s2: the control system re-collects and verifies the initial motor position when the control system is electrified every time later, compares the initial motor position after each verification with the initial motor position before, and searches for a corresponding 0 position according to the initial motor position and deviation information if the initial motor positions are consistent;
S3: if the initial positions of the motors are inconsistent, analyzing the reasons for the inconsistent positions, checking and confirming the 0 position by taking the initial position of the motor as check data, and feeding back the checked 0 position to staff.
2. The method for reducing sensor assembly difficulty according to claim 1, wherein the specific training steps of the detection network model in the step one are as follows:
Step I: the method comprises the steps that a detection network model is in communication connection with a cloud server, position verification data stored in the cloud server are extracted, non-binary data in each group of verification data are converted into binary data, and then each group of data are converted into a specified interval through a normalization method;
step II: extracting each group of verification data after processing to perform feature dimension reduction processing, integrating and summarizing each group of processed data into a simulation data set, and dividing the simulation data set into a verification set, a test set and a training set;
Step III: repeatedly verifying the accuracy of the detection network model by using each group of data in the verification set for multiple times, counting root mean square errors of each group of data in the verification set, simultaneously predicting each group of data once, and outputting data with the best prediction result as optimal parameters;
Step IV: and (3) performing input, convolution, pooling, full connection and output processing on the training set through optimal parameters to generate training samples, finally conveying the training samples into a detection network model, performing real-time optimization on the detection network model by adopting a long-term iteration method, testing the detection network model by utilizing the testing set, stopping training if the testing accuracy meets the expected value, and finally performing performance evaluation on the detection network model meeting the expected value, namely performing accuracy, detection rate and false alarm rate evaluation.
3. The method for reducing difficulty in assembling a sensor according to claim 1, wherein the data recovery in step (4) is specifically as follows:
P1: the computer detects and counts the number of past check data stored by the control system in real time, and carries out recovery rate updating calculation according to the default of the system or the circulation time value set by staff;
P2: and then the computer periodically collects the calculated recovery rate value, extracts each group of data in the corresponding time from front to back according to the set circulation time value and according to the collected recovery rate value, recovers the quantity of past check data stored in the control system, and simultaneously automatically generates a recovery log to record the data recovery condition.
CN202210811516.9A 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method Active CN115082661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210811516.9A CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210811516.9A CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Publications (2)

Publication Number Publication Date
CN115082661A CN115082661A (en) 2022-09-20
CN115082661B true CN115082661B (en) 2024-05-10

Family

ID=83260470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210811516.9A Active CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Country Status (1)

Country Link
CN (1) CN115082661B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115329670A (en) * 2022-08-11 2022-11-11 深圳朗道智通科技有限公司 Data acquisition method for unmanned vehicle

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105058388A (en) * 2015-08-17 2015-11-18 哈尔滨工业大学 Sensor data fusion method used for acquiring robot joint position feedback information
CN105397807A (en) * 2015-12-21 2016-03-16 珠海格力电器股份有限公司 Robot zero calibration device and robot zero calibration system as well as robot zero calibration method
RU2592734C1 (en) * 2015-05-26 2016-07-27 Федеральное государственное бюджетное учреждение науки Институт автоматики и электрометрии Сибирского отделения Российской академии наук (ИАиЭ СО РАН) Method of calibrating angular sensor
CN110196075A (en) * 2018-02-27 2019-09-03 上海市计量测试技术研究院 A kind of environmental test equipment calibration long-range temperature and humidity test system and test method
CN110942144A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 Neural network construction method integrating automatic training, checking and reconstructing
EP3693697A1 (en) * 2019-02-06 2020-08-12 OptiNav Sp. z o.o. Method for calibrating a 3d measurement arrangement
WO2020164282A1 (en) * 2019-02-14 2020-08-20 平安科技(深圳)有限公司 Yolo-based image target recognition method and apparatus, electronic device, and storage medium
DE112018007287T5 (en) * 2018-03-15 2020-12-10 Harman International Industries, Incorporated VEHICLE SYSTEM AND METHOD FOR DETECTING OBJECTS AND OBJECT DISTANCE
DE102021101593B3 (en) * 2021-01-26 2022-03-31 Audi Aktiengesellschaft Method for operating environment sensors in a motor vehicle and motor vehicle
CN114581983A (en) * 2022-03-04 2022-06-03 浪潮(北京)电子信息产业有限公司 Detection frame processing method for target detection and related device
CN216851660U (en) * 2022-03-22 2022-06-28 阿斯曼尔科技(上海)有限公司 Mechanical structure for solving problem of inaccurate positioning caused by back clearance of gear box

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106405555B (en) * 2016-09-23 2019-01-01 百度在线网络技术(北京)有限公司 Obstacle detection method and device for Vehicular radar system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2592734C1 (en) * 2015-05-26 2016-07-27 Федеральное государственное бюджетное учреждение науки Институт автоматики и электрометрии Сибирского отделения Российской академии наук (ИАиЭ СО РАН) Method of calibrating angular sensor
CN105058388A (en) * 2015-08-17 2015-11-18 哈尔滨工业大学 Sensor data fusion method used for acquiring robot joint position feedback information
CN105397807A (en) * 2015-12-21 2016-03-16 珠海格力电器股份有限公司 Robot zero calibration device and robot zero calibration system as well as robot zero calibration method
CN110196075A (en) * 2018-02-27 2019-09-03 上海市计量测试技术研究院 A kind of environmental test equipment calibration long-range temperature and humidity test system and test method
DE112018007287T5 (en) * 2018-03-15 2020-12-10 Harman International Industries, Incorporated VEHICLE SYSTEM AND METHOD FOR DETECTING OBJECTS AND OBJECT DISTANCE
EP3693697A1 (en) * 2019-02-06 2020-08-12 OptiNav Sp. z o.o. Method for calibrating a 3d measurement arrangement
WO2020164282A1 (en) * 2019-02-14 2020-08-20 平安科技(深圳)有限公司 Yolo-based image target recognition method and apparatus, electronic device, and storage medium
CN110942144A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 Neural network construction method integrating automatic training, checking and reconstructing
DE102021101593B3 (en) * 2021-01-26 2022-03-31 Audi Aktiengesellschaft Method for operating environment sensors in a motor vehicle and motor vehicle
CN114581983A (en) * 2022-03-04 2022-06-03 浪潮(北京)电子信息产业有限公司 Detection frame processing method for target detection and related device
CN216851660U (en) * 2022-03-22 2022-06-28 阿斯曼尔科技(上海)有限公司 Mechanical structure for solving problem of inaccurate positioning caused by back clearance of gear box

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高精度倾角传感器检测系统的设计与实现;陈烁辉;周翔;王汉其;廖传伟;;自动化与仪表;20121215(第12期);全文 *

Also Published As

Publication number Publication date
CN115082661A (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN109785337B (en) In-column mammal counting method based on example segmentation algorithm
CN108711148B (en) Tire defect intelligent detection method based on deep learning
CN115082661B (en) Sensor assembly difficulty reducing method
CN112766103A (en) Machine room inspection method and device
CN113723325B (en) Frock defect detecting system of prefabricated component of assembled
CN105866782B (en) A kind of moving object detection system and method based on laser radar
CN111369555A (en) Video quality diagnosis method based on deep learning
CN111626046B (en) Correlation verification method for positions of telemetering text primitives and equipment primitives of transformer substation
CN114782437B (en) Computer mainboard quality detection method and system based on artificial intelligence
CN113837312B (en) Method and device for evaluating running state of zinc oxide arrester
CN115409992A (en) Remote driving patrol car system
CN113298194B (en) Data fusion method and system based on multiple sensors and storage medium
CN116522096B (en) Three-dimensional digital twin content intelligent manufacturing method based on motion capture
KR102198028B1 (en) Position Verification Method for Equipment Layout at 3D Design of Smart Factory
CN116772125A (en) Oil transportation station pipeline inspection system and method
CN116702290A (en) Intelligent detection method and system for BIM geometric model
CN116774191A (en) POS data and machine-mounted laser radar point cloud data interaction quality inspection device
US20220343113A1 (en) Automatic model reconstruction method and automatic model reconstruction system for component recognition model
CN110087066B (en) One-key automatic inspection method applied to online inspection
CN113808095A (en) Big data-based intelligent damage identification and analysis system for railway steel rails
CN110826636A (en) Aircraft anomaly detection system and anomaly detection method thereof
CN110703183A (en) Intelligent electric energy meter fault data analysis method and system
CN114398392B (en) Product data calling control system and method based on process tolerance library
CN220356308U (en) Log gauge system
CN116776086B (en) Signal fault discriminating method and device based on self-attention mechanism self-encoder

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant