CN115082661A - Method for reducing assembly difficulty of sensor - Google Patents

Method for reducing assembly difficulty of sensor Download PDF

Info

Publication number
CN115082661A
CN115082661A CN202210811516.9A CN202210811516A CN115082661A CN 115082661 A CN115082661 A CN 115082661A CN 202210811516 A CN202210811516 A CN 202210811516A CN 115082661 A CN115082661 A CN 115082661A
Authority
CN
China
Prior art keywords
data
group
sensor
motor
initial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210811516.9A
Other languages
Chinese (zh)
Other versions
CN115082661B (en
Inventor
蔡勇
王淑娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asman Technology Shanghai Co ltd
Original Assignee
Asman Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asman Technology Shanghai Co ltd filed Critical Asman Technology Shanghai Co ltd
Priority to CN202210811516.9A priority Critical patent/CN115082661B/en
Publication of CN115082661A publication Critical patent/CN115082661A/en
Application granted granted Critical
Publication of CN115082661B publication Critical patent/CN115082661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)
  • General Factory Administration (AREA)

Abstract

The invention discloses a method for reducing the assembly difficulty of a sensor, which belongs to the technical field of sensor assembly and comprises the following specific steps: (1) the control system is powered on and confirms the initial position; (2) carrying out position calibration and recording movement data; (3) recording position deviation and waiting for power-on verification again; (4) periodically detecting the check data and recovering the data; according to the invention, the initial position, the 0 position and the calibration position are collected to carry out calibration for multiple times, so that the requirements on the initial position and system installation can be greatly reduced, and the position deviation caused in the long-time use or transportation process can be adjusted and avoided.

Description

Method for reducing assembly difficulty of sensor
Technical Field
The invention relates to the technical field of sensor assembly, in particular to a method for reducing the assembly difficulty of a sensor.
Background
The sensor is a detection device, can feel the measured information, and can convert the felt information into an electric signal or other information output in a required form according to a certain rule so as to meet the requirements of information transmission, processing, storage, display, recording, control and the like, and the sensor is characterized by comprising: miniaturization, digitization, intelligence, multiple functionalization, systematization and networking. The method is the first link for realizing automatic detection and automatic control. The existence and development of the sensor enable the object to have the senses of touch, taste, smell and the like, and the object slowly becomes alive. With the advent of the new technological revolution, the world began to enter the information age. In the process of utilizing information, firstly, the accurate and reliable information is needed to be obtained, and the sensor is the main way and means for obtaining the information in the natural and production fields. Therefore, the modern production loses the foundation without a plurality of excellent sensors;
the existing sensor assembly difficulty reduction method has high requirements on an initial position and system installation in motion control, and cannot adjust and avoid position offset caused by long-time use or transportation; therefore, a method for reducing the assembly difficulty of the sensor is provided.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides a method for reducing the assembly difficulty of a sensor.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for reducing the assembly difficulty of a sensor comprises the following specific steps:
(1) the control system is powered on and confirms the initial position;
(2) carrying out position calibration and recording movement data;
(3) recording position deviation and waiting for power-on verification again;
(4) and detecting the check data periodically and recovering the data.
As a further aspect of the present invention, the initial position confirmation in step (1) specifically includes the following steps:
the method comprises the following steps: the control system automatically constructs and trains a detection network model after being electrified and started, receives position information sent by the sensor in real time, generates a corresponding plane image, and detects real-time image information of the network model acquisition device;
step two: the detection network model extracts feature data of each group of image information and sends the feature data into the bidirectional feature pyramid for feature fusion, after the feature fusion is completed, each group of image information with different resolutions is deduced, a sensor in each group of image information is locked through a detection frame, then sensor detection frame information in the image information is collected, and corresponding detection frame coordinates are generated;
step three: carrying out enlarged cutting on related image information according to the coordinates of each group of detection frames, collecting and storing each group of sensor pictures generated after the enlarged cutting, filtering simple negative samples belonging to a background in each group of sensor pictures through RPN (resilient packet network), selecting a region possibly containing a target for classification and regression, and confirming the position coordinates of the sensor on the device;
step four: and comparing the position information sent by the sensor with the position coordinates detected by the detection network model, if the comparison results are consistent, using the position information sent by the sensor as the initial position of the motor, and if the comparison results are inconsistent, feeding back two sets of coordinate information to a worker for manual confirmation to select correct position data as the initial position of the motor.
As a further scheme of the present invention, the detection network model in step one specifically trains as follows:
step I: the method comprises the steps that a detection network model is in communication connection with a cloud server, position check data stored in the cloud server are extracted, non-binary data in each group of check data are converted into binary data, and then each group of data are converted into a specified interval through a normalization method;
step II: extracting each group of processed check data to perform feature dimension reduction processing, integrating and summarizing each group of processed check data into a simulation data set, and dividing the simulation data set into a verification set, a test set and a training set;
step III: repeatedly verifying the precision of the detection network model by using each group of data in the verification set for multiple times, counting the root mean square error of each group of data in the verification set, simultaneously predicting each group of data once, and outputting the data with the best prediction result as the optimal parameter;
step IV: the method comprises the steps of inputting, convolving, pooling, fully connecting and outputting a training set through optimal parameters to generate training samples, finally conveying the training samples to a detection network model, optimizing the detection network model in real time by adopting a long-term iteration method, testing the detection network model by using a test set, stopping training if the test accuracy meets an expected value, and finally evaluating the performance of the detection network model meeting the expected value, namely evaluating the accuracy, the detection rate and the false alarm rate.
As a further scheme of the invention, the position calibration in the step (2) comprises the following specific steps:
the first step is as follows: the motor position is manually moved to a position 0 by a worker or automatically moved to the position 0 by a device, wherein the position 0 is specifically an initial position required by the worker, meanwhile, the control system records the coordinate information of the position 0, and then the motor returns to the initial position of the motor from the position 0;
the second step is that: then, the position of the motor is manually moved to a plurality of groups of different calibration positions by workers or automatically moved by a device, the coordinate information of each group of calibration positions is recorded respectively, and the control system extracts the recorded coordinate information of each group of calibration positions according to an internally set calibration rule so as to calibrate whether the position 0 meets the actual requirement;
the third step: if the position 0 does not meet the actual requirement, waiting for the staff to issue an operation instruction, if the staff issues a recalibration instruction, repeating the first step to the second step, if the staff issues a withdrawal calibration, stopping the calibration, and if the position 0 meets the actual requirement, transferring to the next operation process.
As a further scheme of the present invention, the power-up verification in step (3) specifically includes the following steps:
s1: the control system calculates the distance between the 0 position and the initial position of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, simultaneously records the calculation result as deviation information, and marks the generation time of the deviation information;
s2: when the control system is electrified every time subsequently, the initial position of the motor is collected again and checked, the initial position of the motor after each check is compared with the initial position of the motor at the previous time, and if the initial positions of the motors at the two times are consistent, the corresponding position 0 is searched according to the initial position of the motor and deviation information;
s3: if the initial positions of the motors are inconsistent twice, analyzing the reason of the position inconsistency, simultaneously using the initial position of the motor as verification data to verify and confirm the position 0, and feeding back the verified position 0 to the staff.
As a further scheme of the invention, the data recovery in the step (4) comprises the following specific steps:
p1: the computer detects and counts the number of past check data stored in the control system in real time, and updates and calculates the recovery rate according to the cycle time value default by the system or set by a worker;
p2: and then the computer periodically collects the calculated recovery rate value, then extracts each group of data in corresponding time from front to back according to the set cycle time value and the data generation time, recovers the number of the past check data stored in the control system according to the collected recovery rate value, and automatically generates a recovery log to record the data recovery condition.
Compared with the prior art, the invention has the beneficial effects that:
compared with the prior complex assembly difficulty reduction method, the method receives the position information sent by the sensor in real time after the control system is electrified and started, generates a corresponding plane image, confirms the initial position of the motor, then the position of the motor is automatically moved to the 0 position by a worker or a device, simultaneously the control system records the coordinate information of the 0 position, then the motor returns to the initial position of the motor from the 0 position, and calibrates whether the 0 position meets the actual requirement or not through a plurality of groups of different calibration positions, after the calibration is finished, the control system calculates the distance between the 0 position and the initial position of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, simultaneously records the calculation result as deviation information, and then the control system is electrified at each time, the method comprises the steps of re-collecting and verifying the initial position of the motor, comparing the initial position of the motor after each verification with the initial position of the motor at the previous time, searching the corresponding 0 position according to the initial position of the motor and deviation information if the initial positions of the motors at the two times are consistent, analyzing the reason of the inconsistency of the positions if the initial positions of the motors at the two times are inconsistent, and simultaneously checking and confirming the 0 position by taking the initial position of the motor at the time as checking data.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a block diagram of a flow chart of a method for reducing the difficulty of assembling a sensor according to the present invention;
fig. 2 is an algorithm block diagram of a method for reducing the difficulty of assembling a sensor according to the present invention.
Detailed Description
Referring to fig. 1, a method for reducing difficulty in assembling a sensor includes the following steps:
the control system is powered on and confirms the initial position.
Specifically, after the control system is powered on and started, a detection network model is automatically constructed and trained, position information sent by a sensor is received in real time, a corresponding plane image is generated, real-time image information of a network model acquisition device is detected, feature data of each group of image information is extracted by the detection network model and sent to a bidirectional feature pyramid for feature fusion, after the feature fusion is completed, each group of image information with different resolutions is deduced, the sensor in each group of image information is locked by a detection frame, then sensor detection frame information in the image information is collected and corresponding detection frame coordinates are generated, related image information is subjected to expanded cutting according to each group of detection frame coordinates, each group of sensor pictures generated after the expanded cutting are collected and stored, and then simple negative samples belonging to the background in each group of sensor pictures are filtered by RPN, and selecting a region possibly containing a target for classification and regression, confirming the position coordinate of the sensor on the device, comparing the position information sent by the sensor with the position coordinate detected by the detection network model, if the comparison result is consistent, using the position information sent by the sensor as the initial position of the motor, and if the comparison result is inconsistent, feeding back two groups of coordinate information to a worker for manual confirmation to select correct position data as the initial position of the motor.
It is further described that the detection network model is in communication connection with the cloud server, position check data stored in the cloud server is extracted, non-binary data in each group of check data is converted into binary data, then each group of data is converted into a specified interval through a normalization method, then each group of check data after processing is extracted for feature dimension reduction processing, each group of processed data is integrated and summarized into a simulation data set, the simulation data set is divided into a verification set, a test set and a training set, the precision of the detection network model is verified by using each group of data in the verification set repeatedly for multiple times, root mean square errors of each group of data in the verification set are counted, each group of data is predicted once, data with the best prediction result is output as the optimal parameter, and finally the training set is input through the optimal parameter, Convolution, pooling, full connection and output processing are carried out to generate training samples, the training samples are finally conveyed into a detection network model, a long-term iteration method is adopted to carry out real-time optimization on the detection network model, a test set is used for testing the detection network model, if the test accuracy meets an expected value, the training is stopped, and finally, the performance evaluation is carried out on the detection network model meeting the expected value, namely, the accuracy, the detection rate and the false alarm rate are evaluated.
Position calibration is performed and movement data is recorded.
Specifically, the motor position is manually moved to a position 0 by a worker or automatically moved to the position 0 by a device, wherein the position 0 is specifically an initial position required by the worker, the control system records coordinate information of the position 0, then the motor returns to the initial position of the motor from the position 0, then the motor position is automatically moved to a plurality of groups of different calibration positions by the worker or the device, and records coordinate information of each group of calibration positions respectively, the control system extracts the recorded coordinate information of each group of calibration positions according to an internally set calibration rule to calibrate whether the position 0 meets an actual requirement, if the position 0 does not meet the actual requirement, the control system waits for the worker to issue an operation instruction, if the worker issues a recalibration instruction, the position 0 is recalibrated again, if the worker issues a withdrawal calibration, the calibration is stopped, and if the position 0 meets the actual requirement, the flow goes to the next operation process.
And recording the position deviation to wait for the power-on verification again.
Specifically, the control system calculates the distance between the 0 position and the initial position of the motor by a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, meanwhile, the calculation result is taken as deviation information to be recorded, the generation time of the deviation information is marked, then the control system acquires and verifies the initial position of the motor again when being electrified subsequently each time, comparing the initial position of the motor after each check with the initial position of the motor at the previous time, if the initial positions of the motors at the two times are consistent, then the corresponding 0 position is searched according to the initial position of the motor and the deviation information, if the initial positions of the two motors are not consistent, and analyzing the reason of position inconsistency, simultaneously using the initial position of the motor as verification data to verify and confirm the 0 position, and feeding back the verified 0 position to the staff.
And detecting the check data periodically and recovering the data.
Specifically, the computer detects and counts the number of past verification data stored in the control system in real time, meanwhile, the recovery rate is updated and calculated according to a cycle time value set by system defaults or workers, then the computer collects the calculated recovery rate value periodically, then, according to the set cycle time value, each group of data in corresponding time is extracted from front to back according to data generation time, the number of the past verification data stored in the control system is recovered according to the collected recovery rate value, and meanwhile, a recovery log is automatically generated to record the data recovery condition.

Claims (6)

1. A method for reducing the difficulty of assembling a sensor is characterized by comprising the following specific steps:
(1) the control system is powered on and confirms the initial position;
(2) carrying out position calibration and recording movement data;
(3) recording position deviation and waiting for power-on verification again;
(4) and detecting the check data periodically and recovering the data.
2. The method for reducing the difficulty in assembling the sensor according to claim 1, wherein the initial position confirmation in the step (1) comprises the following specific steps:
the method comprises the following steps: the control system automatically constructs and trains a detection network model after being electrified and started, receives position information sent by the sensor in real time, generates a corresponding plane image, and detects real-time image information of the network model acquisition device;
step two: the detection network model extracts feature data of each group of image information and sends the feature data into the bidirectional feature pyramid for feature fusion, after the feature fusion is completed, each group of image information with different resolutions is deduced, a sensor in each group of image information is locked through a detection frame, then sensor detection frame information in the image information is collected, and corresponding detection frame coordinates are generated;
step three: carrying out enlarged cutting on related image information according to the coordinates of each group of detection frames, collecting and storing each group of sensor pictures generated after the enlarged cutting, filtering simple negative samples belonging to a background in each group of sensor pictures through RPN (resilient packet network), selecting a region possibly containing a target for classification and regression, and confirming the position coordinates of the sensor on the device;
step four: and comparing the position information sent by the sensor with the position coordinates detected by the detection network model, if the comparison results are consistent, using the position information sent by the sensor as the initial position of the motor, and if the comparison results are inconsistent, feeding back two sets of coordinate information to a worker for manual confirmation to select correct position data as the initial position of the motor.
3. The method for reducing the difficulty in assembling the sensor according to claim 2, wherein the step of training the detection network model in the step one is as follows:
step I: the detection network model is in communication connection with the cloud server, position check data stored in the cloud server are extracted, non-binary data in each group of check data are converted into binary data, and then each group of data are converted into a specified interval through a normalization method;
and step II: extracting each group of processed check data to perform feature dimension reduction processing, integrating and summarizing each group of processed check data into a simulation data set, and dividing the simulation data set into a verification set, a test set and a training set;
step III: repeatedly verifying the precision of the detection network model by using each group of data in the verification set, counting the root mean square error of each group of data in the verification set, simultaneously predicting each group of data once, and outputting the data with the best prediction result as the optimal parameter;
step IV: the method comprises the steps of inputting, convolving, pooling, fully connecting and outputting a training set through optimal parameters to generate training samples, finally conveying the training samples to a detection network model, optimizing the detection network model in real time by adopting a long-term iteration method, testing the detection network model by using a test set, stopping training if the test accuracy meets an expected value, and finally evaluating the performance of the detection network model meeting the expected value, namely evaluating the accuracy, the detection rate and the false alarm rate.
4. The method for reducing the difficulty in assembling the sensor, according to claim 2, wherein the position calibration in the step (2) comprises the following specific steps:
the first step is as follows: the motor position is manually moved to a position 0 by a worker or automatically moved to the position 0 by a device, wherein the position 0 is specifically an initial position required by the worker, meanwhile, the control system records the coordinate information of the position 0, and then the motor returns to the initial position of the motor from the position 0;
the second step is that: then, the position of the motor is manually moved to a plurality of groups of different calibration positions by workers or automatically moved by a device, the coordinate information of each group of calibration positions is recorded respectively, and the control system extracts the recorded coordinate information of each group of calibration positions according to an internally set calibration rule so as to calibrate whether the position 0 meets the actual requirement;
the third step: if the position 0 does not meet the actual requirement, waiting for the staff to issue an operation instruction, if the staff issues a recalibration instruction, repeating the first step to the second step, if the staff issues a withdrawal calibration, stopping the calibration, and if the position 0 meets the actual requirement, transferring to the next operation process.
5. The method for reducing the assembly difficulty of the sensor according to claim 4, wherein the step (3) of the power-on verification specifically comprises the following steps:
s1: the control system calculates the distance between the 0 position and the initial position of the motor through a built-in distance calculation formula, records the moving direction when the 0 position moves to the initial position of the motor, simultaneously records the calculation result as deviation information, and marks the generation time of the deviation information;
s2: the control system collects and verifies the initial position of the motor again when the control system is electrified subsequently each time, compares the initial position of the motor after each verification with the initial position of the motor in the previous time, and searches a corresponding 0 position according to the initial position of the motor and deviation information if the initial positions of the motors in the two times are consistent;
s3: if the initial positions of the motors are inconsistent twice, analyzing the reason of the position inconsistency, simultaneously using the initial position of the motor as verification data to verify and confirm the position 0, and feeding back the verified position 0 to the staff.
6. The method for reducing the assembly difficulty of the sensor, according to claim 1, is characterized in that the data recovery in the step (4) specifically comprises the following steps:
p1: the computer detects and counts the number of past check data stored in the control system in real time, and updates and calculates the recovery rate according to the cycle time value default by the system or set by a worker;
p2: and then the computer periodically collects the calculated recovery rate value, then extracts each group of data in corresponding time from front to back according to the set cycle time value and the data generation time, recovers the number of the past check data stored in the control system according to the collected recovery rate value, and automatically generates a recovery log to record the data recovery condition.
CN202210811516.9A 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method Active CN115082661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210811516.9A CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210811516.9A CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Publications (2)

Publication Number Publication Date
CN115082661A true CN115082661A (en) 2022-09-20
CN115082661B CN115082661B (en) 2024-05-10

Family

ID=83260470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210811516.9A Active CN115082661B (en) 2022-07-11 2022-07-11 Sensor assembly difficulty reducing method

Country Status (1)

Country Link
CN (1) CN115082661B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115329670A (en) * 2022-08-11 2022-11-11 深圳朗道智通科技有限公司 Data acquisition method for unmanned vehicle

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105058388A (en) * 2015-08-17 2015-11-18 哈尔滨工业大学 Sensor data fusion method used for acquiring robot joint position feedback information
CN105397807A (en) * 2015-12-21 2016-03-16 珠海格力电器股份有限公司 Robot zero calibration device and robot zero calibration system as well as robot zero calibration method
RU2592734C1 (en) * 2015-05-26 2016-07-27 Федеральное государственное бюджетное учреждение науки Институт автоматики и электрометрии Сибирского отделения Российской академии наук (ИАиЭ СО РАН) Method of calibrating angular sensor
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN110196075A (en) * 2018-02-27 2019-09-03 上海市计量测试技术研究院 A kind of environmental test equipment calibration long-range temperature and humidity test system and test method
CN110942144A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 Neural network construction method integrating automatic training, checking and reconstructing
EP3693697A1 (en) * 2019-02-06 2020-08-12 OptiNav Sp. z o.o. Method for calibrating a 3d measurement arrangement
WO2020164282A1 (en) * 2019-02-14 2020-08-20 平安科技(深圳)有限公司 Yolo-based image target recognition method and apparatus, electronic device, and storage medium
DE112018007287T5 (en) * 2018-03-15 2020-12-10 Harman International Industries, Incorporated VEHICLE SYSTEM AND METHOD FOR DETECTING OBJECTS AND OBJECT DISTANCE
DE102021101593B3 (en) * 2021-01-26 2022-03-31 Audi Aktiengesellschaft Method for operating environment sensors in a motor vehicle and motor vehicle
CN114581983A (en) * 2022-03-04 2022-06-03 浪潮(北京)电子信息产业有限公司 Detection frame processing method for target detection and related device
CN216851660U (en) * 2022-03-22 2022-06-28 阿斯曼尔科技(上海)有限公司 Mechanical structure for solving problem of inaccurate positioning caused by back clearance of gear box

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2592734C1 (en) * 2015-05-26 2016-07-27 Федеральное государственное бюджетное учреждение науки Институт автоматики и электрометрии Сибирского отделения Российской академии наук (ИАиЭ СО РАН) Method of calibrating angular sensor
CN105058388A (en) * 2015-08-17 2015-11-18 哈尔滨工业大学 Sensor data fusion method used for acquiring robot joint position feedback information
CN105397807A (en) * 2015-12-21 2016-03-16 珠海格力电器股份有限公司 Robot zero calibration device and robot zero calibration system as well as robot zero calibration method
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN110196075A (en) * 2018-02-27 2019-09-03 上海市计量测试技术研究院 A kind of environmental test equipment calibration long-range temperature and humidity test system and test method
DE112018007287T5 (en) * 2018-03-15 2020-12-10 Harman International Industries, Incorporated VEHICLE SYSTEM AND METHOD FOR DETECTING OBJECTS AND OBJECT DISTANCE
EP3693697A1 (en) * 2019-02-06 2020-08-12 OptiNav Sp. z o.o. Method for calibrating a 3d measurement arrangement
WO2020164282A1 (en) * 2019-02-14 2020-08-20 平安科技(深圳)有限公司 Yolo-based image target recognition method and apparatus, electronic device, and storage medium
CN110942144A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 Neural network construction method integrating automatic training, checking and reconstructing
DE102021101593B3 (en) * 2021-01-26 2022-03-31 Audi Aktiengesellschaft Method for operating environment sensors in a motor vehicle and motor vehicle
CN114581983A (en) * 2022-03-04 2022-06-03 浪潮(北京)电子信息产业有限公司 Detection frame processing method for target detection and related device
CN216851660U (en) * 2022-03-22 2022-06-28 阿斯曼尔科技(上海)有限公司 Mechanical structure for solving problem of inaccurate positioning caused by back clearance of gear box

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈烁辉;周翔;王汉其;廖传伟;: "高精度倾角传感器检测系统的设计与实现", 自动化与仪表, no. 12, 15 December 2012 (2012-12-15) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115329670A (en) * 2022-08-11 2022-11-11 深圳朗道智通科技有限公司 Data acquisition method for unmanned vehicle

Also Published As

Publication number Publication date
CN115082661B (en) 2024-05-10

Similar Documents

Publication Publication Date Title
CN115082661A (en) Method for reducing assembly difficulty of sensor
CN113723325B (en) Frock defect detecting system of prefabricated component of assembled
CN113298194B (en) Data fusion method and system based on multiple sensors and storage medium
CN115017578A (en) Intelligent actual measurement method and device for building, UGV and storage medium
KR102198028B1 (en) Position Verification Method for Equipment Layout at 3D Design of Smart Factory
CN112766253A (en) Intelligent system for full-automatic instrument analysis in complex environment based on image
CN113420016A (en) Real-time acquisition system for surveying and mapping data
CN114998597A (en) Target detection method and device based on artificial intelligence
CN112540404B (en) Automatic speed analysis method and system based on deep learning
CN205300712U (en) Ultrasonic wave gas surface low volume point coefficient correcting unit
CN114722960A (en) Method and system for detecting incomplete track of event log in business process
CN113689439A (en) Unmanned aerial vehicle image capturing method based on reinforcement learning image processing technology
CN114034260A (en) Deep foundation pit support structure deformation diagnosis system based on streaming media and BIM
CN106338770A (en) Shot detection point data mutual checking method and system
CN110703183A (en) Intelligent electric energy meter fault data analysis method and system
CN116664699B (en) Automobile production line data management system and method
CN116090692B (en) Engineering cost management system and method based on BIM technology
CN117761717B (en) Automatic loop three-dimensional reconstruction system and operation method
CN116776086B (en) Signal fault discriminating method and device based on self-attention mechanism self-encoder
CN116801192B (en) Indoor electromagnetic fingerprint updating method and system by end cloud cooperation
CN118032008A (en) Multi-sensor combined calibration system and method for autonomous mobile robot
CN117272704B (en) Digital twin-drive data processing system for multi-source heterogeneous data
CN111814953B (en) Positioning method of deep convolution neural network model based on channel pruning
CN118013465A (en) Non-motor vehicle identification method and system based on multi-sensor cooperation
CN114674977A (en) Gas detection alarm instrument and gas monitoring system based on cloud computing and edge computing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant