CN113172624A - Positioning guide device and method and electronic equipment - Google Patents

Positioning guide device and method and electronic equipment Download PDF

Info

Publication number
CN113172624A
CN113172624A CN202110443057.9A CN202110443057A CN113172624A CN 113172624 A CN113172624 A CN 113172624A CN 202110443057 A CN202110443057 A CN 202110443057A CN 113172624 A CN113172624 A CN 113172624A
Authority
CN
China
Prior art keywords
image
image sensor
reflection
positioning
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110443057.9A
Other languages
Chinese (zh)
Inventor
黄威
吕炜俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Chuangyuan Microsoft Co ltd
Original Assignee
Beijing Chuangyuan Microsoft Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chuangyuan Microsoft Co ltd filed Critical Beijing Chuangyuan Microsoft Co ltd
Priority to CN202110443057.9A priority Critical patent/CN113172624A/en
Publication of CN113172624A publication Critical patent/CN113172624A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Abstract

The present application relates to the field of industrial vision equipment technologies, and in particular, to a positioning guide device, a positioning guide method, and an electronic device. The positioning and guiding device provided by the embodiment of the application comprises a reflecting part, a first image sensor, a processor and an executing mechanism. The reflecting part is arranged on the bearing table, a first tested component is further arranged on the bearing table, and the center of the reflecting part and the center of the first tested component are positioned on the same horizontal plane; and a first image sensor arranged in a vertical direction of the plummer, a lens center of the first image sensor and a center of the reflection part are in a same vertical plane, and the first image sensor captures a first image of the first measured component through reflection of the reflection part. The light path is changed through the reflecting part, so that the first image sensor can obtain images which cannot be obtained at the conventional position, and the space utilization rate of the machine table is increased.

Description

Positioning guide device and method and electronic equipment
Technical Field
The present disclosure relates to the field of industrial vision devices, and particularly, to a positioning guide device, a positioning guide method, and an electronic device.
Background
In the current common mode of visual CCD photographing, positioning and guiding, the distance between a camera lens and a measured object cannot be completely compatible with the size of a machine under certain conditions. And the product probably takes a picture in many directions under complicated operating mode, often can be subject to the space and the size of board, leads to unable the demand that satisfies the location guide, or needs to select the big visual field lens that the cost is much higher to accomplish, but big visual field lens can produce great distortion and influence final equipment precision.
Disclosure of Invention
In view of the above, embodiments of the present disclosure provide a positioning guiding apparatus, a positioning guiding method and an electronic device, so as to overcome the above-mentioned drawbacks in the prior art.
The application mainly comprises the following aspects:
in a first aspect, an embodiment of the present application provides a positioning and guiding device, including: the reflecting part is arranged on the bearing table, a first tested component is further arranged on the bearing table, and the center of the reflecting part and the center of the first tested component are positioned on the same horizontal plane; a first image sensor arranged in a vertical direction of the plummer, a lens center of the first image sensor and a center of the reflection part are positioned in a same vertical plane, and the first image sensor captures a first image of the first tested part through reflection of the reflection part; a processor for acquiring a first image of the first measured part from the first image sensor, recognizing the image to obtain pixel coordinates of a plurality of detection points on the first measured part, and generating a motion control signal based on the optical parameter of the reflecting part and the pixel coordinates of the plurality of detection points; and the executing mechanism responds to the motion control signal generated by the processor to execute corresponding action aiming at the first tested part.
In one possible embodiment, in the positioning and guiding device, a first distance between a center of the reflection portion and a center of the first measured member and a second distance between a lens center of the first image sensor and the center of the reflection portion are determined based on the imaging parameters of the first image sensor.
In a possible embodiment, the positioning and guiding device further comprises a rotating structure, and the reflection part is fixedly arranged on the bearing platform or is connected with the bearing platform through the rotating structure.
In one possible embodiment, the positioning and guiding device further comprises: a first light source disposed between the first measured member and the reflecting portion to vertically irradiate the first measured member; and a second light source disposed between the second measured member and the reflection part to vertically irradiate the second measured member.
In one possible embodiment, the positioning and guiding device further comprises: and the second image sensor is arranged in the vertical direction of the bearing table, the center of a lens of the second image sensor is positioned in the same vertical plane with the center of the first part to be measured, and the second image sensor captures a third image of the first part to be measured.
In one possible embodiment, in the positioning and guiding device, the angle between the reflecting surface of the reflecting part and the horizontal plane is 45 degrees.
In a second aspect, an embodiment of the present application further provides a positioning and guiding method, which is used for the positioning and guiding device of the first aspect, and the positioning and guiding method includes: acquiring a first image of a first part under test from a first image sensor; identifying a first image to obtain pixel coordinates of a plurality of detection points on a first detected part; generating a motion control signal based on the optical parameter of the reflection part and the pixel coordinates of the plurality of detection points; and controlling an actuating mechanism to execute corresponding action aiming at the first tested part according to the motion control signal.
In a possible embodiment, the positioning and guiding device further comprises a rotating structure, and the reflecting part is connected with the bearing table through the rotating structure; wherein the step of acquiring a first image of a first part under test from a first image sensor comprises: generating a first rotation control signal, and controlling the rotating structure to rotate by a first preset angle based on the first rotation control signal so that the reflecting part is opposite to the first tested part; generating a first image capturing signal and controlling a first image sensor to capture a first image of the first component under test through reflection by the reflection part based on the first image capturing signal; acquiring a first image of a first part under test from a first image sensor; wherein, still place the second part under test on the plummer, the location guide method still includes: generating a second rotation control signal, and controlling the rotating structure to rotate by a second preset angle based on the second rotation control signal so that the reflecting part is opposite to a second tested part; generating a second image capturing signal, and controlling the first image sensor to capture a second image of the second component under test through reflection by the reflection part based on the second image capturing signal; a second image of a second part under test is acquired from a second image sensor.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions being executed by the processor to perform the steps of the positioning guidance method according to the second aspect or any of the possible embodiments of the second aspect.
In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the positioning and guiding method in the second aspect or any possible implementation manner of the second aspect.
The positioning and guiding device provided by the embodiment of the application comprises a reflecting part, a first image sensor, a processor and an executing mechanism. The bearing table is provided with a reflecting part, light reflected by the first tested component is reflected to the first image sensor through the reflecting part, and incident light of the first tested component is perpendicular to the reflected light of the reflecting part. The processor acquires an image of the first measured object through the first image sensor, obtains pixel coordinates of a plurality of detection points on the first measured object after calculation, and generates corresponding motion control signals according to the optical parameters of the reflecting part and the pixel coordinates of the plurality of detection points. The executing mechanism receives the motion control signal and responds to the motion control signal to execute corresponding action on the first tested part, so that the manipulator carries out positioning guide to finish assembly work. The light path is changed through the reflecting part, so that the first image sensor can obtain images which cannot be obtained at the conventional position, and the space utilization rate of the machine table is increased.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic structural view of a conventional positioning guide device;
fig. 2 is a schematic structural diagram illustrating a positioning and guiding device provided in an embodiment of the present application;
fig. 3 shows one of the schematic structural diagrams of a positioning and guiding device provided by the embodiment of the present application;
fig. 4 shows a second schematic structural diagram of a positioning and guiding device provided in the embodiment of the present application;
FIG. 5 is a schematic structural diagram of another positioning and guiding device provided in the embodiments of the present application;
fig. 6 is a flowchart illustrating a positioning guidance method according to an embodiment of the present application;
fig. 7 is a flowchart illustrating another positioning guidance method provided by an embodiment of the present application;
fig. 8 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Description of the main element symbols:
in the figure:
101-a reflection part; 102-a first image sensor; 103-a first component under test; 104-a second component under test; 105-a first light source; 106-a second light source; 107-a second image sensor; 108-a third light source;
400-an electronic device; 401-a processor; 402-a communication bus; 403-a user interface; 404-a network interface; 405-a memory; 4051-operating system; 4052-application program.
Detailed Description
To make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and that steps without logical context may be performed in reverse order or concurrently. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
To enable one of ordinary skill in the art to utilize the present disclosure, the following embodiments are presented in conjunction with a specific application scenario "industrial vision device positioning guidance," and it would be apparent to one of ordinary skill in the art that the general principles defined herein may be applied to other embodiments and application scenarios without departing from the spirit and scope of the present disclosure.
The following apparatus, method, electronic device or computer-readable storage medium in the embodiments of the present application may be applied to any scenario that needs to perform positioning guidance, and the embodiments of the present application do not limit specific application scenarios, and any scheme that uses the apparatus and method for positioning guidance provided in the embodiments of the present application is within the protection scope of the present application.
It should be noted that before the present application, the lens of the image sensor CCD is directly aligned to the device under test in the manner shown in fig. 1, but the distance between the lens and the device under test may not be compatible with the size of the equipment platform in some cases. And, probably need diversely shoot under complicated operating mode, the installation of camera lens receives the space and the restriction of size of equipment board more this moment, leads to unable satisfying the shooting demand to can't fix a position the guide. The cost is increased if a large-view lens is selected to complete the shooting, and the large-view lens is easy to generate distortion, thereby affecting the precision of the shot image. And most equipment machines have motion modules, and the movement track of the motion modules in different motion states also has certain conflict with the positions of the lenses.
In view of the above problems, embodiments of the present application provide a positioning guiding apparatus, a positioning guiding method, and an electronic device, which are described below by way of embodiments.
For the convenience of understanding of the present application, the technical solutions provided in the present application will be described in detail below with reference to specific embodiments.
Example one
As shown in fig. 2, the positioning and guiding device provided in the embodiment of the present application includes a reflection portion 101, a first image sensor 102, a processor, and an actuator (not shown in the figure). The reflection part 101 is arranged on a bearing table, a first tested component 103 is further arranged on the bearing table, and the center of the reflection part 101 and the center of the first tested component 103 are located on the same horizontal plane. The first image sensor 102 is disposed in the vertical direction of the stage, the lens center of the first image sensor 102 is in the same vertical plane as the center of the reflection part 101, and the first image sensor 102 captures a first image of the first part under test 103 by reflection by the reflection part 101. The processor acquires a first image of the first measured part 103 from the first image sensor 102, recognizes the first image to obtain pixel coordinates of a plurality of detection points on the first measured part 103, and generates a motion control signal based on the optical parameters of the reflecting part 101 and the pixel coordinates of the plurality of detection points. The actuator performs a corresponding action with respect to the first part under test 103 in response to the processor-generated motion control signal.
For example, the angle between the reflecting surface of the reflecting part 101 and the horizontal plane is 45 degrees, and the reflecting part 101 may be made of a high-transmittance glass material. In one example, the reflection part 101 may be a right-angle prism, the reflection part 101 is disposed on the stage, and the center of the reflection part 101 and the center of the first measured component 103 are on the same horizontal plane, i.e. the incident light of the first measured component 103 in the horizontal direction is at 45 degrees to the reflection plane of the reflection part 101, it can be understood that the angle a in fig. 2 is also at 45 degrees.
The first image sensor 102 is disposed in the vertical direction of the plummer, and the center of the lens of the first image sensor 102 is on the same vertical plane with the center of the reflection part 101, i.e. the reflection light reflected by the reflection part 101 is at 45 degrees to the reflection plane of the reflection part 101. The first image sensor 102 performs image capture through the photosensitive chip to generate a first image of the first tested component 103, and the first image may be a side view of the first tested component 103.
The processor acquires a first image of the first measured part 103 from the first image sensor 102, recognizes the first image, obtains pixel coordinates of a plurality of detection points of the first measured part 103, and generates a motion control signal according to the optical parameters of the reflection part 101 and the pixel coordinates of the plurality of detection points. The detection point is a preset pixel coordinate corresponding to the first measured component 103 and is used for positioning the measured component. The obtained first image has a mirror image effect and needs to be subjected to image processing, and a surface image of the part to be detected is obtained through the industrial vision system processing, wherein the surface image comprises an action image of the part to be detected, which is simulated and attached to guide (namely, a machine table drives the part to be detected to perform mechanical motion), a nine-point marking action is completed, a conversion relation corresponding to the actual motion of the image and the part to be detected can be obtained through marking, and therefore the step of drawing, marking and calculating is completed.
That is, the processor determines a mechanical motion coordinate based on the optical parameter of the reflection section 101 and the pixel coordinates of the plurality of detection points, and generates a motion control signal of the manipulator based on the mechanical motion coordinate. Here, the optical parameter of the reflection portion 101 may include an angle between a reflection surface of the reflection portion 101 and a horizontal plane, when the angle is 45 degrees, Δ X and Δ Y existing in the mechanical movement direction (X-axis direction and Y-axis direction) are in a 1:1 relationship, and a conversion relationship between the image and the actual movement may be obtained by calibration.
The actuator responds to the motion control signal generated by the processing, so that the manipulator completes the actions such as assembling the first part under test 103.
Preferably, the mounting position of the first image sensor 102 may be above the stage, or may be provided on the side or below the stage as needed. When the first image sensor 102 is installed at the side or below the stage, the installation position of the reflection portion 101 is adjusted correspondingly to ensure that the placing angle of the reflection portion 101 is 45 degrees (i.e. the included angle between the reflection light to be measured and the reflection surface of the reflection portion 101 is 45 degrees), thereby preventing the image distortion and affecting the image capturing effect.
Specifically, when the condition of need shooing the product bottom view is met, if place the camera lens upwards according to conventional mode of setting and shoot, then can lead to the dust to fall into the camera lens top, long-term accumulation dust then can influence the definition that obtains the image. The lens can be fixed on the side of the machine equipment, the bottom view of the part to be measured is obtained through reflection of the reflection part, and the problem that the definition is affected by lens dust deposition is solved.
In this embodiment, the light path is reflected by the reflection portion 101 disposed between the component to be measured and the first image sensor 102, so that the first image sensor 102 can be disposed at a position that does not affect the overall space of the system, and the component to be measured is captured, and positioning and guiding are completed based on image information, thereby increasing the space utilization of the machine, avoiding the space limitation between the machine size and the size of the body to be measured, and reducing the cost of the production system equipment.
Further, a first distance between the center of the reflection unit 101 and the center of the first measured member 103 and a second distance between the lens center of the first image sensor 102 and the center of the reflection unit 101 are determined based on the imaging parameters of the first image sensor 102.
In a specific embodiment, the sum of the object distance between the center of the first measured component 103 and the center of the reflection part 101 and the object distance between the lens center of the first image sensor 102 and the center of the reflection part 101 is equal to the object distance of the lens of the first image sensor 102. The first image sensor 102 may be fixed on the equipment machine platform through a sliding structure, and the sliding structure may receive an instruction of the processor, and drive the first image sensor 102 to move vertically or horizontally, so as to ensure that a sum of an object distance between a center of the first measured component 103 and a center of the reflection portion 101 and an object distance between a lens center of the first image sensor 102 and the center of the reflection portion 101 is equal to a lens object distance of the first image sensor 102. In a practical application scenario, a first distance between the center of the reflection part 101 and the center of the first measured component 103 may be affected by a volume limit of the measured component or a space of the device, and the first distance may be shortened, for example, from 300mm to 250mm, and then a second distance between the lens center of the first image sensor 102 and the center of the reflection part 101 needs to be adjusted from 300mm to 350mm through a sliding structure, so as to maintain a sum of an object distance between the center of the first measured component 103 and the center of the reflection part 101 and an object distance between the lens center of the first image sensor 102 and the center of the reflection part 101 equal to the lens object distance of the first image sensor 102.
Further, as shown in fig. 3 and 4, the positioning and guiding device further includes a rotating structure, and the reflecting portion 101 is fixedly disposed on the carrier or the reflecting portion 101 is connected to the carrier through the rotating structure.
The processor generates a first rotation control signal, sends the first rotation control signal to the rotating structure, and controls the rotating structure to rotate by a first preset angle, so that the reflecting part 101 is opposite to the first part to be measured 103; the processor generates a first image capturing signal and transmits the first image capturing signal to the first image sensor 102, and controls the first image sensor 102 to capture a first image of the first component under test 103 through reflection by the reflection part 101.
A second tested part 104 is further placed on the bearing table, the processor generates a second rotation control signal, sends the second rotation control signal to the rotating structure, and controls the rotating structure to rotate by a second preset angle, so that the reflecting part 101 is opposite to the second tested part 104; the processor generates a second image capturing signal and transmits the second image capturing signal to the first image sensor 102, and controls the first image sensor 102 to capture a second image of the second part under test 104 by reflection of the reflection part 101.
In this embodiment, the reflection unit 101 is connected to the stage via a rotation structure, and the first image sensor 102 respectively acquires a first image of the first component under test 103 and a second image of the second component under test 104.
Specifically, the first image sensor 102 is disposed between the first measured component 103 and the second measured component 104, and when the processor generates the first rotation control signal, the rotation structure rotates by a first predetermined angle in response to the first rotation control signal, so that the reflection surface of the reflection portion 101 faces the first measured component 103, and the reflection surface of the reflection portion 101 is at an angle of 45 degrees with respect to the horizontal plane. When the processor generates the first image capturing signal, the first image sensor 102 acquires a first image of the first part under test 103 by reflection by the reflection section 101. After acquiring the first image of the first measured component 103, the processor generates a second rotation control signal, and the rotating structure rotates by a second predetermined angle in response to the second rotation control signal, so that the reflecting surface of the reflecting part 101 faces the second measured component 104, and the reflecting surface of the reflecting part 101 forms an angle of 45 degrees with the horizontal plane. When the processor generates the second image capturing signal, the second image sensor 107 acquires a second image of the second part under test 104 by reflection by the reflection section 101. Wherein the second image is a side view of the second part under test 104. The image sensor is used for respectively acquiring the images of the two objects to be measured, so that the space utilization rate of the device is improved.
Further, the positioning and guiding device further comprises: a first light source 105 and a second light source 106. The first light source 105 is disposed between the first measured member 103 and the reflection portion 101 to vertically irradiate the first measured member 103. The second light source 106 is disposed between the second measured member 104 and the reflection portion 101 to vertically irradiate the second measured member 104. The first light source 105 vertically irradiates the first measured component 103, and is used for enhancing the reflected light of the first measured component 103 and improving the accuracy of the image acquired by the first image sensor 102. The second light source 106 perpendicularly irradiates the second measured component 104, and is used for enhancing the reflected light of the second measured component 104 and improving the accuracy of the image acquired by the second image sensor 107.
As shown in fig. 5, further, the positioning and guiding device further includes: a second image sensor 107. The second image sensor 107 is arranged in the vertical direction of the stage, the lens center of the second image sensor 107 is in the same vertical plane as the center of the first part under test 103, and the second image sensor 107 captures a third image of the first part under test 103.
Specifically, the second image sensor 107 is disposed in the vertical direction of the stage, the lens center of the second image sensor 107 is in the same vertical plane as the center of the first measured component 103, and the second image sensor 107 is configured to directly acquire the third image of the first measured component 103 in response to the third image capturing signal. The third image is a top view of the first measured component 103.
Alternatively, as shown in fig. 5, a third light source 108 is provided, and the third light source 108 is disposed between the first measured member 103 and the second image sensor 107. The second image sensor 107 may also directly acquire a top view of the first measured component 103 in response to the third image capturing signal, while the first image sensor 102 acquires a side view of the second measured component 104 by reflection by the reflecting portion 101 in response to the first image capturing signal. The current requirement of the vision system is the positioning and guiding function of the first tested part 103 and the second tested part 104, the first tested part 103 is located on the plane of the plummer, and the positioning and guiding function can be completed by vertically shooting through the second image sensor 107. However, the position of the second part 104 to be measured is the side of the platform, and the platform is in the horizontal assembly line and cannot be turned over or operated, and the camera cannot be installed on the side due to the limitation of the equipment space. Therefore, the reflecting part 101 is provided, and the first image sensor 102 can obtain a side view of the second measured component 104 from above by reflection, thereby meeting the system requirements.
In the embodiment of the present application, by providing the first image sensor 102 and the second image sensor 107 and simultaneously acquiring the first image and the third image of the first measured component 103, the processor acquires the first image and the third image respectively for recognition, thereby obtaining the pixel coordinates of the plurality of detection points of the first measured component 103, and generates the motion control signal according to the optical parameters of the reflection part 101 and the pixel coordinates of the plurality of detection points. The detection point is a preset pixel coordinate corresponding to the first measured component 103 and is used for positioning the measured component. Simultaneously, a first image and a third image of the first image sensor 102 are captured, the first tested part 103 is calibrated from a plurality of angles, the requirement of multi-direction photographing under complex working conditions is met, and the space of the device is reasonably utilized.
Further, the positioning and guiding device comprises a reflection part 101, and an included angle between a reflection surface of the reflection part 101 and a horizontal plane is 45 degrees.
In this embodiment, the positioning and guiding device includes a reflection portion 101, and the reflection portion 101 is a right-angle prism whose reflection surface forms an angle of 45 degrees with the horizontal plane. Preferably, the reflection part 101 is made of a high-transmittance glass material, so that the image taking and imaging effect of the reflection part 101 is clear, the image distortion is effectively avoided, and the polishing effect is not affected.
Example two
Fig. 6 is a positioning and guiding method provided in an embodiment of the present application, and is used for the positioning and guiding device. As shown in fig. 6, the positioning and guiding method provided in the embodiment of the present application includes the following steps:
s201: acquiring a first image of the first part under test from the first image sensor;
s202: identifying the first image to obtain pixel coordinates of a plurality of detection points on the first tested part;
s203: generating a motion control signal based on the optical parameter of the reflection part and the pixel coordinates of the plurality of detection points;
s204: and controlling the actuating mechanism to execute corresponding actions aiming at the first tested part according to the motion control signal.
In this embodiment, a first image of a first measured part is acquired by a first image sensor, the first image is recognized, pixel coordinates of a plurality of detection points on the first measured part are obtained, and a motion control signal is generated based on an optical parameter of a reflection part and the pixel coordinates of the plurality of detection points. The actuator performs a corresponding action with respect to the first component under test in response to the motion control signal generated by the processor. The image acquisition is completed through prism reflection, images which cannot be obtained in a conventional mode are obtained, and after image processing, positioning and guiding of the component to be measured are achieved.
EXAMPLE III
Fig. 7 is a positioning and guiding method provided in an embodiment of the present application, and is applied to the positioning and guiding device, where the positioning and guiding device further includes a rotating structure, and the reflection portion is connected to the bearing table through the rotating structure. As shown in fig. 7, in the positioning guidance method provided in the embodiment of the present application, step S201 specifically includes:
s301: generating a first rotation control signal, and controlling the rotating structure to rotate by a first preset angle based on the first rotation control signal so that the reflecting part is opposite to the first tested part;
s302: generating a first image capturing signal, and controlling the first image sensor to capture a first image of the first component under test through reflection of the reflection part based on the first image capturing signal;
s303: acquiring a first image of the first part under test from the first image sensor;
wherein, still place the second part under test on the plummer, the location guide method still includes:
s304: generating a second rotation control signal, and controlling the rotating structure to rotate by a second preset angle based on the second rotation control signal so that the reflecting part is opposite to the second tested part;
s305: generating a second image capturing signal, and controlling the first image sensor to capture a second image of the second component under test by reflection of the reflection part based on the second image capturing signal;
s306: a second image of the second part under test is acquired from the second image sensor.
In this embodiment, the reflection part is connected with the bearing table through a rotating structure, and the first image sensor respectively acquires a first image of the first tested component and a second image of the second tested component.
Specifically, the first image sensor is arranged between the first tested component and the second tested component, when the processor generates a first rotation control signal, the rotating structure responds to the first rotation control signal and rotates for a first preset angle, so that the reflecting surface of the reflecting part faces the first tested component, and the reflecting surface of the reflecting part forms an angle of 45 degrees with the horizontal plane. When the processor generates a first image capturing signal, the first image sensor acquires a first image of the first component under test through reflection by the reflection portion. And after the first image of the first tested part is acquired, the processor generates a second rotation control signal, and the rotating structure responds to the second rotation control signal and rotates for a second preset angle, so that the reflecting surface of the reflecting part faces the second tested part, and the reflecting surface of the reflecting part forms an angle of 45 degrees with the horizontal plane. When the processor generates a second image capturing signal, the second image sensor acquires a second image of the second component under test by reflection of the reflection portion. Wherein the second image is a side view of the second measured component. The image sensor is used for respectively acquiring the images of the two objects to be measured, so that the space utilization rate of the device is improved.
Based on the same application concept, referring to fig. 8, a structure of an electronic device 400 provided in an embodiment of the present application is shown, where the electronic device 400 includes: at least one processor 401, at least one network interface 404 or other user interface 403, memory 405, at least one communication bus 402. A communication bus 402 is used to enable connective communication between these components. The electronic device 400 optionally contains a user interface 403 including a display (e.g., touchscreen, LCD, CRT, Holographic (Holographic) or projection (Projector), etc.), a keyboard or a pointing device (e.g., mouse, trackball (trackball), touch pad or touchscreen, etc.).
Memory 405 may include both read-only memory and random-access memory and provides instructions and data to processor 401. A portion of the memory 405 may also include non-volatile random access memory (NVRAM).
In some embodiments, memory 405 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof:
an operating system 4051, which contains various system programs, for implementing various basic services and processing hardware-based tasks;
the application programs 4052 include various application programs such as a desktop (launcher), a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services.
In an embodiment of the present invention, the processor 401 is configured to execute the steps of the positioning guidance method provided in any of the above embodiments by calling a program or instructions stored in the memory 405.
Based on the same application concept, embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the positioning and guiding method provided by the foregoing embodiments are performed.
In particular, the storage medium can be a general-purpose storage medium, such as a removable disk, a hard disk, or the like, and when executed, the computer program on the storage medium can execute the above positioning and guiding method.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A positioning guide, characterized in that it comprises:
the reflection part is arranged on the bearing table, a first tested component is further placed on the bearing table, and the center of the reflection part and the center of the first tested component are located on the same horizontal plane;
a first image sensor disposed in a vertical direction of the stage, a lens center of the first image sensor being in a same vertical plane as a center of the reflection part, the first image sensor capturing a first image of the first component under test by reflection of the reflection part;
a processor for acquiring a first image of the first measured component from the first image sensor, recognizing the first image to obtain pixel coordinates of a plurality of detection points on the first measured component, and generating a motion control signal based on the optical parameter of the reflection part and the pixel coordinates of the plurality of detection points;
and the executing mechanism responds to the motion control signal generated by the processor to execute corresponding action for the first tested part.
2. The positioning and guiding device according to claim 1, wherein a first distance between a center of the reflecting portion and a center of the first measured member and a second distance between a lens center of the first image sensor and the center of the reflecting portion are determined based on imaging parameters of the first image sensor.
3. The device of claim 1, further comprising a rotating structure, wherein the reflecting part is fixedly disposed on the carrier or the reflecting part is connected to the carrier through the rotating structure.
4. The positioning guide according to claim 3, further comprising:
a first light source arranged between the first measured member and the reflection portion to vertically irradiate the first measured member;
a second light source disposed between a second measured member and the reflection part to vertically irradiate the second measured member.
5. The positioning guide according to claim 1, further comprising:
and the second image sensor is arranged in the vertical direction of the bearing table, the lens center of the second image sensor and the center of the first part to be measured are positioned on the same vertical plane, and the second image sensor captures a third image of the first part to be measured.
6. The positioning and guiding device of claim 1, wherein the reflecting surface of the reflecting portion is at an angle of 45 degrees to the horizontal.
7. A positioning and guiding method for the positioning and guiding device according to claim 1, the positioning and guiding method comprising:
acquiring a first image of the first part under test from the first image sensor;
identifying the first image to obtain pixel coordinates of a plurality of detection points on the first tested part;
generating a motion control signal based on the optical parameter of the reflection part and the pixel coordinates of the plurality of detection points;
and controlling the actuating mechanism to execute corresponding actions aiming at the first tested part according to the motion control signal.
8. The positioning and guiding method according to claim 7, wherein the positioning and guiding device further comprises a rotating structure, and the reflecting part is connected with the bearing table through the rotating structure;
wherein the step of acquiring a first image of the first component under test from the first image sensor comprises:
generating a first rotation control signal, and controlling the rotating structure to rotate by a first preset angle based on the first rotation control signal so that the reflecting part is opposite to the first tested part;
generating a first image capturing signal, and controlling the first image sensor to capture a first image of the first component under test through reflection of the reflection part based on the first image capturing signal;
acquiring a first image of the first part under test from the first image sensor;
wherein, still place the second part under test on the plummer, the location guide method still includes:
generating a second rotation control signal, and controlling the rotating structure to rotate by a second preset angle based on the second rotation control signal so that the reflecting part is opposite to the second tested part;
generating a second image capturing signal, and controlling the first image sensor to capture a second image of the second component under test by reflection of the reflection part based on the second image capturing signal;
a second image of the second part under test is acquired from the second image sensor.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operated, the machine-readable instructions being executable by the processor to perform the steps of the method of claim 7 or 8.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the method as claimed in claim 7 or 8.
CN202110443057.9A 2021-04-23 2021-04-23 Positioning guide device and method and electronic equipment Pending CN113172624A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110443057.9A CN113172624A (en) 2021-04-23 2021-04-23 Positioning guide device and method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110443057.9A CN113172624A (en) 2021-04-23 2021-04-23 Positioning guide device and method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113172624A true CN113172624A (en) 2021-07-27

Family

ID=76924527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110443057.9A Pending CN113172624A (en) 2021-04-23 2021-04-23 Positioning guide device and method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113172624A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040233280A1 (en) * 2003-05-19 2004-11-25 Honda Motor Co., Ltd. Distance measurement apparatus, distance measurement method, and distance measurement program
US20100220289A1 (en) * 2007-07-30 2010-09-02 Austen Hearn Optical Alignment Apparatus and Method Therefor
US20150253185A1 (en) * 2014-03-07 2015-09-10 Google Inc. Measuring parallelism in lightguide surfaces
CN107560544A (en) * 2017-09-12 2018-01-09 上海大学 One kind is used for robot hole positioning and normal direction measurement apparatus and method
CN107889522A (en) * 2015-08-26 2018-04-06 Abb瑞士股份有限公司 Object various visual angles detection device and its method
US20190290123A1 (en) * 2018-03-22 2019-09-26 Santec Corporation Topographical imaging using combined sensing inputs
US20200018869A1 (en) * 2018-07-16 2020-01-16 Faro Technologies, Inc. Laser scanner with enhanced dymanic range imaging
CN209978819U (en) * 2019-07-30 2020-01-21 王亚辉 Detection equipment and detection system
CN111692987A (en) * 2019-03-15 2020-09-22 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method
US20220050281A1 (en) * 2018-12-07 2022-02-17 Leica Microsystems Cms Gmbh Method for automatically determining the position in a sample arrangement and corresponding microscope

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040233280A1 (en) * 2003-05-19 2004-11-25 Honda Motor Co., Ltd. Distance measurement apparatus, distance measurement method, and distance measurement program
US20100220289A1 (en) * 2007-07-30 2010-09-02 Austen Hearn Optical Alignment Apparatus and Method Therefor
US20150253185A1 (en) * 2014-03-07 2015-09-10 Google Inc. Measuring parallelism in lightguide surfaces
CN107889522A (en) * 2015-08-26 2018-04-06 Abb瑞士股份有限公司 Object various visual angles detection device and its method
CN107560544A (en) * 2017-09-12 2018-01-09 上海大学 One kind is used for robot hole positioning and normal direction measurement apparatus and method
US20190290123A1 (en) * 2018-03-22 2019-09-26 Santec Corporation Topographical imaging using combined sensing inputs
US20200018869A1 (en) * 2018-07-16 2020-01-16 Faro Technologies, Inc. Laser scanner with enhanced dymanic range imaging
US20220050281A1 (en) * 2018-12-07 2022-02-17 Leica Microsystems Cms Gmbh Method for automatically determining the position in a sample arrangement and corresponding microscope
CN111692987A (en) * 2019-03-15 2020-09-22 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method
CN209978819U (en) * 2019-07-30 2020-01-21 王亚辉 Detection equipment and detection system

Similar Documents

Publication Publication Date Title
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
JP4584246B2 (en) How to display an output image on an object
CN108881724B (en) Image acquisition method, device, equipment and storage medium
JP2012230702A (en) Pointing device with camera and mark output
US10291843B2 (en) Information processing apparatus having camera function and producing guide display to capture character recognizable image, control method thereof, and storage medium
KR20200091298A (en) Hand eye calibration method and system
JPH07286837A (en) Instrument and method for measuring rotational amount of spherical body
US7377650B2 (en) Projection of synthetic information
US9990739B1 (en) Method and device for fisheye camera automatic calibration
TWI628415B (en) Positioning and measuring system based on image scale
US10254893B2 (en) Operating apparatus, control method therefor, and storage medium storing program
US20190325593A1 (en) Image processing apparatus, system, method of manufacturing article, image processing method, and non-transitory computer-readable storage medium
WO2017215246A1 (en) Input device recognition method and system, and input instruction recognition method and system
KR20210071253A (en) Camera movement controlling method and apparatus
JP6528964B2 (en) INPUT OPERATION DETECTING DEVICE, IMAGE DISPLAY DEVICE, PROJECTOR DEVICE, PROJECTOR SYSTEM, AND INPUT OPERATION DETECTING METHOD
KR20100038897A (en) Apparatus of estimating user's gaze and the method thereof
CN113172624A (en) Positioning guide device and method and electronic equipment
JP2018018308A (en) Information processing device and control method and computer program therefor
US20120300058A1 (en) Control computer and method for regulating mechanical arm using the same
JP6686319B2 (en) Image projection device and image display system
JP2018205645A (en) Tremor correction characteristic evaluation device for optical device with tremor correction function
JP2021166322A (en) Information processing device, program, and information processing system
CN116045852B (en) Three-dimensional morphology model determining method and device and three-dimensional morphology measuring equipment
JP2018116253A (en) Shake correction characteristic evaluation device for optical equipment with shake correction function
CN115816833B (en) Method and device for determining image correction data, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination