WO2022183906A1 - Procédé et appareil d'imagerie, dispositif, et support de stockage - Google Patents

Procédé et appareil d'imagerie, dispositif, et support de stockage Download PDF

Info

Publication number
WO2022183906A1
WO2022183906A1 PCT/CN2022/076387 CN2022076387W WO2022183906A1 WO 2022183906 A1 WO2022183906 A1 WO 2022183906A1 CN 2022076387 W CN2022076387 W CN 2022076387W WO 2022183906 A1 WO2022183906 A1 WO 2022183906A1
Authority
WO
WIPO (PCT)
Prior art keywords
space
measured object
aware
dimensional coordinate
coordinate system
Prior art date
Application number
PCT/CN2022/076387
Other languages
English (en)
Chinese (zh)
Inventor
罗迤宝
魏静波
贾蓉
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2022183906A1 publication Critical patent/WO2022183906A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the embodiments of the present application relate to the field of communications, and in particular, to an imaging method, apparatus, device, and storage medium.
  • the laser projection measurement system based on the Lidar module is used to control the Lidar module of the intelligent mobile device to project a laser point-shaped spot to the measured object, and use the laser projection measurement system based on the Lidar module.
  • the direct time-of-flight integrates the distance between the camera and the measured object, and then measures the imaging.
  • the current Lidar module measurement imaging method has at least the following shortcomings:
  • An embodiment of the present application provides an imaging method, which is applied to a camera device provided with a spatial perception component, and the method includes:
  • the space-aware label is preset on the outer surface of the measured object, and the outer surface of the measured object is provided with at least three two said space-aware stickers, and the positions of any two said space-aware stickers do not overlap;
  • Image processing is performed on the measured object according to the standard three-dimensional coordinate system and the picture material.
  • the embodiment of the present application also provides an imaging device, including:
  • the determination module can also determine the position information of each spatial perception label within the coverage of the spatial perception component, the spatial perception label is preset on the outer surface of the measured object, and the outer surface of the measured object is At least three of the space-aware labels are arranged on the surface, and the positions of any two of the space-aware labels do not overlap;
  • a building module for building a standard three-dimensional coordinate system according to the position information
  • a photographing module used for photographing the measured object to obtain picture material
  • the imaging module is configured to perform imaging processing on the measured object according to the standard three-dimensional coordinate system and the picture material.
  • the embodiment of the present application also provides an imaging device, including:
  • the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the imaging method as described above.
  • Embodiments of the present application further provide a computer-readable storage medium storing a computer program.
  • the computer program when executed by the processor, implements the imaging method described above.
  • FIG. 1 is a flowchart of the imaging method provided by the first embodiment of the present application.
  • Fig. 2 is a schematic diagram of a standard three-dimensional coordinate system constructed based on step 102 in the imaging method of Fig. 1;
  • FIG. 3 is a schematic diagram of the shooting position determined based on step 102 in the imaging method of FIG. 1;
  • FIG. 4 is a schematic diagram of imaging the measured object based on step 104 in the imaging method of FIG. 1;
  • FIG. 5 is a flowchart of the imaging method provided by the second embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an imaging device provided by a third embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an imaging device provided by a fourth embodiment of the present application.
  • the first embodiment of the present application relates to an imaging method, which is applied to a camera device provided with a spatial perception component.
  • the lidar module needs to be implanted into the camera equipment in advance.
  • the imaging method provided in this embodiment uses spatial perception components with low price and high popularity, such as millimeter wave ranging components, ultrasonic ranging components, Ultra Wide Band (UWB) ranging components, etc.
  • Ranging components with strong penetration, low power consumption, good interference effect, high security, large space capacity, and accurate positioning enable 3D modeling to be completed in common terminal devices with cameras, such as mobile phones and tablet computers, and These terminals do not need to implant Lidar modules with high power consumption and high cost, thus reducing the hardware threshold of 3D modeling as much as possible, making 3D acquisition popular among ordinary individual users.
  • the above-listed spatial sensing components can be prepared as thin labels as possible in practical applications. Therefore, the spatial perception components required for imaging can be integrated inside the camera device, or directly attached to the outer shell of the camera device in the form of labels, thereby further reducing the hardware for 3D modeling. Threshold, so that the camera equipment that has been put into the market but does not have spatial awareness components can achieve 3D acquisition by means of external spatial awareness labels.
  • this embodiment takes the imaging method applied to a camera device provided with a UWB ranging component, such as a mobile phone, as an example for specific description.
  • Step 101 Determine the position information of each space-aware label within the coverage of the space-aware component.
  • each space-aware label within the coverage determined by the space-aware component is a space-aware label of the same type that is pre-attached on the outer surface of the measured object.
  • the imaging method of this embodiment is applied to a mobile phone provided with a UWB ranging component as an example, and the spatial awareness sticker pre-attached on the outer surface of the measured object is a UWB ranging sticker.
  • determining the position information of each space-aware label within the coverage of the space-aware component in step 101 is specifically by: sending a broadcast (playing a pulse signal outward) -> measuring the distance (determining any two spaces) Perceive the distance between labels/components) -> the process of positioning (determining location information).
  • the spatial sensing component broadcasts the pulse signal according to a preset period, and then searches for each of the spatial sensing labels within the coverage of the pulse signal; then, from the received spatial sensing label
  • the identification information pre-allocated for the space-aware label is extracted from the pulse signal response data packet of the sticker; finally, the extracted identification information is deduplicated, and the deduplicated identification information is counted to obtain the The number of space-aware stickers within the coverage of the space-aware component.
  • the pulse signal response data packet specifically refers to the spatial sensing label attached to the outer surface of the measured object, after receiving the pulse signal of the spatial sensing component according to the preset period, for The received pulse signal makes a pulse signal response packet.
  • the set space perception component is the UWB ranging sticker (for the convenience of distinguishing the UWB ranging sticker set on the mobile phone as the first UWB ranging sticker)
  • the measured object is a hexahedron
  • a UWB ranging label is pre-attached on each outer surface of the hexahedron (for the convenience of distinguishing, the UWB ranging label attached to the outer surface of the measured object is referred to as the second UWB ranging stickers) as an example.
  • the UWB signal After the first UWB ranging sticker and the second UWB ranging sticker are powered on, the UWB signal will be played out according to the preset cycle, in order to be able to distinguish which second UWB ranging sticker receives the UWB signal.
  • an identification number that can identify its uniqueness can be assigned to each second UWB ranging sticker in advance, so that the first UWB ranging sticker can extract the UWB signal from the second UWB ranging sticker after receiving the UWB signal played by the second UWB ranging sticker.
  • the corresponding identification number is obtained, and the second UWB ranging sticker is filtered according to the identification number, so as to summarize the quantity, and the number of the second UWB ranging sticker within the coverage of the first UWB ranging sticker can be determined. for 6.
  • the specific types of the spatial sensing components set on the camera equipment and the spatial sensing labels set on the measured object may be millimeter waves, ultrasonic waves, or UWB. Therefore, when determining the above-mentioned first distance and second distance, according to the type of space-aware stickers/components actually set, the corresponding space-aware principle can be selected for ranging.
  • the first UWB ranging label is used as an example.
  • the first distance between the distance label and any second UWB distance measuring label on the measured object can be expressed by the following formula:
  • B 1 represents the first UWB ranging sticker
  • B 2 represents the second UWB ranging sticker
  • B 1 R1 represents the first UWB ranging sticker received from the second UWB ranging sticker for the first time.
  • the ranging packet that is, the ranging packet sent by the second UWB ranging tag to the first UWB ranging tag at time T1
  • B 1 T1 indicates that the first UWB ranging tag sends the second UWB ranging tag at time T1
  • the transmitted ranging packet, B 2 T1 represents the ranging packet sent by the second UWB ranging sticker to the first UWB ranging sticker at time T1
  • B 2 R1 represents the second UWB ranging sticker received for the first time
  • the ranging packet from the first UWB ranging tag that is, the ranging packet sent by the first UWB ranging tag to the second UWB ranging tag at time T1.
  • B 2 R2 represents the ranging packet from the first UWB ranging tag received by the second UWB ranging tag for the second time, that is, the first UWB ranging tag sends the second UWB ranging tag to the second UWB ranging tag at time T2.
  • the ranging packet sent by the sticker, B 1 T2 represents the ranging package sent by the first UWB ranging sticker to the second UWB ranging sticker at time T2.
  • C represents the propagation speed between the ranging packet sent from the first UWB ranging sticker and the ranging package sent from the second UWB ranging sticker.
  • the divisor is 4 specifically because there will be 4 transmissions of ranging packets between the first UWB ranging sticker and the second UWB ranging sticker.
  • the determination of the second distance between any two second UWB ranging labels can also be obtained by calculation based on the above formula.
  • a corresponding positioning algorithm can be selected for the specific types of space-aware components/labels set to determine each space-aware label location information.
  • UWB ranging stickers it can be implemented based on typical positioning algorithms based on ultra-bandwidth, such as RSSI (Received Signal Strength Indication) positioning algorithm based on received signal emphasis indication, AOA (Angle-of -Arrival) positioning algorithm, TOA (time of arrival) positioning algorithm based on the time of arrival of the signal, and TDOA (Time Difference 0f Arrival) positioning algorithm based on the time of arrival of the signal, to realize the detection of each second UWB distance marker on the measured object.
  • the position information of the sticker is specifically the determination of the coordinate point in the three-dimensional space.
  • each outer surface of the measured object of the polyhedron is attached at least A space-aware sticker, an irregular measured object, or a spherical-shaped measured object with only one outer surface, at least three space-aware labels are attached to the outer surface of the measured object at preset intervals.
  • styles of the space-aware stickers include but are not limited to: Parallel Lines style, Single Circle style, dot matrix light spot/ Dot Matrix style, Cross Hair style, Concentric Circles style, Dots style, etc.
  • Step 102 constructing a standard three-dimensional coordinate system according to the position information.
  • the measured object that needs to be subjected to three-dimensional imaging is stationary, so the position information of the space-aware sticker set on the outer surface of the measured object, that is, the coordinates in the three-dimensional space are stable. Therefore, in this embodiment, by constructing a three-dimensional coordinate system according to fixed position information, a stable and invariable standard three-dimensional coordinate system can be obtained, so that subsequent imaging based on this standard three-dimensional coordinate system can be more accurately and truly restored. Measured object.
  • the position information of at least three space-aware labels is required. That is, when constructing a standard three-dimensional coordinate system based on position information, it is necessary to select at least three space-aware labels from all the space-aware labels set on the measured device as position reference labels, and then according to the selected at least three position reference labels The position information of the label constructs a standard three-dimensional coordinate system.
  • this embodiment takes three selected position reference labels as an example, and is described in detail with reference to FIG. 2 :
  • A, B, and C are the selected position reference labels, respectively
  • A', B', and C' are the projections of A, B, and C, respectively, based on A, B, C, A', B' and C' to construct a three-dimensional coordinate system.
  • the coordinate axis perpendicular to the ground plane is selected as the Z-axis, and the remaining two coordinate axes are designated as the X-axis and the Y-axis respectively, and the standard three-dimensional coordinate system can be obtained.
  • the coordinate axis where the position reference label A is located is the Z axis
  • the coordinate axis where the position reference label C is located is the X axis
  • the coordinate axis where the projection B' corresponding to the position reference label B is located is Y axis.
  • the second A, B and D;
  • the first one is selected as the initial standard three-dimensional coordinate system, and the remaining three reference three-dimensional coordinate systems are calibrated three-dimensional coordinate systems.
  • the initial standard three-dimensional coordinate system is calibrated to obtain the standard three-dimensional coordinate system.
  • the initial standard three-dimensional coordinate system is continuously aligned and adjusted repeatedly, and finally a stable and unchanging marked three-dimensional coordinate system can be obtained.
  • the standard three-dimensional coordinate system is obtained by calibrating and aligning the three-dimensional coordinate system constructed with the position information of any three position reference labels, so that the subsequent imaging based on this coordinate system can restore the measured object more realistically. actual situation.
  • the camera equipment in order to ensure that the camera equipment can capture every area on the outer surface of the object to be measured.
  • the standard three-dimensional coordinate system After the standard three-dimensional coordinate system is obtained, several shooting positions corresponding to different areas of the measured object can be determined according to the standard three-dimensional coordinate system and the position information of the three position reference labels that construct the standard three-dimensional coordinate system, so that the camera device is in At each shooting position, the picture material of each area on the outer surface of the measured object can be obtained by shooting, thereby ensuring that the measured object can be completely and accurately 3D based on the standard three-dimensional coordinate system and the obtained picture material. imaging.
  • the shooting position is the position at which the final imaging device shoots the measured object. In order to restore the measured object as much as possible, the determined shooting position needs to be able to capture the characteristic information of the measured object as clearly as possible when the imaging device is in this position.
  • the label style can be Parallel Lines style, Single Circle style, dot matrix light spot/ Dot Matrix style, Cross Hair style, Concentric Circles style, Dots style, so the determined shooting position can be ensured that when the camera equipment is in this position, it can be clearly captured.
  • the UWB ranging label on the surface of the measured object corresponding to the shooting position shall prevail.
  • the measured object is often three-dimensional, in order to ensure that the final 3D imaging effect can restore the measured object as truly as possible, there are often more than one determined shooting positions, that is, it can be set around the measured object.
  • the determined shooting position is determined from the constructed standard three-dimensional coordinate system and the position information of at least three position reference labels used in constructing the standard three-dimensional coordinate system.
  • the determined shooting positions may only include three groups.
  • the shooting position is determined by the horizontal position shown in FIG. 3
  • three groups of shooting positions are determined, respectively using the coordinate positions of the position reference labels A, B and C on the Z axis as the horizontal position, and then determine three groups of shooting positions.
  • each of the shooting tracks is divided at preset intervals, so that several shooting positions can be obtained.
  • a plurality of shooting positions are respectively determined on the first shooting track, the second shooting track and the third shooting track.
  • Step 103 photographing the measured object to obtain picture material.
  • step 102 it can be known that there are multiple shooting positions finally determined. Therefore, when shooting the measured object according to the shooting position, the camera equipment is actually moved to each shooting position, and then the measured object is shot. , and then obtain the picture material of the measured object corresponding to each shooting position.
  • the picture materials obtained by shooting will be different. Therefore, in order to ensure that the obtained picture material can clearly reflect the spatial awareness stickers set on the surface of the measured object corresponding to the shooting position, when shooting the measured object, it is necessary to first determine the corresponding the position information of the space-aware stickers; then, adjust the shooting angle according to each of the shooting positions and the position information of the space-aware stickers corresponding to each of the shooting positions, and use the adjusted The measured object is photographed at a shooting angle, and a number of the picture materials are obtained.
  • the space-aware label since the space-aware label is pre-arranged on the outer surface of the object to be measured, it may specifically be on the outer surface of the object to be measured. Therefore, the above-mentioned determination of the position information of the space-aware label corresponding to each of the shooting positions, specifically, is to first determine the outer surface of the measured object relative to each of the shooting positions, and then determine the outer surface of the measured object. Location information of the spatially aware sticker ostensibly relative to the corresponding shooting location.
  • a visual interactive interface can be provided, and then the constructed standard three-dimensional coordinate system can be displayed in the visual interactive interface, and the junction of the latitude and longitude of the virtual sphere formed based on the standard three-dimensional coordinate system can be displayed.
  • Determine the shooting position and then guide the camera device to move to each shooting position in the visual interactive interface, and guide the camera center normal of the camera device to align with the center of the virtual sphere, and then shoot to get the picture material.
  • the space-aware stickers set on the opposite side of the measured object can choose a sticker with a contrasting color with the measured object, so as to be able to It can be better recognized under strong light, thereby ensuring the final imaging effect.
  • Step 104 Perform imaging processing on the measured object according to the standard three-dimensional coordinate system and the picture material.
  • the settings on the measured object are extracted.
  • the position information of each of the space-aware labels; then, according to the position information of each of the space-aware labels, the outline of the measured object is drawn in the standard three-dimensional coordinate system; finally, from each The feature information of the measured object is extracted from the picture material, and the outline of the measured object is mapped and rendered.
  • the location information of the UWB ranging stickers on the outer surface of the picture material is determined, and then the construction is constructed according to the determined location information of each UWB ranging sticker.
  • the outline of the measured object is drawn in the standard three-dimensional coordinate system, and finally the characteristic information of the measured object is extracted from each picture material, such as color mapping and the outline of the measured object drawn in the standard three-dimensional coordinate system. Rendering can truly and accurately restore the measured object, that is, to achieve 3D imaging of the measured object.
  • step 104 it can also be judged whether the picture material obtained by the current shooting covers every area of the outer surface of the object to be measured.
  • the standard three-dimensional coordinate system determine the shooting position corresponding to the unphotographed area of the measured object, and according to the determined shooting position, make a position movement prompt, such as showing the need to move in the interactive interface. to the specific shooting position, or prompt the user to move the camera equipment to the shooting position in the form of voice prompts, and then shoot the object under test, so as to obtain the picture material corresponding to the unshot area of the object under test, Finally, based on the obtained picture material and the standard three-dimensional coordinate system, three-dimensional imaging processing is performed on the measured object.
  • the imaging method provided in this embodiment constructs a fixed standard three-dimensional coordinate system based on the position information of the space-aware sticker fixed on the object to be measured, so that the standard three-dimensional coordinate system is constructed according to the standard three-dimensional coordinate system and
  • the imaging processing of the object to be measured is performed based on the picture material obtained by shooting at the shooting position, which realizes the process of changing the dynamic photographing ranging and imaging process to a relatively static operation, thereby greatly improving the accuracy of ranging results and imaging results.
  • the imaging method provided in this embodiment by changing the existing ranging imaging based on Lidar module to ranging imaging based on spatial perception components or spatial perception labels, since the entire imaging process does not need to rely on spot projection, it will not be affected by ambient brightness. interference, which further ensures the accuracy of the measurement results and imaging results.
  • the imaging method provided by this embodiment does not require a Lidar module to be implanted in the terminal device, and also does not limit the spatial awareness component or the spatial awareness label to be built into the content of the terminal device, that is, the spatial awareness component or the spatial awareness label. It can be externally installed on the terminal device, so that the existing terminal device without implanted Lidar module and spatial perception component can realize ranging imaging by directly attaching the external spatial perception label, which greatly reduces the hardware threshold of 3D modeling function. Conducive to the popularization of 3D acquisition.
  • a second embodiment of the present application relates to an imaging method.
  • the second embodiment is improved on the basis of the first embodiment, and the main improvement is: before shooting the measured object to obtain the picture material, it is first determined whether the imaging device is in a certain shooting position, if not In the determined shooting position, the camera device is guided to move to the shooting position, so as to ensure that the picture material obtained by subsequent shooting can more accurately reflect the actual information of the measured object.
  • the imaging method involved in the second embodiment includes the following steps:
  • Step 501 Determine the position information of each space-aware label within the coverage of the space-aware component.
  • step 502 a standard three-dimensional coordinate system is constructed according to the position information, and the shooting position is determined.
  • step 501 and step 502 in this embodiment are substantially the same as step 101 and step 102 in the first embodiment, which will not be repeated here.
  • Step 503 Determine the location information of the spatial awareness component.
  • the determination of the location information of the spatial perception component is roughly the same as the determination of the location information of the spatial perception label disposed on the outer surface of the measured object, that is, by determining the spatial perception component and the The distance between any space-aware labels, and then determine the location of the space-aware component according to the distance.
  • Step 504 Determine whether the location information of the spatial perception component matches the shooting location.
  • the operation in step 506 can be performed; otherwise, the remaining several shooting positions are then traversed, and the traversed current shooting position and spatial perception The current specific position of the component is matched. If it is determined through comparison that the current position of the spatial perception component does not match each of the determined shooting positions, the operation in step 505 is performed.
  • Step 505 Make a position movement prompt, so that the imaging device moves to the shooting position according to the position movement prompt.
  • a visual interactive interface is directly provided to users. Therefore, when it is determined that the position information of the spatial perception component does not match the shooting position, the position movement prompt made can be directly displayed on the visual interactive interface.
  • the position information of the spatial perception component can be regarded as the same as the camera device’s position information. location information. Therefore, the prompt information for moving the position displayed in the visual interactive interface may be such as "the current camera device is not in the shooting position, please move the position".
  • the position movement prompt made can also be refined to which direction and how far to move the angle, so that the user holding the camera device can move the camera device according to the position movement prompt.
  • the distance between the current position of the imaging device and the shooting position can also be displayed in the visual interaction interface, so as to guide the user to move to the shooting position in preparation.
  • the camera device is an intelligent machine device such as a drone, an intelligent robot, etc.
  • the position movement prompt made may directly be the coordinate information that informs the specific shooting position to which such a device needs to be moved. .
  • Step 506 Photograph the measured object according to the photographing position to obtain picture material.
  • Step 507 Perform imaging processing on the measured object according to the standard three-dimensional coordinate system and the picture material.
  • step 506 and step 507 in this embodiment are substantially the same as step 103 and step 104 in the first embodiment, which will not be repeated here.
  • the imaging method before the object to be measured is photographed according to the shooting position and the picture material is obtained, it is first determined whether the imaging device is in the determined shooting position. Then guide the camera device to move to the shooting position, so as to ensure that the picture material obtained by subsequent shooting can more accurately reflect the actual information of the measured object, thereby making the final imaging result based on the standard three-dimensional coordinate system and the picture material obtained by shooting more accurate. .
  • the third embodiment of the present application relates to an imaging device, as shown in FIG. 6 , including: a determination module 601 , a construction module 602 , a photographing module 603 and an imaging module 604 .
  • the determining module 601 can also determine the position information of each space-aware label within the coverage of the space-aware component, the space-aware label is preset on the outer surface of the measured object, and the measured The outer surface of the object is provided with at least three of the space-aware labels, and the positions of any two of the space-aware labels do not overlap;
  • the building module 602 is used for building a standard three-dimensional coordinate system according to the position information; shooting The module 603 is used for photographing the measured object to obtain picture material;
  • the imaging module 604 is used for imaging the measured object according to the standard three-dimensional coordinate system and the picture material.
  • the determining module 601 determines the position information of each space-aware sticker within the coverage of the space-aware component, specifically:
  • the location information of each of the space-aware stickers is determined.
  • the determining module 601 determines the number of space-aware stickers within the coverage of the space-aware component, specifically:
  • the spatial sensing component plays a pulse signal outwardly according to a preset period, and searches for each of the spatial sensing stickers within the coverage of the pulse signal;
  • the extracted identification information is de-duplicated, and the de-duplicated identification information is counted to obtain the number of space-aware stickers within the coverage of the space-aware component.
  • the construction module 602 constructs a standard three-dimensional coordinate system according to the position information, it is specifically:
  • the shooting position is determined according to the standard three-dimensional coordinate system and the position information of the at least three position reference stickers.
  • the specific steps are:
  • N is an integer greater than or equal to 3;
  • the initial standard three-dimensional coordinate system is calibrated to obtain the standard three-dimensional coordinate system.
  • the imaging device further includes a location determination module.
  • a position determination module configured to determine a number of shooting positions according to the standard three-dimensional coordinate system and the position information of at least three position reference labels; wherein, each shooting position corresponds to the measured object an area of the outer surface.
  • the camera module 603 shoots the measured object to obtain the picture material, specifically, the camera module 603 shoots the measured object according to each of the shooting positions to obtain the picture material.
  • the specific steps are:
  • Each of the shooting tracks is divided at preset intervals to obtain several shooting positions.
  • the specific steps are:
  • the imaging device further includes: a position matching module.
  • the position matching module is used to perform the following operations:
  • the imaging module 604 performs imaging processing on the measured object according to the standard three-dimensional coordinate system and the picture material, specifically:
  • the feature information of the measured object is extracted from each of the picture materials, and the outline of the measured object is mapped and rendered.
  • the imaging device further includes: a picture material checking module.
  • the picture material checking module is used to judge whether the picture material covers each area of the outer surface of the measured object.
  • trigger the position determination module to perform the operation of determining the shooting position corresponding to the area of the measured object that is not captured according to the standard three-dimensional coordinate system, and trigger the shooting module 603 to perform the operation according to the shooting position.
  • make a position movement prompt so that the camera device moves to the shooting position to shoot the measured object according to the position movement prompt, and obtains the picture material corresponding to the unphotographed area of the measured object operation.
  • this embodiment is a device embodiment corresponding to the first or second embodiment, and this embodiment can be implemented in cooperation with the first or second embodiment.
  • the related technical details mentioned in the first or second embodiment are still valid in this embodiment, and are not repeated here in order to reduce repetition.
  • the related technical details mentioned in this embodiment can also be applied in the first or second embodiment.
  • a logical unit may be a physical unit, a part of a physical unit, or multiple physical units.
  • a composite implementation of the unit in order to highlight the innovative part of the present application, this embodiment does not introduce units that are not closely related to solving the technical problem raised by the present application, but this does not mean that there are no other units in this embodiment.
  • the fourth embodiment of the present application relates to an imaging device, as shown in FIG. 7 , comprising: at least one processor 701 ; and a memory 702 communicatively connected with the at least one processor 701 ; wherein the memory 702 stores information that can be Instructions executed by the at least one processor 701, the instructions are executed by the at least one processor 701, so that the at least one processor 701 can execute the imaging method described in the above method embodiments.
  • the memory 702 and the processor 701 are connected by a bus, and the bus may include any number of interconnected buses and bridges, and the bus connects one or more processors 701 and various circuits of the memory 702 together.
  • the bus may also connect together various other circuits, such as peripherals, voltage regulators, and power management circuits, which are well known in the art and therefore will not be described further herein.
  • the bus interface provides the interface between the bus and the transceiver.
  • a transceiver may be a single element or multiple elements, such as multiple receivers and transmitters, providing a means for communicating with various other devices over a transmission medium.
  • the data processed by the processor 701 is transmitted on the wireless medium through the antenna, and further, the antenna also receives the data and transmits the data to the processor 701 .
  • Processor 701 is responsible for managing the bus and general processing, and may also provide various functions including timing, peripheral interface, voltage regulation, power management, and other control functions.
  • the memory 702 may be used to store data used by the processor 701 in performing operations.
  • the fifth embodiment of the present application relates to a computer-readable storage medium storing a computer program.
  • the imaging method described in the above method embodiments is implemented when the computer program is executed by the processor.
  • the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM, Read-Only Memory), Random Access Memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Des modes de réalisation de la présente demande se rapportent au domaine des communications. Un procédé et un appareil d'imagerie, un dispositif, et un support de stockage sont divulgués. Dans la présente demande, le procédé d'imagerie est appliqué à un dispositif de photographie pourvu d'un ensemble de reconnaissance spatiale. Le procédé fait appel aux étapes suivantes : détermination d'informations de position de chaque étiquette de reconnaissance spatiale à l'intérieur d'une zone de couverture de l'ensemble de reconnaissance spatiale, les étiquettes de reconnaissance spatiale étant disposées sur la surface externe d'un objet mesuré à l'avance, la surface externe de l'objet mesuré est pourvue d'au moins trois étiquettes de reconnaissance spatiale, et les positions dans lesquelles deux quelconques étiquettes de reconnaissance spatiale des étiquettes de reconnaissance spatiale ne se chevauchent pas ; construction d'un système de coordonnées tridimensionnelles standard selon les informations de position ; photographie de l'objet mesuré afin d'obtenir un matériau d'image ; et réalisation d'un traitement d'imagerie sur l'objet mesuré selon le système de coordonnées tridimensionnelles standard et le matériau d'image.
PCT/CN2022/076387 2021-03-03 2022-02-15 Procédé et appareil d'imagerie, dispositif, et support de stockage WO2022183906A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110235434.XA CN115037866A (zh) 2021-03-03 2021-03-03 成像方法、装置、设备及存储介质
CN202110235434.X 2021-03-03

Publications (1)

Publication Number Publication Date
WO2022183906A1 true WO2022183906A1 (fr) 2022-09-09

Family

ID=83118042

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/076387 WO2022183906A1 (fr) 2021-03-03 2022-02-15 Procédé et appareil d'imagerie, dispositif, et support de stockage

Country Status (2)

Country Link
CN (1) CN115037866A (fr)
WO (1) WO2022183906A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117326172A (zh) * 2023-11-01 2024-01-02 东莞市新立智能设备有限公司 一种贴标机用贴标自适应调整方法、系统及自动贴标机

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101509763A (zh) * 2009-03-20 2009-08-19 天津工业大学 单目高精度大型物体三维数字化测量系统及其测量方法
US20140286536A1 (en) * 2011-12-06 2014-09-25 Hexagon Technology Center Gmbh Position and orientation determination in 6-dof
CN104217554A (zh) * 2014-09-19 2014-12-17 武汉理工大学 学生健康学习姿势提醒系统及方法
CN109490825A (zh) * 2018-11-20 2019-03-19 武汉万集信息技术有限公司 定位导航方法、装置、设备、系统及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101509763A (zh) * 2009-03-20 2009-08-19 天津工业大学 单目高精度大型物体三维数字化测量系统及其测量方法
US20140286536A1 (en) * 2011-12-06 2014-09-25 Hexagon Technology Center Gmbh Position and orientation determination in 6-dof
CN104217554A (zh) * 2014-09-19 2014-12-17 武汉理工大学 学生健康学习姿势提醒系统及方法
CN109490825A (zh) * 2018-11-20 2019-03-19 武汉万集信息技术有限公司 定位导航方法、装置、设备、系统及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117326172A (zh) * 2023-11-01 2024-01-02 东莞市新立智能设备有限公司 一种贴标机用贴标自适应调整方法、系统及自动贴标机
CN117326172B (zh) * 2023-11-01 2024-05-24 东莞市新立智能设备有限公司 一种贴标机用贴标自适应调整方法、系统及自动贴标机

Also Published As

Publication number Publication date
CN115037866A (zh) 2022-09-09

Similar Documents

Publication Publication Date Title
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
US9761049B2 (en) Determination of mobile display position and orientation using micropower impulse radar
CN109801374B (zh) 一种通过多角度图像集重构三维模型的方法、介质及系统
CN109559349B (zh) 一种用于标定的方法和装置
US11380016B2 (en) Fisheye camera calibration system, method and electronic device
CN110136207B (zh) 鱼眼相机标定系统、方法、装置、电子设备及存储介质
US20210231810A1 (en) Camera apparatus
US11514608B2 (en) Fisheye camera calibration system, method and electronic device
US20190012809A1 (en) Stereo vision measuring system and stereo vision measuring method
CN106408614B (zh) 适于现场应用的摄像机内参数标校方法和系统
CN112423191B (zh) 一种视频通话设备和音频增益方法
WO2022183906A1 (fr) Procédé et appareil d'imagerie, dispositif, et support de stockage
CN111354037A (zh) 一种定位方法及系统
CN114088012B (zh) 测量装置的补偿方法、装置、三维扫描系统和存储介质
CN115359130A (zh) 雷达和相机的联合标定方法、装置、电子设备及存储介质
CN107564051A (zh) 一种深度信息采集方法及系统
WO2022088613A1 (fr) Procédé et appareil de positionnement de robot, dispositif et support de stockage
WO2022227875A1 (fr) Procédé, appareil et dispositif d'imagerie tridimensionnelle et support de stockage
CN107229055B (zh) 移动设备定位方法和移动设备定位装置
CN110163922B (zh) 鱼眼相机标定系统、方法、装置、电子设备及存储介质
CN116091701A (zh) 三维重建方法、装置、计算机设备及存储介质
CN115334247A (zh) 摄像模块标校方法、视觉定位方法、装置及电子设备
CN107478227B (zh) 交互式大型空间的定位算法
JP7328382B2 (ja) 装着型装置
CN109922331B (zh) 一种图像处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22762372

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24/01/2024)