CN116152045A - Vehicle joint projection boundary determination method and device, terminal equipment and storage medium - Google Patents

Vehicle joint projection boundary determination method and device, terminal equipment and storage medium Download PDF

Info

Publication number
CN116152045A
CN116152045A CN202111373865.9A CN202111373865A CN116152045A CN 116152045 A CN116152045 A CN 116152045A CN 202111373865 A CN202111373865 A CN 202111373865A CN 116152045 A CN116152045 A CN 116152045A
Authority
CN
China
Prior art keywords
boundary line
projection area
vehicle
projection
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111373865.9A
Other languages
Chinese (zh)
Inventor
丁磊
李坤显
吕建明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Human Horizons Shanghai Internet Technology Co Ltd
Original Assignee
Human Horizons Shanghai Internet Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Human Horizons Shanghai Internet Technology Co Ltd filed Critical Human Horizons Shanghai Internet Technology Co Ltd
Priority to CN202111373865.9A priority Critical patent/CN116152045A/en
Publication of CN116152045A publication Critical patent/CN116152045A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a vehicle joint projection boundary determining method, a device, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring a vehicle projection area of each vehicle in a motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and the N vehicles form a joint projection; according to the N vehicle projection areas, a first boundary line and a third boundary line of the combined projection area are obtained so as to maximize the width of the combined projection area; obtaining a second boundary line and a fourth boundary line of the joint projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the joint projection area so as to maximize the height of the joint projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order. The method can determine the boundaries of the joint projection area.

Description

Vehicle joint projection boundary determination method and device, terminal equipment and storage medium
Technical Field
The present invention relates to the field of projection technologies, and in particular, to a method and apparatus for determining a joint projection boundary of a vehicle, a terminal device, and a storage medium.
Background
With the continuous improvement of the living standard of people, the configuration requirement on vehicles is also higher and higher, and the vehicles are not only a walking tool, but also recreation and entertainment equipment when going out. In the process of using a vehicle, people have the requirement of playing audio by using the vehicle, and also have the requirement of playing pictures and images by using the vehicle, such as watching movies or videos when resting in the vehicle.
Currently, vehicles can project simple images, such as images onto building surfaces, through DLP (Digital Light Processing ) headlamps. Meanwhile, in order to ensure the joint projection effect, the projection materials are prevented from exceeding the joint projection area, and the boundary of the joint projection area needs to be determined.
Disclosure of Invention
The invention provides a vehicle joint projection boundary determining method, a vehicle joint projection boundary determining device, terminal equipment and a storage medium, so as to determine the boundary of a joint projection area.
In order to solve the above technical problem, an embodiment of the present invention provides a method for determining a joint projection boundary of a vehicle, including:
acquiring a vehicle projection area of each vehicle in a motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
Obtaining a first boundary line and a third boundary line of a combined projection area according to N vehicle projection areas so as to maximize the width of the combined projection area;
obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
and determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
Preferably, the obtaining a first boundary line and a third boundary line of the joint projection area according to the N vehicle projection areas includes:
when N is smaller than the preset number or N is an odd number, sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain a first boundary line of the 1 st vehicle projection area and a third boundary line of the N th vehicle projection area; wherein the joint projection area and each vehicle projection area take a first vertical boundary line as a first boundary line and a second vertical boundary line as a third boundary line;
And determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N th trolley projection area as the third boundary line of the joint projection area.
Preferably, the obtaining the second boundary line and the fourth boundary line of the joint projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the joint projection area includes:
sequencing the second boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines of the vehicle projection areas as the second boundary line of the joint projection areas;
sequencing the fourth boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines of the vehicle projection areas as the fourth boundary line of the combined projection areas; the combined projection area and each vehicle projection area take the first horizontal boundary line as a second boundary line and take the second horizontal boundary line as a fourth boundary line.
Preferably, the obtaining a first boundary line and a third boundary line of the joint projection area according to the N vehicle projection areas includes:
When N is greater than or equal to the preset number and is an even number, judging whether any 4 adjacent vehicle projection areas meet adjacent intersection or not and whether at least one vehicle projection area is intersected with other three vehicle projection areas;
if yes, judging that a motorcade carries out multi-dimensional joint projection, and sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area, the first boundary line of the 2 nd vehicle projection area, the third boundary line of the N-1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
determining one of the first boundary line of the 1 st vehicle projection area and the third boundary line of the 2 nd vehicle projection area, which is close to the nth vehicle projection area, as a first boundary line of a joint projection area;
determining one of a third boundary line of the N-1 th trolley vehicle projection area and a first boundary line close to the 1 st trolley vehicle projection area as a third boundary line of a joint projection area;
if not, judging that the motorcade carries out horizontal joint projection, and sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
And determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N th trolley projection area as the third boundary line of the joint projection area.
Preferably, after the determining that the fleet performs multi-dimensional joint projection if yes, the method further includes:
when the second boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the second boundary line of the vehicle projection area as a second boundary line to be selected;
sequencing the second boundary lines to be selected according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines to be selected as the second boundary line of the joint projection area;
when the fourth boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the fourth boundary line of the vehicle projection area as a fourth boundary line to be selected;
and sequencing the fourth boundary lines to be selected according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines to be selected as the fourth boundary line of the joint projection area.
Preferably, the method further comprises:
establishing a two-dimensional coordinate system by taking a first vertex of a 1 st vehicle projection area as an origin to obtain an ordinate of a second boundary line and an ordinate of a fourth boundary line of the vehicle projection area;
sequencing the second boundary lines of the vehicle projection areas according to the ordinate size of the second boundary lines of the vehicle projection areas to obtain the lowest boundary line in the second boundary lines of the vehicle projection areas, and determining the lowest boundary line as the second boundary line of the joint projection area;
and sequencing the fourth boundary lines of the vehicle projection areas according to the ordinate sizes of the fourth boundary lines of the vehicle projection areas to obtain the highest boundary line in the fourth boundary lines of the vehicle projection areas, and determining the highest boundary line as the fourth boundary line of the combined projection areas.
Preferably, after the determining that the fleet performs multi-dimensional joint projection if yes, the method further includes:
establishing a two-dimensional coordinate system by taking a first vertex of a 1 st vehicle projection area as an origin to obtain an abscissa of a first boundary line and an abscissa of a third boundary line of the vehicle projection area;
judging whether the abscissa of the first boundary line of the 1 st trolley projection area is smaller than the abscissa of the first boundary line of the 2 nd trolley projection area;
If yes, determining a first boundary line of the projection area of the 2 nd vehicle as a first boundary line of the combined projection area;
if not, determining the first boundary line of the projection area of the 1 st trolley as the first boundary line of the combined projection area;
judging whether the abscissa of the third boundary line of the N-1 th trolley vehicle projection area is smaller than the abscissa of the third boundary line of the N-1 th trolley vehicle projection area;
if yes, determining a third boundary line of the N-1-th trolley vehicle projection area as a third boundary line of the joint projection area;
if not, determining the third boundary line of the projection area of the Nth trolley as the third boundary line of the combined projection area.
In a second aspect, the present invention provides a vehicle joint projection boundary determination apparatus, including:
the projection area acquisition module is used for acquiring a vehicle projection area of each vehicle in the motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
the vertical boundary determining module is used for obtaining a first boundary line and a third boundary line of the combined projection area according to the N vehicle projection areas so as to maximize the width of the combined projection area;
The horizontal boundary determining module is used for obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
and the whole boundary determining module is used for determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
In a third aspect, the present invention further provides a terminal device, including a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor implements the vehicle joint projection boundary determination method according to any one of the above when executing the computer program.
In a fourth aspect, the present invention further provides a computer readable storage medium, where the computer readable storage medium includes a stored computer program, where when the computer program runs, the computer readable storage medium is controlled to execute the method for determining the joint projection boundary of the vehicle according to any one of the above methods.
Compared with the prior art, the invention has the following beneficial effects:
the method comprises the steps of obtaining a vehicle projection area of each vehicle in a motorcade, obtaining a first boundary line and a third boundary line of the combined projection area so as to maximize the width of the combined projection area, and obtaining a second boundary line and a fourth boundary line of the combined projection area so as to maximize the height of the combined projection area. By the method, the maximum boundary of the joint projection area can be determined while the joint projection effect is ensured. When the method is applied specifically, a user can timely adjust or change the projection materials according to the maximum boundary, and the projection materials are prevented from exceeding the joint projection area.
Drawings
FIG. 1 is a schematic flow chart of a method for determining a joint projection boundary of a vehicle according to a first embodiment of the present invention;
FIG. 2 is a schematic illustration of a fleet of vehicles performing horizontal joint projection in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a multi-dimensional joint projection of a fleet of vehicles in an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a vehicle joint projection boundary determining apparatus according to a second embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a first embodiment of the present invention provides a vehicle joint projection boundary determination method, including the following steps S11 to S14:
s11, acquiring a vehicle projection area of each vehicle in a motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
s12, obtaining a first boundary line and a third boundary line of a combined projection area according to N vehicle projection areas so as to maximize the width of the combined projection area;
s13, obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
s14, determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
In step S11, a vehicle projection area of each vehicle in the fleet needs to be acquired. In this embodiment, a DLP headlight is provided on the vehicle, and the DLP headlight can be projected. Meanwhile, a camera is further arranged on the vehicle, and the camera collects relevant data of a vehicle projection area according to a motion imaging technology. And each vehicle transmits the relevant data of the vehicle projection area to the terminal equipment, and the terminal equipment can simulate the vehicle projection areas in the motorcade and the intersecting states of the vehicle projection areas according to the relevant data of the vehicle projection areas in equal proportion. The size of the projected area of the vehicle depends on the model of the DLP headlight, for example, the size of the projectable area is 1 x 1. Of course, in other embodiments, the DLP light may be mounted at other locations on the vehicle, as the invention is not limited in this regard.
Illustratively, the fleet includes N vehicles, N being a positive integer greater than 1, the N vehicles comprising a joint projection. When the motorcade carries out joint projection, the projection material can be subjected to segmentation processing, N projection material fragments are obtained, and the N projection material fragments are respectively sent to N vehicles. The vehicle projects through the DLP headlight according to the received projection material fragments, and the projection of N vehicles can splice, finally forms the complete projection of projection material.
In step S12, according to the N vehicle projection areas, a first boundary line and a third boundary line of a joint projection area are obtained so as to maximize the width of the joint projection area. In determining the projection area, the width of the joint projection area, i.e. the first boundary line and the third boundary line, which are vertical boundary lines, is generally determined first.
In step S13, a second boundary line and a fourth boundary line of the joint projection area are obtained according to the first boundary line, the third boundary line, and the N vehicle projection areas of the joint projection area, so as to maximize the height of the joint projection area. In determining the height of the joint projection region, the effect of the joint projection needs to be simultaneously considered.
In step S14, the overall boundary of the joint projection area is determined according to the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line of the joint projection area. In this embodiment, the vehicle projection areas are all square, and the closed square area formed by the first boundary line, the second boundary line, the third boundary line and the fourth boundary line is the joint projection area.
In this embodiment, the vehicle projection area of each vehicle in the fleet is acquired, then the first boundary line and the third boundary line of the joint projection area are obtained so as to maximize the width of the joint projection area, and then the second boundary line and the fourth boundary line of the joint projection area are obtained so as to maximize the height of the joint projection area. By the method, the maximum boundary of the joint projection area can be determined while the joint projection effect is ensured. When the method is applied specifically, a user can timely adjust or change the projection materials according to the maximum boundary, and the projection materials are prevented from exceeding the joint projection area.
It should be noted that, the fleet may perform horizontal joint projection and multidimensional joint projection according to a specific number. In order to facilitate an understanding of the invention, some preferred embodiments of the invention will be described further below.
In one implementation, the fleet performs a horizontal joint projection. Obtaining a first boundary line and a third boundary line of a combined projection area according to the N vehicle projection areas, wherein the method comprises the following steps:
when N is smaller than the preset number or N is an odd number, sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain a first boundary line of the 1 st vehicle projection area and a third boundary line of the N th vehicle projection area; wherein the joint projection area and each vehicle projection area take a first vertical boundary line as a first boundary line and a second vertical boundary line as a third boundary line;
and determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N th trolley projection area as the third boundary line of the joint projection area.
In a specific projection scene, a projection plane needs to be selected as a projection plane for the combined projection of the motorcades, for example, a white building outer wall. After determining a specific projection plane, in order to avoid that the projection of one vehicle is blocked by other vehicles, the vehicles in the vehicle group are arranged in a substantially straight line and all are arranged on the same side of the projection plane, and at this time, the vehicles can be numbered in sequence according to the position information of the vehicles in the vehicle group, and the vehicle group has a head-end vehicle and a tail-end vehicle. When the vehicles perform horizontal joint projection, a first boundary line of the head-end vehicle is determined as a first boundary line of the joint projection area, and a third boundary line of the tail-end vehicle is determined as a third boundary line of the joint projection area.
Further, the obtaining the second boundary line and the fourth boundary line of the joint projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the joint projection area includes:
sequencing the second boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines of the vehicle projection areas as the second boundary line of the joint projection areas;
sequencing the fourth boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines of the vehicle projection areas as the fourth boundary line of the combined projection areas; the combined projection area and each vehicle projection area take the first horizontal boundary line as a second boundary line and take the second horizontal boundary line as a fourth boundary line.
Referring to fig. 2, N is equal to 3 for illustration. Specifically, a two-dimensional coordinate system is established by taking the first vertex of the 1 st vehicle projection area as the origin to obtain the ordinate of the second boundary line of the vehicle projection area, wherein the ordinate is 0 and Y respectively 1 、Y 3 The ordinate of the fourth boundary line of the vehicle projection area is Y respectively 2 、Y 4 、Y 5
And then, sorting the second boundary lines of the vehicle projection areas according to the ordinate size of the second boundary lines of the vehicle projection areas, obtaining the lowest boundary line in the second boundary lines of the vehicle projection areas, and determining the lowest boundary line as the second boundary line of the joint projection areas. Wherein 0 < Y 3 <Y 1 Description Y 1 Is lowest in horizontal position, Y 1 The second boundary line is defined as the second boundary line of the joint projection area.
And finally, sorting the fourth boundary lines of the vehicle projection areas according to the ordinate sizes of the fourth boundary lines of the vehicle projection areas, obtaining the highest boundary line in the fourth boundary lines of the vehicle projection areas, and determining the highest boundary line as the fourth boundary line of the combined projection areas. Wherein Y is 2 <Y 5 <Y 4 Description Y 2 Is highest, Y 2 The fourth boundary line is determined as the fourth boundary line of the joint projection area. In horizontal joint projection, the final joint projection area is as shaded in FIG. 2As shown.
In another implementation, the fleet performs multi-dimensional joint projection. The method for obtaining the first boundary line and the third boundary line of the combined projection area according to the N vehicle projection areas comprises the following steps of S21-S26:
S21, when N is larger than or equal to the preset number and is even, judging whether any 4 adjacent vehicle projection areas meet adjacent intersection or not and at least one vehicle projection area is intersected with other three vehicle projection areas;
s22, if yes, judging that a motorcade carries out multi-dimensional joint projection, and sorting the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area, the first boundary line of the 2 nd vehicle projection area, the third boundary line of the N-1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
s23, determining one of a first boundary line of the 1 st trolley projection area and a third boundary line of the 2 nd trolley projection area, which is close to the N th trolley projection area, as a first boundary line of a joint projection area;
s24, determining one of the third boundary line of the N-1-th trolley vehicle projection area and the first boundary line close to the 1 st trolley vehicle projection area as a third boundary line of a joint projection area;
s25, if not, judging that a motorcade carries out horizontal joint projection, and sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain a first boundary line of the 1 st vehicle projection area and a third boundary line of the N th vehicle projection area;
S26, determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N-th trolley projection area as the third boundary line of the joint projection area.
The preset number is the vehicle value which satisfies the multi-dimensional joint projection of the vehicle. In this embodiment, the description is given with a preset number of 4. In other embodiments, the user may preset the number according to his own needs, for example, set the preset number to 6, 8, etc.
It should be noted that, because the projection height of the DLP headlight of the vehicle is limited, when the multi-dimensional combined projection is performed, only two rows of projection areas of the vehicle can be overlapped to achieve a better projection effect. Referring to fig. 3, an example where N is equal to 4 will be described.
In step S21, it is determined whether any 4 adjacent vehicle projection areas satisfy the adjacent intersection and at least one vehicle projection area intersects with all the other three vehicle projection areas. As shown in fig. 3, the 4 adjacent projection areas are adjacently intersected, and the 4 th vehicle projection area is intersected with the other three projection areas, so that it can be determined that the vehicle fleet performs multi-dimensional joint projection.
In step S22, if yes, it is determined that the fleet performs multi-dimensional joint projection, and the vehicle projection areas are ranked according to the first boundary line of the vehicle projection area, so as to obtain the first boundary line of the 1 st vehicle projection area, the first boundary line of the 2 nd vehicle projection area, the third boundary line of the N-1 st vehicle projection area, and the third boundary line of the N-th vehicle projection area.
Specifically, referring to fig. 3, a two-dimensional coordinate system is established with the first vertex of the 1 st vehicle projection area as the origin, and the abscissa of the first boundary line and the abscissa of the third boundary line of the vehicle projection area are obtained. Judging whether the abscissa of the first boundary line of the 1 st trolley projection area is smaller than the abscissa of the first boundary line of the 2 nd trolley projection area; if yes, determining a first boundary line of the projection area of the 2 nd vehicle as a first boundary line of the combined projection area; if not, the first boundary line of the projection area of the 1 st trolley is determined as the first boundary line of the combined projection area. Then judging whether the abscissa of the third boundary line of the N-1 th trolley vehicle projection area is smaller than the abscissa of the third boundary line of the N-1 th trolley vehicle projection area; if yes, determining a third boundary line of the N-1-th trolley vehicle projection area as a third boundary line of the joint projection area; if not, determining the third boundary line of the projection area of the Nth trolley as the third boundary line of the combined projection area.
Illustratively, in FIG. 3, the first borderline of the 1 st and 2 nd vehicle projection areas has an abscissa of 0 and X, respectively 1 Wherein 0 < X 1 Description that the first boundary line of the 2 nd vehicle projection area is closer to the third boundary line of the 4 th vehicle projection area, X will be 1 The boundary line where this is located is determined as the first boundary line of the joint projection area, i.e., the content in step S23. The abscissa of the third boundary line of the projection areas of the 3 rd and 4 th vehicles is X 2 、X 3 Wherein X is 2 <X 3 Description that the third boundary line of the 3 rd vehicle projection area is closer to the first boundary line of the 1 st vehicle projection area, X is taken as 2 The boundary line where this is located is determined as the third boundary line of the joint projection area, i.e., the content in step S24.
In step S25-26, when it is determined that the multi-dimensional joint projection does not meet the condition, it is determined that the fleet is performing horizontal joint projection at this time. It should be noted that, because the projection height of the DLP headlight of the vehicle is limited, when N is smaller than 4 or N is an odd number, only horizontal joint projection can be performed, which is not suitable for multidimensional joint projection; when N is greater than or equal to 4 and is an even number, multidimensional joint projection can be performed, and horizontal joint projection judgment can be performed. In the case of horizontal joint projection, the first boundary line of the projection area of the 1 st vehicle is directly determined as the first boundary line of the joint projection area, and the third boundary line of the projection area of the nth vehicle is determined as the third boundary line of the joint projection area, which will not be described in detail herein.
Further, when the motorcade performs multi-dimensional joint projection, the second boundary line and the fourth boundary line of the joint projection area determining method includes:
when the second boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the second boundary line of the vehicle projection area as a second boundary line to be selected;
sequencing the second boundary lines to be selected according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines to be selected as the second boundary line of the joint projection area;
when the fourth boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the fourth boundary line of the vehicle projection area as a fourth boundary line to be selected;
and sequencing the fourth boundary lines to be selected according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines to be selected as the fourth boundary line of the joint projection area.
The non-projection area refers to an area not covered by the projection of the vehicle. In FIG. 3, X 1 The boundary line is the first boundary line of the joint projection area, X 2 The boundary line is the third boundary line of the joint projection area. The second boundary lines of the 1 st and 3 rd vehicle projection regions are located adjacent to the non-projection region in a portion between the first boundary line and the third boundary line of the joint projection region, and the second boundary lines of the 2 nd and 4 th vehicle projection regions are located not adjacent to the non-projection region in a portion between the first boundary line and the third boundary line of the joint projection region, so the second boundary lines of the 1 st and 3 rd vehicle projection regions are determined as second boundary lines to be selected. Wherein the second boundary lines of the projection areas of the 1 st and 3 rd vehicles are respectively 0 and Y 2 ,0>Y 2 The horizontal position of 0 is the lowest, and the second boundary line where 0 is located, i.e., the X-axis, is determined as the second boundary line of the joint projection area.
Referring to fig. 3, the fourth boundary line of the 2 nd and 4 th vehicle projection regions is located adjacent to the non-projection region at a portion between the first boundary line and the third boundary line of the joint projection region, and the fourth boundary line of the 1 st and 3 rd vehicle projection regions is located not adjacent to the non-projection region at a portion between the first boundary line and the third boundary line of the joint projection region, so the fourth boundary line of the 2 nd and 4 th vehicle projection regions is determined as the fourth boundary line to be selected. Wherein the fourth boundary lines of the projection areas of the 2 nd and 4 th vehicles are respectively Y 1 、Y 3 ,Y 1 <Y 3 Description Y 1 Is highest, Y 1 The fourth boundary line is determined as the fourth boundary line of the joint projection area. In multi-dimensional joint projection, the final determined joint projection area is shown as a shaded portion in fig. 3.
Referring to fig. 4, a second embodiment of the present invention provides a vehicle joint projection boundary determination apparatus, including:
the projection area acquisition module is used for acquiring a vehicle projection area of each vehicle in the motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
the vertical boundary determining module is used for obtaining a first boundary line and a third boundary line of the combined projection area according to the N vehicle projection areas so as to maximize the width of the combined projection area;
the horizontal boundary determining module is used for obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
And the whole boundary determining module is used for determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
Preferably, the vertical boundary determining module includes:
the first sequencing unit is used for sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas when N is smaller than the preset number or N is an odd number, so as to obtain the first boundary line of the 1 st vehicle projection area and the third boundary line of the N th vehicle projection area; wherein the joint projection area and each vehicle projection area take a first vertical boundary line as a first boundary line and a second vertical boundary line as a third boundary line;
a first vertical boundary determining unit configured to determine a first boundary line of the 1 st vehicle projection area as a first boundary line of the joint projection area and determine a third boundary line of the N-th vehicle projection area as a third boundary line of the joint projection area.
Preferably, the horizontal boundary determining module includes:
a first horizontal boundary determining unit configured to rank the second boundary lines of the vehicle projection areas according to the heights of the horizontal positions, and determine a lowest boundary line among the second boundary lines of the vehicle projection areas as a second boundary line of the joint projection area;
A second horizontal boundary determining unit configured to rank fourth boundary lines of the vehicle projection areas according to heights of horizontal positions, and determine a highest boundary line among the fourth boundary lines of the vehicle projection areas as a fourth boundary line of a joint projection area; the combined projection area and each vehicle projection area take the first horizontal boundary line as a second boundary line and take the second horizontal boundary line as a fourth boundary line.
Preferably, the vertical boundary determining module further comprises:
the judging unit is used for judging whether any 4 adjacent vehicle projection areas meet adjacent intersection or not and whether at least one vehicle projection area is intersected with other three vehicle projection areas or not when N is larger than or equal to the preset number and is an even number;
the second sequencing unit is used for judging that the motorcade carries out multidimensional joint projection if the motor vehicle projection area is the same as the first boundary line of the motor vehicle projection area, sequencing the motor vehicle projection areas according to the first boundary line of the motor vehicle projection area to obtain a first boundary line of the motor vehicle projection area 1, a first boundary line of the motor vehicle projection area 2, a third boundary line of the motor vehicle projection area N-1 and a third boundary line of the motor vehicle projection area N;
a second vertical boundary determining unit configured to determine one of a first boundary line of the 1 st vehicle projection area and a third boundary line of the 2 nd vehicle projection area, which is close to the nth vehicle projection area, as a first boundary line of a joint projection area;
A third vertical boundary determining unit configured to determine, as a third boundary of the joint projection area, one of a third boundary of the N-1 th vehicle projection area and a first boundary of the N-1 th vehicle projection area, which is close to the 1 st vehicle projection area;
the third sequencing unit is used for judging that the motorcade carries out horizontal joint projection if not, sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
a fourth vertical boundary determining unit configured to determine a first boundary line of the 1 st vehicle projection area as a first boundary line of the joint projection area and determine a third boundary line of the N-th vehicle projection area as a third boundary line of the joint projection area.
Preferably, the horizontal boundary determining module further includes:
a first candidate unit configured to determine, when a portion of the second boundary line of the vehicle projection area located between the first boundary line and the third boundary line of the joint projection area is adjacent to the non-projection area, the second boundary line of the vehicle projection area as a second boundary line to be selected;
A third horizontal boundary determining unit, configured to rank the second boundary lines to be selected according to the height of the horizontal position, and determine the lowest boundary line in the second boundary lines to be selected as the second boundary line of the joint projection area;
a second candidate unit configured to determine a fourth boundary line of the vehicle projection area as a fourth boundary line to be selected when a portion of the fourth boundary line of the vehicle projection area between the first boundary line and the third boundary line of the joint projection area is adjacent to a non-projection area;
and the fourth horizontal boundary determining unit is used for sequencing the fourth boundary lines to be selected according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines to be selected as the fourth boundary line of the joint projection area.
It should be noted that, the vehicle joint projection boundary determining apparatus provided by the embodiment of the present invention is configured to execute all the flow steps of the vehicle joint projection boundary determining method in the foregoing embodiment, and the working principles and beneficial effects of the two correspond to each other one by one, so that a detailed description is omitted.
According to the device provided by the embodiment, the vehicle projection area of each vehicle in the motorcade is obtained, then the first boundary line and the third boundary line of the combined projection area are obtained so as to maximize the width of the combined projection area, and then the second boundary line and the fourth boundary line of the combined projection area are obtained so as to maximize the height of the combined projection area. By the method, the maximum boundary of the joint projection area can be determined while the joint projection effect is ensured. When the method is applied specifically, a user can timely adjust or change the projection materials according to the maximum boundary, and the projection materials are prevented from exceeding the joint projection area.
The embodiment of the invention also provides terminal equipment. The terminal device includes: a processor, a memory, and a computer program stored in the memory and executable on the processor, such as a vehicle joint projection boundary determination program. The processor, when executing the computer program, implements the steps in the above-described embodiments of the method for determining a joint projection boundary of each vehicle, for example, step S11 shown in fig. 1. Alternatively, the processor, when executing the computer program, performs the functions of the modules/units of the apparatus embodiments described above, such as the vertical boundary determination module.
The computer program may be divided into one or more modules/units, which are stored in the memory and executed by the processor to accomplish the present invention, for example. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments are used for describing the execution of the computer program in the terminal device.
The terminal equipment can be a desktop computer, a notebook computer, a palm computer, an intelligent tablet and other computing equipment. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the above components are merely examples of terminal devices and do not constitute a limitation of terminal devices, and may include more or fewer components than described above, or may combine certain components, or different components, e.g., the terminal devices may also include input and output devices, network access devices, buses, etc.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, which is a control center of the terminal device, and which connects various parts of the entire terminal device using various interfaces and lines.
The memory may be used to store the computer program and/or module, and the processor may implement various functions of the terminal device by running or executing the computer program and/or module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the terminal device integrated modules/units may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as stand alone products. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present invention, and are not to be construed as limiting the scope of the invention. It should be noted that any modifications, equivalent substitutions, improvements, etc. made by those skilled in the art without departing from the spirit and principles of the present invention are intended to be included in the scope of the present invention.

Claims (10)

1. A method for determining a joint projection boundary of a vehicle, comprising:
acquiring a vehicle projection area of each vehicle in a motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
obtaining a first boundary line and a third boundary line of a combined projection area according to N vehicle projection areas so as to maximize the width of the combined projection area;
obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
and determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
2. The method for determining a joint projection boundary of a vehicle according to claim 1, wherein the obtaining a first boundary line and a third boundary line of the joint projection region from the N vehicle projection regions includes:
When N is smaller than the preset number or N is an odd number, sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain a first boundary line of the 1 st vehicle projection area and a third boundary line of the N th vehicle projection area; wherein the joint projection area and each vehicle projection area take a first vertical boundary line as a first boundary line and a second vertical boundary line as a third boundary line;
and determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N th trolley projection area as the third boundary line of the joint projection area.
3. The vehicle joint projection boundary determination method according to claim 2, wherein the obtaining the second boundary line and the fourth boundary line of the joint projection region from the first boundary line, the third boundary line, and the N vehicle projection regions of the joint projection region includes:
sequencing the second boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines of the vehicle projection areas as the second boundary line of the joint projection areas;
Sequencing the fourth boundary lines of the vehicle projection areas according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines of the vehicle projection areas as the fourth boundary line of the combined projection areas; the combined projection area and each vehicle projection area take the first horizontal boundary line as a second boundary line and take the second horizontal boundary line as a fourth boundary line.
4. The method for determining a joint projection boundary of a vehicle according to claim 2, wherein the obtaining a first boundary line and a third boundary line of the joint projection region from the N vehicle projection regions includes:
when N is greater than or equal to the preset number and is an even number, judging whether any 4 adjacent vehicle projection areas meet adjacent intersection or not and whether at least one vehicle projection area is intersected with other three vehicle projection areas;
if yes, judging that a motorcade carries out multi-dimensional joint projection, and sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area, the first boundary line of the 2 nd vehicle projection area, the third boundary line of the N-1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
Determining one of the first boundary line of the 1 st vehicle projection area and the third boundary line of the 2 nd vehicle projection area, which is close to the nth vehicle projection area, as a first boundary line of a joint projection area;
determining one of a third boundary line of the N-1 th trolley vehicle projection area and a first boundary line close to the 1 st trolley vehicle projection area as a third boundary line of a joint projection area;
if not, judging that the motorcade carries out horizontal joint projection, and sequencing the vehicle projection areas according to the first boundary line of the vehicle projection areas to obtain the first boundary line of the 1 st vehicle projection area and the third boundary line of the N th vehicle projection area;
and determining the first boundary line of the 1 st trolley projection area as the first boundary line of the joint projection area, and determining the third boundary line of the N th trolley projection area as the third boundary line of the joint projection area.
5. The vehicle joint projection boundary determination method according to claim 4, wherein after the determining that the fleet performs multi-dimensional joint projection if yes, the method further comprises:
When the second boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the second boundary line of the vehicle projection area as a second boundary line to be selected;
sequencing the second boundary lines to be selected according to the height of the horizontal position, and determining the lowest boundary line in the second boundary lines to be selected as the second boundary line of the joint projection area;
when the fourth boundary line of the vehicle projection area is positioned between the first boundary line and the third boundary line of the combined projection area and is adjacent to the non-projection area, determining the fourth boundary line of the vehicle projection area as a fourth boundary line to be selected;
and sequencing the fourth boundary lines to be selected according to the height of the horizontal position, and determining the highest boundary line in the fourth boundary lines to be selected as the fourth boundary line of the joint projection area.
6. A vehicle joint projection boundary determination method according to claim 3, further comprising:
establishing a two-dimensional coordinate system by taking a first vertex of a 1 st vehicle projection area as an origin to obtain an ordinate of a second boundary line and an ordinate of a fourth boundary line of the vehicle projection area;
Sequencing the second boundary lines of the vehicle projection areas according to the ordinate size of the second boundary lines of the vehicle projection areas to obtain the lowest boundary line in the second boundary lines of the vehicle projection areas, and determining the lowest boundary line as the second boundary line of the joint projection area;
and sequencing the fourth boundary lines of the vehicle projection areas according to the ordinate sizes of the fourth boundary lines of the vehicle projection areas to obtain the highest boundary line in the fourth boundary lines of the vehicle projection areas, and determining the highest boundary line as the fourth boundary line of the combined projection areas.
7. The vehicle joint projection boundary determination method according to claim 4, wherein after the determining that the fleet performs multi-dimensional joint projection if yes, the method further comprises:
establishing a two-dimensional coordinate system by taking a first vertex of a 1 st vehicle projection area as an origin to obtain an abscissa of a first boundary line and an abscissa of a third boundary line of the vehicle projection area;
judging whether the abscissa of the first boundary line of the 1 st trolley projection area is smaller than the abscissa of the first boundary line of the 2 nd trolley projection area;
if yes, determining a first boundary line of the projection area of the 2 nd vehicle as a first boundary line of the combined projection area;
If not, determining the first boundary line of the projection area of the 1 st trolley as the first boundary line of the combined projection area;
judging whether the abscissa of the third boundary line of the N-1 th trolley vehicle projection area is smaller than the abscissa of the third boundary line of the N-1 th trolley vehicle projection area;
if yes, determining a third boundary line of the N-1-th trolley vehicle projection area as a third boundary line of the joint projection area;
if not, determining the third boundary line of the projection area of the Nth trolley as the third boundary line of the combined projection area.
8. A vehicle joint projection boundary determination apparatus, characterized by comprising:
the projection area acquisition module is used for acquiring a vehicle projection area of each vehicle in the motorcade; the motorcade comprises N vehicles, N is a positive integer greater than 1, and N vehicles form combined projection;
the vertical boundary determining module is used for obtaining a first boundary line and a third boundary line of the combined projection area according to the N vehicle projection areas so as to maximize the width of the combined projection area;
the horizontal boundary determining module is used for obtaining a second boundary line and a fourth boundary line of the combined projection area according to the first boundary line, the third boundary line and the N vehicle projection areas of the combined projection area so as to maximize the height of the combined projection area; wherein the joint projection area and each vehicle projection area divide the first boundary line, the second boundary line, the third boundary line, and the fourth boundary line in the same order;
And the whole boundary determining module is used for determining the whole boundary of the joint projection area according to the first boundary line, the second boundary line, the third boundary line and the fourth boundary line of the joint projection area.
9. A terminal device comprising a processor, a memory and a computer program stored in the memory and configured to be executed by the processor, the processor implementing the vehicle joint projection boundary determination method according to any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored computer program, wherein the computer program, when run, controls a device in which the computer readable storage medium is located to perform the vehicle joint projection boundary determination method according to any one of claims 1 to 7.
CN202111373865.9A 2021-11-19 2021-11-19 Vehicle joint projection boundary determination method and device, terminal equipment and storage medium Pending CN116152045A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111373865.9A CN116152045A (en) 2021-11-19 2021-11-19 Vehicle joint projection boundary determination method and device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111373865.9A CN116152045A (en) 2021-11-19 2021-11-19 Vehicle joint projection boundary determination method and device, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116152045A true CN116152045A (en) 2023-05-23

Family

ID=86351078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111373865.9A Pending CN116152045A (en) 2021-11-19 2021-11-19 Vehicle joint projection boundary determination method and device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116152045A (en)

Similar Documents

Publication Publication Date Title
JP4513871B2 (en) Image processing method, image processing program, and image processing apparatus
CN107730457B (en) Image completion method and device, electronic equipment and storage medium
CN108833784B (en) Self-adaptive composition method, mobile terminal and computer readable storage medium
CN104978750B (en) Method and apparatus for handling video file
CN110246081B (en) Image splicing method and device and readable storage medium
CN110765799B (en) Client code scanning identification method, device, equipment and storage medium
CN113055615B (en) Conference all-in-one machine, screen segmentation display method and storage device
CN110992244A (en) Picture generation method, system, equipment and storage medium with Moire patterns
CN103702032A (en) Image processing method, device and terminal equipment
CN111179402B (en) Rendering method, device and system of target object
EP4252413A2 (en) Methods and apparatus for receiving virtual relocation during a network conference
CN111833399A (en) Target detection method based on fisheye image and related equipment
CN109690611A (en) A kind of method for correcting image and device
CN110392266B (en) Light field video coding method based on pseudo video sequence, terminal equipment and storage medium
US10877811B1 (en) Scheduler for vector processing operator allocation
CN111223169A (en) Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform
CN113077524B (en) Automatic calibration method, device and equipment for binocular fisheye camera and storage medium
US11194474B1 (en) Link-list shortening logic
CN116152045A (en) Vehicle joint projection boundary determination method and device, terminal equipment and storage medium
CN115713678A (en) Arrow picture data augmentation method and system, electronic device and storage medium
CN111028357B (en) Soft shadow processing method and device of augmented reality equipment
JP7466689B2 (en) Image display method, device, equipment, and storage medium
CN116156124A (en) Imaging adjustment method, device, equipment and storage medium for vehicle joint projection
US11216307B1 (en) Scheduler for vector processing operator readiness
US10445883B1 (en) ID recycle mechanism for connected component labeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination