WO2022000856A1 - 测速方法及装置、电子设备及存储介质 - Google Patents
测速方法及装置、电子设备及存储介质 Download PDFInfo
- Publication number
- WO2022000856A1 WO2022000856A1 PCT/CN2020/121491 CN2020121491W WO2022000856A1 WO 2022000856 A1 WO2022000856 A1 WO 2022000856A1 CN 2020121491 W CN2020121491 W CN 2020121491W WO 2022000856 A1 WO2022000856 A1 WO 2022000856A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- transmission parameter
- image
- processed
- size
- pixel
- Prior art date
Links
- 238000000691 measurement method Methods 0.000 title claims abstract description 10
- 230000005540 biological transmission Effects 0.000 claims abstract description 322
- 238000000034 method Methods 0.000 claims abstract description 84
- 238000012545 processing Methods 0.000 claims description 74
- 230000000875 corresponding effect Effects 0.000 claims description 48
- 238000001514 detection method Methods 0.000 claims description 43
- 230000002596 correlated effect Effects 0.000 claims description 35
- 238000006243 chemical reaction Methods 0.000 claims description 28
- 238000003384 imaging method Methods 0.000 claims description 28
- 230000015654 memory Effects 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 18
- 238000000605 extraction Methods 0.000 claims description 14
- 238000013507 mapping Methods 0.000 claims description 12
- 238000010586 diagram Methods 0.000 claims description 10
- 238000012544 monitoring process Methods 0.000 claims description 9
- 238000013528 artificial neural network Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present disclosure relates to the field of computer technology, and in particular, to a speed measurement method and device, an electronic device, and a storage medium.
- the moving speed of the object is obtained based on the moving distance of the object in the image and the moving time of the object.
- the accuracy of obtaining the moving speed of the object is low.
- the present disclosure provides a speed measurement method and device, an electronic device and a storage medium.
- a speed measurement method comprising:
- first image to be processed and a second image to be processed wherein, the first image to be processed and the second image to be processed both include a first object; obtain the first object in the first image to be processed The first position in the first position, the second position of the first object in the second image to be processed, and the first transmission parameter of the first moving distance; wherein, the first moving distance is the difference between the first position and the the distance between the second positions; the first transmission parameter represents the conversion relationship between the first moving distance and the first physical distance; the first physical distance is the physical distance corresponding to the first moving distance distance; the first physical distance is negatively correlated with the scale of the first position in the first image to be processed, and/or the first physical distance and the second position are in the second The scales in the image to be processed are negatively correlated; according to the first moving distance, the first transmission parameter and the moving time, the speed of the first object is obtained; wherein, the moving time is based on the first to-be-processed The timestamp of the image and the timestamp of the
- the speed measuring device obtains the speed of the first object according to the first transmission parameter, the first moving distance and the moving time, which can improve the accuracy of the speed.
- the obtaining the speed of the first object according to the first moving distance, the first transmission parameter and the moving time includes: according to the first transmission parameter and the first transmission parameter A moving distance is obtained to obtain a second moving distance; and the speed is obtained according to the second moving distance and the moving time.
- the acquiring the first transmission parameter of the first moving distance includes: acquiring the second transmission parameter of a third position; wherein the third position is the difference between the first position and the position on the connecting line of the second position; the second transmission parameter represents the conversion relationship between the size of the first pixel point and the size of the first object point; the first pixel point is based on the third position
- the pixel point determined in the first image to be processed, the first pixel point may be the pixel point determined in the second image to be processed according to the third position;
- the first object point is the pixel point determined in the second image to be processed
- the first ratio is negatively correlated with the scale of the first pixel point in the image;
- the first ratio is the size of the first pixel point and the first object point.
- the ratio between the dimensions; the first transmission parameter is obtained according to the second transmission parameter; wherein, the first transmission parameter is positively correlated with the second transmission parameter.
- the third position is an intermediate position between the first position and the second position.
- the acquiring the second transmission parameter of the third position includes: performing object detection processing on the first image to be processed to obtain the position of the first object frame and the position of the second object frame;
- the first object frame includes a first object;
- the second object frame includes a second object; the first size of the first object is obtained according to the position of the first object frame, and the second object is obtained according to the position of the first object frame.
- the position of the frame obtains the second size of the second object; the third transmission parameter is obtained according to the first size and the third size, and the fourth transmission parameter is obtained according to the second size and the fourth size; wherein, the The third size is the physical size of the first object; the third transmission parameter represents the conversion relationship between the fifth size and the sixth size; the fifth size is the size of the second pixel; the second The position of the pixel point in the first image to be processed is determined according to the position of the first object frame; the sixth size is the size of the object point corresponding to the second pixel point; the fourth size is the physical size of the second object; the fourth transmission parameter represents the conversion relationship between the seventh size and the eighth size; the seventh size is the size of the third pixel; the third pixel is in the The position in the second image to be processed is determined according to the position of the second object frame; the eighth size is the size of the object point corresponding to the third pixel point; the third transmission parameter and the fourth The transmission parameters are subjected to curve fitting processing to obtain a transmission parameter map of
- the method further includes: acquiring A confidence map; wherein, the confidence map represents the mapping between the object type and the confidence of the transmission parameter; according to the object type of the first object and the confidence map, the third transmission parameter of the third transmission parameter is obtained.
- the performing curve fitting processing on the third transmission parameter and the fourth transmission parameter to obtain a transmission parameter map of the image to be processed includes: according to the first confidence level and the first confidence level Three transmission parameters to obtain a fifth transmission parameter; wherein, the fifth transmission parameter is positively correlated with the first confidence level; curve fitting is performed on the fourth transmission parameter and the fifth transmission parameter to obtain the The transmission parameter map.
- the method further includes: The pixel point area in the first object frame is subjected to feature extraction processing to obtain feature data; according to the feature data, the score of the first object is obtained; wherein, the score is positive with the confidence of the size of the first object correlation; obtaining the first confidence of the third transmission parameter according to the object type of the first object and the confidence mapping, including: according to the object type of the first object and the confidence mapping , obtain the second confidence level of the third transmission parameter; obtain the first confidence level according to the score and the second confidence level; wherein, the first confidence level is related to the score.
- the obtaining the fifth transmission parameter according to the first confidence level and the third transmission parameter includes: determining a product of the first confidence level and the third transmission parameter, The fifth transmission parameter is obtained.
- the method further includes: acquiring the first transmission parameter The depth image of the image to be processed; according to the depth image, the first depth information of the second pixel point and the second depth information of the third pixel point are obtained; according to the first depth information and the fifth depth information The transmission parameter obtains a first data point, and the second data point is obtained according to the second depth information and the fourth transmission parameter; the curve fitting processing is performed on the fourth transmission parameter and the fifth transmission parameter,
- Obtaining the transmission parameter map includes: performing curve fitting processing on the first data point and the second data point to obtain the transmission parameter map.
- the first image to be processed and the second image to be processed are acquired by the same imaging device, and the pose of the imaging device in the process of acquiring the first image to be processed The pose is the same as that of the imaging device in the process of acquiring the second image to be processed.
- the first object is a human object; the human object belongs to a monitoring crowd, and the method further includes: in the case that the speed does not exceed a safe speed threshold, acquiring the image data of the imaging device. location; sending an alarm instruction including the location to the terminal; wherein the alarm instruction is used to instruct the terminal to output the alarm information that the crowd density of the monitored crowd is too large.
- a speed measuring device comprising: a first acquiring unit, configured to acquire a first image to be processed and a second image to be processed; wherein the first image to be processed and the second image to be processed The images to be processed all include a first object; a second acquisition unit is configured to acquire the first position of the first object in the first image to be processed, the first object in the second image to be processed The first transmission parameter of the second position and the first moving distance; wherein, the first moving distance is the distance between the first position and the second position; the first transmission parameter represents the first transmission parameter A conversion relationship between a moving distance and a first physical distance; the first physical distance is a physical distance corresponding to the first moving distance; the first physical distance and the first position are in the first waiting distance
- the scale in the processed image is negatively correlated, and/or the first physical distance is negatively correlated with the scale of the second position in the second to-be-processed image; the first processing unit is configured to, according to the The speed of the first object is obtained by
- the first processing unit is configured to: obtain a second movement distance according to the first transmission parameter and the first movement distance; according to the second movement distance and the movement time to get the speed.
- the second obtaining unit is configured to: obtain a second transmission parameter of a third position; wherein the third position is on the line connecting the first position and the second position position; the second transmission parameter represents the conversion relationship between the size of the first pixel point and the size of the first object point; the first pixel point is based on the third position in the first image to be processed
- the first pixel is the pixel determined in the second image to be processed according to the third position; the first object is the object corresponding to the first pixel point; the first ratio is negatively correlated with the scale of the first pixel in the image; the first ratio is the ratio between the size of the first pixel and the size of the first object point; according to and obtaining the first transmission parameter from the second transmission parameter, wherein the first transmission parameter is positively correlated with the second transmission parameter.
- the third position is an intermediate position between the first position and the second position.
- the second acquisition unit is configured to: perform object detection processing on the first image to be processed to obtain the position of the first object frame and the position of the second object frame; wherein the The first object frame includes a first object; the second object frame includes a second object; the first size of the first object is obtained according to the position of the first object frame, and the first size of the first object is obtained according to the position of the second object frame The second size of the second object; the third transmission parameter is obtained according to the first size and the third size, and the fourth transmission parameter is obtained according to the second size and the fourth size; wherein, the third size is The physical size of the first object; the third transmission parameter represents the conversion relationship between the fifth size and the sixth size; the fifth size is the size of the second pixel; the second pixel is in the The position in the first image to be processed is determined according to the position of the first object frame; the sixth size is the size of the object point corresponding to the second pixel point; the fourth size is the second object The fourth transmission parameter represents the conversion relationship
- the first acquisition unit is further configured to perform curve fitting processing on the third transmission parameter and the fourth transmission parameter to obtain the transmission parameter of the image to be processed Before the graph, a confidence map is obtained; wherein, the confidence map represents the mapping between the object type and the confidence of the transmission parameter;
- the speed measurement device further includes: a second processing unit, which is used for according to the first object. The object type and the confidence level are mapped to obtain the first confidence level of the third transmission parameter; the second obtaining unit is used for: obtaining the fifth confidence level according to the first confidence level and the third transmission parameter Transmission parameter; wherein, the fifth transmission parameter is positively correlated with the first confidence level;
- a curve fitting process is performed on the fourth transmission parameter and the fifth transmission parameter to obtain the transmission parameter map.
- the speed measuring device further includes: a third processing unit, configured to obtain the third transmission parameter of the third transmission parameter in the mapping according to the object type of the first object and the confidence level. Before a confidence level, feature extraction processing is performed on the pixel area in the first object frame to obtain feature data; a fourth processing unit is configured to obtain the score of the first object according to the feature data; wherein, The score is positively correlated with the confidence of the size of the first object; the second processing unit is configured to: obtain the third transmission parameter according to the object type of the first object and the confidence map The second confidence level of ; according to the score and the second confidence level, the first confidence level is obtained; wherein, the first confidence level is related to the score.
- a third processing unit configured to obtain the third transmission parameter of the third transmission parameter in the mapping according to the object type of the first object and the confidence level.
- the second obtaining unit is configured to: determine the product of the first confidence level and the third transmission parameter to obtain the fifth transmission parameter.
- the first obtaining unit is further configured to obtain, before the curve fitting process is performed on the fourth transmission parameter and the fifth transmission parameter to obtain the transmission parameter map the depth image of the first image to be processed;
- the second obtaining unit is further configured to: obtain the first depth information of the second pixel point and the second depth information of the third pixel point according to the depth image depth information; obtaining a first data point according to the first depth information and the fifth transmission parameter, and obtaining a second data point according to the second depth information and the fourth transmission parameter; the second obtaining unit, It is also used for: performing curve fitting processing on the first data point and the second data point to obtain the transmission parameter map.
- the first image to be processed and the second image to be processed are acquired by the same imaging device, and the pose of the imaging device in the process of acquiring the first image to be processed The pose is the same as that of the imaging device in the process of acquiring the second image to be processed.
- the first object is a human object; the human object belongs to a monitoring crowd, and the first acquiring unit is further configured to acquire all the data when the speed does not exceed a safe speed threshold.
- the location of the imaging device; the speed measuring device further includes: a sending unit, configured to send an alarm instruction including the location to the terminal; wherein the alarm instruction is used to instruct the terminal to output the crowd density excess of the monitored crowd. Big warning message.
- a processor configured to execute the method according to the above-mentioned first aspect and any possible implementation manner thereof.
- an electronic device comprising: a processor, a sending device, an input device, an output device, and a memory, the memory being used to store computer program codes, the computer program codes comprising computer instructions, and in the processing When the computer executes the computer instructions, the electronic device executes the method according to the first aspect and any one of possible implementations thereof.
- a computer-readable storage medium where a computer program is stored in the computer-readable storage medium, and the computer program includes program instructions that, when the program instructions are executed by a processor, cause all The processor executes the method as described above in the first aspect and any possible implementation manner thereof.
- a computer program product includes a computer program or an instruction, and when the computer program or instruction is run on a computer, the computer is made to perform the above-mentioned first aspect and any of them.
- FIG. 1 is a schematic diagram of a crowd image according to an embodiment of the present disclosure
- FIG. 2 is a schematic diagram of a pixel coordinate system according to an embodiment of the present disclosure
- FIG. 3 is a schematic flowchart of a speed measurement method provided by an embodiment of the present disclosure.
- FIG. 5 is a schematic diagram of a goal provided by an embodiment of the present disclosure.
- FIG. 6 is a schematic structural diagram of a speed measuring device provided by an embodiment of the present disclosure.
- FIG. 7 is a schematic diagram of a hardware structure of a speed measuring device according to an embodiment of the present disclosure.
- At least one (item) refers to one or more
- “multiple” refers to two or more
- at least two (item) refers to two or three
- "and/or” is used to describe the association relationship of related objects, indicating that three kinds of relationships can exist, for example, "A and/or B” can mean: only A exists, only B exists, and A exists at the same time and B three cases, where A, B can be singular or plural.
- the character “/" can indicate that the related objects are an "or” relationship, which refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
- At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c” ", where a, b, c can be single or multiple.
- an object point refers to a point in the real world
- a physical distance refers to a distance in the real world
- a physical size refers to a size in the real world.
- Object points correspond to pixels in the image.
- image A is obtained by photographing a table with a camera.
- the table includes the object point a, and the pixel point b in the image A is obtained by imaging the object point a, then the object point a corresponds to the pixel point b.
- the physical area corresponds to the pixel area in the image.
- image B is obtained by photographing a basketball court with a camera.
- the pixel point area c in the image A is obtained by imaging the basketball court, then the basketball court corresponds to the pixel point area c.
- a near object has a large scale in the image
- a distant object has a small scale in the image.
- far refers to the distance between the real object corresponding to the object in the image and the imaging device that collects the above-mentioned image
- near refers to the distance between the real object corresponding to the object in the image and the imaging device that collects the above-mentioned image. distance is close.
- the scale of a pixel is positively related to the size of the object corresponding to the pixel. Specifically, the larger the scale of a pixel point in the image, the larger the size of the object point corresponding to the pixel point.
- image A includes pixel point a and pixel point b, wherein the object point corresponding to pixel point a is object point 1 , and the object point corresponding to pixel point b is object point 2 . If the scale of pixel point a in image A is larger than that of pixel point b in image A, the size of object point 1 is larger than the size of object point 2.
- the scale of a location refers to the ratio between the size of an object at that location and the physical size of the object.
- the scale of the position of character A is larger than that of the position of character B, and the size difference between people is small (that is, the difference between the physical sizes of different people is small)
- the area of the pixel area covered by character A is larger than the area of the pixel area covered by character B.
- the positions in the image all refer to the positions under the pixel coordinates of the image.
- the abscissa of the pixel coordinate system is used to represent the number of columns where the pixel points are located
- the ordinate in the pixel coordinate system is used to represent the number of rows where the pixel points are located.
- the pixel coordinates are constructed with the upper left corner of the image as the coordinate origin O, the direction of the row parallel to the image as the direction of the X-axis, and the direction of the column parallel to the image as the direction of the Y-axis
- the system is XOY.
- the units of the abscissa and ordinate are pixels.
- the coordinates of the pixel point A 11 in FIG. 2 are (1, 1)
- the coordinates of the pixel point A 23 are (3, 2)
- the coordinates of the pixel point A 42 are (2, 4)
- the coordinates of the pixel point A 34 are (2, 4).
- the coordinates are (4, 3).
- the execution body of the embodiment of the present disclosure is a speed measuring device.
- the speed measuring device may be one of the following: a mobile phone, a computer, a server, and a tablet computer.
- FIG. 3 is a schematic flowchart of a speed measurement method provided by an embodiment of the present disclosure.
- both the first image to be processed and the second image to be processed may contain any content.
- the first image to be processed may contain people; the first image to be processed may also contain roads and car heads.
- the second image to be processed may contain people; the second image to be processed may also contain animals.
- the present disclosure does not limit the content contained in the first image to be processed and the content contained in the second image to be processed.
- the first object may be one of the following: a person and an object.
- the first object may be a person; the first object may also be a car; the first object may also be an animal.
- Both the first image to be processed and the second image to be processed contain the first object.
- the first image to be processed includes Zhang San
- the second image to be processed also includes Zhang San.
- both the first image to be processed and the second image to be processed include vehicle a.
- the speed measuring device receives the first image to be processed input by the user through the input component.
- the above input components include: keyboard, mouse, touch screen, touch pad, audio input and so on.
- the speed measuring device receives the first image to be processed sent by the first terminal.
- the first terminal may be any one of the following: a mobile phone, a computer, a tablet computer, a server, or a wearable device.
- the speed measuring device is reproduced with a camera assembly, wherein the camera assembly includes a camera.
- the speed measuring device acquires the first image to be processed by using the camera assembly to capture images.
- the speed measuring device selects one frame of image from the acquired video stream as the first image to be processed.
- the speed measuring device receives the second image to be processed input by the user through the input component.
- the above input components include: keyboard, mouse, touch screen, touch pad, audio input and so on.
- the speed measuring device receives the second image to be processed sent by the second terminal.
- the second terminal may be any one of the following: a mobile phone, a computer, a tablet computer, a server, or a wearable device.
- the speed measuring device is reproduced with a camera assembly, wherein the camera assembly includes a camera.
- the speed measuring device acquires the second image to be processed by using the camera assembly to capture images.
- the speed measuring device selects one frame of image from the acquired video stream as the second image to be processed.
- the speed measuring device acquires the surveillance video stream collected by the surveillance camera through the communication connection, and selects two frames of images from the surveillance video stream as the first image to be processed and the second image to be processed.
- the position of the first object in the first image to be processed may be the position of an object frame containing the first object in the first image to be processed; the position of the first object in the first image to be processed may be is the position of the pixel in the pixel area covered by the first object in the first image to be processed.
- the position of the first object in the second image to be processed may be the position of the object frame containing the first object in the second image to be processed; the position of the first object in the second image to be processed may be covered by the first object The position of the pixel point in the pixel point area of in the second image to be processed.
- the first moving distance is the distance between the first position and the second position, that is, the first moving distance is the distance in the pixel coordinate system.
- the first position is (3, 5), that is, the position of the first object in the first image to be processed is (3, 5);
- the second position is (7, 8), that is, the first object is in the second image to be processed.
- the position in the processed image is (7, 8).
- the first moving distance is:
- the physical distance refers to the distance in the real world.
- the physical distance corresponding to the first moving distance is the first physical distance.
- the first transmission parameter represents the conversion relationship between the first moving distance and the first physical distance. For example, it is assumed that the timestamp of the first image to be processed is t 1 , the timestamp of the second image to be processed is t 2 , and the first object is Zhang San. If the first moving distance obtained by the speed measuring device according to the first position and the second position is d 1 , the speed measuring device can use the first transmission parameter to convert d 1 into the moving distance of Zhang San in the real world from t 1 to t 2 (ie the first physical distance). In some possible implementations, the first transmission parameter is a ratio between the first moving distance and the first physical distance.
- the first physical distance is negatively correlated with the scale of the first position in the first image to be processed, and/or the first physical distance is negatively correlated with the scale of the second position in the second image to be processed related.
- the correlation relationship includes at least one of the following situations:
- the first physical distance obtained by the speed measuring device according to the first moving distance and the first transmission parameter is negatively correlated with the scale of the first position in the first image to be processed;
- the first physical distance obtained by the speed measuring device according to the first moving distance and the first transmission parameter is negatively correlated with the scale of the first position in the first image to be processed, and the first physical distance and the second position are in The scales in the second image to be processed are negatively correlated.
- the moving time is the time consumed by the first object moving the first moving distance.
- the speed measuring device can obtain the moving time according to the time stamp of the first image to be processed and the time stamp of the second image to be processed.
- the smaller time stamp among the time stamp of the first image to be processed and the time stamp of the second image to be processed is called a small time stamp
- the time stamp of the first image to be processed and the time stamp of the second image to be processed are referred to as the small time stamp.
- a large time stamp among the time stamps of the processed image is called a large time stamp.
- the starting time of the moving time is the small timestamp
- the ending time of the moving time is the large timestamp.
- the timestamp of the first image to be processed is 16:54:30 on June 27, 2020
- the timestamp of the second image to be processed is 16:54:33 on June 27, 2020.
- the small timestamp is 16:54:30 on June 27, 2020
- the large timestamp is 16:54:33 on June 27, 2020.
- the speed of the first object is the speed of the first object in the real world.
- the speed measuring device obtains the moving distance of the first object in the real world (hereinafter referred to as the second moving distance) according to the first transmission parameter and the first moving distance.
- the speed measuring device can obtain the speed of the first object according to the first physical moving distance and the moving time. For example, assuming that the first transmission parameter represents the ratio between the first moving distance and the first physical distance, the first transmission parameter is 0.1 cm, the first moving distance is 10, and the moving time is 0.5 seconds.
- the speed measuring device obtains the speed of the first object in the image (hereinafter referred to as virtual speed) according to the first moving distance and the moving time.
- the speed measuring device obtains the speed of the first object in the real world according to the virtual speed and the first transmission parameter.
- the speed measuring device obtains the speed of the first object according to the first transmission parameter, the first moving distance and the moving time, which can improve the accuracy of the speed.
- the speed measuring device obtains the first transmission parameter by performing the following steps:
- the third position is a position on the line connecting the first position and the second position. It should be understood that although the first position is the position in the first image to be processed, and the second position is the position in the second image to be processed, because in the embodiment of the present disclosure, the pixel coordinate system of the first image to be processed is the same as the position in the first image to be processed. The pixel coordinate systems of the two images to be processed are the same, and the speed measuring device can determine the third position in the pixel coordinate system according to the first position and the second position.
- the third position may be a position in the first image to be processed, and the third position may also be a position in the second image to be processed.
- the first position is (3, 4) and the second position is (7, 8).
- the third position is the middle position between the first position and the second position: (5, 6).
- the third position may represent the pixel point in the 5th row and the 6th column in the first image to be processed, and the third position may also represent the pixel point in the 5th row and 6th column in the second image to be processed.
- the speed measuring device may determine the pixel point in the first image to be processed according to the third position, and the speed measuring device may also determine the pixel point in the second image to be processed at the third position.
- the pixel point determined by the speed measuring device according to the third position is called the first pixel point
- the second transmission parameter represents the conversion relationship between the size of the first pixel point and the size of the first object point, wherein the first object point is the first pixel point.
- the second transmission parameter represents the conversion relationship between the length of the first pixel point and the length of the first object point.
- the second transmission parameter represents a conversion relationship between the height of the first pixel point and the height of the first object point.
- the second transmission parameter represents the conversion relationship between the width of the first pixel point and the width of the first object point.
- the ratio between the size of the first pixel point and the size of the first object point is referred to as the first ratio.
- the first ratio is negatively correlated with the size of the first pixel in the image.
- the first pixel belongs to the first image to be processed, and if the first ratio is the ratio of the length of the first pixel to the first object point, the larger the scale of the first pixel in the first image to be processed, the A ratio is smaller.
- the length of any two pixels in the first image to be processed is the same, that is to say, the length of the first pixel is unchanged, then the larger the scale of the first pixel, the longer the first object point is.
- the smaller it is, that is, the size of the first object point is negatively correlated with the size of the first pixel point.
- the first ratio is the ratio of the length of the first pixel to the first object point
- the first pixel belongs to the second image to be processed
- the larger the scale of the first pixel in the second image to be processed The first ratio is smaller.
- the second transmission parameter carries the scale information of the first pixel point.
- the speed measuring device receives the second transmission parameter input by the user through the input component.
- the above input components include: keyboard, mouse, touch screen, touch pad, audio input and so on.
- the speed measuring device receives the second transmission parameter sent by the third terminal.
- the third terminal may be any one of the following: a mobile phone, a computer, a tablet computer, a server, and a wearable device.
- the third terminal and the first terminal may be the same or different.
- the scale of the pixel point is linearly related to the abscissa of the pixel point, and/or the scale of the pixel point is linearly related to the ordinate of the pixel point.
- the scale of the third position is linearly related to the scale of the first position, and/or the scale of the third position is linearly related to the scale of the second position. Therefore, the speed measuring device can determine the transmission parameter of the first moving distance according to the transmission parameter of the intermediate position between the first position and the second position, that is, determine the first transmission parameter according to the second transmission parameter.
- the third position is an intermediate position between the first position and the second position.
- the first transmission parameter is positively correlated with the second transmission parameter.
- b 1 and b 2 satisfy formula (1):
- b 1 , b 2 satisfy formula (3):
- FIG. 4 is a schematic flowchart of a possible implementation method of step 1 provided by an embodiment of the present disclosure.
- the object detection processing detects objects whose size is in the vicinity of the determined value.
- the average length of a human face is 20 cm
- the detection object of the object detection process may be a human face.
- the average height of a person is 1.65 meters
- the detection object of the object detection processing may be a human body.
- the heights of the goals shown in FIG. 5 are all determined (for example, 2.44 meters), and the detection object of the object detection processing may be the goal.
- the object frame may be any shape, and the present disclosure does not limit the shape of the object frame (including the above-mentioned first object frame and second object frame).
- the shape of the object frame includes at least one of the following: rectangle, diamond, circle, ellipse, and polygon.
- the position of the object frame (including the position of the first object frame and the position of the second object frame) is used to determine the pixel area included in the object frame, that is, the position of the object frame in the image to be processed .
- the position of the object frame may include coordinates of any pair of diagonal corners in the rectangle, wherein a pair of diagonal corners refers to two vertices on the diagonal of the rectangle.
- the position of the object frame may include: the position of the geometric center of the rectangle, the length of the rectangle, and the width of the rectangle.
- the position of the object frame may include: the position of the center of the object frame and the radius of the object frame.
- the number of objects to be detected in the object detection process is not less than one.
- the detection object when the detection object is a human face, the position of the face frame including the human face can be obtained by performing object detection processing on the image to be processed.
- the detection object when the detection object includes a face and a human body, by performing object detection processing on the image to be processed, the position of the face frame including the face and the position of the human frame including the human body can be obtained.
- the detection object includes a face, a human body and a screw
- the detection objects of the object detection process include at least one of the following: a human face, a human foot, a human body, a screw, and a goal.
- the object detection processing on the image to be processed can be implemented by a convolutional neural network.
- the convolutional neural network is trained by using the image with annotation information as the training data, so that the trained convolutional neural network can complete the object detection processing on the image.
- the annotation information of the images in the training data is the position information of the object frame, and the object frame contains the detection object of the object detection process.
- the object detection processing may be implemented by an object detection algorithm, wherein the object detection algorithm may be one of the following: a look-only algorithm (you only look once, YOLO), a target detection algorithm ( deformable part model, DMP), single shot multiBox detector (SSD), Faster-RCNN algorithm, etc.
- a look-only algorithm you only look once, YOLO
- a target detection algorithm deformable part model, DMP
- SSD single shot multiBox detector
- Faster-RCNN algorithm etc.
- the present disclosure does not limit the object detection algorithm for implementing object detection processing.
- the speed measuring device obtains the position of the first object frame including the first object and the position of the second object frame including the second object by performing object detection processing on the first image to be processed.
- the detection object included in the first object frame is different from the detection object included in the second object frame.
- the detection object included in the first object frame is the face of Zhang San
- the detection object included in the second object frame is the face of Li Si.
- the detection object included in the first object frame is Zhang San's face
- the detection object included in the second object frame is a sign.
- the speed measuring device can determine the size of the detection object contained in the object frame according to the position of the object frame. For example, when the shape of the object frame is a rectangle, the speed measuring device can determine the length and width of the object frame according to the position of the object frame, and then determine the length and width of the detection object in the object frame.
- the speed measuring device can obtain the first size of the first object according to the position of the first object frame, and obtain the second size of the second object according to the position of the second object frame.
- the third size is the physical size of the first object
- the fourth size is the physical size of the second object.
- the third dimension may be the height of the human (eg, 170 cm).
- the detection object contained in the second object frame is a human face
- the third size may be the length of the human face (eg, 20 cm).
- the speed measuring device can determine a pixel (ie, a second pixel) in the first image to be processed according to the position of the first object frame. For example, when the shape of the first object frame is a rectangle, the speed measuring device determines the position of the geometric center of the first object frame according to the position of the first object frame, and uses the pixel corresponding to the geometric center as the second pixel. For another example, when the shape of the first object frame is a rectangle, the speed measuring device determines the position of any vertex of the first object frame according to the position of the first object frame, and uses the pixel corresponding to the vertex as the second pixel. .
- the speed measuring device determines the position of the center of the circle of the first object frame according to the position of the first object frame, and uses the pixel point corresponding to the center of the circle as the second pixel point.
- the speed measuring device may determine a pixel point, that is, a third pixel point, in the first image to be processed according to the position of the second object frame.
- the size of the second pixel is referred to as the fifth size
- the size of the object point corresponding to the second pixel is referred to as the sixth size
- the size of the third pixel is referred to as the seventh size
- the size of the third pixel is referred to as the seventh size
- the size of the object point corresponding to the third pixel point is called the eighth size.
- the conversion relationship between the fifth size and the sixth size is referred to as the third transmission parameter
- the conversion relationship between the seventh size and the eighth size is referred to as the fourth transmission parameter.
- the speed measuring device can obtain the third transmission parameter according to the first size and the third size, and can obtain the fourth transmission parameter according to the second size and the fourth size.
- the first dimension is s 1
- the second dimension is s 2
- the third dimension is s 3
- the fourth dimension is s 4
- the third transmission parameter is b 3
- the fourth transmission parameter is b 4 .
- s 1 , s 3 , b 3 satisfy formula (4):
- s 1 , s 3 , b 3 satisfy formula (6):
- s 1 , s 3 , b 3 satisfy formula (8):
- the speed measuring device passes the third transmission parameter and the fourth
- the transmission parameters are subjected to curve fitting processing, and a transmission parameter map of the first image to be processed can be obtained. According to the pixel value in the transmission parameter map, the transmission parameter of any pixel in the first image to be processed can be determined.
- the speed measuring device can determine the conversion relationship between the size of the fourth pixel point (ie, the ninth size) and the tenth size according to the first pixel value, wherein the tenth size is the size of the object point corresponding to the fourth pixel point.
- the ninth size is s 5
- the tenth size is s 6 .
- p 1 , s 5 , s 6 satisfy formula (10):
- p 1 , s 5 , and s 6 satisfy formula (11):
- p 1 , s 5 , and s 6 satisfy formula (12):
- the speed measuring device can determine the transmission parameter of any pixel point except the fourth pixel point in the first image to be processed according to the transmission parameter map.
- the speed measuring device may determine the reference pixel value from the transmission parameter map according to the third position, wherein the position of the pixel point corresponding to the reference pixel value in the transmission parameter map Same as the third position.
- the speed measuring device can further obtain the second transmission parameter according to the reference pixel value.
- the speed measuring device obtains the third transmission parameter according to the first size and the third size, and obtains the fourth transmission parameter according to the second size and the fourth size.
- a transmission parameter map is obtained, and then the transmission parameter of any pixel point in the first image to be processed can be determined according to the transmission parameter map.
- the distance measuring device can obtain a transmission parameter map of the second image to be processed by performing object detection processing on the second image to be processed. Further, the second transmission parameter is determined.
- the speed measuring device before performing step 404, the speed measuring device further performs the following steps:
- the precision of the transmission parameter of a pixel is positively correlated with the precision of the size of the object point corresponding to the pixel.
- the precision of the transmission parameter map is related to the precision of the size of the first object and the size of the second object. positive correlation.
- the accuracy of the size of an object with a fixed size is higher than that of an object whose size is in the floating range.
- a standard soccer goal has a width of 7.32 meters and a height of 2.44 meters. 90% of people's height is between 1.4 meters and 2 meters. The accuracy of the dimensions of a soccer goal is higher than that of a person's height.
- the height of a standard basketball hoop is 3.05 meters. 95% of human faces are between 17 cm and 30 cm in length. The accuracy of the height of the basketball hoop is higher than the accuracy of the length of the face.
- Another example is a screw with a fixed length. 95% of people's feet are between 20 cm and 35 cm long. The precision of the length of a screw with a fixed length is higher than that of a human foot.
- the above-mentioned object with a fixed size may be an object with a fixed size in a specific scene. For example, boarding signs in the departure lounge. Another example is the chairs in the gym. Another example is a desk in an office.
- the confidence map represents the mapping between the object type and the confidence of the transmission parameter. For example, see Table 1 for the confidence map.
- the speed measuring device receives the confidence map input by the user through the input component.
- the above input components include: keyboard, mouse, touch screen, touch pad, audio input and so on.
- the speed measuring apparatus receives the confidence map sent by the fourth terminal.
- the fourth terminal may be any one of the following: a mobile phone, a computer, a tablet computer, a server, and a wearable device.
- the fourth terminal and the first terminal may be the same or different.
- the first confidence of the third transmission parameter can be obtained according to the confidence map and the object type of the first object. For example, suppose that the reliability map is set as the above Table 1, and the object type of the first object is a human body. At this time, the first confidence level is 0.9.
- the speed measuring device may determine the object type of the first object by performing feature extraction processing on the pixel area included in the first object frame.
- the speed measuring device may determine the transmission parameters corresponding to the objects in each object frame respectively according to the object type of the objects in each object frame. For example, the speed measuring device may obtain the confidence level of the fourth transmission parameter (which will be referred to as the third confidence level in this embodiment of the present application) according to the object type of the second object and the confidence level map.
- the speed measuring device After obtaining the first confidence level, the speed measuring device performs the following steps in the process of executing step 404:
- the fifth transmission parameter is positively correlated with the first confidence level.
- the first confidence level is c 1 and the fifth transmission parameter is b 5 .
- c 1 , b 5 satisfy formula (13):
- c 1 , b 5 satisfy formula (14):
- c 1 , b 5 satisfy formula (15):
- the speed measuring device can improve the accuracy of the transmission parameter map by performing curve fitting processing on the fourth transmission parameter and the fifth transmission parameter.
- the speed measuring device in the case that the speed measuring device obtains the third confidence level by performing step 4, and obtains the sixth transmission parameter according to the third confidence level and the fourth transmission parameter, the speed measuring device can pass A curve fitting process is performed on the fifth transmission parameter and the sixth transmission parameter to obtain a transmission parameter map.
- the distance measuring device can obtain the transmission parameter map of the second image to be processed based on the confidence map, so as to improve the transmission parameter map of the second image to be processed accuracy.
- the speed measuring device before performing step 4, the speed measuring device further performs the following steps:
- the feature extraction processing may be convolution processing, pooling processing, or a combination of convolution processing and pooling processing.
- the feature extraction process may be implemented by a trained convolutional neural network or a feature extraction model, which is not limited in the present disclosure.
- the speed measuring device can extract the semantic information in the pixel point area in the object frame by performing feature extraction processing on the pixel point area in the object frame, and obtain the characteristic data of the object frame.
- convolution processing is performed on the pixel point area in the first object frame layer by layer through at least two convolution layers to complete the first object frame.
- Feature extraction processing of the pixel area within the object frame The convolutional layers in at least two convolutional layers are connected in series in sequence, that is, the output of the previous convolutional layer is the input of the next convolutional layer, and the semantic information extracted by each convolutional layer is different.
- the feature extraction process abstracts the features of the pixel area in the first object frame step by step, and also gradually discards relatively minor feature data, wherein the relatively minor feature information refers to, except for the features that can be used for Feature information other than the feature information of the object type of the object of the first object frame is determined. Therefore, the size of the feature data extracted later is smaller, but the semantic information is more concentrated.
- Convolution processing is performed on the pixel point area in the first object frame step by step through the multi-layer convolution layer, so that semantic information of the pixel point area in the first object frame can be obtained.
- the speed measuring device may perform feature extraction processing on the pixel point area in each object frame, respectively, to obtain feature data of the pixel point area in each object frame.
- the state of the object is determined according to the feature data of the object, and then a confidence score for characterizing the size of the object is obtained, wherein the score of the object is There is a positive correlation with the confidence in the size of the object.
- the object is a human body and the size of the object is the height of the person.
- the height of the person is equal to the person's real height.
- the confidence of the person's height is the highest; when the person is in a walking state, the height of the person is equal to the person's real height.
- the confidence of the person's height is second; when the person is in a state of bowing his head (such as looking down at the mobile phone), there is a small error between the person's height and the person's true height, at this time , the confidence level of the person's height is lower than that of the person in the walking state; in the case of the person sitting, there is a large error between the person's height and the person's real height. At this time, the person's height of low confidence.
- the speed measuring device may determine the score of the object in the object frame according to the feature data extracted from the pixel area in the object frame.
- the speed measuring device may use a classifier (eg, support vector machine, softmax function) to process the feature data of the object frame to obtain a score of the objects in the object frame.
- a classifier eg, support vector machine, softmax function
- the speed measuring device may use a neural network to process the pixel point area in the object frame to obtain the score of the object in the object frame.
- the speed measuring device uses the labeled image set as training data to train the neural network to obtain the trained neural network.
- the unlabeled image set is processed using the trained neural network to obtain the label of the unlabeled image set.
- the trained neural network is trained using the labels of the labeled image set, the unlabeled image set, and the unlabeled image set to obtain an image processing neural network.
- the information carried by the tag includes the position of the object frame in the image and the score of the object in the object frame.
- the speed measuring device can obtain the score of the first object according to the characteristic data of the first object.
- the speed measuring device may obtain a score for each object in the first image to be processed, respectively.
- the speed measuring device performs the following steps in the process of executing step 4:
- the implementation process of this step can refer to step 4, but in this step, the speed measuring device obtains not the first confidence degree but the second confidence degree according to the object type of the first object and the confidence degree mapping.
- the first confidence level is related to the score.
- the first confidence level is c 1
- the second confidence level is c 2
- the score is s.
- c 1 , c 2 , s satisfy formula (16):
- c 1 , c 2 , s satisfy formula (17):
- c 1 , c 2 , s satisfy formula (18):
- the speed measuring device obtains the first confidence level according to the score of the first object and the second confidence level, which can improve the accuracy of the first confidence level.
- the speed measuring device may obtain the fourth confidence level of the fourth transmission parameter by performing step 9, and the speed measuring device may obtain the above-mentioned third confidence level according to the score of the second object and the fourth confidence level.
- the speed measuring device before performing step 6, the speed measuring device also performs the following steps:
- the depth image of the first image to be processed carries the depth information of the pixels in the first image to be processed.
- the speed measuring device receives the depth image input by the user through the input component.
- the above input components include: keyboard, mouse, touch screen, touch pad, audio input and so on.
- the speed measuring device is loaded with an RGB camera and a depth camera. During the process of using the RGB camera to collect the first image to be processed, the speed measuring device uses the depth camera to collect the depth image of the first image to be processed.
- the depth camera may be any one of the following: a structured light (structured light) camera, a TOF camera, and a binocular stereo vision camera.
- the speed measuring device receives the depth image sent by the fifth terminal, where the fifth terminal includes a mobile phone, a computer, a tablet computer, a server, and the like.
- the fifth terminal and the first terminal may be the same or different.
- the depth image carries the depth information of the pixels in the first image to be processed.
- the speed measuring device can determine the depth information of the second pixel (ie the first depth information) and the depth information of the third pixel (ie the second depth information) according to the depth image.
- the abscissa of the first data point is the first depth information
- the abscissa of the second data point is the second depth information
- the ordinate of the first data point is the fifth transmission parameter
- the second The ordinate of the data point is the fourth transmission parameter. That is, the speed measuring device takes the depth information of the pixel point as the abscissa, and the transmission parameter of the pixel point as the ordinate.
- the ordinate of the first data point is the first depth information
- the ordinate of the second data point is the second depth information
- the abscissa of the first data point is the fifth transmission parameter
- the second The abscissa of the data points is the fourth transmission parameter. That is, the speed measuring device takes the depth information of the pixel point as the ordinate and the transmission parameter of the pixel point as the abscissa.
- the speed measuring device After obtaining the first data point and the second data point, the speed measuring device performs the following steps in the process of performing step 6:
- both the first data point and the second data point carry the depth information of the pixel point.
- the transmission parameter map obtained by the speed measuring device also carries depth information.
- the speed measuring device obtains the transmission parameter map by performing step 14, which can improve the accuracy of the size of the pixel in the first image to be processed.
- the precision of the transmission parameter map can further improve the precision of the transmission parameters of the pixels in the first image to be processed, thereby improving the precision of the speed of the first object.
- the speed measuring device can obtain the transmission parameter map of the second image to be processed based on the depth map of the second image to be processed, thereby improving the Accuracy of speed.
- the first image to be processed and the second image to be processed are acquired by the same imaging device, and the pose of the imaging device in the process of acquiring the first image to be processed is the same as that of the imaging device
- the poses in the process of acquiring the second image to be processed are the same.
- the pixel points at the same position in the first image to be processed and the second image to be processed have the same scale.
- pixel a belongs to the first image to be processed
- pixel b belongs to the second image to be processed
- the position of pixel a in the first image to be processed is the same as the position of pixel b in the second image to be processed.
- the scale of the pixel point a in the first image to be processed is the same as the scale of the pixel point b in the second image to be processed.
- the imaging device is a surveillance camera.
- the embodiments of the present disclosure also provide a possible application scenario.
- surveillance camera equipment in order to enhance the safety in work, life or social environment, surveillance camera equipment will be installed in various public places to perform security protection according to video stream information.
- the technical solutions provided by the embodiments of the present disclosure to process the video stream collected by the surveillance camera device, the density of people in the public place can be determined, thereby effectively preventing the occurrence of public accidents.
- the above-mentioned first object is a human object, and the human object belongs to a monitoring crowd, where the monitoring crowd refers to a crowd in a monitoring picture of a monitoring camera.
- the monitoring crowd refers to a crowd in a monitoring picture of a monitoring camera.
- crowd density the distance between people is smaller, which in turn leads to slower movement of people. Therefore, it can be determined by the speed of the first object whether the crowd density of the monitored crowd is too large.
- the speed measuring device determines that the crowd density of the monitored crowd is too large.
- the speed measuring device further obtains the location of the imaging device (the location of the imaging device carries at least one of the following information: the serial number of the imaging device, the longitude and latitude information of the imaging device), and sends an alarm instruction including the location to the terminal of the relevant manager to remind the manager
- the crowd density of the monitoring crowd is too large, thereby reducing the probability of public safety accidents.
- the alarm instruction may instruct the terminal to output alarm information that the crowd density of the monitored crowd is too large in at least one of the following ways: light, voice, text, and vibration.
- the writing order of each step does not mean a strict execution order but constitutes any limitation on the implementation process, and the specific execution order of each step should be based on its function and possible Internal logic is determined.
- FIG. 6 is a schematic structural diagram of a speed measuring device provided by an embodiment of the present disclosure.
- the speed measuring device includes: a first obtaining unit 11, a second obtaining unit 12, and a first processing unit 13, wherein: the first obtaining unit 13
- the unit 11 is used to acquire the first image to be processed and the second image to be processed; wherein, the first image to be processed and the second image to be processed both include a first object;
- the second acquisition unit 12 is used to acquire the first position of the first object in the first image to be processed, the second position of the first object in the second image to be processed, and the first transmission parameters of the first moving distance; wherein,
- the first moving distance is the distance between the first position and the second position;
- the first transmission parameter represents the conversion relationship between the first moving distance and the first physical distance;
- a physical distance is a physical distance corresponding to the first moving distance; the first physical distance is negatively correlated with the scale of the first position in the first image to be processed, and/or the first physical distance is The physical distance
- the first processing unit 13 is configured to:
- the speed is obtained according to the second moving distance and the moving time.
- the second obtaining unit 12 is used for:
- the second transmission parameter of the third position is the position on the connecting line between the first position and the second position; wherein, the second transmission parameter represents the size of the first pixel point and the A conversion relationship between the sizes of object points;
- the first pixel point is a pixel point determined in the first image to be processed according to the third position, and the first pixel point may be a pixel point determined according to the third position in the first image to be processed.
- the first object point is the object point corresponding to the first pixel point; the first ratio and the scale of the first pixel point in the image Negative correlation;
- the first ratio is the ratio between the size of the first pixel point and the size of the first object point;
- the first transmission parameter is obtained; wherein, the first transmission parameter is positively correlated with the second transmission parameter.
- the third position is an intermediate position between the first position and the second position.
- the second obtaining unit 12 is used for:
- the first object frame includes the first object
- the second object frame includes the second object frame object
- the third transmission parameter is obtained according to the first size and the third size
- the fourth transmission parameter is obtained according to the second size and the fourth size
- the third size is the physical size of the first object
- the third transmission parameter represents the conversion relationship between the fifth size and the sixth size
- the fifth size is the size of the second pixel
- the position of the second pixel in the first image to be processed is based on The position of the first object frame is determined
- the sixth size is the size of the object point corresponding to the second pixel point
- the fourth size is the physical size of the second object
- the fourth transmission parameter Indicates the conversion relationship between the seventh size and the eighth size
- the seventh size is the size of the third pixel
- the position of the third pixel in the second image to be processed depends on the second object
- the position of the frame is determined
- the eighth size is the size of the object point corresponding to the third pixel point
- the conversion relationship between the ninth size and the tenth size is based on the transmission parameter
- the first pixel value in the figure is determined; the ninth size is the size of the fourth pixel in the first image to be processed; the tenth size is the size of the object point corresponding to the fourth pixel;
- the first pixel value is the pixel value of the fifth pixel point; the fifth pixel point is the pixel point corresponding to the fourth pixel point in the transmission parameter map;
- the second transmission parameter is obtained according to the pixel value corresponding to the third position in the transmission parameter map.
- the first acquisition unit 11 is further configured to perform curve fitting processing on the third transmission parameter and the fourth transmission parameter to obtain the transmission of the image to be processed Before the parameter map, obtain a confidence map;
- the confidence map represents the mapping between the object type and the confidence of the transmission parameter;
- the speed measuring device further includes: a second processing unit 14, configured to obtain the first confidence level of the third transmission parameter according to the object type of the first object and the confidence level mapping; the second obtaining unit 12, for: obtaining a fifth transmission parameter according to the first confidence level and the third transmission parameter; wherein the fifth transmission parameter is positively correlated with the first confidence level; for the fourth transmission parameter
- the parameters and the fifth transmission parameter are subjected to curve fitting processing to obtain the transmission parameter map.
- the speed measuring device 1 further includes: a third processing unit 15, configured to obtain the third transmission parameter in the mapping according to the object type of the first object and the confidence level Before the first confidence level of , perform feature extraction processing on the pixel area in the first object frame to obtain feature data; the fourth processing unit 16 is configured to obtain the score of the first object according to the feature data ; wherein, the score is positively correlated with the confidence of the size of the first object; the second processing unit 14 is used for: according to the object type of the first object and the confidence map, obtain the The second confidence level of the third transmission parameter; the first confidence level is obtained according to the score and the second confidence level; wherein, the first confidence level is related to the score.
- the second obtaining unit 12 is configured to: determine the product of the first confidence level and the third transmission parameter to obtain the fifth transmission parameter.
- the first obtaining unit is further configured to obtain, before the curve fitting process is performed on the fourth transmission parameter and the fifth transmission parameter to obtain the transmission parameter map
- the depth image of the first image to be processed; the second obtaining unit 12 is further configured to: obtain the first depth information of the second pixel point and the first depth information of the third pixel point according to the depth image 2 depth information; a first data point is obtained according to the first depth information and the fifth transmission parameter, and a second data point is obtained according to the second depth information and the fourth transmission parameter; the second obtaining unit 12. It is further used for: performing curve fitting processing on the first data point and the second data point to obtain the transmission parameter map.
- the first image to be processed and the second image to be processed are acquired by the same imaging device, and the pose of the imaging device in the process of acquiring the first image to be processed The pose is the same as that of the imaging device in the process of acquiring the second image to be processed.
- the first object is a human object; the human object belongs to a monitoring crowd, and the first obtaining unit 11 is further configured to obtain the speed when the speed does not exceed a safe speed threshold.
- the position of the imaging device; the speed measuring device further includes: a sending unit 17, configured to send an alarm instruction including the position to the terminal; wherein, the alarm instruction is used to instruct the terminal to output the crowd of the monitored crowd Warning information about excessive density.
- the speed measuring device obtains the speed of the first object according to the first transmission parameter, the first moving distance and the moving time, which can improve the accuracy of the speed.
- the functions or modules included in the apparatuses provided in the embodiments of the present disclosure may be used to execute the methods described in the above method embodiments.
- FIG. 7 is a schematic diagram of a hardware structure of a speed measuring device according to an embodiment of the present disclosure.
- the speed measuring device 2 includes a processor 21 , a memory 22 , an input device 23 and an output device 24 .
- the processor 21 , the memory 22 , the input device 23 , and the output device 24 are coupled through a connector, and the connector includes various types of interfaces, transmission lines, or buses, which are not limited in this embodiment of the present disclosure. It should be understood that, in various embodiments of the present disclosure, coupling refers to mutual connection in a specific manner, including direct connection or indirect connection through other devices, such as various interfaces, transmission lines, and buses.
- the processor 21 may be one or more graphics processing units (graphics processing units, GPUs).
- the GPU may be a single-core GPU or a multi-core GPU.
- the processor 21 may be a processor group composed of multiple GPUs, and the multiple processors are coupled to each other through one or more buses.
- the processor may also be a processor of other object types, etc., which is not limited in this embodiment of the present disclosure.
- the memory 22 may be used to store computer program instructions, as well as various types of computer program code, including program code for implementing the disclosed aspects.
- the memory includes, but is not limited to, random access memory (RAM), read-only memory (read-only memory, ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM) ), or a portable read-only memory (compact disc read-only memory, CD-ROM), which is used for related instructions and data.
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read only memory
- CD-ROM compact disc read-only memory
- the input device 23 is used for inputting data and/or signals
- the output device 24 is used for outputting data and/or signals.
- the input device 23 and the output device 24 may be independent devices or may be an integral device.
- the memory 22 can not only be used to store related instructions, but also can be used to store related data.
- the memory 22 can be used to store the first image to be processed obtained through the input device 23, or the memory 22 can also It can be used to store the speed of the first object obtained by the processor 21, etc., and the embodiment of the present disclosure does not limit the data specifically stored in the memory.
- FIG. 7 only shows a simplified design of a speed measuring device.
- the speed measuring device may also include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all speed measuring devices that can implement the embodiments of the present disclosure are included in the disclosure. within the scope of protection.
- the disclosed system, apparatus and method may be implemented in other manners.
- the apparatus embodiments described above are only illustrative.
- the division of the units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
- each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
- software it can be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions.
- the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
- the computer instructions may be stored in or transmitted over a computer-readable storage medium.
- the computer instructions can be sent from a website site, computer, server, or data center via wired (eg, coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.) another website site, computer, server or data center for transmission.
- the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes an integration of one or more available media.
- the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, digital versatile disc (DVD)), or semiconductor media (eg, solid state disk (SSD)) )Wait.
- the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
- the program When the program is executed , which may include the processes of the foregoing method embodiments.
- the aforementioned storage medium includes: read-only memory (read-only memory, ROM) or random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.
- the present disclosure discloses a speed measurement method and device, an electronic device and a storage medium.
- the method includes: acquiring a first image to be processed and a second image to be processed; both the first image to be processed and the second image to be processed include a first object; processing the first position in the image, the second position of the first object in the second image to be processed, and the first transmission parameter of the first moving distance; according to the first moving distance, the first transmission parameters and moving time to obtain the speed of the first object; the moving time is obtained according to the timestamp of the first image to be processed and the timestamp of the second image to be processed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
Description
物体类型 | 置信度 |
球门、篮球架、登机指示牌 | 0.9 |
人体 | 0.8 |
人脸 | 0.7 |
人脚 | 0.65 |
Claims (14)
- 一种测速方法,所述方法包括:获取第一待处理图像和第二待处理图像;其中,所述第一待处理图像和所述第二待处理图像均包括第一对象;获取所述第一对象在所述第一待处理图像中的第一位置、所述第一对象在所述第二待处理图像中的第二位置和第一移动距离的第一透射参数;其中,所述第一移动距离为所述第一位置与所述第二位置之间的距离;所述第一透射参数表征所述第一移动距离与第一物理距离之间的转换关系;所述第一物理距离为所述第一移动距离对应的物理距离;所述第一物理距离与所述第一位置在所述第一待处理图像中的尺度呈负相关,和/或,所述第一物理距离与所述第二位置在所述第二待处理图像中的尺度呈负相关;依据所述第一移动距离、所述第一透射参数和移动时间,得到所述第一对象的速度;其中,所述移动时间依据所述第一待处理图像的时间戳和所述第二待处理图像的时间戳得到。
- 根据权利要求1所述的方法,所述依据所述第一移动距离、所述第一透射参数和移动时间,得到所述第一对象的速度,包括:依据所述第一透射参数和所述第一移动距离,得到第二移动距离;依据所述第二移动距离和所述移动时间,得到所述速度。
- 根据权利要求1或2所述的方法,所述获取所述第一移动距离的第一透射参数,包括:获取第三位置的第二透射参数;其中,所述第三位置为所述第一位置与所述第二位置连线上的位置;所述第二透射参数表征第一像素点的尺寸与第一物点的尺寸之间的转换关系;所述第一像素点为依据所述第三位置在所述第一待处理图像中确定的像素点,所述第一像素点或为依据所述第三位置在所述第二待处理图像中确定的像素点;所述第一物点为所述第一像素点对应的物点;第一比值与所述第一像素点在图像中的尺度呈负相关;所述第一比值为所述第一像素点的尺寸与所述第一物点的尺寸之间的比值;依据所述第二透射参数,得到所述第一透射参数;其中,所述第一透射参数与所述第二透射参数呈正相关。
- 根据权利要求3所述的方法,所述第三位置为所述第一位置与所述的第二位置之间的中间位置。
- 根据权利要求3或4所述的方法,所述获取第三位置的第二透射参数,包括:对所述第一待处理图像进行物体检测处理,得到第一物体框的位置和第二物体框的位置;其中,所述第一物体框包含第一物体;所述第二物体框包含第二物体;依据所述第一物体框的位置得到所述第一物体的第一尺寸,依据所述第二物体框的位置得到所述第二物体的第二尺寸;依据所述第一尺寸和第三尺寸得到第三透射参数,依据所述第二尺寸和第四尺寸得到第四透射参数;其中,所述第三尺寸为所述第一物体的物理尺寸;所述第三透射参数表征第五尺寸与第六尺寸之间的转换关系;所述第五尺寸为第二像素点的尺寸;所述第二像素点在所述第一待处理图像中的位置依据所述第一物体框的位置确定;所述第六尺寸为所述第二像素点对应的物点的尺寸;所述第四尺寸为所述第二物体的物理尺寸;所述第四透射参数表征第七尺寸与第八尺寸之间的转换关系;所述第七尺寸为第三像素点的尺寸;所述第三像素点在所述第二待处理图像中的位置依据所述第二物体框的位置确定;所述第八尺寸为所述第三像素点对应的物点的尺寸;对所述第三透射参数和所述第四透射参数进行曲线拟合处理,得到所述第一待处理图像的透射参数图;第九尺寸与第十尺寸之间的转换关系依据所述透射参数图中的第一像素值确定;所述第九尺寸为所述第一待处理图像中的第四像素点的尺寸;所述第十尺寸为所述第四像素点对应的物点的尺寸; 所述第一像素值为第五像素点的像素值;所述第五像素点为所述透射参数图中与所述第四像素点对应的像素点;依据所述透射参数图中与所述第三位置对应的像素值,得到所述第二透射参数。
- 根据权利要求5所述的方法,在所述对所述第三透射参数和所述第四透射参数进行曲线拟合处理,得到所述待处理图像的透射参数图之前,所述方法还包括:获取置信度映射;其中,所述置信度映射表征物体类型与透射参数的置信度之间的映射;依据所述第一物体的物体类型和所述置信度映射,得到所述第三透射参数的第一置信度;所述对所述第三透射参数和所述第四透射参数进行曲线拟合处理,得到所述待处理图像的透射参数图,包括:依据所述第一置信度和所述第三透射参数,得到第五透射参数;其中,所述第五透射参数与所述第一置信度呈正相关;对所述第四透射参数和所述第五透射参数进行曲线拟合处理,得到所述透射参数图。
- 根据权利要求6所述的方法,在所述依据所述第一物体的物体类型和所述置信度映射,得到所述第三透射参数的第一置信度之前,所述方法还包括:对所述第一物体框内的像素点区域进行特征提取处理,得到特征数据;依据所述特征数据,得到所述第一物体的分数;其中,所述分数与所述第一物体的尺寸的置信度呈正相关;所述依据所述第一物体的物体类型和所述置信度映射,得到所述第三透射参数的第一置信度,包括:依据所述第一物体的物体类型和所述置信度映射,得到所述第三透射参数的第二置信度;依据所述分数与所述第二置信度,得到所述第一置信度;其中,所述第一置信度与所述分数呈相关。
- 根据权利要求5或6所述的方法,所述依据所述第一置信度和所述第三透射参数,得到第五透射参数,包括:确定所述第一置信度与所述第三透射参数的乘积,得到所述第五透射参数。
- 根据权利要求6至8中任一项所述的方法,在所述对所述第四透射参数和所述第五透射参数进行曲线拟合处理,得到所述透射参数图之前,所述方法还包括:获取所述第一待处理图像的深度图像;依据所述深度图像,得到所述第二像素点的第一深度信息以及所述第三像素点的第二深度信息;依据所述第一深度信息和所述第五透射参数得到第一数据点,依据所述第二深度信息和所述第四透射参数得到第二数据点;所述对所述第四透射参数和所述第五透射参数进行曲线拟合处理,得到所述透射参数图,包括:对所述第一数据点和所述第二数据点进行曲线拟合处理,得到所述透射参数图。
- 根据权利要求1至9中任一项所述的方法,所述第一待处理图像和所述第二待处理图像由同一成像设备采集得到,且所述成像设备在采集所述第一待处理图像的过程中的位姿与所述成像设备在采集所述第二待处理图像的过程中的位姿相同。
- 根据权利要求10所述的方法,所述第一对象为人物对象;所述人物对象属于监测人群,所述方法还包括:在所述速度未超过安全速度阈值的情况下,获取所述成像设备的位置;向终端发送包含所述位置的告警指令;其中,所述告警指令用于指示所述终端输出所述监测人群的人群密度过大的告警信息。
- 一种测速装置,所述装置包括:第一获取单元,用于获取第一待处理图像和第二待处理图像;所述第一待处理图像和所述第二待处理图像均包括第一对象;第二获取单元,用于获取所述第一对象在所述第一待处理图像中的第一位置、所述第一对象在所 述第二待处理图像中的第二位置和第一移动距离的第一透射参数;其中,所述第一移动距离为所述第一位置与所述第二位置之间的距离;所述第一透射参数表征所述第一移动距离与第一物理距离之间的转换关系;所述第一物理距离为所述第一移动距离对应的物理距离;所述第一物理距离与所述第一位置在所述第一待处理图像中的尺度呈负相关,和/或,所述第一物理距离与所述第二位置在所述第二待处理图像中的尺度呈负相关;第一处理单元,用于依据所述第一移动距离、所述第一透射参数和移动时间,得到所述第一对象的速度;其中,所述移动时间依据所述第一待处理图像的时间戳和所述第二待处理图像的时间戳得到。
- 一种电子设备,包括:处理器和存储器,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,在所述处理器执行所述计算机指令的情况下,所述电子设备执行如权利要求1至11中任一项所述的方法。
- 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,所述计算机程序包括程序指令,在所述程序指令被处理器执行的情况下,使所述处理器执行权利要求1至11中任一项所述的方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217030375A KR20220004016A (ko) | 2020-06-30 | 2020-10-16 | 속도 측정 방법 및 장치, 전자 기기 및 저장 매체 |
JP2021547718A JP2022542205A (ja) | 2020-06-30 | 2020-10-16 | 速度測定方法及び装置、電子デバイス並びに記憶媒体 |
US17/477,746 US20220005208A1 (en) | 2020-10-16 | 2021-09-17 | Speed measurement method and apparatus, electronic device, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010613426.XA CN111739098B (zh) | 2020-06-30 | 2020-06-30 | 测速方法及装置、电子设备及存储介质 |
CN202010613426.X | 2020-06-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/477,746 Continuation US20220005208A1 (en) | 2020-10-16 | 2021-09-17 | Speed measurement method and apparatus, electronic device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022000856A1 true WO2022000856A1 (zh) | 2022-01-06 |
Family
ID=72653708
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/121491 WO2022000856A1 (zh) | 2020-06-30 | 2020-10-16 | 测速方法及装置、电子设备及存储介质 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN111739098B (zh) |
TW (1) | TWI735367B (zh) |
WO (1) | WO2022000856A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111739098B (zh) * | 2020-06-30 | 2024-05-24 | 上海商汤智能科技有限公司 | 测速方法及装置、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101877174A (zh) * | 2009-09-29 | 2010-11-03 | 杭州海康威视软件有限公司 | 车速测量方法、监控机及车速测量系统 |
CN108957024A (zh) * | 2017-05-22 | 2018-12-07 | 阿里巴巴集团控股有限公司 | 一种速度测量的方法、装置以及电子设备 |
CN109979206A (zh) * | 2017-12-28 | 2019-07-05 | 杭州海康威视系统技术有限公司 | 车辆测速方法、装置、系统、电子设备及存储介质 |
CN110824188A (zh) * | 2019-10-17 | 2020-02-21 | 浙江大华技术股份有限公司 | 高速公路车辆的测速方法、装置、编解码器及存储装置 |
CN111739098A (zh) * | 2020-06-30 | 2020-10-02 | 上海商汤智能科技有限公司 | 测速方法及装置、电子设备及存储介质 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112007003284T5 (de) * | 2007-01-26 | 2009-12-24 | Trimble Jena Gmbh | Optisches Instrument und Verfahren zum Erhalten von Abstands- und Bildinformation |
TW201013214A (en) * | 2008-09-17 | 2010-04-01 | Altek Corp | Photography device and method for sensing the moving speed of the photographed object |
JP2015025657A (ja) * | 2011-10-07 | 2015-02-05 | 川崎重工業株式会社 | 速度計測装置及び方法 |
US10586339B2 (en) * | 2015-03-18 | 2020-03-10 | Riken | Device for measuring rotation of spherical body, measurement method, and program |
CN107255812A (zh) * | 2017-06-30 | 2017-10-17 | 努比亚技术有限公司 | 基于3d技术的测速方法、移动终端、及存储介质 |
CN107462741B (zh) * | 2017-07-26 | 2019-12-31 | 武汉船用机械有限责任公司 | 一种运动物体速度及加速度测量装置 |
CN110889890B (zh) * | 2019-11-29 | 2023-07-28 | 深圳市商汤科技有限公司 | 图像处理方法及装置、处理器、电子设备及存储介质 |
-
2020
- 2020-06-30 CN CN202010613426.XA patent/CN111739098B/zh active Active
- 2020-10-16 WO PCT/CN2020/121491 patent/WO2022000856A1/zh active Application Filing
- 2020-10-29 TW TW109137705A patent/TWI735367B/zh active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101877174A (zh) * | 2009-09-29 | 2010-11-03 | 杭州海康威视软件有限公司 | 车速测量方法、监控机及车速测量系统 |
CN108957024A (zh) * | 2017-05-22 | 2018-12-07 | 阿里巴巴集团控股有限公司 | 一种速度测量的方法、装置以及电子设备 |
CN109979206A (zh) * | 2017-12-28 | 2019-07-05 | 杭州海康威视系统技术有限公司 | 车辆测速方法、装置、系统、电子设备及存储介质 |
CN110824188A (zh) * | 2019-10-17 | 2020-02-21 | 浙江大华技术股份有限公司 | 高速公路车辆的测速方法、装置、编解码器及存储装置 |
CN111739098A (zh) * | 2020-06-30 | 2020-10-02 | 上海商汤智能科技有限公司 | 测速方法及装置、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN111739098B (zh) | 2024-05-24 |
TW202202845A (zh) | 2022-01-16 |
CN111739098A (zh) | 2020-10-02 |
TWI735367B (zh) | 2021-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI752466B (zh) | 影像處理方法、處理器、電子設備、儲存媒介 | |
US10121076B2 (en) | Recognizing entity interactions in visual media | |
US11430265B2 (en) | Video-based human behavior recognition method, apparatus, device and storage medium | |
CN105051754B (zh) | 用于通过监控系统检测人的方法和装置 | |
US9996731B2 (en) | Human head detection in depth images | |
WO2022135511A1 (zh) | 运动物体的定位方法、装置、电子设备及存储介质 | |
US10922536B2 (en) | Age classification of humans based on image depth and human pose | |
US20220351390A1 (en) | Method for generating motion capture data, electronic device and storage medium | |
WO2021097750A1 (zh) | 人体姿态的识别方法、装置、存储介质及电子设备 | |
EP4050305A1 (en) | Visual positioning method and device | |
US20240104744A1 (en) | Real-time multi-view detection of objects in multi-camera environments | |
CN104484814A (zh) | 一种基于视频地图的广告方法及系统 | |
WO2022000856A1 (zh) | 测速方法及装置、电子设备及存储介质 | |
CN111177811A (zh) | 一种应用于云平台的消防点位自动布图的方法 | |
CN109816628B (zh) | 人脸评价方法及相关产品 | |
WO2021238151A1 (zh) | 图像标注方法、装置、电子设备、存储介质及计算机程序 | |
CN114005140A (zh) | 一种人员识别方法、装置、设备、行人监控系统及存储介质 | |
US20220005208A1 (en) | Speed measurement method and apparatus, electronic device, and storage medium | |
CN109785439A (zh) | 人脸素描图像生成方法及相关产品 | |
CN111739086A (zh) | 测量面积的方法及装置、电子设备及存储介质 | |
US20230089845A1 (en) | Visual Localization Method and Apparatus | |
WO2023273154A1 (zh) | 图像处理方法、装置、设备、介质及程序 | |
JP2022542205A (ja) | 速度測定方法及び装置、電子デバイス並びに記憶媒体 | |
TWI739601B (zh) | 圖像處理方法、電子設備和儲存介質 | |
CN111739097A (zh) | 测距方法及装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021547718 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20942674 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 521431113 Country of ref document: SA |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20942674 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 05.07.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20942674 Country of ref document: EP Kind code of ref document: A1 |