WO2021208258A1 - 基于跟踪目标的搜索方法、设备及其手持相机 - Google Patents
基于跟踪目标的搜索方法、设备及其手持相机 Download PDFInfo
- Publication number
- WO2021208258A1 WO2021208258A1 PCT/CN2020/099835 CN2020099835W WO2021208258A1 WO 2021208258 A1 WO2021208258 A1 WO 2021208258A1 CN 2020099835 W CN2020099835 W CN 2020099835W WO 2021208258 A1 WO2021208258 A1 WO 2021208258A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- search
- tracking
- image frame
- algorithm
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 81
- 238000010845 search algorithm Methods 0.000 claims abstract description 140
- 230000008569 process Effects 0.000 claims description 19
- 230000006870 function Effects 0.000 description 25
- 238000012545 processing Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 9
- 230000006872 improvement Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 3
- 230000013011 mating Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000010979 ruby Substances 0.000 description 1
- 229910001750 ruby Inorganic materials 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the embodiments of the present application relate to the field of computer vision technology, and in particular, to a search method, equipment and handheld camera based on tracking a target.
- Target detection and tracking is a fast-developing direction in the field of computer vision in recent years.
- visual processing technology and artificial intelligence technology household handheld cameras can also be used to track the target to be photographed, and perform object recognition and scene recognition operations based on the target to be photographed, so that users can classify the photos or videos taken. And management, and other subsequent automatic processing operations.
- the current single-target tracking algorithms have a problem, that is, when at least one of the shape, lighting conditions, scene, and position of the target to be tracked changes, it will seriously affect the tracking effect of the tracking shooting and cause the tracking shooting to fail.
- the problem is, when at least one of the shape, lighting conditions, scene, and position of the target to be tracked changes, it will seriously affect the tracking effect of the tracking shooting and cause the tracking shooting to fail.
- one of the technical problems solved by the embodiments of the present invention is to provide a tracking target-based search method, device, and handheld camera, so as to overcome the technical defects that are prone to failure in tracking and shooting in the prior art.
- An embodiment of the present application provides a search method based on a tracking target, including: determining a first search area corresponding to a first search algorithm in a first image frame according to the effective identification information of the tracking area corresponding to the tracking target, And use the first search algorithm to search in the first search area; and determine at least one second search algorithm corresponding to the second search algorithm in the second image frame according to the effective identification information of the tracking area corresponding to the tracking target Searching a region, and using the second search algorithm to search in the second search region.
- a tracking target-based search device which is characterized in that it includes a memory, a processor, a video collector, and the video collector is used to collect a target to be tracked in a target area; the memory is used for In storing program code; the processor is used to call and execute the program code, when the program code is executed, to perform the following operations: determine the first search algorithm according to the effective identification information of the tracking area corresponding to the tracking target Corresponding to a first search area in the first image frame, and use the first search algorithm to search in the first search area; and determine the second search area according to the effective identification information of the tracking area corresponding to the tracking target
- the search algorithm corresponds to at least one second search area in the second image frame, and uses the second search algorithm to search in the second search area.
- a handheld camera which includes the tracking target-based search device described in the foregoing embodiment, and further includes a carrier, which is fixedly connected to the video collector for At least a part of the video collector is carried.
- the search algorithm provided by the embodiment of the present application determines the first search area corresponding to the first search algorithm in the image frame and performs the search according to the effective identification information of the tracking area corresponding to the tracking target, and determines that the second search algorithm is located in the image frame In the corresponding second search area and search, by combining the first search algorithm and the second search algorithm, the accuracy of the search can be improved, and the probability of tracking the target being lost can be reduced.
- FIG. 1 is a schematic flowchart of a search method based on tracking targets according to an embodiment of the application
- FIG. 2 is a flowchart of an embodiment of a first search algorithm in a tracking target-based search method provided by an embodiment of this application;
- FIG. 3 is a flowchart of an embodiment of a second search algorithm in a tracking target-based search method provided by an embodiment of the application;
- FIG. 4 is a schematic diagram of a second search area generated based on a second search algorithm according to an embodiment of the application;
- FIG. 5 is a flowchart of an embodiment of a third search algorithm in a tracking target-based search method provided by an embodiment of this application;
- FIG. 6 is a schematic framework diagram of a tracking target-based search device provided by an embodiment of this application.
- FIG. 7 to 9 are schematic structural diagrams of a handheld camera provided by an embodiment of the application.
- the technical solutions provided by the embodiments of the present application improve the accuracy of the tracking search by improving the existing tracking algorithm, and improve the use experience of tracking and shooting.
- FIG. 1 is a schematic flowchart of the tracking target-based search method provided in the first embodiment of this application.
- the above tracking target-based search method can be applied to various shooting devices or any electronic devices with shooting functions.
- it can be applied to portable shooting devices such as pocket cameras, sports cameras, and handheld cameras.
- portable shooting devices such as pocket cameras, sports cameras, and handheld cameras.
- electronic devices such as smart phones and tablets with shooting functions, the present invention does not limit this.
- the tracking target-based search method in the embodiment of the present application mainly includes the following steps:
- Step S11 Determine a first search area corresponding to the first search algorithm in the first image frame according to the effective identification information of the tracking area corresponding to the tracking target, and use the first search algorithm to search in the first search area.
- the effective identification information can be used to identify the location, shape, size and other identification features of the tracking area, but it is not limited to this.
- the effective identification information can also be used to identify other identification features of the tracking area, such as color and material.
- the tracking area is determined based on the effective frame in the image frame, where the effective frame is used to identify the position, shape, and size of the tracking target in the first image frame. It is usually rectangular and can follow The near and far size of the tracking object changes accordingly (for example, the effective frame used to frame a human face can follow the changes in the distance and size of the photographed face to produce corresponding changes), and when the position and area size of the effective frame change, the tracking area The center position and area size will be adjusted accordingly.
- the center position of the tracking area coincides with the center point of the effective frame
- the area size of the tracking area is a predetermined multiple of the area size of the effective frame.
- the side length of the tracking area is 4 times the side length of the effective frame (that is, 4 times the side length of the tracking target). It should be noted that the center position and the area size (ie, the side length) of the tracking area can also be adjusted and set according to actual requirements, and the present invention does not limit this.
- the first search algorithm is used to determine a first search area with the current position of the tracking target as the center, and search and identify the main body part of the tracking target accordingly.
- the center point of the first search area coincides with the center position of the tracking area, and the search range of the first search area is the same or different from the area size of the tracking area, depending on the value of the adjustment parameter.
- the first search algorithm can be executed once for each of at least 12 consecutive first image frames (the total time is estimated to be about 1 second), but it is not limited to this, and can also be based on actual needs. Adjust the frame number of the first image frame.
- the size of the search range of the first search area correspondingly generated in any two adjacent first image frames of the first search algorithm is not equal, so as to improve the search accuracy.
- Step S12 According to the effective identification information of the tracking area corresponding to the tracking target, determine at least one second search area corresponding to the second search algorithm in the second image frame, and use the second search algorithm to search in the second search area.
- the definitions of the effective identification information and the tracking area are the same as those described in step S11 above, and will not be repeated here.
- the second search algorithm is used to determine at least one first search area in the surrounding area of the tracking target, so as to search and recognize the surrounding part of the tracking target.
- the search area of the second search area is the same as the area area of the tracking area, and the center point of the second search area is different from the center position of the tracking area.
- the first image frame and the second image frame may be image frames with the same frame number (that is, the same image frame in the image sequence).
- the third image frame in the image sequence is both the first image frame and the second image frame.
- first image frame and the second image frame may also be two image frames with different frame numbers in the same image sequence.
- the third frame in the image sequence is the first image frame
- the fourth frame is the second image frame.
- the execution sequence of the first search algorithm (ie, step S11) and the second search algorithm (ie, step S12) can be adjusted according to actual needs.
- the second search algorithm is used to search in the second search area at least once.
- the first image frame and The second image frame is an image frame with a different frame number.
- the second search algorithm can be used multiple times to search for the image.
- the sequence is searched for multiple subsequent second image frames (for example, the 13th to 20th second image frames).
- the second search algorithm in the process of using the first search algorithm to search in the first search area, may be used to search in the second search area.
- the first image frame and the second The image frames have the same number of frames, and the search accuracy can be improved by using two search algorithms for the same image frame at the same time.
- the first search algorithm and the second search algorithm can be used to search for the third image frame in the image sequence.
- the first search algorithm or the second search algorithm may be used once for each image frame in the image sequence.
- the search algorithm can be alternated every other frame, that is, the first search algorithm is used for the second image frame in the image sequence, and the second search algorithm is used for the third image frame in the image sequence. Algorithm to search.
- the embodiments of the present application are based on the effective identification information of the tracking area corresponding to the tracking target, and combined with the first search algorithm and the second search algorithm for searching, the search accuracy can be improved and the probability of tracking loss can be reduced.
- the second embodiment of the present application provides a search method based on tracking targets. Please refer to FIG. 2, this embodiment of the present application describes an exemplary processing flow of determining the first search area in the first image frame in step S11 shown in FIG. 1. As shown in the figure, the search method of the embodiment of the present application mainly includes the following:
- Step S21 Determine the center position and the area size of the tracking area in the first image frame according to the effective identification information of the tracking area.
- the effective identification information of the tracking area is used to identify the center position (center point) and the area size of the tracking area, but it is not limited to this, and can also be used to identify other identification features of the tracking area.
- Step S22 Determine the first search area corresponding to the first search algorithm in the first image frame according to the center position and area size of the tracking area.
- the tracking area is a rectangular area
- the side length of the first search area is n times the corresponding side length of the tracking area, where n is an adjustment parameter, and the center point of the first search area coincides with the center position of the tracking area.
- the first search area generated based on the tracking area is also a rectangular area, where the four side lengths of the first search area correspond to the four side lengths of the tracking area in a one-to-one correspondence, for example, the first search area
- the length of the left and right sides of a search area corresponds to n times the length of the left and right sides of the tracking area.
- the length of the upper and lower sides of the first search area corresponds to n times the length of the upper and lower sides of the tracking area.
- the first search The center point of the area is the center position of the tracking area.
- is less than or equal to 0.3, that is, the adjustment range (increasing range or reduction range) of the area size for the tracking area by the adjustment parameter n does not exceed 0.3 times.
- the first search algorithm may be used to search for multiple consecutive first image frames in the image sequence, and the values of adjustment parameters corresponding to two adjacent first image frames are different.
- the first search algorithm can be used to search for 12 consecutive first image frames in the image sequence (search once per frame).
- the difference between the adjustment parameters corresponding to two adjacent first image frames does not exceed 0.3, and the zoom-in adjustment parameter (that is, the adjustment parameter with a value greater than 1.0) and the zoom-out adjustment parameter (that is, the adjustment parameter with a value less than 1.0)
- the adjustment parameter is set alternately, and the constant adjustment parameter (that is, the adjustment parameter with a value equal to 1.0) is set arbitrarily, for example, the interval is set between adjacent enlargement adjustment parameters and reduction adjustment parameters.
- the successive adjustment parameters in the image sequence Each of the 12 first image frames (for example, frames 2 to 13) is searched using the first search algorithm once, where the side length of the first search area generated in the second frame is the tracking area Corresponding to 1.0 times the side length, the side length of the first search area generated in the 3rd frame is 1.1 times the side length of the tracking area, and the side length of the first search area generated in the 4th frame is the side length of the tracking area 1.0 times the length, the side length of the first search area generated in the 5th frame is 0.9 times the side length of the tracking area, and so on.
- the second embodiment of the present application uses the first search algorithm to generate and search a first search area with the same center point as the tracking area and different area sizes in the first image frame, so as to target the subject of the tracking target. Part of the search recognition, thereby improving search efficiency and improve search accuracy.
- the third embodiment of the present application provides a search method based on tracking targets. Please refer to FIG. 3, this embodiment of the present application describes an exemplary processing flow of determining the second search area in the second image frame in step S12 shown in FIG. 1. As shown in the figure, the search method of the embodiment of the present application mainly includes the following:
- Step S31 Determine the center position and the area area of the tracking area in the second image frame according to the effective identification information of the tracking area.
- the effective identification information of the tracking area is used to identify the center position (center point) and area area of the tracking area, but it is not limited to this, and can also be used to identify other identification features of the tracking area.
- Step S32 Determine a second search area corresponding to the second search algorithm in the second image frame according to the center position and area area of the tracking area.
- the tracking area is a rectangular area
- the search area of the second search area is the area area of the tracking area
- the side length of a square with the same area as the area of the tracking area can be determined according to the area of the tracking area, that is, the area of the rectangular area of the tracking area is converted into the side length a of the square with the same area.
- the shortest distance between the center points of two adjacent second search areas can be set to 0.3 times the side length of the square (that is, 0.3a), and the center point of the second search area is between the center point of the tracking area and the center of the tracking area.
- the shortest distance between the square is at least 0.3 times the side length of the square. But it is not limited to this. Generally speaking, it is better to set the shortest distance between the center points of two adjacent second search regions not to exceed 0.5 times the side length of the square.
- the process of determining the second search area corresponding to the second search algorithm in the second image frame includes: taking the center position of the tracking area as the origin of the rectangular coordinate system, and the center point of the second search area is located in the The position in the rectangular coordinate system ( ⁇ 0.3ma, ⁇ 0.3m'a), where m and m'are integers, and a is the side length of a square with the same area as the area of the tracking area.
- the shortest distance between the center points of each second search area is 0.3a
- the center point of each second search area is 0.3a
- the shortest distance between the small dots of the tracking area and the center of the tracking area is at least 0.3a.
- the second search algorithm can be used to generate multiple second search areas with different positions but the same search area in the second image frame, and search and identify the surrounding parts of the tracking target accordingly, thereby Improve the accuracy of the search algorithm.
- the fourth embodiment of the present application provides a tracking target-based search method. Please refer to FIG. 5.
- the tracking target-based search method of the embodiment of the present application mainly includes the following:
- Step S51 Determine the area size of the tracking area in the third image frame according to the effective identification information of the tracking area corresponding to the tracking target.
- the effective identification information of the tracking area is used to identify the area of the tracking area, but it is not limited to this, and can also be used to identify other identification features of the tracking area.
- Step S52 Determine the area size of the third search area corresponding to the third search algorithm in the third image frame according to the area size of the tracking area, and randomly determine the center point of the third search area in the third image frame.
- the area size of the third search area is the same as the area size of the tracking area.
- the center point of the third search area is any position in the entire third image frame, and the number of the third search area is at least one.
- Step S53 Use the third search algorithm to search in the third search area.
- the third search algorithm is used to randomly determine one or more third search areas at any position in the third image frame, and perform a global random search, so as to target the main part, surrounding parts, and background parts of the tracking target. One of them performs search and identification.
- the first search algorithm, the second search algorithm, and the third search algorithm may be executed alternately, and correspondingly, the frame numbers of the first image frame, the second image frame, and the third image frame are all different.
- any two of the first search algorithm, the second search algorithm, and the third search algorithm are performed simultaneously.
- any two of the first image frame, the second image frame, and the third image frame are the same.
- the first search algorithm, the second search algorithm, and the third search algorithm are performed simultaneously, and correspondingly, the frame numbers of the first image frame, the second image frame, and the third image frame are all the same.
- step S52 may also be: determining the area size of the third search area corresponding to the third search algorithm in the third image frame according to the area size of the third image frame, and setting the center position of the third image frame It is determined as the center point of the third search area, and a third search area corresponding to the third search algorithm is determined in the third image frame (that is, the third search area is the entire third image frame), and in step In S53, a third search algorithm is used to perform a global search on the entire third image frame.
- the fourth embodiment of the present application uses the third search algorithm to randomly generate and search the third search area at different positions in the third image frame.
- the third search algorithm uses the third search algorithm to randomly generate and search the third search area at different positions in the third image frame.
- Fig. 6 shows the main architecture of a tracking target-based search device according to the fifth embodiment of the present invention.
- the device for tracking the status determination mainly includes a memory 602, a processor 604, and a video collector 606.
- the video collector 606 is used for collecting tracking targets in the target area, the processor 604 is used for storing program codes, and the processor 606 is used for calling and executing the program codes.
- the effective identification information of the tracking area corresponding to the tracking target determine a first search area corresponding to the first search algorithm in the first image frame, and use the first search algorithm to search in the first search area; and According to the effective identification information of the tracking area corresponding to the tracking target, determine at least one second search area corresponding to the second search algorithm in the second image frame, and use the second search algorithm to perform in the second search area search.
- program code is further used to perform the following operations:
- the center position and the area size of the tracking area in the first image frame are determined, and the center position and the area size of the tracking area are determined to be in the first image frame.
- the tracking area is a rectangular area
- the side length of the first search area is n times the corresponding side length of the tracking area, where n is an adjustment parameter, and
- the values of the adjustment parameters corresponding to the adjacent image frames are different.
- program code is further used to perform the following operations:
- the center position and area area of the tracking area in the second image frame are determined, and according to the center position and area area of the tracking area, it is determined to be in the second image frame.
- the second search area corresponding to the second search algorithm in two image frames.
- program code is further used to perform the following operations:
- the area area of the tracking area determine the side length of a square with the same area as the area area of the tracking area, and set the shortest distance between the center points of the second search area as the side length of the square
- the shortest distance between the center point of the second search area and the center position of the tracking area is at least 0.3 times the side length of the square
- the search area of the second search area is the tracking area The area area of the area.
- the second search algorithm is used to search in the second search area at least once, wherein the The respective frame numbers of the first image frame and the second image frame are different.
- the second search algorithm in the process of using the first search algorithm to search in the first search area, is used to search in the second search area, wherein the first search algorithm is used to search in the second search area.
- the frame numbers of an image frame and the second image frame are the same.
- program code is further used to perform the following operations:
- the effective identification information of the tracking area corresponding to the tracking target determine the area size of the tracking area in the third image frame; according to the area size of the tracking area, determine the third search in the third image frame The area size of the third search area corresponding to the algorithm, and randomly determine the center point of the third search area in the third image frame; and use the third search algorithm to search in the third search area.
- the program code is further configured to perform the following operation: for each frame of the first image frame of 12 consecutive frames, the first search algorithm is used to search.
- the sixth embodiment of the present invention provides a handheld camera, which includes the tracking state determination device described in the sixth embodiment above. In addition, it also includes a carrier fixedly connected to the video collector to carry at least the video collector. Part.
- the handheld camera is a handheld pan-tilt camera.
- the carrier includes at least a handheld pan/tilt, and the handheld pan/tilt includes but is not limited to a handheld three-axis pan/tilt.
- the video capture device includes, but is not limited to, a handheld three-axis pan/tilt camera.
- the handheld camera as the handheld gimbal camera as an example, the basic structure of the handheld gimbal camera is briefly introduced.
- the handheld pan/tilt camera of the embodiment of the present invention (as shown in FIG. 7) includes a handle 11 and a photographing device 12 loaded on the handle 11.
- the The photographing device 12 may include a three-axis pan-tilt camera, and in other embodiments includes a two-axis or more than three-axis pan-tilt camera.
- the handle 11 is provided with a display screen 13 for displaying the shooting content of the shooting device 12.
- the invention does not limit the type of the display screen 13.
- the display screen 13 By setting the display screen 13 on the handle 11 of the handheld PTZ camera, the display screen can display the shooting content of the shooting device 12, so that the user can quickly browse the pictures or videos shot by the shooting device 12 through the display screen 13, thereby improving The interaction and fun between the handheld PTZ camera and the user meets the diverse needs of the user.
- the handle 11 is further provided with an operating function unit for controlling the camera 12, and by operating the operating function unit, the operation of the camera 12 can be controlled, for example, the opening and closing of the camera 12 can be controlled. Turning off and controlling the shooting of the shooting device 12, controlling the posture change of the pan-tilt part of the shooting device 12, etc., so that the user can quickly operate the shooting device 12.
- the operation function part may be in the form of a button, a knob or a touch screen.
- the operating function unit includes a photographing button 14 for controlling the photographing of the photographing device 12, a power/function button 15 for controlling the opening and closing of the photographing device 12 and other functions, as well as controlling the pan/tilt.
- the universal key 16 may also include other control buttons, such as image storage buttons, image playback control buttons, etc., which can be set according to actual needs.
- the operation function part and the display screen 13 are arranged on the same side of the handle 11.
- the operation function part and the display screen 13 shown in the figure are both arranged on the front of the handle 11, which conforms to ergonomics.
- the overall appearance and layout of the handheld PTZ camera is more reasonable and beautiful.
- the side of the handle 11 is provided with a function operation key A, which is used to facilitate the user to quickly and intelligently form a sheet with one key.
- a function operation key A which is used to facilitate the user to quickly and intelligently form a sheet with one key.
- the handle 11 is further provided with a card slot 17 for inserting a storage element.
- the card slot 17 is provided on the side of the handle 11 adjacent to the display screen 13, and a memory card is inserted into the card slot 17 to store the images taken by the camera 12 in the memory card. .
- arranging the card slot 17 on the side does not affect the use of other functions, and the user experience is better.
- a power supply battery for supplying power to the handle 11 and the imaging device 12 may be provided inside the handle 11.
- the power supply battery can be a lithium battery with large capacity and small size to realize the miniaturization design of the handheld pan/tilt camera.
- the handle 11 is also provided with a charging interface/USB interface 18.
- the charging interface/USB interface 18 is provided at the bottom of the handle 11 to facilitate connection with an external power source or storage device, so as to charge the power supply battery or perform data transmission.
- the handle 11 is further provided with a sound pickup hole 19 for receiving audio signals, and the sound pickup hole 19 communicates with a microphone inside.
- the sound pickup hole 19 may include one or more. It also includes an indicator light 20 for displaying status. The user can realize audio interaction with the display screen 13 through the sound pickup hole 19.
- the indicator light 20 can serve as a reminder, and the user can obtain the power status of the handheld pan/tilt camera and the current execution function status through the indicator light 20.
- the sound pickup hole 19 and the indicator light 20 can also be arranged on the front of the handle 11, which is more in line with the user's usage habits and operation convenience.
- the imaging device 12 includes a pan-tilt support and a camera mounted on the pan-tilt support.
- the imager may be a camera, or an image pickup element composed of a lens and an image sensor (such as CMOS or CCD), etc., which can be specifically selected according to needs.
- the camera may be integrated on the pan-tilt support, so that the photographing device 12 is a pan-tilt camera; it may also be an external photographing device, which can be detachably connected or clamped to be mounted on the pan-tilt support.
- the pan/tilt support is a three-axis pan/tilt support
- the photographing device 12 is a three-axis pan/tilt camera.
- the three-axis pan/tilt head bracket includes a yaw axis assembly 22, a roll axis assembly 23 movably connected to the yaw axis assembly 22, and a pitch axis assembly 24 movably connected to the roll axis assembly 23.
- the camera is mounted on the pitch axis assembly 24.
- the yaw axis assembly 22 drives the camera 12 to rotate in the yaw direction.
- the pan/tilt support can also be a two-axis pan/tilt, a four-axis pan/tilt, etc., which can be specifically selected according to needs.
- a mounting portion is further provided, the mounting portion is provided at one end of the connecting arm connected to the roll shaft assembly, and the yaw shaft assembly may be set in the handle, and the yaw shaft assembly drives The camera 12 rotates in the yaw direction together.
- the handle 11 is provided with an adapter 26 for coupling with a mobile device 2 (such as a mobile phone), and the adapter 26 is detachably connected to the handle 11.
- the adapter 26 protrudes from the side of the handle 11 for connecting to the mobile device 2.
- the handheld pan/tilt camera It is docked with the adapter 26 and used to be supported at the end of the mobile device 2.
- the handle 11 is provided with an adapter 26 for connecting with the mobile device 2 to connect the handle 11 and the mobile device 2 to each other.
- the handle 11 can be used as a base of the mobile device 2.
- the user can hold the other end of the mobile device 2 Let's pick up and operate the handheld pan/tilt camera together.
- the connection is convenient and fast, and the product is beautiful.
- a communication connection between the handheld pan-tilt camera and the mobile device 2 can be realized, and data can be transmitted between the camera 12 and the mobile device 2.
- the adapter 26 and the handle 11 are detachably connected, that is, the adapter 26 and the handle 11 can be mechanically connected or removed. Further, the adapter 26 is provided with an electrical contact portion, and the handle 11 is provided with an electrical contact matching portion that matches with the electrical contact portion.
- the adapter 26 can be removed from the handle 11.
- the adapter 26 is then mounted on the handle 11 to complete the mechanical connection between the adapter 26 and the handle 11, and at the same time through the electrical contact part and the electrical contact mating part. The connection ensures the electrical connection between the two, so as to realize the data transmission between the camera 12 and the mobile device 2 through the adapter 26.
- the side of the handle 11 is provided with a receiving groove 27, and the adapter 26 is slidably clamped in the receiving groove 27. After the adapter 26 is installed in the receiving slot 27, the adapter 26 partially protrudes from the receiving slot 27, and the portion of the adapter 26 protruding from the receiving slot 27 is used to connect with the mobile device 2.
- the adapter 26 when the adapter 26 is inserted into the receiving groove 27 from the adapter 26, the adapter 26 is flush with the receiving groove 27, Furthermore, the adapter 26 is stored in the receiving groove 27 of the handle 11.
- the adapter 26 can be installed in the receiving slot 27 so that the adapter 26 protrudes from the receiving slot 27 so that the mobile device 2 can be connected to the handle 11 are connected to each other.
- the adapter 26 can be taken out of the receiving slot 27 of the handle 11, and then inserted into the receiving slot from the adapter 26 in the reverse direction 27, the adapter 26 is further stored in the handle 11.
- the adapter 26 is flush with the receiving groove 27 of the handle 11. After the adapter 26 is stored in the handle 11, the surface of the handle 11 can be ensured to be flat, and the adapter 26 is stored in the handle 11 to make it easier to carry.
- the receiving groove 27 is semi-opened on one side surface of the handle 11, which makes it easier for the adapter 26 to be slidably connected to the receiving groove 27.
- the adapter 26 can also be detachably connected to the receiving slot 27 of the handle 11 by means of a snap connection, a plug connection, or the like.
- the receiving groove 27 is provided on the side of the handle 11.
- the receiving groove 27 is clamped and covered by the cover 28, which is convenient for the user to operate, and does not affect the front and sides of the handle. The overall appearance.
- the electrical contact part and the electrical contact mating part may be electrically connected in a contact contact manner.
- the electrical contact portion can be selected as a telescopic probe, can also be selected as an electrical plug-in interface, or can be selected as an electrical contact.
- the electrical contact portion and the electrical contact mating portion can also be directly connected to each other in a surface-to-surface contact manner.
- a search method based on tracking targets characterized in that the method comprises:
- the effective identification information of the tracking area corresponding to the tracking target determine a first search area corresponding to the first search algorithm in the first image frame, and use the first search algorithm to search in the first search area; and According to the effective identification information of the tracking area corresponding to the tracking target, determine at least one second search area corresponding to the second search algorithm in the second image frame, and use the second search algorithm to perform in the second search area search.
- the search method according to A1, wherein the determining a first search area corresponding to the first search algorithm in the first image frame according to the effective identification information of the tracking area corresponding to the tracking target includes:
- the first search area corresponding to the first search algorithm in the first image frame is determined according to the center position and the area size of the tracking area.
- A3 The search method according to A2, wherein the tracking area is a rectangular area, the side length of the first search area is n times the corresponding side length of the tracking area, where n is an adjustment parameter,
- the second search area corresponding to the second search algorithm in the second image frame is determined.
- A6 The search method according to A5, wherein the second search algorithm corresponding to the second search algorithm in the second image frame is determined according to the center position and the area area of the tracking area
- the area includes:
- the area area of the tracking area determine the side length of a square with the same area as the area area of the tracking area, and set the shortest distance between the center points of the second search area as the side length of the square
- the shortest distance between the center point of the second search area and the center position of the tracking area is at least 0.3 times the side length of the square
- the search area of the second search area is the tracking area The area area of the area.
- the second search algorithm is used to search in the second search area, wherein the first image frame and the first The respective frame numbers of the two image frames are the same.
- A9 The search method according to A1, wherein the method further includes:
- the area size of the tracking area determine the area size of the third search area corresponding to the third search algorithm in the third image frame, and randomly determine the area size of the third search area in the third image frame Center point;
- the first search algorithm For each of the 12 consecutive frames of the first image frame, the first search algorithm is used to search.
- a tracking target-based search device characterized by comprising a memory, a processor, a video collector, the video collector is used to collect the target to be tracked in the target area; the memory is used to store program code; The processor is used to call and execute the program code, and when the program code is executed, it is used to perform the following operations:
- the effective identification information of the tracking area corresponding to the tracking target determine a first search area corresponding to the first search algorithm in the first image frame, and use the first search algorithm to search in the first search area;
- the effective identification information of the tracking area corresponding to the tracking target determine at least one second search area corresponding to the second search algorithm in the second image frame, and use the second search algorithm to perform in the second search area search.
- the first search area corresponding to the first search algorithm in the first image frame is determined according to the center position and the area size of the tracking area.
- A13 The search device according to A11, wherein the tracking area is a rectangular area, the side length of the first search area is n times the corresponding side length of the tracking area, where n is an adjustment parameter,
- the area area of the tracking area determine the side length of a square with the same area as the area area of the tracking area, and set the shortest distance between the center points of the second search area as the side length of the square
- the shortest distance between the center point of the second search area and the center position of the tracking area is at least 0.3 times the side length of the square
- the search area of the second search area is the tracking area The area area of the area.
- the second search algorithm is used to search in the second search area, wherein the first image frame and the first The respective frame numbers of the two image frames are the same.
- the area size of the tracking area determine the area size of the third search area corresponding to the third search algorithm in the third image frame, and randomly determine the area size of the third search area in the third image frame Center point;
- the first search algorithm For each of the 12 consecutive frames of the first image frame, the first search algorithm is used to search.
- a handheld camera characterized by comprising the tracking target-based search device according to any one of A11-A20, characterized by further comprising: a carrier, the carrier and the video capture device
- the fixed connection is used to carry at least a part of the video collector.
- A22 The handheld camera according to A21, wherein the carrier includes a handheld pan/tilt.
- A23 The handheld camera according to A22, wherein the carrier is a handheld three-axis pan/tilt.
- A24 The handheld camera according to A21, wherein the video capture device comprises a handheld three-axis pan-tilt camera.
- the improvement of a technology can be clearly distinguished between hardware improvements (for example, improvements in circuit structures such as diodes, transistors, switches, etc.) or software improvements (improvements in method flow).
- hardware improvements for example, improvements in circuit structures such as diodes, transistors, switches, etc.
- software improvements improvements in method flow.
- the improvement of many methods and processes of today can be regarded as a direct improvement of the hardware circuit structure.
- Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by the hardware entity module.
- a programmable logic device for example, a Field Programmable Gate Array (Field Programmable Gate Array, FPGA)
- PLD Programmable Logic Device
- FPGA Field Programmable Gate Array
- HDL Hardware Description Language
- ABEL Advanced Boolean Expression Language
- AHDL Altera Hardware Description Language
- HDCal JHDL
- Lava Lava
- Lola MyHDL
- PALASM RHDL
- VHDL Very-High-Speed Integrated Circuit Hardware Description Language
- Verilog Verilog
- the controller can be implemented in any suitable manner.
- the controller can take the form of, for example, a microprocessor or a processor and a computer-readable medium storing computer-readable program codes (such as software or firmware) executable by the (micro)processor. , Logic gates, switches, application specific integrated circuits (ASICs), programmable logic controllers and embedded microcontrollers. Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the memory control logic.
- controllers in addition to implementing the controller in a purely computer-readable program code manner, it is entirely possible to program the method steps to make the controller use logic gates, switches, application-specific integrated circuits, programmable logic controllers, and embedded logic.
- the same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
- a typical implementation device is a computer.
- the computer may be, for example, a personal computer, a laptop computer, a cell phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or Any combination of these devices.
- These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
- the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
- the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- this application can be provided as a method, a system, or a computer program product. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
- a computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
- This application may be described in the general context of computer-executable instructions executed by a computer, such as a program module.
- program modules include routines, programs, objects, components, data structures, etc. that perform specific transactions or implement specific abstract data types.
- This application can also be practiced in distributed computing environments. In these distributed computing environments, remote processing devices connected through a communication network execute transactions.
- program modules can be located in local and remote computer storage media including storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (10)
- 一种基于跟踪目标的搜索方法,其特征在于,所述方法包括:根据跟踪目标对应的跟踪区域的有效标识信息,确定第一搜索算法在第一图像帧中对应的一个第一搜索区域,并使用所述第一搜索算法在所述第一搜索区域进行搜索;以及根据所述跟踪目标对应的跟踪区域的有效标识信息,确定第二搜索算法在第二图像帧中对应的至少一个第二搜索区域,并使用所述第二搜索算法在所述第二搜索区域进行搜索。
- 根据权利要求1所述的搜索方法,其特征在于,所述根据跟踪目标对应的跟踪区域的有效标识信息,确定第一搜索算法在第一图像帧中对应的一个第一搜索区域包括:根据所述跟踪区域的所述有效标识信息,确定在所述第一图像帧中所述跟踪区域的中心位置和区域尺寸;以及根据所述跟踪区域的中心位置和区域尺寸,确定在所述第一图像帧中所述第一搜索算法对应的所述第一搜索区域。
- 根据权利要求2所述的搜索方法,其特征在于,所述跟踪区域为矩形区域,所述第一搜索区域的边长为所述跟踪区域对应边长的n倍,其中,n为调整参数,|n-1|小于或等于0.3;所述第一搜索区域的中心点与所述跟踪区域的中心位置重合。
- 根据权利要求3所述的搜索方法,其特征在于,相邻的所述第一图像帧对应的所述调整参数的取值不同。
- 根据权利要求1所述的搜索方法,其特征在于,所述根据所述跟踪目标对应的跟踪区域的有效标识信息,确定第二搜索算法在第二图像帧中对应的至少一个第二搜索区域包括:根据所述跟踪区域的所述有效标识信息,确定在所述第二图像帧中所述跟踪区域的中心位置和区域面积;以及根据所述跟踪区域的中心位置和区域面积,确定在所述第二图像帧中所述第二搜索算法对应的所述第二搜索区域。
- 根据权利要求5所述的搜索方法,其特征在于,所述根据所述跟踪区域的中心位置和区域面积,确定在所述第二图像帧中所述第二搜索算法对应的所述第二搜索区域包括:根据所述跟踪区域的区域面积,确定面积与所述跟踪区域的区域面积相同 的正方形的边长,并将所述第二搜索区域的中心点之间的最短距离设置为所述正方形的边长的0.3倍,所述第二搜索区域的中心点与所述跟踪区域的中心位置之间的最短距离至少为所述正方形边长的0.3倍,所述第二搜索区域的搜索面积为所述跟踪区域的区域面积。
- 根据权利要求1所述的搜索方法,其特征在于,至少一次使用所述第一搜索算法在所述第一搜索区域进行搜索之后,至少一次使用所述第二搜索算法在所述第二搜索区域进行搜索,其中,所述第一图像帧与所述第二图像帧各自的帧号为不同。
- 根据权利要求1所述的搜索方法,其特征在于,在使用所述第一搜索算法在所述第一搜索区域进行搜索的过程中,使用所述第二搜索算法在所述第二搜索区域进行搜索,其中,所述第一图像帧与所述第二图像帧各自的帧号为相同。
- 根据权利要求1所述的搜索方法,其特征在于,所述方法还包括:根据所述跟踪目标对应的跟踪区域的有效标识信息,确定在第三图像帧中所述跟踪区域的区域尺寸;根据所述跟踪区域的区域尺寸,确定在所述第三图像帧中第三搜索算法对应的第三搜索区域的区域尺寸,并随机在所述第三图像帧中确定所述第三搜索区域的中心点;以及使用所述第三搜索算法在所述第三搜索区域进行搜索。
- 根据权利要求1所述的搜索方法,其特征在于,针对连续12帧所述第一图像帧的每一帧,使用所述第一搜索算法进行搜索。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010296054.2A CN111563913B (zh) | 2020-04-15 | 2020-04-15 | 基于跟踪目标的搜索方法、设备及其手持相机 |
CN202010296054.2 | 2020-04-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021208258A1 true WO2021208258A1 (zh) | 2021-10-21 |
Family
ID=72073102
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/099835 WO2021208258A1 (zh) | 2020-04-15 | 2020-07-02 | 基于跟踪目标的搜索方法、设备及其手持相机 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111563913B (zh) |
WO (1) | WO2021208258A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113869163B (zh) * | 2021-09-18 | 2022-08-23 | 北京远度互联科技有限公司 | 目标跟踪方法、装置、电子设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101807300A (zh) * | 2010-03-05 | 2010-08-18 | 北京智安邦科技有限公司 | 一种目标碎片区域融合的方法及装置 |
US20120099765A1 (en) * | 2010-10-21 | 2012-04-26 | SET Corporation | Method and system of video object tracking |
CN106920252A (zh) * | 2016-06-24 | 2017-07-04 | 阿里巴巴集团控股有限公司 | 一种图像数据处理方法、装置及电子设备 |
CN108765458A (zh) * | 2018-04-16 | 2018-11-06 | 上海大学 | 基于相关滤波的高海况无人艇海面目标尺度自适应跟踪方法 |
CN110503662A (zh) * | 2019-07-09 | 2019-11-26 | 科大讯飞(苏州)科技有限公司 | 跟踪方法及相关产品 |
CN110853076A (zh) * | 2019-11-08 | 2020-02-28 | 重庆市亿飞智联科技有限公司 | 一种目标跟踪方法、装置、设备及存储介质 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8582811B2 (en) * | 2011-09-01 | 2013-11-12 | Xerox Corporation | Unsupervised parameter settings for object tracking algorithms |
US9760791B2 (en) * | 2015-09-01 | 2017-09-12 | Sony Corporation | Method and system for object tracking |
CN105631895B (zh) * | 2015-12-18 | 2018-05-29 | 重庆大学 | 结合粒子滤波的时空上下文视频目标跟踪方法 |
CN108537726B (zh) * | 2017-03-03 | 2022-01-04 | 杭州海康威视数字技术股份有限公司 | 一种跟踪拍摄的方法、设备和无人机 |
US10796185B2 (en) * | 2017-11-03 | 2020-10-06 | Facebook, Inc. | Dynamic graceful degradation of augmented-reality effects |
CN107959798B (zh) * | 2017-12-18 | 2020-07-07 | 北京奇虎科技有限公司 | 视频数据实时处理方法及装置、计算设备 |
CN108062763B (zh) * | 2017-12-29 | 2020-10-16 | 纳恩博(北京)科技有限公司 | 目标跟踪方法及装置、存储介质 |
CN109785385B (zh) * | 2019-01-22 | 2021-01-29 | 中国科学院自动化研究所 | 视觉目标跟踪方法及系统 |
-
2020
- 2020-04-15 CN CN202010296054.2A patent/CN111563913B/zh active Active
- 2020-07-02 WO PCT/CN2020/099835 patent/WO2021208258A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101807300A (zh) * | 2010-03-05 | 2010-08-18 | 北京智安邦科技有限公司 | 一种目标碎片区域融合的方法及装置 |
US20120099765A1 (en) * | 2010-10-21 | 2012-04-26 | SET Corporation | Method and system of video object tracking |
CN106920252A (zh) * | 2016-06-24 | 2017-07-04 | 阿里巴巴集团控股有限公司 | 一种图像数据处理方法、装置及电子设备 |
CN108765458A (zh) * | 2018-04-16 | 2018-11-06 | 上海大学 | 基于相关滤波的高海况无人艇海面目标尺度自适应跟踪方法 |
CN110503662A (zh) * | 2019-07-09 | 2019-11-26 | 科大讯飞(苏州)科技有限公司 | 跟踪方法及相关产品 |
CN110853076A (zh) * | 2019-11-08 | 2020-02-28 | 重庆市亿飞智联科技有限公司 | 一种目标跟踪方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN111563913A (zh) | 2020-08-21 |
CN111563913B (zh) | 2021-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021208256A1 (zh) | 一种视频处理方法、设备及手持相机 | |
CN110555883B (zh) | 相机姿态追踪过程的重定位方法、装置及存储介质 | |
US10924641B2 (en) | Wearable video camera medallion with circular display | |
WO2021208253A1 (zh) | 一种跟踪对象确定方法、设备和手持相机 | |
US9380207B1 (en) | Enabling multiple field of view image capture within a surround image mode for multi-lense mobile devices | |
WO2021208249A1 (zh) | 图像处理方法、设备及手持相机 | |
US8605158B2 (en) | Image pickup control apparatus, image pickup control method and computer readable medium for changing an image pickup mode | |
US20140139425A1 (en) | Image processing apparatus, image processing method, image capture apparatus and computer program | |
CN109040600A (zh) | 全景景象拍摄及浏览的移动装置、系统及方法 | |
WO2021208255A1 (zh) | 一种视频片段标记方法、设备及手持相机 | |
CN106657455B (zh) | 一种带可旋转摄像头的电子设备 | |
CN109981944A (zh) | 电子装置及其控制方法 | |
US20120120267A1 (en) | Electronic apparatus, control method, program, and image-capturing system | |
WO2021185374A1 (zh) | 一种拍摄图像的方法及电子设备 | |
CN111724412A (zh) | 确定运动轨迹的方法、装置及计算机存储介质 | |
WO2021208251A1 (zh) | 人脸跟踪方法及人脸跟踪设备 | |
CN110661979B (zh) | 摄像方法、装置、终端及存储介质 | |
WO2021208258A1 (zh) | 基于跟踪目标的搜索方法、设备及其手持相机 | |
WO2021208252A1 (zh) | 一种跟踪目标确定方法、装置和手持相机 | |
WO2021208257A1 (zh) | 跟踪状态确定方法、设备及手持相机 | |
WO2021208254A1 (zh) | 一种跟踪目标的找回方法、设备以及手持相机 | |
WO2021208259A1 (zh) | 云台驱动方法、设备及手持相机 | |
WO2021208260A1 (zh) | 目标对象的跟踪框显示方法、设备及手持相机 | |
WO2021208261A1 (zh) | 一种跟踪目标的找回方法、设备及手持相机 | |
CN114697570B (zh) | 用于显示图像的方法、电子设备及芯片 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20931166 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20931166 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.07.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20931166 Country of ref document: EP Kind code of ref document: A1 |